Practice Test 1 | Google Cloud Certified Professional Cloud Architect | Dumps | Mock Test
For this question, refer to the EHR Healthcare case study.
EHR Healthcare wants to onboard new insurance providers as quickly as possible.
To do this, it must be able to acquire records both in an atomic way, record by record, and through the transmission of files in various formats. So both streaming and batch.
EHR Healthcare wants to create standard data transformation procedures for loading into its data structures but to make the process as easy and fast as possible.
What are the GCP services that can meet this requirement (Select TWO)?
A. Manage data acquisition in Streaming with Pub / Sub and DataFlow
B. Manage the acquisition of streaming data with web applications with a standard http interface
C. Manage the Acquisition of files with signed URL in Cloud Storage and use DataFlow for transformation
D. Sharing of public buckets with a naming convention for the acquisition of files
Correct answers: A and C
Dataflow can process both batch- and streaming-data parallel pipelines with the same code in a serverless way. It is based on Apache Beam so the procedures may be created as reusable templates in any environment.
The dev team cab build programs that define the pipeline. Then, one of Apache Beam’s supported distributed processing backends, such as Dataflow, executes the pipeline. The data processing job supports parallel processing.
- B is wrong because it requires more developing work and may not sustain heavy traffic.
- D is wrong because the solution is not secure
For any further detail:
Comments are closed, but trackbacks and pingbacks are open.