Google Associate Data Practitioner Practice Exams
Last updated on Apr 07,2025- Exam Code: Associate Data Practitioner
- Exam Name: Google Cloud Associate Data Practitioner ( ADP Exam )
- Certification Provider: Google
- Latest update: Apr 07,2025
You have a Dataproc cluster that performs batch processing on data stored in Cloud Storage. You need to schedule a daily Spark job to generate a report that will be emailed to stakeholders. You need a fully-managed solution that is easy to implement and minimizes complexity.
What should you do?
- A . Use Cloud Composer to orchestrate the Spark job and email the report.
- B . Use Dataproc workflow templates to define and schedule the Spark job, and to email the report.
- C . Use Cloud Run functions to trigger the Spark job and email the report.
- D . Use Cloud Scheduler to trigger the Spark job. and use Cloud Run functions to email the report.
Your retail organization stores sensitive application usage data in Cloud Storage. You need to encrypt the data without the operational overhead of managing encryption keys.
What should you do?
- A . Use Google-managed encryption keys (GMEK).
- B . Use customer-managed encryption keys (CMEK).
- C . Use customer-supplied encryption keys (CSEK).
- D . Use customer-supplied encryption keys (CSEK) for the sensitive data and customer-managed encryption keys (CMEK) for the less sensitive data.
You work for a financial organization that stores transaction data in BigQuery. Your organization has a regulatory requirement to retain data for a minimum of seven years for auditing purposes. You need to ensure that the data is retained for seven years using an efficient and cost-optimized approach.
What should you do?
- A . Create a partition by transaction date, and set the partition expiration policy to seven years.
- B . Set the table-level retention policy in BigQuery to seven years.
- C . Set the dataset-level retention policy in BigQuery to seven years.
- D . Export the BigQuery tables to Cloud Storage daily, and enforce a lifecycle management policy that has a seven-year retention rule.
Your company uses Looker to generate and share reports with various stakeholders. You have a complex dashboard with several visualizations that needs to be delivered to specific stakeholders on a recurring basis, with customized filters applied for each recipient. You need an efficient and scalable solution to automate the delivery of this customized dashboard. You want to follow the Google-recommended approach.
What should you do?
- A . Create a separate LookML model for each stakeholder with predefined filters, and schedule the dashboards using the Looker Scheduler.
- B . Create a script using the Looker Python SDK, and configure user attribute filter values. Generate a new scheduled plan for each stakeholder.
- C . Embed the Looker dashboard in a custom web application, and use the application’s scheduling features to send the report with personalized filters.
- D . Use the Looker Scheduler with a user attribute filter on the dashboard, and send the dashboard with personalized filters to each stakeholder based on their attributes.
You are a database administrator managing sales transaction data by region stored in a BigQuery table. You need to ensure that each sales representative can only see the transactions in their region.
What should you do?
- A . Add a policy tag in BigQuery.
- B . Create a row-level access policy.
- C . Create a data masking rule.
- D . Grant the appropriate 1AM permissions on the dataset.
Your company is migrating their batch transformation pipelines to Google Cloud. You need to choose a solution that supports programmatic transformations using only SQL. You also want the technology to support Git integration for version control of your pipelines.
What should you do?
- A . Use Cloud Data Fusion pipelines.
- B . Use Dataform workflows.
- C . Use Dataflow pipelines.
- D . Use Cloud Composer operators.
You are a data analyst working with sensitive customer data in BigQuery. You need to ensure that only authorized personnel within your organization can query this data, while following the principle of least privilege.
What should you do?
- A . Enable access control by using IAM roles.
- B . Encrypt the data by using customer-managed encryption keys (CMEK).
- C . Update dataset privileges by using the SQL GRANT statement.
- D . Export the data to Cloud Storage, and use signed URLs to authorize access.
You are a data analyst working with sensitive customer data in BigQuery. You need to ensure that only authorized personnel within your organization can query this data, while following the principle of least privilege.
What should you do?
- A . Enable access control by using IAM roles.
- B . Encrypt the data by using customer-managed encryption keys (CMEK).
- C . Update dataset privileges by using the SQL GRANT statement.
- D . Export the data to Cloud Storage, and use signed URLs to authorize access.
You are a data analyst working with sensitive customer data in BigQuery. You need to ensure that only authorized personnel within your organization can query this data, while following the principle of least privilege.
What should you do?
- A . Enable access control by using IAM roles.
- B . Encrypt the data by using customer-managed encryption keys (CMEK).
- C . Update dataset privileges by using the SQL GRANT statement.
- D . Export the data to Cloud Storage, and use signed URLs to authorize access.
You are a data analyst working with sensitive customer data in BigQuery. You need to ensure that only authorized personnel within your organization can query this data, while following the principle of least privilege.
What should you do?
- A . Enable access control by using IAM roles.
- B . Encrypt the data by using customer-managed encryption keys (CMEK).
- C . Update dataset privileges by using the SQL GRANT statement.
- D . Export the data to Cloud Storage, and use signed URLs to authorize access.