Snowflake ARA-C01 SnowPro Advanced Architect Certification Online Training
Snowflake ARA-C01 Online Training
The questions for ARA-C01 were last updated at Mar 30,2025.
- Exam Code: ARA-C01
- Exam Name: SnowPro Advanced Architect Certification
- Certification Provider: Snowflake
- Latest update: Mar 30,2025
A healthcare company wants to share data with a medical institute. The institute is running a Standard edition of Snowflake; the healthcare company is running a Business Critical edition.
How can this data be shared?
- A . The healthcare company will need to change the institute’s Snowflake edition in the accounts panel.
- B . By default, sharing is supported from a Business Critical Snowflake edition to a Standard edition.
- C . Contact Snowflake and they will execute the share request for the healthcare company.
- D . Set the share_restriction parameter on the shared object to false.
An Architect is designing a pipeline to stream event data into Snowflake using the Snowflake Kafka connector. The Architect’s highest priority is to configure the connector to stream data in the MOST cost-effective manner.
Which of the following is recommended for optimizing the cost associated with the Snowflake Kafka connector?
- A . Utilize a higher Buffer.flush.time in the connector configuration.
- B . Utilize a higher Buffer.size.bytes in the connector configuration.
- C . Utilize a lower Buffer.size.bytes in the connector configuration.
- D . Utilize a lower Buffer.count.records in the connector configuration.
An Architect has chosen to separate their Snowflake Production and QA environments using two
separate Snowflake accounts.
The QA account is intended to run and test changes on data and database objects before pushing those changes to the Production account. It is a requirement that all database objects and data in the QA account need to be an exact copy of the database objects, including privileges and data in the Production account on at least a nightly basis.
Which is the LEAST complex approach to use to populate the QA account with the Production account’s data and database objects on a nightly basis?
- A . 1) Create a share in the Production account for each database
2) Share access to the QA account as a Consumer
3) The QA account creates a database directly from each share
4) Create clones of those databases on a nightly basis
5) Run tests directly on those cloned databases - B . 1) Create a stage in the Production account
2) Create a stage in the QA account that points to the same external object-storage location
3) Create a task that runs nightly to unload each table in the Production account into the stage
4) Use Snowpipe to populate the QA account - C . 1) Enable replication for each database in the Production account
2) Create replica databases in the QA account
3) Create clones of the replica databases on a nightly basis
4) Run tests directly on those cloned databases - D . 1) In the Production account, create an external function that connects into the QA account and returns all the data for one specific table
2) Run the external function as part of a stored procedure that loops through each table in the Production account and populates each table in the QA account
A user can change object parameters using which of the following roles?
- A . ACCOUNTADMIN, SECURITYADMIN
- B . SYSADMIN, SECURITYADMIN
- C . ACCOUNTADMIN, USER with PRIVILEGE
- D . SECURITYADMIN, USER with PRIVILEGE
A media company needs a data pipeline that will ingest customer review data into a Snowflake table, and apply some transformations. The company also needs to use Amazon Comprehend to do sentiment analysis and make the de-identified final data set available publicly for advertising companies who use different cloud providers in different regions.
The data pipeline needs to run continuously ang efficiently as new records arrive in the object storage leveraging event notifications. Also, the operational complexity, maintenance of the infrastructure, including platform upgrades and security, and the development effort should be minimal.
Which design will meet these requirements?
- A . Ingest the data using COPY INTO and use streams and tasks to orchestrate transformations. Export the data into Amazon S3 to do model inference with Amazon Comprehend and ingest the data back into a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.
- B . Ingest the data using Snowpipe and use streams and tasks to orchestrate transformations. Create an external function to do model inference with Amazon Comprehend and write the final records to a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.
- C . Ingest the data into Snowflake using Amazon EMR and PySpark using the Snowflake Spark connector. Apply transformations using another Spark job. Develop a python program to do model inference by leveraging the Amazon Comprehend text analysis API. Then write the results to a Snowflake table and create a listing in the Snowflake Marketplace to make the data available to other companies.
- D . Ingest the data using Snowpipe and use streams and tasks to orchestrate transformations. Export the data into Amazon S3 to do model inference with Amazon Comprehend and ingest the data back into a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.
A Snowflake Architect is designing an application and tenancy strategy for an organization where strong legal isolation rules as well as multi-tenancy are requirements.
Which approach will meet these requirements if Role-Based Access Policies (RBAC) is a viable option for isolating tenants?
- A . Create accounts for each tenant in the Snowflake organization.
- B . Create an object for each tenant strategy if row level security is viable for isolating tenants.
- C . Create an object for each tenant strategy if row level security is not viable for isolating tenants.
- D . Create a multi-tenant table strategy if row level security is not viable for isolating tenants.
Which statements describe characteristics of the use of materialized views in Snowflake? (Choose two.)
- A . They can include ORDER BY clauses.
- B . They cannot include nested subqueries.
- C . They can include context functions, such as CURRENT_TIME().
- D . They can support MIN and MAX aggregates.
- E . They can support inner joins, but not outer joins.
The Data Engineering team at a large manufacturing company needs to engineer data coming from many sources to support a wide variety of use cases and data consumer requirements which include:
1) Finance and Vendor Management team members who require reporting and visualization
2) Data Science team members who require access to raw data for ML model development
3) Sales team members who require engineered and protected data for data monetization What Snowflake data modeling approaches will meet these requirements? (Choose two.)
- A . Consolidate data in the company’s data lake and use EXTERNAL TABLES.
- B . Create a raw database for landing and persisting raw data entering the data pipelines.
- C . Create a set of profile-specific databases that aligns data with usage patterns.
- D . Create a single star schema in a single database to support all consumers’ requirements.
- E . Create a Data Vault as the sole data pipeline endpoint and have all consumers directly access the Vault.
An Architect on a new project has been asked to design an architecture that meets Snowflake security, compliance, and governance requirements as follows:
1) Use Tri-Secret Secure in Snowflake
2) Share some information stored in a view with another Snowflake customer
3) Hide portions of sensitive information from some columns
4) Use zero-copy cloning to refresh the non-production environment from the production environment
To meet these requirements, which design elements must be implemented? (Choose three.)
- A . Define row access policies.
- B . Use the Business-Critical edition of Snowflake.
- C . Create a secure view.
- D . Use the Enterprise edition of Snowflake.
- E . Use Dynamic Data Masking.
- F . Create a materialized view.
Which of the following are characteristics of how row access policies can be applied to external tables? (Choose three.)
- A . An external table can be created with a row access policy, and the policy can be applied to the VALUE column.
- B . A row access policy can be applied to the VALUE column of an existing external table.
- C . A row access policy cannot be directly added to a virtual column of an external table.
- D . External tables are supported as mapping tables in a row access policy.
- E . While cloning a database, both the row access policy and the external table will be cloned.
- F . A row access policy cannot be applied to a view created on top of an external table.