How is the change of local time due to daylight savings time handled in Snowflake tasks? (Choose two.)
How is the change of local time due to daylight savings time handled in Snowflake tasks? (Choose two.)A . A task scheduled in a UTC-based schedule will have no issues with the time changes.B . Task schedules can be designed to follow specified or local time zones to accommodate the...
When using the Snowflake Connector for Kafka, what data formats are supported for the messages? (Choose two.)
When using the Snowflake Connector for Kafka, what data formats are supported for the messages? (Choose two.)A . CSVB . XMLC . AvroD . JSONE . ParquetView AnswerAnswer: C, D Explanation: The data formats that are supported for the messages when using the Snowflake Connector for Kafka are Avro and...
Which of the following is recommended for optimizing the cost associated with the Snowflake Kafka connector?
An Architect is designing a pipeline to stream event data into Snowflake using the Snowflake Kafka connector. The Architect’s highest priority is to configure the connector to stream data in the MOST cost-effective manner. Which of the following is recommended for optimizing the cost associated with the Snowflake Kafka connector?A...
What is a characteristic of loading data into Snowflake using the Snowflake Connector for Kafka?
What is a characteristic of loading data into Snowflake using the Snowflake Connector for Kafka?A . The Connector only works in Snowflake regions that use AWS infrastructure.B . The Connector works with all file formats, including text, JSON, Avro, Ore, Parquet, and XML.C . The Connector creates and manages its...
According to Snowflake recommended best practice, how should these requirements be met?
A large manufacturing company runs a dozen individual Snowflake accounts across its business divisions. The company wants to increase the level of data sharing to support supply chain optimizations and increase its purchasing leverage with multiple vendors. The company’s Snowflake Architects need to design a solution that would allow the...
What Snowflake data modeling approaches will meet these requirements?
The Data Engineering team at a large manufacturing company needs to engineer data coming from many sources to support a wide variety of use cases and data consumer requirements which include: 1) Finance and Vendor Management team members who require reporting and visualization 2) Data Science team members who require...
At which object type level can the APPLY MASKING POLICY, APPLY ROW ACCESS POLICY and APPLY SESSION POLICY privileges be granted?
At which object type level can the APPLY MASKING POLICY, APPLY ROW ACCESS POLICY and APPLY SESSION POLICY privileges be granted?A . GlobalB . DatabaseC . SchemaD . TableView AnswerAnswer: A Explanation: The object type level at which the APPLY MASKING POLICY, APPLY ROW ACCESS POLICY and APPLY SESSION POLICY...
How do Snowflake databases that are created from shares differ from standard databases that are not created from shares? (Choose three.)
How do Snowflake databases that are created from shares differ from standard databases that are not created from shares? (Choose three.)A . Shared databases are read-only.B . Shared databases must be refreshed in order for new data to be visible.C . Shared databases cannot be cloned.D . Shared databases are...
What integration object should be used to place restrictions on where data may be exported?
What integration object should be used to place restrictions on where data may be exported?A . Stage integrationB . Security integrationC . Storage integrationD . API integrationView AnswerAnswer: B Explanation: According to the SnowPro Advanced: Architect documents and learning resources, the integration object that should be used to place restrictions...
How can an Architect enable optimal clustering to enhance performance for different access paths on a given table?
How can an Architect enable optimal clustering to enhance performance for different access paths on a given table?A . Create multiple clustering keys for a table.B . Create multiple materialized views with different cluster keys.C . Create super projections that will automatically create clustering.D . Create a clustering key that...