0
0
Snowflakecloud~20 mins

File formats (CSV, JSON, Parquet, Avro) in Snowflake - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
File Format Mastery in Snowflake
Get all challenges correct to earn this badge!
Test your skills under time pressure!
service_behavior
intermediate
2:00remaining
Understanding CSV File Loading Behavior in Snowflake

You load a CSV file into a Snowflake table using the COPY INTO command with default options. The CSV file contains some rows with missing values at the end of the line.

What will happen to those missing values during the load?

Snowflake
COPY INTO my_table FROM @my_stage/file.csv FILE_FORMAT = (TYPE = 'CSV');
AMissing trailing columns are ignored and the row is truncated to available columns.
BThe load fails with an error due to missing columns in some rows.
CMissing trailing columns are loaded as NULL values in the table columns.
DSnowflake fills missing columns with empty strings instead of NULL.
Attempts:
2 left
💡 Hint

Think about how Snowflake handles incomplete rows in CSV files by default.

Architecture
intermediate
2:00remaining
Choosing the Best File Format for Semi-Structured Data in Snowflake

You need to store and query semi-structured JSON data efficiently in Snowflake. Which file format should you choose to optimize query performance and storage?

AParquet
BJSON
CCSV
DAvro
Attempts:
2 left
💡 Hint

Consider which format supports columnar storage and efficient compression.

security
advanced
2:00remaining
Handling Schema Evolution with Avro Files in Snowflake

You have Avro files with evolving schemas being loaded into a Snowflake table. What is the best practice to handle schema changes without causing load failures?

AConvert Avro files to CSV before loading to avoid schema issues.
BUse a fixed schema in Snowflake and reject files with schema differences.
CManually update the Snowflake table schema before loading new Avro files.
DUse Snowflake's VARIANT column type to store Avro data and parse schema dynamically.
Attempts:
2 left
💡 Hint

Think about how Snowflake handles semi-structured data and schema flexibility.

🧠 Conceptual
advanced
2:00remaining
Comparing Compression and Query Efficiency of File Formats in Snowflake

Which file format among CSV, JSON, Parquet, and Avro generally provides the best compression and fastest query performance in Snowflake?

ACSV
BParquet
CAvro
DJSON
Attempts:
2 left
💡 Hint

Consider columnar vs row-based formats and compression capabilities.

Best Practice
expert
3:00remaining
Optimizing Snowflake Data Loading with Mixed File Formats

You have a Snowflake pipeline that loads data from multiple file formats: CSV, JSON, Parquet, and Avro. To optimize loading speed and minimize errors, which approach is best?

ACreate separate tables optimized for each file format and load accordingly.
BLoad all files into one table with columns matching the CSV schema.
CLoad all files into a single VARIANT column table regardless of format.
DConvert all files to CSV before loading to simplify the process.
Attempts:
2 left
💡 Hint

Think about how different file formats have different strengths and schema requirements.