Complete the code to create a stage for loading data from an S3 bucket.
CREATE OR REPLACE STAGE my_s3_stage URL='s3://mybucket/data/' [1] = (AWS_KEY_ID='your_key' AWS_SECRET_KEY='your_secret');
The CREDENTIALS keyword is used to provide AWS access keys for the S3 stage.
Complete the code to load data from an Azure Blob storage stage into a Snowflake table.
COPY INTO my_table FROM @my_azure_stage/[1] FILE_FORMAT = (TYPE = 'CSV' FIELD_DELIMITER = ',');
The file data.csv matches the CSV file format specified in the COPY command.
Fix the error in the Snowflake stage creation for Google Cloud Storage by completing the missing keyword.
CREATE OR REPLACE STAGE gcs_stage URL='gcs://mybucket/data/' [1] = (GCS_ACCESS_TOKEN='your_token');
The CREDENTIALS keyword is required to provide the GCS access token for authentication.
Fill both blanks to create a file format for loading JSON files from cloud storage.
CREATE OR REPLACE FILE FORMAT my_json_format TYPE = '[1]' STRIP_OUTER_ARRAY = [2];
The file format type for JSON files is JSON, and STRIP_OUTER_ARRAY = TRUE helps parse JSON arrays correctly.
Fill all three blanks to copy data from an S3 stage into a table with a specific file format and error handling.
COPY INTO my_table FROM @my_s3_stage/[1] FILE_FORMAT = (TYPE = '[2]') ON_ERROR = '[3]';
The file data.csv matches the CSV file format, and ON_ERROR = 'CONTINUE' tells Snowflake to skip errors and continue loading.