0
0
Snowflakecloud~5 mins

Loading from S3, Azure Blob, GCS in Snowflake - Commands & Configuration

Choose your learning style9 modes available
Introduction
Loading data into Snowflake from cloud storage services like Amazon S3, Azure Blob Storage, or Google Cloud Storage lets you bring your files into your database easily. This helps you analyze and work with your data quickly without manual uploads.
When you have data files stored in Amazon S3 that you want to analyze in Snowflake.
When your data is saved in Azure Blob Storage and you need to load it into Snowflake tables.
When you want to import CSV or JSON files from Google Cloud Storage into Snowflake.
When you want to automate data loading from cloud storage into Snowflake for regular updates.
When you want to keep your data in cloud storage but query it efficiently inside Snowflake.
Config File - load_data.sql
load_data.sql
CREATE OR REPLACE STAGE my_s3_stage
  URL='s3://my-bucket/data/'
  STORAGE_INTEGRATION = my_s3_integration;

CREATE OR REPLACE STAGE my_azure_stage
  URL='azure://myaccount.blob.core.windows.net/mycontainer/data/'
  STORAGE_INTEGRATION = my_azure_integration;

CREATE OR REPLACE STAGE my_gcs_stage
  URL='gcs://my-gcs-bucket/data/'
  STORAGE_INTEGRATION = my_gcs_integration;

COPY INTO my_table
  FROM @my_s3_stage
  FILE_FORMAT = (TYPE = 'CSV' FIELD_DELIMITER = ',' SKIP_HEADER = 1);

COPY INTO my_table
  FROM @my_azure_stage
  FILE_FORMAT = (TYPE = 'CSV' FIELD_DELIMITER = ',' SKIP_HEADER = 1);

COPY INTO my_table
  FROM @my_gcs_stage
  FILE_FORMAT = (TYPE = 'CSV' FIELD_DELIMITER = ',' SKIP_HEADER = 1);

This SQL script creates three external stages in Snowflake, each pointing to a different cloud storage location: Amazon S3, Azure Blob Storage, and Google Cloud Storage. The STORAGE_INTEGRATION links Snowflake to the cloud storage securely.

Then, it runs COPY INTO commands to load CSV files from each stage into the Snowflake table my_table. The file format specifies CSV with commas and skips the header row.

Commands
This command creates a Snowflake stage named 'my_s3_stage' that points to the Amazon S3 bucket folder where your data files are stored. The storage integration handles secure access.
Terminal
snowsql -q "CREATE OR REPLACE STAGE my_s3_stage URL='s3://my-bucket/data/' STORAGE_INTEGRATION = my_s3_integration;"
Expected OutputExpected
Done. Snowflake stage 'my_s3_stage' created.
This command loads data from the S3 stage into the Snowflake table 'my_table' using the specified CSV file format. It skips the header row to avoid loading column names as data.
Terminal
snowsql -q "COPY INTO my_table FROM @my_s3_stage FILE_FORMAT = (TYPE = 'CSV' FIELD_DELIMITER = ',' SKIP_HEADER = 1);"
Expected OutputExpected
COPY INTO my_table executed successfully. Loaded 1000 rows.
This command creates a Snowflake stage named 'my_azure_stage' that points to your Azure Blob Storage container folder with data files.
Terminal
snowsql -q "CREATE OR REPLACE STAGE my_azure_stage URL='azure://myaccount.blob.core.windows.net/mycontainer/data/' STORAGE_INTEGRATION = my_azure_integration;"
Expected OutputExpected
Done. Snowflake stage 'my_azure_stage' created.
This command loads data from the Azure Blob Storage stage into the Snowflake table 'my_table' using the CSV format.
Terminal
snowsql -q "COPY INTO my_table FROM @my_azure_stage FILE_FORMAT = (TYPE = 'CSV' FIELD_DELIMITER = ',' SKIP_HEADER = 1);"
Expected OutputExpected
COPY INTO my_table executed successfully. Loaded 800 rows.
This command creates a Snowflake stage named 'my_gcs_stage' that points to your Google Cloud Storage bucket folder.
Terminal
snowsql -q "CREATE OR REPLACE STAGE my_gcs_stage URL='gcs://my-gcs-bucket/data/' STORAGE_INTEGRATION = my_gcs_integration;"
Expected OutputExpected
Done. Snowflake stage 'my_gcs_stage' created.
This command loads data from the Google Cloud Storage stage into the Snowflake table 'my_table' using the CSV file format.
Terminal
snowsql -q "COPY INTO my_table FROM @my_gcs_stage FILE_FORMAT = (TYPE = 'CSV' FIELD_DELIMITER = ',' SKIP_HEADER = 1);"
Expected OutputExpected
COPY INTO my_table executed successfully. Loaded 1200 rows.
Key Concept

If you remember nothing else from this pattern, remember: Snowflake stages link your cloud storage to your database, and COPY INTO loads the data from those stages into your tables.

Common Mistakes
Not creating or configuring the STORAGE_INTEGRATION before creating the stage.
Without STORAGE_INTEGRATION, Snowflake cannot securely access the cloud storage, so the stage creation or data loading will fail.
Always create and configure the STORAGE_INTEGRATION object with proper permissions before creating stages.
Using incorrect URL format for the cloud storage location in the stage definition.
Snowflake requires exact URL formats for each cloud provider; wrong URLs cause stage creation errors or inability to find files.
Use the correct URL prefix and path format: s3://bucket/path/, azure://account.blob.core.windows.net/container/path/, gcs://bucket/path/.
Not specifying the correct FILE_FORMAT or skipping the header row when loading CSV files.
If the file format is wrong, data may load incorrectly or cause errors; including headers as data can corrupt your table.
Define a FILE_FORMAT with the right type and options, and use SKIP_HEADER=1 if your files have headers.
Summary
Create Snowflake stages to connect to your cloud storage locations using STORAGE_INTEGRATION for secure access.
Use COPY INTO commands to load data from these stages into your Snowflake tables with the correct file format.
Verify each step by checking the success messages and row counts after loading data.