Config path: /<database>/<schema>/pipe/<name>.yaml


  table: test_table_2
  stage: test_external_stage
    id: "GET($1, 'id')"
    name: "GET($1, 'name')"

auto_ingest: false


  • copy (dict)

    • table (ident) - target table for COPY INTO

    • stage (ident) - source stage for COPY INTO

    • path (str) - path prefix for files in stage

    • pattern (str) - regular expression to filter files in stage

    • file_format (ident) - file format for files in stage

    • transform (dict)

      • {key} (ident) - column name in target table

      • {value} (str) - SQL expression to extract column value from source stage

    • options (dict)

      • {key} (ident) - COPY option name

      • {value} (bool, float, int, list, str) - COPY option value

  • auto_ingest (bool) - enable AUTO_INGEST

  • aws_sns_topic (str) - SNS topic for S3 bucket

  • integration (ident) - notification integration name for Azure

  • error_integration (ident) - notification integration to monitor ingestion errors

  • comment (str)

Usage notes

  1. Re-creating pipes correctly is complicated. Make sure you read & fully understand pipe recreation considerations. SnowDDL can only suggest or apply DDL queries to change pipes, but it cannot pause, monitor and refresh pipes for you.

  2. Maintenance of active pipes implicitly incurs additional costs.

  3. file_format can only be specified by name referencing to FILE_FORMAT object.

  4. Notification integration mentioned in error_integration parameter must be additionally specified in schema parameter owner_integration_usage. Otherwise schema owner role will not be able to send notifications.

Last updated