Integrating your data pipelines via platform endpoints allows you to add data to the platform and to take data off the platform, to use within your local environments.

It is often a good idea to setup your cloud endpoints as soon as possible as sometimes it requires assistance from other departments in your company.

Note: Use the Snowflake web interface (Snowpipe) or any other supported Snowflake ETL software to integrate data from or to your cloud storage endpoint.

A direct connection to Snowflake is coming soon.

Note: You must have an Technician role to perform this action

Your endpoints within your cloud accounts must have been created, else setup a cloud-agnostic SFTP endpoint

To create an endpoint:

  1. Click on your Organization Logo on the navigation bar.

  2. Select the Endpoints tab

  3. Click Create endpoint.

  4. Enter the endpoint Name.

  5. Enter an endpoint Description.

  6. Select the Type of endpoint you want to set up.
    Follow the instructions in the appropriate section below.

S3

  1. Choose Type > S3.

  2. Insert your S3 path.
    Example: s3://my-bucket/my-data
    This is the bucket name you set when you created your endpoint.

  3. Select your AWS Region from the list.

  4. Insert your IAM credentials (if requested)

    1. Insert your Amazon access key

    2. Insert your Amazon secret key

  5. Click Create.

  6. Follow the Your AWS S3 bucket policy steps

    • Navigate to the bucket in the AWS console S3 browser

    • Enter the `Permissions` tab and `Bucket Policy` option

    • Copy and paste the generated policy statement below into the text editor

    • Save

  7. Click Close

Still not sure? Watch this video on how to set up an AWS S3 bucket.

GCS

  1. Choose Type > GCS.

  2. Enter your bucket name and optional subdirectory (path/to/dir). The Platform uses these locations when publishing and exporting to the endpoint.

  3. Enter the GCS bucket name.

  4. (Optional) Enter the GCS subdirectory.

  5. Select GCP Key File (if requested)

  6. Click Create.
    The New GCS endpoint information page appears.
    Help: This page informs you that the endpoint is ready and that the member account and roles are created. It also provides guidance on how to apply the required roles and access permissions to the bucket member account. Ensure bucket access control is set to Fine-grained.

  7. Go back to Google Console and log in

  8. Go to your bucket

  9. Click on Roles > Add

  10. Add New principals (This is the member email given to you on platform once you create your endpoint)

  11. Add the following roles:

    1. Storage Legacy Bucket Writer

    2. Storage Legacy Object Owner

12. Click Close.

Still not sure? Watch this video on how to set up an GSC bucket.

Azure Blob Storage

  1. Choose Type > Azure Blob Storage.

  2. In the Shared Access Signature section:

    1. Enter the Blob SAS URL.

    2. (Optional) Insert a Blob prefix.

  3. Click Create.
    The new endpoint ready message box appears.

  4. Use the access permissions instructions to add the displayed member details to your permissions tab.

  5. Click Close.

SFTP

(available to some customers only, contact Support if you would like to know more)

  1. Choose Type > SFTP

  2. Click + Create key to create an SSH key. You can create up to ten SSH keys. SSH keys can be downloaded as .PEM files and used to access the SFTP server.

Warning: For security purposes SFTP endpoint SSH keys are only accessible to the user that created the endpoint (the endpoint owner) on behalf of the organization. If you are using an SFTP endpoint created by a different user, contact this user to obtain access to the SFTP server location to retrieve the exported data.

Now you are ready to add data onto the Platform, or take data off the Platform, via a cloud endpoint


Related Pages

Create a GCS bucket

Create an S3 Bucket

Create an Azure Blob SAS token and URL

Source your Data Product from an Endpoint

Consume Data Products Off Platform