GCS

Steps to integrate BigQuery with Locale.ai

You can use a GCS bucket to send your data directly to the locale.ai platform. Here's how it's done.

Basic Requirements

You will need to have a dedicated bucket in GCS, the data that you want to upload for one or more entities should be placed in this bucket. Each of the datasets can be imported from separate buckets as well.

GCS bucket access steps

1.Navigate to your GCS bucket in the GCP console and go to permissions.

2. We require the bucket to have uniform access control, thus in case the bucket has fine graded access control, change it to uniform by clicking the switch to uniform button in the access control section, select the uniform option in the dialog box, and save.

3. After this step, in the permissions section, click on add, as highlighted below.

4. In the add members to bucket sidebox, add the following member- gaia-extractor@localeai-314712.iam.gserviceaccount.com

5. After adding the member, assign the following roles- cloud storage > storage object viewer

This bucket policy basically allows Locale to list the objects in the bucket and read any file present in the bucket. For more information regarding GCS cross-account access please refer to the GCP official documentation.

6. Finally, after completion of these steps, tick the send notification email checkbox, optionally you may write the bucket name in the message.

Once you are done with all these steps, you need to provide us with the bucket name in order for us to get started with the integration.

Last updated