Selecting the BigQuery destination will send your test result data or issue data to your BigQuery data warehouse. BigQuery is another recommended integration as you can easily interact with the data using SQL to build interesting reports and analysis. In addition, Google has extensive SDKs, APIs, tools, training, and support communities. You don't have to host your own infrastructure or develop complex systems to manage things like high availability databases.

When configured, you will see tables created under the dataset you specify for each test type code and a table for issues. Each table is also partitioned by date according to the UTC time the data is inserted into BigQuery. This allows you to run efficient queries.

To get started with the UXI Data Push Destination for BigQuery:

1. Create a Google Cloud account

2. In the Google Cloud Console, select or create a Google Cloud project. Add billing information if necessary.

3. In the Cloud Console, open BigQuery.

4. In the Explorer panel, click your project name.

5. Expand the View actions option and click Create Dataset. Give the dataset an ID and select the location. Use all other default options.

6. Return to the Google Cloud console and create a service account. Ensure the service account can access the BigQuery project containing the dataset. The service account will either need the BigQueryEditor primitive IAM role or the bigquery.dataEditor predefined IAM role. The minimum permissions are as follows:

bigquery.datasets.get 
bigquery.tables.create
bigquery.tables.get
bigquery.tables.getData
bigquery.tables.list
bigquery.tables.update
bigquery.tables.updateData

Create and download a key for this service account as a JSON file.

7. In the UXI dashboard, go to Settings > Integrations. In the Data Push Destinations section, click on the Add Destination button.

8. Specify the following information from the Add Destination modal that appears:

  • Data Type: Test Results or Issues

  • Destination Type: BigQuery

  • Name: Give this integration a friendly name

  • Project Id: Enter the Google Cloud Project Id

  • Dataset: Enter the name of the BigQuery dataset

  • Service Account Details: Upload the service account JSON file from step

9. Click the Add button

See Also:

Here is a short demo setting up the data push destination for Google BigQuery.

Did this answer your question?