Configuring BigQuery Connector with WSO2 Enterprise Integrator

Asara kumarasena
5 min readMay 2, 2021

The BigQuery connector allows you to access the BigQuery REST API through WSO2 ESB. BigQuery is a tool that allows you to execute SQL-like queries on large amounts of data at outstanding speeds.

To work with the BigQuery connector, you need to have a Google Cloud Platform account. If you do not have a Google Cloud Platform account, go to console.cloud.google.com, and create a Google Cloud Platform trial account.

The BigQuery API requires all requests to be authenticated as a user or a service account. In this article, I am trying to explain how to create a service account and how to configure BigQuery operations.

You can download the BigQuery connector from the connector store and I have tested this connector(v1.0.8) for both Updated WSO2 EI:6.6 and WSO2MI:1.2 servers.

Follow the steps below to create a Service Account.

Step 1

Open the Service Accounts page in the GCP console. If you do not already have a project, Click CREATE PROJECT.

And then provide project details.

Step 2

Select your project and click Open and then click Create Service Account.

Enter Service account details, and then click Create.

Step 3

Select a role you wish to grant to the service account(optional) and click Continue.

Step 4

Grant users access to this service account (optional) and click Done.

Step 5

Go to the service account for which you wish to create a key and click the created Service account in that row.

Step 6

Click Create new key

Select the key type as P12 and click Create. Then the created key will be downloaded.

That’s all for creating a service account and let’s look at how to create Dataset and Table.

Search BigQuery API and Click Enable. This enables the BigQuery API.

Creating Dataset

Step 1

Go to the BigQuery under the Big data category.

Then click Create Dataset.

Step 2

Enter required Dataset details and click Create Dataset.

Creating Table

Step 1

After creating the Dataset, click the created Dataset. You can see the following details. Then click Create Table.

Step 2

Enter required Table details and click Create.

The final structure and table will be as below.

✔ Configuring BigQuery Operations

We need to add the <bigquery.init> or <bigquery.getAccessTokenFromServiceAccount> element in our configuration before carrying out any other BigQuery operations. We created a Service account, hence we use the<bigquery.getAccessTokenFromServiceAccount> in the mediation.

<bigquery.getAccessTokenFromServiceAccount>
<apiUrl>https://www.googleapis.com</apiUrl>
<keyStoreLocation>{$ctx:keyStoreLocation}</keyStoreLocation>
<serviceAccount>{$ctx:serviceAccount}</serviceAccount>
<scope>https://www.googleapis.com/auth/bigquery</scope>
</bigquery.getAccessTokenFromServiceAccount>

Properties

  • apiUrl: The base endpoint URL of the BigQuery API.
  • keyStoreLocation: The location where the p12 key file is located.
  • serviceAccount: The value of the service account.
  • scope: The space delimited scope to access the API.

Now I am sharing a sample REST API service that includes the getTable,runQuery operations.

<?xml version="1.0" encoding="UTF-8"?>
<api xmlns="http://ws.apache.org/ns/synapse"
name="bigquery-testAPI"
context="/resources">
//getTable operation <resource methods="POST" uri-template="/gettabledetails">
<inSequence>
<property name="serviceAccount" expression="json-eval($.serviceAccount)"/>
<property name="keyStoreLocation" expression="json-eval($.keyStoreLocation)"/>
<property name="projectId" expression="json-eval($.projectId)"/>
<property name="datasetId" expression="json-eval($.datasetId)"/>
<property name="tableId" expression="json-eval($.tableId)"/>
<bigquery.getAccessTokenFromServiceAccount>
<apiUrl>https://www.googleapis.com</apiUrl>
<keyStoreLocation>{$ctx:keyStoreLocation}</keyStoreLocation>
<serviceAccount>{$ctx:serviceAccount}</serviceAccount>
<scope>https://www.googleapis.com/auth/bigquery</scope>
</bigquery.getAccessTokenFromServiceAccount>
<bigquery.getTable>
<tableId>{$ctx:tableId}</tableId>
<datasetId>{$ctx:datasetId}</datasetId>
<projectId>{$ctx:projectId}</projectId>
</bigquery.getTable>
<log level="custom">
<property name="log" value="after bigquery gettable"/>
</log>
<respond/>
</inSequence>
</resource>
//runquery operation <resource methods="POST" uri-template="/runQuery">
<inSequence>
<property name="serviceAccount" expression="json-eval($.serviceAccount)"/>
<property name="keyStoreLocation" expression="json-eval($.keyStoreLocation)"/>
<property name="projectId" expression="json-eval($.projectId)"/>
<property name="query" expression="json-eval($.query)"/>
<property name="defaultDatasetId" expression="json-eval($.defaultDatasetId)"/>
<bigquery.getAccessTokenFromServiceAccount>
<apiUrl>https://www.googleapis.com</apiUrl>
<keyStoreLocation>{$ctx:keyStoreLocation}</keyStoreLocation>
<serviceAccount>{$ctx:serviceAccount}</serviceAccount>
<scope>https://www.googleapis.com/auth/bigquery</scope> </bigquery.getAccessTokenFromServiceAccount>

<bigquery.runQuery>
<projectId>{$ctx:projectId}</projectId>
<kind>bigquery#queryRequest</kind>
<query>{$ctx:query}</query>
<maxResults>100</maxResults>
<defaultDatasetId>{$ctx:defaultDatasetId}</defaultDatasetId>
<defaultProjectId>{$ctx:projectId}</defaultProjectId>
<timeoutMs>10000</timeoutMs>
<dryRun>false</dryRun>
<useQueryCache>true</useQueryCache>
</bigquery.runQuery>
<log level="custom">
<property name="log" value="after run query"/>
</log>
<respond/>
</inSequence>
</resource>
</api>

we are DONE!!! 🎉 🎉

We have now successfully configured the ESB server with BigQuery connector.

Input request for runquery:-

Output Response:-

--

--

Asara kumarasena

Graduate Student @Wayne State University | Former Software Engineer @WSO2