Skip to main content

Google BigQuery

Google BigQuery is an enterprise data warehouse that allows you to store and query massive datasets at high speed using Google's infrastructure

Credential configuration

This credential configuration is done with a single JSON that looks like this:

"projectId": "PROJECT_ID",
"credentials": {
"type": "service_account",
"project_id": "PROJECT_ID",
"private_key_id": "KEY_ID",
"private_key": "-----BEGIN PRIVATE KEY-----\nPRIVATE_KEY\n-----END PRIVATE KEY-----\n",
"client_email": "SERVICE_ACCOUNT_EMAIL",
"client_id": "CLIENT_ID",
"auth_uri": "",
"token_uri": "",
"auth_provider_x509_cert_url": "",
"client_x509_cert_url": ""

The projectId corresponds to the project ID from the Google Developer's Console.

The credentials field of the JSON corresponds to a JSON key of the service account. This key is downloaded as a JSON file when you create it. You only need to paste its content inside credentials field. If you don't have one, create a service account and then, a JSON key for that service account. The service account which you use needs permissions to access BigQuery to work properly.

Here you have an example of a filled credential configuration form in YepCode:

Google BigQuery snippets available in editor


The title is the triggering text for YepCode to autocomplete the script


New integration from credential
const googleBigQueryClient = yepcode.integration.googleBigQuery("credential-slug");
New integration from plain authentication data
const { BigQuery } = require("@google-cloud/bigquery");

const googleBigQueryCredentials = {
projectId: "YepCode",
credentials: {
type: "service_account",
project_id: "yepcode",
private_key_id: "XXXXX",
private_key: \`-----BEGIN PRIVATE KEY-----\nx\n-----END PRIVATE KEY-----\`,
client_email: "",
client_id: "1234567890",
auth_uri: "",
token_uri: "",
auth_provider_x509_cert_url: "",
client_x509_cert_url: "",

const googleBigQueryClient = new BigQuery(googleBigQueryCredentials);

Create a dataset

Create a dataset
async function createDataset() {
// Specify the geographic location where the dataset should reside
const options = {
location: "dataset-location",

const [dataset] = await googleBigQueryClient.createDataset("dataset-id", options);
console.log(`Dataset ${} created.`);


Delete a dataset

Delete a dataset
async function deleteDataset() {
// Create a reference to the existing dataset
const dataset = googleBigQueryClient.dataset("dataset-id");

await dataset.delete({force: true});
console.log(`Dataset ${} deleted.`);


Query to a dataset

Query to a dataset
async function query() {
const query = "your-SQL-query";

// For all options, see
const options = {
query: query,
// Location must match that of the dataset(s) referenced in the query.
location: "dataset-location",

// Run the query as a job
const [job] = await googleBigQueryClient.createQueryJob(options);
console.log(`Job ${} started.`);

// Wait for the query to finish
const [rows] = await job.getQueryResults();

// Use the result
// Your code here