Random `ApiError: Invalid Credentials` calling BigQuery from Google Cloud Functions -
my function called 15 times per hour (every 4 minutes), , runs query via startquery. "invalid credentials" error happens randomly after 30 minutes. happens more , more until calls fail.
this query reads data table in dataset, , saves result table located in dataset, via options destination
, writedisposition=write_truncate
. 2 datasets located in eu.
redeploying function removes problem temporarily.
a call gcloud beta functions describe my-function
indicates uses app engine default service account: my-project-id@appspot.gserviceaccount.com
.
here error details:
apierror: invalid credentials @ object.parsehttprespbody (/user_code/node_modules/@google-cloud/bigquery/node_modules/@google-cloud/common/src/util.js:192:30) @ object.handleresp (/user_code/node_modules/@google-cloud/bigquery/node_modules/@google-cloud/common/src/util.js:132:18) @ /user_code/node_modules/@google-cloud/bigquery/node_modules/@google-cloud/common/src/util.js:465:12 @ request.onresponse [as _callback] (/user_code/node_modules/@google-cloud/bigquery/node_modules/retry-request/index.js:160:7) @ request.self.callback (/user_code/node_modules/@google-cloud/bigquery/node_modules/request/request.js:188:22) @ emittwo (events.js:106:13) @ request.emit (events.js:191:7) @ request.<anonymous> (/user_code/node_modules/@google-cloud/bigquery/node_modules/request/request.js:1171:10) @ emitone (events.js:96:13) @ request.emit (events.js:188:7)
edit
the code, stripped:
const bigquery = require('@google-cloud/bigquery')(); const destinationdataset = bigquery.dataset('destinationdataset'); const destinationtable = dataset.table('destinationtable'); exports.aggregate = function aggregate (event) { const message = event.data; const attributes = message.attributes; let job; let destinationtable; return promise.resolve() .then(() => { if (attributes.partition == null) { throw new error('partition time not provided. make sure have "partitiontime" attribute in message'); } const query = 'select ... sourcetable _partitiontime = timestamp(@partitiontime)'; // dataset specified in job options below // query options list: https://cloud.google.com/bigquery/docs/reference/v2/jobs/query // and: https://googlecloudplatform.github.io/google-cloud-node/#/docs/bigquery/0.9.6/bigquery?method=startquery const options = { destination: destinationtable, writedisposition: 'write_truncate', query: query, defaultdataset: { datasetid: 'sourcedataset' }, timeoutms: 540000, // 9 minutes, same timeout cloud function uselegacysql: false, parametermode: 'named', queryparameters: [{ name: 'partitiontime', parametertype: { type: 'string' }, parametervalue: { value: attributes.partition } }] }; return bigquery.startquery(options); }) .then((results) => { job = results[0]; console.log(`bigquery job ${job.id} started, generating data in ${destinationtable.dataset.id}.${destinationtable.id}.`); return job.promise(); }) .then((results) => { // job's status return job.getmetadata(); }) .then((metadata) => { // check job's status errors const errors = metadata[0].status.errors; if (errors && errors.length > 0) { throw errors; } }) // if destination table given .then(() => { console.log(`bigquery job ${job.id} completed, data generated in ${destinationtable.dataset.id}.${destinationtable.id}.`); }) .catch((err) => { console.log(`job failed ${inspect(attributes)}: ${inspect(err)}`); return promise.reject(err); }); };
you'll notice give no options while initializing bigquery object: require('@google-cloud/bigquery')()
.
should create service account role bigquery job user, , use runtimeconfig api avoid pushing credentials git origin ?
the question still remains on why error randomly. looking @ function logs now, error happened on every calls between midnight , 4am cest, on 1 third of calls until 5:36am. , since time (4 hours ago) did not happen once.
edit 2
this shows frequency of failed invocations compared successful ones. errors (in green) "invalid credentials" errors. absolutely nothing touched during 7 days: no deployments, no changes of configurations, no fiddlings in bigquery.
Comments
Post a Comment