如何使用nodejs gcloud上传到bigquery时启用错误

时间:2022-01-30 14:56:47

using the bigquery UI I have an option to enable mistakes by checking the

使用bigquery UI我可以选择通过检查来启用错误

Number of errors allowed

now when I am using the Gcloud in nodejs how can I enable errors?

现在,当我在nodejs中使用Gcloud时,如何启用错误?

fs.writeFile("/tmp/bq_json_file_new.json", myJSON, function(err){});
fs.createReadStream("/tmp/bq_json_file_new.json")
  .pipe(table.createWriteStream(metadata))
  .on('complete', function(job) {
    job
      .on('error', console.log)
      .on('complete', function(metadata) {
        console.log('job completed', metadata);
      });
  });

2 个解决方案

#1


1  

This is the maxBadRecords field in the job configuration. You can specify this when inserting a job in the BigQuery API. I'm not sure how the nodejs client looks, but if you're passing in a job-shaped object, you should be able to specify maxBadRecords in its load job configuration.

这是作业配置中的maxBadRecords字段。您可以在BigQuery API中插入作业时指定此项。我不确定nodejs客户端的外观,但是如果你传入一个作业形状的对象,你应该能够在其加载作业配置中指定maxBadRecords。

#2


0  

here's the answer using Danny Kitt's answer:

这是使用Danny Kitt答案的答案:

var gcloud = require('gcloud')({
  keyFilename: '../config/keyfile.json',
  projectId: 'my-project'
});

var request = require('request');

var bigquery = gcloud.bigquery();


var dataset = bigquery.dataset('my_dataset');
var table = dataset.table('my_table');

var metadata = {
    sourceFormat: 'NEWLINE_DELIMITED_JSON',
    maxBadRecords: 2
};

fs = require('fs');

fs.createReadStream('./myFile.json')
  .pipe(table.createWriteStream(metadata))
  .on('complete', function(job) {
   job
      .on('error', console.log)
      .on('complete', function(metadata) {
        console.log('job completed', metadata);
      });
  });

#1


1  

This is the maxBadRecords field in the job configuration. You can specify this when inserting a job in the BigQuery API. I'm not sure how the nodejs client looks, but if you're passing in a job-shaped object, you should be able to specify maxBadRecords in its load job configuration.

这是作业配置中的maxBadRecords字段。您可以在BigQuery API中插入作业时指定此项。我不确定nodejs客户端的外观,但是如果你传入一个作业形状的对象,你应该能够在其加载作业配置中指定maxBadRecords。

#2


0  

here's the answer using Danny Kitt's answer:

这是使用Danny Kitt答案的答案:

var gcloud = require('gcloud')({
  keyFilename: '../config/keyfile.json',
  projectId: 'my-project'
});

var request = require('request');

var bigquery = gcloud.bigquery();


var dataset = bigquery.dataset('my_dataset');
var table = dataset.table('my_table');

var metadata = {
    sourceFormat: 'NEWLINE_DELIMITED_JSON',
    maxBadRecords: 2
};

fs = require('fs');

fs.createReadStream('./myFile.json')
  .pipe(table.createWriteStream(metadata))
  .on('complete', function(job) {
   job
      .on('error', console.log)
      .on('complete', function(metadata) {
        console.log('job completed', metadata);
      });
  });