This library is compatible with Go 1.5+
Please refer to CHANGELOG.md
if you encounter breaking changes.
This library uses SQL mode and streaming API to insert data as default. To use legacy SQL please use the following /* USE LEGACY SQL */ hint, in this case you will not be able to fetch repeated and nested fields.
To control insert method just provide config.parameters with the following value:
_table_name_.insertMethod = "load"
Note that if streaming is used, currently UPDATE and DELETE statements are not supported.
For streaming you can specify which column to use as insertId with the following config.params
_table_name_.insertMethod = "stream"
_table_name_.insertIdColumn = "sessionId"
streamBatchCount controls row count in batch (default 9999)
When inserting data data this library checks upto 60 sec if data has been added. To control this behaviour you can set insertWaitTimeoutInMs (default 60 sec)
To disable this mechanism set: insertWaitTimeoutInMs: -1
Retries insert when 503 internal error
Default dataset
Default 500
The maximum number of rows of data to return per page of results. In addition to this limit, responses are also limited to 10 MB.
- Google secrets for service account
a) set GOOGLE_APPLICATION_CREDENTIALS environment variable
b) credential can be a name with extension of the JSON secret file placed into ~/.secret/ folder
config.yaml
driverName: bigquery
credentials: bq # place your big query secret json to ~/.secret/bg.json
parameters:
datasetId: myDataset
c) full URL to secret file
config.yaml
driverName: bigquery
credentials: file://tmp/secret/mySecret.json
parameters:
datasetId: myDataset
Secret file has to specify the following attributes:
type Config struct {
//google cloud credential
ClientEmail string `json:"client_email,omitempty"`
TokenURL string `json:"token_uri,omitempty"`
PrivateKey string `json:"private_key,omitempty"`
PrivateKeyID string `json:"private_key_id,omitempty"`
ProjectID string `json:"project_id,omitempty"`
}
- Private key (pem)
config.yaml
driverName: bigquery
credentials: bq # place your big query secret json to ~/.secret/bg.json
parameters:
serviceAccountId: "***@developer.gserviceaccount.com"
datasetId: MyDataset
projectId: spheric-arcadia-98015
privateKeyPath: /tmp/secret/bq.pem
The following is a very simple example of Reading and Inserting data
package main
import (
"github.com/viant/bgc"
"github.com/viant/dsc"
"time"
"fmt"
"log"
)
type MostLikedCity struct {
City string
Visits int
Souvenirs []string
}
type Traveler struct {
Id int
Name string
LastVisitTime time.Time
Achievements []string
MostLikedCity<