Trigger
type: "io.kestra.plugin.gcp.bigquery.Trigger"
Wait for query on BigQuery.
Examples
Wait for a sql query to return results and iterate through rows.
id: bigquery-listen
namespace: company.team
tasks:
- id: each
type: io.kestra.plugin.core.flow.EachSequential
tasks:
- id: return
type: io.kestra.plugin.core.debug.Return
format: "{{ taskrun.value }}"
value: "{{ trigger.rows }}"
triggers:
- id: watch
type: io.kestra.plugin.gcp.bigquery.Trigger
interval: "PT5M"
sql: "SELECT * FROM `myproject.mydataset.mytable`"
fetch: true
Properties
conditions
- Type: array
- SubType: Condition
- Dynamic: ❌
- Required: ❌
List of conditions in order to limit the flow trigger.
fetch
- Type: boolean
- Dynamic: ❌
- Required: ❌
- Default:
false
Whether to Fetch the data from the query result to the task output
fetchOne
- Type: boolean
- Dynamic: ❌
- Required: ❌
- Default:
false
Whether to Fetch only one data row from the query result to the task output
interval
- Type: string
- Dynamic: ❌
- Required: ❌
- Default:
60.000000000
- Format:
duration
Interval between polling.
The interval between 2 different polls of schedule, this can avoid to overload the remote system with too many calls. For most of the triggers that depend on external systems, a minimal interval must be at least PT30S. See ISO_8601 Durations for more information of available interval values.
legacySql
- Type: boolean
- Dynamic: ❌
- Required: ❌
- Default:
false
Whether to use BigQuery's legacy SQL dialect for this query
By default this property is set to false.
projectId
- Type: string
- Dynamic: ❓
- Required: ❌
scopes
- Type: array
- SubType: string
- Dynamic: ❓
- Required: ❌
- Default:
[https://www.googleapis.com/auth/cloud-platform]
serviceAccount
- Type: string
- Dynamic: ❓
- Required: ❌
sql
- Type: string
- Dynamic: ✔️
- Required: ❌
The sql query to run
stopAfter
- Type: array
- SubType: string
- Dynamic: ❌
- Required: ❌
List of execution states after which a trigger should be stopped (a.k.a. disabled).
store
- Type: boolean
- Dynamic: ❌
- Required: ❌
- Default:
false
Whether to store the data from the query result into an ion serialized data file
Outputs
destinationTable
- Type: Query-DestinationTable
- Required: ❌
**The destination table (if one) or the temporary table created automatically **
jobId
- Type: string
- Required: ❌
The job id
row
- Type: object
- Required: ❌
Map containing the first row of fetched data
Only populated if 'fetchOne' parameter is set to true.
rows
- Type: array
- SubType: object
- Required: ❌
List containing the fetched data
Only populated if 'fetch' parameter is set to true.
size
- Type: integer
- Required: ❌
The size of the rows fetch
uri
- Type: string
- Required: ❌
- Format:
uri
The uri of store result
Only populated if 'store' is set to true.
Definitions
io.kestra.plugin.gcp.bigquery.Query-DestinationTable
Properties
dataset
- Type: string
- Dynamic: ❓
- Required: ❓
The dataset of the table
project
- Type: string
- Dynamic: ❓
- Required: ❓
The project of the table
table
- Type: string
- Dynamic: ❓
- Required: ❓
The table name
Was this page helpful?