| page_title | airbyte_connections Data Source - terraform-provider-airbyte |
|---|---|
| subcategory | |
| description | Connections DataSource |
Connections DataSource
data "airbyte_connections" "my_connections" {
include_deleted = false
limit = 20
offset = 0
tag_ids = [
"05db8e59-424e-49ea-80ce-1db6a74f3dfc"
]
workspace_ids = [
"a31bb8f4-e5b5-4dc9-bf1e-872e426f3223"
]
}include_deleted(Boolean) Include deleted connections in the returned results.limit(Number) Set the limit on the number of Connections returned. The default is 20.offset(Number) Set the offset to start at when returning Connections. The default is 0tag_ids(List of String) The UUIDs of the tags you wish to list connections for. Empty list will retrieve all connections.workspace_ids(List of String) The UUIDs of the workspaces you wish to list connections for. Empty list will retrieve all allowed workspaces.
data(Attributes List) (see below for nested schema)next(String)previous(String)
Read-Only:
configurations(Attributes) A list of configured stream options for a connection. (see below for nested schema)connection_id(String)created_at(Number)destination_id(String)name(String)namespace_definition(String) Define the location where the data will be stored in the destinationnamespace_format(String)non_breaking_schema_updates_behavior(String) Set how Airbyte handles syncs when it detects a non-breaking schema change in the sourceprefix(String)schedule(Attributes) schedule for when the the connection should run, per the schedule type (see below for nested schema)source_id(String)status(String)status_reason(String)tags(Attributes List) (see below for nested schema)workspace_id(String)
Read-Only:
streams(Attributes Set) (see below for nested schema)
Read-Only:
cursor_field(List of String) Path to the field that will be used to determine if a record is new or modified since the last sync. This field is REQUIRED ifsync_modeisincrementalunless there is a default.destination_object_name(String) The name of the destination object that this stream will be written to, used for data activation destinations.include_files(Boolean) Whether to move raw files from the source to the destination during the sync.mappers(Attributes List) Mappers that should be applied to the stream before writing to the destination. (see below for nested schema)name(String)namespace(String) Namespace of the stream.primary_key(List of List of String) Paths to the fields that will be used as primary key. This field is REQUIRED ifdestination_sync_modeis*_dedupunless it is already supplied by the source schema.selected_fields(Attributes Set) Paths to the fields that will be included in the configured catalog. (see below for nested schema)sync_mode(String)
Read-Only:
id(String)mapper_configuration(Attributes) The values required to configure the mapper. (see below for nested schema)type(String)
Read-Only:
encryption(Attributes) (see below for nested schema)field_filtering(Attributes) (see below for nested schema)field_renaming(Attributes) (see below for nested schema)hashing(Attributes) (see below for nested schema)row_filtering(Attributes) (see below for nested schema)
Read-Only:
aes(Attributes) (see below for nested schema)rsa(Attributes) (see below for nested schema)
Read-Only:
algorithm(String)field_name_suffix(String)key(String, Sensitive)mode(String)padding(String)target_field(String)
Read-Only:
algorithm(String)field_name_suffix(String)public_key(String)target_field(String)
Read-Only:
target_field(String) The name of the field to filter.
Read-Only:
new_field_name(String) The new name for the field after renaming.original_field_name(String) The current name of the field to rename.
Read-Only:
field_name_suffix(String) The suffix to append to the field name after hashing.method(String) The hashing algorithm to use.target_field(String) The name of the field to be hashed.
Read-Only:
conditions(String) Parsed as JSON.
Read-Only:
field_path(List of String)
Read-Only:
basic_timing(String)cron_expression(String)schedule_type(String)
Read-Only:
color(String)name(String)tag_id(String)workspace_id(String)