Class: Google::Cloud::Dataplex::V1::DataDiscoverySpec::BigQueryPublishingConfig

Inherits:
Object
  • Object
show all
Extended by:
Protobuf::MessageExts::ClassMethods
Includes:
Protobuf::MessageExts
Defined in:
proto_docs/google/cloud/dataplex/v1/data_discovery.rb

Overview

Describes BigQuery publishing configurations.

Defined Under Namespace

Modules: TableType

Instance Attribute Summary collapse

Instance Attribute Details

#connection::String

Returns Optional. The BigQuery connection used to create BigLake tables. Must be in the form projects/{project_id}/locations/{location_id}/connections/{connection_id}.

Returns:

  • (::String)

    Optional. The BigQuery connection used to create BigLake tables. Must be in the form projects/{project_id}/locations/{location_id}/connections/{connection_id}



68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
# File 'proto_docs/google/cloud/dataplex/v1/data_discovery.rb', line 68

class BigQueryPublishingConfig
  include ::Google::Protobuf::MessageExts
  extend ::Google::Protobuf::MessageExts::ClassMethods

  # Determines how discovered tables are published.
  module TableType
    # Table type unspecified.
    TABLE_TYPE_UNSPECIFIED = 0

    # Default. Discovered tables are published as BigQuery external tables
    # whose data is accessed using the credentials of the user querying the
    # table.
    EXTERNAL = 1

    # Discovered tables are published as BigLake external tables whose data
    # is accessed using the credentials of the associated BigQuery
    # connection.
    BIGLAKE = 2
  end
end

#location::String

Returns Optional. The location of the BigQuery dataset to publish BigLake external or non-BigLake external tables to.

  1. If the Cloud Storage bucket is located in a multi-region bucket, then BigQuery dataset can be in the same multi-region bucket or any single region that is included in the same multi-region bucket. The datascan can be created in any single region that is included in the same multi-region bucket
  2. If the Cloud Storage bucket is located in a dual-region bucket, then BigQuery dataset can be located in regions that are included in the dual-region bucket, or in a multi-region that includes the dual-region. The datascan can be created in any single region that is included in the same dual-region bucket.
  3. If the Cloud Storage bucket is located in a single region, then BigQuery dataset can be in the same single region or any multi-region bucket that includes the same single region. The datascan will be created in the same single region as the bucket.
  4. If the BigQuery dataset is in single region, it must be in the same single region as the datascan.

For supported values, refer to https://cloud.google.com/bigquery/docs/locations#supported_locations.

Returns:

  • (::String)

    Optional. The location of the BigQuery dataset to publish BigLake external or non-BigLake external tables to.

    1. If the Cloud Storage bucket is located in a multi-region bucket, then BigQuery dataset can be in the same multi-region bucket or any single region that is included in the same multi-region bucket. The datascan can be created in any single region that is included in the same multi-region bucket
    2. If the Cloud Storage bucket is located in a dual-region bucket, then BigQuery dataset can be located in regions that are included in the dual-region bucket, or in a multi-region that includes the dual-region. The datascan can be created in any single region that is included in the same dual-region bucket.
    3. If the Cloud Storage bucket is located in a single region, then BigQuery dataset can be in the same single region or any multi-region bucket that includes the same single region. The datascan will be created in the same single region as the bucket.
    4. If the BigQuery dataset is in single region, it must be in the same single region as the datascan.

    For supported values, refer to https://cloud.google.com/bigquery/docs/locations#supported_locations.



68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
# File 'proto_docs/google/cloud/dataplex/v1/data_discovery.rb', line 68

class BigQueryPublishingConfig
  include ::Google::Protobuf::MessageExts
  extend ::Google::Protobuf::MessageExts::ClassMethods

  # Determines how discovered tables are published.
  module TableType
    # Table type unspecified.
    TABLE_TYPE_UNSPECIFIED = 0

    # Default. Discovered tables are published as BigQuery external tables
    # whose data is accessed using the credentials of the user querying the
    # table.
    EXTERNAL = 1

    # Discovered tables are published as BigLake external tables whose data
    # is accessed using the credentials of the associated BigQuery
    # connection.
    BIGLAKE = 2
  end
end

#table_type::Google::Cloud::Dataplex::V1::DataDiscoverySpec::BigQueryPublishingConfig::TableType

Returns Optional. Determines whether to publish discovered tables as BigLake external tables or non-BigLake external tables.

Returns:



68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
# File 'proto_docs/google/cloud/dataplex/v1/data_discovery.rb', line 68

class BigQueryPublishingConfig
  include ::Google::Protobuf::MessageExts
  extend ::Google::Protobuf::MessageExts::ClassMethods

  # Determines how discovered tables are published.
  module TableType
    # Table type unspecified.
    TABLE_TYPE_UNSPECIFIED = 0

    # Default. Discovered tables are published as BigQuery external tables
    # whose data is accessed using the credentials of the user querying the
    # table.
    EXTERNAL = 1

    # Discovered tables are published as BigLake external tables whose data
    # is accessed using the credentials of the associated BigQuery
    # connection.
    BIGLAKE = 2
  end
end