Class: Google::Cloud::Dlp::V2::DataProfileAction::Export

Inherits:
Object
  • Object
show all
Extended by:
Protobuf::MessageExts::ClassMethods
Includes:
Protobuf::MessageExts
Defined in:
proto_docs/google/privacy/dlp/v2/dlp.rb

Overview

If set, the detailed data profiles will be persisted to the location of your choice whenever updated.

Instance Attribute Summary collapse

Instance Attribute Details

#profile_table::Google::Cloud::Dlp::V2::BigQueryTable

Returns Store all profiles to BigQuery.

  • The system will create a new dataset and table for you if none are are provided. The dataset will be named sensitive_data_protection_discovery and table will be named discovery_profiles. This table will be placed in the same project as the container project running the scan. After the first profile is generated and the dataset and table are created, the discovery scan configuration will be updated with the dataset and table names.
  • See Analyze data profiles stored in BigQuery.
  • See Sample queries for your BigQuery table.
  • Data is inserted using streaming insert and so data may be in the buffer for a period of time after the profile has finished.
    • The Pub/Sub notification is sent before the streaming buffer is guaranteed to be written, so data may not be instantly visible to queries by the time your topic receives the Pub/Sub notification.
    • The best practice is to use the same table for an entire organization so that you can take advantage of the provided Looker reports. If you use VPC Service Controls to define security perimeters, then you must use a separate table for each boundary.

Returns:

  • (::Google::Cloud::Dlp::V2::BigQueryTable)

    Store all profiles to BigQuery.

    • The system will create a new dataset and table for you if none are are provided. The dataset will be named sensitive_data_protection_discovery and table will be named discovery_profiles. This table will be placed in the same project as the container project running the scan. After the first profile is generated and the dataset and table are created, the discovery scan configuration will be updated with the dataset and table names.
    • See Analyze data profiles stored in BigQuery.
    • See Sample queries for your BigQuery table.
    • Data is inserted using streaming insert and so data may be in the buffer for a period of time after the profile has finished.
      • The Pub/Sub notification is sent before the streaming buffer is guaranteed to be written, so data may not be instantly visible to queries by the time your topic receives the Pub/Sub notification.
      • The best practice is to use the same table for an entire organization so that you can take advantage of the provided Looker reports. If you use VPC Service Controls to define security perimeters, then you must use a separate table for each boundary.


4328
4329
4330
4331
# File 'proto_docs/google/privacy/dlp/v2/dlp.rb', line 4328

class Export
  include ::Google::Protobuf::MessageExts
  extend ::Google::Protobuf::MessageExts::ClassMethods
end

#sample_findings_table::Google::Cloud::Dlp::V2::BigQueryTable

Returns Store sample [data profile findings][google.privacy.dlp.v2.DataProfileFinding] in an existing table or a new table in an existing dataset. Each regeneration will result in new rows in BigQuery. Data is inserted using streaming insert and so data may be in the buffer for a period of time after the profile has finished.

Returns:

  • (::Google::Cloud::Dlp::V2::BigQueryTable)

    Store sample [data profile findings][google.privacy.dlp.v2.DataProfileFinding] in an existing table or a new table in an existing dataset. Each regeneration will result in new rows in BigQuery. Data is inserted using streaming insert and so data may be in the buffer for a period of time after the profile has finished.



4328
4329
4330
4331
# File 'proto_docs/google/privacy/dlp/v2/dlp.rb', line 4328

class Export
  include ::Google::Protobuf::MessageExts
  extend ::Google::Protobuf::MessageExts::ClassMethods
end