Skip navigation links
A B C D E F G H I J K L M N O P Q R S T U V W 

A

AbstractResult() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.AbstractResult
 
ackDeadlineSeconds(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
Return the ack deadline, in seconds, for subscription.
ackDeadlineSeconds(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
 
ackDeadlineSeconds(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
 
ackDeadlineSeconds(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
 
ackId() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.IncomingMessage
Id to pass back to Pubsub to acknowledge receipt of this message.
acknowledge(PubsubClient.SubscriptionPath, List<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
Acknowldege messages from subscription with ackIds.
acknowledge(PubsubClient.SubscriptionPath, List<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
 
acknowledge(PubsubClient.SubscriptionPath, List<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
 
acknowledge(PubsubClient.SubscriptionPath, List<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
 
ActionFactory - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.action
Factory class for creating instances that will handle different functions of DoFns.
ActionFactory() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ActionFactory
 
ActionFactory - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.action
Factory class for creating instances that will handle each type of record within a change stream query.
ActionFactory() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.ActionFactory
 
ACTIVE_PARTITION_READ_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Counter for the active partition reads during the execution of the Connector.
actuateProjectionPushdown(Map<TupleTag<?>, FieldAccessDescriptor>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
 
add(List<ValueInSingleWindow<T>>, TableDataInsertAllResponse.InsertErrors, TableReference, FailsafeValueInSingleWindow<TableRow, TableRow>) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.ErrorContainer
 
addIncompleteNewPartitions(NewPartition) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.reconciler.OrphanedMetadataCleaner
Add NewPartition if it hasn't been updated for 15 minutes.
addIncompleteNewPartitions(NewPartition) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.reconciler.PartitionReconciler
Capture NewPartition row that cannot merge on its own.
addMissingPartitions(List<Range.ByteStringRange>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.reconciler.OrphanedMetadataCleaner
Add all the missingPartitions.
addMissingPartitions(List<Range.ByteStringRange>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.reconciler.PartitionReconciler
Capture partitions that are not currently being streamed.
addUuids() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteIO
Add Uuids to to-be-published messages that ensures that uniqueness is maintained.
AddUuidsTransform - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
A transform to add UUIDs to each message to be written to Pub/Sub Lite.
AddUuidsTransform() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.AddUuidsTransform
 
advance() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
For subscription mode only: Track progression of time according to the Clock passed .
advance() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
 
alwaysRetry() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.InsertRetryPolicy
Always retry all failures.
API_METRIC_LABEL - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl
 
appendRows(long, ProtoRows) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StreamAppendClient
Append rows to a Storage API write stream at the given offset.
appendRowsRowStatusCounter(BigQuerySinkMetrics.RowStatus, String, String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
 
apply(Statement, Description) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery
 
apply(Row) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BeamRowToBigtableMutation.ToBigtableRowFn
 
apply(Row) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableRowToBeamRow
 
apply(Row) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.GetMutationsFromBeamRow
 
apply(HealthcareIOError<T>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOErrorToTableRow
 
apply(ValueInSingleWindow<byte[]>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite.ParsePubsubMessageProtoAsPayloadFromWindowedValue
 
apply(byte[]) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessages.DeserializeBytesIntoPubsubMessagePayloadOnly
 
apply(PubsubMessage) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessages.ParsePayloadAsPubsubMessageProto
 
apply(byte[]) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessages.ParsePubsubMessageProtoAsPayload
 
apply(Statement, Description) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
 
apply(Statement, Description) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal
 
applyRowMutations() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
Write RowMutation messages to BigQuery.
asPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubSubscription
Returns the string representation of this subscription as a path used in the Cloud Pub/Sub API.
asPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
Returns the string representation of this topic as a path used in the Cloud Pub/Sub API.
assertSubscriptionEventuallyCreated(String, Duration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
Block until a subscription is created for this test topic in the specified project.
assertThatAllRows(Schema) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery
 
assertThatTopicEventuallyReceives(Matcher<PubsubMessage>...) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
Repeatedly pull messages from TestPubsub.subscriptionPath() until receiving one for each matcher (or timeout is reached), then assert that the received messages match the expectations.
asV1Beta1Path() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubSubscription
Deprecated.
the v1beta1 API for Cloud Pub/Sub is deprecated.
asV1Beta1Path() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
Deprecated.
the v1beta1 API for Cloud Pub/Sub is deprecated.
asV1Beta2Path() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubSubscription
Deprecated.
the v1beta2 API for Cloud Pub/Sub is deprecated.
asV1Beta2Path() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
Deprecated.
the v1beta2 API for Cloud Pub/Sub is deprecated.
attached() - Method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
 
ATTRIBUTE_ARRAY_ENTRY_SCHEMA - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
 
ATTRIBUTE_ARRAY_FIELD_TYPE - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
 
ATTRIBUTE_MAP_FIELD_TYPE - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
 
AuthenticatedRetryInitializer(GoogleCredentials) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.AuthenticatedRetryInitializer
 
AvroGenericRecordToStorageApiProto - Class in org.apache.beam.sdk.io.gcp.bigquery
Utility methods for converting Avro GenericRecord objects to dynamic protocol message, for use with the Storage write API.
AvroGenericRecordToStorageApiProto() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.AvroGenericRecordToStorageApiProto
 
AvroWriteRequest<T> - Class in org.apache.beam.sdk.io.gcp.bigquery
 
AvroWriteRequest(T, Schema) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.AvroWriteRequest
 

B

batchGetDocuments() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.Read
Factory method to create a new type safe builder for BatchGetDocumentsRequest operations.
batchWrite() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.Write
Factory method to create a new type safe builder for Write operations.
BeamRowToBigtableMutation - Class in org.apache.beam.sdk.io.gcp.bigtable
Bigtable reference: .
BeamRowToBigtableMutation(Map<String, Set<String>>) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BeamRowToBigtableMutation
 
BeamRowToBigtableMutation.ToBigtableRowFn - Class in org.apache.beam.sdk.io.gcp.bigtable
 
BeamRowToStorageApiProto - Class in org.apache.beam.sdk.io.gcp.bigquery
Utility methods for converting Beam Row objects to dynamic protocol message, for use with the Storage write API.
BeamRowToStorageApiProto() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BeamRowToStorageApiProto
 
BIG_QUERY_INSERT_ERROR_ERROR_CONTAINER - Static variable in interface org.apache.beam.sdk.io.gcp.bigquery.ErrorContainer
 
BIGQUERY_EARLY_ROLLOUT_REGION - Static variable in interface org.apache.beam.sdk.io.gcp.bigquery.TestBigQueryOptions
 
BIGQUERY_JOB_TEMPLATE - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
Template for BigQuery jobs created by BigQueryIO.
BigqueryClient - Class in org.apache.beam.sdk.io.gcp.testing
A wrapper class to call Bigquery API calls.
BigqueryClient(String) - Constructor for class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
 
BigQueryCoderProviderRegistrar - Class in org.apache.beam.sdk.io.gcp.bigquery
A CoderProviderRegistrar for standard types used with BigQueryIO.
BigQueryCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryCoderProviderRegistrar
 
BigQueryDirectReadSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration
 
BigQueryDirectReadSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
An implementation of TypedSchemaTransformProvider for BigQuery Storage Read API jobs configured via BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration.
BigQueryDirectReadSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider
 
BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransform - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
A SchemaTransform for BigQuery Storage Read API, configured with BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration and instantiated by BigQueryDirectReadSchemaTransformProvider.
BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
Configuration for reading from BigQuery with Storage Read API.
BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
 
BigQueryDlqProvider - Class in org.apache.beam.sdk.io.gcp.bigquery
 
BigQueryDlqProvider() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryDlqProvider
 
BigQueryExportReadSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.gcp.bigquery
Configuration for reading from BigQuery.
BigQueryExportReadSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration
 
BigQueryExportReadSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.bigquery
 
BigQueryExportReadSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.bigquery
An implementation of TypedSchemaTransformProvider for BigQuery read jobs configured using BigQueryExportReadSchemaTransformConfiguration.
BigQueryExportReadSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformProvider
 
BigQueryExportReadSchemaTransformProvider.BigQueryExportSchemaTransform - Class in org.apache.beam.sdk.io.gcp.bigquery
An implementation of SchemaTransform for BigQuery read jobs configured using BigQueryExportReadSchemaTransformConfiguration.
BigQueryFileLoadsWriteSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.gcp.bigquery
Configuration for writing to BigQuery.
BigQueryFileLoadsWriteSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryFileLoadsWriteSchemaTransformConfiguration
 
BigQueryFileLoadsWriteSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.bigquery
 
BigQueryFileLoadsWriteSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.bigquery
An implementation of TypedSchemaTransformProvider for BigQuery write jobs configured using BigQueryFileLoadsWriteSchemaTransformConfiguration.
BigQueryFileLoadsWriteSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryFileLoadsWriteSchemaTransformProvider
 
BigQueryFileLoadsWriteSchemaTransformProvider.BigQueryWriteSchemaTransform - Class in org.apache.beam.sdk.io.gcp.bigquery
A SchemaTransform that performs BigQueryIO.Writes based on a BigQueryFileLoadsWriteSchemaTransformConfiguration.
BigQueryHelpers - Class in org.apache.beam.sdk.io.gcp.bigquery
A set of helper functions and classes used by BigQueryIO.
BigQueryHelpers() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
 
BigQueryInsertError - Class in org.apache.beam.sdk.io.gcp.bigquery
Model definition for BigQueryInsertError.
BigQueryInsertError(TableRow, TableDataInsertAllResponse.InsertErrors, TableReference) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertError
 
BigQueryInsertErrorCoder - Class in org.apache.beam.sdk.io.gcp.bigquery
A Coder that encodes BigQuery BigQueryInsertError objects.
BigQueryInsertErrorCoder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertErrorCoder
 
BigQueryIO - Class in org.apache.beam.sdk.io.gcp.bigquery
PTransforms for reading and writing BigQuery tables.
BigQueryIO.Read - Class in org.apache.beam.sdk.io.gcp.bigquery
Implementation of BigQueryIO.read().
BigQueryIO.TypedRead<T> - Class in org.apache.beam.sdk.io.gcp.bigquery
BigQueryIO.TypedRead.Method - Enum in org.apache.beam.sdk.io.gcp.bigquery
Determines the method used to read data from BigQuery.
BigQueryIO.TypedRead.QueryPriority - Enum in org.apache.beam.sdk.io.gcp.bigquery
An enumeration type for the priority of a query.
BigQueryIO.Write<T> - Class in org.apache.beam.sdk.io.gcp.bigquery
Implementation of BigQueryIO.write().
BigQueryIO.Write.CreateDisposition - Enum in org.apache.beam.sdk.io.gcp.bigquery
An enumeration type for the BigQuery create disposition strings.
BigQueryIO.Write.Method - Enum in org.apache.beam.sdk.io.gcp.bigquery
Determines the method used to insert data in BigQuery.
BigQueryIO.Write.SchemaUpdateOption - Enum in org.apache.beam.sdk.io.gcp.bigquery
An enumeration type for the BigQuery schema update options strings.
BigQueryIO.Write.WriteDisposition - Enum in org.apache.beam.sdk.io.gcp.bigquery
An enumeration type for the BigQuery write disposition strings.
BigQueryIOTranslation - Class in org.apache.beam.sdk.io.gcp.bigquery
 
BigQueryIOTranslation() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOTranslation
 
BigQueryIOTranslation.ReadRegistrar - Class in org.apache.beam.sdk.io.gcp.bigquery
 
BigQueryIOTranslation.WriteRegistrar - Class in org.apache.beam.sdk.io.gcp.bigquery
 
BigqueryMatcher - Class in org.apache.beam.sdk.io.gcp.testing
A matcher to verify data in BigQuery by processing given query and comparing with content's checksum.
BigqueryMatcher.TableAndQuery - Class in org.apache.beam.sdk.io.gcp.testing
 
BigQueryOptions - Interface in org.apache.beam.sdk.io.gcp.bigquery
Properties needed when using Google BigQuery with the Apache Beam SDK.
BigQuerySchemaIOProvider - Class in org.apache.beam.sdk.io.gcp.bigquery
An implementation of SchemaIOProvider for reading and writing to BigQuery with BigQueryIO.
BigQuerySchemaIOProvider() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaIOProvider
 
BigQuerySchemaRetrievalException - Exception in org.apache.beam.sdk.io.gcp.bigquery
Exception to signal that BigQuery schema retrieval failed.
BigQueryServices - Interface in org.apache.beam.sdk.io.gcp.bigquery
An interface for real, mock, or fake implementations of Cloud BigQuery services.
BigQueryServices.BigQueryServerStream<T> - Interface in org.apache.beam.sdk.io.gcp.bigquery
Container for reading data from streaming endpoints.
BigQueryServices.DatasetService - Interface in org.apache.beam.sdk.io.gcp.bigquery
An interface to get, create and delete Cloud BigQuery datasets and tables.
BigQueryServices.DatasetService.TableMetadataView - Enum in org.apache.beam.sdk.io.gcp.bigquery
 
BigQueryServices.JobService - Interface in org.apache.beam.sdk.io.gcp.bigquery
An interface for the Cloud BigQuery load service.
BigQueryServices.StorageClient - Interface in org.apache.beam.sdk.io.gcp.bigquery
An interface representing a client object for making calls to the BigQuery Storage API.
BigQueryServices.StreamAppendClient - Interface in org.apache.beam.sdk.io.gcp.bigquery
An interface for appending records to a Storage API write stream.
BigQueryServices.WriteStreamService - Interface in org.apache.beam.sdk.io.gcp.bigquery
An interface to get, create and flush Cloud BigQuery STORAGE API write streams.
BigQueryServicesImpl - Class in org.apache.beam.sdk.io.gcp.bigquery
An implementation of BigQueryServices that actually communicates with the cloud BigQuery service.
BigQueryServicesImpl() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl
 
BigQueryServicesImpl.DatasetServiceImpl - Class in org.apache.beam.sdk.io.gcp.bigquery
 
BigQueryServicesImpl.WriteStreamServiceImpl - Class in org.apache.beam.sdk.io.gcp.bigquery
 
BigQuerySinkMetrics - Class in org.apache.beam.sdk.io.gcp.bigquery
Helper class to create perworker metrics for BigQuery Sink stages.
BigQuerySinkMetrics() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
 
BigQuerySinkMetrics.ParsedMetricName - Class in org.apache.beam.sdk.io.gcp.bigquery
 
BigQueryStorageApiInsertError - Class in org.apache.beam.sdk.io.gcp.bigquery
 
BigQueryStorageApiInsertError(TableRow) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertError
 
BigQueryStorageApiInsertError(TableRow, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertError
 
BigQueryStorageApiInsertErrorCoder - Class in org.apache.beam.sdk.io.gcp.bigquery
 
BigQueryStorageApiInsertErrorCoder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertErrorCoder
 
BigQueryStorageTableSource<T> - Class in org.apache.beam.sdk.io.gcp.bigquery
A Source representing reading from a table.
BigQueryStorageWriteApiSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransformConfiguration
 
BigQueryStorageWriteApiSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
An implementation of TypedSchemaTransformProvider for BigQuery Storage Write API jobs configured via BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransformConfiguration.
BigQueryStorageWriteApiSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider
 
BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
Configuration for writing to BigQuery with Storage Write API.
BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransformConfiguration.ErrorHandling - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
 
BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransformConfiguration.ErrorHandling.Builder - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
 
BigQueryUtils - Class in org.apache.beam.sdk.io.gcp.bigquery
Utility methods for BigQuery related operations.
BigQueryUtils() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
 
BigQueryUtils.ConversionOptions - Class in org.apache.beam.sdk.io.gcp.bigquery
Options for how to convert BigQuery data to Beam data.
BigQueryUtils.ConversionOptions.Builder - Class in org.apache.beam.sdk.io.gcp.bigquery
BigQueryUtils.ConversionOptions.TruncateTimestamps - Enum in org.apache.beam.sdk.io.gcp.bigquery
Controls whether to truncate timestamps to millisecond precision lossily, or to crash when truncation would result.
BigQueryUtils.SchemaConversionOptions - Class in org.apache.beam.sdk.io.gcp.bigquery
Options for how to convert BigQuery schemas to Beam schemas.
BigQueryUtils.SchemaConversionOptions.Builder - Class in org.apache.beam.sdk.io.gcp.bigquery
BigtableChangeStreamAccessor - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao
This is probably a temporary solution to what is a bigger migration from cloud-bigtable-client-core to java-bigtable.
BigtableChangeStreamTestOptions - Interface in org.apache.beam.sdk.io.gcp.bigtable.changestreams
 
BigtableClientOverride - Interface in org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao
Override the configuration of Cloud Bigtable data and admin client.
BigtableConfig - Class in org.apache.beam.sdk.io.gcp.bigtable
Configuration for a Cloud Bigtable client.
BigtableConfig() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableConfig
 
BigtableIO - Class in org.apache.beam.sdk.io.gcp.bigtable
Transforms for reading from and writing to Google Cloud Bigtable.
BigtableIO.ExistingPipelineOptions - Enum in org.apache.beam.sdk.io.gcp.bigtable
Overwrite options to determine what to do if change stream name is being reused and there exists metadata of the same change stream name.
BigtableIO.Read - Class in org.apache.beam.sdk.io.gcp.bigtable
A PTransform that reads from Google Cloud Bigtable.
BigtableIO.ReadChangeStream - Class in org.apache.beam.sdk.io.gcp.bigtable
 
BigtableIO.Write - Class in org.apache.beam.sdk.io.gcp.bigtable
A PTransform that writes to Google Cloud Bigtable.
BigtableIO.WriteWithResults - Class in org.apache.beam.sdk.io.gcp.bigtable
A PTransform that writes to Google Cloud Bigtable and emits a BigtableWriteResult for each batch written.
BigtableReadSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration
 
BigtableReadSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.bigtable
An implementation of TypedSchemaTransformProvider for Bigtable Read jobs configured via BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration.
BigtableReadSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider
 
BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.gcp.bigtable
Configuration for reading from Bigtable.
BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.bigtable
BigtableReadSchemaTransformProvider.BigtableRowToBeamRow - Class in org.apache.beam.sdk.io.gcp.bigtable
 
BigtableRowToBeamRow() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableRowToBeamRow
 
BigtableRowToBeamRow - Class in org.apache.beam.sdk.io.gcp.bigtable
Bigtable reference: .
BigtableRowToBeamRow(Schema) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableRowToBeamRow
 
BigtableRowToBeamRowFlat - Class in org.apache.beam.sdk.io.gcp.bigtable
Bigtable reference: .
BigtableRowToBeamRowFlat(Schema, Map<String, Set<String>>) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableRowToBeamRowFlat
 
BigtableUtils - Class in org.apache.beam.sdk.io.gcp.testing
 
BigtableUtils() - Constructor for class org.apache.beam.sdk.io.gcp.testing.BigtableUtils
 
BigtableWriteResult - Class in org.apache.beam.sdk.io.gcp.bigtable
The result of writing a batch of rows to Bigtable.
BigtableWriteResult() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResult
 
BigtableWriteResultCoder - Class in org.apache.beam.sdk.io.gcp.bigtable
A coder for BigtableWriteResult.
BigtableWriteResultCoder() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResultCoder
 
BigtableWriteSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration
 
BigtableWriteSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.bigtable
An implementation of TypedSchemaTransformProvider for Bigtable Write jobs configured via BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration.
BigtableWriteSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider
 
BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.gcp.bigtable
Configuration for writing to Bigtable.
BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.bigtable
BigtableWriteSchemaTransformProvider.GetMutationsFromBeamRow - Class in org.apache.beam.sdk.io.gcp.bigtable
 
BlockingCommitterImpl - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
 
booleanToByteArray(boolean) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigtableUtils
 
build() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration.Builder
build() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryFileLoadsWriteSchemaTransformConfiguration.Builder
build() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.SchemaConversionOptions.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration.Builder
build() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransformConfiguration.Builder
build() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransformConfiguration.ErrorHandling.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration.Builder
build() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration.Builder
build() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchGetDocuments.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchWriteWithDeadLetterQueue.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchWriteWithSummary.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.ListCollectionIds.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.ListDocuments.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.PartitionQuery.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.RunQuery.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
Create a new instance of RpcQosOptions from the current builder state.
build() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.ErrorHandling.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.ErrorHandling.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PublisherOptions.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.SubscriberOptions.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
build() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
Builds a PartitionMetadata from the given fields.
build() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction.Builder
 
build() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryFileLoadsWriteSchemaTransformConfiguration
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryFileLoadsWriteSchemaTransformConfiguration.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions
 
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.SchemaConversionOptions
 
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.SchemaConversionOptions.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransformConfiguration
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransformConfiguration.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransformConfiguration.ErrorHandling
 
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransformConfiguration.ErrorHandling.Builder
 
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration
 
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration.Builder
 
Builder(JodaClock, FirestoreStatefulComponentFactory, RpcQosOptions, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchGetDocuments.Builder
 
Builder(JodaClock, FirestoreStatefulComponentFactory, RpcQosOptions, boolean, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.PartitionQuery.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
 
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.ErrorHandling
 
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.ErrorHandling.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration
 
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.ErrorHandling
 
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.ErrorHandling.Builder
 
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PublisherOptions.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
 
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
 
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration.Builder
 
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.SubscriberOptions.Builder
 
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions.Builder
 
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
 
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration
 
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration.Builder
 
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig.Builder
 
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction.Builder
 
builder() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema
 
builder(Dialect) - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema
 
builder() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration
 
Builder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration.Builder
 
buildExternal(ExternalRead.Configuration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead.ReadBuilder
 
buildExternal(ExternalWrite.Configuration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite.WriteBuilder
 
buildExternal(SpannerTransformRegistrar.ReadBuilder.Configuration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder
 
buildReader() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
 
buildSchemaWithAttributes(Schema, List<String>, String) - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
Builds a new Schema by adding additional optional attributes and map field to the provided schema.
buildWriter() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
 
BytesThroughputEstimator<T> - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator
An estimator to provide an estimate on the byte throughput of the outputted elements.
BytesThroughputEstimator(SizeEstimator<T>, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator.BytesThroughputEstimator
 
BytesThroughputEstimator(SizeEstimator<T>, int, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator.BytesThroughputEstimator
 
BytesThroughputEstimator<T> - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator
An estimator to provide an estimate on the throughput of the outputted elements.
BytesThroughputEstimator(int, SizeEstimator<T>) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.BytesThroughputEstimator
 
byteString(byte[]) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.RowUtils
 
byteString(byte[]) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigtableUtils
 
ByteStringRangeHelper - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams
Helper functions to evaluate the completeness of collection of ByteStringRanges.
ByteStringRangeHelper() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ByteStringRangeHelper
 
byteStringUtf8(String) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.RowUtils
 
byteStringUtf8(String) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigtableUtils
 

C

cancel() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.BigQueryServerStream
Cancels the stream, releasing any client- and server-side resources.
cancel() - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices.FakeBigQueryServerStream
 
CELL_SCHEMA - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider
 
CF_CONTINUATION_TOKEN - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
 
CF_INITIAL_TOKEN - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
 
CF_LOCK - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
 
CF_MISSING_PARTITIONS - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
 
CF_PARENT_LOW_WATERMARKS - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
 
CF_PARENT_PARTITIONS - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
 
CF_SHOULD_DELETE - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
 
CF_VERSION - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
 
CF_WATERMARK - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
 
CHANGE_SQN_COLUMN - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiCDC
 
CHANGE_STREAM_MUTATION_GC_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
Counter for the total number of ChangeStreamMutations that are initiated by garbage collection (not user initiated) identified during the execution of the Connector.
CHANGE_STREAM_MUTATION_USER_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
Counter for the total number of ChangeStreamMutations that are initiated by users (not garbage collection) identified during the execution of the Connector.
CHANGE_TYPE_COLUMN - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiCDC
 
changeStreamAction(ChangeStreamMetrics) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ActionFactory
Creates and returns a singleton instance of an action class for processing individual ChangeStreamMutation in ReadChangeStreamPartitionDoFn.
ChangeStreamAction - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.action
This class is responsible for processing individual ChangeStreamRecord.
ChangeStreamAction(ChangeStreamMetrics) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ChangeStreamAction
Constructs ChangeStreamAction to process individual ChangeStreamRecord.
ChangeStreamContinuationTokenHelper - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams
 
ChangeStreamContinuationTokenHelper() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamContinuationTokenHelper
 
ChangeStreamDao - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao
Data access object to list and read stream partitions of a table.
ChangeStreamDao(BigtableDataClient, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.ChangeStreamDao
 
ChangeStreamDao - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
Responsible for making change stream queries for a given partition.
ChangeStreamMetrics - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams
Class to aggregate metrics related functionality.
ChangeStreamMetrics() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
 
ChangeStreamMetrics - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams
Class to aggregate metrics related functionality.
ChangeStreamMetrics() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Constructs a ChangeStreamMetrics instance with the following metrics enabled by default.
ChangeStreamMetrics(Set<MetricName>) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Constructs a ChangeStreamMetrics instance with the given metrics enabled.
changeStreamQuery(String, Timestamp, Timestamp, long) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamDao
Performs a change stream query.
ChangeStreamRecord - Interface in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
Represents a Spanner Change Stream Record.
ChangeStreamRecordMapper - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper
This class is responsible for transforming a Struct to a List of ChangeStreamRecord models.
changeStreamRecordMapper() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper.MapperFactory
Creates and returns a singleton instance of a mapper class capable of transforming a Struct into a List of ChangeStreamRecord subclasses.
ChangeStreamRecordMetadata - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
Holds internal execution metrics / metadata for the processed ChangeStreamRecord.
ChangeStreamRecordMetadata.Builder - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
 
ChangeStreamResultSet - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
Decorator class over a ResultSet that provides telemetry for the streamed records.
ChangeStreamResultSetMetadata - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
Represents telemetry metadata gathered during the consumption of a change stream query.
ChangeStreamsConstants - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams
Single place for defining the constants used in the Spanner.readChangeStreams() connector.
ChangeStreamsConstants() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
 
checkDone() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.ReadChangeStreamPartitionProgressTracker
This is to signal to the runner that this restriction has completed.
checkDone() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
Checks if the restriction has been processed successfully.
checkIfAnySubscriptionExists(String, Duration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
CheckpointMarkImpl - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
 
ChildPartition - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
A child partition represents a new partition that should be queried.
ChildPartition(String, HashSet<String>) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartition
Constructs a child partition, which will have its own token and the parents that it originated from.
ChildPartition(String, String) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartition
Constructs a child partition, which will have its own token and the parent that it originated from.
ChildPartitionsRecord - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
Represents a ChildPartitionsRecord.
ChildPartitionsRecord(Timestamp, String, List<ChildPartition>, ChangeStreamRecordMetadata) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
Constructs a child partitions record containing one or more child partitions.
childPartitionsRecordAction(PartitionMetadataDao, ChangeStreamMetrics) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.ActionFactory
Creates and returns a singleton instance of an action class capable of process ChildPartitionsRecords.
ChildPartitionsRecordAction - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.action
This class is part of the process for ReadChangeStreamPartitionDoFn SDF.
CivilTimeEncoder - Class in org.apache.beam.sdk.io.gcp.bigquery
Encoder for TIME and DATETIME values, according to civil_time encoding.
cleanUpPrefix() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
Delete all the metadata rows starting with the change stream name prefix, except for detect new partition row because it signals the existence of a pipeline with the change stream name.
CleanUpReadChangeStreamDoFn - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn
 
CleanUpReadChangeStreamDoFn(DaoFactory) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.CleanUpReadChangeStreamDoFn
 
close() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StorageClient
Close the client object.
close() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
 
close() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.WriteStreamServiceImpl
 
close() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.BigtableChangeStreamAccessor
 
close() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.DaoFactory
 
close() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
Gracefully close the underlying netty channel.
close() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
 
close() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
 
close() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.BlockingCommitterImpl
 
close() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.ManagedFactoryImpl
 
close() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
 
close() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSet
Closes the current change stream ResultSet.
close() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerAccessor
 
close() - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
close() - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
 
CLOSESTREAM_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
Counter for the total number of heartbeats identified during the execution of the Connector.
CloudPubsubTransforms - Class in org.apache.beam.sdk.io.gcp.pubsublite
A class providing transforms between Cloud Pub/Sub and Pub/Sub Lite message types.
CoderSizeEstimator<T> - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator
This class is used to estimate the size in bytes of a given element.
CoderSizeEstimator(Coder<T>) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator.CoderSizeEstimator
 
Column() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema.Column
 
COLUMN_CREATED_AT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
Metadata table column name for the timestamp at which the partition row was first created.
COLUMN_END_TIMESTAMP - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
Metadata table column name for the timestamp to end the change stream query of the partition.
COLUMN_FAMILIES - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
 
COLUMN_FINISHED_AT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
Metadata table column name for the timestamp at which the partition was marked as finished by the ReadChangeStreamPartitionDoFn SDF.
COLUMN_HEARTBEAT_MILLIS - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
Metadata table column name for the change stream query heartbeat interval in millis.
COLUMN_PARENT_TOKENS - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
Metadata table column name for parent partition tokens.
COLUMN_PARTITION_TOKEN - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
Metadata table column name for the partition token.
COLUMN_RUNNING_AT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
Metadata table column name for the timestamp at which the partition was marked as running by the ReadChangeStreamPartitionDoFn SDF.
COLUMN_SCHEDULED_AT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
Metadata table column name for the timestamp at which the partition was scheduled by the DetectNewPartitionsDoFn SDF.
COLUMN_START_TIMESTAMP - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
Metadata table column name for the timestamp to start the change stream query of the partition.
COLUMN_STATE - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
Metadata table column name for the state that the partition is currently in.
COLUMN_WATERMARK - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
Metadata table column name for the current watermark of the partition.
COLUMNS - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiCDC
 
COLUMNS_MAPPING - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.RowUtils
 
ColumnType - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
Defines a column type from a Cloud Spanner table with the following information: column name, column type, flag indicating if column is primary key and column position in the table.
ColumnType(String, TypeCode, boolean, long) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ColumnType
 
commitOffset(Offset) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.BlockingCommitterImpl
 
commitWriteStreams(String, Iterable<String>) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.WriteStreamService
Commit write streams of type PENDING.
commitWriteStreams(String, Iterable<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.WriteStreamServiceImpl
 
commitWriteStreams(String, Iterable<String>) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
Configuration() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead.Configuration
 
Configuration() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite.Configuration
 
Configuration() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
 
configurationClass() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformProvider
Returns the expected class of the configuration.
configurationClass() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryFileLoadsWriteSchemaTransformProvider
Returns the expected class of the configuration.
configurationClass() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider
 
configurationClass() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider
 
configurationClass() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider
 
configurationClass() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider
 
configurationClass() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformProvider
 
configurationClass() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformProvider
 
configurationClass() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
 
configurationClass() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider
 
configurationClass() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider
 
configurationClass() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider
 
configurationSchema() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaIOProvider
Returns the expected schema of the configuration object.
configurationSchema() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider
Returns the expected schema of the configuration object.
configurationSchema() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
Returns the expected schema of the configuration object.
Context(TableDataInsertAllResponse.InsertErrors) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.InsertRetryPolicy.Context
 
ConversionOptions() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions
 
convertAvroFormat(Schema.FieldType, Object, BigQueryUtils.ConversionOptions) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
Tries to convert an Avro decoded value to a Beam field value based on the target type of the Beam field.
convertGenericRecordToTableRow(GenericRecord, TableSchema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
 
convertNewPartitionRowKeyToPartition(ByteString) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
Convert new partition row key to partition to process metadata read from Bigtable.
convertNumbers(TableRow) - Static method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
 
convertPartitionToNewPartitionRowKey(Range.ByteStringRange) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
Convert partition to a New Partition row key to query for partitions ready to be streamed as the result of splits and merges.
convertPartitionToStreamPartitionRowKey(Range.ByteStringRange) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
Convert partition to a Stream Partition row key to query for metadata of partitions that are currently being streamed.
convertStreamPartitionRowKeyToPartition(ByteString) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
Convert stream partition row key to partition to process metadata read from Bigtable.
countPartitionsCreatedAfter(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
Counts all partitions with a PartitionMetadataAdminDao.COLUMN_CREATED_AT less than the given timestamp.
coverSameKeySpace(List<Range.ByteStringRange>, Range.ByteStringRange) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ByteStringRangeHelper
Returns true if parentPartitions form a proper superset of childPartition.
create(String, ImmutableMap<String, String>) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics.ParsedMetricName
 
create(String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics.ParsedMetricName
 
create(ValueProvider<TableReference>, DataFormat, ValueProvider<List<String>>, ValueProvider<String>, SerializableFunction<SchemaAndRecord, T>, Coder<T>, BigQueryServices, boolean) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
 
create(ValueProvider<TableReference>, ValueProvider<List<String>>, ValueProvider<String>, SerializableFunction<SchemaAndRecord, T>, Coder<T>, BigQueryServices) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
 
create(Schema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery
Creates an instance of this rule.
create(long) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResult
 
create(Schema, String) - Static method in class org.apache.beam.sdk.io.gcp.datastore.EntityToRow
Create a PTransform instance.
create(String, String) - Static method in class org.apache.beam.sdk.io.gcp.datastore.RowToEntity
Create a PTransform instance.
create() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
Creates an instance of this rule using options provided by TestPipeline.testingPipelineOptions().
create() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal
Creates an instance of this rule.
create(SubscriptionPartition) - Method in interface org.apache.beam.sdk.io.gcp.pubsublite.internal.ManagedFactory
 
create(SubscriptionPartition) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.ManagedFactoryImpl
 
create(SpannerConfig, String, String) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.MetadataSpannerConfigFactory
Generates a SpannerConfig that can be used to access the change stream metadata database by copying only the necessary fields from the given primary database SpannerConfig and setting the instance ID and database ID to the supplied metadata values.
create(Mutation, Mutation...) - Static method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
Creates a new group.
create(Mutation, Iterable<Mutation>) - Static method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
 
create() - Static method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
create() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
create(BatchTransactionId) - Static method in class org.apache.beam.sdk.io.gcp.spanner.Transaction
 
create(String, String, String, Boolean) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher.TableAndQuery
 
createBigQueryClientCustomErrors() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl
 
createDataset(String, String, String, String, Long) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
Create a Dataset with the given location, description and default expiration time for tables in the dataset (if null, tables don't expire).
createDataset(String, String, String, String, Long) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
Create a Dataset with the given location, description and default expiration time for tables in the dataset (if null, tables don't expire).
createDataset(String, String, String, String, Long) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
createDicomStore(String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Create a DicomStore.
createDicomStore(String, String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Create a DicomStore with a PubSub listener.
createDicomStore(String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
createDicomStore(String, String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
createFactoryForCreateSubscription() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
 
createFactoryForGetSchema(PubsubClient.TopicPath, PubsubClient.SchemaPath, Schema) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
 
createFactoryForPublish(PubsubClient.TopicPath, Iterable<PubsubClient.OutgoingMessage>, Iterable<PubsubClient.OutgoingMessage>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
Return a factory for testing publishers.
createFactoryForPull(Clock, PubsubClient.SubscriptionPath, int, Iterable<PubsubClient.IncomingMessage>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
Return a factory for testing subscribers.
createFactoryForPullAndPublish(PubsubClient.SubscriptionPath, PubsubClient.TopicPath, Clock, int, Iterable<PubsubClient.IncomingMessage>, Iterable<PubsubClient.OutgoingMessage>, Iterable<PubsubClient.OutgoingMessage>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
Returns a factory for a test that is expected to both publish and pull messages over the course of the test.
createFhirStore(String, String, String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Create FHIR Store with a PubSub topic listener.
createFhirStore(String, String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Create FHIR Store.
createFhirStore(String, String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
createFhirStore(String, String, String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
createHL7v2Message(String, Message) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Creates an HL7v2 message.
createHL7v2Message(String, Message) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
createHL7v2Store(String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Create hl 7 v 2 store hl 7 v 2 store.
createHL7v2Store(String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
createMetadataTable() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
Create the metadata table if it does not exist yet.
createNewDataset(String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
Creates a new dataset.
createNewDataset(String, String, Long) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
Creates a new dataset with defaultTableExpirationMs.
createNewDataset(String, String, Long, String) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
Creates a new dataset with defaultTableExpirationMs and in a specified location (GCP region).
createNewTable(String, String, Table) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
 
createOrUpdateReadChangeStreamMetadataTable(String, String, String) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO
Utility method to create or update Read Change Stream metadata table.
createPartitionMetadataTable() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
Creates the metadata table in the given instance, database configuration, with the constructor specified table name.
createQuery(String, String, String) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher
 
createQueryUsingStandardSql(String, String, String) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher
 
createRandomSubscription(PubsubClient.ProjectPath, PubsubClient.TopicPath, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
Create a random subscription for topic.
createReader(PipelineOptions, CheckpointMarkImpl) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedSourceImpl
 
createReadSession(CreateReadSessionRequest) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StorageClient
Create a new read session against an existing table.
createSchema(PubsubClient.SchemaPath, String, Schema.Type) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
Create Schema from Schema definition content.
createSchema(PubsubClient.SchemaPath, String, Schema.Type) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
Create Schema from Schema definition content.
createSchema(PubsubClient.SchemaPath, String, Schema.Type) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
Create Schema from Schema definition content.
createSchema(PubsubClient.SchemaPath, String, Schema.Type) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
 
createSubscription(PubsubClient.TopicPath, PubsubClient.SubscriptionPath, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
Create subscription to topic.
createSubscription(PubsubClient.TopicPath, PubsubClient.SubscriptionPath, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
 
createSubscription(PubsubClient.TopicPath, PubsubClient.SubscriptionPath, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
 
createSubscription(PubsubClient.TopicPath, PubsubClient.SubscriptionPath, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
 
createTable(Table) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
Creates the specified table if it does not exist.
createTable(Table) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
Creates the specified table if it does not exist.
createTable(Table) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
CreateTableHelpers - Class in org.apache.beam.sdk.io.gcp.bigquery
 
CreateTableHelpers() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.CreateTableHelpers
 
CreateTables<DestinationT,ElementT> - Class in org.apache.beam.sdk.io.gcp.bigquery
Creates any tables needed before performing streaming writes to the tables.
CreateTables(BigQueryIO.Write.CreateDisposition, DynamicDestinations<?, DestinationT>) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.CreateTables
The list of tables created so far, so we don't try the creation each time.
createTest(String, String, String) - Static method in class org.apache.beam.sdk.io.gcp.datastore.RowToEntity
 
createTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
Create topic.
createTopic(PubsubClient.TopicPath, PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
Create {link TopicPath} with PubsubClient.SchemaPath.
createTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
 
createTopic(PubsubClient.TopicPath, PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
 
createTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
 
createTopic(PubsubClient.TopicPath, PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
 
createTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
 
createTopic(PubsubClient.TopicPath, PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
 
createTransaction() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
Returns a transform that creates a batch transaction.
CreateTransaction() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
 
createWriteStream(String, WriteStream.Type) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.WriteStreamService
Create a Write Stream for use with the Storage Write API.
createWriteStream(String, WriteStream.Type) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.WriteStreamServiceImpl
 
createWriteStream(String, WriteStream.Type) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
CrossLanguageConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.CrossLanguageConfiguration
 
CURRENT_METADATA_TABLE_VERSION - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
 
currentRestriction() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.ReadChangeStreamPartitionProgressTracker
Returns the streamProgress that was successfully claimed.
currentRestriction() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
 

D

DaoFactory - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao
 
DaoFactory(BigtableConfig, BigtableConfig, String, String, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.DaoFactory
 
DaoFactory - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
Factory class to create data access objects to perform change stream queries and access the metadata tables.
DaoFactory(SpannerConfig, String, SpannerConfig, String, Options.RpcPriority, String, Dialect, Dialect) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.DaoFactory
Constructs a DaoFactory with the configuration to be used for the underlying instances.
DATA_RECORD_COMMITTED_TO_EMITTED_0MS_TO_1000MS_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Counter for record latencies [0, 1000) ms during the execution of the Connector.
DATA_RECORD_COMMITTED_TO_EMITTED_1000MS_TO_3000MS_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Counter for record latencies [1000, 3000) ms during the execution of the Connector.
DATA_RECORD_COMMITTED_TO_EMITTED_3000MS_TO_INF_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Counter for record latencies equal or above 3000ms during the execution of the Connector.
DATA_RECORD_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Counter for the total number of data records identified during the execution of the Connector.
DataChangeRecord - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
A data change record encodes modifications to Cloud Spanner rows.
DataChangeRecord(String, Timestamp, String, boolean, String, String, List<ColumnType>, List<Mod>, ModType, ValueCaptureType, long, long, String, boolean, ChangeStreamRecordMetadata) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
Constructs a data change record for a given partition, at a given timestamp, for a given transaction.
dataChangeRecordAction(ThroughputEstimator<DataChangeRecord>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.ActionFactory
Creates and returns a singleton instance of an action class capable of processing DataChangeRecords.
DataChangeRecordAction - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.action
This class is part of the process for ReadChangeStreamPartitionDoFn SDF.
DataChangeRecordAction(ThroughputEstimator<DataChangeRecord>) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.DataChangeRecordAction
 
dataSchema - Variable in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
 
dataset - Variable in class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser.DicomWebPath
 
DatasetServiceImpl(BigQueryOptions) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
 
DatastoreIO - Class in org.apache.beam.sdk.io.gcp.datastore
DatastoreIO provides an API for reading from and writing to Google Cloud Datastore over different versions of the Cloud Datastore Client libraries.
DatastoreV1 - Class in org.apache.beam.sdk.io.gcp.datastore
DatastoreV1 provides an API to Read, Write and Delete PCollections of Google Cloud Datastore version v1 Entity objects.
DatastoreV1.DeleteEntity - Class in org.apache.beam.sdk.io.gcp.datastore
A PTransform that deletes Entities from Cloud Datastore.
DatastoreV1.DeleteKey - Class in org.apache.beam.sdk.io.gcp.datastore
A PTransform that deletes Entities associated with the given Keys from Cloud Datastore.
DatastoreV1.Read - Class in org.apache.beam.sdk.io.gcp.datastore
A PTransform that reads the result rows of a Cloud Datastore query as Entity objects.
DatastoreV1.Write - Class in org.apache.beam.sdk.io.gcp.datastore
A PTransform that writes Entity objects to Cloud Datastore.
DataStoreV1SchemaIOProvider - Class in org.apache.beam.sdk.io.gcp.datastore
An implementation of SchemaIOProvider for reading and writing payloads with DatastoreIO.
DataStoreV1SchemaIOProvider() - Constructor for class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider
 
DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO - Class in org.apache.beam.sdk.io.gcp.datastore
An abstraction to create schema aware IOs.
DEAD_LETTER - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read
The tag for the deadletter output of FHIR resources.
DEAD_LETTER - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search
The tag for the deadletter output of FHIR Resources.
DEAD_LETTER - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything
The tag for the deadletter output of FHIR Resources from a GetPatientEverything request.
DEAD_LETTER - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read
The tag for the deadletter output of HL7v2 read responses.
DEAD_LETTER - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read
The tag for the deadletter output of HL7v2 Messages.
decActivePartitionReadCounter() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Decrements the ChangeStreamMetrics.ACTIVE_PARTITION_READ_COUNT by 1 if the metric is enabled.
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertErrorCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertErrorCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.RowMutation.RowMutationCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV2
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV3
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResultCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameterCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOErrorCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2MessageCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadResponseCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.JsonArrayCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdAndOrderingKeyCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
 
decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithMessageIdCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithTopicCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.OffsetByteRangeCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.SubscriptionPartitionCoder
 
decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UuidCoder
 
decodePacked32TimeSeconds(int) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
Decodes bitFieldTimeSeconds as a LocalTime with seconds precision.
decodePacked32TimeSecondsAsJavaTime(int) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
Decodes bitFieldTimeSeconds as a LocalTime with seconds precision.
decodePacked64DatetimeMicros(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
Decodes bitFieldDatetimeMicros as a LocalDateTime with microseconds precision.
decodePacked64DatetimeMicrosAsJavaTime(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
Decodes bitFieldDatetimeMicros as a LocalDateTime with microseconds precision.
decodePacked64DatetimeSeconds(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
Decodes bitFieldDatetimeSeconds as a LocalDateTime with seconds precision.
decodePacked64DatetimeSecondsAsJavaTime(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
Decodes bitFieldDatetimeSeconds as a LocalDateTime with seconds precision.
decodePacked64TimeMicros(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
Decodes bitFieldTimeMicros as a LocalTime with microseconds precision.
decodePacked64TimeMicrosAsJavaTime(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
Decodes bitFieldTimeMicros as a LocalTime with microseconds precision.
decodePacked64TimeNanos(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
Decodes bitFieldTimeNanos as a LocalTime with nanoseconds precision.
decodePacked64TimeNanosAsJavaTime(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
Decodes bitFieldTimeNanos as a LocalTime with nanoseconds precision.
decodeQueryResult(String) - Static method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
 
decPartitionStreamCount() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
deduplicate(UuidDeduplicationOptions) - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteIO
Remove duplicates from the PTransform from a read.
deduplicate() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions
 
DEFAULT_ATTRIBUTE - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.internal.Uuid
 
DEFAULT_CHANGE_STREAM_NAME - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
The default change stream name for a change stream query is the empty String.
DEFAULT_DEDUPLICATE_DURATION - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions
 
DEFAULT_INCLUSIVE_END_AT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
The default end timestamp for a change stream query is ChangeStreamsConstants.MAX_INCLUSIVE_END_AT.
DEFAULT_INCLUSIVE_START_AT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
The default start timestamp for a change stream query is Timestamp.MIN_VALUE.
DEFAULT_METADATA_TABLE_NAME - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
 
DEFAULT_RPC_PRIORITY - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
The default priority for a change stream query is Options.RpcPriority.HIGH.
DEFAULT_TIME_DOMAIN - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions
 
DEFAULT_UUID_EXTRACTOR - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions
 
defaultOptions() - Static method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
Factory method to return a new instance of RpcQosOptions with all default values.
deidentify(String, String, DeidentifyConfig) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
Deidentify FHIR resources.
deidentify(ValueProvider<String>, ValueProvider<String>, ValueProvider<DeidentifyConfig>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
Deidentify FHIR resources.
Deidentify(ValueProvider<String>, ValueProvider<String>, ValueProvider<DeidentifyConfig>) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Deidentify
 
deidentify(DoFn<String, String>.ProcessContext) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Deidentify.DeidentifyFn
 
deidentifyFhirStore(String, String, DeidentifyConfig) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Deidentify a GCP FHIR Store and write the result into a new FHIR Store.
deidentifyFhirStore(String, String, DeidentifyConfig) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
DeidentifyFn(ValueProvider<String>, ValueProvider<DeidentifyConfig>) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Deidentify.DeidentifyFn
 
DELETE_URN - Static variable in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar
 
DeleteBuilder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.DeleteBuilder
 
deleteDataset(String, String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
Deletes the dataset specified by the datasetId value.
deleteDataset(String, String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
Deletes the dataset specified by the datasetId value.
deleteDataset(String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
 
deleteDataset(String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
deleteDicomStore(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Delete a Dicom Store.
deleteDicomStore(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
deleteEntity() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1
Returns an empty DatastoreV1.DeleteEntity builder.
deleteFhirStore(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Delete Fhir store.
deleteFhirStore(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
deleteHL7v2Message(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Deletes an HL7v2 message.
deleteHL7v2Message(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
deleteHL7v2Store(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Deletes an HL7v2 store.
deleteHL7v2Store(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
deleteKey() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1
Returns an empty DatastoreV1.DeleteKey builder.
deleteNewPartition(NewPartition) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
This is the 2nd step of 2 phase delete.
deletePartitionMetadataTable() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
Drops the metadata table.
deleteSchema(PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
deleteSchema(PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
Delete SchemaPath.
deleteSchema(PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
Delete SchemaPath.
deleteSchema(PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
Delete SchemaPath.
deleteStreamPartitionRow(Range.ByteStringRange) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
This is the 2nd step of 2 phase delete of StreamPartition.
deleteSubscription(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
Delete subscription.
deleteSubscription(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
 
deleteSubscription(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
 
deleteSubscription(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
 
deleteTable(TableReference) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
Deletes the table specified by tableId from the dataset.
deleteTable(TableReference) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
Deletes the table specified by tableId from the dataset.
deleteTable(String, String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
 
deleteTable(TableReference) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
deleteTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
 
deleteTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
 
deleteTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
 
deleteTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
 
describeMismatchSafely(BigqueryMatcher.TableAndQuery, Description) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher
 
describeTo(Description) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher
 
description() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider
 
DeserializeBytesIntoPubsubMessagePayloadOnly() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessages.DeserializeBytesIntoPubsubMessagePayloadOnly
 
DETECT_NEW_PARTITION_SUFFIX - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
 
detectNewPartitionsAction(ChangeStreamMetrics, MetadataTableDao, Instant, GenerateInitialPartitionsAction, ResumeFromPreviousPipelineAction, ProcessNewPartitionsAction) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ActionFactory
Creates and returns a singleton instance of an action class for processing DetectNewPartitionsDoFn.
DetectNewPartitionsAction - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.action
This class processes DetectNewPartitionsDoFn.
DetectNewPartitionsAction(ChangeStreamMetrics, MetadataTableDao, Instant, GenerateInitialPartitionsAction, ResumeFromPreviousPipelineAction, ProcessNewPartitionsAction) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.DetectNewPartitionsAction
 
detectNewPartitionsAction(PartitionMetadataDao, PartitionMetadataMapper, ChangeStreamMetrics, Duration) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.ActionFactory
Creates and returns a single instance of an action class capable of detecting and scheduling new partitions to be queried.
DetectNewPartitionsAction - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.action
This class is responsible for scheduling partitions.
DetectNewPartitionsAction(PartitionMetadataDao, PartitionMetadataMapper, ChangeStreamMetrics, Duration) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.DetectNewPartitionsAction
Constructs an action class for detecting / scheduling new partitions.
DetectNewPartitionsDoFn - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn
 
DetectNewPartitionsDoFn(Instant, ActionFactory, DaoFactory, ChangeStreamMetrics) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.DetectNewPartitionsDoFn
 
DetectNewPartitionsDoFn - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn
A SplittableDoFn (SDF) that is responsible for scheduling partitions to be queried.
DetectNewPartitionsDoFn(DaoFactory, MapperFactory, ActionFactory, ChangeStreamMetrics) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.DetectNewPartitionsDoFn
This class needs a DaoFactory to build DAOs to access the partition metadata tables.
DetectNewPartitionsRangeTracker - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction
This restriction tracker delegates most of its behavior to an internal TimestampRangeTracker.
DetectNewPartitionsRangeTracker(TimestampRange) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.DetectNewPartitionsRangeTracker
 
DetectNewPartitionsState - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.model
Metadata of the progress of DetectNewPartitionsDoFn from the metadata table.
DetectNewPartitionsState(Instant, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.DetectNewPartitionsState
 
DetectNewPartitionsTracker - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction
 
DetectNewPartitionsTracker(long) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.DetectNewPartitionsTracker
 
DicomIO - Class in org.apache.beam.sdk.io.gcp.healthcare
The DicomIO connectors allows Beam pipelines to make calls to the Dicom API of the Google Cloud Healthcare API (https://cloud.google.com/healthcare/docs/how-tos#dicom-guide).
DicomIO() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.DicomIO
 
DicomIO.ReadStudyMetadata - Class in org.apache.beam.sdk.io.gcp.healthcare
This class makes a call to the retrieve metadata endpoint (https://cloud.google.com/healthcare/docs/how-tos/dicomweb#retrieving_metadata).
DicomIO.ReadStudyMetadata.Result - Class in org.apache.beam.sdk.io.gcp.healthcare
 
dicomStorePath - Variable in class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser.DicomWebPath
 
DicomWebPath() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser.DicomWebPath
 
DlqProvider - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
 
DlqProvider() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.DlqProvider
 
doesMetadataTableExist() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
 
doHoldLock(Range.ByteStringRange, String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
Return true if the uuid holds the lock of the partition.
doPartitionsOverlap(Range.ByteStringRange, Range.ByteStringRange) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ByteStringRangeHelper
Returns true if the two ByteStringRange overlaps, otherwise false.
doubleToByteArray(double) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigtableUtils
 
dryRunQuery(String, JobConfigurationQuery, String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.JobService
Dry runs the query in the given project.
dryRunQuery(String, JobConfigurationQuery, String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
 
DYNAMIC_DESTINATIONS - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider
 
DynamicDestinations<T,DestinationT> - Class in org.apache.beam.sdk.io.gcp.bigquery
This class provides the most general way of specifying dynamic BigQuery table destinations.
DynamicDestinations() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
 

E

ENABLE_CUSTOM_PUBSUB_SINK - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
 
ENABLE_CUSTOM_PUBSUB_SOURCE - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
 
encode(BigQueryInsertError, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertErrorCoder
 
encode(BigQueryStorageApiInsertError, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertErrorCoder
 
encode(RowMutation, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.RowMutation.RowMutationCoder
 
encode(TableDestination, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoder
 
encode(TableDestination, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV2
 
encode(TableDestination, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV3
 
encode(TableRow, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
 
encode(TableRow, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
 
encode(BigtableWriteResult, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResultCoder
 
encode(FhirSearchParameter<T>, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameterCoder
 
encode(HealthcareIOError<T>, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOErrorCoder
 
encode(HL7v2Message, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2MessageCoder
 
encode(HL7v2ReadResponse, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadResponseCoder
 
encode(JsonArray, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.JsonArrayCoder
 
encode(PubsubMessage, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
 
encode(PubsubMessage, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
 
encode(PubsubMessage, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdAndOrderingKeyCoder
 
encode(PubsubMessage, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdCoder
 
encode(PubsubMessage, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
 
encode(PubsubMessage, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
 
encode(PubsubMessage, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithMessageIdCoder
 
encode(PubsubMessage, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithTopicCoder
 
encode(OffsetByteRange, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.OffsetByteRangeCoder
 
encode(SubscriptionPartition, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.SubscriptionPartitionCoder
 
encode(Uuid, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UuidCoder
 
encodePacked32TimeSeconds(LocalTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
Encodes time as a 4-byte integer with seconds precision.
encodePacked32TimeSeconds(LocalTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
Encodes time as a 4-byte integer with seconds precision.
encodePacked64DatetimeMicros(LocalDateTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
Encodes dateTime as a 8-byte integer with microseconds precision.
encodePacked64DatetimeMicros(LocalDateTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
Encodes dateTime as a 8-byte integer with microseconds precision.
encodePacked64DatetimeSeconds(LocalDateTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
Encodes dateTime as a 8-byte integer with seconds precision.
encodePacked64DatetimeSeconds(LocalDateTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
Encodes dateTime as a 8-byte integer with seconds precision.
encodePacked64TimeMicros(LocalTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
Encodes time as a 8-byte integer with microseconds precision.
encodePacked64TimeMicros(LocalTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
Encodes time as a 8-byte integer with microseconds precision.
encodePacked64TimeNanos(LocalTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
Encodes time as a 8-byte integer with nanoseconds precision.
encodePacked64TimeNanos(LocalTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
Encodes time as a 8-byte integer with nanoseconds precision.
encodeQueryResult(Table) - Static method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
 
encodeQueryResult(Table, List<TableRow>) - Static method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
 
EncodingException - Exception in org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator
Represents an error during encoding (serializing) a class.
EncodingException(Throwable) - Constructor for exception org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator.EncodingException
 
EncodingException - Exception in org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator
Represents an error during encoding (serializing) a class.
EncodingException(Throwable) - Constructor for exception org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.EncodingException
 
ensureUsableAsCloudPubsub() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.CloudPubsubTransforms
Ensure that all messages that pass through can be converted to Cloud Pub/Sub messages using the standard transformation methods in the client library.
EntityToRow - Class in org.apache.beam.sdk.io.gcp.datastore
A PTransform to perform a conversion of Entity to Row.
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertError
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.DetectNewPartitionsState
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.InitialPipelineState
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.NewPartition
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.StreamPartitionWithWatermark
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteFailure
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteSuccessSummary
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameter
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadParameter
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadResponse
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.ProjectPath
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SubscriptionPath
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.TopicPath
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartition
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ColumnType
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.HeartbeatRecord
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.Mod
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.TypeCode
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRange
 
equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
 
ERROR_MESSAGE - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO.ReadStudyMetadata
TupleTag for any error response.
ERROR_SCHEMA - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformProvider
 
ERROR_SCHEMA - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider
 
ERROR_TAG - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformProvider
 
ERROR_TAG - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformProvider
 
ERROR_TAG - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
 
ERROR_TAG - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider
 
ERROR_TAG - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider
 
ErrorContainer<T> - Interface in org.apache.beam.sdk.io.gcp.bigquery
ErrorContainer interface.
ErrorCounterFn(String, SerializableFunction<Row, byte[]>, Schema, boolean) - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.ErrorCounterFn
 
ErrorCounterFn(String, SerializableFunction<Row, byte[]>, Schema, boolean, List<String>, Schema) - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.ErrorCounterFn
 
ErrorFn(String, SerializableFunction<byte[], Row>, Schema, boolean) - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.ErrorFn
 
ErrorFn(String, SerializableFunction<byte[], Row>, Schema, List<String>, String, Schema, boolean) - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.ErrorFn
 
ErrorHandling() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransformConfiguration.ErrorHandling
 
ErrorHandling() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.ErrorHandling
 
ErrorHandling() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.ErrorHandling
 
eventually(Matcher<Iterable<? extends Row>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery.RowsAssertion
 
ExecuteBundles(ValueProvider<String>) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundles
Instantiates a new Execute bundles.
ExecuteBundles(String) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundles
 
executeBundles(String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
Execute Bundle Method executes a batch of requests in batch or as a single transaction @see .
executeBundles(ValueProvider<String>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
Execute Bundle Method executes a batch of requests in batch or as a single transaction @see .
executeFhirBundle(String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Execute fhir bundle http body.
executeFhirBundle(String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformProvider.BigQueryExportSchemaTransform
 
expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryFileLoadsWriteSchemaTransformProvider.BigQueryWriteSchemaTransform
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
expand(PCollection<KV<DestinationT, ElementT>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.CreateTables
 
expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite
 
expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransform
 
expand(PCollectionRowTuple) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.ReifyAsIterable
 
expand(PCollection<KV<DestinationT, ElementT>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiConvertMessages
 
expand(PCollection<KV<DestinationT, ElementT>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiLoads
 
expand(PCollection<KV<DestinationT, StorageApiWritePayload>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteRecordsInconsistent
 
expand(PCollection<KV<ShardedKey<DestinationT>, Iterable<StorageApiWritePayload>>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritesShardedRecords
 
expand(PCollection<KV<DestinationT, StorageApiWritePayload>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords
 
expand(PCollection<KV<DestinationT, ElementT>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingInserts
 
expand(PCollection<KV<TableDestination, ElementT>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingWriteTables
 
expand() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
 
expand(PCollection<Row>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BeamRowToBigtableMutation
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
 
expand(PCollection<KV<ByteString, Iterable<Mutation>>>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
 
expand(PCollection<KV<ByteString, Iterable<Mutation>>>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.WriteWithResults
 
expand(PCollection<Row>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableRowToBeamRow
 
expand(PCollection<Row>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableRowToBeamRowFlat
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
expand(PCollection<Entity>) - Method in class org.apache.beam.sdk.io.gcp.datastore.EntityToRow
 
expand(PCollection<Row>) - Method in class org.apache.beam.sdk.io.gcp.datastore.RowToEntity
 
expand(PCollection<BatchGetDocumentsRequest>) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchGetDocuments
 
expand(PCollection<Write>) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchWriteWithDeadLetterQueue
 
expand(PCollection<Write>) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchWriteWithSummary
 
expand(PCollection<ListCollectionIdsRequest>) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.ListCollectionIds
 
expand(PCollection<ListDocumentsRequest>) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.ListDocuments
 
expand(PCollection<PartitionQueryRequest>) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.PartitionQuery
 
expand(PCollection<RunQueryRequest>) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.RunQuery
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO.ReadStudyMetadata
 
expand() - Method in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO.ReadStudyMetadata.Result
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Deidentify
 
expand(PCollection<FhirBundleParameter>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundles
 
expand() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Export
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Import
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read
 
expand() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read.Result
 
expand(PCollection<FhirSearchParameter<T>>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search
 
expand() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search.Result
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
 
expand() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.Result
 
expand(PCollection<FhirIOPatientEverything.PatientEverythingParameter>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything
 
expand() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything.Result
 
expand(PCollection<HL7v2ReadParameter>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read
 
expand(PCollection<HL7v2ReadParameter>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read.FetchHL7v2Message
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.ListHL7v2Messages
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read
 
expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.FetchHL7v2Message
 
expand(PCollection<HL7v2Message>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write
 
expand() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write.Result
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
 
expand(PCollection<PubsubMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
 
expand(PCollection<PubSubMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.AddUuidsTransform
 
expand(PCollection<byte[]>) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.ProtoFromBytes
 
expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.ProtoToBytes
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.SubscribeTransform
 
expand(PCollection<SequencedMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UuidDeduplicationTransform
 
expand(PCollection<PubSubMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.SetUuidFromPubSubMessage
 
expand(PInput) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
expand(PCollection<ReadOperation>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
 
expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
 
expand(PCollection<Mutation>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
 
expand(PCollection<MutationGroup>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.WriteGrouped
 
expand() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteResult
 
expandInconsistent(PCollection<KV<DestinationT, ElementT>>, Coder<KV<DestinationT, StorageApiWritePayload>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiLoads
 
expandTriggered(PCollection<KV<DestinationT, ElementT>>, Coder<KV<DestinationT, StorageApiWritePayload>>, Coder<StorageApiWritePayload>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiLoads
 
expandUntriggered(PCollection<KV<DestinationT, ElementT>>, Coder<KV<DestinationT, StorageApiWritePayload>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiLoads
 
expectDryRunQuery(String, String, JobStatistics) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
 
Export(ValueProvider<String>, ValueProvider<String>) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Export
 
exportFhirResourceToBigQuery(String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Export a FHIR Resource to BigQuery.
exportFhirResourceToBigQuery(String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
exportFhirResourceToGcs(String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Export a FHIR Resource to GCS.
exportFhirResourceToGcs(String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
exportResources(DoFn<String, String>.ProcessContext) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Export.ExportResourcesFn
 
exportResources(String, String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
Export resources to GCS.
exportResources(ValueProvider<String>, ValueProvider<String>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
 
ExportResourcesFn(ValueProvider<String>) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Export.ExportResourcesFn
 
ExternalRead - Class in org.apache.beam.sdk.io.gcp.pubsub
Exposes PubsubIO.Read as an external transform for cross-language usage.
ExternalRead() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead
 
ExternalRead.Configuration - Class in org.apache.beam.sdk.io.gcp.pubsub
Parameters class to expose the transform to an external SDK.
ExternalRead.ReadBuilder - Class in org.apache.beam.sdk.io.gcp.pubsub
 
ExternalTransformRegistrarImpl - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
 
ExternalTransformRegistrarImpl() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.ExternalTransformRegistrarImpl
 
ExternalWrite - Class in org.apache.beam.sdk.io.gcp.pubsub
Exposes PubsubIO.Write as an external transform for cross-language usage.
ExternalWrite() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite
 
ExternalWrite.Configuration - Class in org.apache.beam.sdk.io.gcp.pubsub
Parameters class to expose the transform to an external SDK.
ExternalWrite.ParsePubsubMessageProtoAsPayloadFromWindowedValue - Class in org.apache.beam.sdk.io.gcp.pubsub
 
ExternalWrite.WriteBuilder - Class in org.apache.beam.sdk.io.gcp.pubsub
 
extractTimestampAttribute(String, Map<String, String>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
Return the timestamp (in ms since unix epoch) to use for a Pubsub message with timestampAttribute and attriutes.

F

FACTORY - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
Factory for creating Pubsub clients using gRPC transport.
FACTORY - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
Factory for creating Pubsub clients using Json transport.
FAILED - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write
The tag for the failed writes to HL7v2 store`.
FAILED_BODY - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
The tag for the failed writes to FHIR store.
FAILED_BUNDLES - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundles
The TupleTag used for bundles that failed to be executed for any reason.
FAILED_FILES - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
The tag for the files that failed to FHIR store.
FailedWritesException(List<FirestoreV1.WriteFailure>) - Constructor for exception org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.FailedWritesException
 
failOnInsert(Map<TableRow, List<TableDataInsertAllResponse.InsertErrors>>) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
Cause a given TableRow object to fail when it's inserted.
FakeBigQueryServerStream(List<T>) - Constructor for class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices.FakeBigQueryServerStream
 
FakeBigQueryServices - Class in org.apache.beam.sdk.io.gcp.testing
A fake implementation of BigQuery's query service..
FakeBigQueryServices() - Constructor for class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
 
FakeBigQueryServices.FakeBigQueryServerStream<T> - Class in org.apache.beam.sdk.io.gcp.testing
An implementation of BigQueryServerStream which takes a List as the Iterable to simulate a server stream.
FakeDatasetService - Class in org.apache.beam.sdk.io.gcp.testing
A fake dataset service that can be serialized, for use in testReadFromTable.
FakeDatasetService() - Constructor for class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
FakeJobService - Class in org.apache.beam.sdk.io.gcp.testing
A fake implementation of BigQuery's job service.
FakeJobService() - Constructor for class org.apache.beam.sdk.io.gcp.testing.FakeJobService
 
FakeJobService(int) - Constructor for class org.apache.beam.sdk.io.gcp.testing.FakeJobService
 
FetchHL7v2Message() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read.FetchHL7v2Message
Instantiates a new Fetch HL7v2 message DoFn.
FetchHL7v2Message() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.FetchHL7v2Message
Instantiates a new Fetch HL7v2 message DoFn.
FhirBundleParameter - Class in org.apache.beam.sdk.io.gcp.healthcare
FhirBundleParameter represents a FHIR bundle in JSON format to be executed on a FHIR store.
FhirBundleParameter() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleParameter
 
FhirBundleResponse - Class in org.apache.beam.sdk.io.gcp.healthcare
 
FhirBundleResponse() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleResponse
 
FhirIO - Class in org.apache.beam.sdk.io.gcp.healthcare
FhirIO provides an API for reading and writing resources to Google Cloud Healthcare Fhir API.
FhirIO() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
 
FhirIO.Deidentify - Class in org.apache.beam.sdk.io.gcp.healthcare
Deidentify FHIR resources from a FHIR store to a destination FHIR store.
FhirIO.Deidentify.DeidentifyFn - Class in org.apache.beam.sdk.io.gcp.healthcare
A function that schedules a deidentify operation and monitors the status.
FhirIO.ExecuteBundles - Class in org.apache.beam.sdk.io.gcp.healthcare
The type Execute bundles.
FhirIO.ExecuteBundlesResult - Class in org.apache.beam.sdk.io.gcp.healthcare
ExecuteBundlesResult contains both successfully executed bundles and information help debugging failed executions (eg metadata & error msgs).
FhirIO.Export - Class in org.apache.beam.sdk.io.gcp.healthcare
Export FHIR resources from a FHIR store to new line delimited json files on GCS or BigQuery.
FhirIO.Export.ExportResourcesFn - Class in org.apache.beam.sdk.io.gcp.healthcare
A function that schedules an export operation and monitors the status.
FhirIO.Import - Class in org.apache.beam.sdk.io.gcp.healthcare
Writes each bundle of elements to a new-line delimited JSON file on GCS and issues a fhirStores.import Request for that file.
FhirIO.Import.ContentStructure - Enum in org.apache.beam.sdk.io.gcp.healthcare
The enum Content structure.
FhirIO.Read - Class in org.apache.beam.sdk.io.gcp.healthcare
The type Read.
FhirIO.Read.Result - Class in org.apache.beam.sdk.io.gcp.healthcare
The type Result.
FhirIO.Search<T> - Class in org.apache.beam.sdk.io.gcp.healthcare
The type Search.
FhirIO.Search.Result - Class in org.apache.beam.sdk.io.gcp.healthcare
 
FhirIO.Write - Class in org.apache.beam.sdk.io.gcp.healthcare
The type Write.
FhirIO.Write.AbstractResult - Class in org.apache.beam.sdk.io.gcp.healthcare
 
FhirIO.Write.Result - Class in org.apache.beam.sdk.io.gcp.healthcare
The type Result.
FhirIO.Write.WriteMethod - Enum in org.apache.beam.sdk.io.gcp.healthcare
The enum Write method.
FhirIOPatientEverything - Class in org.apache.beam.sdk.io.gcp.healthcare
The type FhirIOPatientEverything for querying a FHIR Patient resource's compartment.
FhirIOPatientEverything() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything
 
FhirIOPatientEverything.PatientEverythingParameter - Class in org.apache.beam.sdk.io.gcp.healthcare
PatientEverythingParameter defines required attributes for a FHIR GetPatientEverything request in FhirIOPatientEverything.
FhirIOPatientEverything.Result - Class in org.apache.beam.sdk.io.gcp.healthcare
The Result for a FhirIOPatientEverything request.
FhirResourcePagesIterator(HttpHealthcareApiClient.FhirResourcePagesIterator.FhirMethod, HealthcareApiClient, String, String, String, Map<String, Object>) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.FhirResourcePagesIterator
 
FhirSearchParameter<T> - Class in org.apache.beam.sdk.io.gcp.healthcare
FhirSearchParameter represents the query parameters for a FHIR search request, used as a parameter for FhirIO.Search.
FhirSearchParameterCoder<T> - Class in org.apache.beam.sdk.io.gcp.healthcare
FhirSearchParameterCoder is the coder for FhirSearchParameter, which takes a coder for type T.
fhirStoresImport(String, String, String, FhirIO.Import.ContentStructure) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
Import method for batch writing resources.
fhirStoresImport(String, String, FhirIO.Import.ContentStructure) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
 
fhirStoresImport(ValueProvider<String>, ValueProvider<String>, ValueProvider<String>, FhirIO.Import.ContentStructure) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
 
FilterForMutationDoFn - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn
 
FilterForMutationDoFn() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.FilterForMutationDoFn
 
finalizeCheckpoint() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.CheckpointMarkImpl
 
finalizeWriteStream(String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.WriteStreamService
Finalize a write stream.
finalizeWriteStream(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.WriteStreamServiceImpl
 
finalizeWriteStream(String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
finish(DoFn<SequencedMessage, Row>.FinishBundleContext) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.ErrorFn
 
finish() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.ErrorCounterFn
 
finish(DoFn<DataChangeRecord, Row>.FinishBundleContext) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.DataChangeRecordToRow
 
finishBundle(DoFn<Iterable<KV<DestinationT, WriteTables.Result>>, Iterable<KV<TableDestination, WriteTables.Result>>>.FinishBundleContext) - Method in class org.apache.beam.sdk.io.gcp.bigquery.UpdateSchemaDestination
 
finishBundle() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write.PubsubBoundedWriter
 
finishBundle() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.PubsubLiteSink
 
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
 
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO.ReadStudyMetadata.Result
 
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
 
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read.Result
 
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search.Result
 
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.AbstractResult
 
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.Result
 
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything.Result
 
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write.Result
 
finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteResult
 
FirestoreIO - Class in org.apache.beam.sdk.io.gcp.firestore
FirestoreIO provides an API for reading from and writing to Google Cloud Firestore.
FirestoreOptions - Interface in org.apache.beam.sdk.io.gcp.firestore
 
FirestoreV1 - Class in org.apache.beam.sdk.io.gcp.firestore
FirestoreV1 provides an API which provides lifecycle managed PTransforms for Cloud Firestore v1 API.
FirestoreV1.BatchGetDocuments - Class in org.apache.beam.sdk.io.gcp.firestore
Concrete class representing a PTransform<PCollection<BatchGetDocumentsRequest>, PTransform<BatchGetDocumentsResponse>> which will read from Firestore.
FirestoreV1.BatchGetDocuments.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
A type safe builder for FirestoreV1.BatchGetDocuments allowing configuration and instantiation.
FirestoreV1.BatchWriteWithDeadLetterQueue - Class in org.apache.beam.sdk.io.gcp.firestore
Concrete class representing a PTransform<PCollection<Write>, PCollection<FirestoreV1.WriteFailure> which will write to Firestore.
FirestoreV1.BatchWriteWithDeadLetterQueue.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
A type safe builder for FirestoreV1.BatchWriteWithDeadLetterQueue allowing configuration and instantiation.
FirestoreV1.BatchWriteWithSummary - Class in org.apache.beam.sdk.io.gcp.firestore
Concrete class representing a PTransform<PCollection<Write>, PDone> which will write to Firestore.
FirestoreV1.BatchWriteWithSummary.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
A type safe builder for FirestoreV1.BatchWriteWithSummary allowing configuration and instantiation.
FirestoreV1.FailedWritesException - Exception in org.apache.beam.sdk.io.gcp.firestore
Exception that is thrown if one or more Writes is unsuccessful with a non-retryable status code.
FirestoreV1.ListCollectionIds - Class in org.apache.beam.sdk.io.gcp.firestore
Concrete class representing a PTransform<PCollection<ListCollectionIdsRequest>, PTransform<ListCollectionIdsResponse>> which will read from Firestore.
FirestoreV1.ListCollectionIds.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
A type safe builder for FirestoreV1.ListCollectionIds allowing configuration and instantiation.
FirestoreV1.ListDocuments - Class in org.apache.beam.sdk.io.gcp.firestore
Concrete class representing a PTransform<PCollection<ListDocumentsRequest>, PTransform<ListDocumentsResponse>> which will read from Firestore.
FirestoreV1.ListDocuments.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
A type safe builder for FirestoreV1.ListDocuments allowing configuration and instantiation.
FirestoreV1.PartitionQuery - Class in org.apache.beam.sdk.io.gcp.firestore
Concrete class representing a PTransform<PCollection<PartitionQueryRequest>, PTransform<RunQueryRequest>> which will read from Firestore.
FirestoreV1.PartitionQuery.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
A type safe builder for FirestoreV1.PartitionQuery allowing configuration and instantiation.
FirestoreV1.Read - Class in org.apache.beam.sdk.io.gcp.firestore
Type safe builder factory for read operations.
FirestoreV1.RunQuery - Class in org.apache.beam.sdk.io.gcp.firestore
Concrete class representing a PTransform<PCollection<RunQueryRequest>, PTransform<RunQueryResponse>> which will read from Firestore.
FirestoreV1.RunQuery.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
A type safe builder for FirestoreV1.RunQuery allowing configuration and instantiation.
FirestoreV1.Write - Class in org.apache.beam.sdk.io.gcp.firestore
Type safe builder factory for write operations.
FirestoreV1.WriteFailure - Class in org.apache.beam.sdk.io.gcp.firestore
Failure details for an attempted Write.
FirestoreV1.WriteSuccessSummary - Class in org.apache.beam.sdk.io.gcp.firestore
Summary object produced when a number of writes are successfully written to Firestore in a single BatchWrite.
floatToByteArray(float) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigtableUtils
 
flush(String, long) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.WriteStreamService
Flush a given stream up to the given offset.
flush(String, long) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.WriteStreamServiceImpl
 
flush(String, long) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
formatByteStringRange(Range.ByteStringRange) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ByteStringRangeHelper
Returns formatted string of a partition for debugging.
from(BigQueryExportReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformProvider
Returns the expected SchemaTransform of the configuration.
from(BigQueryFileLoadsWriteSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryFileLoadsWriteSchemaTransformProvider
Returns the expected SchemaTransform of the configuration.
from(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
Reads a BigQuery table specified as "[project_id]:[dataset_id].[table_id]" or "[dataset_id].[table_id]" for tables within the current project.
from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
Same as from(String), but with a ValueProvider.
from(TableReference) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
Read from table specified by a TableReference.
from(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
from(TableReference) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
from(String, Row, Schema) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaIOProvider
Produces a SchemaIO given a String representing the data's location, the schema of the data that resides there, and some IO-specific configuration object.
from(BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider
 
from(BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider
 
from(BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider
 
from(BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider
 
from(String, Row, Schema) - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider
Produce a SchemaIO given a String representing the data's location, the schema of the data that resides there, and some IO-specific configuration object.
from(PubsubReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformProvider
 
from(String, Row, Schema) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
Produce a SchemaIO given a String representing the data's location, the schema of the data that resides there, and some IO-specific configuration object.
from(PubsubWriteSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformProvider
 
from(PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
 
from(PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider
 
from(Struct) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper.PartitionMetadataMapper
Transforms a Struct representing a partition metadata row into a PartitionMetadata model.
from(SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider
 
from(SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider
 
fromCloudPubsubMessages() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.CloudPubsubTransforms
Transform messages publishable using PubsubIO to their equivalent Pub/Sub Lite publishable message.
fromJsonString(String, Class<T>) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
 
fromModel(Message) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
From model Message to hl7v2 message.
fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
Creates an instance of this rule using provided options.
fromPath(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubSubscription
Creates a class representing a Pub/Sub subscription from the specified subscription path.
fromPath(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
Creates a class representing a Cloud Pub/Sub topic from the specified topic path.
fromProto(PubsubMessage) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessages
 
fromQuery(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
Reads results received after executing the given query.
fromQuery(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
Same as fromQuery(String), but with a ValueProvider.
fromQuery(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
fromQuery(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
fromSubscription(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
Reads from the given subscription.
fromSubscription(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
Like subscription() but with a ValueProvider.
fromTableSchema(TableSchema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
Convert a BigQuery TableSchema to a Beam Schema.
fromTableSchema(TableSchema, BigQueryUtils.SchemaConversionOptions) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
Convert a BigQuery TableSchema to a Beam Schema.
fromTopic(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
Creates and returns a transform for reading from a Cloud Pub/Sub topic.
fromTopic(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
Like PubsubIO.Read.fromTopic(String) but with a ValueProvider.

G

GcpIoPipelineOptionsRegistrar - Class in org.apache.beam.sdk.io.gcp.common
A registrar containing the default GCP options.
GcpIoPipelineOptionsRegistrar() - Constructor for class org.apache.beam.sdk.io.gcp.common.GcpIoPipelineOptionsRegistrar
 
generateInitialChangeStreamPartitions() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.ChangeStreamDao
Returns the result from GenerateInitialChangeStreamPartitions API.
generateInitialPartitionsAction(ChangeStreamMetrics, ChangeStreamDao, Instant) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ActionFactory
Creates and returns a singleton instance of an action class for processing DetectNewPartitionsDoFn
GenerateInitialPartitionsAction - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.action
Class to generate first set of outputs for DetectNewPartitionsDoFn.
GenerateInitialPartitionsAction(ChangeStreamMetrics, ChangeStreamDao, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.GenerateInitialPartitionsAction
 
generatePartitionMetadataTableName(String) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.NameGenerator
Generates an unique name for the partition metadata table in the form of "Metadata_<databaseId>_<uuid>".
generateRowKeyPrefix() - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.UniqueIdGenerator
Return a random base64 encoded 8 byte string.
get() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator.BytesThroughputEstimator
Returns the estimated throughput bytes for this run.
get() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.BytesThroughputEstimator
Returns the estimated throughput bytes for now.
get() - Method in interface org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.ThroughputEstimator
Returns the estimated throughput for now.
getAll() - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
Retrieve all HL7v2 Messages from a PCollection of message IDs (such as from PubSub notification subscription).
getAllIds(String, String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
getAllJobs() - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
 
getAllPartitionsCreatedAfter(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
Fetches all partitions with a PartitionMetadataAdminDao.COLUMN_CREATED_AT less than the given timestamp.
getAllRows(String, String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
getApplicationName() - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher.TableAndQuery
 
getAppProfileId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableConfig
Returns the app profile being read from.
getAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
Returns the given attribute value.
getAttributeId() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
 
getAttributeId() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
 
getAttributeMap() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
Returns the full map of attributes.
getAttributeMap() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
 
getAttributes() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
 
getAttributes() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration
 
getAttributes() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
 
getAttributes() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
 
getAttributesMap() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
 
getAttributesMap() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration
 
getAutoSharding() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransformConfiguration
 
getBaseName() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics.ParsedMetricName
 
getBatchClient() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerAccessor
 
getBatchInitialCount() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
The initial size of a batch; used in the absence of the QoS system having significant data to determine a better batch size.
getBatchMaxBytes() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
The maximum number of bytes to include in a batch.
getBatchMaxCount() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
The maximum number of writes to include in a batch.
getBatchTargetLatency() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
Target latency for batch requests.
getBigQueryLocation() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.TestBigQueryOptions
 
getBigQueryProject() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getBigtableChangeStreamInstanceId() - Method in interface org.apache.beam.sdk.io.gcp.bigtable.changestreams.BigtableChangeStreamTestOptions
 
getBigtableClientOverride() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableConfig
Returns the Bigtable client override.
getBigtableOptions() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Deprecated.
read options are configured directly on BigtableIO.read(). Use BigtableIO.Read.populateDisplayData(DisplayData.Builder) to view the current configurations.
getBigtableOptions() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Deprecated.
write options are configured directly on BigtableIO.write(). Use BigtableIO.Write.populateDisplayData(DisplayData.Builder) to view the current configurations.
getBqStreamingApiLoggingFrequencySec() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getBundle() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleParameter
FHIR R4 bundle resource object as a string.
getCellsMutatedPerColumn(String, String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema
Return the total number of cells affected when the specified column is mutated.
getCellsMutatedPerRow(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema
Return the total number of cells affected with the given row is deleted.
getChangeStreamContinuationTokens() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.NewPartition
 
getChangeStreamContinuationTokens() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
 
getChangeStreamDao() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.DaoFactory
 
getChangeStreamDao() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.DaoFactory
Creates and returns a singleton DAO instance for querying a partition change stream.
getChangeStreamName() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.DaoFactory
 
getChangeStreamName() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration
 
getChangeStreamNamePrefix() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
Return the prefix used to identify the rows belonging to this job.
getChangeStreamNamePrefix() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
 
getCheckpointMark() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
 
getCheckpointMarkCoder() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedSourceImpl
 
getChildPartitions() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
List of child partitions yielded within this record.
getClient(String) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
 
getClientFactory() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
 
getClock() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
 
getCloseStream() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
 
getClustering() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
getCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.TypeCode
Returns the type code of the column.
getCoderProvider() - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResultCoder
 
getCoderProvider() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.OffsetByteRangeCoder
 
getCoderProvider() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.SubscriptionPartitionCoder
 
getCoderProvider() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UuidCoder
 
getCoderProviders() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryCoderProviderRegistrar
 
getCoderProviders() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubCoderProviderRegistrar
 
getColumns() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
getColumns(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema
 
getCommitDeadline() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
getCommitRetrySettings() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
getCommitTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.TransactionResult
Returns the commit timestamp of the read / write transaction.
getCommitTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
The timestamp at which the modifications within were committed in Cloud Spanner.
getCreatedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
The time at which this partition was first detected and created in the metadata table.
getCreateDisposition() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryFileLoadsWriteSchemaTransformConfiguration
Specifies whether the table should be created if it does not exist.
getCreateDisposition() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransformConfiguration
 
getCreateTime() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
Gets create time.
getCurrent() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
 
getCurrentRowAsStruct() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSet
Returns the record at the current pointer as a Struct.
getCurrentSource() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
 
getCurrentTimestamp() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
 
getCurrentToken() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
 
getData() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
Gets data.
getDatabaseAdminClient() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerAccessor
 
getDatabaseClient() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerAccessor
 
getDatabaseId() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
getDatabaseId() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration
 
getDatabaseId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
getDatabaseId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration
 
getDatabaseRole() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
getDataBoostEnabled() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
getDataClient() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.BigtableChangeStreamAccessor
 
getDataResource() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOError
 
getDataset(String, String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
Gets the specified Dataset resource by dataset ID.
getDataset(String, String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
Gets the specified Dataset resource by dataset ID.
getDataset(String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
getDatasetService(BigQueryOptions) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices
Returns a real, mock, or fake BigQueryServices.DatasetService.
getDatasetService(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl
 
getDatasetService(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
 
getDescriptorFromTableSchema(TableSchema, boolean, boolean) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
 
getDescriptorFromTableSchema(TableSchema, boolean, boolean) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
Given a BigQuery TableSchema, returns a protocol-buffer Descriptor that can be used to write data using the BigQuery Storage API.
getDestination(ValueInSingleWindow<T>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
Returns an object that represents at a high level which table is being written to.
getDestinationCoder() - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
Returns the coder for DestinationT.
getEarliestHL7v2SendTime(String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Gets the earliest HL7v2 send time.
getEarliestHL7v2SendTime(String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
getElement() - Method in class org.apache.beam.sdk.io.gcp.bigquery.AvroWriteRequest
 
getEmulatorHost() - Method in interface org.apache.beam.sdk.io.gcp.firestore.FirestoreOptions
A host port pair to allow connecting to a Cloud Firestore emulator instead of the live service.
getEmulatorHost() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
getEnableBundling() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getEncodedElementByteSize(BigQueryInsertError) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertErrorCoder
 
getEncodedElementByteSize(TableRow) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertErrorCoder
 
getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
 
getEnd() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.HL7v2MessagePages
 
getEndAtTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration
 
getEndTime() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
 
getEndTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
The end time for querying this given partition.
getError() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertError
 
getErrorHandling() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransformConfiguration
 
getErrorHandling() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
 
getErrorHandling() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration
 
getErrorHandling() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
 
getErrorHandling() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
 
getErrorInfo(IOException) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
 
getErrorMessage() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertError
 
getErrorMessage() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOError
 
getEstimatedLowWatermark() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
 
getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
 
getExecuteStreamingSqlRetrySettings() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
getFailedBodies() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
 
getFailedBodies() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.AbstractResult
 
getFailedBodies() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.Result
Gets failed bodies with err.
getFailedBundles() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
Gets failed FhirBundleResponse wrapped inside HealthcareIOError.
getFailedFiles() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
 
getFailedFiles() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.AbstractResult
 
getFailedFiles() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.Result
Gets failed file imports with err.
getFailedInserts() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
Returns a PCollection containing the TableRows that didn't make it to BQ.
getFailedInsertsWithErr() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
Returns a PCollection containing the BigQueryInsertErrors with detailed error information.
getFailedInsertsWithErr() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write.Result
 
getFailedMutations() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteResult
 
getFailedReads() - Method in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO.ReadStudyMetadata.Result
Gets failed reads.
getFailedReads() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read.Result
Gets failed reads.
getFailedReads() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything.Result
Gets failed reads.
getFailedRowsTag() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiLoads
 
getFailedRowsTupleTag() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingWriteTables
 
getFailedSearches() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search.Result
Gets failed searches.
getFailedStorageApiInserts() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
Return any rows that persistently fail to insert when using a storage-api method.
getFhirBundleParameter() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleResponse
FhirBundleParameter represents a FHIR bundle in JSON format to be executed on a FHIR store.
getFhirStore() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundles
 
getField() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema.KeyPart
 
getFinishedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
The time at which the connector finished processing this partition.
getFirestoreDb() - Method in interface org.apache.beam.sdk.io.gcp.firestore.FirestoreOptions
The Firestore database ID to connect to.
getFirestoreHost() - Method in interface org.apache.beam.sdk.io.gcp.firestore.FirestoreOptions
A host port pair to allow connecting to a Cloud Firestore instead of the default live service.
getFlatJsonRows(Schema) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery
Loads rows from BigQuery into Rows with given Schema.
getFormat() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
 
getFormat() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration
 
getFormat() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
 
getFormat() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
 
getFrom(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.BytesThroughputEstimator
Returns the estimated throughput bytes for a specified time.
getFrom(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.NullThroughputEstimator
Always returns 0.
getFrom(Timestamp) - Method in interface org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.ThroughputEstimator
Returns the estimated throughput for a specified time.
getFrom() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRange
Returns the range start timestamp (inclusive).
getFullPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SubscriptionPath
 
getFullPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.TopicPath
 
getGson() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.DataChangeRecordToRow
 
getHeartbeatMillis() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
The number of milliseconds after the stream is idle, which a heartbeat record will be emitted in the change stream query.
getHintMaxNumWorkers() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
A hint to the QoS system for the intended max number of workers for a pipeline.
getHL7v2Message(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Gets a Hl7v2 message by its name from a Hl7v2 store.
getHL7v2Message() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadResponse
Gets hl7v2Message.
getHL7v2Message(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
Gets HL7v2 message.
getHl7v2MessageId() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadParameter
HL7v2MessageId string.
getHL7v2Store(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Gets an HL7v2 store.
getHL7v2Store(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
Gets HL7v2 store.
getHost() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
getHTTPWriteTimeout() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getId() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.ProjectPath
 
getId() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SchemaPath
 
getIdAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
 
getIdAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
Get the id attribute.
getIdAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
Get the id attribute.
getIdAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration
 
getIndex() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
getInferMaps() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.SchemaConversionOptions
/** Controls whether to use the map or row FieldType for a TableSchema field that appears to represent a map (it is an array of structs containing only key and value fields).
getInflightWaitSeconds() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StreamAppendClient
If the previous call to appendRows blocked due to flow control, returns how long the call blocked for.
getInitialBackoff() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
The initial backoff duration to be used before retrying a request for the first time.
getInitialWatermarkEstimatorState(InitialPipelineState) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.DetectNewPartitionsDoFn
 
getInitialWatermarkEstimatorState(PartitionRecord) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.ReadChangeStreamPartitionDoFn
 
getInitialWatermarkEstimatorState(PartitionMetadata) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.DetectNewPartitionsDoFn
 
getInitialWatermarkEstimatorState(PartitionMetadata) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.ReadChangeStreamPartitionDoFn
 
getInsertBundleParallelism() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getInsertCount() - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
getInsertErrors() - Method in class org.apache.beam.sdk.io.gcp.bigquery.InsertRetryPolicy.Context
 
getInstanceAdminClient() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.BigtableChangeStreamAccessor
 
getInstanceId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableConfig
Returns the instance id being written to.
getInstanceId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration
 
getInstanceId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration
 
getInstanceId() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration
 
getInstanceId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
getInstanceId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration
 
getIntersectingPartition(Range.ByteStringRange, Range.ByteStringRange) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ByteStringRangeHelper
Return the overlapping parts of 2 partitions.
getIsLocalChannelProvider() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
getJob(JobReference) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.JobService
Gets the specified Job by the given JobReference.
getJob(JobReference) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
 
getJobService(BigQueryOptions) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices
Returns a real, mock, or fake BigQueryServices.JobService.
getJobService(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl
 
getJobService(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
 
getJsonClustering() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
getJsonFactory() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
getJsonTimePartitioning() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
getKey() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameter
 
getKeyedResources() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search.Result
Gets resources with input SearchParameter key.
getKeyParts(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema
 
getKeySet() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
getKeysJson() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.Mod
The primary keys of this specific modification.
getKind() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
 
getKind() - Method in interface org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.PubsubClientFactory
Return the display name for this factory.
getLabels() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
Gets labels.
getLastRunTimestamp() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
 
getLastUpdated() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.NewPartition
 
getLatestHL7v2SendTime(String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Gets the latest HL7v2 send time.
getLatestHL7v2SendTime(String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
getLiteralGqlQuery() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
getLocalhost() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
getLocation() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
 
getLocation() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
 
getLowWatermark() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.NewPartition
 
getMaxAttempts() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
The maximum number of times a request will be attempted for a complete successful result.
getMaxBufferingDurationMilliSec() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getMaxCumulativeBackoff() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
getMaxStreamingBatchSize() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getMaxStreamingRowsToBatch() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getMessage() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.OutgoingMessage
Underlying Message.
getMessageConverter(DestinationT, BigQueryServices.DatasetService) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiDynamicDestinationsTableRow
 
getMessageId() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
Returns the messageId of the message populated by Cloud Pub/Sub.
getMessageType() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
Gets message type.
getMetadata() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleParameter
String representing the metadata of the Bundle to be written.
getMetadata() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadParameter
String representing the metadata of the messageId to be read.
getMetadata() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadResponse
Gets metadata.
getMetadata() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSet
Returns the gathered metadata for the change stream query so far.
getMetadata() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
The connector execution metadata for this record.
getMetadataTable() - Method in interface org.apache.beam.sdk.io.gcp.spanner.SpannerIO.SpannerChangeStreamOptions
Returns the name of the metadata table.
getMetadataTableAdminDao() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.DaoFactory
 
getMetadataTableDao() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.DaoFactory
 
getMetadataTableDebugString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.DaoFactory
 
getMethod() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
 
getMethod() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
getMetricLabels() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics.ParsedMetricName
 
getMissingPartitionsFrom(List<Range.ByteStringRange>, ByteString, ByteString) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ByteStringRangeHelper
Return missing partitions within partitions that are within start and end.
getMissingPartitionsFromEntireKeySpace(List<Range.ByteStringRange>) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ByteStringRangeHelper
Return missing partitions from the entire keyspace.
getMods() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
The modifications within this record.
getModType() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
The type of operation that caused the modifications within this record.
getMutationInformation() - Method in class org.apache.beam.sdk.io.gcp.bigquery.RowMutation
 
GetMutationsFromBeamRow() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.GetMutationsFromBeamRow
 
getMutationType() - Method in class org.apache.beam.sdk.io.gcp.bigquery.RowMutationInformation
 
getName() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
Gets name.
getName() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SubscriptionPath
 
getName() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.TopicPath
 
getName() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ColumnType
The name of the column.
getName() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema.Column
 
getNamespace() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
getNeedsAttributes() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
 
getNeedsMessageId() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
 
getNeedsOrderingKey() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
 
getNewBigqueryClient(String) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
 
getNewValuesJson() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.Mod
The new column values after the modification was applied.
getNextId() - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.UniqueIdGenerator
Return a random base64 encoded 8 byte string.
getNumberOfPartitionsInTransaction() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
The total number of partitions for the given transaction.
getNumberOfRecordsInTransaction() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
The total number of data change records for the given transaction.
getNumberOfRecordsRead() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSetMetadata
Returns the total number of records read from the change stream so far.
getNumberOfRecordsRead() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
The number of records read in the partition change stream query before reading this record.
getNumBytes() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteSuccessSummary
 
getNumEntities(PipelineOptions, String, String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
Returns Number of entities available for reading.
getNumExtractJobCalls() - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
 
getNumQuerySplits() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
getNumRows(BigQueryOptions, TableReference) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
It returns the number of rows for a given table.
getNumStorageWriteApiStreamAppendClients() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getNumStorageWriteApiStreams() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getNumStreamingKeys() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getNumStreams() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransformConfiguration
 
getNumWrites() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteSuccessSummary
 
getObservedTime() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOError
 
getOldValuesJson() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.Mod
The old column values before the modification was applied.
getOrCreate(BigtableConfig) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.BigtableChangeStreamAccessor
Create a BigtableAccess if it doesn't exist and store it in the cache for faster access.
getOrCreate(SpannerConfig) - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerAccessor
 
getOrderingKey() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
Returns the ordering key of the message.
getOrdinalPosition() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ColumnType
The position of the column in the table.
getOrphanedNewPartitions() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.reconciler.OrphanedMetadataCleaner
Returns a list of NewPartition that have been around for a while and do not overlap with any missing partition.
getOutput() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransformConfiguration.ErrorHandling
 
getOutput() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.ErrorHandling
 
getOutput() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.ErrorHandling
 
getOutput() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteResult
 
getOutputCoder() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedSourceImpl
 
getOverlappingPartitions(List<Range.ByteStringRange>) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ByteStringRangeHelper
Return a list of overlapping partitions.
getOverloadRatio() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
The target ratio between requests sent and successful requests.
getParentLowWatermark() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
 
getParentPartitions() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.NewPartition
 
getParentPartitions() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
 
getParentTokens() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartition
The unique partition identifiers of the parent partitions where this child partition originated from.
getParentTokens() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
The unique partition identifiers of the parent partitions where this child partition originated from.
getPartition() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.NewPartition
 
getPartition() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
 
getPartition() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.StreamPartitionWithWatermark
 
getPartition(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
Fetches the partition metadata row data for the given partition token.
getPartition(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.InTransactionContext
Fetches the partition metadata row data for the given partition token.
getPartitionCreatedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
The time at which this partition was first detected and created in the metadata table.
getPartitionEndTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
The end time for the partition change stream query, which produced this record.
getPartitionMetadataAdminDao() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.DaoFactory
Creates and returns a singleton DAO instance for admin operations over the partition metadata table.
getPartitionMetadataDao() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.DaoFactory
Creates and returns a singleton DAO instance for accessing the partition metadata table.
getPartitionQueryTimeout() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
getPartitionReadTimeout() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
getPartitionRunningAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
The time at which the connector started processing this partition.
getPartitionScheduledAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
The time at which this partition was scheduled to be queried.
getPartitionStartTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
The start time for the partition change stream query, which produced this record.
getPartitionsToReconcile(Instant, Instant) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.reconciler.PartitionReconciler
For missing partitions, try to organize the mismatched parent tokens in a way to fill the missing partitions.
getPartitionToken() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
The partition token that produced this change stream record.
getPartitionToken() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
The unique identifier of the partition that generated this record.
getPartitionToken() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
Unique partition identifier, which can be used to perform a change stream query.
getPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.ProjectPath
 
getPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SchemaPath
 
getPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SubscriptionPath
 
getPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.TopicPath
 
getPatientCompartments() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything.Result
Gets the patient compartment responses for GetPatientEverything requests.
getPatientEverything() - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
Get the patient compartment for a FHIR Patient using the GetPatientEverything/$everything API.
getPatientEverything(String, Map<String, Object>, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Fhir get patient everything http body.
getPatientEverything(String, Map<String, Object>, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
getPayload() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload
 
getPayload() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
Returns the main PubSub message.
getPgJsonb(int) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSet
Returns the record at the current pointer as JsonB.
getPipeline() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
 
getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO.ReadStudyMetadata.Result
 
getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
 
getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read.Result
 
getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search.Result
 
getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.AbstractResult
 
getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.Result
 
getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything.Result
 
getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write.Result
 
getPipeline() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteResult
 
getPipelineOptions() - Method in class org.apache.beam.sdk.io.gcp.common.GcpIoPipelineOptionsRegistrar
 
getProgress() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
Returns the progress made within the restriction so far.
getProject() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
Get the project path.
getProject() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
 
getProject() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
 
getProjectId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableConfig
Returns the project id being written to.
getProjectId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration
 
getProjectId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration
 
getProjectId() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
getProjectId() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
 
getProjectId() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SchemaPath
 
getProjectId() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration
 
getProjectId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
getProjectId() - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher.TableAndQuery
 
getPubsubRootUrl() - Method in interface org.apache.beam.sdk.io.gcp.pubsub.PubsubOptions
Root URL for use with the Google Cloud Pub/Sub API.
getQueries() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameter
 
getQuery() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration
Configures the BigQuery read job with the SQL query.
getQuery() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration
 
getQuery() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
getQuery() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
getQuery() - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher.TableAndQuery
 
getQueryLocation() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration
BigQuery geographic location where the query job will be executed.
getQueryName() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
getQueryStartedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSetMetadata
Returns the timestamp at which the change stream query for a ChangeStreamResultSet first started.
getQueryStartedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
The time that the change stream query which produced this record started.
getRawBytesToRowFunction(Schema) - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
 
getReadOperation() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
 
getReadResponse() - Method in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO.ReadStudyMetadata.Result
Gets resources.
getReadTime() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
getRecord() - Method in class org.apache.beam.sdk.io.gcp.bigquery.SchemaAndRecord
 
getRecordReadAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSetMetadata
Returns the timestamp at which a record was read from the ChangeStreamResultSet.
getRecordReadAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
The time at which the record was fully read.
getRecordSequence() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
Indicates the order in which a record was put to the stream.
getRecordSequence() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
Indicates the order in which this record was put into the change stream in the scope of a partition, commit timestamp and transaction tuple.
getRecordStreamEndedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSetMetadata
Returns the timestamp at which a record finished to be streamed.
getRecordStreamEndedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
The time at which the record finished streaming.
getRecordStreamStartedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSetMetadata
Returns the timestamp at which a record first started to be streamed.
getRecordStreamStartedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
The time at which the record started to be streamed.
getRecordTimestamp() - Method in interface org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecord
The timestamp associated with the record.
getRecordTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
The Cloud Spanner timestamp time when this record occurred.
getRecordTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
Returns the timestamp that which this partition started being valid in Cloud Spanner.
getRecordTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
The timestamp at which the modifications within were committed in Cloud Spanner.
getRecordTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.HeartbeatRecord
Indicates the timestamp for which the change stream query has returned all changes.
getResources() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read.Result
Gets resources.
getResources() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search.Result
Gets resources.
getResourceType() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameter
 
getResponse() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleResponse
HTTP response from the FHIR store after attempting to write the Bundle method.
getResult() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.TransactionResult
Returns the result of the transaction execution.
getRetryableCodes() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
getRow() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertError
 
getRow() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertError
 
getRowRestriction() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration
 
getRowsWritten() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResult
The number of rows written in this batch.
getRowToRawBytesFunction(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider
 
getRowType() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
The type of the primary keys and modified columns within this record.
getRpcPriority() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
getRunningAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
The time at which the connector started processing this partition.
getSamplePeriod() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
The length of time sampled request data will be retained.
getSamplePeriodBucketSize() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
The size of buckets within the specified samplePeriod.
getScheduledAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
The time at which this partition was scheduled to be queried.
getSchema() - Method in class org.apache.beam.sdk.io.gcp.bigquery.AvroWriteRequest
 
getSchema(DestinationT) - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
Returns the table schema for the destination.
getSchema(TableReference, BigQueryServices.DatasetService) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableSchemaCache
 
getSchema(PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
Return a Beam Schema from the Pub/Sub schema resource, if exists.
getSchema(PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
Return a Beam Schema from the Pub/Sub schema resource, if exists.
getSchema(PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
Return a Beam Schema from the Pub/Sub schema resource, if exists.
getSchema() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
 
getSchema(PubsubClient.SchemaPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
 
getSchema() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
 
getSchemaPath(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
getSchemaPath(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
Return SchemaPath from TopicPath if exists.
getSchemaPath(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
Return SchemaPath from TopicPath if exists.
getSchemaPath(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
 
getSchematizedData() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
Gets schematized data.
getSchemaWithoutAttributes(Schema, List<String>) - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider
 
getSelectedFields() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration
 
getSendFacility() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
Gets send facility.
getSendTime() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
Gets send time.
getSequenceNumber() - Method in class org.apache.beam.sdk.io.gcp.bigquery.RowMutationInformation
 
getServerTransactionId() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
The unique transaction id in which the modifications occurred.
getShortTableUrn() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
Return shortened tablespec in datasets/[dataset]/tables/[table] format.
getSideInputs() - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
Specifies that this object needs access to one or more side inputs.
getSingleTokenNewPartition(Range.ByteStringRange) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.NewPartition
Return a new NewPartition that only contains one token that matches the parentPartition.
getSize() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.DetectNewPartitionsDoFn
 
getSize(StreamProgress) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.ReadChangeStreamPartitionDoFn
 
getSize(TimestampRange) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.DetectNewPartitionsDoFn
 
getSize(PartitionMetadata, TimestampRange) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.ReadChangeStreamPartitionDoFn
 
getSplitBacklogBytes() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
 
getStackTrace() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOError
 
getStart() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.HL7v2MessagePages
 
getStartAtTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration
 
getStartTime() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.InitialPipelineState
 
getStartTime() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
 
getStartTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
It is the partition_start_time of the child partition token.
getStartTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
It is the start time at which the partition started existing in Cloud Spanner.
getState() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
The state in which the current partition is in.
getStatus() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteFailure
 
getStatusCode() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOError
 
getStorageApiAppendThresholdBytes() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getStorageApiAppendThresholdRecordCount() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getStorageClient(BigQueryOptions) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices
Returns a real, mock, or fake BigQueryServices.StorageClient.
getStorageClient(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl
 
getStorageClient(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
 
getStorageWriteApiMaxRequestSize() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getStorageWriteApiTriggeringFrequencySec() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getStorageWriteMaxInflightBytes() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getStorageWriteMaxInflightRequests() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getStreamAppendClient(String, DescriptorProtos.DescriptorProto, boolean, AppendRowsRequest.MissingValueInterpretation) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.WriteStreamService
Create an append client for a given Storage API write stream.
getStreamAppendClient(String, DescriptorProtos.DescriptorProto, boolean, AppendRowsRequest.MissingValueInterpretation) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.WriteStreamServiceImpl
 
getStreamAppendClient(String, DescriptorProtos.DescriptorProto, boolean, AppendRowsRequest.MissingValueInterpretation) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
getStreamTableDebugString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.DaoFactory
 
getSubscription() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
 
getSubscription() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
Get the subscription being read from.
getSubscriptionName() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
 
getSubscriptionProvider() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
Get the ValueProvider for the subscription being read from.
getSuccessfulBodies() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
 
getSuccessfulBodies() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.AbstractResult
 
getSuccessfulBodies() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.Result
Gets successful bodies from Write.
getSuccessfulBundles() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
Gets successful FhirBundleResponse from execute bundles operation.
getSuccessfulInserts() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
Returns a PCollection containing the TableRows that were written to BQ via the streaming insert API.
getSuccessfulStorageApiInserts() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
Return all rows successfully inserted using one of the storage-api insert methods.
getSuccessfulTableLoads() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
Returns a PCollection containing the TableDestinations that were successfully loaded using the batch load API.
getTable(BigQueryOptions, TableReference) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
 
getTable() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertError
 
getTable() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
Returns the table to read, or null if reading from a query instead.
getTable() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
getTable() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Returns the table reference, or null.
getTable(TableReference) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
Gets the specified Table resource by table ID.
getTable(TableReference, List<String>) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
 
getTable(TableReference, List<String>, BigQueryServices.DatasetService.TableMetadataView) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
 
getTable(TableReference) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
Gets the specified Table resource by table ID.
getTable(TableReference, List<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
 
getTable(TableReference, List<String>, BigQueryServices.DatasetService.TableMetadataView) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
 
getTable(DestinationT) - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
Returns a TableDestination object for the destination.
getTable() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransformConfiguration
 
getTable() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration
 
getTable() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
getTable(TableReference) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
getTable(TableReference, List<String>) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
getTable(TableReference, List<String>, BigQueryServices.DatasetService.TableMetadataView) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
getTableAdminClient() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.BigtableChangeStreamAccessor
 
getTableConstraints(DestinationT) - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
Returns TableConstraints (including primary and foreign key) to be used when creating the table.
getTableDescription() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
getTableId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns the table being read from.
getTableId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration
 
getTableId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration
 
getTableId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
Return the metadata table name.
getTableId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration
 
getTableImpl(TableReference, List<String>, BigQueryServices.DatasetService.TableMetadataView) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
getTableName() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
The name of the table in which the modifications within this record occurred.
getTableProvider() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
Returns the table to read, or null if reading from a query instead.
getTableProvider() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
getTableReference() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
getTableResource(String, String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
 
getTableRow() - Method in class org.apache.beam.sdk.io.gcp.bigquery.RowMutation
 
getTables() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema
 
getTableSchema() - Method in class org.apache.beam.sdk.io.gcp.bigquery.SchemaAndRecord
 
getTableSpec() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration
Specifies a table for a BigQuery read job.
getTableSpec() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryFileLoadsWriteSchemaTransformConfiguration
Writes to the given table specification.
getTableSpec() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration
 
getTableSpec() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
Return the tablespec in [project:].dataset.tableid format.
getTableUrn(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
Return the tablespec in projects/[project]/datasets/[dataset]/tables/[table] format.
getTargetDataset() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.TestBigQueryOptions
 
getTargetTable(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
 
getTargetTableId(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
 
getTempDatasetId() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getThrottleDuration() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
The amount of time an attempt will be throttled if deemed necessary based on previous success rate.
getThroughputEstimate() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
 
getTimePartitioning() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
getTimestamp() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload
 
getTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.HeartbeatRecord
Indicates the timestamp for which the change stream query has returned all changes.
getTimestampAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
 
getTimestampAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
Get the timestamp attribute.
getTimestampAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
Get the timestamp attribute.
getTimestampAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration
 
getTimestampMsSinceEpoch() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.OutgoingMessage
Timestamp for element (ms since epoch).
getTo() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRange
Returns the range end timestamp (exclusive).
getToken() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartition
Unique partition identifier, which can be used to perform a change stream query.
getTokenWithCorrectPartition(Range.ByteStringRange, ChangeStreamContinuationToken) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamContinuationTokenHelper
Return the continuation token with correct partition.
getTopic() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
 
getTopic() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
 
getTopic() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
Get the topic being written to.
getTopic() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
Get the topic being read from.
getTopic() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration
 
getTopicName() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
 
getTopicProvider() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
Get the ValueProvider for the topic being written to.
getTopicProvider() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
Get the ValueProvider for the topic being read from.
getTotalStreamDuration() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSetMetadata
Returns the total stream duration of change stream records so far.
getTotalStreamTimeMillis() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
The total streaming time (in millis) for this record.
getTransactionTag() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
The transaction tag associated with the given transaction.
getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOTranslation.ReadRegistrar
 
getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOTranslation.WriteRegistrar
 
getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubSubPayloadTranslation.ReadRegistrar
 
getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubSubPayloadTranslation.WriteRegistrar
 
getTriggeringFrequencySeconds() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransformConfiguration
 
getTruncateTimestamps() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions
 
getType() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ColumnType
The type of the column.
getType() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema.Column
 
getUnfinishedMinWatermark() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
Fetches the earliest partition watermark from the partition metadata table that is not in a PartitionMetadata.State.FINISHED state.
getUnknownFields() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload
 
getUnknownFieldsPayload() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload
 
getUpdatedSchema() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StreamAppendClient
If the table schema has been updated, returns the new schema.
getUpdatedSchema(TableSchema, TableSchema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableSchemaUpdateUtils
 
getUseAtLeastOnceSemantics() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransformConfiguration
 
getUseStandardSql() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration
Enables BigQuery's Standard SQL dialect when reading from a query.
getUseStorageApiConnectionPool() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getUseStorageWriteApi() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getUseStorageWriteApiAtLeastOnce() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
getUsingStandardSql() - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher.TableAndQuery
 
getUuid() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
 
getUuidFromMessage(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
 
getValueCaptureType() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
The capture type of the change stream that generated this record.
getWatermark() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.DetectNewPartitionsState
 
getWatermark() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.StreamPartitionWithWatermark
 
getWatermark() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
 
getWatermark() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
The time for which all records with a timestamp less than it have been processed.
getWatermarkLastUpdated() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.DetectNewPartitionsState
 
getWrite() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteFailure
 
getWriteDisposition() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryFileLoadsWriteSchemaTransformConfiguration
Specifies what to do with existing data in the table, in case the table already exists.
getWriteDisposition() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransformConfiguration
 
getWriteFailures() - Method in exception org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.FailedWritesException
This list of FirestoreV1.WriteFailures detailing which writes failed and for what reason.
getWriteResult() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteFailure
 
getWriteStream(String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.WriteStreamService
 
getWriteStream(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.WriteStreamServiceImpl
 
getWriteStream(String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
getWriteStreamService(BigQueryOptions) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices
Returns a real, mock, or fake BigQueryServices.WriteStreamService.
getWriteStreamService(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl
 
getWriteStreamService(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
 
grouped() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
Same transform but can be applied to PCollection of MutationGroup.

H

hashCode() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertError
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.DetectNewPartitionsState
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.InitialPipelineState
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.NewPartition
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.StreamPartitionWithWatermark
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteFailure
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteSuccessSummary
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameter
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadParameter
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadResponse
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.ProjectPath
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SubscriptionPath
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.TopicPath
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartition
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ColumnType
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.HeartbeatRecord
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.Mod
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.TypeCode
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRange
 
hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
 
hasNext() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.FhirResourcePagesIterator
 
hasNext() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.HL7v2MessagePages.HL7v2MessagePagesIterator
 
HealthcareApiClient - Interface in org.apache.beam.sdk.io.gcp.healthcare
Defines a client to communicate with the GCP HCLS API (version v1).
HealthcareIOError<T> - Class in org.apache.beam.sdk.io.gcp.healthcare
Class for capturing errors on IO operations on Google Cloud Healthcare APIs resources.
HealthcareIOErrorCoder<T> - Class in org.apache.beam.sdk.io.gcp.healthcare
 
HealthcareIOErrorToTableRow<T> - Class in org.apache.beam.sdk.io.gcp.healthcare
Convenience transform to write dead-letter HealthcareIOErrors to BigQuery TableRows.
HealthcareIOErrorToTableRow() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOErrorToTableRow
 
HEARTBEAT_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
Counter for the total number of heartbeats identified during the execution of the Connector.
HEARTBEAT_RECORD_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Counter for the total number of heartbeat records identified during the execution of the Connector.
HeartbeatRecord - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
A heartbeat record serves as a notification that the change stream query has returned all changes for the partition less or equal to the record timestamp.
HeartbeatRecord(Timestamp, ChangeStreamRecordMetadata) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.HeartbeatRecord
Constructs the heartbeat record with the given timestamp and metadata.
heartbeatRecordAction(ChangeStreamMetrics) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.ActionFactory
Creates and returns a singleton instance of an action class capable of processing HeartbeatRecords.
HeartbeatRecordAction - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.action
This class is part of the process for ReadChangeStreamPartitionDoFn SDF.
HL7v2IO - Class in org.apache.beam.sdk.io.gcp.healthcare
HL7v2IO provides an API for reading from and writing to Google Cloud Healthcare HL7v2 API.
HL7v2IO() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
 
HL7v2IO.HL7v2Read - Class in org.apache.beam.sdk.io.gcp.healthcare
The type Read that reads HL7v2 message contents given a PCollection of HL7v2ReadParameter.
HL7v2IO.HL7v2Read.FetchHL7v2Message - Class in org.apache.beam.sdk.io.gcp.healthcare
PTransform to fetch a message from an Google Cloud Healthcare HL7v2 store based on msgID.
HL7v2IO.HL7v2Read.FetchHL7v2Message.HL7v2MessageGetFn - Class in org.apache.beam.sdk.io.gcp.healthcare
DoFn for fetching messages from the HL7v2 store with error handling.
HL7v2IO.HL7v2Read.Result - Class in org.apache.beam.sdk.io.gcp.healthcare
The type Result includes PCollection of HL7v2ReadResponse objects for successfully read results and PCollection of HealthcareIOError objects for failed reads.
HL7v2IO.ListHL7v2Messages - Class in org.apache.beam.sdk.io.gcp.healthcare
List HL7v2 messages in HL7v2 Stores with optional filter.
HL7v2IO.Read - Class in org.apache.beam.sdk.io.gcp.healthcare
The type Read that reads HL7v2 message contents given a PCollection of message IDs strings.
HL7v2IO.Read.FetchHL7v2Message - Class in org.apache.beam.sdk.io.gcp.healthcare
PTransform to fetch a message from an Google Cloud Healthcare HL7v2 store based on msgID.
HL7v2IO.Read.FetchHL7v2Message.HL7v2MessageGetFn - Class in org.apache.beam.sdk.io.gcp.healthcare
DoFn for fetching messages from the HL7v2 store with error handling.
HL7v2IO.Read.Result - Class in org.apache.beam.sdk.io.gcp.healthcare
The type Result includes PCollection of HL7v2Message objects for successfully read results and PCollection of HealthcareIOError objects for failed reads.
HL7v2IO.Write - Class in org.apache.beam.sdk.io.gcp.healthcare
The type Write that writes the given PCollection of HL7v2 messages.
HL7v2IO.Write.Result - Class in org.apache.beam.sdk.io.gcp.healthcare
 
HL7v2IO.Write.WriteMethod - Enum in org.apache.beam.sdk.io.gcp.healthcare
The enum Write method.
HL7v2Message - Class in org.apache.beam.sdk.io.gcp.healthcare
The type HL7v2 message to wrap the Message model.
HL7v2Message(String, String, String, String, String, String, String, Map<String, String>) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
 
HL7v2MessageCoder - Class in org.apache.beam.sdk.io.gcp.healthcare
 
HL7v2MessageGetFn() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read.FetchHL7v2Message.HL7v2MessageGetFn
 
HL7v2Read() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read
 
HL7v2ReadParameter - Class in org.apache.beam.sdk.io.gcp.healthcare
HL7v2ReadParameter represents the read parameters for a HL7v2 read request, used as the input type for HL7v2IO.HL7v2Read.
HL7v2ReadParameter() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadParameter
 
HL7v2ReadResponse - Class in org.apache.beam.sdk.io.gcp.healthcare
HL7v2ReadResponse represents the response format for a HL7v2 read request, used as the output type of HL7v2IO.HL7v2Read.
HL7v2ReadResponse(String, HL7v2Message) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadResponse
 
HL7v2ReadResponseCoder - Class in org.apache.beam.sdk.io.gcp.healthcare
HttpHealthcareApiClient - Class in org.apache.beam.sdk.io.gcp.healthcare
A client that talks to the Cloud Healthcare API through HTTP requests.
HttpHealthcareApiClient() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
Instantiates a new Http healthcare api client.
HttpHealthcareApiClient(CloudHealthcare) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
Instantiates a new Http healthcare api client.
HttpHealthcareApiClient.AuthenticatedRetryInitializer - Class in org.apache.beam.sdk.io.gcp.healthcare
 
HttpHealthcareApiClient.FhirResourcePagesIterator - Class in org.apache.beam.sdk.io.gcp.healthcare
The type FhirResourcePagesIterator for methods which return paged output.
HttpHealthcareApiClient.FhirResourcePagesIterator.FhirMethod - Enum in org.apache.beam.sdk.io.gcp.healthcare
 
HttpHealthcareApiClient.HealthcareHttpException - Exception in org.apache.beam.sdk.io.gcp.healthcare
Wraps HttpResponse in an exception with a statusCode field for use with HealthcareIOError.
HttpHealthcareApiClient.HL7v2MessagePages - Class in org.apache.beam.sdk.io.gcp.healthcare
 
HttpHealthcareApiClient.HL7v2MessagePages.HL7v2MessagePagesIterator - Class in org.apache.beam.sdk.io.gcp.healthcare
The type Hl7v2 message id pages iterator.

I

identifier() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryDlqProvider
 
identifier() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformProvider
Implementation of the TypedSchemaTransformProvider identifier method.
identifier() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryFileLoadsWriteSchemaTransformProvider
Implementation of the TypedSchemaTransformProvider identifier method.
identifier() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaIOProvider
Returns an id that uniquely represents this IO.
identifier() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider
 
identifier() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider
 
identifier() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider
 
identifier() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider
 
identifier() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider
Returns an id that uniquely represents this IO.
identifier() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubDlqProvider
 
identifier() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformProvider
 
identifier() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
Returns an id that uniquely represents this IO.
identifier() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformProvider
 
identifier() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.DlqProvider
 
identifier() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
 
identifier() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider
 
identifier() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider
 
identifier() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider
 
ignoreInsertIds() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Setting this option to true disables insertId based data deduplication offered by BigQuery.
ignoreUnknownValues() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Accept rows that contain values that do not match the schema.
importFhirResource(String, String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Importing a FHIR resource from GCS.
importFhirResource(String, String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
importResources(String, String, String, FhirIO.Import.ContentStructure) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
Import resources.
importResources(ValueProvider<String>, ValueProvider<String>, ValueProvider<String>, FhirIO.Import.ContentStructure) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
Import resources.
in(Pipeline, PCollection<FhirBundleResponse>, PCollection<HealthcareIOError<FhirBundleParameter>>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
Entry point for the ExecuteBundlesResult, storing the successful and failed bundles and their metadata.
incActivePartitionReadCounter() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Increments the ChangeStreamMetrics.ACTIVE_PARTITION_READ_COUNT by 1 if the metric is enabled.
incChangeStreamMutationGcCounter() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
Increments the ChangeStreamMetrics.CHANGE_STREAM_MUTATION_GC_COUNT by 1 if the metric is enabled.
incChangeStreamMutationUserCounter() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
Increments the ChangeStreamMetrics.CHANGE_STREAM_MUTATION_USER_COUNT by 1 if the metric is enabled.
incClosestreamCount() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
Increments the ChangeStreamMetrics.CLOSESTREAM_COUNT by 1 if the metric is enabled.
incDataRecordCounter() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Increments the ChangeStreamMetrics.DATA_RECORD_COUNT by 1 if the metric is enabled.
incHeartbeatCount() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
Increments the ChangeStreamMetrics.HEARTBEAT_COUNT by 1 if the metric is enabled.
incHeartbeatRecordCount() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Increments the ChangeStreamMetrics.HEARTBEAT_RECORD_COUNT by 1 if the metric is enabled.
incListPartitionsCount() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
Increments the ChangeStreamMetrics.LIST_PARTITIONS_COUNT by 1 if the metric is enabled.
IncomingMessage() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.IncomingMessage
 
incOrphanedNewPartitionCleanedCount() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
incPartitionMergeCount() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
Increments the ChangeStreamMetrics.PARTITION_MERGE_COUNT by 1 if the metric is enabled.
incPartitionReconciledWithoutTokenCount() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
incPartitionReconciledWithTokenCount() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
incPartitionRecordCount() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Increments the ChangeStreamMetrics.PARTITION_RECORD_COUNT by 1 if the metric is enabled.
incPartitionRecordMergeCount() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Increments the ChangeStreamMetrics.PARTITION_RECORD_MERGE_COUNT by 1 if the metric is enabled.
incPartitionRecordSplitCount() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Increments the ChangeStreamMetrics.PARTITION_RECORD_SPLIT_COUNT by 1 if the metric is enabled.
incPartitionSplitCount() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
Increments the ChangeStreamMetrics.PARTITION_SPLIT_COUNT by 1 if the metric is enabled.
incPartitionStreamCount() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
incQueryCounter() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Increments the ChangeStreamMetrics.QUERY_COUNT by 1 if the metric is enabled.
ingestHL7v2Message(String, Message) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Ingest an HL7v2 message.
ingestHL7v2Message(String, Message) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
ingestMessages(String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
Write with Messages.Ingest method.
initClient() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Deidentify.DeidentifyFn
 
initClient() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Export.ExportResourcesFn
 
initialize(HttpRequest) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.AuthenticatedRetryInitializer
 
InitializeDoFn - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn
A DoFn responsible to initialize the metadata table and prepare it for managing the state of the pipeline.
InitializeDoFn(DaoFactory, Instant, BigtableIO.ExistingPipelineOptions) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.InitializeDoFn
 
InitializeDoFn - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn
A DoFn responsible for initializing the change stream Connector.
InitializeDoFn(DaoFactory, MapperFactory, Timestamp, Timestamp) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.InitializeDoFn
 
InitialPartition - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
Utility class to determine initial partition constants and methods.
InitialPartition() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.InitialPartition
 
InitialPipelineState - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.model
States to initialize a pipeline outputted by InitializeDoFn.
InitialPipelineState(Instant, boolean) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.InitialPipelineState
 
initialRestriction() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.DetectNewPartitionsDoFn
 
initialRestriction() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.ReadChangeStreamPartitionDoFn
 
initialRestriction(PartitionMetadata) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.DetectNewPartitionsDoFn
Uses an TimestampRange with a max range.
initialRestriction(PartitionMetadata) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.ReadChangeStreamPartitionDoFn
The restriction for a partition will be defined from the start and end timestamp to query the partition for.
inputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformProvider
Implementation of the TypedSchemaTransformProvider inputCollectionNames method.
inputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryFileLoadsWriteSchemaTransformProvider
Implementation of the TypedSchemaTransformProvider inputCollectionNames method.
inputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider
 
inputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider
 
inputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider
 
inputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider
 
inputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformProvider
 
inputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformProvider
 
inputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
 
inputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider
 
inputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider
 
inputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider
 
insert(PartitionMetadata) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
Inserts the partition metadata.
insert(PartitionMetadata) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.InTransactionContext
Inserts the partition metadata.
INSERT_OR_UPDATE_URN - Static variable in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar
 
INSERT_URN - Static variable in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar
 
insertAll(TableReference, List<FailsafeValueInSingleWindow<TableRow, TableRow>>, List<String>, InsertRetryPolicy, List<ValueInSingleWindow<T>>, ErrorContainer<T>, boolean, boolean, boolean, List<ValueInSingleWindow<TableRow>>) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
Inserts TableRows with the specified insertIds if not null.
insertAll(TableReference, List<FailsafeValueInSingleWindow<TableRow, TableRow>>, List<String>, InsertRetryPolicy, List<ValueInSingleWindow<T>>, ErrorContainer<T>, boolean, boolean, boolean, List<ValueInSingleWindow<TableRow>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
 
insertAll(TableReference, List<TableRow>, List<String>) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
insertAll(TableReference, List<FailsafeValueInSingleWindow<TableRow, TableRow>>, List<String>, InsertRetryPolicy, List<ValueInSingleWindow<T>>, ErrorContainer<T>, boolean, boolean, boolean, List<ValueInSingleWindow<TableRow>>) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
InsertBuilder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.InsertBuilder
 
insertDataToTable(String, String, String, List<Map<String, Object>>) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
Inserts rows to a table using a BigQuery streaming write.
InsertOrUpdateBuilder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.InsertOrUpdateBuilder
 
InsertRetryPolicy - Class in org.apache.beam.sdk.io.gcp.bigquery
A retry policy for streaming BigQuery inserts.
InsertRetryPolicy() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.InsertRetryPolicy
 
InsertRetryPolicy.Context - Class in org.apache.beam.sdk.io.gcp.bigquery
Contains information about a failed insert.
insertRows(Schema, Row...) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery
 
instanceId - Variable in class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser.DicomWebPath
 
instantiateHealthcareClient() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read.FetchHL7v2Message.HL7v2MessageGetFn
Instantiate healthcare client (version v1).
instantiateHealthcareClient() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.FetchHL7v2Message.HL7v2MessageGetFn
Instantiate healthcare client (version v1).
InTransactionContext(String, TransactionContext, Dialect) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.InTransactionContext
Constructs a context to execute a user defined function transactionally.
isAppProfileSingleClusterAndTransactional(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
Verify the app profile is for single cluster routing with allow single-row transactions enabled.
isBounded() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaIOProvider
Indicates whether the PCollections produced by this transform will contain a bounded or unbounded number of elements.
isBounded() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.ReadChangeStreamPartitionProgressTracker
This restriction tracker is for unbounded streams.
isBounded() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider
 
isBounded() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
 
isBounded() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
 
isEmpty() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
 
isEOF() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
isEOF() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
 
isEOF() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
 
isEOF() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
 
isFailToLock() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
 
isHeartbeat() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
 
isInitialPartition(String) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.InitialPartition
Verifies if the given partition token is the initial partition.
isLastRecordInTransactionInPartition() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
Indicates whether this record is the last emitted for the given transaction in the given partition.
isPrimaryKey() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ColumnType
True if the column is part of the primary key, false otherwise.
isResume() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.InitialPipelineState
 
isRowLocked(Row) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.encoder.MetadataTableEncoder
Returns true if row is locked.
isShouldReportDiagnosticMetrics() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
Whether additional diagnostic metrics should be reported for a Transform.
isSystemTransaction() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
Whether the given transaction is Spanner system transaction.
isTableEmpty(TableReference) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
Returns true if the table is empty.
isTableEmpty(TableReference) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
 
isTableEmpty(TableReference) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
isValidPartition(Range.ByteStringRange) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ByteStringRangeHelper
Checks if the partition's start key is before its end key.
iterator() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.HL7v2MessagePages
 
iterator() - Method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
 
iterator() - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices.FakeBigQueryServerStream
 

J

JsonArrayCoder - Class in org.apache.beam.sdk.io.gcp.healthcare
 
JsonArrayCoder() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.JsonArrayCoder
 
jsonValueFromMessageValue(Descriptors.FieldDescriptor, Object, boolean) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
 

K

KEY - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.RowUtils
 
KEY_FIELD_PROPERTY - Static variable in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider
 
keyField - Variable in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
 
KeyPart() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema.KeyPart
 
kind - Variable in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
 
knownBuilderInstances() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.ExternalTransformRegistrarImpl
 
knownBuilderInstances() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar
 
knownBuilders() - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead
 
knownBuilders() - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite
 

L

LABELS - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.RowUtils
 
lastAttemptedPosition - Variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
 
lastClaimedPosition - Variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
 
LIST_PARTITIONS_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
Counter for the total number of partitions identified during the execution of the Connector.
listAllFhirStores(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
List all FHIR stores in a dataset.
listAllFhirStores(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
listCollectionIds() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.Read
Factory method to create a new type safe builder for ListCollectionIdsRequest operations.
listDocuments() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.Read
Factory method to create a new type safe builder for ListDocumentsRequest operations.
listSubscriptions(PubsubClient.ProjectPath, PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
Return a list of subscriptions for topic in project.
listSubscriptions(PubsubClient.ProjectPath, PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
 
listSubscriptions(PubsubClient.ProjectPath, PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
 
listSubscriptions(PubsubClient.ProjectPath, PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
 
listTopics(PubsubClient.ProjectPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
Return a list of topics for project.
listTopics(PubsubClient.ProjectPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
 
listTopics(PubsubClient.ProjectPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
 
listTopics(PubsubClient.ProjectPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
 
location - Variable in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
 
location - Variable in class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser.DicomWebPath
 
lockAndRecordPartition(PartitionRecord) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
Lock the partition in the metadata table for the DoFn streaming it.
longToByteArray(long) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigtableUtils
 

M

makeHL7v2ListRequest(String, String, String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Make hl 7 v 2 list request list messages response.
makeHL7v2ListRequest(String, String, String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
makeListRequest(HealthcareApiClient, String, Instant, Instant, String, String, String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.HL7v2MessagePages
Make list request list messages response.
makeSendTimeBoundHL7v2ListRequest(String, Instant, Instant, String, String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Time Bound HL7v2 list request.
makeSendTimeBoundHL7v2ListRequest(String, Instant, Instant, String, String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
ManagedFactory<T extends java.lang.AutoCloseable> - Interface in org.apache.beam.sdk.io.gcp.pubsublite.internal
A ManagedFactory produces instances and tears down any produced instances when it is itself closed.
ManagedFactoryImpl<T extends java.lang.AutoCloseable> - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
 
MapperFactory - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper
Factory class for creating instances that will map a struct to a connector model.
MapperFactory(Dialect) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper.MapperFactory
 
markNewPartitionForDeletion(NewPartition) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
This is the 1st step of 2 phase delete.
matchesSafely(BigqueryMatcher.TableAndQuery) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher
 
MAX_INCLUSIVE_END_AT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
Represents the max end at that can be specified for a change stream.
message() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.IncomingMessage
Underlying Message.
messageFromBeamRow(Descriptors.Descriptor, Row, String, long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BeamRowToStorageApiProto
Given a Beam Row object, returns a protocol-buffer message that can be used to write data using the BigQuery Storage streaming API.
messageFromGenericRecord(Descriptors.Descriptor, GenericRecord, String, long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.AvroGenericRecordToStorageApiProto
Given an Avro GenericRecord object, returns a protocol-buffer message that can be used to write data using the BigQuery Storage streaming API.
messageFromMap(TableRowToStorageApiProto.SchemaInformation, Descriptors.Descriptor, AbstractMap<String, Object>, boolean, boolean, TableRow, String, long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
 
messageFromTableRow(TableRowToStorageApiProto.SchemaInformation, Descriptors.Descriptor, TableRow, boolean, boolean, TableRow, String, long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
Given a BigQuery TableRow, returns a protocol-buffer message that can be used to write data using the BigQuery Storage API.
METADATA - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO.ReadStudyMetadata
TupleTag for the main output.
MetadataSpannerConfigFactory - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams
This class generates a SpannerConfig for the change stream metadata database by copying only the necessary fields from the SpannerConfig of the primary database.
MetadataSpannerConfigFactory() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.MetadataSpannerConfigFactory
 
MetadataTableAdminDao - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao
Data access object for creating and dropping the metadata table.
MetadataTableAdminDao(BigtableTableAdminClient, BigtableInstanceAdminClient, String, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
 
MetadataTableDao - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao
Data access object for managing the state of the metadata Bigtable table.
MetadataTableDao(BigtableDataClient, String, ByteString) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
 
MetadataTableEncoder - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.encoder
Helper methods that simplifies some conversion and extraction of metadata table content.
MetadataTableEncoder() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.encoder.MetadataTableEncoder
 
METRICS_NAMESPACE - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
 
microsecondToInstant(long) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.TimestampConverter
 
Mod - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
Represents a modification in a table emitted within a DataChangeRecord.
Mod(String, String, String) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.Mod
Constructs a mod from the primary key values, the old state of the row and the new state of the row.
modeToProtoMode(String, String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
 
modifyAckDeadline(PubsubClient.SubscriptionPath, List<String>, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
Modify the ack deadline for messages from subscription with ackIds to be deadlineSeconds from now.
modifyAckDeadline(PubsubClient.SubscriptionPath, List<String>, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
 
modifyAckDeadline(PubsubClient.SubscriptionPath, List<String>, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
 
modifyAckDeadline(PubsubClient.SubscriptionPath, List<String>, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
 
ModType - Enum in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
Represents the type of modification applied in the DataChangeRecord.
MutationGroup - Class in org.apache.beam.sdk.io.gcp.spanner
A bundle of mutations that must be submitted atomically.

N

NameGenerator - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams
This class generates a unique name for the partition metadata table, which is created when the Connector is initialized.
NameGenerator() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.NameGenerator
 
neverRetry() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.InsertRetryPolicy
Never retry any failures.
NEW_PARTITION_PREFIX - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
 
newBuilder() - Static method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
Factory method to return a new instance of RpcQosOptions.Builder with all values set to their initial default values.
newBuilder() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PublisherOptions
 
newBuilder() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.SubscriberOptions
 
newBuilder() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions
 
newBuilder() - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
 
newBuilder() - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
Creates a builder for constructing a partition metadata instance.
newClient(String, String, PubsubOptions, String) - Method in interface org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.PubsubClientFactory
Construct a new Pubsub client.
newClient(String, String, PubsubOptions) - Method in interface org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.PubsubClientFactory
 
newDlqTransform(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryDlqProvider
 
newDlqTransform(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubDlqProvider
 
newDlqTransform(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.DlqProvider
 
NewPartition - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.model
Represent new partition as a result of splits and merges.
NewPartition(Range.ByteStringRange, List<ChangeStreamContinuationToken>, Instant, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.NewPartition
 
NewPartition(Range.ByteStringRange, List<ChangeStreamContinuationToken>, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.NewPartition
 
newTracker(TimestampRange) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.DetectNewPartitionsDoFn
 
newTracker(PartitionMetadata, TimestampRange) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.ReadChangeStreamPartitionDoFn
 
newWatermarkEstimator(Instant) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.DetectNewPartitionsDoFn
 
newWatermarkEstimator(Instant) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.ReadChangeStreamPartitionDoFn
 
newWatermarkEstimator(Instant) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.DetectNewPartitionsDoFn
 
newWatermarkEstimator(Instant) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.ReadChangeStreamPartitionDoFn
 
next() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.FhirResourcePagesIterator
 
next() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.HL7v2MessagePages.HL7v2MessagePagesIterator
 
next() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSet
Moves the pointer to the next record in the ResultSet if there is one.
next(Timestamp) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampUtils
Adds one nanosecond to the given timestamp.
now(Matcher<Iterable<? extends Row>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery.RowsAssertion
 
NullSizeEstimator<T> - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator
NoOp implementation of a size estimator.
NullSizeEstimator() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator.NullSizeEstimator
 
NullThroughputEstimator<T> - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator
NoOp implementation of a throughput estimator.
NullThroughputEstimator() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.NullThroughputEstimator
 
NUM_QUERY_SPLITS_MAX - Static variable in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
An upper bound on the number of splits for a query.

O

of() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertErrorCoder
 
of() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertErrorCoder
 
of(TableRow, RowMutationInformation) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.RowMutation
 
of() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.RowMutation.RowMutationCoder
 
of(RowMutationInformation.MutationType, long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.RowMutationInformation
 
of() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoder
 
of() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV2
 
of() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV3
 
of() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
 
of() - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResultCoder
 
of(String, String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleParameter
 
of(String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleParameter
 
of(FhirBundleParameter, String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleResponse
 
of(String, String, Map<String, T>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameter
Creates a FhirSearchParameter of type T.
of(String, Map<String, T>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameter
Creates a FhirSearchParameter of type T, without a key.
of(Coder<T>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameterCoder
 
of(Coder<T>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOErrorCoder
 
of(PCollectionTuple) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read.Result
 
of(PCollectionTuple) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.Result
 
of() - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2MessageCoder
 
of(Class<HL7v2Message>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2MessageCoder
 
of(String, String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadParameter
 
of(String, HL7v2Message) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadResponse
From metadata and hl7v2Message to HL7v2ReadResponse.
of() - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadResponseCoder
 
of(Class<HL7v2ReadResponse>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadResponseCoder
 
of() - Static method in class org.apache.beam.sdk.io.gcp.healthcare.JsonArrayCoder
 
of(PubsubMessage, long, long, String, String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.IncomingMessage
 
of(PubsubMessage, long, String, String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.OutgoingMessage
 
of(PubsubMessage, long, String, String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.OutgoingMessage
 
of() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
 
of(TypeDescriptor<PubsubMessage>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdAndOrderingKeyCoder
 
of() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdAndOrderingKeyCoder
 
of(TypeDescriptor<PubsubMessage>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdCoder
 
of() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdCoder
 
of(TypeDescriptor<PubsubMessage>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
 
of() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
 
of() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithMessageIdCoder
 
of(TypeDescriptor<PubsubMessage>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithTopicCoder
 
of() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithTopicCoder
 
of(ByteString) - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.Uuid
 
of(Timestamp, Timestamp) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRange
Constructs a timestamp range.
OffsetByteRangeCoder - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
 
OffsetByteRangeCoder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.OffsetByteRangeCoder
 
ofPatientEverything(HealthcareApiClient, String, Map<String, Object>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.FhirResourcePagesIterator
Instantiates a new GetPatientEverything FHIR resource pages iterator.
ofSearch(HealthcareApiClient, String, String, Map<String, Object>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.FhirResourcePagesIterator
Instantiates a new search FHIR resource pages iterator.
OK - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
 
onTeardown() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiConvertMessages.ConvertMessagesDoFn
 
onTeardown() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiFlushAndFinalizeDoFn
 
onTeardown() - Method in class org.apache.beam.sdk.io.gcp.bigquery.UpdateSchemaDestination
 
optimizedWrites() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
If true, enables new codepaths that are expected to use less resources while writing to BigQuery.
org.apache.beam.sdk.io.gcp.bigquery - package org.apache.beam.sdk.io.gcp.bigquery
Defines transforms for reading and writing from Google BigQuery.
org.apache.beam.sdk.io.gcp.bigquery.providers - package org.apache.beam.sdk.io.gcp.bigquery.providers
Defines SchemaTransformProviders for reading and writing from Google BigQuery.
org.apache.beam.sdk.io.gcp.bigtable - package org.apache.beam.sdk.io.gcp.bigtable
Defines transforms for reading and writing from Google Cloud Bigtable.
org.apache.beam.sdk.io.gcp.bigtable.changestreams - package org.apache.beam.sdk.io.gcp.bigtable.changestreams
Change stream for Google Cloud Bigtable.
org.apache.beam.sdk.io.gcp.bigtable.changestreams.action - package org.apache.beam.sdk.io.gcp.bigtable.changestreams.action
Business logic to process change stream for Google Cloud Bigtable.
org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao - package org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao
Data access object for change stream for Google Cloud Bigtable.
org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn - package org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn
DoFn and SDF definitions to process Google Cloud Bigtable Change Streams.
org.apache.beam.sdk.io.gcp.bigtable.changestreams.encoder - package org.apache.beam.sdk.io.gcp.bigtable.changestreams.encoder
Encoders for writing and reading from Metadata Table for Google Cloud Bigtable Change Streams.
org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator - package org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator
Classes related to estimating the throughput of the change streams SDFs.
org.apache.beam.sdk.io.gcp.bigtable.changestreams.model - package org.apache.beam.sdk.io.gcp.bigtable.changestreams.model
User models for the Google Cloud Bigtable change stream API.
org.apache.beam.sdk.io.gcp.bigtable.changestreams.reconciler - package org.apache.beam.sdk.io.gcp.bigtable.changestreams.reconciler
Partition reconciler for Google Cloud Bigtable Change Streams.
org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction - package org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction
Custom RestrictionTracker for Google Cloud Bigtable Change Streams.
org.apache.beam.sdk.io.gcp.common - package org.apache.beam.sdk.io.gcp.common
Defines common Google Cloud Platform IO support classes.
org.apache.beam.sdk.io.gcp.datastore - package org.apache.beam.sdk.io.gcp.datastore
Provides an API for reading from and writing to Google Cloud Datastore over different versions of the Cloud Datastore Client libraries.
org.apache.beam.sdk.io.gcp.firestore - package org.apache.beam.sdk.io.gcp.firestore
Provides an API for reading from and writing to Google Cloud Firestore.
org.apache.beam.sdk.io.gcp.healthcare - package org.apache.beam.sdk.io.gcp.healthcare
Provides an API for reading from and writing to Google Cloud Datastore over different versions of the Cloud Datastore Client libraries.
org.apache.beam.sdk.io.gcp.pubsub - package org.apache.beam.sdk.io.gcp.pubsub
Defines transforms for reading and writing from Google Cloud Pub/Sub.
org.apache.beam.sdk.io.gcp.pubsublite - package org.apache.beam.sdk.io.gcp.pubsublite
Defines transforms for reading and writing from Google Cloud Pub/Sub Lite.
org.apache.beam.sdk.io.gcp.pubsublite.internal - package org.apache.beam.sdk.io.gcp.pubsublite.internal
Defines transforms for reading and writing from Google Cloud Pub/Sub Lite.
org.apache.beam.sdk.io.gcp.spanner - package org.apache.beam.sdk.io.gcp.spanner
Provides an API for reading from and writing to Google Cloud Spanner.
org.apache.beam.sdk.io.gcp.spanner.changestreams - package org.apache.beam.sdk.io.gcp.spanner.changestreams
Provides an API for reading change stream data from Google Cloud Spanner.
org.apache.beam.sdk.io.gcp.spanner.changestreams.action - package org.apache.beam.sdk.io.gcp.spanner.changestreams.action
Action processors for each of the types of Change Stream records received.
org.apache.beam.sdk.io.gcp.spanner.changestreams.dao - package org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
Database Access Objects for querying change streams and modifying the Connector's metadata tables.
org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn - package org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn
DoFn and SDF definitions to process Google Cloud Spanner Change Streams.
org.apache.beam.sdk.io.gcp.spanner.changestreams.encoder - package org.apache.beam.sdk.io.gcp.spanner.changestreams.encoder
User model for the Spanner change stream API.
org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator - package org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator
Classes related to estimating the throughput of the change streams SDFs.
org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper - package org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper
Mapping related functionality, such as from ResultSets to Change Stream models.
org.apache.beam.sdk.io.gcp.spanner.changestreams.model - package org.apache.beam.sdk.io.gcp.spanner.changestreams.model
User models for the Spanner change stream API.
org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction - package org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction
Custom restriction tracker related classes.
org.apache.beam.sdk.io.gcp.testing - package org.apache.beam.sdk.io.gcp.testing
Defines utilities for unit testing Google Cloud Platform components of Apache Beam pipelines.
ORPHANED_NEW_PARTITION_CLEANED_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
Counter for the total number of orphaned new partitions cleaned up.
OrphanedMetadataCleaner - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.reconciler
 
OrphanedMetadataCleaner() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.reconciler.OrphanedMetadataCleaner
 
OUT - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read
The tag for the main output of FHIR resources.
OUT - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search
The tag for the main output of FHIR Resources from a search.
OUT - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything
The tag for the main output of FHIR Resources from a GetPatientEverything request.
OUT - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read
The tag for the main output of HL7v2 read responses.
OUT - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read
The tag for the main output of HL7v2 Messages.
OutgoingMessage() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.OutgoingMessage
 
OUTPUT_TAG - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformProvider
 
OUTPUT_TAG - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformProvider
 
OUTPUT_TAG - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
 
OUTPUT_TAG - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider
 
OUTPUT_TAG - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider
 
outputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformProvider
Implementation of the TypedSchemaTransformProvider outputCollectionNames method.
outputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryFileLoadsWriteSchemaTransformProvider
Implementation of the TypedSchemaTransformProvider outputCollectionNames method.
outputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider
 
outputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider
 
outputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider
 
outputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider
 
outputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformProvider
 
outputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformProvider
 
outputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
 
outputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider
 
outputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider
 
outputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider
 

P

PARENT_TOKENS - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.InitialPartition
The empty set representing the initial partition parent tokens.
parseDicomWebpath(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser
 
ParsedMetricName() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics.ParsedMetricName
 
parseInitialContinuationTokens(Row) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.encoder.MetadataTableEncoder
Return a list of initial token from a row.
parseLockUuid(Row) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.encoder.MetadataTableEncoder
Returns the uuid from a row.
parseMetricName(String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
Parse a 'metric name' String that was created with 'createLabeledMetricName'.
ParsePayloadAsPubsubMessageProto() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessages.ParsePayloadAsPubsubMessageProto
 
ParsePubsubMessageProtoAsPayload() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessages.ParsePubsubMessageProtoAsPayload
 
ParsePubsubMessageProtoAsPayloadFromWindowedValue() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite.ParsePubsubMessageProtoAsPayloadFromWindowedValue
 
parseTableSpec(String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
Parse a table specification in the form "[project_id]:[dataset_id].[table_id]" or "[project_id].[dataset_id].[table_id]" or "[dataset_id].[table_id]".
parseTableUrn(String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
 
parseTimestampAsMsSinceEpoch(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
Return timestamp as ms-since-unix-epoch corresponding to timestamp.
parseTokenFromRow(Row) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.encoder.MetadataTableEncoder
Read the continuation token cell of a row from ReadRows.
parseWatermarkFromRow(Row) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.encoder.MetadataTableEncoder
Read the watermark cell of a row from ReadRows.
parseWatermarkLastUpdatedFromRow(Row) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.encoder.MetadataTableEncoder
Return the timestamp (the time it was updated) of the watermark cell.
PARTITION_CREATED_TO_SCHEDULED_MS - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Time in milliseconds that a partition took to transition from PartitionMetadata.State.CREATED to PartitionMetadata.State.SCHEDULED.
PARTITION_MERGE_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
Counter for the total number of partition merges identified during the execution of the Connector.
PARTITION_RECONCILED_WITH_TOKEN_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
Counter for the total number of partitions reconciled with continuation tokens.
PARTITION_RECONCILED_WITHOUT_TOKEN_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
Counter for the total number of partitions reconciled without continuation tokens.
PARTITION_RECORD_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Counter for the total number of partitions identified during the execution of the Connector.
PARTITION_RECORD_MERGE_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Counter for the total number of partition merges identified during the execution of the Connector.
PARTITION_RECORD_SPLIT_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Counter for the total number of partition splits / moves identified during the execution of the Connector.
PARTITION_SCHEDULED_TO_RUNNING_MS - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Time in milliseconds that a partition took to transition from PartitionMetadata.State.SCHEDULED to PartitionMetadata.State.RUNNING.
PARTITION_SPLIT_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
Counter for the total number of partition splits / moves identified during the execution of the Connector.
PARTITION_STREAM_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
Counter for the total number of active partitions being streamed.
PARTITION_TOKEN - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.InitialPartition
The token of the initial partition.
PartitionMetadata - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
Model for the partition metadata database table used in the Connector.
PartitionMetadata(String, HashSet<String>, Timestamp, Timestamp, long, PartitionMetadata.State, Timestamp, Timestamp, Timestamp, Timestamp, Timestamp) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
 
PartitionMetadata.Builder - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
Partition metadata builder for better user experience.
PartitionMetadata.State - Enum in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
The state at which a partition can be in the system: CREATED: the partition has been created, but no query has been done against it yet.
PartitionMetadataAdminDao - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
Data access object for creating and dropping the partition metadata table.
PartitionMetadataDao - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
Data access object for the Connector metadata tables.
PartitionMetadataDao.InTransactionContext - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
Represents the execution of a read / write transaction in Cloud Spanner.
PartitionMetadataDao.TransactionResult<T> - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
Represents a result from executing a Cloud Spanner read / write transaction.
partitionMetadataMapper() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper.MapperFactory
Creates and returns a single instance of a mapper class capable of transforming a Struct into a PartitionMetadata class.
PartitionMetadataMapper - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper
This class is responsible for transforming a Struct to a PartitionMetadata.
partitionQuery() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.Read
Factory method to create a new type safe builder for PartitionQueryRequest operations.
PartitionReconciler - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.reconciler
There can be a race when many splits and merges happen to a single partition in quick succession.
PartitionReconciler(MetadataTableDao, ChangeStreamMetrics) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.reconciler.PartitionReconciler
 
PartitionRecord - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.model
Output result of DetectNewPartitionsDoFn containing information required to stream a partition.
PartitionRecord(Range.ByteStringRange, List<ChangeStreamContinuationToken>, Instant, List<NewPartition>) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
 
PartitionRecord(Range.ByteStringRange, Instant, Instant, List<NewPartition>) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
 
PartitionRecord(Range.ByteStringRange, Instant, String, Instant, List<NewPartition>, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
 
PartitionRecord(Range.ByteStringRange, List<ChangeStreamContinuationToken>, String, Instant, List<NewPartition>, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
 
partitionsToString(List<Range.ByteStringRange>) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ByteStringRangeHelper
Convert partitions to a string for debugging.
patchTableDescription(TableReference, String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
Patch BigQuery Table description.
patchTableDescription(TableReference, String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.DatasetServiceImpl
 
patchTableDescription(TableReference, String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
PatientEverythingParameter() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything.PatientEverythingParameter
 
PAYLOAD_TOO_LARGE - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
 
pin() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StreamAppendClient
Pin this object.
pollFor(Duration) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery.PollingAssertion
 
pollJob(JobReference, int) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.JobService
Waits for the job is Done, and returns the job.
pollJob(JobReference, int) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
 
pollOperation(Operation, Long) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Poll operation.
pollOperation(Operation, Long) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.WriteWithResults
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.datastore.RampupThrottlingFn
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write.PubsubBoundedWriter
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
 
populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.WriteGrouped
 
PostProcessingMetricsDoFn - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn
A DoFn class to gather metrics about the emitted DataChangeRecords.
PostProcessingMetricsDoFn(ChangeStreamMetrics) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.PostProcessingMetricsDoFn
 
PreparePubsubWriteDoFn<InputT> - Class in org.apache.beam.sdk.io.gcp.pubsub
 
PrepareWrite<InputT,DestinationT,OutputT> - Class in org.apache.beam.sdk.io.gcp.bigquery
Prepare an input PCollection for writing to BigQuery.
PrepareWrite(DynamicDestinations<InputT, DestinationT>, SerializableFunction<InputT, OutputT>) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite
 
previous(Timestamp) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampUtils
 
primary() - Method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
 
process(PipelineOptions, KV<String, StorageApiFlushAndFinalizeDoFn.Operation>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiFlushAndFinalizeDoFn
 
process(InputT, Instant, BoundedWindow, PaneInfo, DoFn.OutputReceiver<PubsubMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PreparePubsubWriteDoFn
 
process(SequencedMessage, DoFn.MultiOutputReceiver) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.ErrorFn
 
process(Row, DoFn.MultiOutputReceiver) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.ErrorCounterFn
 
process(DataChangeRecord, DoFn.MultiOutputReceiver) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.DataChangeRecordToRow
 
processElement(DoFn<KV<DestinationT, ElementT>, KV<DestinationT, StorageApiWritePayload>>.ProcessContext, PipelineOptions, KV<DestinationT, ElementT>, Instant, DoFn.MultiOutputReceiver) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiConvertMessages.ConvertMessagesDoFn
 
processElement(Iterable<KV<DestinationT, WriteTables.Result>>, DoFn<Iterable<KV<DestinationT, WriteTables.Result>>, Iterable<KV<TableDestination, WriteTables.Result>>>.ProcessContext, BoundedWindow) - Method in class org.apache.beam.sdk.io.gcp.bigquery.UpdateSchemaDestination
 
processElement(InitialPipelineState, RestrictionTracker<OffsetRange, Long>, DoFn.OutputReceiver<PartitionRecord>, ManualWatermarkEstimator<Instant>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.DetectNewPartitionsDoFn
 
processElement(KV<ByteString, ChangeStreamRecord>, DoFn.OutputReceiver<KV<ByteString, ChangeStreamMutation>>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.FilterForMutationDoFn
 
processElement(DoFn.OutputReceiver<InitialPipelineState>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.InitializeDoFn
 
processElement(PartitionRecord, RestrictionTracker<StreamProgress, StreamProgress>, DoFn.OutputReceiver<KV<ByteString, ChangeStreamRecord>>, ManualWatermarkEstimator<Instant>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.ReadChangeStreamPartitionDoFn
 
processElement(DoFn<T, T>.ProcessContext) - Method in class org.apache.beam.sdk.io.gcp.datastore.RampupThrottlingFn
Emit only as many elements as the exponentially increasing budget allows.
processElement(DoFn<HL7v2ReadParameter, HL7v2ReadResponse>.ProcessContext) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.HL7v2Read.FetchHL7v2Message.HL7v2MessageGetFn
Process element.
processElement(DoFn<String, HL7v2Message>.ProcessContext) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.FetchHL7v2Message.HL7v2MessageGetFn
Process element.
processElement(PubsubMessage, Instant) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write.PubsubBoundedWriter
 
processElement(Row, DoFn.MultiOutputReceiver) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformProvider.ErrorFn
 
processElement(PubSubMessage) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.PubsubLiteSink
 
processElement(PubSubMessage, DoFn.OutputReceiver<PubSubMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.SetUuidFromPubSubMessage.SetUuidFn
 
processElement(DoFn.OutputReceiver<Void>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.CleanUpReadChangeStreamDoFn
 
processElement(RestrictionTracker<TimestampRange, Timestamp>, DoFn.OutputReceiver<PartitionMetadata>, ManualWatermarkEstimator<Instant>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.DetectNewPartitionsDoFn
Main processing function for the DetectNewPartitionsDoFn function.
processElement(DoFn.OutputReceiver<PartitionMetadata>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.InitializeDoFn
 
processElement(DataChangeRecord, DoFn.OutputReceiver<DataChangeRecord>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.PostProcessingMetricsDoFn
Stage to measure a data records latencies and metrics.
processElement(PartitionMetadata, RestrictionTracker<TimestampRange, Timestamp>, DoFn.OutputReceiver<DataChangeRecord>, ManualWatermarkEstimator<Instant>, DoFn.BundleFinalizer) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.ReadChangeStreamPartitionDoFn
Performs a change stream query for a given partition.
processElement(DoFn<Void, SpannerSchema>.ProcessContext) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadSpannerSchema
 
PROCESSING_DELAY_FROM_COMMIT_TIMESTAMP - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
Distribution for measuring processing delay from commit timestamp.
processNewPartition(NewPartition, DoFn.OutputReceiver<PartitionRecord>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ProcessNewPartitionsAction
Process a single new partition.
processNewPartitionsAction(ChangeStreamMetrics, MetadataTableDao, Instant) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ActionFactory
 
ProcessNewPartitionsAction - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.action
 
ProcessNewPartitionsAction(ChangeStreamMetrics, MetadataTableDao, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ProcessNewPartitionsAction
 
project - Variable in class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser.DicomWebPath
 
projectId - Variable in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
 
projectPathFromId(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
 
projectPathFromPath(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
 
ProtoFromBytes<T extends com.google.protobuf.Message> - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
 
protoModeToJsonMode(TableFieldSchema.Mode) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
 
protoSchemaToTableSchema(TableSchema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
 
protoTableFieldToTableField(TableFieldSchema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
 
protoTableSchemaFromAvroSchema(Schema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.AvroGenericRecordToStorageApiProto
Given an Avro Schema, returns a protocol-buffer TableSchema that can be used to write data through BigQuery Storage API.
ProtoToBytes<T extends com.google.protobuf.Message> - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
 
ProtoToBytes() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.ProtoToBytes
 
protoTypeToJsonType(TableFieldSchema.Type) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
 
publish(PubsubClient.TopicPath, List<PubsubClient.OutgoingMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
Publish outgoingMessages to Pubsub topic.
publish(PubsubClient.TopicPath, List<PubsubClient.OutgoingMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
 
publish(PubsubClient.TopicPath, List<PubsubClient.OutgoingMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
 
publish(PubsubClient.TopicPath, List<PubsubClient.OutgoingMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
 
publish(List<PubsubMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
Publish messages to TestPubsub.topicPath().
PublisherOptions - Class in org.apache.beam.sdk.io.gcp.pubsublite
Options needed for a Pub/Sub Lite Publisher.
PublisherOptions() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PublisherOptions
 
PublisherOptions.Builder - Class in org.apache.beam.sdk.io.gcp.pubsublite
 
PubsubClient - Class in org.apache.beam.sdk.io.gcp.pubsub
An (abstract) helper class for talking to Pubsub via an underlying transport.
PubsubClient() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
 
PubsubClient.IncomingMessage - Class in org.apache.beam.sdk.io.gcp.pubsub
A message received from Pubsub.
PubsubClient.OutgoingMessage - Class in org.apache.beam.sdk.io.gcp.pubsub
A message to be sent to Pubsub.
PubsubClient.ProjectPath - Class in org.apache.beam.sdk.io.gcp.pubsub
Path representing a cloud project id.
PubsubClient.PubsubClientFactory - Interface in org.apache.beam.sdk.io.gcp.pubsub
Factory for creating clients.
PubsubClient.SchemaPath - Class in org.apache.beam.sdk.io.gcp.pubsub
Path representing a Pubsub schema.
PubsubClient.SubscriptionPath - Class in org.apache.beam.sdk.io.gcp.pubsub
Path representing a Pubsub subscription.
PubsubClient.TopicPath - Class in org.apache.beam.sdk.io.gcp.pubsub
Path representing a Pubsub topic.
PubsubCoderProviderRegistrar - Class in org.apache.beam.sdk.io.gcp.pubsub
A CoderProviderRegistrar for standard types used with PubsubIO.
PubsubCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubCoderProviderRegistrar
 
PubsubDlqProvider - Class in org.apache.beam.sdk.io.gcp.pubsub
 
PubsubDlqProvider() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubDlqProvider
 
PubsubGrpcClient - Class in org.apache.beam.sdk.io.gcp.pubsub
A helper class for talking to Pubsub via grpc.
PubsubIO - Class in org.apache.beam.sdk.io.gcp.pubsub
Read and Write PTransforms for Cloud Pub/Sub streams.
PubsubIO.PubsubSubscription - Class in org.apache.beam.sdk.io.gcp.pubsub
Class representing a Cloud Pub/Sub Subscription.
PubsubIO.PubsubTopic - Class in org.apache.beam.sdk.io.gcp.pubsub
Class representing a Cloud Pub/Sub Topic.
PubsubIO.Read<T> - Class in org.apache.beam.sdk.io.gcp.pubsub
Implementation of read methods.
PubsubIO.Write<T> - Class in org.apache.beam.sdk.io.gcp.pubsub
Implementation of write methods.
PubsubIO.Write.PubsubBoundedWriter - Class in org.apache.beam.sdk.io.gcp.pubsub
Writer to Pubsub which batches messages from bounded collections.
PubsubJsonClient - Class in org.apache.beam.sdk.io.gcp.pubsub
A Pubsub client using JSON transport.
PubsubLiteIO - Class in org.apache.beam.sdk.io.gcp.pubsublite
I/O transforms for reading from Google Pub/Sub Lite.
PubsubLiteReadSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration
 
PubsubLiteReadSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.pubsublite
 
PubsubLiteReadSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
 
PubsubLiteReadSchemaTransformProvider.ErrorFn - Class in org.apache.beam.sdk.io.gcp.pubsublite
 
PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.gcp.pubsublite
 
PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.pubsublite
 
PubsubLiteSink - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
A sink which publishes messages to Pub/Sub Lite.
PubsubLiteSink(PublisherOptions) - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.PubsubLiteSink
 
PubsubLiteWriteSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration
 
PubsubLiteWriteSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.pubsublite
 
PubsubLiteWriteSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider
 
PubsubLiteWriteSchemaTransformProvider.ErrorCounterFn - Class in org.apache.beam.sdk.io.gcp.pubsublite
 
PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.gcp.pubsublite
 
PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.pubsublite
 
PubsubLiteWriteSchemaTransformProvider.SetUuidFromPubSubMessage - Class in org.apache.beam.sdk.io.gcp.pubsublite
 
PubsubLiteWriteSchemaTransformProvider.SetUuidFromPubSubMessage.SetUuidFn - Class in org.apache.beam.sdk.io.gcp.pubsublite
 
PubsubMessage - Class in org.apache.beam.sdk.io.gcp.pubsub
Class representing a Pub/Sub message.
PubsubMessage(byte[], Map<String, String>) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
 
PubsubMessage(byte[], Map<String, String>, String) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
 
PubsubMessage(byte[], Map<String, String>, String, String) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
 
PubsubMessagePayloadOnlyCoder - Class in org.apache.beam.sdk.io.gcp.pubsub
A coder for PubsubMessage treating the raw bytes being decoded as the message's payload.
PubsubMessagePayloadOnlyCoder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
 
PubsubMessages - Class in org.apache.beam.sdk.io.gcp.pubsub
Common util functions for converting between PubsubMessage proto and PubsubMessage.
PubsubMessages.DeserializeBytesIntoPubsubMessagePayloadOnly - Class in org.apache.beam.sdk.io.gcp.pubsub
 
PubsubMessages.ParsePayloadAsPubsubMessageProto - Class in org.apache.beam.sdk.io.gcp.pubsub
 
PubsubMessages.ParsePubsubMessageProtoAsPayload - Class in org.apache.beam.sdk.io.gcp.pubsub
 
PubsubMessageWithAttributesAndMessageIdAndOrderingKeyCoder - Class in org.apache.beam.sdk.io.gcp.pubsub
A coder for PubsubMessage including all fields of a PubSub message from server.
PubsubMessageWithAttributesAndMessageIdAndOrderingKeyCoder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdAndOrderingKeyCoder
 
PubsubMessageWithAttributesAndMessageIdCoder - Class in org.apache.beam.sdk.io.gcp.pubsub
A coder for PubsubMessage including attributes and the message id from the PubSub server.
PubsubMessageWithAttributesAndMessageIdCoder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdCoder
 
PubsubMessageWithAttributesCoder - Class in org.apache.beam.sdk.io.gcp.pubsub
A coder for PubsubMessage including attributes.
PubsubMessageWithAttributesCoder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
 
PubsubMessageWithMessageIdCoder - Class in org.apache.beam.sdk.io.gcp.pubsub
A coder for PubsubMessage treating the raw bytes being decoded as the message's payload, with the message id from the PubSub server.
PubsubMessageWithMessageIdCoder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithMessageIdCoder
 
PubsubMessageWithTopicCoder - Class in org.apache.beam.sdk.io.gcp.pubsub
A coder for PubsubMessage including the topic from the PubSub server.
PubsubMessageWithTopicCoder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithTopicCoder
 
PubsubOptions - Interface in org.apache.beam.sdk.io.gcp.pubsub
Properties that can be set when using Google Cloud Pub/Sub with the Apache Beam SDK.
PubSubPayloadTranslation - Class in org.apache.beam.sdk.io.gcp.pubsub
 
PubSubPayloadTranslation() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubSubPayloadTranslation
 
PubSubPayloadTranslation.ReadRegistrar - Class in org.apache.beam.sdk.io.gcp.pubsub
 
PubSubPayloadTranslation.WriteRegistrar - Class in org.apache.beam.sdk.io.gcp.pubsub
 
PubsubReadSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.gcp.pubsub
Configuration for reading from Pub/Sub.
PubsubReadSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration
 
PubsubReadSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.pubsub
 
PubsubReadSchemaTransformConfiguration.ErrorHandling - Class in org.apache.beam.sdk.io.gcp.pubsub
 
PubsubReadSchemaTransformConfiguration.ErrorHandling.Builder - Class in org.apache.beam.sdk.io.gcp.pubsub
 
PubsubReadSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.pubsub
An implementation of TypedSchemaTransformProvider for Pub/Sub reads configured using PubsubReadSchemaTransformConfiguration.
PubsubReadSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformProvider
 
PubsubSchemaIOProvider - Class in org.apache.beam.sdk.io.gcp.pubsub
An implementation of SchemaIOProvider for reading and writing JSON/AVRO payloads with PubsubIO.
PubsubSchemaIOProvider() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
 
PubsubTestClient - Class in org.apache.beam.sdk.io.gcp.pubsub
A (partial) implementation of PubsubClient for use by unit tests.
PubsubTestClient() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
 
PubsubTestClient.PubsubTestClientFactory - Interface in org.apache.beam.sdk.io.gcp.pubsub
Closing the factory will validate all expected messages were processed.
PubsubUnboundedSink - Class in org.apache.beam.sdk.io.gcp.pubsub
A PTransform which streams messages to Pubsub.
PubsubUnboundedSink(PubsubClient.PubsubClientFactory, ValueProvider<PubsubClient.TopicPath>, String, String, int) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
 
PubsubUnboundedSink(PubsubClient.PubsubClientFactory, ValueProvider<PubsubClient.TopicPath>, String, String, int, String) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
 
PubsubUnboundedSink(PubsubClient.PubsubClientFactory, ValueProvider<PubsubClient.TopicPath>, String, String, int, int, int) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
 
PubsubUnboundedSink(PubsubClient.PubsubClientFactory, ValueProvider<PubsubClient.TopicPath>, String, String, int, int, int, String) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
 
PubsubUnboundedSource - Class in org.apache.beam.sdk.io.gcp.pubsub
Users should use PubsubIO#read instead.
PubsubUnboundedSource(PubsubClient.PubsubClientFactory, ValueProvider<PubsubClient.ProjectPath>, ValueProvider<PubsubClient.TopicPath>, ValueProvider<PubsubClient.SubscriptionPath>, String, String, boolean) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
Construct an unbounded source to consume from the Pubsub subscription.
PubsubUnboundedSource(Clock, PubsubClient.PubsubClientFactory, ValueProvider<PubsubClient.ProjectPath>, ValueProvider<PubsubClient.TopicPath>, ValueProvider<PubsubClient.SubscriptionPath>, String, String, boolean) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
Construct an unbounded source to consume from the Pubsub subscription.
PubsubUnboundedSource(PubsubClient.PubsubClientFactory, ValueProvider<PubsubClient.ProjectPath>, ValueProvider<PubsubClient.TopicPath>, ValueProvider<PubsubClient.SubscriptionPath>, String, String, boolean, boolean) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
Construct an unbounded source to consume from the Pubsub subscription.
PubsubWriteSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.gcp.pubsub
Configuration for writing to Pub/Sub.
PubsubWriteSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration
 
PubsubWriteSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.pubsub
 
PubsubWriteSchemaTransformConfiguration.ErrorHandling - Class in org.apache.beam.sdk.io.gcp.pubsub
 
PubsubWriteSchemaTransformConfiguration.ErrorHandling.Builder - Class in org.apache.beam.sdk.io.gcp.pubsub
 
PubsubWriteSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.pubsub
An implementation of TypedSchemaTransformProvider for Pub/Sub reads configured using PubsubWriteSchemaTransformConfiguration.
PubsubWriteSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformProvider
 
PubsubWriteSchemaTransformProvider.ErrorFn - Class in org.apache.beam.sdk.io.gcp.pubsub
 
pull(long, PubsubClient.SubscriptionPath, int, boolean) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
Request the next batch of up to batchSize messages from subscription.
pull(long, PubsubClient.SubscriptionPath, int, boolean) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
 
pull(long, PubsubClient.SubscriptionPath, int, boolean) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
 
pull(long, PubsubClient.SubscriptionPath, int, boolean) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
 
putSchemaIfAbsent(TableReference, TableSchema) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableSchemaCache
Registers schema for a table if one is not already present.

Q

QUALIFIER_DEFAULT - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
 
QUERY_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Counter for the total number of queries issued during the execution of the Connector.
queryChangeStreamAction(ChangeStreamDao, PartitionMetadataDao, ChangeStreamRecordMapper, PartitionMetadataMapper, DataChangeRecordAction, HeartbeatRecordAction, ChildPartitionsRecordAction, ChangeStreamMetrics) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.ActionFactory
Creates and returns a single instance of an action class capable of performing a change stream query for a given partition.
QueryChangeStreamAction - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.action
Main action class for querying a partition change stream.
queryResultHasChecksum(String) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher
 
queryUnflattened(String, String, boolean, boolean) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
Performs a query without flattening results.
queryUnflattened(String, String, boolean, boolean, String) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
Performs a query without flattening results.
queryWithRetries(String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
 
queryWithRetries(String, String, boolean) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
 
queryWithRetriesUsingStandardSql(String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
 

R

RampupThrottlingFn<T> - Class in org.apache.beam.sdk.io.gcp.datastore
An implementation of a client-side throttler that enforces a gradual ramp-up, broadly in line with Datastore best practices.
RampupThrottlingFn(ValueProvider<Integer>, PCollectionView<Instant>) - Constructor for class org.apache.beam.sdk.io.gcp.datastore.RampupThrottlingFn
 
RampupThrottlingFn(int, PCollectionView<Instant>) - Constructor for class org.apache.beam.sdk.io.gcp.datastore.RampupThrottlingFn
 
random() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.Uuid
 
range - Variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
 
read() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
read(SerializableFunction<SchemaAndRecord, T>) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
Reads from a BigQuery table or query and returns a PCollection with one element per each row of the table or query result, parsed from the BigQuery AVRO format using the specified function.
read() - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO
Creates an uninitialized BigtableIO.Read.
Read() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
 
read() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1
Returns an empty DatastoreV1.Read builder.
Read() - Constructor for class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
read() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1
The class returned by this method provides the ability to create PTransforms for read operations available in the Firestore V1 API provided by FirestoreStub.
Read() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read
Instantiates a new Read.
read(String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
Read all HL7v2 Messages from a single store.
read(ValueProvider<String>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
Read all HL7v2 Messages from a single store.
Read() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read
 
Read() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
 
read(SubscriberOptions) - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteIO
Read messages from Pub/Sub Lite.
read(Object, Decoder) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.encoder.TimestampEncoding
Deserializes a Timestamp from the given Decoder.
read() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
Creates an uninitialized instance of SpannerIO.Read.
Read() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
READ_URN - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.internal.ExternalTransformRegistrarImpl
 
READ_URN - Static variable in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar
 
readAll(List<String>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
Read all HL7v2 Messages from multiple stores.
readAll(ValueProvider<List<String>>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
Read all HL7v2 Messages from multiple stores.
readAll() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
A PTransform that works like SpannerIO.read(), but executes read operations coming from a PCollection.
ReadAll() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
 
readAllRequests() - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
Retrieve all HL7v2 Messages from a PCollection of HL7v2ReadParameter.
readAllStreamPartitions() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
Read all the StreamPartition and output PartitionRecord to stream them.
readAllWithFilter(List<String>, String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
Read all HL7v2 Messages from a multiple stores matching a filter.
readAllWithFilter(ValueProvider<List<String>>, ValueProvider<String>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
Read all HL7v2 Messages from a multiple stores matching a filter.
readAvroGenericRecords(Schema) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns a PTransform that continuously reads binary encoded Avro messages into the Avro GenericRecord type.
readAvros(Class<T>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns A PTransform that continuously reads binary encoded Avro messages of the given type from a Google Cloud Pub/Sub stream.
readAvrosWithBeamSchema(Class<T>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns a PTransform that continuously reads binary encoded Avro messages of the specific type.
ReadBuilder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead.ReadBuilder
 
ReadBuilder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder
 
readCallMetric(TableReference) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
 
readChangeStream() - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO
Creates an uninitialized BigtableIO.ReadChangeStream.
ReadChangeStream() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
 
readChangeStream() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
Creates an uninitialized instance of SpannerIO.ReadChangeStream.
ReadChangeStream() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
 
readChangeStreamPartition(PartitionRecord, StreamProgress, Instant, Duration) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.ChangeStreamDao
Streams a partition.
readChangeStreamPartitionAction(MetadataTableDao, ChangeStreamDao, ChangeStreamMetrics, ChangeStreamAction, Duration, SizeEstimator<KV<ByteString, ChangeStreamRecord>>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ActionFactory
Creates and returns a singleton instance of an action class for processing ReadChangeStreamPartitionDoFn.
ReadChangeStreamPartitionAction - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.action
This class is part of ReadChangeStreamPartitionDoFn SDF.
ReadChangeStreamPartitionAction(MetadataTableDao, ChangeStreamDao, ChangeStreamMetrics, ChangeStreamAction, Duration, SizeEstimator<KV<ByteString, ChangeStreamRecord>>) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ReadChangeStreamPartitionAction
 
ReadChangeStreamPartitionDoFn - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn
 
ReadChangeStreamPartitionDoFn(DaoFactory, ActionFactory, ChangeStreamMetrics) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.ReadChangeStreamPartitionDoFn
 
ReadChangeStreamPartitionDoFn - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn
A SDF (Splittable DoFn) class which is responsible for performing a change stream query for a given partition.
ReadChangeStreamPartitionDoFn(DaoFactory, MapperFactory, ActionFactory, ChangeStreamMetrics) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.ReadChangeStreamPartitionDoFn
This class needs a DaoFactory to build DAOs to access the partition metadata tables and to perform the change streams query.
ReadChangeStreamPartitionProgressTracker - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction
RestrictionTracker used by ReadChangeStreamPartitionDoFn to keep track of the progress of the stream and to split the restriction for runner initiated checkpoints.
ReadChangeStreamPartitionProgressTracker(StreamProgress) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.ReadChangeStreamPartitionProgressTracker
Constructs a restriction tracker with the streamProgress.
ReadChangeStreamPartitionRangeTracker - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction
This restriction tracker delegates most of its behavior to an internal TimestampRangeTracker.
ReadChangeStreamPartitionRangeTracker(PartitionMetadata, TimestampRange) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.ReadChangeStreamPartitionRangeTracker
Receives the partition that will be queried and the timestamp range that belongs to it.
readDetectNewPartitionMissingPartitions() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
Read and deserialize missing partition and how long they have been missing from the metadata table.
readDetectNewPartitionsState() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
Read the low watermark of the pipeline from Detect New Partition row.
readFhirResource(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Read fhir resource http body.
readFhirResource(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
readMessages() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns A PTransform that continuously reads from a Google Cloud Pub/Sub stream.
readMessagesWithAttributes() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns A PTransform that continuously reads from a Google Cloud Pub/Sub stream.
readMessagesWithAttributesAndMessageId() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns A PTransform that continuously reads from a Google Cloud Pub/Sub stream.
readMessagesWithAttributesAndMessageIdAndOrderingKey() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns A PTransform that continuously reads from a Google Cloud Pub/Sub stream.
readMessagesWithCoderAndParseFn(Coder<T>, SimpleFunction<PubsubMessage, T>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns A PTransform that continuously reads from a Google Cloud Pub/Sub stream, mapping each PubsubMessage into type T using the supplied parse function and coder.
readMessagesWithMessageId() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns A PTransform that continuously reads from a Google Cloud Pub/Sub stream.
readNewPartitions() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
 
readNewPartitionsIncludingDeleted() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
 
ReadOperation - Class in org.apache.beam.sdk.io.gcp.spanner
Encapsulates a spanner read operation.
ReadOperation() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
readProtoDynamicMessages(ProtoDomain, String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns a PTransform that continuously reads binary encoded protobuf messages for the type specified by fullMessageName.
readProtoDynamicMessages(Descriptors.Descriptor) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Similar to PubsubIO.readProtoDynamicMessages(ProtoDomain, String) but for when the Descriptors.Descriptor is already known.
readProtos(Class<T>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns A PTransform that continuously reads binary encoded protobuf messages of the given type from a Google Cloud Pub/Sub stream.
ReadRegistrar() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOTranslation.ReadRegistrar
 
ReadRegistrar() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubSubPayloadTranslation.ReadRegistrar
 
readResources() - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
Read resources from a PCollection of resource IDs (e.g.
readRows(ReadRowsRequest) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StorageClient
Read rows in the context of a specific read stream.
readRows(ReadRowsRequest, String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StorageClient
 
ReadSpannerSchema - Class in org.apache.beam.sdk.io.gcp.spanner
This DoFn reads Cloud Spanner 'information_schema.*' tables to build the SpannerSchema.
ReadSpannerSchema(SpannerConfig, PCollectionView<Dialect>) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.ReadSpannerSchema
Constructor for creating an instance of the ReadSpannerSchema class.
ReadSpannerSchema(SpannerConfig, PCollectionView<Dialect>, Set<String>) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.ReadSpannerSchema
Constructor for creating an instance of the ReadSpannerSchema class.
readStreamPartitionsWithWatermark() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
Return list of locked StreamPartition and their watermarks.
readStrings() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns A PTransform that continuously reads UTF-8 encoded strings from a Google Cloud Pub/Sub stream.
readStudyMetadata() - Static method in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO
 
readTableRows() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
Like BigQueryIO.read(SerializableFunction) but represents each row as a TableRow.
readTableRowsWithSchema() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
Like BigQueryIO.readTableRows() but with Schema support.
readWithDatumReader(AvroSource.DatumReaderFactory<T>) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
Reads from a BigQuery table or query and returns a PCollection with one element per each row of the table or query result.
readWithFilter(String, String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
Read all HL7v2 Messages from a single store matching a filter.
readWithFilter(ValueProvider<String>, ValueProvider<String>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
Read all HL7v2 Messages from a single store matching a filter.
recordId() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.IncomingMessage
Id to pass to the runner to distinguish this message from all others.
recordId() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.OutgoingMessage
If using an id attribute, the record id to associate with this record's metadata so the receiver can reject duplicates.
refreshSchema(TableReference, BigQueryServices.DatasetService) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableSchemaCache
 
refreshThread() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableSchemaCache
 
ReifyAsIterable<T> - Class in org.apache.beam.sdk.io.gcp.bigquery
This transforms turns a side input into a singleton PCollection that can be used as the main input for another transform.
ReifyAsIterable() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.ReifyAsIterable
 
releaseStreamPartitionLockForDeletion(Range.ByteStringRange, String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
This is the 1st step of 2 phase delete of StreamPartition.
REPLACE_URN - Static variable in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar
 
ReplaceBuilder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReplaceBuilder
 
reportFailedRPCMetrics(RetryManager.Operation.Context<?>, BigQuerySinkMetrics.RpcMethod) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
Records RpcRequests counter and RpcLatency histogram for this RPC call.
reportFailedRPCMetrics(RetryManager.Operation.Context<?>, BigQuerySinkMetrics.RpcMethod, String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
Records RpcRequests counter and RpcLatency histogram for this RPC call.
reportSuccessfulRpcMetrics(RetryManager.Operation.Context<?>, BigQuerySinkMetrics.RpcMethod) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
Records RpcRequests counter and RpcLatency histogram for this RPC call.
reportSuccessfulRpcMetrics(RetryManager.Operation.Context<?>, BigQuerySinkMetrics.RpcMethod, String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
Records RpcRequests counter and RpcLatency histogram for this RPC call.
requestTimeMsSinceEpoch() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.IncomingMessage
Timestamp (in system time) at which we requested the message (ms since epoch).
requiresDataSchema() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaIOProvider
Indicates whether this transform requires a specified data schema.
requiresDataSchema() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider
 
requiresDataSchema() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
 
restrictionTracker(OffsetRange) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.DetectNewPartitionsDoFn
 
restrictionTracker(StreamProgress) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.ReadChangeStreamPartitionDoFn
 
resumeFromPreviousPipelineAction(ChangeStreamMetrics, MetadataTableDao, Instant, ProcessNewPartitionsAction) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ActionFactory
 
ResumeFromPreviousPipelineAction - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.action
 
ResumeFromPreviousPipelineAction(ChangeStreamMetrics, MetadataTableDao, Instant, ProcessNewPartitionsAction) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ResumeFromPreviousPipelineAction
 
retrieveDicomStudyMetadata(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Retrieve DicomStudyMetadata.
retrieveDicomStudyMetadata(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
retryTransientErrors() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.InsertRetryPolicy
Retry all failures except for known persistent errors.
ROW_SCHEMA - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider
 
RowMutation - Class in org.apache.beam.sdk.io.gcp.bigquery
A convenience class for applying row updates to BigQuery using BigQueryIO.applyRowMutations().
RowMutation() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.RowMutation
 
RowMutation.RowMutationCoder - Class in org.apache.beam.sdk.io.gcp.bigquery
 
RowMutationCoder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.RowMutation.RowMutationCoder
 
RowMutationInformation - Class in org.apache.beam.sdk.io.gcp.bigquery
This class indicates how to apply a row update to BigQuery.
RowMutationInformation() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.RowMutationInformation
 
RowMutationInformation.MutationType - Enum in org.apache.beam.sdk.io.gcp.bigquery
 
RowToEntity - Class in org.apache.beam.sdk.io.gcp.datastore
A PTransform to perform a conversion of Row to Entity.
RowUtils - Class in org.apache.beam.sdk.io.gcp.bigtable
 
RowUtils() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.RowUtils
 
RpcQosOptions - Class in org.apache.beam.sdk.io.gcp.firestore
Quality of Service manager options for Firestore RPCs.
RpcQosOptions.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
Mutable Builder class for creating instances of RpcQosOptions.
run(PartitionRecord, ChangeStreamRecord, RestrictionTracker<StreamProgress, StreamProgress>, DoFn.OutputReceiver<KV<ByteString, ChangeStreamRecord>>, ManualWatermarkEstimator<Instant>, BytesThroughputEstimator<KV<ByteString, ChangeStreamRecord>>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ChangeStreamAction
This class processes ReadChangeStreamResponse from bigtable server.
run(RestrictionTracker<OffsetRange, Long>, DoFn.OutputReceiver<PartitionRecord>, ManualWatermarkEstimator<Instant>, InitialPipelineState) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.DetectNewPartitionsAction
Perform the necessary steps to manage initial set of partitions and new partitions.
run(DoFn.OutputReceiver<PartitionRecord>, Instant) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.GenerateInitialPartitionsAction
The very first step of the pipeline when there are no partitions being streamed yet.
run(PartitionRecord, RestrictionTracker<StreamProgress, StreamProgress>, DoFn.OutputReceiver<KV<ByteString, ChangeStreamRecord>>, ManualWatermarkEstimator<Instant>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ReadChangeStreamPartitionAction
Streams changes from a specific partition.
run(DoFn.OutputReceiver<PartitionRecord>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.action.ResumeFromPreviousPipelineAction
Resume from previously drained pipeline.
run(PartitionMetadata, ChildPartitionsRecord, RestrictionTracker<TimestampRange, Timestamp>, ManualWatermarkEstimator<Instant>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.ChildPartitionsRecordAction
This is the main processing function for a ChildPartitionsRecord.
run(PartitionMetadata, DataChangeRecord, RestrictionTracker<TimestampRange, Timestamp>, DoFn.OutputReceiver<DataChangeRecord>, ManualWatermarkEstimator<Instant>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.DataChangeRecordAction
This is the main processing function for a DataChangeRecord.
run(RestrictionTracker<TimestampRange, Timestamp>, DoFn.OutputReceiver<PartitionMetadata>, ManualWatermarkEstimator<Instant>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.DetectNewPartitionsAction
Executes the main logic to schedule new partitions.
run(PartitionMetadata, HeartbeatRecord, RestrictionTracker<TimestampRange, Timestamp>, ManualWatermarkEstimator<Instant>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.HeartbeatRecordAction
This is the main processing function for a HeartbeatRecord.
run(PartitionMetadata, RestrictionTracker<TimestampRange, Timestamp>, DoFn.OutputReceiver<DataChangeRecord>, ManualWatermarkEstimator<Instant>, DoFn.BundleFinalizer) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.QueryChangeStreamAction
This method will dispatch a change stream query for the given partition, it delegate the processing of the records to one of the corresponding action classes registered and it will keep the state of the partition up to date in the Connector's metadata table.
runInTransaction(Function<PartitionMetadataDao.InTransactionContext, T>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
Runs a given function in a transaction context.
runInTransaction(Function<PartitionMetadataDao.InTransactionContext, T>, String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
 
runQuery() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.Read
Factory method to create a new type safe builder for RunQueryRequest operations.

S

SAMPLE_PARTITION - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
We use a bogus partition here to estimate the average size of a partition metadata record.
schema() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
 
SchemaAndRecord - Class in org.apache.beam.sdk.io.gcp.bigquery
A wrapper for a GenericRecord and the TableSchema representing the schema of the table (or query) it was generated from.
SchemaAndRecord(GenericRecord, TableSchema) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.SchemaAndRecord
 
SchemaConversionOptions() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.SchemaConversionOptions
 
schemaPathFromId(String, String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
 
schemaPathFromPath(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
 
schemaToProtoTableSchema(TableSchema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
 
searchFhirResource(String, String, Map<String, Object>, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Search fhir resource http body.
searchFhirResource(String, String, Map<String, Object>, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
searchResources(String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
Search resources from a Fhir store with String parameter values.
searchResourcesWithGenericParameters(String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
Search resources from a Fhir store with any type of parameter values.
seriesId - Variable in class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser.DicomWebPath
 
setAttributeId(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration.Builder
 
setAttributeId(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration.Builder
 
setAttributeMap(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration.Builder
 
setAttributes(List<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.Builder
 
setAttributes(List<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.Builder
 
setAttributes(List<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration.Builder
 
setAttributes(List<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration.Builder
 
setAttributesMap(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.Builder
 
setAttributesMap(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.Builder
 
setAutoSharding(Boolean) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransformConfiguration.Builder
 
setAveragePartitionBytesSize(long) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.DetectNewPartitionsDoFn
Sets the average partition bytes size to estimate the backlog of this DoFn.
setBatching(Boolean) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
 
setBigQueryLocation(String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.TestBigQueryOptions
 
setBigQueryProject(String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setBigQueryServices(BigQueryServices) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransform
 
setBigQueryServices(BigQueryServices) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransform
 
setBigtableChangeStreamInstanceId(String) - Method in interface org.apache.beam.sdk.io.gcp.bigtable.changestreams.BigtableChangeStreamTestOptions
 
setBqStreamingApiLoggingFrequencySec(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setChangeStreamName(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration.Builder
 
setClientFactory(PubsubTestClient.PubsubTestClientFactory) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.Builder
 
setClock(Clock) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.Builder
 
setClock(PubsubIO.Read<T>, Clock) - Method in interface org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient.PubsubTestClientFactory
 
setCreatedAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
Sets the time at which the partition was created.
setCreateDisposition(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryFileLoadsWriteSchemaTransformConfiguration.Builder
Specifies whether the table should be created if it does not exist.
setCreateDisposition(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransformConfiguration.Builder
 
setDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration.Builder
 
setDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.CrossLanguageConfiguration
 
setDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration.Builder
 
setDeduplicate(Deduplicate.KeyedValues<Uuid, SequencedMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions.Builder
Set the deduplication transform.
setEmulatorHost(String) - Method in interface org.apache.beam.sdk.io.gcp.firestore.FirestoreOptions
Define a host port pair to allow connecting to a Cloud Firestore emulator instead of the live service.
setEmulatorHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.CrossLanguageConfiguration
 
setEnableBundling(Boolean) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setEndAtTimestamp(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration.Builder
 
setEndTime(Instant) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
 
setEndTimestamp(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
Sets the end time of the partition.
setErrorHandling(BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransformConfiguration.ErrorHandling) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransformConfiguration.Builder
 
setErrorHandling(PubsubReadSchemaTransformConfiguration.ErrorHandling) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.Builder
 
setErrorHandling(PubsubWriteSchemaTransformConfiguration.ErrorHandling) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.Builder
 
setErrorHandling(ErrorHandling) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration.Builder
 
setErrorHandling(ErrorHandling) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration.Builder
 
setFailToLock(boolean) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
 
setFinishedAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
Sets the time at which the partition finished running.
setFirestoreDb(String) - Method in interface org.apache.beam.sdk.io.gcp.firestore.FirestoreOptions
Set the Firestore database ID to connect to.
setFirestoreHost(String) - Method in interface org.apache.beam.sdk.io.gcp.firestore.FirestoreOptions
Define a host port pair to allow connecting to a Cloud Firestore instead of the default live service.
setFormat(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.Builder
 
setFormat(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.Builder
 
setFormat(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration.Builder
 
setFormat(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration.Builder
 
setHeartbeatMillis(long) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
Sets the heartbeat interval in millis.
setHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.CrossLanguageConfiguration
 
setHTTPWriteTimeout(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setIdAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.Builder
 
setIdAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.Builder
 
setIdLabel(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead.Configuration
 
setIdLabel(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite.Configuration
 
setInferMaps(boolean) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.SchemaConversionOptions.Builder
 
setInsertBundleParallelism(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration.Builder
 
setInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration.Builder
 
setInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration.Builder
 
setInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.CrossLanguageConfiguration
 
setInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration.Builder
 
setLocation(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration.Builder
 
setLocation(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration.Builder
 
setMaxBufferingDurationMilliSec(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setMaxStreamingBatchSize(Long) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setMaxStreamingRowsToBatch(Long) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setMetadataTable(String) - Method in interface org.apache.beam.sdk.io.gcp.spanner.SpannerIO.SpannerChangeStreamOptions
Specifies the name of the metadata table.
setNumFailuresExpected(int) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
 
setNumStorageWriteApiStreamAppendClients(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setNumStorageWriteApiStreams(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setNumStreamingKeys(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setNumStreams(Integer) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransformConfiguration.Builder
 
setOutput(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransformConfiguration.ErrorHandling.Builder
 
setOutput(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.ErrorHandling.Builder
 
setOutput(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.ErrorHandling.Builder
 
setParentTokens(HashSet<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
Sets the collection of parent partition identifiers.
setPartitionToken(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
Sets the unique partition identifier.
setPayload(byte[]) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload.Builder
 
setPrimaryKey(TableReference, List<String>) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
setProject(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration.Builder
 
setProject(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration.Builder
 
setProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration.Builder
 
setProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration.Builder
 
setProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration.Builder
 
setProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.CrossLanguageConfiguration
 
setPubsubRootUrl(String) - Method in interface org.apache.beam.sdk.io.gcp.pubsub.PubsubOptions
 
setQuery(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration.Builder
Configures the BigQuery read job with the SQL query.
setQuery(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration.Builder
 
setQueryLocation(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration.Builder
BigQuery geographic location where the query job will be executed.
setReadTimestamp(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
 
setRowRestriction(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration.Builder
 
setRunningAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
Sets the time at which the partition started running.
setScheduledAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
Sets the time at which the partition was scheduled.
setSchema(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.Builder
 
setSchema(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration.Builder
 
setSchema(byte[]) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
 
setSchematizedData(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
 
setSelectedFields(List<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration.Builder
 
setShouldFailRow(Function<TableRow, Boolean>) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
setSizeEstimator(CoderSizeEstimator<KV<ByteString, ChangeStreamRecord>>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.ReadChangeStreamPartitionDoFn
Sets the estimator to track throughput for each DoFn instance.
setSpannerConfig(SpannerConfig) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction.Builder
 
setSql(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
 
setStaleness(Long) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
 
setStartAtTimestamp(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration.Builder
 
setStartTimestamp(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
Sets the start time of the partition.
setState(PartitionMetadata.State) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
Sets the current state of the partition.
setStorageApiAppendThresholdBytes(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setStorageApiAppendThresholdRecordCount(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setStorageWriteApiMaxRequestSize(Long) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setStorageWriteApiTriggeringFrequencySec(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setStorageWriteMaxInflightBytes(Long) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setStorageWriteMaxInflightRequests(Long) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setSubscription(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead.Configuration
 
setSubscription(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.Builder
 
setSubscriptionName(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider.PubsubLiteReadSchemaTransformConfiguration.Builder
 
setSubscriptionPath(SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.SubscriberOptions.Builder
 
setSupportMetricsDeletion(Boolean) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
 
setTable(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransformConfiguration.Builder
 
setTable(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration.Builder
 
setTable(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
 
setTableId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration.Builder
 
setTableId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration.Builder
 
setTableId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration.Builder
 
setTableSpec(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration.Builder
Specifies a table for a BigQuery read job.
setTableSpec(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryFileLoadsWriteSchemaTransformConfiguration.Builder
Writes to the given table specification.
setTableSpec(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration.Builder
 
setTargetDataset(String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.TestBigQueryOptions
 
setTempDatasetId(String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setThroughputEstimator(BytesThroughputEstimator<DataChangeRecord>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.ReadChangeStreamPartitionDoFn
Sets the estimator to calculate the backlog of this function.
setTimestamp(Instant) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload.Builder
 
setTimestampAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead.Configuration
 
setTimestampAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite.Configuration
 
setTimestampAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.Builder
 
setTimestampAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.Builder
 
setTimestampBound(TimestampBound) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction.Builder
 
setTimestampBoundMode(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
 
setTimeSupplier(Supplier<Timestamp>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
 
setTimeUnit(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
 
setTopic(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead.Configuration
 
setTopic(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite.Configuration
 
setTopic(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformConfiguration.Builder
 
setTopic(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformConfiguration.Builder
 
setTopicName(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.PubsubLiteWriteSchemaTransformConfiguration.Builder
 
setTopicPath(TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PublisherOptions.Builder
 
setTriggeringFrequencySeconds(Long) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransformConfiguration.Builder
 
setTruncateTimestamps(BigQueryUtils.ConversionOptions.TruncateTimestamps) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions.Builder
 
setUnknownFieldsPayload(byte[]) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload.Builder
 
setup() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.DetectNewPartitionsDoFn
 
setup() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dofn.ReadChangeStreamPartitionDoFn
 
setup() - Method in class org.apache.beam.sdk.io.gcp.datastore.RampupThrottlingFn
 
setup() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.DetectNewPartitionsDoFn
Obtains the instance of DetectNewPartitionsAction.
setup() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.ReadChangeStreamPartitionDoFn
setup() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadSpannerSchema
 
setUp() - Static method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
setUp() - Static method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
 
setUseAtLeastOnceSemantics(Boolean) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransformConfiguration.Builder
 
setUseStandardSql(Boolean) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryExportReadSchemaTransformConfiguration.Builder
Enables BigQuery's Standard SQL dialect when reading from a query.
setUseStorageApiConnectionPool(Boolean) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setUseStorageWriteApi(Boolean) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setUseStorageWriteApiAtLeastOnce(Boolean) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
 
setUuid(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
 
setUuidExtractor(SerializableFunction<SequencedMessage, Uuid>) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions.Builder
 
SetUuidFn(String) - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.SetUuidFromPubSubMessage.SetUuidFn
 
SetUuidFromPubSubMessage(String) - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider.SetUuidFromPubSubMessage
 
setWatermark(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
Sets the watermark (last processed timestamp) for the partition.
setWithAttributes(Boolean) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead.Configuration
 
setWriteDisposition(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryFileLoadsWriteSchemaTransformConfiguration.Builder
Specifies what to do with existing data in the table, in case the table already exists.
setWriteDisposition(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransformConfiguration.Builder
 
shouldRetry(InsertRetryPolicy.Context) - Method in class org.apache.beam.sdk.io.gcp.bigquery.InsertRetryPolicy
Return true if this failure should be retried.
sideInput(PCollectionView<SideInputT>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
Returns the value of a given side input.
signalStart() - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal
Outputs a message that the pipeline has started.
signalSuccessWhen(Coder<T>, SerializableFunction<T, String>, SerializableFunction<Set<T>, Boolean>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal
Outputs a success message when successPredicate is evaluated to true.
signalSuccessWhen(Coder<T>, SerializableFunction<Set<T>, Boolean>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal
size() - Method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
 
SizeEstimator<T> - Interface in org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator
 
SizeEstimator<T> - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator
This class is used to estimate the size in bytes of a given element.
SizeEstimator(Coder<T>) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.SizeEstimator
 
sizeOf(T) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator.CoderSizeEstimator
Estimates the size in bytes of the given element with the configured Coder .
sizeOf(T) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator.NullSizeEstimator
 
sizeOf(T) - Method in interface org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator.SizeEstimator
 
sizeOf(T) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.SizeEstimator
Estimates the size in bytes of the given element with the configured Coder .
skipInvalidRows() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Insert all valid rows of a request, even if invalid rows exist.
SpannerAccessor - Class in org.apache.beam.sdk.io.gcp.spanner
Manages lifecycle of DatabaseClient and Spanner instances.
SpannerChangestreamsReadConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration
 
SpannerChangestreamsReadSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams
 
SpannerChangestreamsReadSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.SpannerChangestreamsReadSchemaTransformProvider
 
SpannerChangestreamsReadSchemaTransformProvider.DataChangeRecordToRow - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams
 
SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams
 
SpannerChangestreamsReadSchemaTransformProvider.SpannerChangestreamsReadConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams
 
SpannerConfig - Class in org.apache.beam.sdk.io.gcp.spanner
Configuration for a Cloud Spanner client.
SpannerConfig() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
SpannerConfig.Builder - Class in org.apache.beam.sdk.io.gcp.spanner
Builder for SpannerConfig.
SpannerIO - Class in org.apache.beam.sdk.io.gcp.spanner
Reading from Cloud Spanner
SpannerIO.CreateTransaction - Class in org.apache.beam.sdk.io.gcp.spanner
A PTransform that create a transaction.
SpannerIO.CreateTransaction.Builder - Class in org.apache.beam.sdk.io.gcp.spanner
SpannerIO.FailureMode - Enum in org.apache.beam.sdk.io.gcp.spanner
A failure handling strategy.
SpannerIO.Read - Class in org.apache.beam.sdk.io.gcp.spanner
Implementation of SpannerIO.read().
SpannerIO.ReadAll - Class in org.apache.beam.sdk.io.gcp.spanner
Implementation of SpannerIO.readAll().
SpannerIO.ReadChangeStream - Class in org.apache.beam.sdk.io.gcp.spanner
 
SpannerIO.SpannerChangeStreamOptions - Interface in org.apache.beam.sdk.io.gcp.spanner
Interface to display the name of the metadata table on Dataflow UI.
SpannerIO.Write - Class in org.apache.beam.sdk.io.gcp.spanner
A PTransform that writes Mutation objects to Google Cloud Spanner.
SpannerIO.WriteGrouped - Class in org.apache.beam.sdk.io.gcp.spanner
Same as SpannerIO.Write but supports grouped mutations.
SpannerSchema - Class in org.apache.beam.sdk.io.gcp.spanner
Encapsulates Cloud Spanner Schema.
SpannerSchema() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerSchema
 
SpannerSchema.Column - Class in org.apache.beam.sdk.io.gcp.spanner
 
SpannerSchema.KeyPart - Class in org.apache.beam.sdk.io.gcp.spanner
 
SpannerTransformRegistrar - Class in org.apache.beam.sdk.io.gcp.spanner
Exposes SpannerIO.WriteRows and SpannerIO.ReadRows as an external transform for cross-language usage.
SpannerTransformRegistrar() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar
 
SpannerTransformRegistrar.CrossLanguageConfiguration - Class in org.apache.beam.sdk.io.gcp.spanner
 
SpannerTransformRegistrar.DeleteBuilder - Class in org.apache.beam.sdk.io.gcp.spanner
 
SpannerTransformRegistrar.InsertBuilder - Class in org.apache.beam.sdk.io.gcp.spanner
 
SpannerTransformRegistrar.InsertOrUpdateBuilder - Class in org.apache.beam.sdk.io.gcp.spanner
 
SpannerTransformRegistrar.ReadBuilder - Class in org.apache.beam.sdk.io.gcp.spanner
 
SpannerTransformRegistrar.ReadBuilder.Configuration - Class in org.apache.beam.sdk.io.gcp.spanner
 
SpannerTransformRegistrar.ReplaceBuilder - Class in org.apache.beam.sdk.io.gcp.spanner
 
SpannerTransformRegistrar.UpdateBuilder - Class in org.apache.beam.sdk.io.gcp.spanner
 
SpannerWriteResult - Class in org.apache.beam.sdk.io.gcp.spanner
The results of a SpannerIO.write() transform.
SpannerWriteResult(Pipeline, PCollection<Void>, PCollection<MutationGroup>, TupleTag<MutationGroup>) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteResult
 
SpannerWriteSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration
 
SpannerWriteSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.spanner
 
SpannerWriteSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider
 
SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.gcp.spanner
 
SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.spanner
 
split(int, PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedSourceImpl
 
splitReadStream(SplitReadStreamRequest) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StorageClient
 
splitReadStream(SplitReadStreamRequest, String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StorageClient
 
start() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
 
startBundle(DoFn<Iterable<KV<DestinationT, WriteTables.Result>>, Iterable<KV<TableDestination, WriteTables.Result>>>.StartBundleContext) - Method in class org.apache.beam.sdk.io.gcp.bigquery.UpdateSchemaDestination
 
startBundle(DoFn<PubsubMessage, Void>.StartBundleContext) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write.PubsubBoundedWriter
 
startBundle() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.PubsubLiteSink
 
startCopyJob(JobReference, JobConfigurationTableCopy) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.JobService
Start a BigQuery copy job.
startCopyJob(JobReference, JobConfigurationTableCopy) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
 
startExtractJob(JobReference, JobConfigurationExtract) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.JobService
Start a BigQuery extract job.
startExtractJob(JobReference, JobConfigurationExtract) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
 
startLoadJob(JobReference, JobConfigurationLoad) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.JobService
Start a BigQuery load job.
startLoadJob(JobReference, JobConfigurationLoad, AbstractInputStreamContent) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.JobService
Start a BigQuery load job with stream content.
startLoadJob(JobReference, JobConfigurationLoad) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
 
startLoadJob(JobReference, JobConfigurationLoad, AbstractInputStreamContent) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
 
startQueryJob(JobReference, JobConfigurationQuery) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.JobService
Start a BigQuery query job.
startQueryJob(JobReference, JobConfigurationQuery) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
 
StorageApiCDC - Class in org.apache.beam.sdk.io.gcp.bigquery
Constants and variables for CDC support.
StorageApiCDC() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StorageApiCDC
 
StorageApiConvertMessages<DestinationT,ElementT> - Class in org.apache.beam.sdk.io.gcp.bigquery
A transform that converts messages to protocol buffers in preparation for writing to BigQuery.
StorageApiConvertMessages(StorageApiDynamicDestinations<ElementT, DestinationT>, BigQueryServices, TupleTag<BigQueryStorageApiInsertError>, TupleTag<KV<DestinationT, StorageApiWritePayload>>, Coder<BigQueryStorageApiInsertError>, Coder<KV<DestinationT, StorageApiWritePayload>>, SerializableFunction<ElementT, RowMutationInformation>) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StorageApiConvertMessages
 
StorageApiConvertMessages.ConvertMessagesDoFn<DestinationT,ElementT> - Class in org.apache.beam.sdk.io.gcp.bigquery
 
StorageApiDynamicDestinationsTableRow<T,DestinationT> - Class in org.apache.beam.sdk.io.gcp.bigquery
 
StorageApiFlushAndFinalizeDoFn - Class in org.apache.beam.sdk.io.gcp.bigquery
This DoFn flushes and optionally (if requested) finalizes Storage API streams.
StorageApiFlushAndFinalizeDoFn(BigQueryServices) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StorageApiFlushAndFinalizeDoFn
 
StorageApiLoads<DestinationT,ElementT> - Class in org.apache.beam.sdk.io.gcp.bigquery
This PTransform manages loads into BigQuery using the Storage API.
StorageApiLoads(Coder<DestinationT>, StorageApiDynamicDestinations<ElementT, DestinationT>, SerializableFunction<ElementT, RowMutationInformation>, BigQueryIO.Write.CreateDisposition, String, Duration, BigQueryServices, int, boolean, boolean, boolean, boolean, boolean, boolean, AppendRowsRequest.MissingValueInterpretation) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StorageApiLoads
 
StorageApiWritePayload - Class in org.apache.beam.sdk.io.gcp.bigquery
Class used to wrap elements being sent to the Storage API sinks.
StorageApiWritePayload() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload
 
StorageApiWritePayload.Builder - Class in org.apache.beam.sdk.io.gcp.bigquery
 
StorageApiWriteRecordsInconsistent<DestinationT,ElementT> - Class in org.apache.beam.sdk.io.gcp.bigquery
A transform to write sharded records to BigQuery using the Storage API.
StorageApiWriteRecordsInconsistent(StorageApiDynamicDestinations<ElementT, DestinationT>, BigQueryServices, TupleTag<BigQueryStorageApiInsertError>, TupleTag<TableRow>, Coder<BigQueryStorageApiInsertError>, Coder<TableRow>, boolean, boolean, BigQueryIO.Write.CreateDisposition, String, boolean, AppendRowsRequest.MissingValueInterpretation) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteRecordsInconsistent
 
StorageApiWritesShardedRecords<DestinationT,ElementT> - Class in org.apache.beam.sdk.io.gcp.bigquery
A transform to write sharded records to BigQuery using the Storage API (Streaming).
StorageApiWritesShardedRecords(StorageApiDynamicDestinations<ElementT, DestinationT>, BigQueryIO.Write.CreateDisposition, String, BigQueryServices, Coder<DestinationT>, Coder<BigQueryStorageApiInsertError>, Coder<TableRow>, TupleTag<BigQueryStorageApiInsertError>, TupleTag<TableRow>, boolean, boolean, AppendRowsRequest.MissingValueInterpretation) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritesShardedRecords
 
StorageApiWriteUnshardedRecords<DestinationT,ElementT> - Class in org.apache.beam.sdk.io.gcp.bigquery
Write records to the Storage API using a standard batch approach.
StorageApiWriteUnshardedRecords(StorageApiDynamicDestinations<ElementT, DestinationT>, BigQueryServices, TupleTag<BigQueryStorageApiInsertError>, TupleTag<TableRow>, Coder<BigQueryStorageApiInsertError>, Coder<TableRow>, boolean, boolean, BigQueryIO.Write.CreateDisposition, String, boolean, AppendRowsRequest.MissingValueInterpretation) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords
 
storeId - Variable in class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser.DicomWebPath
 
STREAM_PARTITION_PREFIX - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableAdminDao
 
StreamingInserts<DestinationT,ElementT> - Class in org.apache.beam.sdk.io.gcp.bigquery
PTransform that performs streaming BigQuery write.
StreamingInserts(BigQueryIO.Write.CreateDisposition, DynamicDestinations<?, DestinationT>, Coder<ElementT>, SerializableFunction<ElementT, TableRow>, SerializableFunction<ElementT, TableRow>) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StreamingInserts
Constructor.
StreamingWriteTables<ElementT> - Class in org.apache.beam.sdk.io.gcp.bigquery
This transform takes in key-value pairs of TableRow entries and the TableDestination it should be written to.
StreamingWriteTables() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StreamingWriteTables
 
StreamPartitionWithWatermark - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.model
 
StreamPartitionWithWatermark(Range.ByteStringRange, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.StreamPartitionWithWatermark
 
StreamProgress - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction
StreamProgress() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
 
StreamProgress(ChangeStreamContinuationToken, Instant, BigDecimal, Instant, boolean) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
 
StreamProgress(CloseStream) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
 
stripPartitionDecorator(String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
Strip off any partition decorator information from a tablespec.
studyId - Variable in class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser.DicomWebPath
 
SubscriberOptions - Class in org.apache.beam.sdk.io.gcp.pubsublite
 
SubscriberOptions() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.SubscriberOptions
 
SubscriberOptions.Builder - Class in org.apache.beam.sdk.io.gcp.pubsublite
 
SubscribeTransform - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
 
SubscribeTransform(SubscriberOptions) - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.SubscribeTransform
 
SubscriptionPartition - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
 
SubscriptionPartition() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.SubscriptionPartition
 
SubscriptionPartitionCoder - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
 
SubscriptionPartitionCoder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.SubscriptionPartitionCoder
 
subscriptionPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
Subscription path used to listen for messages on TestPubsub.topicPath().
subscriptionPath() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.SubscriberOptions
 
subscriptionPathFromName(String, String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
 
subscriptionPathFromPath(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
 
SUCCESS - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write
The tag for the successful writes to HL7v2 store`.
SUCCESSFUL_BODY - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
The tag for successful writes to FHIR store.
SUCCESSFUL_BUNDLES - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundles
The TupleTag used for bundles that were executed successfully.
SUPPORTED_FORMATS - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider
 
SUPPORTED_FORMATS_STR - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteWriteSchemaTransformProvider
 
supportsProjectionPushdown() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
 

T

TABLE_FIELD_SCHEMAS - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOErrorToTableRow
 
TABLE_ROW_ERROR_CONTAINER - Static variable in interface org.apache.beam.sdk.io.gcp.bigquery.ErrorContainer
 
TableAndQuery() - Constructor for class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher.TableAndQuery
 
TableDestination - Class in org.apache.beam.sdk.io.gcp.bigquery
Encapsulates a BigQuery table destination.
TableDestination(String, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
TableDestination(TableReference, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
TableDestination(TableReference, String, TimePartitioning) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
TableDestination(String, String, TimePartitioning) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
TableDestination(String, String, TimePartitioning, Clustering) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
TableDestination(String, String, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
TableDestination(TableReference, String, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
TableDestination(TableReference, String, String, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
TableDestination(String, String, String, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
TableDestinationCoder - Class in org.apache.beam.sdk.io.gcp.bigquery
A coder for TableDestination objects.
TableDestinationCoderV2 - Class in org.apache.beam.sdk.io.gcp.bigquery
A Coder for TableDestination that includes time partitioning information.
TableDestinationCoderV2() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV2
 
TableDestinationCoderV3 - Class in org.apache.beam.sdk.io.gcp.bigquery
A Coder for TableDestination that includes time partitioning and clustering information.
TableDestinationCoderV3() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV3
 
tableExists() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
Checks whether the metadata table already exists in the database.
tableFieldToProtoTableField(TableFieldSchema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
 
tableReference() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery
 
tableRowFromBeamRow() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
 
tableRowFromMessage(Message, boolean) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
 
TableRowJsonCoder - Class in org.apache.beam.sdk.io.gcp.bigquery
A Coder that encodes BigQuery TableRow objects in their native JSON format.
tableRowToBeamRow() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
 
TableRowToStorageApiProto - Class in org.apache.beam.sdk.io.gcp.bigquery
Utility methods for converting JSON TableRow objects to dynamic protocol message, for use with the Storage write API.
TableRowToStorageApiProto() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
 
TableRowToStorageApiProto.SchemaDoesntMatchException - Exception in org.apache.beam.sdk.io.gcp.bigquery
 
TableRowToStorageApiProto.SchemaTooNarrowException - Exception in org.apache.beam.sdk.io.gcp.bigquery
 
TableRowToStorageApiProto.SingleValueConversionException - Exception in org.apache.beam.sdk.io.gcp.bigquery
 
TableSchemaCache - Class in org.apache.beam.sdk.io.gcp.bigquery
An updatable cache for table schemas.
TableSchemaUpdateUtils - Class in org.apache.beam.sdk.io.gcp.bigquery
Helper utilities for handling schema-update responses.
TableSchemaUpdateUtils() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableSchemaUpdateUtils
 
tableSpec() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery
 
targetForRootUrl(String) - Static method in interface org.apache.beam.sdk.io.gcp.pubsub.PubsubOptions
Internal only utility for converting PubsubOptions.getPubsubRootUrl() (e.g.
teardown() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadSpannerSchema
 
TEMP_FILES - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
The tag for temp files for import to FHIR store.
TestBigQuery - Class in org.apache.beam.sdk.io.gcp.bigquery
Test rule which creates a new table with specified schema, with randomized name and exposes few APIs to work with it.
TestBigQuery.PollingAssertion - Interface in org.apache.beam.sdk.io.gcp.bigquery
Interface to implement a polling assertion.
TestBigQuery.RowsAssertion - Class in org.apache.beam.sdk.io.gcp.bigquery
Interface for creating a polling eventual assertion.
TestBigQueryOptions - Interface in org.apache.beam.sdk.io.gcp.bigquery
TestPipelineOptions for TestBigQuery.
TestPubsub - Class in org.apache.beam.sdk.io.gcp.pubsub
Test rule which creates a new topic and subscription with randomized names and exposes the APIs to work with them.
TestPubsub.PollingAssertion - Interface in org.apache.beam.sdk.io.gcp.pubsub
 
TestPubsubOptions - Interface in org.apache.beam.sdk.io.gcp.pubsub
PipelineOptions for TestPubsub.
TestPubsubSignal - Class in org.apache.beam.sdk.io.gcp.pubsub
Test rule which observes elements of the PCollection and checks whether they match the success criteria.
throttledTimeCounter(BigQuerySinkMetrics.RpcMethod) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
 
THROUGHPUT_WINDOW_SECONDS - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
The sliding window size in seconds for throughput reporting.
ThroughputEstimator<T> - Interface in org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator
An estimator to calculate the throughput of the outputted elements from a DoFn.
throwableToGRPCCodeString(Throwable) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySinkMetrics
Converts a Throwable to a gRPC Status code.
TIMESTAMP_FIELD_NAME - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOErrorToTableRow
 
TIMESTAMP_MICROS - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.RowUtils
 
TimestampConverter - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams
Convert between different Timestamp and Instant classes.
TimestampConverter() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.TimestampConverter
 
TimestampEncoding - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.encoder
This encoder/decoder writes a com.google.cloud.Timestamp object as a pair of long and int to avro and reads a Timestamp object from the same pair.
TimestampEncoding() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.encoder.TimestampEncoding
 
timestampMsSinceEpoch() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.IncomingMessage
Timestamp for element (ms since epoch).
TimestampRange - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction
A restriction represented by a range of timestamps [from, to).
TimestampRangeTracker - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction
A RestrictionTracker for claiming positions in a TimestampRange in a monotonically increasing fashion.
TimestampRangeTracker(TimestampRange) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
 
TimestampUtils - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction
Provides methods in order to convert timestamp to nanoseconds representation and back.
TimestampUtils() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampUtils
 
timeSupplier - Variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
 
to(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Writes to the given table, specified in the format described in BigQueryHelpers.parseTableSpec(java.lang.String).
to(TableReference) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Writes to the given table, specified as a TableReference.
to(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Same as BigQueryIO.Write.to(String), but with a ValueProvider.
to(SerializableFunction<ValueInSingleWindow<T>, TableDestination>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Writes to table specified by the specified table function.
to(DynamicDestinations<T, ?>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Writes to the table and schema specified by the DynamicDestinations object.
to(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
Publishes to the specified topic.
to(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
Like topic() but with a ValueProvider.
to(SerializableFunction<ValueInSingleWindow<T>, String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
Provides a function to dynamically specify the target topic per message.
toBeamRow(GenericRecord, Schema, BigQueryUtils.ConversionOptions) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
 
toBeamRow(Schema, TableRow) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
Tries to convert a JSON TableRow from BigQuery into a Beam Row.
toBeamRow(Schema, TableSchema, TableRow) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
Tries to parse the JSON TableRow from BigQuery.
ToBigtableRowFn(Map<String, String>) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BeamRowToBigtableMutation.ToBigtableRowFn
 
toBuilder() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload
 
toBuilder() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchGetDocuments
 
toBuilder() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchWriteWithDeadLetterQueue
 
toBuilder() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchWriteWithSummary
 
toBuilder() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.ListCollectionIds
 
toBuilder() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.ListDocuments
 
toBuilder() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.PartitionQuery
 
toBuilder() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.RunQuery
 
toBuilder() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
Create a new RpcQosOptions.Builder initialized with the values from this instance.
toBuilder() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.SubscriberOptions
 
toBuilder() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
Transforms the instance into a builder, so field values can be modified.
toChangeStreamRecords(PartitionMetadata, ChangeStreamResultSet, ChangeStreamResultSetMetadata) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper.ChangeStreamRecordMapper
In GoogleSQL, change stream records are returned as an array of Struct.
toCloudPubsubMessages() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.CloudPubsubTransforms
Transform messages read from Pub/Sub Lite to their equivalent Cloud Pub/Sub Message that would have been read from PubsubIO.
toGenericAvroSchema(String, List<TableFieldSchema>) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
Convert a list of BigQuery TableFieldSchema to Avro Schema.
toJodaTime(Instant) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.TimestampConverter
 
toJsonString(Object) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
 
toModel() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
To model message.
toNanos(Timestamp) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampUtils
Converts the given timestamp to respective nanoseconds representation.
topic() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.OutgoingMessage
 
topicPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
Topic path where events will be published to.
topicPath() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PublisherOptions
 
topicPathFromName(String, String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
 
topicPathFromPath(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
 
toProto(PubsubMessage) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessages
 
toSeconds(Instant) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.TimestampConverter
 
toString() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertError
 
toString() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
toString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableConfig
 
toString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
 
toString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
 
toString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.WriteWithResults
 
toString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.DetectNewPartitionsState
 
toString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.InitialPipelineState
 
toString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.NewPartition
 
toString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.PartitionRecord
 
toString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.model.StreamPartitionWithWatermark
 
toString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.ReadChangeStreamPartitionProgressTracker
 
toString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.StreamProgress
 
toString() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
 
toString() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteSuccessSummary
 
toString() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
 
toString() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameter
 
toString() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
 
toString() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2ReadResponse
 
toString() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.ProjectPath
 
toString() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SubscriptionPath
 
toString() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.TopicPath
 
toString() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubSubscription
 
toString() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
 
toString() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
 
toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.TransactionResult
 
toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
 
toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartition
 
toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
 
toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ColumnType
 
toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
 
toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.HeartbeatRecord
 
toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.Mod
 
toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
 
toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.TypeCode
 
toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRange
 
toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
 
toTableReference(String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
 
toTableRow() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
Convert a Beam Row to a BigQuery TableRow.
toTableRow(SerializableFunction<T, Row>) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
Convert a Beam schema type to a BigQuery TableRow.
toTableRow(Row) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
Convert a Beam Row to a BigQuery TableRow.
toTableSchema(Schema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
Convert a Beam Schema to a BigQuery TableSchema.
toTableSpec(TableReference) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
Returns a canonical string representation of the TableReference.
toTableSpec(TableReference) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
 
toThreetenInstant(Instant) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.TimestampConverter
 
toTimestamp(BigDecimal) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampUtils
Converts nanoseconds to their respective timestamp.
TrackerWithProgress - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
 
TrackerWithProgress() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.TrackerWithProgress
 
Transaction - Class in org.apache.beam.sdk.io.gcp.spanner
A transaction object.
Transaction() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.Transaction
 
transactionId() - Method in class org.apache.beam.sdk.io.gcp.spanner.Transaction
 
TransactionResult(T, Timestamp) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.TransactionResult
 
tryClaim(StreamProgress) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.ReadChangeStreamPartitionProgressTracker
Claims a new StreamProgress to be processed.
tryClaim(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.DetectNewPartitionsRangeTracker
Attempts to claim the given position.
tryClaim(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.ReadChangeStreamPartitionRangeTracker
Attempts to claim the given position.
tryClaim(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
Attempts to claim the given position.
tryClaim(Timestamp, PartitionMetadata) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
 
trySplit(double) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.DetectNewPartitionsTracker
 
trySplit(double) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.restriction.ReadChangeStreamPartitionProgressTracker
Splits the work that's left.
trySplit(double) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.ReadChangeStreamPartitionRangeTracker
If the partition token is the InitialPartition.PARTITION_TOKEN, it does not allow for splits (returns null).
trySplit(double) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
Splits the restriction through the following algorithm:
TypeCode - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
Represents a type of a column within Cloud Spanner.
TypeCode(String) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.TypeCode
Constructs a type code from the given String code.
TypedRead() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
 
typeToProtoType(String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
 

U

UnboundedReaderImpl - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
 
UnboundedSourceImpl - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
 
UniqueIdGenerator - Class in org.apache.beam.sdk.io.gcp.bigtable.changestreams
Generate unique IDs that can be used to differentiate different jobs and partitions.
UniqueIdGenerator() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.changestreams.UniqueIdGenerator
 
unpin() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StreamAppendClient
Unpin this object.
update(Instant, T) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.estimator.BytesThroughputEstimator
Updates the estimator with the bytes of records if it is selected to be sampled.
update(Timestamp, T) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.BytesThroughputEstimator
Updates the estimator with the bytes of records.
update(Timestamp, T) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.NullThroughputEstimator
NoOp.
update(Timestamp, T) - Method in interface org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.ThroughputEstimator
Updates the estimator with the size of the records.
UPDATE_URN - Static variable in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar
 
UpdateBuilder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.UpdateBuilder
 
updateDataClientSettings(BigtableDataSettings.Builder) - Method in interface org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.BigtableClientOverride
Update BigtableDataSettings.Builder with custom configurations.
updateDataRecordCommittedToEmitted(Duration) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
 
updateDetectNewPartitionWatermark(Instant) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
Update the watermark cell for Detect New Partition step.
updateInstanceAdminClientSettings(BigtableInstanceAdminSettings.Builder) - Method in interface org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.BigtableClientOverride
Update BigtableInstanceAdminSettings.Builder with custom configurations.
updatePartitionCreatedToScheduled(Duration) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Adds measurement of an instance for the ChangeStreamMetrics.PARTITION_CREATED_TO_SCHEDULED_MS if the metric is enabled.
updatePartitionScheduledToRunning(Duration) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
Adds measurement of an instance for the ChangeStreamMetrics.PARTITION_SCHEDULED_TO_RUNNING_MS if the metric is enabled.
updateProcessingDelayFromCommitTimestamp(long) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.ChangeStreamMetrics
Adds measurement of an instance for the ChangeStreamMetrics.PROCESSING_DELAY_FROM_COMMIT_TIMESTAMP.
UpdateSchemaDestination<DestinationT> - Class in org.apache.beam.sdk.io.gcp.bigquery
Update destination schema based on data that is about to be copied into it.
UpdateSchemaDestination(BigQueryServices, PCollectionView<String>, ValueProvider<String>, BigQueryIO.Write.WriteDisposition, BigQueryIO.Write.CreateDisposition, int, String, Set<BigQueryIO.Write.SchemaUpdateOption>, DynamicDestinations<?, DestinationT>) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.UpdateSchemaDestination
 
updateTableAdminClientSettings(BigtableTableAdminSettings.Builder) - Method in interface org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.BigtableClientOverride
Update BigtableTableAdminSettings.Builder with custom configurations.
updateTableSchema(String, String, String, TableSchema) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
 
updateTableSchema(TableReference, TableSchema) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
 
updateToFinished(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.InTransactionContext
Updates a partition row to PartitionMetadata.State.FINISHED state.
updateToFinished(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
Updates a partition row to PartitionMetadata.State.FINISHED state.
updateToRunning(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.InTransactionContext
Updates a partition row to PartitionMetadata.State.RUNNING state.
updateToRunning(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
Updates a partition row to PartitionMetadata.State.RUNNING state.
updateToScheduled(List<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.InTransactionContext
Updates multiple partition rows to PartitionMetadata.State.SCHEDULED state.
updateToScheduled(List<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
Updates multiple partition row to PartitionMetadata.State.SCHEDULED state.
updateWatermark(Range.ByteStringRange, Instant, ChangeStreamContinuationToken) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
Update the metadata for the row key represented by the partition.
updateWatermark(String, Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.InTransactionContext
Update the partition watermark to the given timestamp.
updateWatermark(String, Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
Update the partition watermark to the given timestamp.
uploadToDicomStore(String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
Upload to a Dicom Store.
uploadToDicomStore(String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
 
URN - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead
 
URN - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite
 
useAvroLogicalTypes() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
 
useAvroLogicalTypes() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Enables interpreting logical types into their corresponding types (ie.
useBeamSchema() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
If true, then the BigQuery schema will be inferred from the input schema.
usingStandardSql() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
Enables BigQuery's Standard SQL dialect when reading from a query.
usingStandardSql() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
Uuid - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
A Uuid storable in a Pub/Sub Lite attribute.
Uuid() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.Uuid
 
UuidCoder - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
A coder for a Uuid.
UuidCoder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.UuidCoder
 
UuidDeduplicationOptions - Class in org.apache.beam.sdk.io.gcp.pubsublite
Options for deduplicating Pub/Sub Lite messages based on the UUID they were published with.
UuidDeduplicationOptions() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions
 
UuidDeduplicationOptions.Builder - Class in org.apache.beam.sdk.io.gcp.pubsublite
 
UuidDeduplicationTransform - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
A transform for deduplicating Pub/Sub Lite messages based on the UUID they were published with.
UuidDeduplicationTransform(UuidDeduplicationOptions) - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.UuidDeduplicationTransform
 
uuidExtractor() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions
 

V

v1() - Static method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreIO
Returns a DatastoreV1 that provides an API for accessing Cloud Datastore through v1 version of Datastore Client library.
v1() - Static method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreIO
 
V1_READ_OVERRIDE - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.internal.SubscribeTransform
 
VALID_DATA_FORMATS - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformProvider
 
VALID_DATA_FORMATS - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformProvider
 
VALID_DATA_FORMATS - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
 
VALID_FORMATS_STR - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubReadSchemaTransformProvider
 
VALID_FORMATS_STR - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubWriteSchemaTransformProvider
 
VALID_FORMATS_STR - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteReadSchemaTransformProvider
 
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryFileLoadsWriteSchemaTransformProvider.BigQueryWriteSchemaTransform
 
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
 
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
validate() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration
 
validate() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryStorageWriteApiSchemaTransformProvider.BigQueryStorageWriteApiSchemaTransformConfiguration
 
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
 
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
 
validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.WriteWithResults
 
validate() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableReadSchemaTransformProvider.BigtableReadSchemaTransformConfiguration
validate() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteSchemaTransformProvider.BigtableWriteSchemaTransformConfiguration
Validates the configuration object.
validate() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
 
VALUE - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.RowUtils
 
value() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.Uuid
 
ValueCaptureType - Enum in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
Represents the capture type of a change stream.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead.Method
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead.QueryPriority
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.CreateDisposition
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.Method
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.SchemaUpdateOption
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.WriteDisposition
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService.TableMetadataView
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions.TruncateTimestamps
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.RowMutationInformation.MutationType
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ExistingPipelineOptions
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Import.ContentStructure
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.WriteMethod
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write.WriteMethod
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.FhirResourcePagesIterator.FhirMethod
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ModType
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.State
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ValueCaptureType
Returns the enum constant of this type with the specified name.
valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.spanner.SpannerIO.FailureMode
Returns the enum constant of this type with the specified name.
values() - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead.Method
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead.QueryPriority
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.CreateDisposition
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.Method
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.SchemaUpdateOption
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.WriteDisposition
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService.TableMetadataView
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions.TruncateTimestamps
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.RowMutationInformation.MutationType
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ExistingPipelineOptions
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Import.ContentStructure
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.WriteMethod
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write.WriteMethod
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.FhirResourcePagesIterator.FhirMethod
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ModType
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.State
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ValueCaptureType
Returns an array containing the constants of this enum type, in the order they are declared.
values() - Static method in enum org.apache.beam.sdk.io.gcp.spanner.SpannerIO.FailureMode
Returns an array containing the constants of this enum type, in the order they are declared.
verifyDeterministic() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertErrorCoder
verifyDeterministic() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoder
 
verifyDeterministic() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV2
 
verifyDeterministic() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV3
 
verifyDeterministic() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder

W

waitForNMessages(int, Duration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
Repeatedly pull messages from TestPubsub.subscriptionPath(), returns after receiving n messages or after waiting for timeoutDuration.
waitForStart(Duration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal
Future that waits for a start signal for duration.
waitForSuccess(Duration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal
Wait for a success signal for duration.
waitForUpTo(Duration) - Method in interface org.apache.beam.sdk.io.gcp.pubsub.TestPubsub.PollingAssertion
 
WebPathParser - Class in org.apache.beam.sdk.io.gcp.healthcare
 
WebPathParser() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser
 
WebPathParser.DicomWebPath - Class in org.apache.beam.sdk.io.gcp.healthcare
 
withAppProfileId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns a new BigtableIO.Read that will read using the specified app profile id.
withAppProfileId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns a new BigtableIO.Read that will read using the specified app profile id.
withAppProfileId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
Returns a new BigtableIO.ReadChangeStream that will stream from the cluster specified by app profile id.
withAppProfileId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Returns a new BigtableIO.Write that will write using the specified app profile id.
withAppProfileId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Returns a new BigtableIO.Write that will write using the specified app profile id.
withAttemptTimeout(Duration) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns a new BigtableIO.Read with the attempt timeout.
withAttemptTimeout(Duration) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Returns a new BigtableIO.Write with the attempt timeout.
withAutoSchemaUpdate(boolean) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
If true, enables automatically detecting BigQuery table schema updates.
withAutoSharding() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
If true, enables using a dynamically determined number of shards to write to BigQuery.
withAvroFormatFunction(SerializableFunction<AvroWriteRequest<T>, GenericRecord>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Formats the user's type into a GenericRecord to be written to BigQuery.
withAvroSchemaFactory(SerializableFunction<TableSchema, Schema>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Uses the specified function to convert a TableSchema to a Schema.
withAvroWriter(SerializableFunction<Schema, DatumWriter<T>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Writes the user's type as avro using the supplied DatumWriter.
withAvroWriter(SerializableFunction<AvroWriteRequest<T>, AvroT>, SerializableFunction<Schema, DatumWriter<AvroT>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Convert's the user's type to an avro record using the supplied avroFormatFunction.
withBatching(boolean) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
If true the uses Cloud Spanner batch API.
withBatching(boolean) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
By default the PartitionQuery API is used to read data from Cloud Spanner.
withBatchInitialCount(int) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
Configure the initial size of a batch; used in the absence of the QoS system having significant data to determine a better batch size.
withBatchMaxBytes(long) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
Configure the maximum number of bytes to include in a batch.
withBatchMaxCount(int) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
Configure the maximum number of writes to include in a batch.
withBatchSizeBytes(long) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
Specifies the batch size limit (max number of bytes mutated per batch).
withBatchTargetLatency(Duration) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
Target latency for batch requests.
withBeamRowConverters(TypeDescriptor<T>, BigQueryIO.TypedRead.ToBeamRowFunction<T>, BigQueryIO.TypedRead.FromBeamRowFunction<T>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
Sets the functions to convert elements to/from Row objects.
withBigtableOptions(BigtableOptions) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableConfig
Deprecated.
please set the options directly in BigtableIO.
withBigtableOptions(BigtableOptions) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Deprecated.
please set the configurations directly: BigtableIO.read().withProjectId(projectId).withInstanceId(instanceId).withTableId(tableId) and set credentials in PipelineOptions.
withBigtableOptions(BigtableOptions.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Deprecated.
please set the configurations directly: BigtableIO.read().withProjectId(projectId).withInstanceId(instanceId).withTableId(tableId) and set credentials in PipelineOptions.
withBigtableOptions(BigtableOptions) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Deprecated.
please configure the write options directly: BigtableIO.write().withProjectId(projectId).withInstanceId(instanceId).withTableId(tableId) and set credentials in PipelineOptions.
withBigtableOptions(BigtableOptions.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Deprecated.
please configure the write options directly: BigtableIO.write().withProjectId(projectId).withInstanceId(instanceId).withTableId(tableId) and set credentials in PipelineOptions.
withBigtableOptionsConfigurator(SerializableFunction<BigtableOptions.Builder, BigtableOptions.Builder>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableConfig
Deprecated.
please set the options directly in BigtableIO.
withBigtableOptionsConfigurator(SerializableFunction<BigtableOptions.Builder, BigtableOptions.Builder>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Deprecated.
please set the configurations directly: BigtableIO.read().withProjectId(projectId).withInstanceId(instanceId).withTableId(tableId) and set credentials in PipelineOptions.
withBigtableOptionsConfigurator(SerializableFunction<BigtableOptions.Builder, BigtableOptions.Builder>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Deprecated.
please configure the write options directly: BigtableIO.write().withProjectId(projectId).withInstanceId(instanceId).withTableId(tableId) and set credentials in PipelineOptions.
withChangeStreamName(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
Returns a new BigtableIO.ReadChangeStream that uses changeStreamName as prefix for the metadata table.
withChangeStreamName(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
Specifies the change stream name.
withClientFactory(PubsubClient.PubsubClientFactory) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
The default client to write to Pub/Sub is the PubsubJsonClient, created by the PubsubJsonClient.PubsubJsonClientFactory.
withClientFactory(PubsubClient.PubsubClientFactory) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
The default client to write to Pub/Sub is the PubsubJsonClient, created by the PubsubJsonClient.PubsubJsonClientFactory.
withClustering(Clustering) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Specifies the clustering fields to use when writing to a single output table.
withClustering() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
withCoder(Coder<T>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
Sets a Coder for the result of the parse function.
withCoderAndParseFn(Coder<T>, SimpleFunction<PubsubMessage, T>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
Causes the source to return a PubsubMessage that includes Pubsub attributes, and uses the given parsing function to transform the PubsubMessage into an output type.
withColumns(String...) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
withColumns(List<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
withColumns(String...) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
withColumns(List<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
withCommitDeadline(Duration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the commit deadline.
withCommitDeadline(ValueProvider<Duration>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the commit deadline.
withCommitDeadline(Duration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
Specifies the deadline for the Commit API call.
withCommitRetrySettings(RetrySettings) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the commit retry settings.
withCreateDisposition(BigQueryIO.Write.CreateDisposition) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Specifies whether the table should be created if it does not exist.
withCreateOrUpdateMetadataTable(boolean) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
Returns a new BigtableIO.ReadChangeStream that, if set to true, will create or update metadata table before launching pipeline.
withCustomGcsTempLocation(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Provides a custom location on GCS for storing temporary files to be loaded via BigQuery batch load jobs.
withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
Returns a new DatastoreV1.DeleteEntity that deletes entities from the Cloud Datastore for the specified database.
withDatabaseId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
Same as DatastoreV1.DeleteEntity.withDatabaseId(String) but with a ValueProvider.
withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
Returns a new DatastoreV1.DeleteKey that deletes entities from the Cloud Datastore for the specified database.
withDatabaseId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
Same as DatastoreV1.DeleteKey.withDatabaseId(String) but with a ValueProvider.
withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
Returns a new DatastoreV1.Read that reads from the Cloud Datastore for the specified database.
withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
Returns a new DatastoreV1.Write that writes to the Cloud Datastore for the database id.
withDatabaseId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
Same as DatastoreV1.Write.withDatabaseId(String) but with a ValueProvider.
withDatabaseId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the Cloud Spanner database ID.
withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the Cloud Spanner database ID.
withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
Specifies the Cloud Spanner database.
withDatabaseId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
Specifies the Cloud Spanner database.
withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
Specifies the Cloud Spanner database.
withDatabaseId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
Specifies the Cloud Spanner database.
withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
Specifies the Cloud Spanner database.
withDatabaseId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
Specifies the Cloud Spanner database.
withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
Specifies the Cloud Spanner database.
withDatabaseId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
Specifies the Cloud Spanner database.
withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
Specifies the Cloud Spanner database.
withDatabaseId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
Specifies the Cloud Spanner database.
withDatabaseRole(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the Cloud Spanner database role.
withDataBoostEnabled(ValueProvider<Boolean>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies if the pipeline has to be run on the independent compute resource.
withDatasetService(FakeDatasetService) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
 
withDeadLetterQueue() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchWriteWithSummary.Builder
 
withDeadLetterTopic(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
Creates and returns a transform for writing read failures out to a dead-letter topic.
withDeadLetterTopic(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
Like PubsubIO.Read.withDeadLetterTopic(String) but with a ValueProvider.
withDefaultMissingValueInterpretation(AppendRowsRequest.MissingValueInterpretation) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Specify how missing values should be interpreted when there is a default value in the schema.
withDeterministicRecordIdFn(SerializableFunction<T, String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
withDialectView(PCollectionView<Dialect>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
 
withDirectWriteProtos(boolean) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
withEmulator(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableConfig
 
withEmulator(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns a new BigtableIO.Read that will use an official Bigtable emulator.
withEmulator(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Returns a new BigtableIO.Write that will use an official Bigtable emulator.
withEmulatorHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the Cloud Spanner host, when an emulator is used.
withEmulatorHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
Specifies the Cloud Spanner emulator host.
withEmulatorHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
 
withEmulatorHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
Specifies the Cloud Spanner emulator host.
withEmulatorHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
withEmulatorHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
Specifies the Cloud Spanner emulator host.
withEmulatorHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
 
withEmulatorHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
Specifies the Cloud Spanner emulator host.
withEmulatorHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
 
withErrorHandler(ErrorHandler<BadRecord, ?>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
 
withExecuteStreamingSqlRetrySettings(RetrySettings) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the ExecuteStreamingSql retry settings.
withExistingPipelineOptions(BigtableIO.ExistingPipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
Returns a new BigtableIO.ReadChangeStream that decides what to do if an existing pipeline exists with the same change stream name.
withExtendedErrorInfo() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Enables extended error information by enabling WriteResult.getFailedInsertsWithErr()
withExtendedErrorInfo(boolean) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingInserts
Specify whether to use extended error info or not.
withFailedInsertRetryPolicy(InsertRetryPolicy) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Specifies a policy for handling failed inserts.
withFailureMode(SpannerIO.FailureMode) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
Specifies failure mode.
withFlowControl(boolean) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Returns a new BigtableIO.Write with flow control enabled if enableFlowControl is true.
withFormat(DataFormat) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
See DataFormat.
withFormatFunction(SerializableFunction<T, TableRow>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Formats the user's type into a TableRow to be written to BigQuery.
withFormatRecordOnFailureFunction(SerializableFunction<T, TableRow>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
If an insert failure occurs, this function is applied to the originally supplied row T.
withGroupingFactor(int) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
Specifies the multiple of max mutation (in terms of both bytes per batch and cells per batch) that is used to select a set of mutations to sort by key for batching.
withHighPriority() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
withHighPriority() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
 
withHighPriority() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
 
withHintMaxNumWorkers(int) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
Provide a hint to the QoS system for the intended max number of workers for a pipeline.
withHintNumWorkers(int) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
Returns a new DatastoreV1.DeleteEntity with a different worker count hint for ramp-up throttling.
withHintNumWorkers(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
Same as DatastoreV1.DeleteEntity.withHintNumWorkers(int) but with a ValueProvider.
withHintNumWorkers(int) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
Returns a new DatastoreV1.DeleteKey with a different worker count hint for ramp-up throttling.
withHintNumWorkers(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
Same as DatastoreV1.DeleteKey.withHintNumWorkers(int) but with a ValueProvider.
withHintNumWorkers(int) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
Returns a new DatastoreV1.Write with a different worker count hint for ramp-up throttling.
withHintNumWorkers(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
Same as DatastoreV1.Write.withHintNumWorkers(int) but with a ValueProvider.
withHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the Cloud Spanner host.
withHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
Specifies the Cloud Spanner host.
withHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
 
withHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
Specifies the Cloud Spanner host.
withHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
withHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
Specifies the Cloud Spanner host.
withHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
 
withHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
Specifies the Cloud Spanner host.
withHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
Specifies the Cloud Spanner host.
withIdAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
When reading from Cloud Pub/Sub where unique record identifiers are provided as Pub/Sub message attributes, specifies the name of the attribute containing the unique identifier.
withIdAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
Writes to Pub/Sub, adding each record's unique identifier to the published messages in an attribute with the specified name.
withInclusiveEndAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
Specifies the end time of the change stream.
withInclusiveStartAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
Specifies the time that the change stream should be read from.
withIndex(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
withIndex(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
withInitialBackoff(Duration) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
Configure the initial backoff duration to be used before retrying a request for the first time.
withInitialSplitDuration(Duration) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.ListHL7v2Messages
 
withInsertRetryPolicy(InsertRetryPolicy) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingInserts
Specify a retry policy for failed inserts.
withInstanceId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableConfig
 
withInstanceId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns a new BigtableIO.Read that will read from the Cloud Bigtable instance indicated by given parameter, requires BigtableIO.Read.withProjectId(org.apache.beam.sdk.options.ValueProvider<java.lang.String>) to be called to determine the project.
withInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns a new BigtableIO.Read that will read from the Cloud Bigtable instance indicated by given parameter, requires BigtableIO.Read.withProjectId(org.apache.beam.sdk.options.ValueProvider<java.lang.String>) to be called to determine the project.
withInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
Returns a new BigtableIO.ReadChangeStream that will stream from the Cloud Bigtable instance indicated by given parameter, requires BigtableIO.ReadChangeStream.withProjectId(java.lang.String) to be called to determine the project.
withInstanceId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Returns a new BigtableIO.Write that will write into the Cloud Bigtable instance indicated by given parameter, requires BigtableIO.Write.withProjectId(org.apache.beam.sdk.options.ValueProvider<java.lang.String>) to be called to determine the project.
withInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Returns a new BigtableIO.Write that will write into the Cloud Bigtable instance indicated by given parameter, requires BigtableIO.Write.withProjectId(org.apache.beam.sdk.options.ValueProvider<java.lang.String>) to be called to determine the project.
withInstanceId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the Cloud Spanner instance ID.
withInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the Cloud Spanner instance ID.
withInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
Specifies the Cloud Spanner instance.
withInstanceId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
Specifies the Cloud Spanner instance.
withInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
Specifies the Cloud Spanner instance.
withInstanceId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
Specifies the Cloud Spanner instance.
withInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
Specifies the Cloud Spanner instance.
withInstanceId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
Specifies the Cloud Spanner instance.
withInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
Specifies the Cloud Spanner instance.
withInstanceId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
Specifies the Cloud Spanner instance.
withInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
Specifies the Cloud Spanner instance.
withInstanceId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
Specifies the Cloud Spanner instance.
withIsLocalChannelProvider(ValueProvider<Boolean>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies whether a local channel provider should be used.
withJobService(BigQueryServices.JobService) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
 
withJsonSchema(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Similar to BigQueryIO.Write.withSchema(TableSchema) but takes in a JSON-serialized TableSchema.
withJsonSchema(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Same as BigQueryIO.Write.withJsonSchema(String) but using a deferred ValueProvider.
withJsonTimePartitioning(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
withKeyRange(ByteKeyRange) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns a new BigtableIO.Read that will read only rows in the specified range.
withKeyRanges(ValueProvider<List<ByteKeyRange>>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns a new BigtableIO.Read that will read only rows in the specified ranges.
withKeyRanges(List<ByteKeyRange>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns a new BigtableIO.Read that will read only rows in the specified ranges.
withKeySet(KeySet) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
withKeySet(KeySet) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
withKmsKey(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
For query sources, use this Cloud KMS key to encrypt any temporary tables created.
withKmsKey(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
withLiteralGqlQuery(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
Returns a new DatastoreV1.Read that reads the results of the specified GQL query.
withLiteralGqlQuery(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
Same as DatastoreV1.Read.withLiteralGqlQuery(String) but with a ValueProvider.
withLoadJobProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Set the project the BigQuery load job will be initiated from.
withLoadJobProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
withLocalhost(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
Returns a new DatastoreV1.DeleteEntity that deletes entities from the Cloud Datastore Emulator running locally on the specified host port.
withLocalhost(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
Returns a new DatastoreV1.DeleteKey that deletes entities from the Cloud Datastore Emulator running locally on the specified host port.
withLocalhost(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
Returns a new DatastoreV1.Read that reads from a Datastore Emulator running at the given localhost address.
withLocalhost(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
Returns a new DatastoreV1.Write that writes to the Cloud Datastore Emulator running locally on the specified host port.
withLowPriority() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
withLowPriority() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
 
withLowPriority() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
 
withMaxAttempts(int) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
Configure the maximum number of times a request will be attempted for a complete successful result.
withMaxBatchBytesSize(int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
Writes to Pub/Sub are limited by 10mb in general.
withMaxBatchSize(int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
Writes to Pub/Sub are batched to efficiently send data.
withMaxBufferElementCount(Integer) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns a new BigtableIO.Read that will break up read requests into smaller batches.
withMaxBytesPerBatch(long) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Returns a new BigtableIO.Write with the max bytes a batch can have.
withMaxBytesPerPartition(long) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Control how much data will be assigned to a single BigQuery load job.
withMaxCumulativeBackoff(Duration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the maximum cumulative backoff.
withMaxCumulativeBackoff(ValueProvider<Duration>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the maximum cumulative backoff.
withMaxCumulativeBackoff(Duration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
Specifies the maximum cumulative backoff time when retrying after DEADLINE_EXCEEDED errors.
withMaxElementsPerBatch(long) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Returns a new BigtableIO.Write with the max elements a batch can have.
withMaxFilesPerBundle(int) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Control how many files will be written concurrently by a single worker when using BigQuery load jobs before spilling to a shuffle.
withMaxNumMutations(long) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
Specifies the cell mutation limit (maximum number of mutated cells per batch).
withMaxNumRows(long) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
Specifies the row mutation limit (maximum number of mutated rows per batch).
withMaxOutstandingBytes(long) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Returns a new BigtableIO.Write with the max number of outstanding bytes allowed before enforcing flow control.
withMaxOutstandingElements(long) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Returns a new BigtableIO.Write with the max number of outstanding elements allowed before enforcing flow control.
withMaxRetryJobs(int) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
If set, this will set the max number of retry of batch load jobs.
withMetadataDatabase(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
Specifies the metadata database.
withMetadataInstance(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
Specifies the metadata database.
withMetadataTable(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
Specifies the metadata table name.
withMetadataTableAppProfileId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
Returns a new BigtableIO.ReadChangeStream that will use the cluster specified by app profile id to store the metadata of the stream.
withMetadataTableInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
Returns a new BigtableIO.ReadChangeStream that will use the Cloud Bigtable instance indicated by given parameter to manage the metadata of the stream.
withMetadataTableProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
Returns a new BigtableIO.ReadChangeStream that will use the Cloud Bigtable project indicated by given parameter to manage the metadata of the stream.
withMetadataTableTableId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
Returns a new BigtableIO.ReadChangeStream that will use specified table to store the metadata of the stream.
withMethod(BigQueryIO.TypedRead.Method) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
withMethod(BigQueryIO.Write.Method) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Choose the method used to write data to BigQuery.
withNameOnlyQuery() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.PartitionQuery.Builder
Update produced queries to only retrieve their __name__ thereby not retrieving any fields and reducing resource requirements.
withNamespace(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
Returns a new DatastoreV1.Read that reads from the given namespace.
withNamespace(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
Same as DatastoreV1.Read.withNamespace(String) but with a ValueProvider.
withNumberOfRecordsRead(long) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
Sets the number of records read in the partition change stream query before reading this record.
withNumFileShards(int) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Control how many file shards are written when using BigQuery load jobs.
withNumQuerySplits(int) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
Returns a new DatastoreV1.Read that reads by splitting the given query into numQuerySplits.
withNumStorageWriteApiStreams(int) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Control how many parallel streams are used when using Storage API writes.
withOperationTimeout(Duration) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns a new BigtableIO.Read with the operation timeout.
withOperationTimeout(Duration) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Returns a new BigtableIO.Write with the operation timeout.
withoutResultFlattening() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
withoutResultFlattening() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
withoutValidation() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
Disable validation that the table exists or the query succeeds prior to pipeline submission.
withoutValidation() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
withoutValidation() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Disables BigQuery table validation.
withoutValidation() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Disables validation that the table being read from exists.
withoutValidation() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Disables validation that the table being written to exists.
withOverloadRatio(double) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
The target ratio between requests sent and successful requests.
withPartitionCreatedAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
Sets the time at which this partition was first detected and created in the metadata table.
withPartitionEndTimestamp(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
Sets the end time for the partition change stream query that originated this record.
withPartitionOptions(PartitionOptions) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
withPartitionOptions(PartitionOptions) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
Note that PartitionOptions are currently ignored.
withPartitionQueryTimeout(Duration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the PartitionQuery timeout.
withPartitionQueryTimeout(ValueProvider<Duration>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the PartitionQuery timeout.
withPartitionReadTimeout(Duration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the PartitionRead timeout.
withPartitionReadTimeout(ValueProvider<Duration>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the PartitionRead timeout.
withPartitionRunningAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
Sets the time at which the connector started processing this partition.
withPartitionScheduledAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
Sets the time at which this partition was scheduled to be queried.
withPartitionStartTimestamp(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
Sets the start time for the partition change stream query that originated this record.
withPartitionToken(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
Sets the partition token where this record originated from.
withPrimaryKey(List<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableConfig
 
withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns a new BigtableIO.Read that will read from the Cloud Bigtable project indicated by given parameter, requires BigtableIO.Read.withInstanceId(org.apache.beam.sdk.options.ValueProvider<java.lang.String>) to be called to determine the instance.
withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns a new BigtableIO.Read that will read from the Cloud Bigtable project indicated by given parameter, requires BigtableIO.Read.withInstanceId(org.apache.beam.sdk.options.ValueProvider<java.lang.String>) to be called to determine the instance.
withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
Returns a new BigtableIO.ReadChangeStream that will stream from the Cloud Bigtable project indicated by given parameter, requires BigtableIO.ReadChangeStream.withInstanceId(java.lang.String) to be called to determine the instance.
withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Returns a new BigtableIO.Write that will write into the Cloud Bigtable project indicated by given parameter, requires BigtableIO.Write.withInstanceId(org.apache.beam.sdk.options.ValueProvider<java.lang.String>) to be called to determine the instance.
withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Returns a new BigtableIO.Write that will write into the Cloud Bigtable project indicated by given parameter, requires BigtableIO.Write.withInstanceId(org.apache.beam.sdk.options.ValueProvider<java.lang.String>) to be called to determine the instance.
withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
Returns a new DatastoreV1.DeleteEntity that deletes entities from the Cloud Datastore for the specified project.
withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
Same as DatastoreV1.DeleteEntity.withProjectId(String) but with a ValueProvider.
withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
Returns a new DatastoreV1.DeleteKey that deletes entities from the Cloud Datastore for the specified project.
withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
Same as DatastoreV1.DeleteKey.withProjectId(String) but with a ValueProvider.
withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
Returns a new DatastoreV1.Read that reads from the Cloud Datastore for the specified project.
withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
Same as DatastoreV1.Read.withProjectId(String) but with a ValueProvider.
withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
Returns a new DatastoreV1.Write that writes to the Cloud Datastore for the default database.
withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
Same as DatastoreV1.Write.withProjectId(String) but with a ValueProvider.
withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the Cloud Spanner project ID.
withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the Cloud Spanner project ID.
withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
Specifies the Cloud Spanner project.
withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
Specifies the Cloud Spanner project.
withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
Specifies the Cloud Spanner project.
withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
Specifies the Cloud Spanner project.
withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
Specifies the Cloud Spanner project.
withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
Specifies the Cloud Spanner project.
withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
Specifies the Cloud Spanner project.
withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
Specifies the Cloud Spanner project.
withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
Specifies the Cloud Spanner project.
withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
Specifies the Cloud Spanner project.
withPropagateSuccessfulStorageApiWrites(boolean) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
If set to true, then all successful writes will be propagated to WriteResult and accessible via the WriteResult.getSuccessfulStorageApiInserts() method.
withPubsubRootUrl(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
 
withQuery(Query) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
Returns a new DatastoreV1.Read that reads the results of the specified query.
withQuery(Statement) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
withQuery(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
withQuery(Statement) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
withQuery(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
withQueryLocation(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
BigQuery geographic location where the query job will be executed.
withQueryName(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
withQueryName(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
withQueryPriority(BigQueryIO.TypedRead.QueryPriority) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
withQueryStartedAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
Sets the time that the change stream query which produced this record started.
withQueryTempDataset(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
Temporary dataset reference when using BigQueryIO.TypedRead.fromQuery(String).
withRampupThrottlingDisabled() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
Returns a new DatastoreV1.DeleteEntity that does not throttle during ramp-up.
withRampupThrottlingDisabled() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
Returns a new DatastoreV1.DeleteKey that does not throttle during ramp-up.
withRampupThrottlingDisabled() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
Returns a new DatastoreV1.Write that does not throttle during ramp-up.
withReadOperation(ReadOperation) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
withReadTime(Instant) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
Returns a new DatastoreV1.Read that reads at the specified readTime.
withRecordReadAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
Sets the time at which the record was fully read.
withRecordStreamEndedAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
Sets the time at which the record finished streaming.
withRecordStreamStartedAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
Sets the time at which the record started to be streamed.
withRecordTimestamp(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
Sets the timestamp of when this record occurred.
withReportDiagnosticMetrics() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
Whether additional diagnostic metrics should be reported for a Transform.
withRetryableCodes(ImmutableSet<StatusCode.Code>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the errors that will be retried by the client library for all operations.
withRowFilter(ValueProvider<RowFilter>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns a new BigtableIO.Read that will filter the rows read from Cloud Bigtable using the given row filter.
withRowFilter(RowFilter) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns a new BigtableIO.Read that will filter the rows read from Cloud Bigtable using the given row filter.
withRowMutationInformationFn(SerializableFunction<T, RowMutationInformation>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Allows upserting and deleting rows for tables with a primary key defined.
withRowRestriction(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
withRowRestriction(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
Read only rows which match the specified filter, which must be a SQL expression compatible with Google standard SQL.
withRpcPriority(Options.RpcPriority) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the RPC priority.
withRpcPriority(ValueProvider<Options.RpcPriority>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
Specifies the RPC priority.
withRpcPriority(Options.RpcPriority) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
Specifies the priority of the change stream queries.
withSamplePeriod(Duration) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
Configure the length of time sampled request data will be retained.
withSamplePeriodBucketSize(Duration) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
Configure the size of buckets within the specified samplePeriod.
withSchema(TableSchema) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Uses the specified schema for rows to be written.
withSchema(ValueProvider<TableSchema>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Same as BigQueryIO.Write.withSchema(TableSchema) but using a deferred ValueProvider.
withSchemaFromView(PCollectionView<Map<String, String>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Allows the schemas for each table to be computed within the pipeline itself.
withSchemaReadySignal(PCollection<?>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
Specifies an optional input PCollection that can be used as the signal for Wait.OnSignal to indicate when the database schema is ready to be read.
withSchemaUpdateOptions(Set<BigQueryIO.Write.SchemaUpdateOption>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Allows the schema of the destination table to be updated as a side effect of the write.
withSelectedFields(List<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
withSelectedFields(ValueProvider<List<String>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
Read only the specified fields (columns) from a BigQuery table.
withSpannerConfig(SpannerConfig) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
Specifies the Cloud Spanner configuration.
withSpannerConfig(SpannerConfig) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
Specifies the Cloud Spanner configuration.
withSpannerConfig(SpannerConfig) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
Specifies the Cloud Spanner configuration.
withSpannerConfig(SpannerConfig) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
Specifies the Cloud Spanner configuration.
withSpannerConfig(SpannerConfig) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
Specifies the Cloud Spanner configuration.
withStartTime(Instant) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
Returns a new BigtableIO.ReadChangeStream that will start streaming at the specified start time.
withStorageClient(BigQueryServices.StorageClient) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
 
withSuccessfulInsertsPropagation(boolean) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
If true, it enables the propagation of the successfully inserted TableRows on BigQuery as part of the WriteResult object when using BigQueryIO.Write.Method.STREAMING_INSERTS.
withTable(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
 
withTable(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
withTableDescription(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Specifies the table description.
withTableId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns a new BigtableIO.Read that will read from the specified table.
withTableId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
Returns a new BigtableIO.Read that will read from the specified table.
withTableId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.ReadChangeStream
Returns a new BigtableIO.ReadChangeStream that will stream from the specified table.
withTableId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Returns a new BigtableIO.Write that will write to the specified table.
withTableId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Returns a new BigtableIO.Write that will write to the specified table.
withTableReference(TableReference) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
 
withTemplateCompatibility() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
Use new template-compatible source implementation.
withTemplateCompatibility() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
 
withTestServices(BigQueryServices) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
 
withTestServices(BigQueryServices) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
 
withTestServices(BigQueryServices) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
withThrottleDuration(Duration) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
Configure the amount of time an attempt will be throttled if deemed necessary based on previous success rate.
withTimePartitioning(TimePartitioning) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Allows newly created tables to include a TimePartitioning class.
withTimePartitioning(ValueProvider<TimePartitioning>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Like BigQueryIO.Write.withTimePartitioning(TimePartitioning) but using a deferred ValueProvider.
withTimestamp(Instant) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload
 
withTimestamp(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
withTimestamp(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
 
withTimestampAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
When reading from Cloud Pub/Sub where record timestamps are provided as Pub/Sub message attributes, specifies the name of the attribute that contains the timestamp.
withTimestampAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
Writes to Pub/Sub and adds each record's timestamp to the published messages in an attribute with the specified name.
withTimestampBound(TimestampBound) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
 
withTimestampBound(TimestampBound) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
withTimestampBound(TimestampBound) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
 
withTopic(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
 
withTotalStreamTimeMillis(long) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
Sets the total streaming time (in millis) for this record.
withTraceSampleProbability(Double) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
Deprecated.
This configuration has no effect, as tracing is not available.
withTransaction(PCollectionView<Transaction>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
 
withTransaction(PCollectionView<Transaction>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
 
withTriggeringFrequency(Duration) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Choose the frequency at which file writes are triggered.
withValidate(boolean) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableConfig
 
withWriteDisposition(BigQueryIO.Write.WriteDisposition) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Specifies what to do with existing data in the table, in case the table already exists.
withWriteResults() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
Returns a BigtableIO.WriteWithResults that will emit a BigtableWriteResult for each batch of rows written.
withWriteTempDataset(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
Temporary dataset.
wrapDescriptorProto(DescriptorProtos.DescriptorProto) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
 
write() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
A PTransform that writes a PCollection to a BigQuery table.
Write() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
 
write() - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO
Creates an uninitialized BigtableIO.Write.
Write() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
 
write() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1
Returns an empty DatastoreV1.Write builder.
write() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1
The class returned by this method provides the ability to create PTransforms for write operations available in the Firestore V1 API provided by FirestoreStub.
Write() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
 
Write() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write
 
Write() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
 
write(PublisherOptions) - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteIO
Write messages to Pub/Sub Lite.
write(Object, Encoder) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.encoder.TimestampEncoding
Serializes a Timestamp received as datum to the output encoder out.
write() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
Creates an uninitialized instance of SpannerIO.Write.
Write() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
 
WRITE_URN - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.internal.ExternalTransformRegistrarImpl
 
writeAvros(Class<T>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns A PTransform that writes binary encoded Avro messages of a given type to a Google Cloud Pub/Sub stream.
writeAvros(Class<T>, SerializableFunction<ValueInSingleWindow<T>, Map<String, String>>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns A PTransform that writes binary encoded Avro messages of a given type to a Google Cloud Pub/Sub stream.
WriteBuilder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite.WriteBuilder
 
writeCallMetric(TableReference) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
 
writeDetectNewPartitionMissingPartitions(HashMap<Range.ByteStringRange, Instant>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
Write to metadata table serialized missing partitions and how long they have been missing.
writeDetectNewPartitionVersion() - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
Set the version number for DetectNewPartition.
WriteFailure(Write, WriteResult, Status) - Constructor for class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteFailure
 
writeGenericRecords() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
A PTransform that writes a PCollection containing GenericRecords to a BigQuery table.
WriteGrouped(SpannerIO.Write) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.WriteGrouped
 
writeMessages() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns A PTransform that writes to a Google Cloud Pub/Sub stream.
writeMessagesDynamic() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Enables dynamic destination topics.
writeNewPartition(NewPartition) - Method in class org.apache.beam.sdk.io.gcp.bigtable.changestreams.dao.MetadataTableDao
After a split or merge from a close stream, write the new partition's information to the metadata table.
writeProtos(Class<T>) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
A PTransform that writes a PCollection containing protocol buffer objects to a BigQuery table.
writeProtos(Class<T>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns A PTransform that writes binary encoded protobuf messages of a given type to a Google Cloud Pub/Sub stream.
writeProtos(Class<T>, SerializableFunction<ValueInSingleWindow<T>, Map<String, String>>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns A PTransform that writes binary encoded protobuf messages of a given type to a Google Cloud Pub/Sub stream.
WriteRegistrar() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOTranslation.WriteRegistrar
 
WriteRegistrar() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubSubPayloadTranslation.WriteRegistrar
 
WriteResult - Class in org.apache.beam.sdk.io.gcp.bigquery
The result of a BigQueryIO.Write transform.
WriteStreamServiceImpl(BigQueryOptions) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.WriteStreamServiceImpl
 
writeStrings() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
Returns A PTransform that writes UTF-8 encoded strings to a Google Cloud Pub/Sub stream.
WriteSuccessSummary(int, long) - Constructor for class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteSuccessSummary
 
writeTableRows() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
A PTransform that writes a PCollection containing TableRows to a BigQuery table.
A B C D E F G H I J K L M N O P Q R S T U V W 
Skip navigation links