A B C D E F G H I J K L M N O P Q R S T U V W
All Classes All Packages
All Classes All Packages
All Classes All Packages
A
- AbstractResult() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.AbstractResult
- ackDeadlineSeconds(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Return the ack deadline, in seconds, for
subscription
. - ackDeadlineSeconds(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
- ackDeadlineSeconds(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
- ackDeadlineSeconds(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- acknowledge(PubsubClient.SubscriptionPath, List<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Acknowldege messages from
subscription
withackIds
. - acknowledge(PubsubClient.SubscriptionPath, List<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
- acknowledge(PubsubClient.SubscriptionPath, List<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
- acknowledge(PubsubClient.SubscriptionPath, List<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- ActionFactory - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.action
-
Factory class for creating instances that will handle each type of record within a change stream query.
- ActionFactory() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.ActionFactory
- ACTIVE_PARTITION_READ_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Counter for the active partition reads during the execution of the Connector.
- actuateProjectionPushdown(Map<TupleTag<?>, FieldAccessDescriptor>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- add(List<ValueInSingleWindow<T>>, TableDataInsertAllResponse.InsertErrors, TableReference, FailsafeValueInSingleWindow<TableRow, TableRow>) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.ErrorContainer
- addUuids() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteIO
-
Add Uuids to to-be-published messages that ensures that uniqueness is maintained.
- AddUuidsTransform - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
-
A transform to add UUIDs to each message to be written to Pub/Sub Lite.
- AddUuidsTransform() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.AddUuidsTransform
- advance() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
-
For subscription mode only: Track progression of time according to the
Clock
passed . - advance() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
- ALLOW_FIELD_ADDITION - org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.SchemaUpdateOption
-
Allow adding a nullable field to the schema.
- ALLOW_FIELD_RELAXATION - org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.SchemaUpdateOption
-
Allow relaxing a required field in the original schema to nullable.
- alwaysRetry() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.InsertRetryPolicy
-
Always retry all failures.
- appendRows(long, ProtoRows) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StreamAppendClient
-
Append rows to a Storage API write stream at the given offset.
- apply(byte[]) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessages.DeserializeBytesIntoPubsubMessagePayloadOnly
- apply(byte[]) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessages.ParsePubsubMessageProtoAsPayload
- apply(HealthcareIOError<T>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOErrorToTableRow
- apply(PubsubMessage) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessages.ParsePayloadAsPubsubMessageProto
- apply(Row) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BeamRowToBigtableMutation.ToBigtableRowFn
- apply(Statement, Description) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery
- apply(Statement, Description) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
- apply(Statement, Description) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal
- asPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubSubscription
-
Returns the string representation of this subscription as a path used in the Cloud Pub/Sub API.
- asPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
-
Returns the string representation of this topic as a path used in the Cloud Pub/Sub API.
- assertSubscriptionEventuallyCreated(String, Duration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
-
Block until a subscription is created for this test topic in the specified project.
- assertThatAllRows(Schema) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery
- assertThatTopicEventuallyReceives(Matcher<PubsubMessage>...) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
-
Repeatedly pull messages from
TestPubsub.subscriptionPath()
until receiving one for each matcher (or timeout is reached), then assert that the received messages match the expectations. - asV1Beta1Path() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubSubscription
-
Deprecated.the v1beta1 API for Cloud Pub/Sub is deprecated.
- asV1Beta1Path() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
-
Deprecated.the v1beta1 API for Cloud Pub/Sub is deprecated.
- asV1Beta2Path() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubSubscription
-
Deprecated.the v1beta2 API for Cloud Pub/Sub is deprecated.
- asV1Beta2Path() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
-
Deprecated.the v1beta2 API for Cloud Pub/Sub is deprecated.
- attached() - Method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
- ATTRIBUTE_ARRAY_ENTRY_SCHEMA - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
- ATTRIBUTE_ARRAY_FIELD_TYPE - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
- ATTRIBUTE_MAP_FIELD_TYPE - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
- AuthenticatedRetryInitializer(GoogleCredentials) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.AuthenticatedRetryInitializer
- AvroWriteRequest<T> - Class in org.apache.beam.sdk.io.gcp.bigquery
- AvroWriteRequest(T, Schema) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.AvroWriteRequest
B
- BASIC - org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService.TableMetadataView
- BATCH - org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead.QueryPriority
-
Specifies that a query should be run with a BATCH priority.
- BATCH_IMPORT - org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write.WriteMethod
-
Batch import write method.
- batchGetDocuments() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.Read
-
Factory method to create a new type safe builder for
BatchGetDocumentsRequest
operations. - batchWrite() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.Write
-
Factory method to create a new type safe builder for
Write
operations. - BeamRowToBigtableMutation - Class in org.apache.beam.sdk.io.gcp.bigtable
- BeamRowToBigtableMutation(Map<String, Set<String>>) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BeamRowToBigtableMutation
- BeamRowToBigtableMutation.ToBigtableRowFn - Class in org.apache.beam.sdk.io.gcp.bigtable
- BeamRowToStorageApiProto - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Utility methods for converting Beam
Row
objects to dynamic protocol message, for use with the Storage write API. - BeamRowToStorageApiProto() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BeamRowToStorageApiProto
- BIG_QUERY_INSERT_ERROR_ERROR_CONTAINER - Static variable in interface org.apache.beam.sdk.io.gcp.bigquery.ErrorContainer
- BIGQUERY_JOB_TEMPLATE - Static variable in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
-
Template for BigQuery jobs created by BigQueryIO.
- BigqueryClient - Class in org.apache.beam.sdk.io.gcp.testing
-
A wrapper class to call Bigquery API calls.
- BigqueryClient(String) - Constructor for class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
- BigQueryCoderProviderRegistrar - Class in org.apache.beam.sdk.io.gcp.bigquery
-
A
CoderProviderRegistrar
for standard types used withBigQueryIO
. - BigQueryCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryCoderProviderRegistrar
- BigQueryDirectReadSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration
- BigQueryDirectReadSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
-
An implementation of
TypedSchemaTransformProvider
for BigQuery Storage Read API jobs configured viaBigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration
. - BigQueryDirectReadSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider
- BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
-
Configuration for reading from BigQuery with Storage Read API.
- BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.bigquery.providers
- BigQueryDlqProvider - Class in org.apache.beam.sdk.io.gcp.bigquery
- BigQueryDlqProvider() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryDlqProvider
- BigQueryHelpers - Class in org.apache.beam.sdk.io.gcp.bigquery
-
A set of helper functions and classes used by
BigQueryIO
. - BigQueryHelpers() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
- BigQueryInsertError - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Model definition for BigQueryInsertError.
- BigQueryInsertError(TableRow, TableDataInsertAllResponse.InsertErrors, TableReference) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertError
- BigQueryInsertErrorCoder - Class in org.apache.beam.sdk.io.gcp.bigquery
-
A
Coder
that encodes BigQueryBigQueryInsertError
objects. - BigQueryInsertErrorCoder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertErrorCoder
- BigQueryIO - Class in org.apache.beam.sdk.io.gcp.bigquery
-
PTransform
s for reading and writing BigQuery tables. - BigQueryIO.Read - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Implementation of
BigQueryIO.read()
. - BigQueryIO.TypedRead<T> - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Implementation of
BigQueryIO.read(SerializableFunction)
. - BigQueryIO.TypedRead.Method - Enum in org.apache.beam.sdk.io.gcp.bigquery
-
Determines the method used to read data from BigQuery.
- BigQueryIO.TypedRead.QueryPriority - Enum in org.apache.beam.sdk.io.gcp.bigquery
-
An enumeration type for the priority of a query.
- BigQueryIO.Write<T> - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Implementation of
BigQueryIO.write()
. - BigQueryIO.Write.CreateDisposition - Enum in org.apache.beam.sdk.io.gcp.bigquery
-
An enumeration type for the BigQuery create disposition strings.
- BigQueryIO.Write.Method - Enum in org.apache.beam.sdk.io.gcp.bigquery
-
Determines the method used to insert data in BigQuery.
- BigQueryIO.Write.SchemaUpdateOption - Enum in org.apache.beam.sdk.io.gcp.bigquery
-
An enumeration type for the BigQuery schema update options strings.
- BigQueryIO.Write.WriteDisposition - Enum in org.apache.beam.sdk.io.gcp.bigquery
-
An enumeration type for the BigQuery write disposition strings.
- BigqueryMatcher - Class in org.apache.beam.sdk.io.gcp.testing
-
A matcher to verify data in BigQuery by processing given query and comparing with content's checksum.
- BigqueryMatcher.TableAndQuery - Class in org.apache.beam.sdk.io.gcp.testing
- BigQueryOptions - Interface in org.apache.beam.sdk.io.gcp.bigquery
-
Properties needed when using Google BigQuery with the Apache Beam SDK.
- BigQuerySchemaIOProvider - Class in org.apache.beam.sdk.io.gcp.bigquery
-
An implementation of
SchemaIOProvider
for reading and writing to BigQuery withBigQueryIO
. - BigQuerySchemaIOProvider() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaIOProvider
- BigQuerySchemaRetrievalException - Exception in org.apache.beam.sdk.io.gcp.bigquery
-
Exception to signal that BigQuery schema retrieval failed.
- BigQuerySchemaTransformReadConfiguration - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Configuration for reading from BigQuery.
- BigQuerySchemaTransformReadConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformReadConfiguration
- BigQuerySchemaTransformReadConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.bigquery
- BigQuerySchemaTransformReadProvider - Class in org.apache.beam.sdk.io.gcp.bigquery
-
An implementation of
TypedSchemaTransformProvider
for BigQuery read jobs configured usingBigQuerySchemaTransformReadConfiguration
. - BigQuerySchemaTransformReadProvider() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformReadProvider
- BigQuerySchemaTransformWriteConfiguration - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Configuration for writing to BigQuery.
- BigQuerySchemaTransformWriteConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformWriteConfiguration
- BigQuerySchemaTransformWriteConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.bigquery
- BigQuerySchemaTransformWriteProvider - Class in org.apache.beam.sdk.io.gcp.bigquery
-
An implementation of
TypedSchemaTransformProvider
for BigQuery write jobs configured usingBigQuerySchemaTransformWriteConfiguration
. - BigQuerySchemaTransformWriteProvider() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformWriteProvider
- BigQueryServices - Interface in org.apache.beam.sdk.io.gcp.bigquery
-
An interface for real, mock, or fake implementations of Cloud BigQuery services.
- BigQueryServices.BigQueryServerStream<T> - Interface in org.apache.beam.sdk.io.gcp.bigquery
-
Container for reading data from streaming endpoints.
- BigQueryServices.DatasetService - Interface in org.apache.beam.sdk.io.gcp.bigquery
-
An interface to get, create and delete Cloud BigQuery datasets and tables.
- BigQueryServices.DatasetService.TableMetadataView - Enum in org.apache.beam.sdk.io.gcp.bigquery
- BigQueryServices.JobService - Interface in org.apache.beam.sdk.io.gcp.bigquery
-
An interface for the Cloud BigQuery load service.
- BigQueryServices.StorageClient - Interface in org.apache.beam.sdk.io.gcp.bigquery
-
An interface representing a client object for making calls to the BigQuery Storage API.
- BigQueryServices.StreamAppendClient - Interface in org.apache.beam.sdk.io.gcp.bigquery
-
An interface for appending records to a Storage API write stream.
- BigQueryStorageApiInsertError - Class in org.apache.beam.sdk.io.gcp.bigquery
- BigQueryStorageApiInsertError(TableRow) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertError
- BigQueryStorageApiInsertError(TableRow, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertError
- BigQueryStorageApiInsertErrorCoder - Class in org.apache.beam.sdk.io.gcp.bigquery
- BigQueryStorageApiInsertErrorCoder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertErrorCoder
- BigQueryStorageTableSource<T> - Class in org.apache.beam.sdk.io.gcp.bigquery
-
A
Source
representing reading from a table. - BigQueryUtils - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Utility methods for BigQuery related operations.
- BigQueryUtils() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
- BigQueryUtils.ConversionOptions - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Options for how to convert BigQuery data to Beam data.
- BigQueryUtils.ConversionOptions.Builder - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Builder for
BigQueryUtils.ConversionOptions
. - BigQueryUtils.ConversionOptions.TruncateTimestamps - Enum in org.apache.beam.sdk.io.gcp.bigquery
-
Controls whether to truncate timestamps to millisecond precision lossily, or to crash when truncation would result.
- BigQueryUtils.SchemaConversionOptions - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Options for how to convert BigQuery schemas to Beam schemas.
- BigQueryUtils.SchemaConversionOptions.Builder - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Builder for
BigQueryUtils.SchemaConversionOptions
. - BigtableIO - Class in org.apache.beam.sdk.io.gcp.bigtable
-
Transforms
for reading from and writing to Google Cloud Bigtable. - BigtableIO.Read - Class in org.apache.beam.sdk.io.gcp.bigtable
-
A
PTransform
that reads from Google Cloud Bigtable. - BigtableIO.Write - Class in org.apache.beam.sdk.io.gcp.bigtable
-
A
PTransform
that writes to Google Cloud Bigtable. - BigtableIO.WriteWithResults - Class in org.apache.beam.sdk.io.gcp.bigtable
-
A
PTransform
that writes to Google Cloud Bigtable and emits aBigtableWriteResult
for each batch written. - BigtableRowToBeamRow - Class in org.apache.beam.sdk.io.gcp.bigtable
- BigtableRowToBeamRow(Schema) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableRowToBeamRow
- BigtableRowToBeamRowFlat - Class in org.apache.beam.sdk.io.gcp.bigtable
- BigtableRowToBeamRowFlat(Schema, Map<String, Set<String>>) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableRowToBeamRowFlat
- BigtableUtils - Class in org.apache.beam.sdk.io.gcp.testing
- BigtableUtils() - Constructor for class org.apache.beam.sdk.io.gcp.testing.BigtableUtils
- BigtableWriteResult - Class in org.apache.beam.sdk.io.gcp.bigtable
-
The result of writing a batch of rows to Bigtable.
- BigtableWriteResult() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResult
- BigtableWriteResultCoder - Class in org.apache.beam.sdk.io.gcp.bigtable
-
A coder for
BigtableWriteResult
. - BigtableWriteResultCoder() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResultCoder
- BlockingCommitterImpl - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
- booleanToByteArray(boolean) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigtableUtils
- build() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformReadConfiguration.Builder
-
Builds the
BigQuerySchemaTransformReadConfiguration
configuration. - build() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformWriteConfiguration.Builder
-
Builds the
BigQuerySchemaTransformWriteConfiguration
configuration. - build() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.SchemaConversionOptions.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchGetDocuments.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchWriteWithDeadLetterQueue.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchWriteWithSummary.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.ListCollectionIds.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.ListDocuments.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.PartitionQuery.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.RunQuery.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
-
Create a new instance of
RpcQosOptions
from the current builder state. - build() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformReadConfiguration.Builder
-
Builds a
PubsubSchemaTransformReadConfiguration
instance. - build() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformWriteConfiguration.Builder
-
Builds a
PubsubSchemaTransformWriteConfiguration
instance. - build() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PublisherOptions.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.SubscriberOptions.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
-
Builds the
ChangeStreamRecordMetadata
. - build() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
-
Builds a
PartitionMetadata
from the given fields. - build() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestrictionMetadata.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction.Builder
- build() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration.Builder
- builder() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformReadConfiguration
-
Instantiates a
BigQuerySchemaTransformReadConfiguration.Builder
. - builder() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformWriteConfiguration
-
Instantiates a
BigQuerySchemaTransformWriteConfiguration.Builder
. - builder() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions
- builder() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.SchemaConversionOptions
- builder() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration
-
Instantiates a
BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration.Builder
instance. - builder() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformReadConfiguration
-
Instantiates a
PubsubSchemaTransformReadConfiguration.Builder
. - builder() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformReadConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformWriteConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.SchemaConversionOptions.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformReadConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformWriteConfiguration.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PublisherOptions.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.SubscriberOptions.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestrictionMetadata.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction.Builder
- Builder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration.Builder
- Builder(JodaClock, FirestoreStatefulComponentFactory, RpcQosOptions, boolean, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.PartitionQuery.Builder
- Builder(JodaClock, FirestoreStatefulComponentFactory, RpcQosOptions, Instant) - Constructor for class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchGetDocuments.Builder
- Builder(PartitionRestrictionMetadata) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestrictionMetadata.Builder
- buildExternal(ExternalRead.Configuration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead.ReadBuilder
- buildExternal(ExternalWrite.Configuration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite.WriteBuilder
- buildExternal(SpannerTransformRegistrar.ReadBuilder.Configuration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder
- buildReader() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
- buildWriter() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
- BUNDLE - org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Import.ContentStructure
-
The source file contains one or more lines of newline-delimited JSON (ndjson).
- BytesThroughputEstimator<T> - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator
-
An estimator to provide an estimate on the throughput of the outputted elements.
- BytesThroughputEstimator(int, SizeEstimator<T>) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.BytesThroughputEstimator
- byteString(byte[]) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.RowUtils
- byteString(byte[]) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigtableUtils
- byteStringUtf8(String) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.RowUtils
- byteStringUtf8(String) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigtableUtils
C
- cancel() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.BigQueryServerStream
-
Cancels the stream, releasing any client- and server-side resources.
- cancel() - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices.FakeBigQueryServerStream
- ChangeStreamDao - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
-
Responsible for making change stream queries for a given partition.
- ChangeStreamMetrics - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams
-
Class to aggregate metrics related functionality.
- ChangeStreamMetrics() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Constructs a ChangeStreamMetrics instance with the following metrics enabled by default.
- ChangeStreamMetrics(Set<MetricName>) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Constructs a ChangeStreamMetrics instance with the given metrics enabled.
- changeStreamQuery(String, Timestamp, Timestamp, long) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamDao
-
Performs a change stream query.
- ChangeStreamRecord - Interface in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
Represents a Spanner Change Stream Record.
- changeStreamRecordMapper() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper.MapperFactory
-
Creates and returns a singleton instance of a mapper class capable of transforming a
Struct
into aList
ofChangeStreamRecord
subclasses. - ChangeStreamRecordMapper - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper
- ChangeStreamRecordMetadata - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
Holds internal execution metrics / metadata for the processed
ChangeStreamRecord
. - ChangeStreamRecordMetadata.Builder - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
- ChangeStreamResultSet - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
-
Decorator class over a
ResultSet
that provides telemetry for the streamed records. - ChangeStreamResultSetMetadata - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
-
Represents telemetry metadata gathered during the consumption of a change stream query.
- ChangeStreamsConstants - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams
-
Single place for defining the constants used in the
Spanner.readChangeStreams()
connector. - ChangeStreamsConstants() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
- checkDone() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestrictionTracker
- checkDone() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
-
Checks if the restriction has been processed successfully.
- checkIfAnySubscriptionExists(String, Duration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
- CheckpointMarkImpl - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
- ChildPartition - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
A child partition represents a new partition that should be queried.
- ChildPartition(String, String) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartition
-
Constructs a child partition, which will have its own token and the parent that it originated from.
- ChildPartition(String, HashSet<String>) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartition
-
Constructs a child partition, which will have its own token and the parents that it originated from.
- ChildPartitionsRecord - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
Represents a ChildPartitionsRecord.
- ChildPartitionsRecord(Timestamp, String, List<ChildPartition>, ChangeStreamRecordMetadata) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
-
Constructs a child partitions record containing one or more child partitions.
- childPartitionsRecordAction(PartitionMetadataDao, ChangeStreamMetrics) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.ActionFactory
-
Creates and returns a singleton instance of an action class capable of process
ChildPartitionsRecord
s. - ChildPartitionsRecordAction - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.action
-
This class is part of the process for
ReadChangeStreamPartitionDoFn
SDF. - CivilTimeEncoder - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Encoder for TIME and DATETIME values, according to civil_time encoding.
- CleanUpReadChangeStreamDoFn - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn
- CleanUpReadChangeStreamDoFn(DaoFactory) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.CleanUpReadChangeStreamDoFn
- close() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StorageClient
-
Close the client object.
- close() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
-
Gracefully close the underlying netty channel.
- close() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
- close() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- close() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.BlockingCommitterImpl
- close() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.ManagedFactoryImpl
- close() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
- close() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSet
-
Closes the current change stream
ResultSet
. - close() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerAccessor
- close() - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- close() - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
- CloudPubsubTransforms - Class in org.apache.beam.sdk.io.gcp.pubsublite
-
A class providing transforms between Cloud Pub/Sub and Pub/Sub Lite message types.
- COLUMN_CREATED_AT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Metadata table column name for the timestamp at which the partition row was first created.
- COLUMN_END_TIMESTAMP - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Metadata table column name for the timestamp to end the change stream query of the partition.
- COLUMN_FINISHED_AT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Metadata table column name for the timestamp at which the partition was marked as finished by the
ReadChangeStreamPartitionDoFn
SDF. - COLUMN_HEARTBEAT_MILLIS - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Metadata table column name for the change stream query heartbeat interval in millis.
- COLUMN_PARENT_TOKENS - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Metadata table column name for parent partition tokens.
- COLUMN_PARTITION_TOKEN - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Metadata table column name for the partition token.
- COLUMN_RUNNING_AT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Metadata table column name for the timestamp at which the partition was marked as running by the
ReadChangeStreamPartitionDoFn
SDF. - COLUMN_SCHEDULED_AT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Metadata table column name for the timestamp at which the partition was scheduled by the
DetectNewPartitionsDoFn
SDF. - COLUMN_START_TIMESTAMP - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Metadata table column name for the timestamp to start the change stream query of the partition.
- COLUMN_STATE - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Metadata table column name for the state that the partition is currently in.
- COLUMN_WATERMARK - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Metadata table column name for the current watermark of the partition.
- COLUMNS_MAPPING - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.RowUtils
- ColumnType - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
Defines a column type from a Cloud Spanner table with the following information: column name, column type, flag indicating if column is primary key and column position in the table.
- ColumnType(String, TypeCode, boolean, long) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ColumnType
- commitOffset(Offset) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.BlockingCommitterImpl
- commitWriteStreams(String, Iterable<String>) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
Commit write streams of type PENDING.
- commitWriteStreams(String, Iterable<String>) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- Configuration() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite.Configuration
- Configuration() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
- configurationClass() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformReadProvider
-
Returns the expected class of the configuration.
- configurationClass() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformWriteProvider
-
Returns the expected class of the configuration.
- configurationClass() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider
- configurationClass() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformReadProvider
-
Returns the expected class of the configuration.
- configurationClass() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider
- configurationSchema() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaIOProvider
-
Returns the expected schema of the configuration object.
- configurationSchema() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider
-
Returns the expected schema of the configuration object.
- configurationSchema() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
-
Returns the expected schema of the configuration object.
- CONTENT_STRUCTURE_UNSPECIFIED - org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Import.ContentStructure
-
If the content structure is not specified, the default value BUNDLE will be used.
- Context(TableDataInsertAllResponse.InsertErrors) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.InsertRetryPolicy.Context
- ConversionOptions() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions
- convertAvroFormat(Schema.FieldType, Object, BigQueryUtils.ConversionOptions) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
-
Tries to convert an Avro decoded value to a Beam field value based on the target type of the Beam field.
- convertGenericRecordToTableRow(GenericRecord, TableSchema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
- convertNumbers(TableRow) - Static method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
- countPartitionsCreatedAfter(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
-
Counts all partitions with a
PartitionMetadataAdminDao.COLUMN_CREATED_AT
less than the given timestamp. - create() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
-
Creates an instance of this rule using options provided by
TestPipeline.testingPipelineOptions()
. - create() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal
-
Creates an instance of this rule.
- create() - Static method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
- create() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- create(long) - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResult
- create(BatchTransactionId) - Static method in class org.apache.beam.sdk.io.gcp.spanner.Transaction
- create(Mutation, Mutation...) - Static method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
-
Creates a new group.
- create(Mutation, Iterable<Mutation>) - Static method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
- create(String, String) - Static method in class org.apache.beam.sdk.io.gcp.datastore.RowToEntity
-
Create a PTransform instance.
- create(String, String, String, Boolean) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher.TableAndQuery
- create(SubscriptionPartition) - Method in interface org.apache.beam.sdk.io.gcp.pubsublite.internal.ManagedFactory
- create(SubscriptionPartition) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.ManagedFactoryImpl
- create(ValueProvider<TableReference>, ValueProvider<List<String>>, ValueProvider<String>, SerializableFunction<SchemaAndRecord, T>, Coder<T>, BigQueryServices) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
- create(ValueProvider<TableReference>, DataFormat, ValueProvider<List<String>>, ValueProvider<String>, SerializableFunction<SchemaAndRecord, T>, Coder<T>, BigQueryServices, boolean) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
- create(Schema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery
-
Creates an instance of this rule.
- create(Schema, String) - Static method in class org.apache.beam.sdk.io.gcp.datastore.EntityToRow
-
Create a PTransform instance.
- CREATE_IF_NEEDED - org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.CreateDisposition
-
Specifies that tables should be created if needed.
- CREATE_NEVER - org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.CreateDisposition
-
Specifics that tables should not be created.
- CREATED - org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.State
- createDataset(String, String, String, String, Long) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
Create a
Dataset
with the givenlocation
,description
and default expiration time for tables in the dataset (ifnull
, tables don't expire). - createDataset(String, String, String, String, Long) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- createDicomStore(String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
- createDicomStore(String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- createDicomStore(String, String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- createDicomStore(String, String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
- createFactoryForCreateSubscription() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- createFactoryForPublish(PubsubClient.TopicPath, Iterable<PubsubClient.OutgoingMessage>, Iterable<PubsubClient.OutgoingMessage>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
-
Return a factory for testing publishers.
- createFactoryForPull(Clock, PubsubClient.SubscriptionPath, int, Iterable<PubsubClient.IncomingMessage>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
-
Return a factory for testing subscribers.
- createFactoryForPullAndPublish(PubsubClient.SubscriptionPath, PubsubClient.TopicPath, Clock, int, Iterable<PubsubClient.IncomingMessage>, Iterable<PubsubClient.OutgoingMessage>, Iterable<PubsubClient.OutgoingMessage>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
-
Returns a factory for a test that is expected to both publish and pull messages over the course of the test.
- createFhirStore(String, String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
- createFhirStore(String, String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- createFhirStore(String, String, String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- createFhirStore(String, String, String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
- createHL7v2Message(String, Message) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Create hl 7 v 2 message message.
- createHL7v2Message(String, Message) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- createHL7v2Store(String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Create hl 7 v 2 store hl 7 v 2 store.
- createHL7v2Store(String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- createNewDataset(String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
-
Creates a new dataset.
- createNewDataset(String, String, Long) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
-
Creates a new dataset with defaultTableExpirationMs.
- createNewTable(String, String, Table) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
- createPartitionMetadataTable() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Creates the metadata table in the given instance, database configuration, with the constructor specified table name.
- createQuery(String, String, String) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher
- createQueryUsingStandardSql(String, String, String) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher
- createRandomSubscription(PubsubClient.ProjectPath, PubsubClient.TopicPath, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Create a random subscription for
topic
. - createReader(PipelineOptions, CheckpointMarkImpl) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedSourceImpl
- createReadSession(CreateReadSessionRequest) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StorageClient
-
Create a new read session against an existing table.
- createSubscription(PubsubClient.TopicPath, PubsubClient.SubscriptionPath, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Create
subscription
totopic
. - createSubscription(PubsubClient.TopicPath, PubsubClient.SubscriptionPath, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
- createSubscription(PubsubClient.TopicPath, PubsubClient.SubscriptionPath, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
- createSubscription(PubsubClient.TopicPath, PubsubClient.SubscriptionPath, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- createTable(Table) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
Creates the specified table if it does not exist.
- createTable(Table) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- CreateTableDestinations<DestinationT,ElementT> - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Creates any tables needed before performing writes to the tables.
- CreateTableDestinations(BigQueryIO.Write.CreateDisposition, BigQueryServices, DynamicDestinations<?, DestinationT>, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.CreateTableDestinations
- CreateTableDestinations(BigQueryIO.Write.CreateDisposition, DynamicDestinations<?, DestinationT>) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.CreateTableDestinations
- CreateTableHelpers - Class in org.apache.beam.sdk.io.gcp.bigquery
- CreateTableHelpers() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.CreateTableHelpers
- CreateTables<DestinationT,ElementT> - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Creates any tables needed before performing streaming writes to the tables.
- CreateTables(BigQueryIO.Write.CreateDisposition, DynamicDestinations<?, DestinationT>) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.CreateTables
-
The list of tables created so far, so we don't try the creation each time.
- createTest(String, String, String) - Static method in class org.apache.beam.sdk.io.gcp.datastore.RowToEntity
- createTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Create
topic
. - createTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
- createTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
- createTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- createTransaction() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
-
Returns a transform that creates a batch transaction.
- CreateTransaction() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
- createWriteStream(String, WriteStream.Type) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
Create a Write Stream for use with the Storage Write API.
- createWriteStream(String, WriteStream.Type) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- CrossLanguageConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.CrossLanguageConfiguration
- currentRestriction() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestrictionTracker
- currentRestriction() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
D
- DaoFactory - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
-
Factory class to create data access objects to perform change stream queries and access the metadata tables.
- DaoFactory(SpannerConfig, String, SpannerConfig, String, Options.RpcPriority, String, Dialect, Dialect) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.DaoFactory
-
Constructs a
DaoFactory
with the configuration to be used for the underlying instances. - DATA_RECORD_COMMITTED_TO_EMITTED_0MS_TO_1000MS_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Counter for record latencies [0, 1000) ms during the execution of the Connector.
- DATA_RECORD_COMMITTED_TO_EMITTED_1000MS_TO_3000MS_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Counter for record latencies [1000, 3000) ms during the execution of the Connector.
- DATA_RECORD_COMMITTED_TO_EMITTED_3000MS_TO_INF_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Counter for record latencies equal or above 3000ms during the execution of the Connector.
- DATA_RECORD_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Counter for the total number of data records identified during the execution of the Connector.
- DataChangeRecord - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
A data change record encodes modifications to Cloud Spanner rows.
- DataChangeRecord(String, Timestamp, String, boolean, String, String, List<ColumnType>, List<Mod>, ModType, ValueCaptureType, long, long, String, boolean, ChangeStreamRecordMetadata) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
Constructs a data change record for a given partition, at a given timestamp, for a given transaction.
- dataChangeRecordAction(ThroughputEstimator<DataChangeRecord>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.ActionFactory
-
Creates and returns a singleton instance of an action class capable of processing
DataChangeRecord
s. - DataChangeRecordAction - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.action
-
This class is part of the process for
ReadChangeStreamPartitionDoFn
SDF. - DataChangeRecordAction(ThroughputEstimator<DataChangeRecord>) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.DataChangeRecordAction
- dataSchema - Variable in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
- dataset - Variable in class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser.DicomWebPath
- DatastoreIO - Class in org.apache.beam.sdk.io.gcp.datastore
-
DatastoreIO
provides an API for reading from and writing to Google Cloud Datastore over different versions of the Cloud Datastore Client libraries. - DatastoreV1 - Class in org.apache.beam.sdk.io.gcp.datastore
-
DatastoreV1
provides an API to Read, Write and DeletePCollections
of Google Cloud Datastore version v1Entity
objects. - DatastoreV1.DeleteEntity - Class in org.apache.beam.sdk.io.gcp.datastore
-
A
PTransform
that deletesEntities
from Cloud Datastore. - DatastoreV1.DeleteKey - Class in org.apache.beam.sdk.io.gcp.datastore
-
A
PTransform
that deletesEntities
associated with the givenKeys
from Cloud Datastore. - DatastoreV1.Read - Class in org.apache.beam.sdk.io.gcp.datastore
-
A
PTransform
that reads the result rows of a Cloud Datastore query asEntity
objects. - DatastoreV1.Write - Class in org.apache.beam.sdk.io.gcp.datastore
-
A
PTransform
that writesEntity
objects to Cloud Datastore. - DataStoreV1SchemaIOProvider - Class in org.apache.beam.sdk.io.gcp.datastore
-
An implementation of
SchemaIOProvider
for reading and writing payloads withDatastoreIO
. - DataStoreV1SchemaIOProvider() - Constructor for class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider
- DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO - Class in org.apache.beam.sdk.io.gcp.datastore
-
An abstraction to create schema aware IOs.
- DEAD_LETTER - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read
-
The tag for the deadletter output of FHIR resources.
- DEAD_LETTER - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search
-
The tag for the deadletter output of FHIR Resources.
- DEAD_LETTER - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything
-
The tag for the deadletter output of FHIR Resources from a GetPatientEverything request.
- DEAD_LETTER - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read
-
The tag for the deadletter output of HL7v2 Messages.
- decActivePartitionReadCounter() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Decrements the
ChangeStreamMetrics.ACTIVE_PARTITION_READ_COUNT
by 1 if the metric is enabled. - decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertErrorCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertErrorCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV2
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV3
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResultCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameterCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOErrorCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2MessageCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.JsonArrayCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdAndOrderingKeyCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithMessageIdCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.OffsetByteRangeCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.SubscriptionPartitionCoder
- decode(InputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UuidCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
- decode(InputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
- decodePacked32TimeSeconds(int) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Decodes
bitFieldTimeSeconds
as aLocalTime
with seconds precision. - decodePacked32TimeSecondsAsJavaTime(int) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Decodes
bitFieldTimeSeconds
as aLocalTime
with seconds precision. - decodePacked64DatetimeMicros(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Decodes
bitFieldDatetimeMicros
as aLocalDateTime
with microseconds precision. - decodePacked64DatetimeMicrosAsJavaTime(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Decodes
bitFieldDatetimeMicros
as aLocalDateTime
with microseconds precision. - decodePacked64DatetimeSeconds(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Decodes
bitFieldDatetimeSeconds
as aLocalDateTime
with seconds precision. - decodePacked64DatetimeSecondsAsJavaTime(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Decodes
bitFieldDatetimeSeconds
as aLocalDateTime
with seconds precision. - decodePacked64TimeMicros(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Decodes
bitFieldTimeMicros
as aLocalTime
with microseconds precision. - decodePacked64TimeMicrosAsJavaTime(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Decodes
bitFieldTimeMicros
as aLocalTime
with microseconds precision. - decodePacked64TimeNanos(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Decodes
bitFieldTimeNanos
as aLocalTime
with nanoseconds precision. - decodePacked64TimeNanosAsJavaTime(long) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Decodes
bitFieldTimeNanos
as aLocalTime
with nanoseconds precision. - decodeQueryResult(String) - Static method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
- deduplicate() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions
- deduplicate(UuidDeduplicationOptions) - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteIO
-
Remove duplicates from the PTransform from a read.
- DEFAULT - org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead.Method
-
The default behavior if no method is explicitly set.
- DEFAULT - org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.Method
-
The default behavior if no method is explicitly set.
- DEFAULT_ATTRIBUTE - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.internal.Uuid
- DEFAULT_CHANGE_STREAM_NAME - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
-
The default change stream name for a change stream query is the empty
String
. - DEFAULT_DEDUPLICATE_DURATION - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions
- DEFAULT_INCLUSIVE_END_AT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
-
The default end timestamp for a change stream query is
ChangeStreamsConstants.MAX_INCLUSIVE_END_AT
. - DEFAULT_INCLUSIVE_START_AT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
-
The default start timestamp for a change stream query is
Timestamp.MIN_VALUE
. - DEFAULT_RPC_PRIORITY - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
-
The default priority for a change stream query is
Options.RpcPriority.HIGH
. - DEFAULT_TIME_DOMAIN - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions
- DEFAULT_UUID_EXTRACTOR - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions
- defaultOptions() - Static method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
Factory method to return a new instance of
RpcQosOptions
with all default values. - deidentify(String, String, DeidentifyConfig) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
-
Deidentify FHIR resources.
- deidentify(ValueProvider<String>, ValueProvider<String>, ValueProvider<DeidentifyConfig>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
-
Deidentify FHIR resources.
- deidentify(DoFn.ProcessContext) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Deidentify.DeidentifyFn
- Deidentify(ValueProvider<String>, ValueProvider<String>, ValueProvider<DeidentifyConfig>) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Deidentify
- deidentifyFhirStore(String, String, DeidentifyConfig) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
- deidentifyFhirStore(String, String, DeidentifyConfig) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- DeidentifyFn(ValueProvider<String>, ValueProvider<DeidentifyConfig>) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Deidentify.DeidentifyFn
- DELETE - org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ModType
- DELETE_URN - Static variable in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar
- DeleteBuilder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.DeleteBuilder
- deleteDataset(String, String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
Deletes the dataset specified by the datasetId value.
- deleteDataset(String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
- deleteDataset(String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- deleteDicomStore(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
- deleteDicomStore(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- deleteEntity() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1
-
Returns an empty
DatastoreV1.DeleteEntity
builder. - deleteFhirStore(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
- deleteFhirStore(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- deleteHL7v2Message(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Delete hl 7 v 2 message empty.
- deleteHL7v2Message(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- deleteHL7v2Store(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Delete hl 7 v 2 store empty.
- deleteHL7v2Store(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- deleteKey() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1
-
Returns an empty
DatastoreV1.DeleteKey
builder. - deletePartitionMetadataTable() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataAdminDao
-
Drops the metadata table.
- deleteSubscription(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Delete
subscription
. - deleteSubscription(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
- deleteSubscription(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
- deleteSubscription(PubsubClient.SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- deleteTable(TableReference) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
Deletes the table specified by tableId from the dataset.
- deleteTable(TableReference) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- deleteTable(String, String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
- deleteTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
- deleteTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
- deleteTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
- deleteTopic(PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- describeMismatchSafely(BigqueryMatcher.TableAndQuery, Description) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher
- describeTo(Description) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher
- DeserializeBytesIntoPubsubMessagePayloadOnly() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessages.DeserializeBytesIntoPubsubMessagePayloadOnly
- detectNewPartitionsAction(PartitionMetadataDao, PartitionMetadataMapper, ChangeStreamMetrics, Duration) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.ActionFactory
-
Creates and returns a single instance of an action class capable of detecting and scheduling new partitions to be queried.
- DetectNewPartitionsAction - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.action
-
This class is responsible for scheduling partitions.
- DetectNewPartitionsAction(PartitionMetadataDao, PartitionMetadataMapper, ChangeStreamMetrics, Duration) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.DetectNewPartitionsAction
-
Constructs an action class for detecting / scheduling new partitions.
- DetectNewPartitionsDoFn - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn
-
A SplittableDoFn (SDF) that is responsible for scheduling partitions to be queried.
- DetectNewPartitionsDoFn(DaoFactory, MapperFactory, ActionFactory, ChangeStreamMetrics) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.DetectNewPartitionsDoFn
-
This class needs a
DaoFactory
to build DAOs to access the partition metadata tables. - DetectNewPartitionsRangeTracker - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction
-
This restriction tracker delegates most of its behavior to an internal
TimestampRangeTracker
. - DetectNewPartitionsRangeTracker(TimestampRange) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.DetectNewPartitionsRangeTracker
- DicomIO - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The DicomIO connectors allows Beam pipelines to make calls to the Dicom API of the Google Cloud Healthcare API (https://cloud.google.com/healthcare/docs/how-tos#dicom-guide).
- DicomIO() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.DicomIO
- DicomIO.ReadStudyMetadata - Class in org.apache.beam.sdk.io.gcp.healthcare
-
This class makes a call to the retrieve metadata endpoint (https://cloud.google.com/healthcare/docs/how-tos/dicomweb#retrieving_metadata).
- DicomIO.ReadStudyMetadata.Result - Class in org.apache.beam.sdk.io.gcp.healthcare
- dicomStorePath - Variable in class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser.DicomWebPath
- DicomWebPath() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser.DicomWebPath
- DIRECT_READ - org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead.Method
-
Read the contents of a table directly using the BigQuery storage API.
- DlqProvider - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
- DlqProvider() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.DlqProvider
- done() - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionPosition
- done(Timestamp, Timestamp) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestriction
- DONE - org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionMode
- doubleToByteArray(double) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigtableUtils
- dryRunQuery(String, JobConfigurationQuery, String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.JobService
-
Dry runs the query in the given project.
- dryRunQuery(String, JobConfigurationQuery, String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
- DynamicDestinations<T,DestinationT> - Class in org.apache.beam.sdk.io.gcp.bigquery
-
This class provides the most general way of specifying dynamic BigQuery table destinations.
- DynamicDestinations() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
E
- encode(TableRow, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
- encode(TableRow, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
- encode(JsonArray, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.JsonArrayCoder
- encode(BigQueryInsertError, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertErrorCoder
- encode(BigQueryStorageApiInsertError, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertErrorCoder
- encode(TableDestination, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoder
- encode(TableDestination, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV2
- encode(TableDestination, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV3
- encode(BigtableWriteResult, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResultCoder
- encode(FhirSearchParameter<T>, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameterCoder
- encode(HealthcareIOError<T>, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOErrorCoder
- encode(HL7v2Message, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2MessageCoder
- encode(PubsubMessage, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
- encode(PubsubMessage, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdAndOrderingKeyCoder
- encode(PubsubMessage, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdCoder
- encode(PubsubMessage, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
- encode(PubsubMessage, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithMessageIdCoder
- encode(PubsubMessage, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
- encode(PubsubMessage, OutputStream, Coder.Context) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
- encode(OffsetByteRange, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.OffsetByteRangeCoder
- encode(SubscriptionPartition, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.SubscriptionPartitionCoder
- encode(Uuid, OutputStream) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UuidCoder
- encodePacked32TimeSeconds(LocalTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Encodes
time
as a 4-byte integer with seconds precision. - encodePacked32TimeSeconds(LocalTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Encodes
time
as a 4-byte integer with seconds precision. - encodePacked64DatetimeMicros(LocalDateTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Encodes
dateTime
as a 8-byte integer with microseconds precision. - encodePacked64DatetimeMicros(LocalDateTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Encodes
dateTime
as a 8-byte integer with microseconds precision. - encodePacked64DatetimeSeconds(LocalDateTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Encodes
dateTime
as a 8-byte integer with seconds precision. - encodePacked64DatetimeSeconds(LocalDateTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Encodes
dateTime
as a 8-byte integer with seconds precision. - encodePacked64TimeMicros(LocalTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Encodes
time
as a 8-byte integer with microseconds precision. - encodePacked64TimeMicros(LocalTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Encodes
time
as a 8-byte integer with microseconds precision. - encodePacked64TimeNanos(LocalTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Encodes
time
as a 8-byte integer with nanoseconds precision. - encodePacked64TimeNanos(LocalTime) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.CivilTimeEncoder
-
Encodes
time
as a 8-byte integer with nanoseconds precision. - encodeQueryResult(Table) - Static method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
- encodeQueryResult(Table, List<TableRow>) - Static method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
- EncodingException - Exception in org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator
-
Represents an error during encoding (serializing) a class.
- EncodingException(Throwable) - Constructor for exception org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.EncodingException
- ensureUsableAsCloudPubsub() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.CloudPubsubTransforms
-
Ensure that all messages that pass through can be converted to Cloud Pub/Sub messages using the standard transformation methods in the client library.
- EntityToRow - Class in org.apache.beam.sdk.io.gcp.datastore
-
A
PTransform
to perform a conversion ofEntity
toRow
. - equals(Object) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertError
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteFailure
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteSuccessSummary
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameter
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.ProjectPath
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SubscriptionPath
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.TopicPath
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionPosition
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestriction
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartition
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ColumnType
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.HeartbeatRecord
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.Mod
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.TypeCode
- equals(Object) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRange
- ERROR_MESSAGE - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO.ReadStudyMetadata
-
TupleTag for any error response.
- ErrorContainer<T> - Interface in org.apache.beam.sdk.io.gcp.bigquery
-
ErrorContainer interface.
- eventually(Matcher<Iterable<? extends Row>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery.RowsAssertion
- EXECUTE_BUNDLE - org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.WriteMethod
- executeBundles(String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
- executeBundles(ValueProvider<String>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
- ExecuteBundles(String) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundles
- ExecuteBundles(ValueProvider<String>) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundles
-
Instantiates a new Execute bundles.
- executeFhirBundle(String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Execute fhir bundle http body.
- executeFhirBundle(String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- expand() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
- expand() - Method in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO.ReadStudyMetadata.Result
- expand() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
- expand() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read.Result
- expand() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search.Result
- expand() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.Result
- expand() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything.Result
- expand() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.Result
- expand() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write.Result
- expand() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteResult
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Deidentify
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Export
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.ListHL7v2Messages
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.SubscribeTransform
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
- expand(PBegin) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
- expand(PCollection<byte[]>) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.ProtoFromBytes
- expand(PCollection<Row>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableRowToBeamRow
- expand(PCollection<Row>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableRowToBeamRowFlat
- expand(PCollection<PubSubMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.AddUuidsTransform
- expand(PCollection<SequencedMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UuidDeduplicationTransform
- expand(PCollection<Mutation>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
- expand(PCollection<Entity>) - Method in class org.apache.beam.sdk.io.gcp.datastore.EntityToRow
- expand(PCollection<BatchGetDocumentsRequest>) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchGetDocuments
- expand(PCollection<ListCollectionIdsRequest>) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.ListCollectionIds
- expand(PCollection<ListDocumentsRequest>) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.ListDocuments
- expand(PCollection<PartitionQueryRequest>) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.PartitionQuery
- expand(PCollection<RunQueryRequest>) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.RunQuery
- expand(PCollection<Write>) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchWriteWithDeadLetterQueue
- expand(PCollection<Write>) - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchWriteWithSummary
- expand(PCollection<InputT>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO.ReadStudyMetadata
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Import
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read
- expand(PCollection<String>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.FetchHL7v2Message
- expand(PCollection<FhirBundleParameter>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundles
- expand(PCollection<FhirIOPatientEverything.PatientEverythingParameter>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything
- expand(PCollection<FhirSearchParameter<T>>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search
- expand(PCollection<HL7v2Message>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write
- expand(PCollection<PubsubMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
- expand(PCollection<MutationGroup>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.WriteGrouped
- expand(PCollection<ReadOperation>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
- expand(PCollection<KV<ByteString, Iterable<Mutation>>>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
- expand(PCollection<KV<ByteString, Iterable<Mutation>>>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.WriteWithResults
- expand(PCollection<KV<DestinationT, ElementT>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.CreateTableDestinations
- expand(PCollection<KV<DestinationT, ElementT>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.CreateTables
- expand(PCollection<KV<DestinationT, ElementT>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiConvertMessages
- expand(PCollection<KV<DestinationT, ElementT>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiLoads
- expand(PCollection<KV<DestinationT, ElementT>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingInserts
- expand(PCollection<KV<DestinationT, StorageApiWritePayload>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteRecordsInconsistent
- expand(PCollection<KV<DestinationT, StorageApiWritePayload>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords
- expand(PCollection<KV<TableDestination, ElementT>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingWriteTables
- expand(PCollection<KV<ShardedKey<DestinationT>, Iterable<StorageApiWritePayload>>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritesShardedRecords
- expand(PCollection<Row>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BeamRowToBigtableMutation
- expand(PCollection<Row>) - Method in class org.apache.beam.sdk.io.gcp.datastore.RowToEntity
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.ReifyAsIterable
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
- expand(PCollection<T>) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.ProtoToBytes
- expand(PInput) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
- expandInconsistent(PCollection<KV<DestinationT, ElementT>>, Coder<KV<DestinationT, StorageApiWritePayload>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiLoads
- expandTriggered(PCollection<KV<DestinationT, ElementT>>, Coder<KV<DestinationT, StorageApiWritePayload>>, Coder<StorageApiWritePayload>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiLoads
- expandUntriggered(PCollection<KV<DestinationT, ElementT>>, Coder<KV<DestinationT, StorageApiWritePayload>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiLoads
- expectDryRunQuery(String, String, JobStatistics) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
- Export(ValueProvider<String>, ValueProvider<String>) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Export
- EXPORT - org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead.Method
-
Export data to Google Cloud Storage in Avro format and read data files from that location.
- exportFhirResourceToBigQuery(String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
- exportFhirResourceToBigQuery(String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- exportFhirResourceToGcs(String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
- exportFhirResourceToGcs(String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- exportResources(String, String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
-
Export resources to GCS.
- exportResources(ValueProvider<String>, ValueProvider<String>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
- exportResources(DoFn.ProcessContext) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Export.ExportResourcesFn
- ExportResourcesFn(ValueProvider<String>) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Export.ExportResourcesFn
- ExternalRead - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Exposes
PubsubIO.Read
as an external transform for cross-language usage. - ExternalRead() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead
- ExternalRead.Configuration - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Parameters class to expose the transform to an external SDK.
- ExternalRead.ReadBuilder - Class in org.apache.beam.sdk.io.gcp.pubsub
- ExternalTransformRegistrarImpl - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
- ExternalTransformRegistrarImpl() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.ExternalTransformRegistrarImpl
- ExternalWrite - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Exposes
PubsubIO.Write
as an external transform for cross-language usage. - ExternalWrite() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite
- ExternalWrite.Configuration - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Parameters class to expose the transform to an external SDK.
- ExternalWrite.WriteBuilder - Class in org.apache.beam.sdk.io.gcp.pubsub
- extractTimestampAttribute(String, Map<String, String>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Return the timestamp (in ms since unix epoch) to use for a Pubsub message with
timestampAttribute
andattriutes
.
F
- FACTORY - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
-
Factory for creating Pubsub clients using gRCP transport.
- FACTORY - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
-
Factory for creating Pubsub clients using Json transport.
- FAIL_FAST - org.apache.beam.sdk.io.gcp.spanner.SpannerIO.FailureMode
-
Invalid write to Spanner will cause the pipeline to fail.
- FAILED - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write
-
The tag for the failed writes to HL7v2 store`.
- FAILED_BODY - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
-
The tag for the failed writes to FHIR store.
- FAILED_BUNDLES - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundles
-
The TupleTag used for bundles that failed to be executed for any reason.
- FAILED_FILES - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
-
The tag for the files that failed to FHIR store.
- FailedWritesException(List<FirestoreV1.WriteFailure>) - Constructor for exception org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.FailedWritesException
- failOnInsert(Map<TableRow, List<TableDataInsertAllResponse.InsertErrors>>) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
-
Cause a given
TableRow
object to fail when it's inserted. - FakeBigQueryServerStream(List<T>) - Constructor for class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices.FakeBigQueryServerStream
- FakeBigQueryServices - Class in org.apache.beam.sdk.io.gcp.testing
-
A fake implementation of BigQuery's query service..
- FakeBigQueryServices() - Constructor for class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
- FakeBigQueryServices.FakeBigQueryServerStream<T> - Class in org.apache.beam.sdk.io.gcp.testing
-
An implementation of
BigQueryServices.BigQueryServerStream
which takes aList
as theIterable
to simulate a server stream. - FakeDatasetService - Class in org.apache.beam.sdk.io.gcp.testing
-
A fake dataset service that can be serialized, for use in testReadFromTable.
- FakeDatasetService() - Constructor for class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- FakeJobService - Class in org.apache.beam.sdk.io.gcp.testing
-
A fake implementation of BigQuery's job service.
- FakeJobService() - Constructor for class org.apache.beam.sdk.io.gcp.testing.FakeJobService
- FakeJobService(int) - Constructor for class org.apache.beam.sdk.io.gcp.testing.FakeJobService
- FetchHL7v2Message() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.FetchHL7v2Message
-
Instantiates a new Fetch HL7v2 message DoFn.
- FhirBundleParameter - Class in org.apache.beam.sdk.io.gcp.healthcare
-
FhirBundleParameter represents a FHIR bundle in JSON format to be executed on a FHIR store.
- FhirBundleParameter() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleParameter
- FhirBundleResponse - Class in org.apache.beam.sdk.io.gcp.healthcare
- FhirBundleResponse() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleResponse
- FhirIO - Class in org.apache.beam.sdk.io.gcp.healthcare
-
FhirIO
provides an API for reading and writing resources to Google Cloud Healthcare Fhir API. - FhirIO() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
- FhirIO.Deidentify - Class in org.apache.beam.sdk.io.gcp.healthcare
-
Deidentify FHIR resources from a FHIR store to a destination FHIR store.
- FhirIO.Deidentify.DeidentifyFn - Class in org.apache.beam.sdk.io.gcp.healthcare
-
A function that schedules a deidentify operation and monitors the status.
- FhirIO.ExecuteBundles - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The type Execute bundles.
- FhirIO.ExecuteBundlesResult - Class in org.apache.beam.sdk.io.gcp.healthcare
-
ExecuteBundlesResult contains both successfully executed bundles and information help debugging failed executions (eg metadata & error msgs).
- FhirIO.Export - Class in org.apache.beam.sdk.io.gcp.healthcare
-
Export FHIR resources from a FHIR store to new line delimited json files on GCS or BigQuery.
- FhirIO.Export.ExportResourcesFn - Class in org.apache.beam.sdk.io.gcp.healthcare
-
A function that schedules an export operation and monitors the status.
- FhirIO.Import - Class in org.apache.beam.sdk.io.gcp.healthcare
-
Writes each bundle of elements to a new-line delimited JSON file on GCS and issues a fhirStores.import Request for that file.
- FhirIO.Import.ContentStructure - Enum in org.apache.beam.sdk.io.gcp.healthcare
-
The enum Content structure.
- FhirIO.Read - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The type Read.
- FhirIO.Read.Result - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The type Result.
- FhirIO.Search<T> - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The type Search.
- FhirIO.Search.Result - Class in org.apache.beam.sdk.io.gcp.healthcare
- FhirIO.Write - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The type Write.
- FhirIO.Write.AbstractResult - Class in org.apache.beam.sdk.io.gcp.healthcare
- FhirIO.Write.Result - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The type Result.
- FhirIO.Write.WriteMethod - Enum in org.apache.beam.sdk.io.gcp.healthcare
-
The enum Write method.
- FhirIOPatientEverything - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The type FhirIOPatientEverything for querying a FHIR Patient resource's compartment.
- FhirIOPatientEverything() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything
- FhirIOPatientEverything.PatientEverythingParameter - Class in org.apache.beam.sdk.io.gcp.healthcare
-
PatientEverythingParameter defines required attributes for a FHIR GetPatientEverything request in
FhirIOPatientEverything
. - FhirIOPatientEverything.Result - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The Result for a
FhirIOPatientEverything
request. - FhirResourcePagesIterator(HttpHealthcareApiClient.FhirResourcePagesIterator.FhirMethod, HealthcareApiClient, String, String, String, Map<String, Object>) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.FhirResourcePagesIterator
- FhirSearchParameter<T> - Class in org.apache.beam.sdk.io.gcp.healthcare
-
FhirSearchParameter represents the query parameters for a FHIR search request, used as a parameter for
FhirIO.Search
. - FhirSearchParameterCoder<T> - Class in org.apache.beam.sdk.io.gcp.healthcare
-
FhirSearchParameterCoder is the coder for
FhirSearchParameter
, which takes a coder for type T. - fhirStoresImport(String, String, FhirIO.Import.ContentStructure) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
- fhirStoresImport(String, String, String, FhirIO.Import.ContentStructure) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
-
Import method for batch writing resources.
- fhirStoresImport(ValueProvider<String>, ValueProvider<String>, ValueProvider<String>, FhirIO.Import.ContentStructure) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
- FILE_LOADS - org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.Method
-
Use BigQuery load jobs to insert data.
- finalizeCheckpoint() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.CheckpointMarkImpl
- finalizeWriteStream(String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
Finalize a write stream.
- finalizeWriteStream(String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- finishBundle() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write.PubsubBoundedWriter
- finishBundle() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.PubsubLiteSink
- finishBundle(DoFn.FinishBundleContext) - Method in class org.apache.beam.sdk.io.gcp.bigquery.UpdateSchemaDestination
- FINISHED - org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.State
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO.ReadStudyMetadata.Result
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read.Result
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search.Result
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.AbstractResult
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.Result
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything.Result
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.Result
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write.Result
- finishSpecifyingOutput(String, PInput, PTransform<?, ?>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteResult
- FirestoreIO - Class in org.apache.beam.sdk.io.gcp.firestore
-
FirestoreIO
provides an API for reading from and writing to Google Cloud Firestore. - FirestoreOptions - Interface in org.apache.beam.sdk.io.gcp.firestore
- FirestoreV1 - Class in org.apache.beam.sdk.io.gcp.firestore
-
FirestoreV1
provides an API which provides lifecycle managedPTransform
s for Cloud Firestore v1 API. - FirestoreV1.BatchGetDocuments - Class in org.apache.beam.sdk.io.gcp.firestore
-
Concrete class representing a
PTransform
<
PCollection
<
BatchGetDocumentsRequest
>,
PTransform
<
BatchGetDocumentsResponse
>>
which will read from Firestore. - FirestoreV1.BatchGetDocuments.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
-
A type safe builder for
FirestoreV1.BatchGetDocuments
allowing configuration and instantiation. - FirestoreV1.BatchWriteWithDeadLetterQueue - Class in org.apache.beam.sdk.io.gcp.firestore
-
Concrete class representing a
PTransform
<
PCollection
<
Write
>,
PCollection
<
FirestoreV1.WriteFailure
>
which will write to Firestore. - FirestoreV1.BatchWriteWithDeadLetterQueue.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
-
A type safe builder for
FirestoreV1.BatchWriteWithDeadLetterQueue
allowing configuration and instantiation. - FirestoreV1.BatchWriteWithSummary - Class in org.apache.beam.sdk.io.gcp.firestore
-
Concrete class representing a
PTransform
<
PCollection
<
Write
>,
PDone
>
which will write to Firestore. - FirestoreV1.BatchWriteWithSummary.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
-
A type safe builder for
FirestoreV1.BatchWriteWithSummary
allowing configuration and instantiation. - FirestoreV1.FailedWritesException - Exception in org.apache.beam.sdk.io.gcp.firestore
-
Exception that is thrown if one or more
Write
s is unsuccessful with a non-retryable status code. - FirestoreV1.ListCollectionIds - Class in org.apache.beam.sdk.io.gcp.firestore
-
Concrete class representing a
PTransform
<
PCollection
<
ListCollectionIdsRequest
>,
PTransform
<
ListCollectionIdsResponse
>>
which will read from Firestore. - FirestoreV1.ListCollectionIds.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
-
A type safe builder for
FirestoreV1.ListCollectionIds
allowing configuration and instantiation. - FirestoreV1.ListDocuments - Class in org.apache.beam.sdk.io.gcp.firestore
-
Concrete class representing a
PTransform
<
PCollection
<
ListDocumentsRequest
>,
PTransform
<
ListDocumentsResponse
>>
which will read from Firestore. - FirestoreV1.ListDocuments.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
-
A type safe builder for
FirestoreV1.ListDocuments
allowing configuration and instantiation. - FirestoreV1.PartitionQuery - Class in org.apache.beam.sdk.io.gcp.firestore
-
Concrete class representing a
PTransform
<
PCollection
<
PartitionQueryRequest
>,
PTransform
<
RunQueryRequest
>>
which will read from Firestore. - FirestoreV1.PartitionQuery.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
-
A type safe builder for
FirestoreV1.PartitionQuery
allowing configuration and instantiation. - FirestoreV1.Read - Class in org.apache.beam.sdk.io.gcp.firestore
-
Type safe builder factory for read operations.
- FirestoreV1.RunQuery - Class in org.apache.beam.sdk.io.gcp.firestore
-
Concrete class representing a
PTransform
<
PCollection
<
RunQueryRequest
>,
PTransform
<
RunQueryResponse
>>
which will read from Firestore. - FirestoreV1.RunQuery.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
-
A type safe builder for
FirestoreV1.RunQuery
allowing configuration and instantiation. - FirestoreV1.Write - Class in org.apache.beam.sdk.io.gcp.firestore
-
Type safe builder factory for write operations.
- FirestoreV1.WriteFailure - Class in org.apache.beam.sdk.io.gcp.firestore
-
Failure details for an attempted
Write
. - FirestoreV1.WriteSuccessSummary - Class in org.apache.beam.sdk.io.gcp.firestore
-
Summary object produced when a number of writes are successfully written to Firestore in a single BatchWrite.
- floatToByteArray(float) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigtableUtils
- flush(String, long) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
Flush a given stream up to the given offset.
- flush(String, long) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- from(TableReference) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
-
Read from table specified by a
TableReference
. - from(TableReference) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- from(Struct) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper.PartitionMetadataMapper
-
Transforms a
Struct
representing a partition metadata row into aPartitionMetadata
model. - from(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
-
Reads a BigQuery table specified as
"[project_id]:[dataset_id].[table_id]"
or"[dataset_id].[table_id]"
for tables within the current project. - from(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- from(String, Row, Schema) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaIOProvider
-
Produces a SchemaIO given a String representing the data's location, the schema of the data that resides there, and some IO-specific configuration object.
- from(String, Row, Schema) - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider
-
Produce a SchemaIO given a String representing the data's location, the schema of the data that resides there, and some IO-specific configuration object.
- from(String, Row, Schema) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
-
Produce a SchemaIO given a String representing the data's location, the schema of the data that resides there, and some IO-specific configuration object.
- from(BigQuerySchemaTransformReadConfiguration) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformReadProvider
-
Returns the expected
SchemaTransform
of the configuration. - from(BigQuerySchemaTransformWriteConfiguration) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformWriteProvider
-
Returns the expected
SchemaTransform
of the configuration. - from(BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider
- from(PubsubSchemaTransformReadConfiguration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformReadProvider
-
Returns the expected
SchemaTransform
of the configuration. - from(SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider
- from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
-
Same as
from(String)
, but with aValueProvider
. - from(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- fromCloudPubsubMessages() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.CloudPubsubTransforms
-
Transform messages publishable using PubsubIO to their equivalent Pub/Sub Lite publishable message.
- fromJsonString(String, Class<T>) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
- fromModel(Message) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
-
From model
Message
to hl7v2 message. - fromOptions(PipelineOptions) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
-
Creates an instance of this rule using provided options.
- fromPath(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubSubscription
-
Creates a class representing a Pub/Sub subscription from the specified subscription path.
- fromPath(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
-
Creates a class representing a Cloud Pub/Sub topic from the specified topic path.
- fromProto(PubsubMessage) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessages
- fromQuery(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
-
Reads results received after executing the given query.
- fromQuery(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- fromQuery(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
-
Same as
fromQuery(String)
, but with aValueProvider
. - fromQuery(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- fromSubscription(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
-
Reads from the given subscription.
- fromSubscription(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
-
Like
subscription()
but with aValueProvider
. - fromTableSchema(TableSchema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
-
Convert a BigQuery
TableSchema
to a BeamSchema
. - fromTableSchema(TableSchema, BigQueryUtils.SchemaConversionOptions) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
-
Convert a BigQuery
TableSchema
to a BeamSchema
. - fromTopic(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
-
Creates and returns a transform for reading from a Cloud Pub/Sub topic.
- fromTopic(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
-
Like
PubsubIO.Read.fromTopic(String)
but with aValueProvider
. - FULL - org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService.TableMetadataView
G
- GcpIoPipelineOptionsRegistrar - Class in org.apache.beam.sdk.io.gcp.common
-
A registrar containing the default GCP options.
- GcpIoPipelineOptionsRegistrar() - Constructor for class org.apache.beam.sdk.io.gcp.common.GcpIoPipelineOptionsRegistrar
- generatePartitionMetadataTableName(String) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.NameGenerator
-
Generates an unique name for the partition metadata table in the form of
"Metadata_<databaseId>_<uuid>"
. - get() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.BytesThroughputEstimator
-
Returns the estimated throughput bytes for now.
- get() - Method in interface org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.ThroughputEstimator
-
Returns the estimated throughput for now.
- getAll() - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
-
Retrieve all HL7v2 Messages from a PCollection of message IDs (such as from PubSub notification subscription).
- getAllIds(String, String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- getAllJobs() - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
- getAllPartitionsCreatedAfter(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
-
Fetches all partitions with a
PartitionMetadataAdminDao.COLUMN_CREATED_AT
less than the given timestamp. - getAllRows(String, String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- getApplicationName() - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher.TableAndQuery
- getAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
-
Returns the given attribute value.
- getAttributeMap() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
-
Returns the full map of attributes.
- getBatchClient() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerAccessor
- getBatchInitialCount() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
The initial size of a batch; used in the absence of the QoS system having significant data to determine a better batch size.
- getBatchMaxBytes() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
The maximum number of bytes to include in a batch.
- getBatchMaxCount() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
The maximum number of writes to include in a batch.
- getBatchTargetLatency() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
Target latency for batch requests.
- getBigQueryProject() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getBigtableOptions() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Deprecated.will be replaced by bigtable options configurator.
- getBigtableOptions() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
-
Deprecated.will be replaced by bigtable options configurator.
- getBqStreamingApiLoggingFrequencySec() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getBundle() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleParameter
-
FHIR R4 bundle resource object as a string.
- getChangeStreamDao() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.DaoFactory
-
Creates and returns a singleton DAO instance for querying a partition change stream.
- getCheckpointMark() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
- getCheckpointMarkCoder() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedSourceImpl
- getChildPartitions() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
-
List of child partitions yielded within this record.
- getClient(String) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
- getClustering() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- getCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.TypeCode
-
Returns the type code of the column.
- getCoderProvider() - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResultCoder
- getCoderProvider() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.OffsetByteRangeCoder
- getCoderProvider() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.SubscriptionPartitionCoder
- getCoderProvider() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UuidCoder
- getCoderProviders() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryCoderProviderRegistrar
- getCoderProviders() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubCoderProviderRegistrar
- getColumns() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
- getCommitDeadline() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getCommitRetrySettings() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getCommitTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.TransactionResult
-
Returns the commit timestamp of the read / write transaction.
- getCommitTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The timestamp at which the modifications within were committed in Cloud Spanner.
- getCreatedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
The time at which this partition was first detected and created in the metadata table.
- getCreateDisposition() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformWriteConfiguration
-
Specifies whether the table should be created if it does not exist.
- getCreateTime() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
-
Gets create time.
- getCurrent() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
- getCurrentRowAsStruct() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSet
-
Returns the record at the current pointer as a
Struct
. - getCurrentSource() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
- getCurrentTimestamp() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
- getData() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
-
Gets data.
- getDatabaseAdminClient() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerAccessor
- getDatabaseClient() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerAccessor
- getDatabaseId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getDatabaseId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration
- getDatabaseRole() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getDataResource() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOError
- getDataSchema() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformReadConfiguration
-
The expected schema of the Pub/Sub message.
- getDataset(String, String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
Gets the specified
Dataset
resource by dataset ID. - getDataset(String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- getDatasetService(BigQueryOptions) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices
-
Returns a real, mock, or fake
BigQueryServices.DatasetService
. - getDatasetService(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
- getDeadLetterQueue() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformReadConfiguration
-
The Pub/Sub topic path to write failures.
- getDescriptorFromTableSchema(TableSchema, boolean) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
- getDescriptorFromTableSchema(TableSchema, boolean) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
-
Given a BigQuery TableSchema, returns a protocol-buffer Descriptor that can be used to write data using the BigQuery Storage API.
- getDestination(ValueInSingleWindow<T>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
-
Returns an object that represents at a high level which table is being written to.
- getDestinationCoder() - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
-
Returns the coder for
DynamicDestinations
. - getEarliestHL7v2SendTime(String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Gets earliest hl 7 v 2 send time.
- getEarliestHL7v2SendTime(String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- getElement() - Method in class org.apache.beam.sdk.io.gcp.bigquery.AvroWriteRequest
- getEmulatorHost() - Method in interface org.apache.beam.sdk.io.gcp.firestore.FirestoreOptions
-
A host port pair to allow connecting to a Cloud Firestore emulator instead of the live service.
- getEmulatorHost() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getEncodedElementByteSize(TableRow) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
- getEncodedElementByteSize(BigQueryInsertError) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertErrorCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertErrorCoder
- getEncodedTypeDescriptor() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
- getEnd() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.HL7v2MessagePages
- getEndTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
The end time for querying this given partition.
- getEndTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestriction
- getError() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertError
- getErrorMessage() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertError
- getErrorMessage() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOError
- getEstimatedSizeBytes(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
- getExecuteStreamingSqlRetrySettings() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getFailedBodies() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
- getFailedBodies() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.AbstractResult
- getFailedBodies() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.Result
-
Gets failed bodies with err.
- getFailedBundles() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
-
Gets failed FhirBundleResponse wrapped inside HealthcareIOError.
- getFailedFiles() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
- getFailedFiles() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.AbstractResult
- getFailedFiles() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.Result
-
Gets failed file imports with err.
- getFailedInserts() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
-
Returns a
PCollection
containing theTableRow
s that didn't make it to BQ. - getFailedInsertsWithErr() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
-
Returns a
PCollection
containing theBigQueryInsertError
s with detailed error information. - getFailedInsertsWithErr() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write.Result
- getFailedMutations() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteResult
- getFailedReads() - Method in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO.ReadStudyMetadata.Result
-
Gets failed reads.
- getFailedReads() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read.Result
-
Gets failed reads.
- getFailedReads() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything.Result
-
Gets failed reads.
- getFailedReads() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.Result
- getFailedSearches() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search.Result
-
Gets failed searches.
- getFailedStorageApiInserts() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
- getFhirBundleParameter() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleResponse
-
FhirBundleParameter represents a FHIR bundle in JSON format to be executed on a FHIR store.
- getFhirStore() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundles
- getFinishedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
The time at which the connector finished processing this partition.
- getFirestoreDb() - Method in interface org.apache.beam.sdk.io.gcp.firestore.FirestoreOptions
-
The Firestore database ID to connect to.
- getFlatJsonRows(Schema) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery
-
Loads rows from BigQuery into
Rows
with givenSchema
. - getFormat() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformReadConfiguration
-
The expected format of the Pub/Sub message.
- getFormat() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformWriteConfiguration
-
The expected format of the Pub/Sub message.
- getFrom() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRange
-
Returns the range start timestamp (inclusive).
- getFrom(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.BytesThroughputEstimator
-
Returns the estimated throughput bytes for a specified time.
- getFrom(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.NullThroughputEstimator
-
Always returns 0.
- getFrom(Timestamp) - Method in interface org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.ThroughputEstimator
-
Returns the estimated throughput for a specified time.
- getFullPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SubscriptionPath
- getFullPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.TopicPath
- getHeartbeatMillis() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
The number of milliseconds after the stream is idle, which a heartbeat record will be emitted in the change stream query.
- getHintMaxNumWorkers() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
A hint to the QoS system for the intended max number of workers for a pipeline.
- getHL7v2Message(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Fetches an Hl7v2 message by its name from a Hl7v2 store.
- getHL7v2Message(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
-
Gets HL7v2 message.
- getHL7v2Store(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Gets HL7v2 store.
- getHL7v2Store(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
-
Gets HL7v2 store.
- getHost() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getHTTPWriteTimeout() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getId() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.ProjectPath
- getIdAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformReadConfiguration
-
When reading from Cloud Pub/Sub where unique record identifiers are provided as Pub/Sub message attributes, specifies the name of the attribute containing the unique identifier.
- getIdAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformWriteConfiguration
-
When writing to Cloud Pub/Sub where unique record identifiers are provided as Pub/Sub message attributes, specifies the name of the attribute containing the unique identifier.
- getIdAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
-
Get the id attribute.
- getIdAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
-
Get the id attribute.
- getIndex() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
- getInferMaps() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.SchemaConversionOptions
-
/** Controls whether to use the map or row FieldType for a TableSchema field that appears to represent a map (it is an array of structs containing only
key
andvalue
fields). - getInflightWaitSeconds() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StreamAppendClient
-
If the previous call to appendRows blocked due to flow control, returns how long the call blocked for.
- getInitialBackoff() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
The initial backoff duration to be used before retrying a request for the first time.
- getInitialWatermarkEstimatorState(PartitionMetadata) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.DetectNewPartitionsDoFn
- getInitialWatermarkEstimatorState(PartitionMetadata) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.ReadChangeStreamPartitionDoFn
- getInsertBundleParallelism() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getInsertCount() - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- getInsertErrors() - Method in class org.apache.beam.sdk.io.gcp.bigquery.InsertRetryPolicy.Context
- getInstanceId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getInstanceId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration
- getIsLocalChannelProvider() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getJob(JobReference) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.JobService
-
Gets the specified
Job
by the givenJobReference
. - getJob(JobReference) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
- getJobService(BigQueryOptions) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices
-
Returns a real, mock, or fake
BigQueryServices.JobService
. - getJobService(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
- getJsonClustering() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- getJsonFactory() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- getJsonTimePartitioning() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- getKey() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameter
- getKeyedResources() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search.Result
-
Gets resources with input SearchParameter key.
- getKeySet() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
- getKeysJson() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.Mod
-
The primary keys of this specific modification.
- getKind() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
- getKind() - Method in interface org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.PubsubClientFactory
-
Return the display name for this factory.
- getLabels() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
-
Gets labels.
- getLatestHL7v2SendTime(String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
- getLatestHL7v2SendTime(String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- getLiteralGqlQuery() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
- getLocalhost() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
- getMaxAttempts() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
The maximum number of times a request will be attempted for a complete successful result.
- getMaxBufferingDurationMilliSec() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getMaxCumulativeBackoff() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getMaxStreamingBatchSize() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getMaxStreamingRowsToBatch() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getMessageConverter(DestinationT, BigQueryServices.DatasetService) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiDynamicDestinationsTableRow
- getMessageId() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
-
Returns the messageId of the message populated by Cloud Pub/Sub.
- getMessages() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.Result
- getMessageType() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
-
Gets message type.
- getMetadata() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleParameter
-
String representing the metadata of the Bundle to be written.
- getMetadata() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSet
-
Returns the gathered metadata for the change stream query so far.
- getMetadata() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The connector execution metadata for this record.
- getMetadata() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestriction
- getMetadataTable() - Method in interface org.apache.beam.sdk.io.gcp.spanner.SpannerIO.SpannerChangeStreamOptions
-
Returns the name of the metadata table.
- getMode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionPosition
- getMode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestriction
- getMods() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The modifications within this record.
- getModType() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The type of operation that caused the modifications within this record.
- getName() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
-
Gets name.
- getName() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SubscriptionPath
- getName() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.TopicPath
- getName() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ColumnType
-
The name of the column.
- getNamespace() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
- getNeedsAttributes() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
- getNeedsMessageId() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
- getNeedsOrderingKey() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
- getNewBigqueryClient(String) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
- getNewValuesJson() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.Mod
-
The new column values after the modification was applied.
- getNumberOfPartitionsInTransaction() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The total number of partitions for the given transaction.
- getNumberOfRecordsInTransaction() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The total number of data change records for the given transaction.
- getNumberOfRecordsRead() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSetMetadata
-
Returns the total number of records read from the change stream so far.
- getNumberOfRecordsRead() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The number of records read in the partition change stream query before reading this record.
- getNumBytes() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteSuccessSummary
- getNumEntities(PipelineOptions, String, String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
-
Returns Number of entities available for reading.
- getNumExtractJobCalls() - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
- getNumQuerySplits() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
- getNumRows(BigQueryOptions, TableReference) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
-
It returns the number of rows for a given table.
- getNumStorageWriteApiStreamAppendClients() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getNumStorageWriteApiStreams() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getNumStreamingKeys() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getNumWrites() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteSuccessSummary
- getObservedTime() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOError
- getOldValuesJson() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.Mod
-
The old column values before the modification was applied.
- getOrCreate(SpannerConfig) - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerAccessor
- getOrderingKey() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
-
Returns the ordering key of the message.
- getOrdinalPosition() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ColumnType
-
The position of the column in the table.
- getOutput() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteResult
- getOutputCoder() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedSourceImpl
- getOverloadRatio() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
The target ratio between requests sent and successful requests.
- getParentTokens() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartition
-
The unique partition identifiers of the parent partitions where this child partition originated from.
- getParentTokens() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
The unique partition identifiers of the parent partitions where this child partition originated from.
- getPartition(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
-
Fetches the partition metadata row data for the given partition token.
- getPartition(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.InTransactionContext
-
Fetches the partition metadata row data for the given partition token.
- getPartitionCreatedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The time at which this partition was first detected and created in the metadata table.
- getPartitionEndTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The end time for the partition change stream query, which produced this record.
- getPartitionEndTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestrictionMetadata
- getPartitionMetadataAdminDao() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.DaoFactory
-
Creates and returns a singleton DAO instance for admin operations over the partition metadata table.
- getPartitionMetadataDao() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.DaoFactory
-
Creates and returns a singleton DAO instance for accessing the partition metadata table.
- getPartitionRunningAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The time at which the connector started processing this partition.
- getPartitionScheduledAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The time at which this partition was scheduled to be queried.
- getPartitionStartTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The start time for the partition change stream query, which produced this record.
- getPartitionStartTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestrictionMetadata
- getPartitionToken() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The partition token that produced this change stream record.
- getPartitionToken() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The unique identifier of the partition that generated this record.
- getPartitionToken() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
Unique partition identifier, which can be used to perform a change stream query.
- getPartitionToken() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestrictionMetadata
- getPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.ProjectPath
- getPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SubscriptionPath
- getPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.TopicPath
- getPatientCompartments() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything.Result
-
Gets the patient compartment responses for GetPatientEverything requests.
- getPatientEverything() - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
-
Get the patient compartment for a FHIR Patient using the GetPatientEverything/$everything API.
- getPatientEverything(String, Map<String, Object>, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Fhir get patient everythhing http body.
- getPatientEverything(String, Map<String, Object>, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- getPayload() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload
- getPayload() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
-
Returns the main PubSub message.
- getPgJsonb(int) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSet
-
Returns the record at the current pointer as
JsonB
. - getPipeline() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO.ReadStudyMetadata.Result
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read.Result
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search.Result
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.AbstractResult
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.Result
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything.Result
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.Result
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write.Result
- getPipeline() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteResult
- getPipelineOptions() - Method in class org.apache.beam.sdk.io.gcp.common.GcpIoPipelineOptionsRegistrar
- getProgress() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestrictionTracker
- getProgress() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
-
Returns the progress made within the restriction so far.
- getProgress(PartitionRestriction, PartitionPosition) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestrictionProgressChecker
- getProject() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
-
Get the project path.
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getProjectId() - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher.TableAndQuery
- getProtoClass() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformReadConfiguration
-
Used by the ProtoPayloadSerializerProvider when serializing from a Pub/Sub message.
- getPubsubRootUrl() - Method in interface org.apache.beam.sdk.io.gcp.pubsub.PubsubOptions
-
Root URL for use with the Google Cloud Pub/Sub API.
- getQueries() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameter
- getQuery() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformReadConfiguration
-
Configures the BigQuery read job with the SQL query.
- getQuery() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration
- getQuery() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
- getQuery() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
- getQuery() - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher.TableAndQuery
- getQueryLocation() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformReadConfiguration
-
BigQuery geographic location where the query job will be executed.
- getQueryName() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
- getQueryStartedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSetMetadata
-
Returns the timestamp at which the change stream query for a
ChangeStreamResultSet
first started. - getQueryStartedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The time that the change stream query which produced this record started.
- getReadOperation() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
- getReadResponse() - Method in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO.ReadStudyMetadata.Result
-
Gets resources.
- getReadTime() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
- getRecord() - Method in class org.apache.beam.sdk.io.gcp.bigquery.SchemaAndRecord
- getRecordReadAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSetMetadata
-
Returns the timestamp at which a record was read from the
ChangeStreamResultSet
. - getRecordReadAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The time at which the record was fully read.
- getRecordSequence() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
-
Indicates the order in which a record was put to the stream.
- getRecordSequence() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
Indicates the order in which this record was put into the change stream in the scope of a partition, commit timestamp and transaction tuple.
- getRecordStreamEndedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSetMetadata
-
Returns the timestamp at which a record finished to be streamed.
- getRecordStreamEndedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The time at which the record finished streaming.
- getRecordStreamStartedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSetMetadata
-
Returns the timestamp at which a record first started to be streamed.
- getRecordStreamStartedAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The time at which the record started to be streamed.
- getRecordTimestamp() - Method in interface org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecord
-
The timestamp associated with the record.
- getRecordTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The Cloud Spanner timestamp time when this record occurred.
- getRecordTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
-
Returns the timestamp that which this partition started being valid in Cloud Spanner.
- getRecordTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The timestamp at which the modifications within were committed in Cloud Spanner.
- getRecordTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.HeartbeatRecord
-
Indicates the timestamp for which the change stream query has returned all changes.
- getResources() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read.Result
-
Gets resources.
- getResources() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search.Result
-
Gets resources.
- getResourceType() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameter
- getResponse() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleResponse
-
HTTP response from the FHIR store after attempting to write the Bundle method.
- getResult() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.TransactionResult
-
Returns the result of the transaction execution.
- getRetryableCodes() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getRow() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertError
- getRow() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertError
- getRowRestriction() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration
- getRowsWritten() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResult
-
The number of rows written in this batch.
- getRowType() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The type of the primary keys and modified columns within this record.
- getRpcPriority() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- getRunningAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
The time at which the connector started processing this partition.
- getSamplePeriod() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
The length of time sampled request data will be retained.
- getSamplePeriodBucketSize() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
The size of buckets within the specified
samplePeriod
. - getScheduledAt() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
The time at which this partition was scheduled to be queried.
- getSchema() - Method in class org.apache.beam.sdk.io.gcp.bigquery.AvroWriteRequest
- getSchema(TableReference, BigQueryServices.DatasetService) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableSchemaCache
- getSchema(DestinationT) - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
-
Returns the table schema for the destination.
- getSchematizedData() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
-
Gets schematized data.
- getSelectedFields() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration
- getSendFacility() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
-
Gets send facility.
- getSendTime() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
-
Gets send time.
- getServerTransactionId() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The unique transaction id in which the modifications occurred.
- getSideInputs() - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
-
Specifies that this object needs access to one or more side inputs.
- getSize(PartitionMetadata, TimestampRange) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.ReadChangeStreamPartitionDoFn
- getSize(TimestampRange) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.DetectNewPartitionsDoFn
- getSplitBacklogBytes() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
- getStackTrace() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOError
- getStart() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.HL7v2MessagePages
- getStartTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
-
It is the partition_start_time of the child partition token.
- getStartTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
It is the start time at which the partition started existing in Cloud Spanner.
- getStartTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestriction
- getState() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
The state in which the current partition is in.
- getStatus() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteFailure
- getStatusCode() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOError
- getStoppedMode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestriction
- getStorageApiAppendThresholdBytes() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getStorageApiAppendThresholdRecordCount() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getStorageClient(BigQueryOptions) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices
-
Returns a real, mock, or fake
BigQueryServices.StorageClient
. - getStorageClient(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
- getStorageWriteApiMaxRequestSize() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getStorageWriteApiTriggeringFrequencySec() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getStreamAppendClient(String, Descriptors.Descriptor, boolean) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
Create an append client for a given Storage API write stream.
- getStreamAppendClient(String, Descriptors.Descriptor, boolean) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- getSubscription() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformReadConfiguration
-
The subscription from which to read Pub/Sub messages.
- getSubscription() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
-
Get the subscription being read from.
- getSubscriptionProvider() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
-
Get the
ValueProvider
for the subscription being read from. - getSuccessfulBodies() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
- getSuccessfulBodies() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.AbstractResult
- getSuccessfulBodies() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.Result
-
Gets successful bodies from Write.
- getSuccessfulBundles() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
-
Gets successful FhirBundleResponse from execute bundles operation.
- getSuccessfulInserts() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
-
Returns a
PCollection
containing theTableRow
s that were written to BQ via the streaming insert API. - getSuccessfulTableLoads() - Method in class org.apache.beam.sdk.io.gcp.bigquery.WriteResult
-
Returns a
PCollection
containing theTableDestination
s that were successfully loaded using the batch load API. - getTable() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertError
- getTable() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
-
Returns the table to read, or
null
if reading from a query instead. - getTable() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- getTable() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Returns the table reference, or
null
. - getTable() - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
- getTable(TableReference) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
Gets the specified
Table
resource by table ID. - getTable(TableReference) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- getTable(TableReference, List<String>) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
- getTable(TableReference, List<String>) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- getTable(TableReference, List<String>, BigQueryServices.DatasetService.TableMetadataView) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
- getTable(TableReference, List<String>, BigQueryServices.DatasetService.TableMetadataView) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- getTable(DestinationT) - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
-
Returns a
TableDestination
object for the destination. - getTableDescription() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- getTableId() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Returns the table being read from.
- getTableId() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration
- getTableImpl(TableReference, List<String>, BigQueryServices.DatasetService.TableMetadataView) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- getTableName() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The name of the table in which the modifications within this record occurred.
- getTableProvider() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
-
Returns the table to read, or
null
if reading from a query instead. - getTableProvider() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- getTableReference() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- getTableResource(String, String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
- getTableSchema() - Method in class org.apache.beam.sdk.io.gcp.bigquery.SchemaAndRecord
- getTableSpec() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformReadConfiguration
-
Specifies a table for a BigQuery read job.
- getTableSpec() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformWriteConfiguration
-
Writes to the given table specification.
- getTableSpec() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration
- getTableSpec() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
-
Return the tablespec in [project:].dataset.tableid format.
- getTableUrn() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
-
Return the tablespec in projects/[project]/datasets/[dataset]/tables/[table] format.
- getTargetDataset() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.TestBigQueryOptions
- getTargetTable(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
- getTargetTableId(BigQueryOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
- getTempDatasetId() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getThriftClass() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformReadConfiguration
-
Used by the ThriftPayloadSerializerProvider when serializing from a Pub/Sub message.
- getThriftProtocolFactoryClass() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformReadConfiguration
-
Used by the ThriftPayloadSerializerProvider when serializing from a Pub/Sub message.
- getThrottleDuration() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
The amount of time an attempt will be throttled if deemed necessary based on previous success rate.
- getTimePartitioning() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- getTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.HeartbeatRecord
-
Indicates the timestamp for which the change stream query has returned all changes.
- getTimestamp() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionPosition
- getTimestampAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformReadConfiguration
-
When reading from Cloud Pub/Sub where record timestamps are provided as Pub/Sub message attributes, specifies the name of the attribute that contains the timestamp.
- getTimestampAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformWriteConfiguration
-
When writing to Cloud Pub/Sub where record timestamps are configured as Pub/Sub message attributes, specifies the name of the attribute that contains the timestamp.
- getTimestampAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
-
Get the timestamp attribute.
- getTimestampAttribute() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
-
Get the timestamp attribute.
- getTo() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRange
-
Returns the range end timestamp (exclusive).
- getToken() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartition
-
Unique partition identifier, which can be used to perform a change stream query.
- getTopic() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformReadConfiguration
-
The topic from which to read Pub/Sub messages.
- getTopic() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformWriteConfiguration
-
The topic to which to write Pub/Sub messages.
- getTopic() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
-
Get the topic being written to.
- getTopic() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
-
Get the topic being read from.
- getTopicProvider() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
-
Get the
ValueProvider
for the topic being written to. - getTopicProvider() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
-
Get the
ValueProvider
for the topic being read from. - getTotalStreamDuration() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSetMetadata
-
Returns the total stream duration of change stream records so far.
- getTotalStreamTimeMillis() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
-
The total streaming time (in millis) for this record.
- getTransactionTag() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The transaction tag associated with the given transaction.
- getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubSubPayloadTranslation.ReadRegistrar
- getTransformPayloadTranslators() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubSubPayloadTranslation.WriteRegistrar
- getTruncateTimestamps() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions
- getType() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ColumnType
-
The type of the column.
- getUnfinishedMinWatermark() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
-
Fetches the earliest partition watermark from the partition metadata table that is not in a
PartitionMetadata.State.FINISHED
state. - getUseStandardSql() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformReadConfiguration
-
Enables BigQuery's Standard SQL dialect when reading from a query.
- getUseStorageApiConnectionPool() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getUseStorageWriteApi() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getUseStorageWriteApiAtLeastOnce() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- getUsingStandardSql() - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher.TableAndQuery
- getValueCaptureType() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
The capture type of the change stream that generated this record.
- getWatermark() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
- getWatermark() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
The time for which all records with a timestamp less than it have been processed.
- getWrite() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteFailure
- getWriteDisposition() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformWriteConfiguration
-
Specifies what to do with existing data in the table, in case the table already exists.
- getWriteFailures() - Method in exception org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.FailedWritesException
-
This list of
FirestoreV1.WriteFailure
s detailing which writes failed and for what reason. - getWriteResult() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteFailure
- grouped() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
-
Same transform but can be applied to
PCollection
ofMutationGroup
.
H
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertError
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteFailure
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteSuccessSummary
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameter
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.ProjectPath
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SubscriptionPath
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.TopicPath
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartition
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ColumnType
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.HeartbeatRecord
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.Mod
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.TypeCode
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionPosition
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestriction
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRange
- hashCode() - Method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
- hashSchemaDescriptorDeterministic(Descriptors.Descriptor) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
-
Hashes a schema descriptor using a deterministic hash function.
- hasNext() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.FhirResourcePagesIterator
- hasNext() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.HL7v2MessagePages.HL7v2MessagePagesIterator
- HealthcareApiClient - Interface in org.apache.beam.sdk.io.gcp.healthcare
-
Defines a client that talks to the Cloud Healthcare API.
- HealthcareIOError<T> - Class in org.apache.beam.sdk.io.gcp.healthcare
-
Class for capturing errors on IO operations on Google Cloud Healthcare APIs resources.
- HealthcareIOErrorCoder<T> - Class in org.apache.beam.sdk.io.gcp.healthcare
- HealthcareIOErrorToTableRow<T> - Class in org.apache.beam.sdk.io.gcp.healthcare
-
Convenience transform to write dead-letter
HealthcareIOError
s to BigQueryTableRow
s. - HealthcareIOErrorToTableRow() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOErrorToTableRow
- HEARTBEAT_RECORD_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Counter for the total number of heartbeat records identified during the execution of the Connector.
- HeartbeatRecord - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
A heartbeat record serves as a notification that the change stream query has returned all changes for the partition less or equal to the record timestamp.
- HeartbeatRecord(Timestamp, ChangeStreamRecordMetadata) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.HeartbeatRecord
-
Constructs the heartbeat record with the given timestamp and metadata.
- heartbeatRecordAction(ChangeStreamMetrics) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.ActionFactory
-
Creates and returns a singleton instance of an action class capable of processing
HeartbeatRecord
s. - HeartbeatRecordAction - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.action
-
This class is part of the process for
ReadChangeStreamPartitionDoFn
SDF. - HL7v2IO - Class in org.apache.beam.sdk.io.gcp.healthcare
-
HL7v2IO
provides an API for reading from and writing to Google Cloud Healthcare HL7v2 API. - HL7v2IO() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
- HL7v2IO.ListHL7v2Messages - Class in org.apache.beam.sdk.io.gcp.healthcare
-
List HL7v2 messages in HL7v2 Stores with optional filter.
- HL7v2IO.Read - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The type Read that reads HL7v2 message contents given a PCollection of message IDs strings.
- HL7v2IO.Read.FetchHL7v2Message - Class in org.apache.beam.sdk.io.gcp.healthcare
-
PTransform
to fetch a message from an Google Cloud Healthcare HL7v2 store based on msgID. - HL7v2IO.Read.FetchHL7v2Message.HL7v2MessageGetFn - Class in org.apache.beam.sdk.io.gcp.healthcare
-
DoFn for fetching messages from the HL7v2 store with error handling.
- HL7v2IO.Read.Result - Class in org.apache.beam.sdk.io.gcp.healthcare
- HL7v2IO.Write - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The type Write that writes the given PCollection of HL7v2 messages.
- HL7v2IO.Write.Result - Class in org.apache.beam.sdk.io.gcp.healthcare
- HL7v2IO.Write.WriteMethod - Enum in org.apache.beam.sdk.io.gcp.healthcare
-
The enum Write method.
- HL7v2Message - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The type HL7v2 message to wrap the
Message
model. - HL7v2Message(String, String, String, String, String, String, String, Map<String, String>) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
- HL7v2MessageCoder - Class in org.apache.beam.sdk.io.gcp.healthcare
- HttpHealthcareApiClient - Class in org.apache.beam.sdk.io.gcp.healthcare
-
A client that talks to the Cloud Healthcare API through HTTP requests.
- HttpHealthcareApiClient() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
-
Instantiates a new Http healthcare api client.
- HttpHealthcareApiClient(CloudHealthcare) - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
-
Instantiates a new Http healthcare api client.
- HttpHealthcareApiClient.AuthenticatedRetryInitializer - Class in org.apache.beam.sdk.io.gcp.healthcare
- HttpHealthcareApiClient.FhirResourcePagesIterator - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The type FhirResourcePagesIterator for methods which return paged output.
- HttpHealthcareApiClient.FhirResourcePagesIterator.FhirMethod - Enum in org.apache.beam.sdk.io.gcp.healthcare
- HttpHealthcareApiClient.HealthcareHttpException - Exception in org.apache.beam.sdk.io.gcp.healthcare
-
Wraps
HttpResponse
in an exception with a statusCode field for use withHealthcareIOError
. - HttpHealthcareApiClient.HL7v2MessagePages - Class in org.apache.beam.sdk.io.gcp.healthcare
- HttpHealthcareApiClient.HL7v2MessagePages.HL7v2MessagePagesIterator - Class in org.apache.beam.sdk.io.gcp.healthcare
-
The type Hl7v2 message id pages iterator.
I
- identifier() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryDlqProvider
- identifier() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaIOProvider
-
Returns an id that uniquely represents this IO.
- identifier() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformReadProvider
-
Implementation of the
TypedSchemaTransformProvider
identifier method. - identifier() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformWriteProvider
-
Implementation of the
TypedSchemaTransformProvider
identifier method. - identifier() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider
- identifier() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider
-
Returns an id that uniquely represents this IO.
- identifier() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubDlqProvider
- identifier() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
-
Returns an id that uniquely represents this IO.
- identifier() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformReadProvider
-
Implementation of the
TypedSchemaTransformProvider
identifier method. - identifier() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.DlqProvider
- identifier() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider
- ignoreInsertIds() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Setting this option to true disables insertId based data deduplication offered by BigQuery.
- ignoreUnknownValues() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Accept rows that contain values that do not match the schema.
- IMPORT - org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.WriteMethod
-
Import Method bulk imports resources from GCS.
- importFhirResource(String, String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
- importFhirResource(String, String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- importResources(String, String, String, FhirIO.Import.ContentStructure) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
-
Import resources.
- importResources(ValueProvider<String>, ValueProvider<String>, ValueProvider<String>, FhirIO.Import.ContentStructure) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
-
Import resources.
- in(Pipeline, PCollection<FhirBundleResponse>, PCollection<HealthcareIOError<FhirBundleParameter>>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundlesResult
-
Entry point for the ExecuteBundlesResult, storing the successful and failed bundles and their metadata.
- incActivePartitionReadCounter() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Increments the
ChangeStreamMetrics.ACTIVE_PARTITION_READ_COUNT
by 1 if the metric is enabled. - incDataRecordCounter() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Increments the
ChangeStreamMetrics.DATA_RECORD_COUNT
by 1 if the metric is enabled. - incHeartbeatRecordCount() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Increments the
ChangeStreamMetrics.HEARTBEAT_RECORD_COUNT
by 1 if the metric is enabled. - incPartitionRecordCount() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Increments the
ChangeStreamMetrics.PARTITION_RECORD_COUNT
by 1 if the metric is enabled. - incPartitionRecordMergeCount() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Increments the
ChangeStreamMetrics.PARTITION_RECORD_MERGE_COUNT
by 1 if the metric is enabled. - incPartitionRecordSplitCount() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Increments the
ChangeStreamMetrics.PARTITION_RECORD_SPLIT_COUNT
by 1 if the metric is enabled. - incQueryCounter() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Increments the
ChangeStreamMetrics.QUERY_COUNT
by 1 if the metric is enabled. - INGEST - org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write.WriteMethod
-
Ingest write method.
- ingestHL7v2Message(String, Message) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Ingest hl 7 v 2 message ingest message response.
- ingestHL7v2Message(String, Message) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- ingestMessages(String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
-
Write with Messages.Ingest method.
- initClient() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Deidentify.DeidentifyFn
- initClient() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Export.ExportResourcesFn
- initialize(HttpRequest) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.AuthenticatedRetryInitializer
- InitializeDoFn - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn
-
A DoFn responsible for initializing the change stream Connector.
- InitializeDoFn(DaoFactory, MapperFactory, Timestamp, Timestamp) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.InitializeDoFn
- InitialPartition - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
Utility class to determine initial partition constants and methods.
- InitialPartition() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.InitialPartition
- initialRestriction(PartitionMetadata) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.DetectNewPartitionsDoFn
-
Uses an
TimestampRange
with a max range. - initialRestriction(PartitionMetadata) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.ReadChangeStreamPartitionDoFn
-
The restriction for a partition will be defined from the start and end timestamp to query the partition for.
- inputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformReadProvider
-
Implementation of the
TypedSchemaTransformProvider
inputCollectionNames method. - inputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformWriteProvider
-
Implementation of the
TypedSchemaTransformProvider
inputCollectionNames method. - inputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider
- inputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformReadProvider
-
Implementation of the
TypedSchemaTransformProvider
inputCollectionNames method. - inputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider
- insert(PartitionMetadata) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
-
Inserts the partition metadata.
- insert(PartitionMetadata) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.InTransactionContext
-
Inserts the partition metadata.
- INSERT - org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ModType
- INSERT_OR_UPDATE_URN - Static variable in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar
- INSERT_URN - Static variable in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar
- insertAll(TableReference, List<TableRow>, List<String>) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- insertAll(TableReference, List<FailsafeValueInSingleWindow<TableRow, TableRow>>, List<String>, InsertRetryPolicy, List<ValueInSingleWindow<T>>, ErrorContainer<T>, boolean, boolean, boolean, List<ValueInSingleWindow<TableRow>>) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
Inserts
TableRows
with the specified insertIds if not null. - insertAll(TableReference, List<FailsafeValueInSingleWindow<TableRow, TableRow>>, List<String>, InsertRetryPolicy, List<ValueInSingleWindow<T>>, ErrorContainer<T>, boolean, boolean, boolean, List<ValueInSingleWindow<TableRow>>) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- InsertBuilder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.InsertBuilder
- insertDataToTable(String, String, String, List<Map<String, Object>>) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
-
Inserts rows to a table using a BigQuery streaming write.
- InsertOrUpdateBuilder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.InsertOrUpdateBuilder
- InsertRetryPolicy - Class in org.apache.beam.sdk.io.gcp.bigquery
-
A retry policy for streaming BigQuery inserts.
- InsertRetryPolicy() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.InsertRetryPolicy
- InsertRetryPolicy.Context - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Contains information about a failed insert.
- insertRows(Schema, Row...) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery
- instanceId - Variable in class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser.DicomWebPath
- instantiateHealthcareClient() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.FetchHL7v2Message.HL7v2MessageGetFn
-
Instantiate healthcare client.
- INTERACTIVE - org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead.QueryPriority
-
Specifies that a query should be run with an INTERACTIVE priority.
- InTransactionContext(String, TransactionContext, Dialect) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.InTransactionContext
-
Constructs a context to execute a user defined function transactionally.
- isBounded() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaIOProvider
-
Indicates whether the PCollections produced by this transform will contain a bounded or unbounded number of elements.
- isBounded() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider
- isBounded() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
- isBounded() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestrictionTracker
- isBounded() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
- isEOF() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Return true if
PubsubClient.pull(long, org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SubscriptionPath, int, boolean)
will always return empty list. - isEOF() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
- isEOF() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
- isEOF() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- isInitialPartition(String) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.InitialPartition
-
Verifies if the given partition token is the initial partition.
- isLastRecordInTransactionInPartition() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
Indicates whether this record is the last emitted for the given transaction in the given partition.
- isPrimaryKey() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ColumnType
-
True if the column is part of the primary key, false otherwise.
- isShouldReportDiagnosticMetrics() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
Whether additional diagnostic metrics should be reported for a Transform.
- isSystemTransaction() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
-
Whether the given transaction is Spanner system transaction.
- isTableEmpty(TableReference) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
Returns true if the table is empty.
- isTableEmpty(TableReference) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- iterator() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.HL7v2MessagePages
- iterator() - Method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
- iterator() - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices.FakeBigQueryServerStream
J
- JsonArrayCoder - Class in org.apache.beam.sdk.io.gcp.healthcare
- JsonArrayCoder() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.JsonArrayCoder
- jsonValueFromMessageValue(Descriptors.FieldDescriptor, Object, boolean) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
K
- KEY - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.RowUtils
- KEY_FIELD_PROPERTY - Static variable in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider
- keyField - Variable in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
- kind - Variable in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
- knownBuilderInstances() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.ExternalTransformRegistrarImpl
- knownBuilderInstances() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar
- knownBuilders() - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead
- knownBuilders() - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite
L
- LABELS - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.RowUtils
- lastAttemptedPosition - Variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
- lastClaimedPosition - Variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
- listAllFhirStores(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
List all FHIR stores in a dataset.
- listAllFhirStores(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- listCollectionIds() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.Read
-
Factory method to create a new type safe builder for
ListCollectionIdsRequest
operations. - listDocuments() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.Read
-
Factory method to create a new type safe builder for
ListDocumentsRequest
operations. - listSubscriptions(PubsubClient.ProjectPath, PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Return a list of subscriptions for
topic
inproject
. - listSubscriptions(PubsubClient.ProjectPath, PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
- listSubscriptions(PubsubClient.ProjectPath, PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
- listSubscriptions(PubsubClient.ProjectPath, PubsubClient.TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- listTopics(PubsubClient.ProjectPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Return a list of topics for
project
. - listTopics(PubsubClient.ProjectPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
- listTopics(PubsubClient.ProjectPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
- listTopics(PubsubClient.ProjectPath) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- location - Variable in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
- location - Variable in class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser.DicomWebPath
- longToByteArray(long) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigtableUtils
M
- makeHL7v2ListRequest(String, String, String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Make hl 7 v 2 list request list messages response.
- makeHL7v2ListRequest(String, String, String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- makeListRequest(HealthcareApiClient, String, Instant, Instant, String, String, String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.HL7v2MessagePages
-
Make list request list messages response.
- makeSendTimeBoundHL7v2ListRequest(String, Instant, Instant, String, String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Make send time bound hl 7 v 2 list request.
- makeSendTimeBoundHL7v2ListRequest(String, Instant, Instant, String, String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- ManagedFactory<T extends java.lang.AutoCloseable> - Interface in org.apache.beam.sdk.io.gcp.pubsublite.internal
-
A ManagedFactory produces instances and tears down any produced instances when it is itself closed.
- ManagedFactoryImpl<T extends java.lang.AutoCloseable> - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
- MapperFactory - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper
-
Factory class for creating instances that will map a struct to a connector model.
- MapperFactory(Dialect) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper.MapperFactory
- matchesSafely(BigqueryMatcher.TableAndQuery) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher
- MAX_INCLUSIVE_END_AT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
-
Represents the max end at that can be specified for a change stream.
- message() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.OutgoingMessage
-
Underlying Message.
- messageFromBeamRow(Descriptors.Descriptor, Row) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BeamRowToStorageApiProto
-
Given a Beam
Row
object, returns a protocol-buffer message that can be used to write data using the BigQuery Storage streaming API. - messageFromMap(TableRowToStorageApiProto.SchemaInformation, Descriptors.Descriptor, AbstractMap<String, Object>, boolean) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
- messageFromTableRow(TableRowToStorageApiProto.SchemaInformation, Descriptors.Descriptor, TableRow, boolean) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
-
Given a BigQuery TableRow, returns a protocol-buffer message that can be used to write data using the BigQuery Storage API.
- METADATA - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO.ReadStudyMetadata
-
TupleTag for the main output.
- Mod - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
Represents a modification in a table emitted within a
DataChangeRecord
. - Mod(String, String, String) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.Mod
-
Constructs a mod from the primary key values, the old state of the row and the new state of the row.
- modeToProtoMode(String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
- modifyAckDeadline(PubsubClient.SubscriptionPath, List<String>, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Modify the ack deadline for messages from
subscription
withackIds
to bedeadlineSeconds
from now. - modifyAckDeadline(PubsubClient.SubscriptionPath, List<String>, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
- modifyAckDeadline(PubsubClient.SubscriptionPath, List<String>, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
- modifyAckDeadline(PubsubClient.SubscriptionPath, List<String>, int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- ModType - Enum in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
Represents the type of modification applied in the
DataChangeRecord
. - MutationGroup - Class in org.apache.beam.sdk.io.gcp.spanner
-
A bundle of mutations that must be submitted atomically.
N
- NameGenerator - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams
-
This class generates a unique name for the partition metadata table, which is created when the Connector is initialized.
- NameGenerator() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.NameGenerator
- neverRetry() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.InsertRetryPolicy
-
Never retry any failures.
- NEW_ROW - org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ValueCaptureType
- NEW_VALUES - org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ValueCaptureType
- newBuilder() - Static method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
Factory method to return a new instance of
RpcQosOptions.Builder
with all values set to their initial default values. - newBuilder() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PublisherOptions
- newBuilder() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.SubscriberOptions
- newBuilder() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions
- newBuilder() - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
- newBuilder() - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
Creates a builder for constructing a partition metadata instance.
- newBuilder() - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestrictionMetadata
- newBuilder(PartitionRestrictionMetadata) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestrictionMetadata
- newClient(String, String, PubsubOptions) - Method in interface org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.PubsubClientFactory
- newClient(String, String, PubsubOptions, String) - Method in interface org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.PubsubClientFactory
-
Construct a new Pubsub client.
- newDlqTransform(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryDlqProvider
- newDlqTransform(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubDlqProvider
- newDlqTransform(String) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.DlqProvider
- newTracker(PartitionMetadata, TimestampRange) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.ReadChangeStreamPartitionDoFn
- newTracker(TimestampRange) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.DetectNewPartitionsDoFn
- newWatermarkEstimator(Instant) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.DetectNewPartitionsDoFn
- newWatermarkEstimator(Instant) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.ReadChangeStreamPartitionDoFn
- next() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.FhirResourcePagesIterator
- next() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.HL7v2MessagePages.HL7v2MessagePagesIterator
- next() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.ChangeStreamResultSet
-
Moves the pointer to the next record in the
ResultSet
if there is one. - next(Timestamp) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampUtils
-
Adds one nanosecond to the given timestamp.
- now(Matcher<Iterable<? extends Row>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery.RowsAssertion
- NullThroughputEstimator<T> - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator
-
NoOp implementation of a throughput estimator.
- NullThroughputEstimator() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.NullThroughputEstimator
- NUM_QUERY_SPLITS_MAX - Static variable in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
-
An upper bound on the number of splits for a query.
O
- of() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertErrorCoder
- of() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertErrorCoder
- of() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoder
- of() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV2
- of() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV3
- of() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
- of() - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteResultCoder
- of() - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2MessageCoder
- of() - Static method in class org.apache.beam.sdk.io.gcp.healthcare.JsonArrayCoder
- of() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
- of() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdAndOrderingKeyCoder
- of() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdCoder
- of() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
- of() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithMessageIdCoder
- of(Timestamp, Timestamp) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRange
-
Constructs a timestamp range.
- of(ByteString) - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.Uuid
- of(PubsubMessage, long, String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.OutgoingMessage
- of(Class<HL7v2Message>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2MessageCoder
- of(String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleParameter
- of(String, String, Map<String, T>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameter
-
Creates a FhirSearchParameter of type T.
- of(String, Map<String, T>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameter
-
Creates a FhirSearchParameter of type T, without a key.
- of(String, String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleParameter
- of(Coder<T>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameterCoder
- of(Coder<T>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOErrorCoder
- of(FhirBundleParameter, String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirBundleResponse
- of(PubsubMessage, long, String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.OutgoingMessage
- of(PCollectionTuple) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.Result
- of(TypeDescriptor<PubsubMessage>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdAndOrderingKeyCoder
- of(TypeDescriptor<PubsubMessage>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdCoder
- of(TypeDescriptor<PubsubMessage>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
- OffsetByteRangeCoder - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
- OffsetByteRangeCoder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.OffsetByteRangeCoder
- ofPatientEverything(HealthcareApiClient, String, Map<String, Object>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.FhirResourcePagesIterator
-
Instantiates a new GetPatientEverything FHIR resource pages iterator.
- ofSearch(HealthcareApiClient, String, String, Map<String, Object>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.FhirResourcePagesIterator
-
Instantiates a new search FHIR resource pages iterator.
- OLD_AND_NEW_VALUES - org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ValueCaptureType
- onTeardown() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiConvertMessages.ConvertMessagesDoFn
- onTeardown() - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiFlushAndFinalizeDoFn
- onTeardown() - Method in class org.apache.beam.sdk.io.gcp.bigquery.UpdateSchemaDestination
- optimizedWrites() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
If true, enables new codepaths that are expected to use less resources while writing to BigQuery.
- org.apache.beam.sdk.io.gcp.bigquery - package org.apache.beam.sdk.io.gcp.bigquery
-
Defines transforms for reading and writing from Google BigQuery.
- org.apache.beam.sdk.io.gcp.bigquery.providers - package org.apache.beam.sdk.io.gcp.bigquery.providers
-
Defines SchemaTransformProviders for reading and writing from Google BigQuery.
- org.apache.beam.sdk.io.gcp.bigtable - package org.apache.beam.sdk.io.gcp.bigtable
-
Defines transforms for reading and writing from Google Cloud Bigtable.
- org.apache.beam.sdk.io.gcp.common - package org.apache.beam.sdk.io.gcp.common
-
Defines common Google Cloud Platform IO support classes.
- org.apache.beam.sdk.io.gcp.datastore - package org.apache.beam.sdk.io.gcp.datastore
-
Provides an API for reading from and writing to Google Cloud Datastore over different versions of the Cloud Datastore Client libraries.
- org.apache.beam.sdk.io.gcp.firestore - package org.apache.beam.sdk.io.gcp.firestore
-
Provides an API for reading from and writing to Google Cloud Firestore.
- org.apache.beam.sdk.io.gcp.healthcare - package org.apache.beam.sdk.io.gcp.healthcare
-
Provides an API for reading from and writing to Google Cloud Datastore over different versions of the Cloud Datastore Client libraries.
- org.apache.beam.sdk.io.gcp.pubsub - package org.apache.beam.sdk.io.gcp.pubsub
-
Defines transforms for reading and writing from Google Cloud Pub/Sub.
- org.apache.beam.sdk.io.gcp.pubsublite - package org.apache.beam.sdk.io.gcp.pubsublite
-
Defines transforms for reading and writing from Google Cloud Pub/Sub Lite.
- org.apache.beam.sdk.io.gcp.pubsublite.internal - package org.apache.beam.sdk.io.gcp.pubsublite.internal
-
Defines transforms for reading and writing from Google Cloud Pub/Sub Lite.
- org.apache.beam.sdk.io.gcp.spanner - package org.apache.beam.sdk.io.gcp.spanner
-
Provides an API for reading from and writing to Google Cloud Spanner.
- org.apache.beam.sdk.io.gcp.spanner.changestreams - package org.apache.beam.sdk.io.gcp.spanner.changestreams
-
Provides an API for reading change stream data from Google Cloud Spanner.
- org.apache.beam.sdk.io.gcp.spanner.changestreams.action - package org.apache.beam.sdk.io.gcp.spanner.changestreams.action
-
Action processors for each of the types of Change Stream records received.
- org.apache.beam.sdk.io.gcp.spanner.changestreams.dao - package org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
-
Database Access Objects for querying change streams and modifying the Connector's metadata tables.
- org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn - package org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn
-
DoFn and SDF definitions to process Google Cloud Spanner Change Streams.
- org.apache.beam.sdk.io.gcp.spanner.changestreams.encoder - package org.apache.beam.sdk.io.gcp.spanner.changestreams.encoder
-
User model for the Spanner change stream API.
- org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator - package org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator
-
Classes related to estimating the throughput of the change streams SDFs.
- org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper - package org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper
-
Mapping related functionality, such as from
ResultSet
s to Change Stream models. - org.apache.beam.sdk.io.gcp.spanner.changestreams.model - package org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
User models for the Spanner change stream API.
- org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction - package org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction
-
Custom restriction tracker related classes.
- org.apache.beam.sdk.io.gcp.testing - package org.apache.beam.sdk.io.gcp.testing
-
Defines utilities for unit testing Google Cloud Platform components of Apache Beam pipelines.
- OUT - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read
-
The tag for the main output of FHIR resources.
- OUT - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Search
-
The tag for the main output of FHIR Resources from a search.
- OUT - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything
-
The tag for the main output of FHIR Resources from a GetPatientEverything request.
- OUT - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read
-
The tag for the main output of HL7v2 Messages.
- OutgoingMessage() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.OutgoingMessage
- outputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformReadProvider
-
Implementation of the
TypedSchemaTransformProvider
outputCollectionNames method. - outputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformWriteProvider
-
Implementation of the
TypedSchemaTransformProvider
outputCollectionNames method. - outputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider
- outputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformReadProvider
-
Implementation of the
TypedSchemaTransformProvider
outputCollectionNames method. - outputCollectionNames() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider
P
- PARENT_TOKENS - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.InitialPartition
-
The empty set representing the initial partition parent tokens.
- parseDicomWebpath(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser
- ParsePayloadAsPubsubMessageProto() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessages.ParsePayloadAsPubsubMessageProto
- ParsePubsubMessageProtoAsPayload() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessages.ParsePubsubMessageProtoAsPayload
- parseTableSpec(String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
-
Parse a table specification in the form
"[project_id]:[dataset_id].[table_id]"
or"[project_id].[dataset_id].[table_id]"
or"[dataset_id].[table_id]"
. - parseTableUrn(String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
- parseTimestampAsMsSinceEpoch(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Return timestamp as ms-since-unix-epoch corresponding to
timestamp
. - PARTITION_CREATED_TO_SCHEDULED_MS - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Time in milliseconds that a partition took to transition from
PartitionMetadata.State.CREATED
toPartitionMetadata.State.SCHEDULED
. - PARTITION_RECORD_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Counter for the total number of partitions identified during the execution of the Connector.
- PARTITION_RECORD_MERGE_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Counter for the total number of partition merges identified during the execution of the Connector.
- PARTITION_RECORD_SPLIT_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Counter for the total number of partition splits / moves identified during the execution of the Connector.
- PARTITION_SCHEDULED_TO_RUNNING_MS - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Time in milliseconds that a partition took to transition from
PartitionMetadata.State.SCHEDULED
toPartitionMetadata.State.RUNNING
. - PARTITION_TOKEN - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.InitialPartition
-
The token of the initial partition.
- PartitionMetadata - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
Model for the partition metadata database table used in the Connector.
- PartitionMetadata(String, HashSet<String>, Timestamp, Timestamp, long, PartitionMetadata.State, Timestamp, Timestamp, Timestamp, Timestamp, Timestamp) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
- PartitionMetadata.Builder - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
Partition metadata builder for better user experience.
- PartitionMetadata.State - Enum in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
The state at which a partition can be in the system: CREATED: the partition has been created, but no query has been done against it yet.
- PartitionMetadataAdminDao - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
-
Data access object for creating and dropping the partition metadata table.
- PartitionMetadataDao - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
-
Data access object for the Connector metadata tables.
- PartitionMetadataDao.InTransactionContext - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
-
Represents the execution of a read / write transaction in Cloud Spanner.
- PartitionMetadataDao.TransactionResult<T> - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dao
-
Represents a result from executing a Cloud Spanner read / write transaction.
- partitionMetadataMapper() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper.MapperFactory
-
Creates and returns a single instance of a mapper class capable of transforming a
Struct
into aPartitionMetadata
class. - PartitionMetadataMapper - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper
-
This class is responsible for transforming a
Struct
to aPartitionMetadata
. - PartitionMode - Enum in org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction
-
This enum contains the states that PartitionRestrictionTracker will go through.
- PartitionPosition - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction
-
Represents the current position of the running SDF within PartitionRestriction.
- PartitionPosition(Optional<Timestamp>, PartitionMode) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionPosition
- partitionQuery() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.Read
-
Factory method to create a new type safe builder for
PartitionQueryRequest
operations. - PartitionRestriction - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction
-
Represents the restriction for PartitionRestrictionTracker.
- PartitionRestriction(Timestamp, Timestamp, PartitionMode, PartitionMode) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestriction
- PartitionRestrictionClaimer - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction
-
The PartitionRestrictionClaimer class.
- PartitionRestrictionClaimer() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestrictionClaimer
- PartitionRestrictionMetadata - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction
-
The PartitionRestrictionMetadata class.
- PartitionRestrictionMetadata(String, Timestamp, Timestamp) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestrictionMetadata
- PartitionRestrictionMetadata.Builder - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction
- PartitionRestrictionProgressChecker - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction
- PartitionRestrictionProgressChecker() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestrictionProgressChecker
-
Indicates how many mode transitions have been completed for the current mode.
- PartitionRestrictionSplitter - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction
-
The PartitionRestrictionSplitter class.
- PartitionRestrictionSplitter() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestrictionSplitter
- PartitionRestrictionTracker - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction
- PartitionRestrictionTracker(PartitionRestriction) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestrictionTracker
- patchTableDescription(TableReference, String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService
-
Patch BigQuery
Table
description. - patchTableDescription(TableReference, String) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- PATIENT_EVERYTHING - org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.FhirResourcePagesIterator.FhirMethod
- PatientEverythingParameter() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIOPatientEverything.PatientEverythingParameter
- pin() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StreamAppendClient
-
Pin this object.
- pollFor(Duration) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery.PollingAssertion
- pollJob(JobReference, int) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.JobService
-
Waits for the job is Done, and returns the job.
- pollJob(JobReference, int) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
- pollOperation(Operation, Long) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
- pollOperation(Operation, Long) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageTableSource
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.WriteWithResults
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.datastore.RampupThrottlingFn
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write.PubsubBoundedWriter
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
- populateDisplayData(DisplayData.Builder) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.WriteGrouped
- PostProcessingMetricsDoFn - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn
-
A DoFn class to gather metrics about the emitted
DataChangeRecord
s. - PostProcessingMetricsDoFn(ChangeStreamMetrics) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.PostProcessingMetricsDoFn
- PrepareWrite<InputT,DestinationT extends @NonNull java.lang.Object,OutputT> - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Prepare an input
PCollection
for writing to BigQuery. - PrepareWrite(DynamicDestinations<InputT, DestinationT>, SerializableFunction<InputT, OutputT>) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite
- previous(Timestamp) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampUtils
- primary() - Method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
- process(PipelineOptions, KV<String, StorageApiFlushAndFinalizeDoFn.Operation>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiFlushAndFinalizeDoFn
- processElement(PubSubMessage) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.PubsubLiteSink
- processElement(Iterable<KV<DestinationT, WriteTables.Result>>, DoFn.ProcessContext, BoundedWindow) - Method in class org.apache.beam.sdk.io.gcp.bigquery.UpdateSchemaDestination
- processElement(DataChangeRecord, DoFn.OutputReceiver<DataChangeRecord>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.PostProcessingMetricsDoFn
-
Stage to measure a data records latencies and metrics.
- processElement(PartitionMetadata, RestrictionTracker<TimestampRange, Timestamp>, DoFn.OutputReceiver<DataChangeRecord>, ManualWatermarkEstimator<Instant>, DoFn.BundleFinalizer) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.ReadChangeStreamPartitionDoFn
-
Performs a change stream query for a given partition.
- processElement(DoFn.OutputReceiver<Void>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.CleanUpReadChangeStreamDoFn
- processElement(DoFn.OutputReceiver<PartitionMetadata>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.InitializeDoFn
- processElement(DoFn.ProcessContext) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read.FetchHL7v2Message.HL7v2MessageGetFn
-
Process element.
- processElement(DoFn.ProcessContext, PipelineOptions, KV<DestinationT, ElementT>, DoFn.MultiOutputReceiver) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StorageApiConvertMessages.ConvertMessagesDoFn
- processElement(DoFn.ProcessContext) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write.PubsubBoundedWriter
- processElement(DoFn.ProcessContext) - Method in class org.apache.beam.sdk.io.gcp.datastore.RampupThrottlingFn
-
Emit only as many elements as the exponentially increasing budget allows.
- processElement(RestrictionTracker<TimestampRange, Timestamp>, DoFn.OutputReceiver<PartitionMetadata>, ManualWatermarkEstimator<Instant>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.DetectNewPartitionsDoFn
-
Main processing function for the
DetectNewPartitionsDoFn
function. - project - Variable in class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser.DicomWebPath
- projectId - Variable in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
- projectPathFromId(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
- projectPathFromPath(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
- ProtoFromBytes<T extends com.google.protobuf.Message> - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
- protoModeToJsonMode(TableFieldSchema.Mode) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
- protoSchemaToTableSchema(TableSchema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
- protoTableFieldToTableField(TableFieldSchema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
- ProtoToBytes<T extends com.google.protobuf.Message> - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
- ProtoToBytes() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.ProtoToBytes
- protoTypeToJsonType(TableFieldSchema.Type) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
- publish(List<PubsubMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
-
Publish messages to
TestPubsub.topicPath()
. - publish(PubsubClient.TopicPath, List<PubsubClient.OutgoingMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Publish
outgoingMessages
to Pubsubtopic
. - publish(PubsubClient.TopicPath, List<PubsubClient.OutgoingMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
- publish(PubsubClient.TopicPath, List<PubsubClient.OutgoingMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
- publish(PubsubClient.TopicPath, List<PubsubClient.OutgoingMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- PublisherOptions - Class in org.apache.beam.sdk.io.gcp.pubsublite
-
Options needed for a Pub/Sub Lite Publisher.
- PublisherOptions() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.PublisherOptions
- PublisherOptions.Builder - Class in org.apache.beam.sdk.io.gcp.pubsublite
- PubsubClient - Class in org.apache.beam.sdk.io.gcp.pubsub
-
An (abstract) helper class for talking to Pubsub via an underlying transport.
- PubsubClient() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
- PubsubClient.OutgoingMessage - Class in org.apache.beam.sdk.io.gcp.pubsub
-
A message to be sent to Pubsub.
- PubsubClient.ProjectPath - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Path representing a cloud project id.
- PubsubClient.PubsubClientFactory - Interface in org.apache.beam.sdk.io.gcp.pubsub
-
Factory for creating clients.
- PubsubClient.SubscriptionPath - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Path representing a Pubsub subscription.
- PubsubClient.TopicPath - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Path representing a Pubsub topic.
- PubsubCoderProviderRegistrar - Class in org.apache.beam.sdk.io.gcp.pubsub
-
A
CoderProviderRegistrar
for standard types used withPubsubIO
. - PubsubCoderProviderRegistrar() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubCoderProviderRegistrar
- PubsubDlqProvider - Class in org.apache.beam.sdk.io.gcp.pubsub
- PubsubDlqProvider() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubDlqProvider
- PubsubGrpcClient - Class in org.apache.beam.sdk.io.gcp.pubsub
-
A helper class for talking to Pubsub via grpc.
- PubsubIO - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Read and Write
PTransform
s for Cloud Pub/Sub streams. - PubsubIO.PubsubSubscription - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Class representing a Cloud Pub/Sub Subscription.
- PubsubIO.PubsubTopic - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Class representing a Cloud Pub/Sub Topic.
- PubsubIO.Read<T> - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Implementation of read methods.
- PubsubIO.Write<T> - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Implementation of write methods.
- PubsubIO.Write.PubsubBoundedWriter - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Writer to Pubsub which batches messages from bounded collections.
- PubsubJsonClient - Class in org.apache.beam.sdk.io.gcp.pubsub
-
A Pubsub client using JSON transport.
- PubsubLiteIO - Class in org.apache.beam.sdk.io.gcp.pubsublite
-
I/O transforms for reading from Google Pub/Sub Lite.
- PubsubLiteSink - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
-
A sink which publishes messages to Pub/Sub Lite.
- PubsubLiteSink(PublisherOptions) - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.PubsubLiteSink
- PubsubMessage - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Class representing a Pub/Sub message.
- PubsubMessage(byte[], Map<String, String>) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
- PubsubMessage(byte[], Map<String, String>, String) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
- PubsubMessage(byte[], Map<String, String>, String, String) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
- PubsubMessagePayloadOnlyCoder - Class in org.apache.beam.sdk.io.gcp.pubsub
-
A coder for PubsubMessage treating the raw bytes being decoded as the message's payload.
- PubsubMessagePayloadOnlyCoder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessagePayloadOnlyCoder
- PubsubMessages - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Common util functions for converting between PubsubMessage proto and
PubsubMessage
. - PubsubMessages.DeserializeBytesIntoPubsubMessagePayloadOnly - Class in org.apache.beam.sdk.io.gcp.pubsub
- PubsubMessages.ParsePayloadAsPubsubMessageProto - Class in org.apache.beam.sdk.io.gcp.pubsub
- PubsubMessages.ParsePubsubMessageProtoAsPayload - Class in org.apache.beam.sdk.io.gcp.pubsub
- PubsubMessageWithAttributesAndMessageIdAndOrderingKeyCoder - Class in org.apache.beam.sdk.io.gcp.pubsub
-
A coder for PubsubMessage including all fields of a PubSub message from server.
- PubsubMessageWithAttributesAndMessageIdAndOrderingKeyCoder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdAndOrderingKeyCoder
- PubsubMessageWithAttributesAndMessageIdCoder - Class in org.apache.beam.sdk.io.gcp.pubsub
-
A coder for PubsubMessage including attributes and the message id from the PubSub server.
- PubsubMessageWithAttributesAndMessageIdCoder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesAndMessageIdCoder
- PubsubMessageWithAttributesCoder - Class in org.apache.beam.sdk.io.gcp.pubsub
-
A coder for PubsubMessage including attributes.
- PubsubMessageWithAttributesCoder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithAttributesCoder
- PubsubMessageWithMessageIdCoder - Class in org.apache.beam.sdk.io.gcp.pubsub
-
A coder for PubsubMessage treating the raw bytes being decoded as the message's payload, with the message id from the PubSub server.
- PubsubMessageWithMessageIdCoder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessageWithMessageIdCoder
- PubsubOptions - Interface in org.apache.beam.sdk.io.gcp.pubsub
-
Properties that can be set when using Google Cloud Pub/Sub with the Apache Beam SDK.
- PubSubPayloadTranslation - Class in org.apache.beam.sdk.io.gcp.pubsub
- PubSubPayloadTranslation() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubSubPayloadTranslation
- PubSubPayloadTranslation.ReadRegistrar - Class in org.apache.beam.sdk.io.gcp.pubsub
- PubSubPayloadTranslation.WriteRegistrar - Class in org.apache.beam.sdk.io.gcp.pubsub
- PubsubSchemaIOProvider - Class in org.apache.beam.sdk.io.gcp.pubsub
-
An implementation of
SchemaIOProvider
for reading and writing JSON/AVRO payloads withPubsubIO
. - PubsubSchemaIOProvider() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
- PubsubSchemaTransformReadConfiguration - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Configuration for reading from Pub/Sub.
- PubsubSchemaTransformReadConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformReadConfiguration
- PubsubSchemaTransformReadConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.pubsub
- PubsubSchemaTransformReadProvider - Class in org.apache.beam.sdk.io.gcp.pubsub
-
An implementation of
TypedSchemaTransformProvider
for Pub/Sub reads configured usingPubsubSchemaTransformReadConfiguration
. - PubsubSchemaTransformReadProvider() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformReadProvider
- PubsubSchemaTransformWriteConfiguration - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Configuration for writing to Pub/Sub.
- PubsubSchemaTransformWriteConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformWriteConfiguration
- PubsubSchemaTransformWriteConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Builder for
PubsubSchemaTransformWriteConfiguration
. - PubsubTestClient - Class in org.apache.beam.sdk.io.gcp.pubsub
-
A (partial) implementation of
PubsubClient
for use by unit tests. - PubsubTestClient() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- PubsubTestClient.PubsubTestClientFactory - Interface in org.apache.beam.sdk.io.gcp.pubsub
-
Closing the factory will validate all expected messages were processed.
- PubsubUnboundedSink - Class in org.apache.beam.sdk.io.gcp.pubsub
-
A PTransform which streams messages to Pubsub.
- PubsubUnboundedSink(PubsubClient.PubsubClientFactory, ValueProvider<PubsubClient.TopicPath>, String, String, int) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
- PubsubUnboundedSink(PubsubClient.PubsubClientFactory, ValueProvider<PubsubClient.TopicPath>, String, String, int, int, int) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
- PubsubUnboundedSink(PubsubClient.PubsubClientFactory, ValueProvider<PubsubClient.TopicPath>, String, String, int, int, int, String) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
- PubsubUnboundedSink(PubsubClient.PubsubClientFactory, ValueProvider<PubsubClient.TopicPath>, String, String, int, String) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSink
- PubsubUnboundedSource - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Users should use
PubsubIO#read
instead. - PubsubUnboundedSource(Clock, PubsubClient.PubsubClientFactory, ValueProvider<PubsubClient.ProjectPath>, ValueProvider<PubsubClient.TopicPath>, ValueProvider<PubsubClient.SubscriptionPath>, String, String, boolean) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
-
Construct an unbounded source to consume from the Pubsub
subscription
. - PubsubUnboundedSource(PubsubClient.PubsubClientFactory, ValueProvider<PubsubClient.ProjectPath>, ValueProvider<PubsubClient.TopicPath>, ValueProvider<PubsubClient.SubscriptionPath>, String, String, boolean) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
-
Construct an unbounded source to consume from the Pubsub
subscription
. - PubsubUnboundedSource(PubsubClient.PubsubClientFactory, ValueProvider<PubsubClient.ProjectPath>, ValueProvider<PubsubClient.TopicPath>, ValueProvider<PubsubClient.SubscriptionPath>, String, String, boolean, boolean) - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource
-
Construct an unbounded source to consume from the Pubsub
subscription
. - pull(long, PubsubClient.SubscriptionPath, int, boolean) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
-
Request the next batch of up to
batchSize
messages fromsubscription
. - pull(long, PubsubClient.SubscriptionPath, int, boolean) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubGrpcClient
- pull(long, PubsubClient.SubscriptionPath, int, boolean) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubJsonClient
- pull(long, PubsubClient.SubscriptionPath, int, boolean) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubTestClient
- putSchemaIfAbsent(TableReference, TableSchema) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableSchemaCache
-
Registers schema for a table if one is not already present.
Q
- QUERY_CHANGE_STREAM - org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionMode
- QUERY_COUNT - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Counter for the total number of queries issued during the execution of the Connector.
- queryChangeStream(Timestamp) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionPosition
- queryChangeStream(Timestamp, Timestamp) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestriction
- queryChangeStreamAction(ChangeStreamDao, PartitionMetadataDao, ChangeStreamRecordMapper, PartitionMetadataMapper, DataChangeRecordAction, HeartbeatRecordAction, ChildPartitionsRecordAction, ChangeStreamMetrics) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.ActionFactory
-
Creates and returns a single instance of an action class capable of performing a change stream query for a given partition.
- QueryChangeStreamAction - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.action
-
Main action class for querying a partition change stream.
- queryResultHasChecksum(String) - Static method in class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher
- queryUnflattened(String, String, boolean, boolean) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
-
Performs a query without flattening results.
- queryWithRetries(String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
- queryWithRetries(String, String, boolean) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
- queryWithRetriesUsingStandardSql(String, String) - Method in class org.apache.beam.sdk.io.gcp.testing.BigqueryClient
R
- RampupThrottlingFn<T> - Class in org.apache.beam.sdk.io.gcp.datastore
-
An implementation of a client-side throttler that enforces a gradual ramp-up, broadly in line with Datastore best practices.
- RampupThrottlingFn(int, PCollectionView<Instant>) - Constructor for class org.apache.beam.sdk.io.gcp.datastore.RampupThrottlingFn
- RampupThrottlingFn(ValueProvider<Integer>, PCollectionView<Instant>) - Constructor for class org.apache.beam.sdk.io.gcp.datastore.RampupThrottlingFn
- random() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.Uuid
- range - Variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
- read() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
-
Deprecated.Use
BigQueryIO.read(SerializableFunction)
orBigQueryIO.readTableRows()
instead.BigQueryIO.readTableRows()
does exactly the same asBigQueryIO.read()
, howeverBigQueryIO.read(SerializableFunction)
performs better. - read() - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO
-
Creates an uninitialized
BigtableIO.Read
. - read() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1
-
Returns an empty
DatastoreV1.Read
builder. - read() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1
-
The class returned by this method provides the ability to create
PTransforms
for read operations available in the Firestore V1 API provided byFirestoreStub
. - read() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
-
Creates an uninitialized instance of
SpannerIO.Read
. - read(Object, Decoder) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.encoder.TimestampEncoding
-
Deserializes a
Timestamp
from the givenDecoder
. - read(String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
-
Read all HL7v2 Messages from a single store.
- read(SubscriberOptions) - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteIO
-
Read messages from Pub/Sub Lite.
- read(ValueProvider<String>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
-
Read all HL7v2 Messages from a single store.
- read(SerializableFunction<SchemaAndRecord, T>) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
-
Reads from a BigQuery table or query and returns a
PCollection
with one element per each row of the table or query result, parsed from the BigQuery AVRO format using the specified function. - Read() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
- Read() - Constructor for class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
- Read() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Read
-
Instantiates a new Read.
- Read() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Read
- Read() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
- Read() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
- READ_URN - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.internal.ExternalTransformRegistrarImpl
- READ_URN - Static variable in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar
- readAll() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
-
A
PTransform
that works likeSpannerIO.read()
, but executes read operations coming from aPCollection
. - readAll(List<String>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
-
Read all HL7v2 Messages from multiple stores.
- readAll(ValueProvider<List<String>>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
-
Read all HL7v2 Messages from multiple stores.
- ReadAll() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
- readAllWithFilter(List<String>, String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
-
Read all HL7v2 Messages from a multiple stores matching a filter.
- readAllWithFilter(ValueProvider<List<String>>, ValueProvider<String>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
-
Read all HL7v2 Messages from a multiple stores matching a filter.
- readAvroGenericRecords(Schema) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
-
Returns a
PTransform
that continuously reads binary encoded Avro messages into the AvroGenericRecord
type. - readAvros(Class<T>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
-
Returns A
PTransform
that continuously reads binary encoded Avro messages of the given type from a Google Cloud Pub/Sub stream. - readAvrosWithBeamSchema(Class<T>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
-
Returns a
PTransform
that continuously reads binary encoded Avro messages of the specific type. - ReadBuilder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead.ReadBuilder
- ReadBuilder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder
- readCallMetric(TableReference) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
- readChangeStream() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
-
Creates an uninitialized instance of
SpannerIO.ReadChangeStream
. - ReadChangeStream() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
- ReadChangeStreamPartitionDoFn - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn
-
A SDF (Splittable DoFn) class which is responsible for performing a change stream query for a given partition.
- ReadChangeStreamPartitionDoFn(DaoFactory, MapperFactory, ActionFactory, ChangeStreamMetrics) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.ReadChangeStreamPartitionDoFn
-
This class needs a
DaoFactory
to build DAOs to access the partition metadata tables and to perform the change streams query. - ReadChangeStreamPartitionRangeTracker - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction
-
This restriction tracker delegates most of its behavior to an internal
TimestampRangeTracker
. - ReadChangeStreamPartitionRangeTracker(PartitionMetadata, TimestampRange) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.ReadChangeStreamPartitionRangeTracker
-
Receives the partition that will be queried and the timestamp range that belongs to it.
- readFhirResource(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Read fhir resource http body.
- readFhirResource(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- readMessages() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
-
Returns A
PTransform
that continuously reads from a Google Cloud Pub/Sub stream. - readMessagesWithAttributes() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
-
Returns A
PTransform
that continuously reads from a Google Cloud Pub/Sub stream. - readMessagesWithAttributesAndMessageId() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
-
Returns A
PTransform
that continuously reads from a Google Cloud Pub/Sub stream. - readMessagesWithAttributesAndMessageIdAndOrderingKey() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
-
Returns A
PTransform
that continuously reads from a Google Cloud Pub/Sub stream. - readMessagesWithCoderAndParseFn(Coder<T>, SimpleFunction<PubsubMessage, T>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
-
Returns A
PTransform
that continuously reads from a Google Cloud Pub/Sub stream, mapping eachPubsubMessage
into type T using the supplied parse function and coder. - readMessagesWithMessageId() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
-
Returns A
PTransform
that continuously reads from a Google Cloud Pub/Sub stream. - ReadOperation - Class in org.apache.beam.sdk.io.gcp.spanner
-
Encapsulates a spanner read operation.
- ReadOperation() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
- readProtoDynamicMessages(Descriptors.Descriptor) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
-
Similar to
PubsubIO.readProtoDynamicMessages(ProtoDomain, String)
but for when theDescriptors.Descriptor
is already known. - readProtoDynamicMessages(ProtoDomain, String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
-
Returns a
PTransform
that continuously reads binary encoded protobuf messages for the type specified byfullMessageName
. - readProtos(Class<T>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
-
Returns A
PTransform
that continuously reads binary encoded protobuf messages of the given type from a Google Cloud Pub/Sub stream. - ReadRegistrar() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubSubPayloadTranslation.ReadRegistrar
- readResources() - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
-
Read resources from a PCollection of resource IDs (e.g.
- readRows(ReadRowsRequest) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StorageClient
-
Read rows in the context of a specific read stream.
- readRows(ReadRowsRequest, String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StorageClient
- readStrings() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
-
Returns A
PTransform
that continuously reads UTF-8 encoded strings from a Google Cloud Pub/Sub stream. - readStudyMetadata() - Static method in class org.apache.beam.sdk.io.gcp.healthcare.DicomIO
- readTableRows() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
-
Like
BigQueryIO.read(SerializableFunction)
but represents each row as aTableRow
. - readTableRowsWithSchema() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
-
Like
BigQueryIO.readTableRows()
but withSchema
support. - readWithDatumReader(AvroSource.DatumReaderFactory<T>) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
-
Reads from a BigQuery table or query and returns a
PCollection
with one element per each row of the table or query result. - readWithFilter(String, String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
-
Read all HL7v2 Messages from a single store matching a filter.
- readWithFilter(ValueProvider<String>, ValueProvider<String>) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO
-
Read all HL7v2 Messages from a single store matching a filter.
- recordId() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.OutgoingMessage
-
If using an id attribute, the record id to associate with this record's metadata so the receiver can reject duplicates.
- refreshSchema(TableReference, BigQueryServices.DatasetService) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableSchemaCache
- refreshThread() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableSchemaCache
- ReifyAsIterable<T> - Class in org.apache.beam.sdk.io.gcp.bigquery
-
This transforms turns a side input into a singleton PCollection that can be used as the main input for another transform.
- ReifyAsIterable() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.ReifyAsIterable
- REJECT - org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions.TruncateTimestamps
-
Reject timestamps with greater-than-millisecond precision.
- REPLACE_URN - Static variable in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar
- ReplaceBuilder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReplaceBuilder
- REPORT_FAILURES - org.apache.beam.sdk.io.gcp.spanner.SpannerIO.FailureMode
-
Invalid mutations will be returned as part of the result of the write transform.
- requiresDataSchema() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaIOProvider
-
Indicates whether this transform requires a specified data schema.
- requiresDataSchema() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider
- requiresDataSchema() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaIOProvider
- RESOURCE - org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Import.ContentStructure
-
The source file contains one or more lines of newline-delimited JSON (ndjson).
- restriction - Variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestrictionTracker
- retrieveDicomStudyMetadata(String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
- retrieveDicomStudyMetadata(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- retryTransientErrors() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.InsertRetryPolicy
-
Retry all failures except for known persistent errors.
- RowToEntity - Class in org.apache.beam.sdk.io.gcp.datastore
-
A
PTransform
to perform a conversion ofRow
toEntity
. - RowUtils - Class in org.apache.beam.sdk.io.gcp.bigtable
- RowUtils() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.RowUtils
- RpcQosOptions - Class in org.apache.beam.sdk.io.gcp.firestore
-
Quality of Service manager options for Firestore RPCs.
- RpcQosOptions.Builder - Class in org.apache.beam.sdk.io.gcp.firestore
-
Mutable Builder class for creating instances of
RpcQosOptions
. - run(PartitionMetadata, ChildPartitionsRecord, RestrictionTracker<TimestampRange, Timestamp>, ManualWatermarkEstimator<Instant>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.ChildPartitionsRecordAction
-
This is the main processing function for a
ChildPartitionsRecord
. - run(PartitionMetadata, DataChangeRecord, RestrictionTracker<TimestampRange, Timestamp>, DoFn.OutputReceiver<DataChangeRecord>, ManualWatermarkEstimator<Instant>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.DataChangeRecordAction
-
This is the main processing function for a
DataChangeRecord
. - run(PartitionMetadata, HeartbeatRecord, RestrictionTracker<TimestampRange, Timestamp>, ManualWatermarkEstimator<Instant>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.HeartbeatRecordAction
-
This is the main processing function for a
HeartbeatRecord
. - run(PartitionMetadata, RestrictionTracker<TimestampRange, Timestamp>, DoFn.OutputReceiver<DataChangeRecord>, ManualWatermarkEstimator<Instant>, DoFn.BundleFinalizer) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.QueryChangeStreamAction
-
This method will dispatch a change stream query for the given partition, it delegate the processing of the records to one of the corresponding action classes registered and it will keep the state of the partition up to date in the Connector's metadata table.
- run(RestrictionTracker<TimestampRange, Timestamp>, DoFn.OutputReceiver<PartitionMetadata>, ManualWatermarkEstimator<Instant>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.action.DetectNewPartitionsAction
-
Executes the main logic to schedule new partitions.
- runInTransaction(Function<PartitionMetadataDao.InTransactionContext, T>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
-
Runs a given function in a transaction context.
- RUNNING - org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.State
- runQuery() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.Read
-
Factory method to create a new type safe builder for
RunQueryRequest
operations.
S
- SAMPLE_PARTITION - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
-
We use a bogus partition here to estimate the average size of a partition metadata record.
- SCHEDULED - org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.State
- schema() - Method in class org.apache.beam.sdk.io.gcp.datastore.DataStoreV1SchemaIOProvider.DataStoreV1SchemaIO
- SchemaAndRecord - Class in org.apache.beam.sdk.io.gcp.bigquery
-
A wrapper for a
GenericRecord
and theTableSchema
representing the schema of the table (or query) it was generated from. - SchemaAndRecord(GenericRecord, TableSchema) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.SchemaAndRecord
- SchemaConversionOptions() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.SchemaConversionOptions
- schemaToProtoTableSchema(TableSchema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
- SEARCH - org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.FhirResourcePagesIterator.FhirMethod
- searchFhirResource(String, String, Map<String, Object>, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
-
Search fhir resource http body.
- searchFhirResource(String, String, Map<String, Object>, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- searchResources(String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
-
Search resources from a Fhir store with String parameter values.
- searchResourcesWithGenericParameters(String) - Static method in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO
-
Search resources from a Fhir store with any type of parameter values.
- seriesId - Variable in class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser.DicomWebPath
- setAveragePartitionBytesSize(long) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.DetectNewPartitionsDoFn
-
Sets the average partition bytes size to estimate the backlog of this DoFn.
- setBatching(Boolean) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
- setBigQueryProject(String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setBqStreamingApiLoggingFrequencySec(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setCreatedAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
-
Sets the time at which the partition was created.
- setCreateDisposition(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformWriteConfiguration.Builder
-
Specifies whether the table should be created if it does not exist.
- setDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.CrossLanguageConfiguration
- setDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration.Builder
- setDataSchema(Schema) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformReadConfiguration.Builder
-
The expected schema of the Pub/Sub message.
- setDeadLetterQueue(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformReadConfiguration.Builder
-
The Pub/Sub topic path to write failures.
- setDeduplicate(Deduplicate.KeyedValues<Uuid, SequencedMessage>) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions.Builder
-
Set the deduplication transform.
- setEmulatorHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.CrossLanguageConfiguration
- setEmulatorHost(String) - Method in interface org.apache.beam.sdk.io.gcp.firestore.FirestoreOptions
-
Define a host port pair to allow connecting to a Cloud Firestore emulator instead of the live service.
- setEndTimestamp(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
-
Sets the end time of the partition.
- setFinishedAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
-
Sets the time at which the partition finished running.
- setFirestoreDb(String) - Method in interface org.apache.beam.sdk.io.gcp.firestore.FirestoreOptions
-
Set the Firestore database ID to connect to.
- setFormat(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformReadConfiguration.Builder
-
The expected format of the Pub/Sub message.
- setFormat(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformWriteConfiguration.Builder
-
The expected format of the Pub/Sub message.
- setHeartbeatMillis(long) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
-
Sets the heartbeat interval in millis.
- setHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.CrossLanguageConfiguration
- setHTTPWriteTimeout(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setIdAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformReadConfiguration.Builder
-
When reading from Cloud Pub/Sub where unique record identifiers are provided as Pub/Sub message attributes, specifies the name of the attribute containing the unique identifier.
- setIdAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformWriteConfiguration.Builder
-
When reading from Cloud Pub/Sub where unique record identifiers are provided as Pub/Sub message attributes, specifies the name of the attribute containing the unique identifier.
- setIdLabel(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead.Configuration
- setIdLabel(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite.Configuration
- setInferMaps(boolean) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.SchemaConversionOptions.Builder
- setInsertBundleParallelism(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.CrossLanguageConfiguration
- setInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration.Builder
- setMaxBufferingDurationMilliSec(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setMaxStreamingBatchSize(Long) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setMaxStreamingRowsToBatch(Long) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setMetadataTable(String) - Method in interface org.apache.beam.sdk.io.gcp.spanner.SpannerIO.SpannerChangeStreamOptions
-
Specifies the name of the metadata table.
- setNumFailuresExpected(int) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
- setNumStorageWriteApiStreamAppendClients(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setNumStorageWriteApiStreams(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setNumStreamingKeys(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setParentTokens(HashSet<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
-
Sets the collection of parent partition identifiers.
- setPartitionToken(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
-
Sets the unique partition identifier.
- setProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.CrossLanguageConfiguration
- setProtoClass(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformReadConfiguration.Builder
-
Used by the ProtoPayloadSerializerProvider when serializing from a Pub/Sub message.
- setPubsubRootUrl(String) - Method in interface org.apache.beam.sdk.io.gcp.pubsub.PubsubOptions
- setQuery(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformReadConfiguration.Builder
-
Configures the BigQuery read job with the SQL query.
- setQuery(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration.Builder
- setQueryLocation(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformReadConfiguration.Builder
-
BigQuery geographic location where the query job will be executed.
- setReadTimestamp(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
- setRowRestriction(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration.Builder
- setRunningAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
-
Sets the time at which the partition started running.
- setScheduledAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
-
Sets the time at which the partition was scheduled.
- setSchema(byte[]) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
- setSchematizedData(String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
- setSelectedFields(List<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration.Builder
- setShouldFailRow(Function<TableRow, Boolean>) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- setSpannerConfig(SpannerConfig) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction.Builder
- setSql(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
- setStaleness(Long) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
- setStartTimestamp(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
-
Sets the start time of the partition.
- setState(PartitionMetadata.State) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
-
Sets the current state of the partition.
- setStorageApiAppendThresholdBytes(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setStorageApiAppendThresholdRecordCount(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setStorageWriteApiMaxRequestSize(Long) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setStorageWriteApiTriggeringFrequencySec(Integer) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setSubscription(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead.Configuration
- setSubscription(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformReadConfiguration.Builder
-
The subscription from which to read Pub/Sub messages.
- setSubscriptionPath(SubscriptionPath) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.SubscriberOptions.Builder
- setTable(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
- setTableId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration.Builder
- setTableSpec(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformReadConfiguration.Builder
-
Specifies a table for a BigQuery read job.
- setTableSpec(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformWriteConfiguration.Builder
-
Writes to the given table specification.
- setTableSpec(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration.Builder
- setTargetDataset(String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.TestBigQueryOptions
- setTempDatasetId(String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setThriftClass(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformReadConfiguration.Builder
-
Used by the ThriftPayloadSerializerProvider when serializing from a Pub/Sub message.
- setThriftProtocolFactoryClass(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformReadConfiguration.Builder
-
Used by the ThriftPayloadSerializerProvider when serializing from a Pub/Sub message.
- setThroughputEstimator(BytesThroughputEstimator<DataChangeRecord>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.ReadChangeStreamPartitionDoFn
-
Sets the estimator to calculate the backlog of this function.
- setTimestampAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead.Configuration
- setTimestampAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite.Configuration
- setTimestampAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformReadConfiguration.Builder
-
When reading from Cloud Pub/Sub where record timestamps are provided as Pub/Sub message attributes, specifies the name of the attribute that contains the timestamp.
- setTimestampAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformWriteConfiguration.Builder
-
When writing to Cloud Pub/Sub where record timestamps are configured as Pub/Sub message attributes, specifies the name of the attribute that contains the timestamp.
- setTimestampBound(TimestampBound) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction.Builder
- setTimestampBoundMode(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
- setTimeSupplier(Supplier<Timestamp>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestrictionProgressChecker
- setTimeSupplier(Supplier<Timestamp>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestrictionTracker
- setTimeSupplier(Supplier<Timestamp>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
- setTimeUnit(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.ReadBuilder.Configuration
- setTopic(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead.Configuration
- setTopic(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite.Configuration
- setTopic(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformReadConfiguration.Builder
-
The topic from which to read Pub/Sub messages.
- setTopic(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubSchemaTransformWriteConfiguration.Builder
-
The topic to which to write Pub/Sub messages.
- setTopicPath(TopicPath) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PublisherOptions.Builder
- setTruncateTimestamps(BigQueryUtils.ConversionOptions.TruncateTimestamps) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions.Builder
- setup() - Method in class org.apache.beam.sdk.io.gcp.datastore.RampupThrottlingFn
- setup() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.DetectNewPartitionsDoFn
-
Obtains the instance of
DetectNewPartitionsAction
. - setup() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dofn.ReadChangeStreamPartitionDoFn
-
Constructs instances for the
PartitionMetadataDao
,ChangeStreamDao
,ChangeStreamRecordMapper
,PartitionMetadataMapper
,DataChangeRecordAction
,HeartbeatRecordAction
,ChildPartitionsRecordAction
andQueryChangeStreamAction
. - setUp() - Static method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- setUp() - Static method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
- setUseStandardSql(Boolean) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformReadConfiguration.Builder
-
Enables BigQuery's Standard SQL dialect when reading from a query.
- setUseStorageApiConnectionPool(Boolean) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setUseStorageWriteApi(Boolean) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setUseStorageWriteApiAtLeastOnce(Boolean) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryOptions
- setUuidExtractor(SerializableFunction<SequencedMessage, Uuid>) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions.Builder
- setWatermark(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.Builder
-
Sets the watermark (last processed timestamp) for the partition.
- setWithAttributes(Boolean) - Method in class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead.Configuration
- setWriteDisposition(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaTransformWriteConfiguration.Builder
-
Specifies what to do with existing data in the table, in case the table already exists.
- shouldRetry(InsertRetryPolicy.Context) - Method in class org.apache.beam.sdk.io.gcp.bigquery.InsertRetryPolicy
-
Return true if this failure should be retried.
- sideInput(PCollectionView<SideInputT>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations
-
Returns the value of a given side input.
- signalStart() - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal
-
Outputs a message that the pipeline has started.
- signalSuccessWhen(Coder<T>, SerializableFunction<Set<T>, Boolean>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal
-
Invocation of
TestPubsubSignal.signalSuccessWhen(Coder, SerializableFunction, SerializableFunction)
withObject.toString()
as the formatter. - signalSuccessWhen(Coder<T>, SerializableFunction<T, String>, SerializableFunction<Set<T>, Boolean>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal
-
Outputs a success message when
successPredicate
is evaluated to true. - size() - Method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
- SizeEstimator<T> - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator
-
This class is used to estimate the size in bytes of a given element.
- SizeEstimator(Coder<T>) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.SizeEstimator
- sizeOf(T) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.SizeEstimator
-
Estimates the size in bytes of the given element with the configured
Coder
. - skipInvalidRows() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Insert all valid rows of a request, even if invalid rows exist.
- SpannerAccessor - Class in org.apache.beam.sdk.io.gcp.spanner
-
Manages lifecycle of
DatabaseClient
andSpanner
instances. - SpannerConfig - Class in org.apache.beam.sdk.io.gcp.spanner
-
Configuration for a Cloud Spanner client.
- SpannerConfig() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- SpannerConfig.Builder - Class in org.apache.beam.sdk.io.gcp.spanner
-
Builder for
SpannerConfig
. - SpannerIO - Class in org.apache.beam.sdk.io.gcp.spanner
-
Experimental
Transforms
for reading from and writing to Google Cloud Spanner. - SpannerIO.CreateTransaction - Class in org.apache.beam.sdk.io.gcp.spanner
-
A
PTransform
that create a transaction. - SpannerIO.CreateTransaction.Builder - Class in org.apache.beam.sdk.io.gcp.spanner
-
A builder for
SpannerIO.CreateTransaction
. - SpannerIO.FailureMode - Enum in org.apache.beam.sdk.io.gcp.spanner
-
A failure handling strategy.
- SpannerIO.Read - Class in org.apache.beam.sdk.io.gcp.spanner
-
Implementation of
SpannerIO.read()
. - SpannerIO.ReadAll - Class in org.apache.beam.sdk.io.gcp.spanner
-
Implementation of
SpannerIO.readAll()
. - SpannerIO.ReadChangeStream - Class in org.apache.beam.sdk.io.gcp.spanner
- SpannerIO.SpannerChangeStreamOptions - Interface in org.apache.beam.sdk.io.gcp.spanner
-
Interface to display the name of the metadata table on Dataflow UI.
- SpannerIO.Write - Class in org.apache.beam.sdk.io.gcp.spanner
-
A
PTransform
that writesMutation
objects to Google Cloud Spanner. - SpannerIO.WriteGrouped - Class in org.apache.beam.sdk.io.gcp.spanner
-
Same as
SpannerIO.Write
but supports grouped mutations. - SpannerTransformRegistrar - Class in org.apache.beam.sdk.io.gcp.spanner
-
Exposes
SpannerIO.WriteRows
andSpannerIO.ReadRows
as an external transform for cross-language usage. - SpannerTransformRegistrar() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar
- SpannerTransformRegistrar.CrossLanguageConfiguration - Class in org.apache.beam.sdk.io.gcp.spanner
- SpannerTransformRegistrar.DeleteBuilder - Class in org.apache.beam.sdk.io.gcp.spanner
- SpannerTransformRegistrar.InsertBuilder - Class in org.apache.beam.sdk.io.gcp.spanner
- SpannerTransformRegistrar.InsertOrUpdateBuilder - Class in org.apache.beam.sdk.io.gcp.spanner
- SpannerTransformRegistrar.ReadBuilder - Class in org.apache.beam.sdk.io.gcp.spanner
- SpannerTransformRegistrar.ReadBuilder.Configuration - Class in org.apache.beam.sdk.io.gcp.spanner
- SpannerTransformRegistrar.ReplaceBuilder - Class in org.apache.beam.sdk.io.gcp.spanner
- SpannerTransformRegistrar.UpdateBuilder - Class in org.apache.beam.sdk.io.gcp.spanner
- SpannerWriteResult - Class in org.apache.beam.sdk.io.gcp.spanner
-
The results of a
SpannerIO.write()
transform. - SpannerWriteResult(Pipeline, PCollection<Void>, PCollection<MutationGroup>, TupleTag<MutationGroup>) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteResult
- SpannerWriteSchemaTransformConfiguration() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration
- SpannerWriteSchemaTransformProvider - Class in org.apache.beam.sdk.io.gcp.spanner
- SpannerWriteSchemaTransformProvider() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerWriteSchemaTransformProvider
- SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration - Class in org.apache.beam.sdk.io.gcp.spanner
- SpannerWriteSchemaTransformProvider.SpannerWriteSchemaTransformConfiguration.Builder - Class in org.apache.beam.sdk.io.gcp.spanner
- split(int, PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedSourceImpl
- splitReadStream(SplitReadStreamRequest) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StorageClient
- splitReadStream(SplitReadStreamRequest, String) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StorageClient
- start() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.UnboundedReaderImpl
- startBundle() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.PubsubLiteSink
- startBundle(DoFn.StartBundleContext) - Method in class org.apache.beam.sdk.io.gcp.bigquery.UpdateSchemaDestination
- startBundle(DoFn.StartBundleContext) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write.PubsubBoundedWriter
- startCopyJob(JobReference, JobConfigurationTableCopy) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.JobService
-
Start a BigQuery copy job.
- startCopyJob(JobReference, JobConfigurationTableCopy) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
- startExtractJob(JobReference, JobConfigurationExtract) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.JobService
-
Start a BigQuery extract job.
- startExtractJob(JobReference, JobConfigurationExtract) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
- startLoadJob(JobReference, JobConfigurationLoad) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.JobService
-
Start a BigQuery load job.
- startLoadJob(JobReference, JobConfigurationLoad) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
- startLoadJob(JobReference, JobConfigurationLoad, AbstractInputStreamContent) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.JobService
-
Start a BigQuery load job with stream content.
- startLoadJob(JobReference, JobConfigurationLoad, AbstractInputStreamContent) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
- startQueryJob(JobReference, JobConfigurationQuery) - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.JobService
-
Start a BigQuery query job.
- startQueryJob(JobReference, JobConfigurationQuery) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeJobService
- stop() - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionPosition
- stop(PartitionRestriction) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestriction
- STOP - org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionMode
- STORAGE_API_AT_LEAST_ONCE - org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.Method
-
Use the new, Storage Write API without exactly once enabled.
- STORAGE_STATS - org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService.TableMetadataView
- STORAGE_WRITE_API - org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.Method
-
Use the new, exactly-once Storage Write API.
- StorageApiConvertMessages<DestinationT,ElementT> - Class in org.apache.beam.sdk.io.gcp.bigquery
-
A transform that converts messages to protocol buffers in preparation for writing to BigQuery.
- StorageApiConvertMessages(StorageApiDynamicDestinations<ElementT, DestinationT>, BigQueryServices, TupleTag<BigQueryStorageApiInsertError>, TupleTag<KV<DestinationT, StorageApiWritePayload>>, Coder<BigQueryStorageApiInsertError>, Coder<KV<DestinationT, StorageApiWritePayload>>) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StorageApiConvertMessages
- StorageApiConvertMessages.ConvertMessagesDoFn<DestinationT extends @NonNull java.lang.Object,ElementT> - Class in org.apache.beam.sdk.io.gcp.bigquery
- StorageApiDynamicDestinationsTableRow<T,DestinationT extends @NonNull java.lang.Object> - Class in org.apache.beam.sdk.io.gcp.bigquery
- StorageApiFlushAndFinalizeDoFn - Class in org.apache.beam.sdk.io.gcp.bigquery
-
This DoFn flushes and optionally (if requested) finalizes Storage API streams.
- StorageApiFlushAndFinalizeDoFn(BigQueryServices) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StorageApiFlushAndFinalizeDoFn
- StorageApiLoads<DestinationT,ElementT> - Class in org.apache.beam.sdk.io.gcp.bigquery
-
This
PTransform
manages loads into BigQuery using the Storage API. - StorageApiLoads(Coder<DestinationT>, StorageApiDynamicDestinations<ElementT, DestinationT>, BigQueryIO.Write.CreateDisposition, String, Duration, BigQueryServices, int, boolean, boolean) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StorageApiLoads
- StorageApiWritePayload - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Class used to wrap elements being sent to the Storage API sinks.
- StorageApiWritePayload() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritePayload
- StorageApiWriteRecordsInconsistent<DestinationT,ElementT> - Class in org.apache.beam.sdk.io.gcp.bigquery
-
A transform to write sharded records to BigQuery using the Storage API.
- StorageApiWriteRecordsInconsistent(StorageApiDynamicDestinations<ElementT, DestinationT>, BigQueryServices, TupleTag<BigQueryStorageApiInsertError>, Coder<BigQueryStorageApiInsertError>) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteRecordsInconsistent
- StorageApiWritesShardedRecords<DestinationT extends @NonNull java.lang.Object,ElementT> - Class in org.apache.beam.sdk.io.gcp.bigquery
-
A transform to write sharded records to BigQuery using the Storage API.
- StorageApiWritesShardedRecords(StorageApiDynamicDestinations<ElementT, DestinationT>, BigQueryIO.Write.CreateDisposition, String, BigQueryServices, Coder<DestinationT>, Coder<BigQueryStorageApiInsertError>, TupleTag<BigQueryStorageApiInsertError>) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritesShardedRecords
- StorageApiWriteUnshardedRecords<DestinationT,ElementT> - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Write records to the Storage API using a standard batch approach.
- StorageApiWriteUnshardedRecords(StorageApiDynamicDestinations<ElementT, DestinationT>, BigQueryServices, TupleTag<BigQueryStorageApiInsertError>, Coder<BigQueryStorageApiInsertError>) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StorageApiWriteUnshardedRecords
- storeId - Variable in class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser.DicomWebPath
- STREAMING_INSERTS - org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.Method
-
Use the BigQuery streaming insert API to insert data.
- StreamingInserts<DestinationT,ElementT> - Class in org.apache.beam.sdk.io.gcp.bigquery
-
PTransform that performs streaming BigQuery write.
- StreamingInserts(BigQueryIO.Write.CreateDisposition, DynamicDestinations<?, DestinationT>, Coder<ElementT>, SerializableFunction<ElementT, TableRow>, SerializableFunction<ElementT, TableRow>) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StreamingInserts
-
Constructor.
- StreamingWriteTables<ElementT> - Class in org.apache.beam.sdk.io.gcp.bigquery
-
This transform takes in key-value pairs of
TableRow
entries and theTableDestination
it should be written to. - StreamingWriteTables() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.StreamingWriteTables
- stripPartitionDecorator(String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
-
Strip off any partition decorator information from a tablespec.
- studyId - Variable in class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser.DicomWebPath
- SubscriberOptions - Class in org.apache.beam.sdk.io.gcp.pubsublite
- SubscriberOptions() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.SubscriberOptions
- SubscriberOptions.Builder - Class in org.apache.beam.sdk.io.gcp.pubsublite
- SubscribeTransform - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
- SubscribeTransform(SubscriberOptions) - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.SubscribeTransform
- SubscriptionPartition - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
- SubscriptionPartition() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.SubscriptionPartition
- SubscriptionPartitionCoder - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
- SubscriptionPartitionCoder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.SubscriptionPartitionCoder
- subscriptionPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
-
Subscription path used to listen for messages on
TestPubsub.topicPath()
. - subscriptionPath() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.SubscriberOptions
- subscriptionPathFromName(String, String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
- subscriptionPathFromPath(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
- SUCCESS - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write
-
The tag for the successful writes to HL7v2 store`.
- SUCCESSFUL_BODY - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
-
The tag for successful writes to FHIR store.
- SUCCESSFUL_BUNDLES - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.ExecuteBundles
-
The TupleTag used for bundles that were executed successfully.
- supportsProjectionPushdown() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
T
- TABLE_FIELD_SCHEMAS - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOErrorToTableRow
- TABLE_METADATA_VIEW_UNSPECIFIED - org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService.TableMetadataView
- TABLE_ROW_ERROR_CONTAINER - Static variable in interface org.apache.beam.sdk.io.gcp.bigquery.ErrorContainer
- TableAndQuery() - Constructor for class org.apache.beam.sdk.io.gcp.testing.BigqueryMatcher.TableAndQuery
- TableDestination - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Encapsulates a BigQuery table destination.
- TableDestination(TableReference, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- TableDestination(TableReference, String, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- TableDestination(TableReference, String, String, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- TableDestination(TableReference, String, TimePartitioning) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- TableDestination(String, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- TableDestination(String, String, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- TableDestination(String, String, String, String) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- TableDestination(String, String, TimePartitioning) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- TableDestination(String, String, TimePartitioning, Clustering) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- TableDestinationCoder - Class in org.apache.beam.sdk.io.gcp.bigquery
-
A coder for
TableDestination
objects. - TableDestinationCoderV2 - Class in org.apache.beam.sdk.io.gcp.bigquery
-
A
Coder
forTableDestination
that includes time partitioning information. - TableDestinationCoderV2() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV2
- TableDestinationCoderV3 - Class in org.apache.beam.sdk.io.gcp.bigquery
-
A
Coder
forTableDestination
that includes time partitioning and clustering information. - TableDestinationCoderV3() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV3
- tableExists() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
-
Checks whether the metadata table already exists in the database.
- tableFieldToProtoTableField(TableFieldSchema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
- tableReference() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery
- tableRowFromBeamRow() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
- tableRowFromMessage(Message) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
- TableRowJsonCoder - Class in org.apache.beam.sdk.io.gcp.bigquery
-
A
Coder
that encodes BigQueryTableRow
objects in their native JSON format. - tableRowToBeamRow() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
- TableRowToStorageApiProto - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Utility methods for converting JSON
TableRow
objects to dynamic protocol message, for use with the Storage write API. - TableRowToStorageApiProto() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
- TableRowToStorageApiProto.SchemaConversionException - Exception in org.apache.beam.sdk.io.gcp.bigquery
- TableRowToStorageApiProto.SchemaDoesntMatchException - Exception in org.apache.beam.sdk.io.gcp.bigquery
- TableRowToStorageApiProto.SchemaTooNarrowException - Exception in org.apache.beam.sdk.io.gcp.bigquery
- TableSchemaCache - Class in org.apache.beam.sdk.io.gcp.bigquery
-
An updatable cache for table schemas.
- tableSpec() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TestBigQuery
- targetForRootUrl(String) - Static method in interface org.apache.beam.sdk.io.gcp.pubsub.PubsubOptions
-
Internal only utility for converting
PubsubOptions.getPubsubRootUrl()
(e.g. - TEMP_FILES - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
-
The tag for temp files for import to FHIR store.
- TestBigQuery - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Test rule which creates a new table with specified schema, with randomized name and exposes few APIs to work with it.
- TestBigQuery.PollingAssertion - Interface in org.apache.beam.sdk.io.gcp.bigquery
-
Interface to implement a polling assertion.
- TestBigQuery.RowsAssertion - Class in org.apache.beam.sdk.io.gcp.bigquery
-
Interface for creating a polling eventual assertion.
- TestBigQueryOptions - Interface in org.apache.beam.sdk.io.gcp.bigquery
-
TestPipelineOptions
forTestBigQuery
. - TestPubsub - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Test rule which creates a new topic and subscription with randomized names and exposes the APIs to work with them.
- TestPubsub.PollingAssertion - Interface in org.apache.beam.sdk.io.gcp.pubsub
- TestPubsubOptions - Interface in org.apache.beam.sdk.io.gcp.pubsub
-
PipelineOptions
forTestPubsub
. - TestPubsubSignal - Class in org.apache.beam.sdk.io.gcp.pubsub
-
Test rule which observes elements of the
PCollection
and checks whether they match the success criteria. - THROUGHPUT_WINDOW_SECONDS - Static variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamsConstants
-
The sliding window size in seconds for throughput reporting.
- ThroughputEstimator<T> - Interface in org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator
-
An estimator to calculate the throughput of the outputted elements from a DoFn.
- TIMESTAMP_FIELD_NAME - Static variable in class org.apache.beam.sdk.io.gcp.healthcare.HealthcareIOErrorToTableRow
- TIMESTAMP_MICROS - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.RowUtils
- TimestampEncoding - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.encoder
-
This encoder/decoder writes a com.google.cloud.Timestamp object as a pair of long and int to avro and reads a Timestamp object from the same pair.
- TimestampEncoding() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.encoder.TimestampEncoding
- timestampMsSinceEpoch() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.OutgoingMessage
-
Timestamp for element (ms since epoch).
- TimestampRange - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction
-
A restriction represented by a range of timestamps [from, to).
- TimestampRangeTracker - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction
-
A
RestrictionTracker
for claiming positions in aTimestampRange
in a monotonically increasing fashion. - TimestampRangeTracker(TimestampRange) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
- TimestampUtils - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction
-
Provides methods in order to convert timestamp to nanoseconds representation and back.
- TimestampUtils() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampUtils
- timeSupplier - Variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestrictionProgressChecker
- timeSupplier - Variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestrictionTracker
- timeSupplier - Variable in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
- to(TableReference) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Writes to the given table, specified as a
TableReference
. - to(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Writes to the given table, specified in the format described in
BigQueryHelpers.parseTableSpec(java.lang.String)
. - to(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
-
Publishes to the specified topic.
- to(DynamicDestinations<T, ?>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Writes to the table and schema specified by the
DynamicDestinations
object. - to(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Same as
BigQueryIO.Write.to(String)
, but with aValueProvider
. - to(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
-
Like
topic()
but with aValueProvider
. - to(SerializableFunction<ValueInSingleWindow<T>, TableDestination>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Writes to table specified by the specified table function.
- toBeamRow(GenericRecord, Schema, BigQueryUtils.ConversionOptions) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
- toBeamRow(Schema, TableRow) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
-
Tries to convert a JSON
TableRow
from BigQuery into a BeamRow
. - toBeamRow(Schema, TableSchema, TableRow) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
-
Tries to parse the JSON
TableRow
from BigQuery. - ToBigtableRowFn(Map<String, String>) - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BeamRowToBigtableMutation.ToBigtableRowFn
- toBuilder() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchGetDocuments
- toBuilder() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchWriteWithDeadLetterQueue
- toBuilder() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchWriteWithSummary
- toBuilder() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.ListCollectionIds
- toBuilder() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.ListDocuments
- toBuilder() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.PartitionQuery
- toBuilder() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.RunQuery
- toBuilder() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
-
Create a new
RpcQosOptions.Builder
initialized with the values from this instance. - toBuilder() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.SubscriberOptions
- toBuilder() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
-
Transforms the instance into a builder, so field values can be modified.
- toChangeStreamRecords(PartitionMetadata, ChangeStreamResultSet, ChangeStreamResultSetMetadata) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.mapper.ChangeStreamRecordMapper
-
In GoogleSQL, change stream records are returned as an array of
Struct
. - toCloudPubsubMessages() - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.CloudPubsubTransforms
-
Transform messages read from Pub/Sub Lite to their equivalent Cloud Pub/Sub Message that would have been read from PubsubIO.
- toGenericAvroSchema(String, List<TableFieldSchema>) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
-
Convert a list of BigQuery
TableFieldSchema
to AvroSchema
. - toJsonString(Object) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
- toModel() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
-
To model message.
- toNanos(Timestamp) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampUtils
-
Converts the given timestamp to respective nanoseconds representation.
- topicPath() - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
-
Topic path where events will be published to.
- topicPath() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.PublisherOptions
- topicPathFromName(String, String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
- topicPathFromPath(String) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient
- toProto(PubsubMessage) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessages
- toString() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryStorageApiInsertError
- toString() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- toString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
- toString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
- toString() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.WriteWithResults
- toString() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
- toString() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteSuccessSummary
- toString() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions
- toString() - Method in class org.apache.beam.sdk.io.gcp.healthcare.FhirSearchParameter
- toString() - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2Message
- toString() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.ProjectPath
- toString() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.SubscriptionPath
- toString() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubClient.TopicPath
- toString() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubSubscription
- toString() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.PubsubTopic
- toString() - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubMessage
- toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.TransactionResult
- toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata
- toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartition
- toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChildPartitionsRecord
- toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ColumnType
- toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.DataChangeRecord
- toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.HeartbeatRecord
- toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.Mod
- toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata
- toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.TypeCode
- toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionPosition
- toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestriction
- toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestrictionMetadata
- toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRange
- toString() - Method in class org.apache.beam.sdk.io.gcp.spanner.MutationGroup
- toTableReference(String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
- toTableRow() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
-
Convert a Beam
Row
to a BigQueryTableRow
. - toTableRow(SerializableFunction<T, Row>) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
-
Convert a Beam schema type to a BigQuery
TableRow
. - toTableRow(Row) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
-
Convert a BigQuery TableRow to a Beam Row.
- toTableSchema(Schema) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
-
Convert a Beam
Schema
to a BigQueryTableSchema
. - toTableSpec(TableReference) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers
-
Returns a canonical string representation of the
TableReference
. - toTimestamp(BigDecimal) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampUtils
-
Converts nanoseconds to their respective timestamp.
- TrackerWithProgress - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
- TrackerWithProgress() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.TrackerWithProgress
- Transaction - Class in org.apache.beam.sdk.io.gcp.spanner
-
A transaction object.
- Transaction() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.Transaction
- transactionId() - Method in class org.apache.beam.sdk.io.gcp.spanner.Transaction
- TransactionResult(T, Timestamp) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.TransactionResult
- TRUNCATE - org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions.TruncateTimestamps
-
Truncate timestamps to millisecond precision.
- tryClaim(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.DetectNewPartitionsRangeTracker
-
Attempts to claim the given position.
- tryClaim(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.ReadChangeStreamPartitionRangeTracker
-
Attempts to claim the given position.
- tryClaim(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
-
Attempts to claim the given position.
- tryClaim(PartitionPosition) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestrictionTracker
- tryClaim(PartitionRestriction, PartitionPosition, PartitionPosition) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestrictionClaimer
- trySplit(double) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestrictionTracker
- trySplit(double) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.ReadChangeStreamPartitionRangeTracker
-
If the partition token is the
InitialPartition.PARTITION_TOKEN
, it does not allow for splits (returns null). - trySplit(double) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.TimestampRangeTracker
-
Splits the restriction through the following algorithm:
- trySplit(double, PartitionPosition, PartitionRestriction) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestrictionSplitter
- TypeCode - Class in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
Represents a type of a column within Cloud Spanner.
- TypeCode(String) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.TypeCode
-
Constructs a type code from the given String code.
- TypedRead() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- typeToProtoType(String) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto
U
- UnboundedReaderImpl - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
- UnboundedSourceImpl - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
- UNKNOWN - org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ModType
- UNKNOWN - org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ValueCaptureType
- unpin() - Method in interface org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.StreamAppendClient
-
Unpin this object.
- update(Timestamp, T) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.BytesThroughputEstimator
-
Updates the estimator with the bytes of records.
- update(Timestamp, T) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.NullThroughputEstimator
-
NoOp.
- update(Timestamp, T) - Method in interface org.apache.beam.sdk.io.gcp.spanner.changestreams.estimator.ThroughputEstimator
-
Updates the estimator with the size of the records.
- UPDATE - org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ModType
- UPDATE_STATE - org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionMode
- UPDATE_URN - Static variable in class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar
- UpdateBuilder() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar.UpdateBuilder
- updateDataRecordCommittedToEmitted(Duration) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
- updatePartitionCreatedToScheduled(Duration) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Adds measurement of an instance for the
ChangeStreamMetrics.PARTITION_CREATED_TO_SCHEDULED_MS
if the metric is enabled. - updatePartitionScheduledToRunning(Duration) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.ChangeStreamMetrics
-
Adds measurement of an instance for the
ChangeStreamMetrics.PARTITION_SCHEDULED_TO_RUNNING_MS
if the metric is enabled. - UpdateSchemaDestination<DestinationT> - Class in org.apache.beam.sdk.io.gcp.bigquery
- UpdateSchemaDestination(BigQueryServices, PCollectionView<String>, ValueProvider<String>, BigQueryIO.Write.WriteDisposition, BigQueryIO.Write.CreateDisposition, int, String, Set<BigQueryIO.Write.SchemaUpdateOption>, DynamicDestinations) - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.UpdateSchemaDestination
- updateState() - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionPosition
- updateState(Timestamp, Timestamp) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestriction
- updateTableSchema(TableReference, TableSchema) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeDatasetService
- updateToFinished(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.InTransactionContext
-
Updates a partition row to
PartitionMetadata.State.FINISHED
state. - updateToFinished(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
-
Updates a partition row to
PartitionMetadata.State.FINISHED
state. - updateToRunning(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.InTransactionContext
-
Updates a partition row to
PartitionMetadata.State.RUNNING
state. - updateToRunning(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
-
Updates a partition row to
PartitionMetadata.State.RUNNING
state. - updateToScheduled(List<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.InTransactionContext
-
Updates multiple partition rows to
PartitionMetadata.State.SCHEDULED
state. - updateToScheduled(List<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
-
Updates multiple partition row to
PartitionMetadata.State.SCHEDULED
state. - updateWatermark(String, Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao.InTransactionContext
-
Update the partition watermark to the given timestamp.
- updateWatermark(String, Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.dao.PartitionMetadataDao
-
Update the partition watermark to the given timestamp.
- uploadToDicomStore(String, String) - Method in interface org.apache.beam.sdk.io.gcp.healthcare.HealthcareApiClient
- uploadToDicomStore(String, String) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient
- URN - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.ExternalRead
- URN - Static variable in class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite
- useAvroLogicalTypes() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- useAvroLogicalTypes() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Enables interpreting logical types into their corresponding types (ie.
- useBeamSchema() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
If true, then the BigQuery schema will be inferred from the input schema.
- usingStandardSql() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
-
Enables BigQuery's Standard SQL dialect when reading from a query.
- usingStandardSql() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- Uuid - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
-
A Uuid storable in a Pub/Sub Lite attribute.
- Uuid() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.Uuid
- UuidCoder - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
-
A coder for a Uuid.
- UuidCoder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.UuidCoder
- UuidDeduplicationOptions - Class in org.apache.beam.sdk.io.gcp.pubsublite
-
Options for deduplicating Pub/Sub Lite messages based on the UUID they were published with.
- UuidDeduplicationOptions() - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions
- UuidDeduplicationOptions.Builder - Class in org.apache.beam.sdk.io.gcp.pubsublite
- UuidDeduplicationTransform - Class in org.apache.beam.sdk.io.gcp.pubsublite.internal
-
A transform for deduplicating Pub/Sub Lite messages based on the UUID they were published with.
- UuidDeduplicationTransform(UuidDeduplicationOptions) - Constructor for class org.apache.beam.sdk.io.gcp.pubsublite.internal.UuidDeduplicationTransform
- uuidExtractor() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.UuidDeduplicationOptions
V
- v1() - Static method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreIO
-
Returns a
DatastoreV1
that provides an API for accessing Cloud Datastore through v1 version of Datastore Client library. - v1() - Static method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreIO
- V1_READ_OVERRIDE - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.internal.SubscribeTransform
- validate() - Method in class org.apache.beam.sdk.io.gcp.bigquery.providers.BigQueryDirectReadSchemaTransformProvider.BigQueryDirectReadSchemaTransformConfiguration
- validate() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
- validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
- validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
- validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
- validate(PipelineOptions) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.WriteWithResults
- value() - Method in class org.apache.beam.sdk.io.gcp.pubsublite.internal.Uuid
- VALUE - Static variable in class org.apache.beam.sdk.io.gcp.bigtable.RowUtils
- ValueCaptureType - Enum in org.apache.beam.sdk.io.gcp.spanner.changestreams.model
-
Represents the capture type of a change stream.
- valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead.Method
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead.QueryPriority
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.CreateDisposition
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.Method
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.SchemaUpdateOption
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.WriteDisposition
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService.TableMetadataView
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions.TruncateTimestamps
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Import.ContentStructure
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.WriteMethod
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write.WriteMethod
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.FhirResourcePagesIterator.FhirMethod
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ModType
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.State
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ValueCaptureType
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionMode
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.beam.sdk.io.gcp.spanner.SpannerIO.FailureMode
-
Returns the enum constant of this type with the specified name.
- values() - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead.Method
-
Returns an array containing the constants of this enum type, in the order they are declared.
- values() - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead.QueryPriority
-
Returns an array containing the constants of this enum type, in the order they are declared.
- values() - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.CreateDisposition
-
Returns an array containing the constants of this enum type, in the order they are declared.
- values() - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.Method
-
Returns an array containing the constants of this enum type, in the order they are declared.
- values() - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.SchemaUpdateOption
-
Returns an array containing the constants of this enum type, in the order they are declared.
- values() - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.WriteDisposition
-
Returns an array containing the constants of this enum type, in the order they are declared.
- values() - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryServices.DatasetService.TableMetadataView
-
Returns an array containing the constants of this enum type, in the order they are declared.
- values() - Static method in enum org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils.ConversionOptions.TruncateTimestamps
-
Returns an array containing the constants of this enum type, in the order they are declared.
- values() - Static method in enum org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Import.ContentStructure
-
Returns an array containing the constants of this enum type, in the order they are declared.
- values() - Static method in enum org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write.WriteMethod
-
Returns an array containing the constants of this enum type, in the order they are declared.
- values() - Static method in enum org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write.WriteMethod
-
Returns an array containing the constants of this enum type, in the order they are declared.
- values() - Static method in enum org.apache.beam.sdk.io.gcp.healthcare.HttpHealthcareApiClient.FhirResourcePagesIterator.FhirMethod
-
Returns an array containing the constants of this enum type, in the order they are declared.
- values() - Static method in enum org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ModType
-
Returns an array containing the constants of this enum type, in the order they are declared.
- values() - Static method in enum org.apache.beam.sdk.io.gcp.spanner.changestreams.model.PartitionMetadata.State
-
Returns an array containing the constants of this enum type, in the order they are declared.
- values() - Static method in enum org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ValueCaptureType
-
Returns an array containing the constants of this enum type, in the order they are declared.
- values() - Static method in enum org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionMode
-
Returns an array containing the constants of this enum type, in the order they are declared.
- values() - Static method in enum org.apache.beam.sdk.io.gcp.spanner.SpannerIO.FailureMode
-
Returns an array containing the constants of this enum type, in the order they are declared.
- verifyDeterministic() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryInsertErrorCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoder
- verifyDeterministic() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV2
- verifyDeterministic() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestinationCoderV3
- verifyDeterministic() - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder
W
- WAIT_FOR_CHILD_PARTITIONS - org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionMode
- waitForChildPartitions() - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionPosition
- waitForChildPartitions(Timestamp, Timestamp) - Static method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestriction
- waitForNMessages(int, Duration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsub
-
Repeatedly pull messages from
TestPubsub.subscriptionPath()
, returns after receivingn
messages or after waiting fortimeoutDuration
. - waitForStart(Duration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal
-
Future that waits for a start signal for
duration
. - waitForSuccess(Duration) - Method in class org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal
-
Wait for a success signal for
duration
. - waitForUpTo(Duration) - Method in interface org.apache.beam.sdk.io.gcp.pubsub.TestPubsub.PollingAssertion
- WebPathParser - Class in org.apache.beam.sdk.io.gcp.healthcare
- WebPathParser() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.WebPathParser
- WebPathParser.DicomWebPath - Class in org.apache.beam.sdk.io.gcp.healthcare
- withAutoSchemaUpdate(boolean) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
If true, enables automatically detecting BigQuery table schema updates.
- withAutoSharding() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
If true, enables using a dynamically determined number of shards to write to BigQuery.
- withAvroFormatFunction(SerializableFunction<AvroWriteRequest<T>, GenericRecord>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Formats the user's type into a
GenericRecord
to be written to BigQuery. - withAvroSchemaFactory(SerializableFunction<TableSchema, Schema>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Uses the specified function to convert a
TableSchema
to aSchema
. - withAvroWriter(SerializableFunction<Schema, DatumWriter<T>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Writes the user's type as avro using the supplied
DatumWriter
. - withAvroWriter(SerializableFunction<AvroWriteRequest<T>, AvroT>, SerializableFunction<Schema, DatumWriter<AvroT>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Convert's the user's type to an avro record using the supplied avroFormatFunction.
- withBatching(boolean) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
-
If true the uses Cloud Spanner batch API.
- withBatching(boolean) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
-
By default Batch API is used to read data from Cloud Spanner.
- withBatchInitialCount(int) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
-
Configure the initial size of a batch; used in the absence of the QoS system having significant data to determine a better batch size.
- withBatchMaxBytes(long) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
-
Configure the maximum number of bytes to include in a batch.
- withBatchMaxCount(int) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
-
Configure the maximum number of writes to include in a batch.
- withBatchSizeBytes(long) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
-
Specifies the batch size limit (max number of bytes mutated per batch).
- withBatchTargetLatency(Duration) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
-
Target latency for batch requests.
- withBeamRowConverters(TypeDescriptor<T>, BigQueryIO.TypedRead.ToBeamRowFunction<T>, BigQueryIO.TypedRead.FromBeamRowFunction<T>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
-
Sets the functions to convert elements to/from
Row
objects. - withBigtableOptions(BigtableOptions) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Deprecated.will be replaced by bigtable options configurator.
- withBigtableOptions(BigtableOptions) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
-
Deprecated.will be replaced by bigtable options configurator.
- withBigtableOptions(BigtableOptions.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Deprecated.will be replaced by bigtable options configurator.
- withBigtableOptions(BigtableOptions.Builder) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
-
Deprecated.will be replaced by bigtable options configurator.
- withBigtableOptionsConfigurator(SerializableFunction<BigtableOptions.Builder, BigtableOptions.Builder>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Returns a new
BigtableIO.Read
that will read from the Cloud Bigtable instance with customized options provided by given configurator. - withBigtableOptionsConfigurator(SerializableFunction<BigtableOptions.Builder, BigtableOptions.Builder>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
-
Returns a new
BigtableIO.Write
that will read from the Cloud Bigtable instance with customized options provided by given configurator. - withChangeStreamName(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
-
Specifies the change stream name.
- withClientFactory(PubsubClient.PubsubClientFactory) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
-
The default client to write to Pub/Sub is the
PubsubJsonClient
, created by thePubsubJsonClient.PubsubJsonClientFactory
. - withClientFactory(PubsubClient.PubsubClientFactory) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
-
The default client to write to Pub/Sub is the
PubsubJsonClient
, created by thePubsubJsonClient.PubsubJsonClientFactory
. - withClustering() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Allows writing to clustered tables when
BigQueryIO.Write.to(SerializableFunction)
orBigQueryIO.Write.to(DynamicDestinations)
is used. - withClustering(Clustering) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Specifies the clustering fields to use when writing to a single output table.
- withCoder(Coder<T>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
-
Sets a
Coder
for the result of the parse function. - withCoderAndParseFn(Coder<T>, SimpleFunction<PubsubMessage, T>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
-
Causes the source to return a PubsubMessage that includes Pubsub attributes, and uses the given parsing function to transform the PubsubMessage into an output type.
- withColumns(String...) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
- withColumns(String...) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
- withColumns(List<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
- withColumns(List<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
- withCommitDeadline(ValueProvider<Duration>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the commit deadline.
- withCommitDeadline(Duration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the commit deadline.
- withCommitDeadline(Duration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
-
Specifies the deadline for the Commit API call.
- withCommitRetrySettings(RetrySettings) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the commit retry settings.
- withCreateDisposition(BigQueryIO.Write.CreateDisposition) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Specifies whether the table should be created if it does not exist.
- withCustomGcsTempLocation(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Provides a custom location on GCS for storing temporary files to be loaded via BigQuery batch load jobs.
- withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the Cloud Spanner database ID.
- withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
-
Specifies the Cloud Spanner database.
- withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
-
Specifies the Cloud Spanner database.
- withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
-
Specifies the Cloud Spanner database.
- withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
-
Specifies the Cloud Spanner database.
- withDatabaseId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
-
Specifies the Cloud Spanner database.
- withDatabaseId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the Cloud Spanner database ID.
- withDatabaseId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
-
Specifies the Cloud Spanner database.
- withDatabaseId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
-
Specifies the Cloud Spanner database.
- withDatabaseId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
-
Specifies the Cloud Spanner database.
- withDatabaseId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
-
Specifies the Cloud Spanner database.
- withDatabaseId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
-
Specifies the Cloud Spanner database.
- withDatabaseRole(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the Cloud Spanner database role.
- withDatasetService(FakeDatasetService) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
- withDeadLetterQueue() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.BatchWriteWithSummary.Builder
- withDeadLetterTopic(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
-
Creates and returns a transform for writing read failures out to a dead-letter topic.
- withDeadLetterTopic(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
-
Like
PubsubIO.Read.withDeadLetterTopic(String)
but with aValueProvider
. - withDeterministicRecordIdFn(SerializableFunction<T, String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
- withDialectView(PCollectionView<Dialect>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
- withEmulator(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Returns a new
BigtableIO.Read
that will use an official Bigtable emulator. - withEmulator(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
-
Returns a new
BigtableIO.Write
that will use an official Bigtable emulator. - withEmulatorHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
- withEmulatorHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
- withEmulatorHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
- withEmulatorHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
- withEmulatorHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the Cloud Spanner host, when an emulator is used.
- withEmulatorHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
-
Specifies the Cloud Spanner emulator host.
- withEmulatorHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
-
Specifies the Cloud Spanner emulator host.
- withEmulatorHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
-
Specifies the Cloud Spanner emulator host.
- withEmulatorHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
-
Specifies the Cloud Spanner emulator host.
- withExecuteStreamingSqlRetrySettings(RetrySettings) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the ExecuteStreamingSql retry settings.
- withExtendedErrorInfo() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Enables extended error information by enabling
WriteResult.getFailedInsertsWithErr()
- withExtendedErrorInfo(boolean) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingInserts
-
Specify whether to use extended error info or not.
- withFailedInsertRetryPolicy(InsertRetryPolicy) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Specfies a policy for handling failed inserts.
- withFailureMode(SpannerIO.FailureMode) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
-
Specifies failure mode.
- withFormat(DataFormat) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
-
See
DataFormat
. - withFormatFunction(SerializableFunction<T, TableRow>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Formats the user's type into a
TableRow
to be written to BigQuery. - withFormatRecordOnFailureFunction(SerializableFunction<T, TableRow>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
If an insert failure occurs, this function is applied to the originally supplied row T.
- withGroupingFactor(int) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
-
Specifies the multiple of max mutation (in terms of both bytes per batch and cells per batch) that is used to select a set of mutations to sort by key for batching.
- withHighPriority() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
- withHighPriority() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
- withHighPriority() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
- withHintMaxNumWorkers(int) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
-
Provide a hint to the QoS system for the intended max number of workers for a pipeline.
- withHintNumWorkers(int) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
-
Returns a new
DatastoreV1.DeleteEntity
with a different worker count hint for ramp-up throttling. - withHintNumWorkers(int) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
-
Returns a new
DatastoreV1.DeleteKey
with a different worker count hint for ramp-up throttling. - withHintNumWorkers(int) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
-
Returns a new
DatastoreV1.Write
with a different worker count hint for ramp-up throttling. - withHintNumWorkers(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
-
Same as
DatastoreV1.DeleteEntity.withHintNumWorkers(int)
but with aValueProvider
. - withHintNumWorkers(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
-
Same as
DatastoreV1.DeleteKey.withHintNumWorkers(int)
but with aValueProvider
. - withHintNumWorkers(ValueProvider<Integer>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
-
Same as
DatastoreV1.Write.withHintNumWorkers(int)
but with aValueProvider
. - withHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
- withHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
- withHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
- withHost(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
-
Specifies the Cloud Spanner host.
- withHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the Cloud Spanner host.
- withHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
-
Specifies the Cloud Spanner host.
- withHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
-
Specifies the Cloud Spanner host.
- withHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
-
Specifies the Cloud Spanner host.
- withHost(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
-
Specifies the Cloud Spanner host.
- withIdAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
-
When reading from Cloud Pub/Sub where unique record identifiers are provided as Pub/Sub message attributes, specifies the name of the attribute containing the unique identifier.
- withIdAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
-
Writes to Pub/Sub, adding each record's unique identifier to the published messages in an attribute with the specified name.
- withInclusiveEndAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
-
Specifies the end time of the change stream.
- withInclusiveStartAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
-
Specifies the time that the change stream should be read from.
- withIndex(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
- withIndex(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
- withInitialBackoff(Duration) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
-
Configure the initial backoff duration to be used before retrying a request for the first time.
- withInitialSplitDuration(Duration) - Method in class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.ListHL7v2Messages
- withInsertRetryPolicy(InsertRetryPolicy) - Method in class org.apache.beam.sdk.io.gcp.bigquery.StreamingInserts
-
Specify a retry policy for failed inserts.
- withInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Returns a new
BigtableIO.Read
that will read from the Cloud Bigtable instance indicated by given parameter, requiresBigtableIO.Read.withProjectId(org.apache.beam.sdk.options.ValueProvider<java.lang.String>)
to be called to determine the project. - withInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
-
Returns a new
BigtableIO.Write
that will write into the Cloud Bigtable instance indicated by given parameter, requiresBigtableIO.Write.withProjectId(org.apache.beam.sdk.options.ValueProvider<java.lang.String>)
to be called to determine the project. - withInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the Cloud Spanner instance ID.
- withInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
-
Specifies the Cloud Spanner instance.
- withInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
-
Specifies the Cloud Spanner instance.
- withInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
-
Specifies the Cloud Spanner instance.
- withInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
-
Specifies the Cloud Spanner instance.
- withInstanceId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
-
Specifies the Cloud Spanner instance.
- withInstanceId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Returns a new
BigtableIO.Read
that will read from the Cloud Bigtable instance indicated by given parameter, requiresBigtableIO.Read.withProjectId(org.apache.beam.sdk.options.ValueProvider<java.lang.String>)
to be called to determine the project. - withInstanceId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
-
Returns a new
BigtableIO.Write
that will write into the Cloud Bigtable instance indicated by given parameter, requiresBigtableIO.Write.withProjectId(org.apache.beam.sdk.options.ValueProvider<java.lang.String>)
to be called to determine the project. - withInstanceId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the Cloud Spanner instance ID.
- withInstanceId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
-
Specifies the Cloud Spanner instance.
- withInstanceId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
-
Specifies the Cloud Spanner instance.
- withInstanceId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
-
Specifies the Cloud Spanner instance.
- withInstanceId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
-
Specifies the Cloud Spanner instance.
- withInstanceId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
-
Specifies the Cloud Spanner instance.
- withIsLocalChannelProvider(ValueProvider<Boolean>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies whether a local channel provider should be used.
- withJobService(BigQueryServices.JobService) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
- withJsonSchema(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Similar to
BigQueryIO.Write.withSchema(TableSchema)
but takes in a JSON-serializedTableSchema
. - withJsonSchema(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Same as
BigQueryIO.Write.withJsonSchema(String)
but using a deferredValueProvider
. - withJsonTimePartitioning(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
The same as
BigQueryIO.Write.withTimePartitioning(com.google.api.services.bigquery.model.TimePartitioning)
, but takes a JSON-serialized object. - withKeyRange(ByteKeyRange) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Returns a new
BigtableIO.Read
that will read only rows in the specified range. - withKeyRanges(List<ByteKeyRange>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Returns a new
BigtableIO.Read
that will read only rows in the specified ranges. - withKeyRanges(ValueProvider<List<ByteKeyRange>>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Returns a new
BigtableIO.Read
that will read only rows in the specified ranges. - withKeySet(KeySet) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
- withKeySet(KeySet) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
- withKmsKey(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
-
For query sources, use this Cloud KMS key to encrypt any temporary tables created.
- withKmsKey(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
- withLiteralGqlQuery(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
-
Returns a new
DatastoreV1.Read
that reads the results of the specified GQL query. - withLiteralGqlQuery(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
-
Same as
DatastoreV1.Read.withLiteralGqlQuery(String)
but with aValueProvider
. - withLoadJobProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Set the project the BigQuery load job will be initiated from.
- withLoadJobProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
- withLocalhost(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
-
Returns a new
DatastoreV1.DeleteEntity
that deletes entities from the Cloud Datastore Emulator running locally on the specified host port. - withLocalhost(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
-
Returns a new
DatastoreV1.DeleteKey
that deletes entities from the Cloud Datastore Emulator running locally on the specified host port. - withLocalhost(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
-
Returns a new
DatastoreV1.Read
that reads from a Datastore Emulator running at the given localhost address. - withLocalhost(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
-
Returns a new
DatastoreV1.Write
that writes to the Cloud Datastore Emulator running locally on the specified host port. - withLowPriority() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
- withLowPriority() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
- withLowPriority() - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
- withMaxAttempts(int) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
-
Configure the maximum number of times a request will be attempted for a complete successful result.
- withMaxBatchBytesSize(int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
-
Writes to Pub/Sub are limited by 10mb in general.
- withMaxBatchSize(int) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
-
Writes to Pub/Sub are batched to efficiently send data.
- withMaxBufferElementCount(Integer) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Returns a new
BigtableIO.Read
that will break up read requests into smaller batches. - withMaxBytesPerPartition(long) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Control how much data will be assigned to a single BigQuery load job.
- withMaxCumulativeBackoff(ValueProvider<Duration>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the maximum cumulative backoff.
- withMaxCumulativeBackoff(Duration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the maximum cumulative backoff.
- withMaxCumulativeBackoff(Duration) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
-
Specifies the maximum cumulative backoff time when retrying after DEADLINE_EXCEEDED errors.
- withMaxFilesPerBundle(int) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Control how many files will be written concurrently by a single worker when using BigQuery load jobs before spilling to a shuffle.
- withMaxNumMutations(long) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
-
Specifies the cell mutation limit (maximum number of mutated cells per batch).
- withMaxNumRows(long) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
-
Specifies the row mutation limit (maximum number of mutated rows per batch).
- withMetadata(PartitionRestrictionMetadata) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestriction
- withMetadataDatabase(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
-
Specifies the metadata database.
- withMetadataInstance(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
-
Specifies the metadata database.
- withMetadataTable(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
-
Specifies the metadata table name.
- withMethod(BigQueryIO.TypedRead.Method) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- withMethod(BigQueryIO.Write.Method) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Choose the method used to write data to BigQuery.
- withNameOnlyQuery() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.PartitionQuery.Builder
-
Update produced queries to only retrieve their
__name__
thereby not retrieving any fields and reducing resource requirements. - withNamespace(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
-
Returns a new
DatastoreV1.Read
that reads from the given namespace. - withNamespace(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
-
Same as
DatastoreV1.Read.withNamespace(String)
but with aValueProvider
. - withNumberOfRecordsRead(long) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
-
Sets the number of records read in the partition change stream query before reading this record.
- withNumFileShards(int) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Control how many file shards are written when using BigQuery load jobs.
- withNumQuerySplits(int) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
- withNumStorageWriteApiStreams(int) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Control how many parallel streams are used when using Storage API writes.
- withoutResultFlattening() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
-
Disable flattening of query results.
- withoutResultFlattening() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- withoutValidation() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
-
Disable validation that the table exists or the query succeeds prior to pipeline submission.
- withoutValidation() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- withoutValidation() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Disables BigQuery table validation.
- withoutValidation() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Disables validation that the table being read from exists.
- withoutValidation() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
-
Disables validation that the table being written to exists.
- withOverloadRatio(double) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
-
The target ratio between requests sent and successful requests.
- withPartitionCreatedAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
-
Sets the time at which this partition was first detected and created in the metadata table.
- withPartitionEndTimestamp(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
-
Sets the end time for the partition change stream query that originated this record.
- withPartitionEndTimestamp(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestrictionMetadata.Builder
- withPartitionOptions(PartitionOptions) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
- withPartitionOptions(PartitionOptions) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
- withPartitionRunningAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
-
Sets the time at which the connector started processing this partition.
- withPartitionScheduledAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
-
Sets the time at which this partition was scheduled to be queried.
- withPartitionStartTimestamp(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
-
Sets the start time for the partition change stream query that originated this record.
- withPartitionStartTimestamp(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestrictionMetadata.Builder
- withPartitionToken(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
-
Sets the partition token where this record originated from.
- withPartitionToken(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.restriction.PartitionRestrictionMetadata.Builder
- withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Returns a new
BigtableIO.Read
that will read from the Cloud Bigtable project indicated by given parameter, requiresBigtableIO.Read.withInstanceId(org.apache.beam.sdk.options.ValueProvider<java.lang.String>)
to be called to determine the instance. - withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
-
Returns a new
BigtableIO.Write
that will write into the Cloud Bigtable project indicated by given parameter, requiresBigtableIO.Write.withInstanceId(org.apache.beam.sdk.options.ValueProvider<java.lang.String>)
to be called to determine the instance. - withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
-
Returns a new
DatastoreV1.DeleteEntity
that deletes entities from the Cloud Datastore for the specified project. - withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
-
Returns a new
DatastoreV1.DeleteKey
that deletes entities from the Cloud Datastore for the specified project. - withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
-
Returns a new
DatastoreV1.Read
that reads from the Cloud Datastore for the specified project. - withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
-
Returns a new
DatastoreV1.Write
that writes to the Cloud Datastore for the specified project. - withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the Cloud Spanner project ID.
- withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
-
Specifies the Cloud Spanner project.
- withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
-
Specifies the Cloud Spanner project.
- withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
-
Specifies the Cloud Spanner project.
- withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
-
Specifies the Cloud Spanner project.
- withProjectId(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
-
Specifies the Cloud Spanner project.
- withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Returns a new
BigtableIO.Read
that will read from the Cloud Bigtable project indicated by given parameter, requiresBigtableIO.Read.withInstanceId(org.apache.beam.sdk.options.ValueProvider<java.lang.String>)
to be called to determine the instance. - withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
-
Returns a new
BigtableIO.Write
that will write into the Cloud Bigtable project indicated by given parameter, requiresBigtableIO.Write.withInstanceId(org.apache.beam.sdk.options.ValueProvider<java.lang.String>)
to be called to determine the instance. - withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
-
Same as
DatastoreV1.DeleteEntity.withProjectId(String)
but with aValueProvider
. - withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
-
Same as
DatastoreV1.DeleteKey.withProjectId(String)
but with aValueProvider
. - withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
-
Same as
DatastoreV1.Read.withProjectId(String)
but with aValueProvider
. - withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
-
Same as
DatastoreV1.Write.withProjectId(String)
but with aValueProvider
. - withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the Cloud Spanner project ID.
- withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
-
Specifies the Cloud Spanner project.
- withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
-
Specifies the Cloud Spanner project.
- withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
-
Specifies the Cloud Spanner project.
- withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
-
Specifies the Cloud Spanner project.
- withProjectId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
-
Specifies the Cloud Spanner project.
- withPubsubRootUrl(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
- withQuery(Statement) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
- withQuery(Statement) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
- withQuery(Query) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
-
Returns a new
DatastoreV1.Read
that reads the results of the specified query. - withQuery(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
- withQuery(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
- withQueryLocation(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
-
BigQuery geographic location where the query job will be executed.
- withQueryName(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
- withQueryName(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
- withQueryPriority(BigQueryIO.TypedRead.QueryPriority) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- withQueryStartedAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
-
Sets the time that the change stream query which produced this record started.
- withQueryTempDataset(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
-
Temporary dataset reference when using
BigQueryIO.TypedRead.fromQuery(String)
. - withRampupThrottlingDisabled() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteEntity
-
Returns a new
DatastoreV1.DeleteEntity
that does not throttle during ramp-up. - withRampupThrottlingDisabled() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.DeleteKey
-
Returns a new
DatastoreV1.DeleteKey
that does not throttle during ramp-up. - withRampupThrottlingDisabled() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Write
-
Returns a new
DatastoreV1.Write
that does not throttle during ramp-up. - withReadOperation(ReadOperation) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
- withReadTime(Instant) - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
-
Returns a new
DatastoreV1.Read
that reads at the specifiedreadTime
. - withRecordReadAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
-
Sets the time at which the record was fully read.
- withRecordStreamEndedAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
-
Sets the time at which the record finished streaming.
- withRecordStreamStartedAt(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
-
Sets the time at which the record started to be streamed.
- withRecordTimestamp(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
-
Sets the timestamp of when this record occurred.
- withReportDiagnosticMetrics() - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
-
Whether additional diagnostic metrics should be reported for a Transform.
- withRetryableCodes(ImmutableSet<StatusCode.Code>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the errors that will be retried by the client library for all operations.
- withRowFilter(RowFilter) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Returns a new
BigtableIO.Read
that will filter the rows read from Cloud Bigtable using the given row filter. - withRowFilter(ValueProvider<RowFilter>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Returns a new
BigtableIO.Read
that will filter the rows read from Cloud Bigtable using the given row filter. - withRowRestriction(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- withRowRestriction(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
-
Read only rows which match the specified filter, which must be a SQL expression compatible with Google standard SQL.
- withRpcPriority(Options.RpcPriority) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the RPC priority.
- withRpcPriority(Options.RpcPriority) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
-
Specifies the priority of the change stream queries.
- withRpcPriority(ValueProvider<Options.RpcPriority>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerConfig
-
Specifies the RPC priority.
- withSamplePeriod(Duration) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
-
Configure the length of time sampled request data will be retained.
- withSamplePeriodBucketSize(Duration) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
-
Configure the size of buckets within the specified
samplePeriod
. - withSchema(TableSchema) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Uses the specified schema for rows to be written.
- withSchema(ValueProvider<TableSchema>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Same as
BigQueryIO.Write.withSchema(TableSchema)
but using a deferredValueProvider
. - withSchemaFromView(PCollectionView<Map<String, String>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Allows the schemas for each table to be computed within the pipeline itself.
- withSchemaReadySignal(PCollection<?>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
-
Specifies an optional input PCollection that can be used as the signal for
Wait.OnSignal
to indicate when the database schema is ready to be read. - withSchemaUpdateOptions(Set<BigQueryIO.Write.SchemaUpdateOption>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Allows the schema of the destination table to be updated as a side effect of the write.
- withSelectedFields(List<String>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- withSelectedFields(ValueProvider<List<String>>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
-
Read only the specified fields (columns) from a BigQuery table.
- withSpannerConfig(SpannerConfig) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
-
Specifies the Cloud Spanner configuration.
- withSpannerConfig(SpannerConfig) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
-
Specifies the Cloud Spanner configuration.
- withSpannerConfig(SpannerConfig) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
-
Specifies the Cloud Spanner configuration.
- withSpannerConfig(SpannerConfig) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
-
Specifies the Cloud Spanner configuration.
- withSpannerConfig(SpannerConfig) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
-
Specifies the Cloud Spanner configuration.
- withStorageClient(BigQueryServices.StorageClient) - Method in class org.apache.beam.sdk.io.gcp.testing.FakeBigQueryServices
- withSuccessfulInsertsPropagation(boolean) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
If true, it enables the propagation of the successfully inserted TableRows on BigQuery as part of the
WriteResult
object when usingBigQueryIO.Write.Method.STREAMING_INSERTS
. - withTable(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.ReadOperation
- withTable(String) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
- withTableDescription(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Specifies the table description.
- withTableId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Returns a new
BigtableIO.Read
that will read from the specified table. - withTableId(String) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
-
Returns a new
BigtableIO.Write
that will write to the specified table. - withTableId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Read
-
Returns a new
BigtableIO.Read
that will read from the specified table. - withTableId(ValueProvider<String>) - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
-
Returns a new
BigtableIO.Write
that will write to the specified table. - withTableReference(TableReference) - Method in class org.apache.beam.sdk.io.gcp.bigquery.TableDestination
- withTemplateCompatibility() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
-
Use new template-compatible source implementation.
- withTemplateCompatibility() - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- withTestServices(BigQueryServices) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Read
- withTestServices(BigQueryServices) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.TypedRead
- withTestServices(BigQueryServices) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
- withThrottleDuration(Duration) - Method in class org.apache.beam.sdk.io.gcp.firestore.RpcQosOptions.Builder
-
Configure the amount of time an attempt will be throttled if deemed necessary based on previous success rate.
- withTimePartitioning(TimePartitioning) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Allows newly created tables to include a
TimePartitioning
class. - withTimePartitioning(ValueProvider<TimePartitioning>) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Like
BigQueryIO.Write.withTimePartitioning(TimePartitioning)
but using a deferredValueProvider
. - withTimestamp(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
- withTimestamp(Timestamp) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
- withTimestampAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Read
-
When reading from Cloud Pub/Sub where record timestamps are provided as Pub/Sub message attributes, specifies the name of the attribute that contains the timestamp.
- withTimestampAttribute(String) - Method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
-
Writes to Pub/Sub and adds each record's timestamp to the published messages in an attribute with the specified name.
- withTimestampBound(TimestampBound) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.CreateTransaction
- withTimestampBound(TimestampBound) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
- withTimestampBound(TimestampBound) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
- withTotalStreamTimeMillis(long) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.model.ChangeStreamRecordMetadata.Builder
-
Sets the total streaming time (in millis) for this record.
- withTraceSampleProbability(Double) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadChangeStream
-
Deprecated.This configuration has no effect, as tracing is not available.
- withTransaction(PCollectionView<Transaction>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Read
- withTransaction(PCollectionView<Transaction>) - Method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.ReadAll
- withTriggeringFrequency(Duration) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Choose the frequency at which file writes are triggered.
- withWriteDisposition(BigQueryIO.Write.WriteDisposition) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Specifies what to do with existing data in the table, in case the table already exists.
- withWriteResults() - Method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
-
Returns a
BigtableIO.WriteWithResults
that will emit aBigtableWriteResult
for each batch of rows written. - withWriteTempDataset(String) - Method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
-
Temporary dataset.
- write() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
-
A
PTransform
that writes aPCollection
to a BigQuery table. - write() - Static method in class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO
-
Creates an uninitialized
BigtableIO.Write
. - write() - Method in class org.apache.beam.sdk.io.gcp.datastore.DatastoreV1
-
Returns an empty
DatastoreV1.Write
builder. - write() - Method in class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1
-
The class returned by this method provides the ability to create
PTransforms
for write operations available in the Firestore V1 API provided byFirestoreStub
. - write() - Static method in class org.apache.beam.sdk.io.gcp.spanner.SpannerIO
-
Creates an uninitialized instance of
SpannerIO.Write
. - write(Object, Encoder) - Method in class org.apache.beam.sdk.io.gcp.spanner.changestreams.encoder.TimestampEncoding
-
Serializes a
Timestamp
received as datum to the output encoder out. - write(PublisherOptions) - Static method in class org.apache.beam.sdk.io.gcp.pubsublite.PubsubLiteIO
-
Write messages to Pub/Sub Lite.
- Write() - Constructor for class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write
- Write() - Constructor for class org.apache.beam.sdk.io.gcp.bigtable.BigtableIO.Write
- Write() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.FhirIO.Write
- Write() - Constructor for class org.apache.beam.sdk.io.gcp.healthcare.HL7v2IO.Write
- Write() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO.Write
- Write() - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.Write
- WRITE_APPEND - org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.WriteDisposition
-
Specifies that rows may be appended to an existing table.
- WRITE_EMPTY - org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.WriteDisposition
-
Specifies that the output table must be empty.
- WRITE_TRUNCATE - org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.WriteDisposition
-
Specifies that write should replace a table.
- WRITE_URN - Static variable in class org.apache.beam.sdk.io.gcp.pubsublite.internal.ExternalTransformRegistrarImpl
- writeAvros(Class<T>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
-
Returns A
PTransform
that writes binary encoded Avro messages of a given type to a Google Cloud Pub/Sub stream. - WriteBuilder() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.ExternalWrite.WriteBuilder
- writeCallMetric(TableReference) - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryUtils
- WriteFailure(Write, WriteResult, Status) - Constructor for class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteFailure
- WriteGrouped(SpannerIO.Write) - Constructor for class org.apache.beam.sdk.io.gcp.spanner.SpannerIO.WriteGrouped
- writeMessages() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
-
Returns A
PTransform
that writes to a Google Cloud Pub/Sub stream. - writeProtos(Class<T>) - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
-
Returns A
PTransform
that writes binary encoded protobuf messages of a given type to a Google Cloud Pub/Sub stream. - WriteRegistrar() - Constructor for class org.apache.beam.sdk.io.gcp.pubsub.PubSubPayloadTranslation.WriteRegistrar
- WriteResult - Class in org.apache.beam.sdk.io.gcp.bigquery
-
The result of a
BigQueryIO.Write
transform. - writeStrings() - Static method in class org.apache.beam.sdk.io.gcp.pubsub.PubsubIO
-
Returns A
PTransform
that writes UTF-8 encoded strings to a Google Cloud Pub/Sub stream. - WriteSuccessSummary(int, long) - Constructor for class org.apache.beam.sdk.io.gcp.firestore.FirestoreV1.WriteSuccessSummary
- writeTableRows() - Static method in class org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO
-
A
PTransform
that writes aPCollection
containingTableRows
to a BigQuery table.
All Classes All Packages