Package | Description |
---|---|
com.google.api.services.dataflow | |
com.google.api.services.dataflow.model |
Class and Description |
---|
Job
Defines a job to be run by the Dataflow service.
|
LeaseWorkItemRequest
Request to lease WorkItems.
|
ReportWorkItemStatusRequest
Request to report the status of WorkItems.
|
SendWorkerMessagesRequest
A request for sending worker messages to the service.
|
Class and Description |
---|
ApproximateProgress
Obsolete in favor of ApproximateReportedProgress and ApproximateSplitRequest.
|
ApproximateReportedProgress
A progress measurement of a WorkItem by a worker.
|
ApproximateSplitRequest
A suggestion by the service to the worker to dynamically split the WorkItem.
|
AutoscalingSettings
Settings for WorkerPool autoscaling.
|
ComputationTopology
All configuration data for a particular Computation.
|
ConcatPosition
A position that encapsulates an inner position and an index for the inner position.
|
CustomSourceLocation
Identifies the location of a custom souce.
|
DataDiskAssignment
Data disk assignment for a given VM instance.
|
DataflowPackage
Packages that need to be installed in order for a worker to run the steps of the Dataflow job
which will be assigned to its worker pool.
|
DerivedSource
Specification of one of the bundles produced as a result of splitting a Source (e.g.
|
Disk
Describes the data disk used by a workflow job.
|
DynamicSourceSplit
When a task splits using WorkItemStatus.dynamic_source_split, this message describes the two
parts of the split relative to the description of the current task's input.
|
Environment
Describes the environment in which a Dataflow Job runs.
|
FlattenInstruction
An instruction that copies its inputs (zero or more) to its (single) output.
|
InstructionInput
An input of an instruction, as a reference to an output of a producer instruction.
|
InstructionOutput
An output of an instruction.
|
Job
Defines a job to be run by the Dataflow service.
|
JobExecutionInfo
Additional information about how a Dataflow job will be executed which isn’t contained in the
submitted job.
|
JobExecutionStageInfo
Contains information about how a particular google.dataflow.v1beta3.Step will be executed.
|
JobMessage
A particular message pertaining to a Dataflow job.
|
JobMetrics
JobMetrics contains a collection of metrics descibing the detailed progress of a Dataflow job.
|
KeyRangeDataDiskAssignment
Data disk assignment information for a specific key-range of a sharded computation.
|
KeyRangeLocation
Location information for a specific key-range of a sharded computation.
|
LeaseWorkItemRequest
Request to lease WorkItems.
|
LeaseWorkItemResponse
Response to a request to lease WorkItems.
|
ListJobMessagesResponse
Response to a request to list job messages.
|
ListJobsResponse
Response to a request to list Dataflow jobs.
|
MapTask
MapTask consists of an ordered set of instructions, each of which describes one particular low-
level operation for the worker to perform in order to accomplish the MapTask's WorkItem.
|
MetricStructuredName
Identifies a metric, by describing the source which generated the metric.
|
MetricUpdate
Describes the state of a metric.
|
MountedDataDisk
Describes mounted data disk.
|
MultiOutputInfo
Information about an output of a multi-output DoFn.
|
ParallelInstruction
Describes a particular operation comprising a MapTask.
|
ParDoInstruction
An instruction that does a ParDo operation.
|
PartialGroupByKeyInstruction
An instruction that does a partial group-by-key.
|
Position
Position defines a position within a collection of data.
|
PubsubLocation
Identifies a pubsub location to use for transferring data into or out of a streaming Dataflow
job.
|
ReadInstruction
An instruction that reads records.
|
ReportedParallelism
Represents the level of parallelism in a WorkItem's input, reported by the worker.
|
ReportWorkItemStatusRequest
Request to report the status of WorkItems.
|
ReportWorkItemStatusResponse
Response from a request to report the status of WorkItems.
|
SendWorkerMessagesRequest
A request for sending worker messages to the service.
|
SendWorkerMessagesResponse
The response to the worker messages.
|
SeqMapTask
Describes a particular function to invoke.
|
SeqMapTaskOutputInfo
Information about an output of a SeqMapTask.
|
ShellTask
A task which consists of a shell command for the worker to execute.
|
SideInputInfo
Information about a side input of a DoFn or an input of a SeqDoFn.
|
Sink
A sink that records can be encoded and written to.
|
Source
A source that records can be read and decoded from.
|
SourceFork
DEPRECATED in favor of DynamicSourceSplit.
|
SourceGetMetadataRequest
A request to compute the SourceMetadata of a Source.
|
SourceGetMetadataResponse
The result of a SourceGetMetadataOperation.
|
SourceMetadata
Metadata about a Source useful for automatically optimizing and tuning the pipeline, etc.
|
SourceOperationRequest
A work item that represents the different operations that can be performed on a user-defined
Source specification.
|
SourceOperationResponse
The result of a SourceOperationRequest, specified in ReportWorkItemStatusRequest.source_operation
when the work item is completed.
|
SourceSplitOptions
Hints for splitting a Source into bundles (parts for parallel processing) using
SourceSplitRequest.
|
SourceSplitRequest
Represents the operation to split a high-level Source specification into bundles (parts for
parallel processing).
|
SourceSplitResponse
The response to a SourceSplitRequest.
|
SourceSplitShard
DEPRECATED in favor of DerivedSource.
|
StateFamilyConfig
State family configuration.
|
Status
The `Status` type defines a logical error model that is suitable for different programming
environments, including REST APIs and RPC APIs.
|
Step
Defines a particular step within a Dataflow job.
|
StreamingComputationRanges
Describes full or partial data disk assignment information of the computation ranges.
|
StreamingComputationTask
A task which describes what action should be performed for the specified streaming computation
ranges.
|
StreamingSetupTask
A task which initializes part of a streaming Dataflow job.
|
StreamingSideInputLocation
Identifies the location of a streaming side input.
|
StreamingStageLocation
Identifies the location of a streaming computation stage, for stage-to-stage communication.
|
StreamLocation
Describes a stream of data, either as input to be processed or as output of a streaming Dataflow
job.
|
TaskRunnerSettings
Taskrunner configuration settings.
|
TopologyConfig
Global topology of the streaming Dataflow job, including all computations and their sharded
locations.
|
WorkerHealthReport
WorkerHealthReport contains information about the health of a worker.
|
WorkerHealthReportResponse
WorkerHealthReportResponse contains information returned to the worker in response to a health
ping.
|
WorkerMessage
WorkerMessage provides information to the backend about a worker.
|
WorkerMessageCode
A message code is used to report status and error messages to the service.
|
WorkerMessageResponse
A worker_message response allows the server to pass information to the sender.
|
WorkerPool
Describes one particular pool of Dataflow workers to be instantiated by the Dataflow service in
order to perform the computations required by a job.
|
WorkerSettings
Provides data to pass through to the worker harness.
|
WorkItem
WorkItem represents basic information about a WorkItem to be executed in the cloud.
|
WorkItemServiceState
The Dataflow service's idea of the current state of a WorkItem being processed by a worker.
|
WorkItemStatus
Conveys a worker's progress through the work described by a WorkItem.
|
WriteInstruction
An instruction that writes records.
|