public final class WriteResult
extends java.lang.Object
implements org.apache.beam.sdk.values.POutput
BigQueryIO.Write
transform.Modifier and Type | Method and Description |
---|---|
java.util.Map<org.apache.beam.sdk.values.TupleTag<?>,org.apache.beam.sdk.values.PValue> |
expand() |
void |
finishSpecifyingOutput(java.lang.String transformName,
org.apache.beam.sdk.values.PInput input,
org.apache.beam.sdk.transforms.PTransform<?,?> transform) |
org.apache.beam.sdk.values.PCollection<com.google.api.services.bigquery.model.TableRow> |
getFailedInserts()
Returns a
PCollection containing the TableRow s that didn't make it to BQ. |
org.apache.beam.sdk.values.PCollection<BigQueryInsertError> |
getFailedInsertsWithErr()
Returns a
PCollection containing the BigQueryInsertError s with detailed error
information. |
org.apache.beam.sdk.values.PCollection<BigQueryStorageApiInsertError> |
getFailedStorageApiInserts()
Return any rows that persistently fail to insert when using a storage-api method.
|
org.apache.beam.sdk.Pipeline |
getPipeline() |
org.apache.beam.sdk.values.PCollection<com.google.api.services.bigquery.model.TableRow> |
getSuccessfulInserts()
Returns a
PCollection containing the TableRow s that were written to BQ via the
streaming insert API. |
org.apache.beam.sdk.values.PCollection<com.google.api.services.bigquery.model.TableRow> |
getSuccessfulStorageApiInserts()
Return all rows successfully inserted using one of the storage-api insert methods.
|
org.apache.beam.sdk.values.PCollection<TableDestination> |
getSuccessfulTableLoads()
Returns a
PCollection containing the TableDestination s that were successfully
loaded using the batch load API. |
public java.util.Map<org.apache.beam.sdk.values.TupleTag<?>,org.apache.beam.sdk.values.PValue> expand()
expand
in interface org.apache.beam.sdk.values.POutput
public org.apache.beam.sdk.values.PCollection<TableDestination> getSuccessfulTableLoads()
PCollection
containing the TableDestination
s that were successfully
loaded using the batch load API.public org.apache.beam.sdk.values.PCollection<com.google.api.services.bigquery.model.TableRow> getSuccessfulInserts()
PCollection
containing the TableRow
s that were written to BQ via the
streaming insert API.public org.apache.beam.sdk.values.PCollection<com.google.api.services.bigquery.model.TableRow> getFailedInserts()
PCollection
containing the TableRow
s that didn't make it to BQ.
Only use this method if you haven't enabled BigQueryIO.Write.withExtendedErrorInfo()
. Otherwise use getFailedInsertsWithErr()
public org.apache.beam.sdk.values.PCollection<BigQueryInsertError> getFailedInsertsWithErr()
PCollection
containing the BigQueryInsertError
s with detailed error
information.
Only use this method if you have enabled BigQueryIO.Write.withExtendedErrorInfo()
.
Otherwise use getFailedInserts()
public org.apache.beam.sdk.values.PCollection<BigQueryStorageApiInsertError> getFailedStorageApiInserts()
public org.apache.beam.sdk.values.PCollection<com.google.api.services.bigquery.model.TableRow> getSuccessfulStorageApiInserts()
public org.apache.beam.sdk.Pipeline getPipeline()
getPipeline
in interface org.apache.beam.sdk.values.POutput
public void finishSpecifyingOutput(java.lang.String transformName, org.apache.beam.sdk.values.PInput input, org.apache.beam.sdk.transforms.PTransform<?,?> transform)
finishSpecifyingOutput
in interface org.apache.beam.sdk.values.POutput