Class

org.platanios.tensorflow.api.ops.io.data

ParallelInterleaveDataset

Related Doc: package data

Permalink

case class ParallelInterleaveDataset[T, O, D, S, RT, RO, RD, RS](inputDataset: Dataset[T, O, D, S], function: (O) ⇒ Dataset[RT, RO, RD, RS], cycleLength: Output, blockLength: Output = 1, sloppy: Boolean = false, bufferOutputElements: Output = null, prefetchInputElements: Output = null, name: String = "ParallelInterleaveDataset")(implicit evData: Aux[T, O, D, S] = inputDataset.evData, evFunctionInput: ArgType[O] = inputDataset.evFunctionInput, evStructure: Aux[RT, RO, RD, RS], evRData: Aux[RT, RO, RD, RS], evFunctionOutput: ArgType[RO]) extends Dataset[RT, RO, RD, RS] with Product with Serializable

Dataset that wraps the application of the parallelInterleave op.

$OpDocDatasetParallelInterleave

T

Tensor type (i.e., nested structure of tensors).

O

Output type (i.e., nested structure of symbolic tensors).

D

Data type of the outputs (i.e., nested structure of TensorFlow data types).

S

Shape type of the outputs (i.e., nested structure of TensorFlow shapes).

inputDataset

Input dataset.

function

Mapping function.

cycleLength

Number of elements from the input dataset that will be processed concurrently.

blockLength

Number of consecutive elements to produce from each input element before cycling to another input element.

sloppy

If false, elements are produced in deterministic order. Otherwise, the implementation is allowed, for the sake of expediency, to produce elements in a non-deterministic order.

bufferOutputElements

Number of elements each iterator being interleaved should buffer (similar to the prefetch(...) transformation for each interleaved iterator).

prefetchInputElements

Number of input elements to transform to iterators before they are needed for interleaving.

name

Name for this dataset.

Linear Supertypes
Serializable, Serializable, Product, Equals, Dataset[RT, RO, RD, RS], AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. ParallelInterleaveDataset
  2. Serializable
  3. Serializable
  4. Product
  5. Equals
  6. Dataset
  7. AnyRef
  8. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new ParallelInterleaveDataset(inputDataset: Dataset[T, O, D, S], function: (O) ⇒ Dataset[RT, RO, RD, RS], cycleLength: Output, blockLength: Output = 1, sloppy: Boolean = false, bufferOutputElements: Output = null, prefetchInputElements: Output = null, name: String = "ParallelInterleaveDataset")(implicit evData: Aux[T, O, D, S] = inputDataset.evData, evFunctionInput: ArgType[O] = inputDataset.evFunctionInput, evStructure: Aux[RT, RO, RD, RS], evRData: Aux[RT, RO, RD, RS], evFunctionOutput: ArgType[RO])

    Permalink

    inputDataset

    Input dataset.

    function

    Mapping function.

    cycleLength

    Number of elements from the input dataset that will be processed concurrently.

    blockLength

    Number of consecutive elements to produce from each input element before cycling to another input element.

    sloppy

    If false, elements are produced in deterministic order. Otherwise, the implementation is allowed, for the sake of expediency, to produce elements in a non-deterministic order.

    bufferOutputElements

    Number of elements each iterator being interleaved should buffer (similar to the prefetch(...) transformation for each interleaved iterator).

    prefetchInputElements

    Number of input elements to transform to iterators before they are needed for interleaving.

    name

    Name for this dataset.

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  5. val blockLength: Output

    Permalink

    Number of consecutive elements to produce from each input element before cycling to another input element.

  6. val bufferOutputElements: Output

    Permalink

    Number of elements each iterator being interleaved should buffer (similar to the prefetch(...) transformation for each interleaved iterator).

  7. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  8. def createHandle(): Output

    Permalink

    Creates a VARIANT scalar tensor representing this dataset.

    Creates a VARIANT scalar tensor representing this dataset. This function adds ops to the current graph, that create the dataset resource.

    Definition Classes
    ParallelInterleaveDatasetDataset
  9. def createInitializableIterator(sharedName: String = "", name: String = "InitializableIterator"): InitializableIterator[RT, RO, RD, RS]

    Permalink

    Creates an Iterator for enumerating the elements of this dataset.

    Creates an Iterator for enumerating the elements of this dataset.

    **Note:** The returned iterator will be in an uninitialized state. You must execute the InitializableIterator.initializer op before using it.

    sharedName

    If non-empty, then the constructed reader will be shared under the the provided name across multiple sessions that share the same devices (e.g., when using a remote server).

    name

    Name for the op created in relation to the iterator.

    returns

    Created iterator.

    Definition Classes
    Dataset
  10. val cycleLength: Output

    Permalink

    Number of elements from the input dataset that will be processed concurrently.

  11. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  12. implicit val evData: Aux[RT, RO, RD, RS]

    Permalink
    Definition Classes
    Dataset
  13. implicit val evFunctionInput: ArgType[RO]

    Permalink
    Definition Classes
    Dataset
  14. implicit val evStructure: Aux[RT, RO, RD, RS]

    Permalink
    Definition Classes
    Dataset
  15. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  16. val function: (O) ⇒ Dataset[RT, RO, RD, RS]

    Permalink

    Mapping function.

  17. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  18. val inputDataset: Dataset[T, O, D, S]

    Permalink

    Input dataset.

  19. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  20. val name: String

    Permalink

    Name for this dataset.

    Name for this dataset.

    Definition Classes
    ParallelInterleaveDatasetDataset
  21. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  22. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  23. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  24. def outputDataTypes: RD

    Permalink

    Returns the data types corresponding to each element of this dataset, matching the structure of the elements.

    Returns the data types corresponding to each element of this dataset, matching the structure of the elements.

    Definition Classes
    ParallelInterleaveDatasetDataset
  25. def outputShapes: RS

    Permalink

    Returns the shapes corresponding to each element of this dataset, matching the structure of the elements.

    Returns the shapes corresponding to each element of this dataset, matching the structure of the elements.

    Definition Classes
    ParallelInterleaveDatasetDataset
  26. val prefetchInputElements: Output

    Permalink

    Number of input elements to transform to iterators before they are needed for interleaving.

  27. def shard(numShards: Long, shardIndex: Long): Dataset[RT, RO, RD, RS]

    Permalink

    Creates a dataset that includes only 1 / numShards of the elements of this dataset.

    Creates a dataset that includes only 1 / numShards of the elements of this dataset.

    This operator is very useful when running distributed training, as it allows each worker to read a unique subset of the dataset.

    When reading a single input file, you can skip elements as follows:

    tf.data.TFRecordDataset(inputFile)
      .shard(numWorkers, workerIndex)
      .repeat(numEpochs)
      .shuffle(shuffleBufferSize)
      .map(parserFn, numParallelCalls)

    Important caveats:

    • Be sure to shard before you use any randomizing operator (such as shuffle).
    • Generally it is best if the shard operator is used early in the dataset pipeline. For example, when reading from a set of TensorFlow record files, shard before converting the dataset to input samples. This avoids reading every file on every worker. The following is an example of an efficient sharding strategy within a complete pipeline:
    tf.data.listFiles(pattern)
      .shard(numWorkers, workerIndex)
      .repeat(numEpochs)
      .shuffle(shuffleBufferSize)
      .repeat()
      .interleave(tf.data.TFRecordDataset, cycleLength = numReaders, blockLength = 1)
      .map(parserFn, numParallelCalls)
    numShards

    Number of shards to use.

    shardIndex

    Index of the shard to obtain.

    returns

    Created (sharded) dataset.

    Definition Classes
    Dataset
  28. val sloppy: Boolean

    Permalink

    If false, elements are produced in deterministic order.

    If false, elements are produced in deterministic order. Otherwise, the implementation is allowed, for the sake of expediency, to produce elements in a non-deterministic order.

  29. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  30. def toString(): String

    Permalink
    Definition Classes
    Dataset → AnyRef → Any
  31. def transform[TT, TO, TD, TS](transformFn: (Dataset[RT, RO, RD, RS]) ⇒ Dataset[TT, TO, TD, TS])(implicit evStructure: Aux[TT, TO, TD, TS], evT: Aux[TT, TO, TD, TS], evFunctionInputT: ArgType[TO]): Dataset[TT, TO, TD, TS]

    Permalink

    Applies a transformation function to this dataset.

    Applies a transformation function to this dataset.

    transform() enables chaining of custom dataset transformations, which are represented as functions that take one dataset argument and return a transformed dataset.

    transformFn

    Dataset transformation function.

    returns

    Transformed dataset.

    Definition Classes
    Dataset
  32. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  33. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  34. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from Serializable

Inherited from Serializable

Inherited from Product

Inherited from Equals

Inherited from Dataset[RT, RO, RD, RS]

Inherited from AnyRef

Inherited from Any

Ungrouped