Class/Object

org.platanios.tensorflow.api.ops.io.data

InterleaveDataset

Related Docs: object InterleaveDataset | package data

Permalink

case class InterleaveDataset[T, O, D, S, RT, RO, RD, RS](inputDataset: Dataset[T, O, D, S], function: (O) ⇒ Dataset[RT, RO, RD, RS], cycleLength: Output, blockLength: Output = 1, name: String = "InterleaveDataset")(implicit evData: Aux[T, O, D, S] = inputDataset.evData, evFunctionInput: ArgType[O] = inputDataset.evFunctionInput, evStructure: Aux[RT, RO, RD, RS], evRData: Aux[RT, RO, RD, RS], evFunctionOutput: ArgType[RO]) extends Dataset[RT, RO, RD, RS] with Product with Serializable

Dataset that wraps the application of the interleave op.

$OpDocDatasetInterleave

T

Tensor type (i.e., nested structure of tensors).

O

Output type (i.e., nested structure of symbolic tensors).

D

Data type of the outputs (i.e., nested structure of TensorFlow data types).

S

Shape type of the outputs (i.e., nested structure of TensorFlow shapes).

inputDataset

Input dataset.

function

Mapping function.

cycleLength

Number of elements from the input dataset that will be processed concurrently.

blockLength

Number of consecutive elements to produce from each input element before cycling to another input element.

name

Name for this dataset.

Linear Supertypes
Serializable, Serializable, Product, Equals, Dataset[RT, RO, RD, RS], AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. InterleaveDataset
  2. Serializable
  3. Serializable
  4. Product
  5. Equals
  6. Dataset
  7. AnyRef
  8. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new InterleaveDataset(inputDataset: Dataset[T, O, D, S], function: (O) ⇒ Dataset[RT, RO, RD, RS], cycleLength: Output, blockLength: Output = 1, name: String = "InterleaveDataset")(implicit evData: Aux[T, O, D, S] = inputDataset.evData, evFunctionInput: ArgType[O] = inputDataset.evFunctionInput, evStructure: Aux[RT, RO, RD, RS], evRData: Aux[RT, RO, RD, RS], evFunctionOutput: ArgType[RO])

    Permalink

    inputDataset

    Input dataset.

    function

    Mapping function.

    cycleLength

    Number of elements from the input dataset that will be processed concurrently.

    blockLength

    Number of consecutive elements to produce from each input element before cycling to another input element.

    name

    Name for this dataset.

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  5. val blockLength: Output

    Permalink

    Number of consecutive elements to produce from each input element before cycling to another input element.

  6. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  7. def createHandle(): Output

    Permalink

    Creates a VARIANT scalar tensor representing this dataset.

    Creates a VARIANT scalar tensor representing this dataset. This function adds ops to the current graph, that create the dataset resource.

    Definition Classes
    InterleaveDatasetDataset
  8. def createInitializableIterator(sharedName: String = "", name: String = "InitializableIterator"): InitializableIterator[RT, RO, RD, RS]

    Permalink

    Creates an Iterator for enumerating the elements of this dataset.

    Creates an Iterator for enumerating the elements of this dataset.

    **Note:** The returned iterator will be in an uninitialized state. You must execute the InitializableIterator.initializer op before using it.

    sharedName

    If non-empty, then the constructed reader will be shared under the the provided name across multiple sessions that share the same devices (e.g., when using a remote server).

    name

    Name for the op created in relation to the iterator.

    returns

    Created iterator.

    Definition Classes
    Dataset
  9. val cycleLength: Output

    Permalink

    Number of elements from the input dataset that will be processed concurrently.

  10. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  11. implicit val evData: Aux[RT, RO, RD, RS]

    Permalink
    Definition Classes
    Dataset
  12. implicit val evFunctionInput: ArgType[RO]

    Permalink
    Definition Classes
    Dataset
  13. implicit val evStructure: Aux[RT, RO, RD, RS]

    Permalink
    Definition Classes
    Dataset
  14. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  15. val function: (O) ⇒ Dataset[RT, RO, RD, RS]

    Permalink

    Mapping function.

  16. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  17. val inputDataset: Dataset[T, O, D, S]

    Permalink

    Input dataset.

  18. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  19. val name: String

    Permalink

    Name for this dataset.

    Name for this dataset.

    Definition Classes
    InterleaveDatasetDataset
  20. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  21. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  22. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  23. def outputDataTypes: RD

    Permalink

    Returns the data types corresponding to each element of this dataset, matching the structure of the elements.

    Returns the data types corresponding to each element of this dataset, matching the structure of the elements.

    Definition Classes
    InterleaveDatasetDataset
  24. def outputShapes: RS

    Permalink

    Returns the shapes corresponding to each element of this dataset, matching the structure of the elements.

    Returns the shapes corresponding to each element of this dataset, matching the structure of the elements.

    Definition Classes
    InterleaveDatasetDataset
  25. def shard(numShards: Long, shardIndex: Long): Dataset[RT, RO, RD, RS]

    Permalink

    Creates a dataset that includes only 1 / numShards of the elements of this dataset.

    Creates a dataset that includes only 1 / numShards of the elements of this dataset.

    This operator is very useful when running distributed training, as it allows each worker to read a unique subset of the dataset.

    When reading a single input file, you can skip elements as follows:

    tf.data.TFRecordDataset(inputFile)
      .shard(numWorkers, workerIndex)
      .repeat(numEpochs)
      .shuffle(shuffleBufferSize)
      .map(parserFn, numParallelCalls)

    Important caveats:

    • Be sure to shard before you use any randomizing operator (such as shuffle).
    • Generally it is best if the shard operator is used early in the dataset pipeline. For example, when reading from a set of TensorFlow record files, shard before converting the dataset to input samples. This avoids reading every file on every worker. The following is an example of an efficient sharding strategy within a complete pipeline:
    tf.data.listFiles(pattern)
      .shard(numWorkers, workerIndex)
      .repeat(numEpochs)
      .shuffle(shuffleBufferSize)
      .repeat()
      .interleave(tf.data.TFRecordDataset, cycleLength = numReaders, blockLength = 1)
      .map(parserFn, numParallelCalls)
    numShards

    Number of shards to use.

    shardIndex

    Index of the shard to obtain.

    returns

    Created (sharded) dataset.

    Definition Classes
    Dataset
  26. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  27. def toString(): String

    Permalink
    Definition Classes
    Dataset → AnyRef → Any
  28. def transform[TT, TO, TD, TS](transformFn: (Dataset[RT, RO, RD, RS]) ⇒ Dataset[TT, TO, TD, TS])(implicit evStructure: Aux[TT, TO, TD, TS], evT: Aux[TT, TO, TD, TS], evFunctionInputT: ArgType[TO]): Dataset[TT, TO, TD, TS]

    Permalink

    Applies a transformation function to this dataset.

    Applies a transformation function to this dataset.

    transform() enables chaining of custom dataset transformations, which are represented as functions that take one dataset argument and return a transformed dataset.

    transformFn

    Dataset transformation function.

    returns

    Transformed dataset.

    Definition Classes
    Dataset
  29. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  30. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  31. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from Serializable

Inherited from Serializable

Inherited from Product

Inherited from Equals

Inherited from Dataset[RT, RO, RD, RS]

Inherited from AnyRef

Inherited from Any

Ungrouped