io.prediction.controller

PDataSource

abstract class PDataSource[TD, EI, Q, A] extends BaseDataSource[TD, EI, Q, A]

Base class of a parallel data source.

A parallel data source runs locally within a single machine, or in parallel on a cluster, to return data that is distributed across a cluster.

TD

Training data class.

EI

Evaluation Info class.

Q

Input query class.

A

Actual value class.

Linear Supertypes
BaseDataSource[TD, EI, Q, A], AbstractDoer, Serializable, Serializable, AnyRef, Any
Known Subclasses
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. PDataSource
  2. BaseDataSource
  3. AbstractDoer
  4. Serializable
  5. Serializable
  6. AnyRef
  7. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new PDataSource()

Abstract Value Members

  1. abstract def readTraining(sc: SparkContext): TD

    Implement this method to only return training data from a data source

Concrete Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  7. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  8. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  9. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  10. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  11. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  12. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  13. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  14. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  15. final def notify(): Unit

    Definition Classes
    AnyRef
  16. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  17. def readEval(sc: SparkContext): Seq[(TD, EI, RDD[(Q, A)])]

    To provide evaluation feature for your engine, your must override this method to return data for evaluation from a data source.

    To provide evaluation feature for your engine, your must override this method to return data for evaluation from a data source. Returned data can optionally include a sequence of query and actual value pairs for evaluation purpose.

    The default implementation returns an empty sequence as a stub, so that an engine can be compiled without implementing evaluation.

  18. def readEvalBase(sc: SparkContext): Seq[(TD, EI, RDD[(Q, A)])]

    :: DeveloperApi :: Engine developer should not use this directly.

    :: DeveloperApi :: Engine developer should not use this directly. This is called by evaluation workflow to read training and validation data.

    sc

    Spark context

    returns

    Sets of training data, evaluation information, queries, and actual results

    Definition Classes
    PDataSourceBaseDataSource
  19. def readTrainingBase(sc: SparkContext): TD

    :: DeveloperApi :: Engine developer should not use this directly.

    :: DeveloperApi :: Engine developer should not use this directly. This is called by workflow to read training data.

    sc

    Spark context

    returns

    Training data

    Definition Classes
    PDataSourceBaseDataSource
  20. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  21. def toString(): String

    Definition Classes
    AnyRef → Any
  22. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  23. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  24. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Deprecated Value Members

  1. def read(sc: SparkContext): Seq[(TD, EI, RDD[(Q, A)])]

    Annotations
    @deprecated
    Deprecated

    (Since version 0.9.0) Use readEval() instead.

Inherited from BaseDataSource[TD, EI, Q, A]

Inherited from AbstractDoer

Inherited from Serializable

Inherited from Serializable

Inherited from AnyRef

Inherited from Any

Ungrouped