io.prediction.controller

PAlgorithm

abstract class PAlgorithm[PD, M, Q, P] extends BaseAlgorithm[PD, M, Q, P]

Base class of a parallel algorithm.

A parallel algorithm can be run in parallel on a cluster and produces a model that can also be distributed across a cluster.

If your input query class requires custom JSON4S serialization, the most idiomatic way is to implement a trait that extends CustomQuerySerializer, and mix that into your algorithm class, instead of overriding querySerializer directly.

To provide evaluation feature, one must override and implement the batchPredict method. Otherwise, an exception will be thrown when pio eval is used.

PD

Prepared data class.

M

Trained model class.

Q

Input query class.

P

Output prediction class.

Linear Supertypes
BaseAlgorithm[PD, M, Q, P], BaseQuerySerializer, AbstractDoer, Serializable, Serializable, AnyRef, Any
Known Subclasses
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. PAlgorithm
  2. BaseAlgorithm
  3. BaseQuerySerializer
  4. AbstractDoer
  5. Serializable
  6. Serializable
  7. AnyRef
  8. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new PAlgorithm()

Abstract Value Members

  1. abstract def predict(model: M, query: Q): P

    Implement this method to produce a prediction from a query and trained model.

    Implement this method to produce a prediction from a query and trained model.

    model

    Trained model produced by train.

    query

    An input query.

    returns

    A prediction.

  2. abstract def train(sc: SparkContext, pd: PD): M

    Implement this method to produce a model from prepared data.

    Implement this method to produce a model from prepared data.

    pd

    Prepared data for model training.

    returns

    Trained model.

Concrete Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  7. def batchPredict(m: M, qs: RDD[(Long, Q)]): RDD[(Long, P)]

    To provide evaluation feature, one must override and implement this method to generate many predictions in batch.

    To provide evaluation feature, one must override and implement this method to generate many predictions in batch. Otherwise, an exception will be thrown when pio eval is used.

    The default implementation throws an exception.

    m

    Trained model produced by train.

    qs

    An RDD of index-query tuples. The index is used to keep track of predicted results with corresponding queries.

  8. def batchPredictBase(sc: SparkContext, bm: Any, qs: RDD[(Long, Q)]): RDD[(Long, P)]

    :: DeveloperApi :: Engine developers should not use this directly.

    :: DeveloperApi :: Engine developers should not use this directly. This is called by evaluation workflow to perform batch prediction.

    sc

    Spark context

    bm

    Model

    qs

    Batch of queries

    returns

    Batch of predicted results

    Definition Classes
    PAlgorithmBaseAlgorithm
  9. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  10. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  11. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  12. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  13. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  14. lazy val gsonTypeAdapterFactories: Seq[TypeAdapterFactory]

    :: DeveloperApi :: Serializer for Java query classes using Gson

    :: DeveloperApi :: Serializer for Java query classes using Gson

    Definition Classes
    BaseQuerySerializer
  15. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  16. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  17. def makePersistentModel(sc: SparkContext, modelId: String, algoParams: Params, bm: Any): Any

    :: DeveloperApi :: Engine developers should not use this directly (read on to see how parallel algorithm models are persisted).

    :: DeveloperApi :: Engine developers should not use this directly (read on to see how parallel algorithm models are persisted).

    In general, parallel models may contain multiple RDDs. It is not easy to infer and persist them programmatically since these RDDs may be potentially huge. To persist these models, engine developers need to mix the PersistentModel trait into the model class and implement PersistentModel.save. If it returns true, a io.prediction.workflow.PersistentModelManifest will be returned so that during deployment, PredictionIO will use PersistentModelLoader to retrieve the model. Otherwise, Unit will be returned and the model will be re-trained on-the-fly.

    sc

    Spark context

    modelId

    Model ID

    algoParams

    Algorithm parameters that trained this model

    bm

    Model

    returns

    The model itself for automatic persistence, an instance of io.prediction.workflow.PersistentModelManifest for manual persistence, or Unit for re-training on deployment

    Definition Classes
    PAlgorithmBaseAlgorithm
    Annotations
    @DeveloperApi()
  18. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  19. final def notify(): Unit

    Definition Classes
    AnyRef
  20. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  21. def predictBase(baseModel: Any, query: Q): P

    :: DeveloperApi :: Engine developers should not use this directly.

    :: DeveloperApi :: Engine developers should not use this directly. Called by serving to perform a single prediction.

    returns

    Predicted result

    Definition Classes
    PAlgorithmBaseAlgorithm
  22. def queryClass: Class[Q]

    :: DeveloperApi :: Obtains the type signature of query for this algorithm

    :: DeveloperApi :: Obtains the type signature of query for this algorithm

    returns

    Type signature of query

    Definition Classes
    BaseAlgorithm
  23. lazy val querySerializer: Formats

    :: DeveloperApi :: Serializer for Scala query classes using io.prediction.controller.Utils.json4sDefaultFormats

    :: DeveloperApi :: Serializer for Scala query classes using io.prediction.controller.Utils.json4sDefaultFormats

    Definition Classes
    BaseQuerySerializer
  24. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  25. def toString(): String

    Definition Classes
    AnyRef → Any
  26. def trainBase(sc: SparkContext, pd: PD): M

    :: DeveloperApi :: Engine developers should not use this directly.

    :: DeveloperApi :: Engine developers should not use this directly. This is called by workflow to train a model.

    sc

    Spark context

    pd

    Prepared data

    returns

    Trained model

    Definition Classes
    PAlgorithmBaseAlgorithm
  27. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  28. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  29. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from BaseAlgorithm[PD, M, Q, P]

Inherited from BaseQuerySerializer

Inherited from AbstractDoer

Inherited from Serializable

Inherited from Serializable

Inherited from AnyRef

Inherited from Any

Ungrouped