Package

com.intel.analytics.zoo.pipeline.api.keras

objectives

Permalink

package objectives

Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. objectives
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Type Members

  1. class BinaryCrossEntropy[T] extends TensorLossFunction[T]

    Permalink

    This loss function measures the Binary Cross Entropy between the target and the output loss(o, t) = - 1/n sum_i (t[i] * log(o[i]) + (1 - t[i]) * log(1 - o[i])) or in the case of the weights argument being specified: loss(o, t) = - 1/n sum_i weights[i] * (t[i] * log(o[i]) + (1 - t[i]) * log(1 - o[i]))

  2. class CategoricalCrossEntropy[T] extends TensorLossFunction[T]

    Permalink

    This is same with cross entropy criterion except the target tensor is a one-hot tensor

  3. class CosineProximity[T] extends TensorLossFunction[T]

    Permalink

    The negative of the mean cosine proximity between predictions and targets.

    The negative of the mean cosine proximity between predictions and targets. The cosine proximity is defined as below: x'(i) = x(i) / sqrt(max(sum(x(i)2), 1e-12)) y'(i) = y(i) / sqrt(max(sum(x(i)2), 1e-12)) cosine_proximity(x, y) = mean(-1 * x'(i) * y'(i))

  4. class Hinge[T] extends TensorLossFunction[T]

    Permalink

    Creates a criterion that optimizes a two-class classification hinge loss (margin-based loss) between input x (a Tensor of dimension 1) and output y.

  5. class KullbackLeiblerDivergence[T] extends TensorLossFunction[T]

    Permalink

    Loss calculated as: y_true = K.clip(y_true, K.epsilon(), 1) y_pred = K.clip(y_pred, K.epsilon(), 1) and output K.sum(y_true * K.log(y_true / y_pred), axis=-1)

  6. abstract class LossFunction[A <: Activity, B <: Activity, T] extends AbstractCriterion[A, B, T]

    Permalink

    The base class for Keras-style API objectives in Analytics Zoo.

    The base class for Keras-style API objectives in Analytics Zoo.

    A

    Input data type.

    B

    Target data type.

  7. class MeanAbsoluteError[T] extends TensorLossFunction[T]

    Permalink

    A loss that measures the mean absolute value of the element-wise difference between the input and the target.

  8. class MeanAbsolutePercentageError[T] extends TensorLossFunction[T]

    Permalink

    It caculates diff = K.abs((y - x) / K.clip(K.abs(y), K.epsilon(), Double.MaxValue)) and return 100 * K.mean(diff) as outpout

  9. class MeanSquaredError[T] extends TensorLossFunction[T]

    Permalink

    The mean squared error criterion e.g.

    The mean squared error criterion e.g. input: a, target: b, total elements: n loss(a, b) = 1/n \sum |a_i - b_i|^2

  10. class MeanSquaredLogarithmicError[T] extends TensorLossFunction[T]

    Permalink

    It calculates: first_log = K.log(K.clip(y, K.epsilon(), Double.MaxValue) + 1.) second_log = K.log(K.clip(x, K.epsilon(), Double.MaxValue) + 1.) and output K.mean(K.square(first_log - second_log))

  11. class Poisson[T] extends TensorLossFunction[T]

    Permalink

    Loss calculated as: K.mean(y_pred - y_true * K.log(y_pred + K.epsilon()), axis=-1)

  12. class RankHinge[T] extends TensorLossFunction[T]

    Permalink

    Hinge loss for pairwise ranking problems.

  13. class SparseCategoricalCrossEntropy[T] extends TensorLossFunction[T]

    Permalink

    A loss often used in multi-class classification problems with SoftMax as the last layer of the neural network.

    A loss often used in multi-class classification problems with SoftMax as the last layer of the neural network.

    By default, input(y_pred) is supposed to be probabilities of each class, and target(y_true) is supposed to be the class label starting from 0.

  14. class SquaredHinge[T] extends TensorLossFunction[T]

    Permalink

    Creates a criterion that optimizes a two-class classification squared hinge loss (margin-based loss) between input x (a Tensor of dimension 1) and output y.

  15. abstract class TensorLossFunction[T] extends LossFunction[Tensor[T], Tensor[T], T]

    Permalink

    A subclass of LossFunction where input and target are both Tensors.

Value Members

  1. object BinaryCrossEntropy extends Serializable

    Permalink
  2. object CategoricalCrossEntropy extends Serializable

    Permalink
  3. object CosineProximity extends Serializable

    Permalink
  4. object Hinge extends Serializable

    Permalink
  5. object KullbackLeiblerDivergence extends Serializable

    Permalink
  6. val MAE: MeanAbsoluteError.type

    Permalink
  7. val MAPE: MeanAbsolutePercentageError.type

    Permalink
  8. val MSE: MeanSquaredError.type

    Permalink
  9. val MSLE: MeanSquaredLogarithmicError.type

    Permalink
  10. object MeanAbsoluteError extends Serializable

    Permalink
  11. object MeanAbsolutePercentageError extends Serializable

    Permalink
  12. object MeanSquaredError extends Serializable

    Permalink
  13. object MeanSquaredLogarithmicError extends Serializable

    Permalink
  14. object Poisson extends Serializable

    Permalink
  15. object RankHinge extends Serializable

    Permalink
  16. object SparseCategoricalCrossEntropy extends Serializable

    Permalink
  17. object SquaredHinge extends Serializable

    Permalink
  18. val mae: MeanAbsoluteError.type

    Permalink
  19. val mape: MeanAbsolutePercentageError.type

    Permalink
  20. val mse: MeanSquaredError.type

    Permalink
  21. val msle: MeanSquaredLogarithmicError.type

    Permalink

Inherited from AnyRef

Inherited from Any

Ungrouped