HardSwish

lamp.autograd.HardSwish
case class HardSwish(scope: Scope, a: Variable) extends Op

Attributes

Graph
Supertypes
trait Serializable
trait Product
trait Equals
trait Op
class Object
trait Matchable
class Any
Show all

Members list

Value members

Inherited methods

def productElementNames: Iterator[String]

Attributes

Inherited from:
Product
def productIterator: Iterator[Any]

Attributes

Inherited from:
Product

Concrete fields

val params: List[(Variable, (STen, STen) => Unit)]

Implementation of the backward pass

Implementation of the backward pass

A list of input variables paired up with an anonymous function computing the respective partial derivative. With the notation in the documentation of the trait lamp.autograd.Op: dy/dw2 => dy/dw2 * dw2/dw1. The first argument of the anonymous function is the incoming partial derivative (dy/dw2), the second argument is the output tensor into which the result (dy/dw2 * dw2/dw1) is accumulated (added).

If the operation does not support computing the partial derivative for some of its arguments, then do not include that argument in this list.

Attributes

See also

The documentation on the trait lamp.autograd.Op for more details and example.

The value of this operation

The value of this operation

Attributes

Inherited fields

val joinedBackward: Option[STen => Unit]

Attributes

Inherited from:
Op