Class

com.johnsnowlabs.ml.tensorflow

TensorflowElmo

Related Doc: package tensorflow

Permalink

class TensorflowElmo extends Serializable

Embeddings from a language model trained on the 1 Billion Word Benchmark.

Note that this is a very computationally expensive module compared to word embedding modules that only perform embedding lookups. The use of an accelerator is recommended.

word_emb: the character-based word representations with shape [batch_size, max_length, 512]. == word_emb

lstm_outputs1: the first LSTM hidden state with shape [batch_size, max_length, 1024]. === lstm_outputs1

lstm_outputs2: the second LSTM hidden state with shape [batch_size, max_length, 1024]. === lstm_outputs2

elmo: the weighted sum of the 3 layers, where the weights are trainable. This tensor has shape [batch_size, max_length, 1024] == elmo

See https://github.com/JohnSnowLabs/spark-nlp/blob/master/src/test/scala/com/johnsnowlabs/nlp/embeddings/ElmoEmbeddingsTestSpec.scala for further reference on how to use this API.

Linear Supertypes
Serializable, Serializable, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. TensorflowElmo
  2. Serializable
  3. Serializable
  4. AnyRef
  5. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new TensorflowElmo(tensorflow: TensorflowWrapper, batchSize: Int, configProtoBytes: Option[Array[Byte]] = None)

    Permalink

    tensorflow

    Elmo Model wrapper with TensorFlow Wrapper

    batchSize

    size of batch

    configProtoBytes

    Configuration for TensorFlow session Sources : https://tfhub.dev/google/elmo/3 https://arxiv.org/abs/1802.05365

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  5. def calculateEmbeddings(sentences: Seq[TokenizedSentence], poolingLayer: String): Seq[WordpieceEmbeddingsSentence]

    Permalink

    Calculate the embeddigns for a sequence of Tokens and create WordPieceEmbeddingsSentence objects from them

    Calculate the embeddigns for a sequence of Tokens and create WordPieceEmbeddingsSentence objects from them

    sentences

    A sequence of Tokenized Sentences for which embeddings will be calculated

    poolingLayer

    Define which output layer you want from the model word_emb, lstm_outputs1, lstm_outputs2, elmo. See https://tfhub.dev/google/elmo/3 for reference

    returns

    A Seq of WordpieceEmbeddingsSentence, one element for each input sentence

  6. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  7. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  8. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  9. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  10. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  11. def getDimensions(layer: String): Int

    Permalink

    word_emb: the character-based word representations with shape [batch_size, max_length, 512].

    word_emb: the character-based word representations with shape [batch_size, max_length, 512]. == 512

    lstm_outputs1: the first LSTM hidden state with shape [batch_size, max_length, 1024]. === 1024

    lstm_outputs2: the second LSTM hidden state with shape [batch_size, max_length, 1024]. === 1024

    elmo: the weighted sum of the 3 layers, where the weights are trainable. This tensor has shape [batch_size, max_length, 1024] == 1024

    layer

    Layer specification

    returns

    The dimension of chosen layer

  12. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  13. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  14. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  15. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  16. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  17. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  18. def tag(batch: Seq[TokenizedSentence], embeddingsKey: String, dimension: Int): Seq[Array[Array[Float]]]

    Permalink

    Tag a seq of TokenizedSentences, will get the embeddings according to key.

    Tag a seq of TokenizedSentences, will get the embeddings according to key.

    batch

    The Tokens for which we calculate embeddings

    embeddingsKey

    Specification of the output embedding for Elmo

    dimension

    Elmo's embeddings dimension: either 512 or 1024

    returns

    The Embeddings Vector. For each Seq Element we have a Sentence, and for each sentence we have an Array for each of its words. Each of its words gets a float array to represent its Embeddings

  19. val tensorflow: TensorflowWrapper

    Permalink

    Elmo Model wrapper with TensorFlow Wrapper

  20. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  21. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  22. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  23. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from Serializable

Inherited from Serializable

Inherited from AnyRef

Inherited from Any

Ungrouped