Implements the ConstEmbeddings using GloVe vectors Note: two difference from the vanilla GloVe vectors are: - Words with low frequency may be filtered out of the matrix - The empty string ("") stands for UNK; it is typically computed by averaging the embeddings of the culled words
This layer takes a sequence of words and produces a sequence of Expression that stores the words' full embeddings
First layer that occurs in a sequence modeling architecture: goes from words to Expressions
Intermediate layer in a sequence modeling architecture: goes from ExpressionVector to ExpressionVector
A sequence of layers that implements a complete NN architecture for sequence modeling
Multi-task learning (MeTaL) for sequence modeling Designed to model any sequence task (e.g., POS tagging, NER), and SRL
This layer applies a biLSTM over the sequence of Expressions produced by a previous layer
Manages the tasks in LstmCrfMtl
Some really basic vector math that happens outside of DyNet
Averages the parameter weights from multiple DyNet model files
Diffs 2 DyNet models Necessary to
Scores the labels assigned to a sequence of words Unlike the CoNLL-2003 scorer, this scorer operates over individual tokens rather than entity spans
Utility methods used by DyNet applications