TWEC is a method to generate temporal word embeddings: this method is efficient and it is based on a simple heuristic: we train an atemporal word embedding, the compass and we use this embedding to freeze one of the layers of the CBOW architecture. The frozen architecture is then used to train time-specific slices that are all comparable after training.
Source: Training Temporal Word Embeddings with a CompassPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Diachronic Word Embeddings | 5 | 55.56% |
General Classification | 1 | 11.11% |
Clustering | 1 | 11.11% |
De-aliasing | 1 | 11.11% |
Natural Language Understanding | 1 | 11.11% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |