Jointly Learning knowledge graph Embedding, Fine Grain Entity Types and Language Modeling.
-
Updated
Dec 25, 2020 - Python
Jointly Learning knowledge graph Embedding, Fine Grain Entity Types and Language Modeling.
This project explores both Transfer Learning and Feature Extraction for obtaining contextual word embeddings using BERT-family model to solve a problem related to the Fake News Detection task, i.e. Stance Detection.
😷 The Fill-Mask Association Test (FMAT): Measuring Propositions in Natural Language.
Implementation of GAP: Graph Neighborhood Attentive Pooling, https://arxiv.org/abs/2001.10394. A context-sensitve graph (network) representation learning algorithm that relies only on the structure of the graph.
Code for "Contextualized Embeddings in Named-Entity Recognition", ECIR 2020
The official repo for the EACL 2023 paper "Quantifying Context Mixing in Transformers"
Code for "Let's Stop Incorrect Comparisons in End-to-end Relation Extraction!", EMNLP 2020
Arabic NER system with a strong performance
Efficient Contextualized Representation: Language Model Pruning for Sequence Labeling
Code for the paper "Contextualized Weak Supervision for Text Classification"
AI ChatBot using Python Tensorflow and Natural Language Processing (NLP) along side TFLearn
A curated list of pretrained sentence and word embedding models
Add a description, image, and links to the contextualized-representation topic page so that developers can more easily learn about it.
To associate your repository with the contextualized-representation topic, visit your repo's landing page and select "manage topics."