A review on language models as knowledge bases
Recently, there has been a surge of interest in the NLP community on the use of pretrained
Language Models (LMs) as Knowledge Bases (KBs). Researchers have shown that LMs …
Language Models (LMs) as Knowledge Bases (KBs). Researchers have shown that LMs …
Kilm: Knowledge injection into encoder-decoder language models
Large pre-trained language models (PLMs) have been shown to retain implicit knowledge
within their parameters. To enhance this implicit knowledge, we propose Knowledge …
within their parameters. To enhance this implicit knowledge, we propose Knowledge …
Language model analysis for ontology subsumption inference
Investigating whether pre-trained language models (LMs) can function as knowledge bases
(KBs) has raised wide research interests recently. However, existing works focus on simple …
(KBs) has raised wide research interests recently. However, existing works focus on simple …
A survey on knowledge-enhanced multimodal learning
Multimodal learning has been a field of increasing interest, aiming to combine various
modalities in a single joint representation. Especially in the area of visiolinguistic (VL) …
modalities in a single joint representation. Especially in the area of visiolinguistic (VL) …
Adapters for enhanced modeling of multilingual knowledge and text
Large language models appear to learn facts from the large text corpora they are trained on.
Such facts are encoded implicitly within their many parameters, making it difficult to verify or …
Such facts are encoded implicitly within their many parameters, making it difficult to verify or …
Lexicon‐based fine‐tuning of multilingual language models for low‐resource language sentiment analysis
Pre‐trained multilingual language models (PMLMs) such as mBERT and XLM‐R have
shown good cross‐lingual transferability. However, they are not specifically trained to …
shown good cross‐lingual transferability. However, they are not specifically trained to …
Towards multi-sense cross-lingual alignment of contextual embeddings
Cross-lingual word embeddings (CLWE) have been proven useful in many cross-lingual
tasks. However, most existing approaches to learn CLWE including the ones with contextual …
tasks. However, most existing approaches to learn CLWE including the ones with contextual …
Exploring Self-supervised Logic-enhanced Training for Large Language Models
Existing efforts to improve logical reasoning ability of language models have predominantly
relied on supervised fine-tuning, hindering generalization to new domains and/or tasks. The …
relied on supervised fine-tuning, hindering generalization to new domains and/or tasks. The …
KE-QI: A Knowledge Enhanced Article Quality Identification Dataset
C Ai, D Wang, X Yan, Y Xu, W **e, Z Cao - arxiv preprint arxiv:2206.07556, 2022 - arxiv.org
With so many articles of varying qualities being produced every moment, it is a very urgent
task to screen outstanding articles and commit them to social media. To our best knowledge …
task to screen outstanding articles and commit them to social media. To our best knowledge …