Learning from few examples: A summary of approaches to few-shot learning
Few-Shot Learning refers to the problem of learning the underlying pattern in the data just
from a few training samples. Requiring a large number of data samples, many deep learning …
from a few training samples. Requiring a large number of data samples, many deep learning …
A survey on knowledge graphs: Representation, acquisition, and applications
Human knowledge provides a formal understanding of the world. Knowledge graphs that
represent structural relations between entities have become an increasingly popular …
represent structural relations between entities have become an increasingly popular …
Graph neural networks: foundation, frontiers and applications
The field of graph neural networks (GNNs) has seen rapid and incredible strides over the
recent years. Graph neural networks, also known as deep learning on graphs, graph …
recent years. Graph neural networks, also known as deep learning on graphs, graph …
Knowprompt: Knowledge-aware prompt-tuning with synergistic optimization for relation extraction
Recently, prompt-tuning has achieved promising results for specific few-shot classification
tasks. The core idea of prompt-tuning is to insert text pieces (ie, templates) into the input and …
tasks. The core idea of prompt-tuning is to insert text pieces (ie, templates) into the input and …
A survey of knowledge enhanced pre-trained language models
Pre-trained Language Models (PLMs) which are trained on large text corpus via self-
supervised learning method, have yielded promising performance on various tasks in …
supervised learning method, have yielded promising performance on various tasks in …
[PDF][PDF] KLUE: Korean Language Understanding Evaluation
S Park - arxiv preprint arxiv:2105.09680, 2021 - academia.edu
We introduce Korean Language Understanding Evaluation (KLUE) benchmark. KLUE is a
collection of 8 Korean natural language understanding (NLU) tasks, including Topic …
collection of 8 Korean natural language understanding (NLU) tasks, including Topic …
KEPLER: A unified model for knowledge embedding and pre-trained language representation
Pre-trained language representation models (PLMs) cannot well capture factual knowledge
from text. In contrast, knowledge embedding (KE) methods can effectively represent the …
from text. In contrast, knowledge embedding (KE) methods can effectively represent the …
ERNIE: Enhanced language representation with informative entities
Neural language representation models such as BERT pre-trained on large-scale corpora
can well capture rich semantic patterns from plain text, and be fine-tuned to consistently …
can well capture rich semantic patterns from plain text, and be fine-tuned to consistently …
Generalizing from a few examples: A survey on few-shot learning
Machine learning has been highly successful in data-intensive applications but is often
hampered when the data set is small. Recently, Few-shot Learning (FSL) is proposed to …
hampered when the data set is small. Recently, Few-shot Learning (FSL) is proposed to …
Matching the blanks: Distributional similarity for relation learning
General purpose relation extractors, which can model arbitrary relations, are a core
aspiration in information extraction. Efforts have been made to build general purpose …
aspiration in information extraction. Efforts have been made to build general purpose …