Does BERT understand idioms? A probing-based empirical study of BERT encodings of idioms

M Tan, J Jiang - 2021 - ink.library.smu.edu.sg
Understanding idioms is important in NLP. In this paper, we study to what extent pre-trained
BERT model can encode the meaning of a potentially idiomatic expression (PIE) in a certain …

A Dynamic, Interpreted CheckList for Meaning-oriented NLG Metric Evaluation--through the Lens of Semantic Similarity Rating

L Zeidler, J Opitz, A Frank - arxiv preprint arxiv:2205.12176, 2022 - arxiv.org
Evaluating the quality of generated text is difficult, since traditional NLG evaluation metrics,
focusing more on surface form than meaning, often fail to assign appropriate scores. This is …

Contextualized embeddings encode monolingual and cross-lingual knowledge of idiomaticity

S Fakharian, P Cook - Proceedings of the 17th workshop on …, 2021 - aclanthology.org
Potentially idiomatic expressions (PIEs) are ambiguous between non-compositional
idiomatic interpretations and transparent literal interpretations. For example,“hit the road” …

Token-level identification of multiword expressions using pre-trained multilingual language models

R Swaminathan - 2023 - unbscholar.lib.unb.ca
Multiword expressions (MWEs) are combinations of words where the meaning of the
expression cannot be derived from its component words. MWEs are commonly used in …

Leaving no stone unturned: flexible retrieval of idiomatic expressions from a large text corpus

C Hughes, M Filimonov, A Wray, I Spasić - Machine Learning and …, 2021 - mdpi.com
Idioms are multi-word expressions whose meaning cannot always be deduced from the
literal meaning of constituent words. A key feature of idioms that is central to this paper is …

A bigger fish to fry: Scaling up the automatic understanding of idiomatic expressions

H Haagsma - 2020 - research.rug.nl
University of Groningen A Bigger Fish to Fry Haagsma, Hessel Page 1 University of
Groningen A Bigger Fish to Fry Haagsma, Hessel DOI: 10.33612/diss.131057087 …

Chinese idiom understanding with transformer-based pretrained language models

M TAN - 2022 - ink.library.smu.edu.sg
In this dissertation, I study the understanding of Chinese idioms using transformer-based
pretrained language models. By``understanding", I confine the topics to word embeddings …

HiJoNLP at SemEval-2022 Task 2: Detecting Idiomaticity of Multiword Expressions using Multilingual Pretrained Language Models

M Tan - arxiv preprint arxiv:2205.13708, 2022 - arxiv.org
This paper describes an approach to detect idiomaticity only from the contextualized
representation of a MWE over multilingual pretrained language models. Our experiments …

[PDF][PDF] Does BERT understand idioms? A probing-based empirical study of BERT encodings of idioms.(2021)

M TAN, J JIANG - Proceedings of the International Conference on … - ink.library.smu.edu.sg
Understanding idioms is important in NLP. In this paper, we study to what extent a pretrained
BERT model is able to encode the meaning of a potentially idiomatic expression (PIE) in a …

Embedding with different levels for idiom disambiguation

SY Park, YJ Kang, HR Kang, YJ Jang… - Annual Conference on …, 2021 - koreascience.kr
관용표현 중에는 중의성을 가진 표현이 많다. 즉 하나의 표현이 맥락에 따라 일반적 의미와
관용적 의미 두 가지 이상으로 해석될 가능성이 있어 이런 유형의 관용표현을 중의성 해소 없이 …