Discrete opinion tree induction for aspect-based sentiment analysis
Dependency trees have been intensively used with graph neural networks for aspect-based
sentiment classification. Though being effective, such methods rely on external dependency …
sentiment classification. Though being effective, such methods rely on external dependency …
What do they capture? a structural analysis of pre-trained language models for source code
Recently, many pre-trained language models for source code have been proposed to model
the context of code and serve as a basis for downstream code intelligence tasks such as …
the context of code and serve as a basis for downstream code intelligence tasks such as …
Head-driven phrase structure grammar parsing on Penn treebank
Head-driven phrase structure grammar (HPSG) enjoys a uniform formalism representing rich
contextual syntactic and even semantic meanings. This paper makes the first attempt to …
contextual syntactic and even semantic meanings. This paper makes the first attempt to …
A survey of syntactic-semantic parsing based on constituent and dependency structures
MS Zhang - Science China Technological Sciences, 2020 - Springer
Syntactic and semantic parsing has been investigated for decades, which is one primary
topic in the natural language processing community. This article aims for a brief survey on …
topic in the natural language processing community. This article aims for a brief survey on …
Unveiling code pre-trained models: Investigating syntax and semantics capacities
Code models have made significant advancements in code intelligence by encoding
knowledge about programming languages. While previous studies have explored the …
knowledge about programming languages. While previous studies have explored the …
Ordered gnn: Ordering message passing to deal with heterophily and over-smoothing
Most graph neural networks follow the message passing mechanism. However, it faces the
over-smoothing problem when multiple times of message passing is applied to a graph …
over-smoothing problem when multiple times of message passing is applied to a graph …
Are pre-trained language models aware of phrases? simple but strong baselines for grammar induction
With the recent success and popularity of pre-trained language models (LMs) in natural
language processing, there has been a rise in efforts to understand their inner workings. In …
language processing, there has been a rise in efforts to understand their inner workings. In …
Rethinking self-attention: Towards interpretability in neural parsing
Attention mechanisms have improved the performance of NLP tasks while allowing models
to remain explainable. Self-attention is currently widely used, however interpretability is …
to remain explainable. Self-attention is currently widely used, however interpretability is …
Fast and accurate neural CRF constituency parsing
Estimating probability distribution is one of the core issues in the NLP field. However, in both
deep learning (DL) and pre-DL eras, unlike the vast applications of linear-chain CRF in …
deep learning (DL) and pre-DL eras, unlike the vast applications of linear-chain CRF in …
Bottom-up constituency parsing and nested named entity recognition with pointer networks
Constituency parsing and nested named entity recognition (NER) are similar tasks since
they both aim to predict a collection of nested and non-crossing spans. In this work, we cast …
they both aim to predict a collection of nested and non-crossing spans. In this work, we cast …