Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Is neuro-symbolic AI meeting its promises in natural language processing? A structured review
K Hamilton, A Nayak, B Božić, L Longo - Semantic Web, 2024 - content.iospress.com
Abstract Advocates for Neuro-Symbolic Artificial Intelligence (NeSy) assert that combining
deep learning with symbolic reasoning will lead to stronger AI than either paradigm on its …
deep learning with symbolic reasoning will lead to stronger AI than either paradigm on its …
Human-like systematic generalization through a meta-learning neural network
The power of human language and thought arises from systematic compositionality—the
algebraic ability to understand and produce novel combinations from known components …
algebraic ability to understand and produce novel combinations from known components …
Least-to-most prompting enables complex reasoning in large language models
Chain-of-thought prompting has demonstrated remarkable performance on various natural
language reasoning tasks. However, it tends to perform poorly on tasks which requires …
language reasoning tasks. However, it tends to perform poorly on tasks which requires …
Compositional semantic parsing with large language models
Humans can reason compositionally when presented with new tasks. Previous research
shows that appropriate prompting techniques enable large language models (LLMs) to …
shows that appropriate prompting techniques enable large language models (LLMs) to …
How to reuse and compose knowledge for a lifetime of tasks: A survey on continual learning and functional composition
A major goal of artificial intelligence (AI) is to create an agent capable of acquiring a general
understanding of the world. Such an agent would require the ability to continually …
understanding of the world. Such an agent would require the ability to continually …
An empirical survey of data augmentation for limited data learning in NLP
NLP has achieved great progress in the past decade through the use of neural models and
large labeled datasets. The dependence on abundant data prevents NLP models from being …
large labeled datasets. The dependence on abundant data prevents NLP models from being …
Neuro-symbolic artificial intelligence: Current trends
MK Sarker, L Zhou, A Eberhart… - Ai …, 2022 - journals.sagepub.com
Neuro-Symbolic Artificial Intelligence–the combination of symbolic methods with methods
that are based on artificial neural networks–has a long-standing history. In this article, we …
that are based on artificial neural networks–has a long-standing history. In this article, we …
Large language models can learn rules
When prompted with a few examples and intermediate steps, large language models (LLMs)
have demonstrated impressive performance in various reasoning tasks. However, prompting …
have demonstrated impressive performance in various reasoning tasks. However, prompting …
The devil is in the detail: Simple tricks improve systematic generalization of transformers
Recently, many datasets have been proposed to test the systematic generalization ability of
neural networks. The companion baseline Transformers, typically trained with default hyper …
neural networks. The companion baseline Transformers, typically trained with default hyper …
Compositional generalization and natural language variation: Can a semantic parsing approach handle both?
Sequence-to-sequence models excel at handling natural language variation, but have been
shown to struggle with out-of-distribution compositional generalization. This has motivated …
shown to struggle with out-of-distribution compositional generalization. This has motivated …