Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Survey of hallucination in natural language generation
Natural Language Generation (NLG) has improved exponentially in recent years thanks to
the development of sequence-to-sequence deep learning technologies such as Transformer …
the development of sequence-to-sequence deep learning technologies such as Transformer …
Towards mitigating hallucination in large language models via self-reflection
Large language models (LLMs) have shown promise for generative and knowledge-
intensive tasks including question-answering (QA) tasks. However, the practical deployment …
intensive tasks including question-answering (QA) tasks. However, the practical deployment …
Contrastive learning reduces hallucination in conversations
Pre-trained language models (LMs) store knowledge in their parameters and can generate
informative responses when used in conversational systems. However, LMs suffer from the …
informative responses when used in conversational systems. However, LMs suffer from the …
Think before you speak: Explicitly generating implicit commonsense knowledge for response generation
Implicit knowledge, such as common sense, is key to fluid human conversations. Current
neural response generation (RG) models are trained to generate responses directly …
neural response generation (RG) models are trained to generate responses directly …
Deep learning for dialogue systems: Chit-chat and beyond
With the rapid progress of deep neural models and the explosion of available data
resources, dialogue systems that supports extensive topics and chit-chat conversations are …
resources, dialogue systems that supports extensive topics and chit-chat conversations are …
Infusing internalized knowledge of language models into hybrid prompts for knowledgeable dialogue generation
Existing knowledge-grounded dialogue (KGD) systems access the knowledge from an
external knowledge base, then generate the context-coherent response accordingly …
external knowledge base, then generate the context-coherent response accordingly …
[PDF][PDF] Evaluating adapter-based knowledge-enhanced language models in the biomedical domain
A Fichtl - 2024 - wwwmatthes.in.tum.de
In the rapidly evolving field of biomedical natural language processing (BioNLP),
knowledgeenhanced language models (KELMs) have emerged as promising tools to bridge …
knowledgeenhanced language models (KELMs) have emerged as promising tools to bridge …
Stabilized in-context learning with pre-trained language models for few shot dialogue state tracking
Prompt-based methods with large pre-trained language models (PLMs) have shown
impressive unaided performance across many NLP tasks. These models improve even …
impressive unaided performance across many NLP tasks. These models improve even …
Knowing What to Say: Towards knowledge grounded code-mixed response generation for open-domain conversations
Inculcating knowledge in the dialogue agents is an important step towards creating any
agent more human-like. Hence, the use of knowledge while conversing is crucial for building …
agent more human-like. Hence, the use of knowledge while conversing is crucial for building …
Yes, I am afraid of the sharks and also wild lions!: A multitask framework for enhancing dialogue generation via knowledge and emotion grounding
Current end-to-end neural conversation models inherently lack the capability to generate
coherently engaging responses. Efforts to boost informativeness have an adversarial effect …
coherently engaging responses. Efforts to boost informativeness have an adversarial effect …