Efficient and effective text encoding for chinese llama and alpaca
Large Language Models (LLMs), such as ChatGPT and GPT-4, have dramatically
transformed natural language processing research and shown promising strides towards …
transformed natural language processing research and shown promising strides towards …
Mer 2024: Semi-supervised learning, noise robustness, and open-vocabulary multimodal emotion recognition
Multimodal emotion recognition is an important research topic in artificial intelligence.
However, due to problems such as complex environments and inaccurate annotations …
However, due to problems such as complex environments and inaccurate annotations …
Merbench: A unified evaluation benchmark for multimodal emotion recognition
Multimodal emotion recognition plays a crucial role in enhancing user experience in human-
computer interaction. Over the past few decades, researchers have proposed a series of …
computer interaction. Over the past few decades, researchers have proposed a series of …
An iteratively parallel generation method with the pre-filling strategy for document-level event extraction
In document-level event extraction (DEE) tasks, a document typically contains many event
records with multiple event roles. Therefore, accurately extracting all event records is a big …
records with multiple event roles. Therefore, accurately extracting all event records is a big …
A Language Model-based Fine-Grained Address Resolution Framework in UAV Delivery System
Accurate address resolution plays a vital role in UAV delivery systems. Existing address
resolution systems heavily rely on user-provided Point of Interest (POI) information …
resolution systems heavily rely on user-provided Point of Interest (POI) information …
Gradual Syntactic Label Replacement for Language Model Pre-Training
Pre-training serves as a foundation of recent NLP models, where language modeling tasks
are performed over large texts. Typical models like BERT and GPT take the corpus as a …
are performed over large texts. Typical models like BERT and GPT take the corpus as a …
Mitigating frequency bias and anisotropy in language model pre-training with syntactic smoothing
Language models strongly rely on frequency information because they maximize the
likelihood of tokens during pre-training. As a consequence, language models tend to not …
likelihood of tokens during pre-training. As a consequence, language models tend to not …
Chinese Cyberbullying Detection Using XLNet and Deep Bi-LSTM Hybrid Model
S Chen, J Wang, K He - Information, 2024 - mdpi.com
The popularization of the internet and the widespread use of smartphones have led to a
rapid growth in the number of social media users. While information technology has brought …
rapid growth in the number of social media users. While information technology has brought …
Two Heads are Better than One: Zero-shot Cognitive Reasoning via Multi-LLM Knowledge Fusion
Cognitive reasoning holds a significant place within Natural Language Processing (NLP).
Yet, the exploration of zero-shot scenarios, which align more closely with real-life situations …
Yet, the exploration of zero-shot scenarios, which align more closely with real-life situations …
Can llm substitute human labeling? a case study of fine-grained chinese address entity recognition dataset for uav delivery
We present CNER-UAV, a fine-grained C hinese N ame E ntity R ecognition dataset
specifically designed for the task of address resolution in U nmanned A erial V ehicle …
specifically designed for the task of address resolution in U nmanned A erial V ehicle …