[HTML][HTML] A survey of GPT-3 family large language models including ChatGPT and GPT-4
KS Kalyan - Natural Language Processing Journal, 2024 - Elsevier
Large language models (LLMs) are a special class of pretrained language models (PLMs)
obtained by scaling model size, pretraining corpus and computation. LLMs, because of their …
obtained by scaling model size, pretraining corpus and computation. LLMs, because of their …
Machine learning methods for small data challenges in molecular science
Small data are often used in scientific and engineering research due to the presence of
various constraints, such as time, cost, ethics, privacy, security, and technical limitations in …
various constraints, such as time, cost, ethics, privacy, security, and technical limitations in …
О создании двухуровневой нейросетевой структуры для применения в машиностроении
АС Волков, ОО Варламов - МИВАР'22, 2022 - elibrary.ru
Предложена новая методика создания двухуровневой нейросетевой структуры,
предназначенной для решения ряда задач путем обучения отдельных нейросетей для …
предназначенной для решения ряда задач путем обучения отдельных нейросетей для …
Gpt-4 passes the bar exam
In this paper, we experimentally evaluate the zero-shot performance of GPT-4 against prior
generations of GPT on the entire uniform bar examination (UBE), including not only the …
generations of GPT on the entire uniform bar examination (UBE), including not only the …
A survey on generative diffusion models
Deep generative models have unlocked another profound realm of human creativity. By
capturing and generalizing patterns within data, we have entered the epoch of all …
capturing and generalizing patterns within data, we have entered the epoch of all …
[HTML][HTML] Computational approaches to explainable artificial intelligence: advances in theory, applications and trends
Deep Learning (DL), a groundbreaking branch of Machine Learning (ML), has emerged as a
driving force in both theoretical and applied Artificial Intelligence (AI). DL algorithms, rooted …
driving force in both theoretical and applied Artificial Intelligence (AI). DL algorithms, rooted …
Blockchain-empowered federated learning: Challenges, solutions, and future directions
Federated learning is a privacy-preserving machine learning technique that trains models
across multiple devices holding local data samples without exchanging them. There are …
across multiple devices holding local data samples without exchanging them. There are …
Semantic communications: Principles and challenges
Semantic communication, regarded as the breakthrough beyond the Shannon paradigm,
aims at the successful transmission of semantic information conveyed by the source rather …
aims at the successful transmission of semantic information conveyed by the source rather …
A survey of visual transformers
Transformer, an attention-based encoder–decoder model, has already revolutionized the
field of natural language processing (NLP). Inspired by such significant achievements, some …
field of natural language processing (NLP). Inspired by such significant achievements, some …
Progressive-hint prompting improves reasoning in large language models
The performance of Large Language Models (LLMs) in reasoning tasks depends heavily on
prompt design, with Chain-of-Thought (CoT) and self-consistency being critical methods that …
prompt design, with Chain-of-Thought (CoT) and self-consistency being critical methods that …