Neurosymbolic programming
We survey recent work on neurosymbolic programming, an emerging area that bridges the
areas of deep learning and program synthesis. Like in classic machine learning, the goal …
areas of deep learning and program synthesis. Like in classic machine learning, the goal …
Codegen: An open large language model for code with multi-turn program synthesis
Program synthesis strives to generate a computer program as a solution to a given problem
specification, expressed with input-output examples or natural language descriptions. The …
specification, expressed with input-output examples or natural language descriptions. The …
Webshop: Towards scalable real-world web interaction with grounded language agents
Most existing benchmarks for grounding language in interactive environments either lack
realistic linguistic elements, or prove difficult to scale up due to substantial human …
realistic linguistic elements, or prove difficult to scale up due to substantial human …
Compositional exemplars for in-context learning
Large pretrained language models (LMs) have shown impressive In-Context Learning (ICL)
ability, where the model learns to do an unseen task simply by conditioning on a prompt …
ability, where the model learns to do an unseen task simply by conditioning on a prompt …
Synchromesh: Reliable code generation from pre-trained language models
Large pre-trained language models have been used to generate code, providing a flexible
interface for synthesizing programs from natural language specifications. However, they …
interface for synthesizing programs from natural language specifications. However, they …
In-context learning with retrieved demonstrations for language models: A survey
Language models, especially pre-trained large language models, have showcased
remarkable abilities as few-shot in-context learners (ICL), adept at adapting to new tasks …
remarkable abilities as few-shot in-context learners (ICL), adept at adapting to new tasks …
Improving coherence and consistency in neural sequence models with dual-system, neuro-symbolic reasoning
Human reasoning can be understood as an interplay between two systems: the intuitive and
associative (" System 1") and the deliberative and logical (" System 2"). Neural sequence …
associative (" System 1") and the deliberative and logical (" System 2"). Neural sequence …
L2CEval: Evaluating Language-to-Code Generation Capabilities of Large Language Models
Recently, large language models (LLMs), especially those that are pretrained on code, have
demonstrated strong capabilities in generating programs from natural language inputs …
demonstrated strong capabilities in generating programs from natural language inputs …
Decision-oriented dialogue for human-ai collaboration
We describe a class of tasks called decision-oriented dialogues, in which AI assistants such
as large language models (LMs) must collaborate with one or more humans via natural …
as large language models (LMs) must collaborate with one or more humans via natural …
Compositionality in computational linguistics
Neural models greatly outperform grammar-based models across many tasks in modern
computational linguistics. This raises the question of whether linguistic principles, such as …
computational linguistics. This raises the question of whether linguistic principles, such as …