Survey on frontiers of language and robotics
The understanding and acquisition of a language in a real-world environment is an
important task for future robotics services. Natural language processing and cognitive …
important task for future robotics services. Natural language processing and cognitive …
[PDF][PDF] Parsing with compositional vector grammars
Natural language parsing has typically been done with small sets of discrete categories
such as NP and VP, but this representation does not capture the full syntactic nor semantic …
such as NP and VP, but this representation does not capture the full syntactic nor semantic …
Physics of language models: Part 1, context-free grammar
We design experiments to study $\textit {how} $ generative language models, like GPT, learn
context-free grammars (CFGs)--diverse language systems with a tree-like structure capturing …
context-free grammars (CFGs)--diverse language systems with a tree-like structure capturing …
[КНИГА][B] Handbook of natural language processing
N Indurkhya, FJ Damerau - 2010 - taylorfrancis.com
The Handbook of Natural Language Processing, Second Edition presents practical tools
and techniques for implementing natural language processing in computer systems. Along …
and techniques for implementing natural language processing in computer systems. Along …
Compound probabilistic context-free grammars for grammar induction
We study a formalization of the grammar induction problem that models sentences as being
generated by a compound probabilistic context-free grammar. In contrast to traditional …
generated by a compound probabilistic context-free grammar. In contrast to traditional …
[PDF][PDF] Learning accurate, compact, and interpretable tree annotation
We present an automatic approach to tree annotation in which basic nonterminal symbols
are alternately split and merged to maximize the likelihood of a training treebank. Starting …
are alternately split and merged to maximize the likelihood of a training treebank. Starting …
Unsupervised recurrent neural network grammars
Recurrent neural network grammars (RNNG) are generative models of language which
jointly model syntax and surface structure by incrementally generating a syntax tree and …
jointly model syntax and surface structure by incrementally generating a syntax tree and …
A latent variable model for generative dependency parsing
Dependency parsing has been a topic of active research in natural language processing
during the last several years. The CoNLL-2006 shared task (Buchholz and Marsi, 2006) …
during the last several years. The CoNLL-2006 shared task (Buchholz and Marsi, 2006) …
[PDF][PDF] Improved inference for unlexicalized parsing
S Petrov, D Klein - … Technologies 2007: The Conference of the …, 2007 - aclanthology.org
We present several improvements to unlexicalized parsing with hierarchically state-split
PCFGs. First, we present a novel coarse-to-fine method in which a grammar's own …
PCFGs. First, we present a novel coarse-to-fine method in which a grammar's own …
[PDF][PDF] Simple semi-supervised dependency parsing
We present a simple and effective semisupervised method for training dependency parsers.
We focus on the problem of lexical representation, introducing features that incorporate word …
We focus on the problem of lexical representation, introducing features that incorporate word …