David Smith, from the Computer Science department, will give a talk at the Friday lunch series at Yale University on February 17. A title and abstract follow.
Efficient Inference for Declarative Approaches to Language
Much recent work in natural language processing treats linguistic
analysis as an inference problem over graphs. This development opens
up useful connections between machine learning, graph theory, and
linguistics. In particular, we will see how linguists can
declaratively specify linguistic inference problems, in terms of hard
and soft constraints on grammatical structures. The first part of the
talk formulates syntactic parsing as a graphical model with the novel
ingredient of global constraints. Global constraints are propagated
by combinatorial optimization algorithms, which greatly improve on
collections of local constraints. The second part extends these
models for efficient learning of transformations between
non-isomorphic structures. These noisy (quasi-synchronous) mappings
have applications to adapting parsers across domains, projecting
parsers to new languages, learning features of the syntax-semantics
interface, and reranking passages for information retrieval.