Skip to Content.
Sympa Menu

nl-uiuc - [nl-uiuc] NLP lunch: Tutorial on Constrained Conditional Models for NLP

nl-uiuc AT lists.cs.illinois.edu

Subject: Natural language research announcements

List archive

[nl-uiuc] NLP lunch: Tutorial on Constrained Conditional Models for NLP


Chronological Thread 
  • From: Julia Hockenmaier <juliahmr AT cs.uiuc.edu>
  • To: nl-uiuc AT cs.uiuc.edu, nlp-lunch AT cs.uiuc.edu
  • Subject: [nl-uiuc] NLP lunch: Tutorial on Constrained Conditional Models for NLP
  • Date: Mon, 9 Mar 2009 23:36:22 -0500
  • List-archive: <http://lists.cs.uiuc.edu/pipermail/nl-uiuc>
  • List-id: Natural language research announcements <nl-uiuc.cs.uiuc.edu>

Dear all,

this week and next week, we will get a tutorial by Lev Ratinov, Ming- Wei Chang and Dan Roth about Constrained Conditional Models for NLP -- this is a practice for their EACL tutorial, so please come and give them feedback!

The schedule for this semester is almost full --
http://nlp.cs.uiuc.edu/lunch.html
but if you want to give a talk, please let me know!

Julia


Constrained Conditional Models for NLP

Ming-Wei Chang, Lev Ratinov, Dan Roth

Making decisions in natural language processing problems often involves assigning values to sets of interdependent variables where the expressive dependency structure can influence, or even dictate, what assignments are possible. This setting is of particular significance in structured learning problems such as semantic role labeling, named entity and relation recognition, co-reference resolution, transliteration, summarization and machine translation, but the approach has a broader set of applications such as textual entailment and question answering. In all these cases, it is natural to either formulate the decision process as a constrained optimization problem, or to break up the complex problem into a set of subproblems and require solutions to be consistent modulo (soft, possibly) constraints. In both cases, the resulting objective function is composed of learned models, subject to domain or problem specific constraints.

Constrained Conditional Models is a learning and inference framework that refers to augmenting the learning of conditional (probabilistic or discriminative) models with declarative constraints (written, for example, using a first-order representation) as a way to support decisions in an expressive output space while maintaining modularity and tractability of training and inference. Models of this kind have recently attracted much attention within the NLP community.

Formulating problems as constrained optimization problems over the output of learned models has several advantages. It allows one to focus on the modeling of problems by providing the opportunity to incorporate problem specific global constraints using a first order language, freeing the developer from (much of the) low level feature engineering, and it can also guarantee exact inference. It provides the freedom of decoupling the stage of model generation (learning) from that of the constrained inference stage, often resulting in simplifying the learning stage and the engineering problem of building an NLP system, while improving the quality of the solutions.

The primary goal of this tutorial is to introduce the framework of Constrained Conditional Models (CCMs) to the broader ACL community, motivate it as a generic framework for learning and inference in global NLP decision problems, present some of the key theoretical and practical issues involved in using CCMs and survey some of the existing applications of it as a way to promote further development of the framework and additional applications. The tutorial will thus be useful for many of the senior and junior researchers that have interest in global decision problems in NLP, providing a concise overview of recent perspectives and research results.



  • [nl-uiuc] NLP lunch: Tutorial on Constrained Conditional Models for NLP, Julia Hockenmaier, 03/09/2009

Archive powered by MHonArc 2.6.16.

Top of Page