Skip to Content.
Sympa Menu

nl-uiuc - [nl-uiuc] AIIS: Ofer Meshi - today @ 2pm

nl-uiuc AT lists.cs.illinois.edu

Subject: Natural language research announcements

List archive

[nl-uiuc] AIIS: Ofer Meshi - today @ 2pm


Chronological Thread 
  • From: Daniel Khashabi <khashab2 AT illinois.edu>
  • To: nl-uiuc <nl-uiuc AT cs.uiuc.edu>, AIVR <aivr AT cs.uiuc.edu>, Vision List <vision AT cs.uiuc.edu>, <aiis AT cs.uiuc.edu>, <aistudents AT cs.uiuc.edu>, "Girju, Corina R" <girju AT illinois.edu>, Catherine Blake <clblake AT illinois.edu>, "Efron, Miles James" <mefron AT illinois.edu>, Jana Diesner <jdiesner AT illinois.edu>, "Raginsky, Maxim" <maxim AT illinois.edu>, "Schwartz, Lane Oscar Bingaman" <lanes AT illinois.edu>, Ranjitha Kumar <ranjitha AT illinois.edu>, Tandy Warnow <warnow AT illinois.edu>, Avrim Blum <avrim AT cs.cmu.edu>
  • Subject: [nl-uiuc] AIIS: Ofer Meshi - today @ 2pm
  • Date: Thu, 30 Oct 2014 11:46:01 -0500
  • List-archive: <http://lists.cs.uiuc.edu/pipermail/nl-uiuc/>
  • List-id: Natural language research announcements <nl-uiuc.cs.uiuc.edu>

When: Thursday, Oct 30 @ 2pm
Where: 3405 SC
Speaker:  Ofer Meshi - Toyota Technological Institute at Chicago (http://ttic.uchicago.edu/~meshi/)

Title:  Efficient Training of Structured SVMs via Soft Constraints

Abstract:
Structured output prediction is a powerful framework for jointly predicting interdependent output labels. Learning the parameters of structured predictors is a central task in machine learning applications. However, training the model from data often becomes computationally expensive. Several methods have been proposed to exploit the model structure, or decomposition, in order to obtain efficient training algorithms. In particular, methods based on linear programming relaxation, or dual decomposition, decompose the prediction task into multiple simpler prediction tasks and enforce agreement between overlapping predictions. In this work we observe that relaxing these agreement constraints and replacing them with soft constraints yields a much easier optimization problem. Based on this insight we propose an alternative training objective, analyze its theoretical properties, and derive an algorithm for its optimization. Our method, based on the Frank-Wolfe algorithm, achieves significant speedups over existing state-of-the-art methods without hurting prediction accuracy.

Bio:
Ofer Meshi has received his bachelor's degree in Computer Science and Philosophy from Tel Aviv University in 2004. He received a masters degree in Computer Science from the Hebrew University of Jerusalem in 2008 and then a PhD in 2013. In 2010 he received the Google European Doctoral Fellowship in Machine Learning. He has also interned at IBM Research in 2008 and at Google Research in 2011.His research interests include machine learning and optimization, with a focus on efficient algorithms for graphical models and structured output prediction.

----
AIIS : http://cogcomp.cs.illinois.edu/sites/aiis/


  • [nl-uiuc] AIIS: Ofer Meshi - today @ 2pm, Daniel Khashabi, 10/30/2014

Archive powered by MHonArc 2.6.16.

Top of Page