Skip to Content.
Sympa Menu

nl-uiuc - [nl-uiuc] Upcoming talk at the AIIS seminar (next Thursday).

nl-uiuc AT lists.cs.illinois.edu

Subject: Natural language research announcements

List archive

[nl-uiuc] Upcoming talk at the AIIS seminar (next Thursday).


Chronological Thread 
  • From: "Alexandre Klementiev" <klementi AT uiuc.edu>
  • To: nl-uiuc AT cs.uiuc.edu, aivr AT cs.uiuc.edu, dais AT cs.uiuc.edu, cogcomp AT cs.uiuc.edu, vision AT cs.uiuc.edu, krr-group AT cs.uiuc.edu, group AT vision2.ai.uiuc.edu
  • Subject: [nl-uiuc] Upcoming talk at the AIIS seminar (next Thursday).
  • Date: Wed, 3 Sep 2008 18:53:13 -0500
  • List-archive: <http://lists.cs.uiuc.edu/pipermail/nl-uiuc>
  • List-id: Natural language research announcements <nl-uiuc.cs.uiuc.edu>

Dear faculty and students,

We are re-starting our AIIS seminar this semester with a talk by Nathan Srebro (details below) next Thursday. 

Hope to see you there
,
Alex.


Title: More Data Less Work: SVM Training in Time Decreasing with Larger Data Sets
Speaker:
Nathan Srebro, TTI Chicago

Date: September 11, 4:00pm
Location: Siebel 3405 


Abstract: 

Traditional runtime analysis of training Support Vector Machines, and indeed most learning methods, shows how the training runtime increases as more training examples are available. Considering the true objective of training, which is to obtain a good predictor, I will argue that training time should be studied as a decreasing function of training set size. I will then present both theoretical and empirical results demonstrating how a simple stochastic subgradient descent approach for training SVMs indeed displays such monotonic decreasing behavior.

I will also discuss a similar phenomena in the context of Gaussian mixture clustering, where it appears that excess data turns the problem from computationally intractable to computationally tractable.

Joint work with Shai Shalev-Shwartz, Yoram Singer, Greg Shakhnarovich and Sam Roweis.




Archive powered by MHonArc 2.6.16.

Top of Page