CFP: NIPS Workshop on Discrete Optimization in Machine Learning (DISCML) *Extended Deadline: Oct 23*

CFP: NIPS Workshop on Discrete Optimization in Machine Learning (DISCML) *Extended Deadline: Oct 23*

Posted by Rebecca Martin on Mon, 14/10/2013 - 09:59

Call for Contributions
5th Workshop on
Discrete and Combinatorial Problems in Machine Learning (DISCML) at the Annual Conference on Neural Information Processing Systems (NIPS 2013) Lake Tahoe

December 9, 2013

Submission Deadline: October 23, 2013
(max. 6 pages, NIPS format)


Combinatorial structures and optimization problems with discrete solutions are becoming a core component of machine learning. When aiming to process larger quantities of more complex and data, one may quickly find oneself working with graphs, relations, partitions, or sparsity structures. Or we may want to predict structured, sparse estimators, or combinatorial objects such as permutations, trees or other graphs, group structure and so on.

While complex structures enable much richer applications, they often come with the caveat that the related learning and inference problems become computationally very hard. When scaling to large data, combinatorial problems also add challenges to compact representation, streaming, and distributed computation.

Despite discouraging theoretical worst-case results, many practically interesting problems can be much more well behaved (when modeled appropriately). Beneficial properties are most often structural and include symmetry, exchangeability, sparsity, or submodularity. The DISCML workshop revolves around such structures in machine learning, their theory and applications.

We would like to invite high-quality submissions that present
* recent results related to discrete and combinatorial problems in machine learning, or
* open problems, controversial questions and observations.

Areas of interest include (but are not limited to)
* learning and optimization (combinatorial algorithms, submodular optimization, discrete convex analysis, pseudo-boolean optimization, online learning, structure learning),
* continuous relaxations (sparse reconstruction, regularization)
* combinatorics and big data (streaming, sketching, subset selection, parallel and distributed combinatorial algorithms)
* random combinatorial structures and combinatorial stochastic processes
* applications, e.g. combinatorial approaches to information retrieval, speech and natural language processing, computer vision, or bioinformatics.

Invited speakers:
* Kazuo Murota
* Michael Jordan
* Yisong Yue
* Jeff Bilmes


NIPS 2013 format (length max. 6 pages) to submit(at) .
Deadline: October 23, 2013.

Stefanie Jegelka (UC Berkeley),
Andreas Krause (ETH Zurich, Switzerland), Jeff A. Bilmes (University of Washington, Seattle), Pradeep Ravikumar (University of Texas, Austin)