DISCML 2013: NIPS Workshop on Discrete and Combinatorial Problems in Machine Learning -- Call for Contributions

DISCML 2013: NIPS Workshop on Discrete and Combinatorial Problems in Machine Learning -- Call for Contributions

Posted by Rebecca Martin on Tue, 01/10/2013 - 09:05

Call for Contributions
5th Workshop on
Discrete and Combinatorial Problems in Machine Learning (DISCML) at the Annual Conference on Neural Information Processing Systems (NIPS 2013) Lake Tahoe


Submission Deadline: 9th October 2013
(max. 6 pages, NIPS format)

Combinatorial structures and optimization problems with discrete solutions are becoming a core component of machine learning. When aiming to process larger quantities of more complex and data, one may quickly find oneself working with graphs, relations, partitions, or sparsity structures. Or we may want to predict structured, sparse estimators, or combinatorial objects such as permutations, trees or other graphs, group structure and so on.

While complex structures enable much richer applications, they often come with the caveat that the related learning and inference problems become computationally very hard. When scaling to large data, combinatorial problems also add challenges to compact representation, streaming, and distributed computation.

Despite discouraging theoretical worst-case results, many practically interesting problems can be much more well behaved (when modeled appropriately). Beneficial properties are most often structural and include symmetry, exchangeability, sparsity, or submodularity. The DISCML workshop revolves around such structures in machine learning, their theory and applications.
The workshop will feature invited keynote lectures as well as contributed spotlight talks and posters.

We would like to invite high-quality submissions that present
* recent results related to discrete and combinatorial problems in
machine learning, or
* open problems, controversial questions and observations.

Areas of interest include (but are not limited to)
* learning and optimization (combinatorial algorithms, submodular
optimization, discrete convex analysis, pseudo-boolean optimization,
online learning, structure learning),
* continuous relaxations (sparse reconstruction, regularization)
* combinatorics and big data (streaming, sketching, subset selection,
parallel and distributed combinatorial algorithms)
* random combinatorial structures and combinatorial stochastic processes
* applications, e.g. combinatorial approaches to information retrieval,
speech and natural language processing, computer vision, or bioinformatics.


NIPS 2013 format (length max. 6 pages) to submit(at)discml.cc .
Deadline: October 9, 2013.

Stefanie Jegelka (UC Berkeley),
Andreas Krause (ETH Zurich, Switzerland), Jeff A. Bilmes (University of Washington, Seattle), Pradeep Ravikumar (University of Texas, Austin)