CfP -- NIPS Workshop on Spectral Learning

CfP -- NIPS Workshop on Spectral Learning

Posted by Rebecca Martin on Wed, 28/08/2013 - 10:18

Call for Papers: Workshop on Spectral Learning -- NIPS 2013 December 9 or 10, Lake Tahoe (NV), USA
Website: http://sites.google.com/site/spectrallearningworkshop/

Many problems in machine learning involve collecting high-dimensional multivariate observations or sequences of observations, and then fitting a compact model which explains these observations. Recently, linear algebra techniques have given a fundamentally different perspective on how to fit and perform inference in these models.
Exploiting the underlying spectral properties of the model parameters has led to fast, provably consistent methods for parameter learning that stand in contrast to previous approaches, such as Expectation Maximization, which suffer from bad local optima and slow convergence.

In the past several years, these Spectral Learning algorithms have become increasingly popular. They have been applied to learn the structure and parameters of many models including predictive state representations, finite state transducers, hidden Markov models, latent trees, latent junction trees, probabilistic context free grammars, and mixture/admixture models. Spectral learning algorithms have also been applied to a wide range of application domains including system identification, video modeling, speech modeling, robotics, and natural language processing.

The focus of this workshop will be on spectral learning algorithms, broadly construed as any method that fits a model by way of a spectral decomposition of moments of (features of) observations. We would like the workshop to be as inclusive as possible and encourage paper submissions and participation from a wide range of research related to this focus.

We specially encourage submissions on the following themes:

- How can spectral techniques help us develop fast and local minima free solutions to real world problems where existing methods such as Expectation Maximization are unsatisfactory?
- How do spectral/moment methods compare to maximum-likelihood estimators and Bayesian methods, especially in terms of robustness, statistical efficiency, and computational efficiency?
- What notions of spectral decompositions are appropriate for latent variable models and structured prediction problems?
- How can spectral methods take advantage of multi-core/multi-node computing environments?
- What computational problems, besides parameter estimation, can benefit from spectral decompositions and operator parameterizations?

The workshop will feature invited talks from renowned researchers, a poster session with contributed papers, and an open discussion panel.

** Submissions **

Extended abstracts should be submitted using the NIPS 2013 format with a maximum of 4 pages (not including references). Please e-mail your submission to workshop.spectral.learning(at)gmail.com with the subject line "Submission to Workshop on Spectral Learning".

Concurrent submissions to the workshop and the main conference (or other conferences) are permitted.

** Important dates **

- Submission deadline: October 9, 2013
- Notification of acceptance: October 23, 2013
- Workshop: December 9 or 10, 2013

** Organizers **

- Byron Boots (University of Washington)
- Daniel Hsu (Columbia University)
- Borja Balle (Universitat Politècnica de Catalunya)