Invited speakers

Luc de Raedt

Luc de Raedt, Katholieke Universiteit Leuven, Belgium

Short biography

Declarative Modeling for Machine Learning

Presentation slides

Despite the popularity of machine learning and data mining today, it remains challenging to develop applications and software that incorporates machine learning or data mining techniques. This is because machine learning and data mining have focussed on developing high-performance algorithms for solving particular tasks rather than on developing general principles and techniques.

I propose to alleviate these problems by applying the constraint programming methodology to machine learning and data mining and to specify machine learning and data mining problems as constraint satisfaction and optimization problems. What is essential is that the user be provided with a way to declaratively specify what the machine learning or data mining problem is rather than having to outline how that solution needs to be computed. This corresponds to a model + solver- based approach to machine learning and data mining, in which the user specifies the problem in a high level modeling language and the system automatically transforms such models into a format that can be used by a solver to efficiently generate a solution. This should be much easier for the user than having to implement or adapt an algorithm that computes a particular solution to a specific problem.

This view shall be related to inductive logic programming and shall be illustrated using our results on constraint programming for itemset mining and probablistic logic programming.

Ben Taskar, University of Pennsylvania, USA

Short biography

Geometry of Diversity and Determinantal Point Processes: Representation, Inference and Learning

Presentation slides

Graphical models are the dominant tool for capturing complex joint distributions in computer vision and natural language processing prediction tasks. However, inference and learning for all but a small, restrictive subset of graphical models is intractable, and standard approximation methods often fail for distributions with global, negative correlations. Determinantal point processes (DPPs) provide a computationally tractable alternative. DPPs arise in random matrix theory and quantum physics as models of random variables with negative correlations. Among their many remarkable properties, they offer tractable algorithms for exact inference, including computing marginals, computing certain conditional probabilities, and sampling. DPPs are a natural model for subset selection problems where diversity is preferred. For example, they can be used to select diverse sets of sentences to form document summaries, or to return relevant but varied text and image search results, or to detect non-overlapping multiple object trajectories in video. I’ll present our recent work on a novel factorization and dual representation of DPPs that enables efficient inference for exponentially-sized structured sets. We develop a new inference algorithm based on Newton identities for DPPs conditioned on subset size and derive efficient parameter estimation for DPPs from several types of observations. I’ll show the advantages of the model on several natural language and vision tasks: document summarization, image search and multi-person pose estimation problems in images.

Geraint A. Wiggins

Geraint A. Wiggins, University of London

Short biography

Learning and Creativity in the Global Workspace

Presentation slides

The key goal of cognitive science is to produce an account of the phenomenon of mind which is mechanistic, empirically supported, and credible from the perspective of evolution. In this talk, I will present a model based on Baars’ (1988) Global Workspace account of consciousness, that attempts to provide a general, uniform mechanism for information regulation. Key ideas involved are: information content and entropy (Shannon, 1948; MacKay, 2003), expectation (Huron, 2005; Pearce and Wiggins, 2012), learning multi-dimensional, multi-level representations (Gärdenfors, 2000) and data (Pearce, 2005), and data-driven segmentation (Pearce et al, 2010).

The model was originally based in music, but can be generalised to language (Wiggins, 2012). Most importantly, it can account for not only perception and action, but also for creativity, possibly serving as a model for original linguistic thought.

  • Baars, B. J. (1988). A cognitive theory of consciousness. Cambridge, UK: Cambridge University Press.
  • Gärdenfors, P. (2000). Conceptual Spaces: the geometry of thought. Cambridge, MA: MIT Press.
  • Huron, D. (2006). Sweet Anticipation: Music and the Psychology of Expectation. Bradford Books. Cambridge, MA: MIT Press.
  • MacKay, D. J. C. (2003). Information Theory, Inference, and Learning Algorithms. Cambridge, UK: Cambridge University Press.
  • Pearce, M. T. (2005). The Construction and Evaluation of Statistical Models of Melodic Structure in Music
  • Perception and Composition. PhD thesis, Department of Computing, City University, London, UK.
  • Pearce, M. T., Müllensiefen, D. and Wiggins, G. A. (2010). The role of expectation and probabilistic learning in auditory boundary perception: A model comparison. Perception, 9:1367–1391.
  • Pearce, M. T. and Wiggins, G. A. (2012). Auditory Expectation: The Information Dynamics of Music Perception and Cognition. Topics in Cogntive Science. In press.
  • Shannon, C. (1948). A mathematical theory of communication. Bell System Technical Journal, 27, 379–423, 623–56.
  • Wiggins, G. A. (2012). “I let the music speak”: cross-domain application of a cognitive model of musical learning. In Statistical Learning and Language Acquisition, Rebuschat, P. and Williams, J. (eds.), Amsterdam, NL: Mouton De Gruyter.

Leave a Reply