Andrew Barron (Yale University)
Title: Information Theory and Statistics for Machine Learning
Abstract: Tools of information theory and statistics are presented for analysis of risk (generalization error) of function estimation procedures including mixture density estimators, neural networks, function aggregation, basis adaptation, and adaptive kernel machines. Methods of estimation, model selection, and model averaging may be familiar in practice. Here I advocate understanding the associated risk and its implications for viability of the procedures, especially in high dimensional cases. Fundamental roles characterizing risk are played by indices of resolvability, which express the ideal balance between descriptive complexity (or log reciprocal prior probability or metric entropy) and model accuracy measured by relative entropy. Adaptive procedures are essential for accurate performance in high dimensions, but implementation is often adhoc to avoid computational excesses. Provably accurate and computationally feasible estimators for high dimensional problems are not yet available. Clues toward effective computations are discussed.
Simon Haykin (McMaster University)
Title: Cognitive Machines
Abstract: In this lecture, I will discuss examples of cognitive machines in radar, radio, and hearing systems, where the two pervasive fields: neural computation and signal processing, meet. In particular, I will focus on how these fields can enrich each other in the design of a new generation of systems.
Barry Horwitz (NIH)
Title: Using Neural Modeling and Functional Neuroimaging to Study the Neural Basis of Auditory and Visual Object Processing
Abstract: Formidable conceptual problems exist in interpreting human functional neuroimaging data in terms of the underlying neural activity. To surmount these difficulties, we have developed two neurobiologically realistic models (one for vision, one for audition) of the object recognition pathway in human neocortex in which data at different spatiotemporal levels can be simulated and cross-validated by multiple disciplines, including functional brain imaging. Our models, based on neurophysiological and neuroanatomical data from primate and human studies, enable us to simultaneously simulate cellular electrophysiological and functional magnetic resonace imaging data in multiple, interconnected brain regions. This type of network modeling provides a mechanism by which assumptions about the neural bases for high-level cognitive, sensorimotor and emotional processes can have their physiological consequences tested.