Plenary Lecturers

Professor Sergios Theodoridis (National and Kapodistrian University of Athens, Greece / Aalborg University, Denmark)

Lecture Title: “Deep Neural Networks: A Nonparametric Bayesian View with Local Competition”

Abstract: In this talk, a fully probabilistic approach to the design and training of deep neural networks will be presented. The framework is that of the nonparametric Bayesian learning. Both fully connected as well as convolutional networks (CNNs) will be discussed. The structure of the networks is not a-priori chosen. Adopting nonparametric priors for infinite binary matrices, such as the Indian Buffet Process (IBP), the number of weights as well as the number of nodes or number of kernels (in CNN) are estimated via the resulting posterior distributions. The training evolves around variational Bayesian arguments.

Besides the probabilistic arguments that are followed for the inference of the involved parameters, the nonlinearities used are neither squashing functions not rectified linear units (ReLU), which are typically used in the standard networks. Instead, inspired by neuroscientific findings, the nonlinearities comprise units of probabilistically competing linear neurons, in line with what is known as the local winner-take-all (LTWA) strategy. In each node, only one neuron fires to provide the output. Thus, neurons, in each node, are laterally (same layer) related and only one “survives”; yet, this takes place in a probabilistic context based on an underlying distribution that relates the neurons of the respective node. Such rationale mimics closer the way that the neurons in our brain co-operate.

The experiments, over a number of standard data sets, verify that highly efficient structures are obtained in terms of number of units, weights and kernels as well as in terms of bit precision requirements at no sacrifice to performance, compared to previously published state of the art research. Moreover, such networks turn out to exhibit much higher resilience to attacks by adversarial examples.

The presentation mainly focuses on the concepts and the rationale behind the method and less on the mathematical details.

Professor Ali H. Sayed (EPFL, Switzerland)

Lecture Title: “Learning over Graphs”

Abstract: This talk explains how agents over a graph can learn from dispersed information and solve inference tasks of varying degrees of complexity through localized processing. The presentation also shows how information or misinformation is diffused over graphs, how beliefs are formed, and how the graph topology helps resist or enable manipulation. Examples will be considered in the context of social learning, teamwork, distributed optimization, and adversarial behavior.

Professor Patrick L. Combettes (North Carolina State University, USA)

Lecture Title: “Signal Recovery and Synthesis from Nonlinear Equations Models”

Abstract: Building up on classical linear formulations, we posit that a broad class of problems in signal synthesis and in signal recovery are reducible to the basic task of finding a point in a closed convex subset of a Hilbert space that satisfies a number of nonlinear equations involving firmly nonexpansive operators. We investigate this formalism in the case when, due to inaccurate modeling or perturbations, the nonlinear equations are inconsistent. A relaxed formulation of the original problem is proposed in the form of a variational inequality. The properties of the relaxed problem are investigated and a provenly convergent block-iterative algorithm, whereby only blocks of the underlying firmly nonexpansive operators are activated at a given iteration, is devised to solve it. Numerical experiments illustrate robust recoveries in several signal and image processing applications.

Based on joint work with Z. C. Woodstock

PAGETOP