What: Independent component analysis (ICA) models a random vector as the superposition of independent components and identifies these components without resorting to any other prior knowledge.
Why: A typical application is blind signal separation (or the cocktail party problem): assume n independent non Gaussian signals (say, n speakers in a room) and the observation of n mixtures of these signals (say, the output of n microphones in the room); the problem is to recover the source signals using only the assumption of statistical independence between sources.
Another perspective is data analysis: ICA yields (non-necessarily orthogonal) components which are as independent as possible as an alternative to the principal components (PCA) which are constrained to be orthogonal but are only uncorrelated.
How: Contrast functions: Because of the weakness of the assumptions underlying the ICA model, it is necessary to assume non Gaussian components and to resort to other-than-second-order information (the problem bears strong similarity to blind deconvolution). I describe entropic contrast functions (Shannon entropy, Kullback divergence, mutual information) whose optimization solves the ICA problem; I will interpret these contrasts as distances in the framework of information geometry and will explain how they can be approximated by high-order cumulants.
Equivariance: The ICA model is a transformation model: good off-line estimators should have the equivariance property. On-line (or adaptive) estimators can also be made equivariant in a special and strong sense. This is achieved by updating the decomposition in the direction of the stochastic _Lie_ derivative of a contrast function. We show that this device generates on-line algorithm which are extremely simple but behave in a nearly optimal fashion.
Does it work? The talk will be illustrated by the results of processing ECG data and digital communication signals.