Raphaël Berthier

Briefly
I am a firstyear Ph.D. student under the supervision of Francis Bach and Pierre Gaillard. I work in the SIERRA team in Paris, which is a joint team between INRIA Paris, ENS Paris and CNRS.
My research interests lie mainly within statistics, optimization and probability theory. More precisely, my current work focuses on developing polynomialbased iterative methods to accelerate the sharing of information in decentralized networks, using inspiration coming from numerical analysis. Before that, I worked under the supervision of Andrea Montanari to develop a rigorous analysis of the Approximate Message Passing algorithms in the case of nonseparable denoisers.
Here is a short CV.
Contact

Publications and Preprints
R. Berthier, F. Bach, P. Gaillard. Gossip of Statistical Values using Orthogonal Polynomials. [hal, arXiv], 2018. [Show Abstract]
Abstract: Consider a network of agents connected by communication links, where each agent holds a real value. The gossip problem consists in estimating the average of the values diffused in the network in a distributed manner. Current techniques for gossiping are designed to deal with worstcase scenarios, which is irrelevant in applications to distributed statistical learning and denoising in sensor networks. We design secondorder gossip methods tailormade for the case where the real values are i.i.d. samples from the same distribution. In some regular network structures, we are able to prove optimality of our methods, and simulations suggest that they are efficient in a wide range of random networks. Our approach of gossip stems from a new acceleration framework using the family of orthogonal polynomials with respect to the spectral measure of the network graph.
R. Berthier, A. Montanari, P.M. Nguyen. State Evolution for Approximate Message Passing with NonSeparable Functions. [arXiv], 2017, accepted for publication in Information and Inference: a Journal of the IMA. [Show Abstract]
Abstract: Given a highdimensional data matrix , Approximate Message Passing (AMP) algorithms construct sequences of vectors , , indexed by by iteratively applying or , and suitable nonlinear functions, which depend on the specific application. Special instances of this approach have been developed –among other applications– for compressed sensing reconstruction, robust regression, Bayesian estimation, lowrank matrix recovery, phase retrieval, and community detection in graphs. For certain classes of random matrices , AMP admits an asymptotically exact description in the highdimensional limit , which goes under the name of ‘state evolution.’
Earlier work established state evolution for separable nonlinearities (under certain regularity conditions). Nevertheless, empirical work demonstrated several important applications that require nonseparable functions. In this paper we generalize state evolution to Lipschitz continuous nonseparable nonlinearities, for Gaussian matrices . Our proof makes use of Bolthausen's conditioning technique along with several approximation arguments. In particular, we introduce a modified algorithm (called LAMP for Long AMP) which is of independent interest.
