Raphaël Berthier

 

Briefly

I am a first-year Ph.D. student under the supervision of Francis Bach and Pierre Gaillard. I work in the SIERRA team in Paris, which is a joint team between INRIA Paris, ENS Paris and CNRS.

My research interests lie mainly within statistics, optimization and probability theory. More precisely, my current work focuses on developing polynomial-based iterative methods to accelerate the sharing of information in decentralized networks, using inspiration coming from numerical analysis. Before that, I worked under the supervision of Andrea Montanari to develop a rigorous analysis of the Approximate Message Passing algorithms in the case of non-separable denoisers.

Here is a short CV.

Contact

Publications and Preprints

  • R. Berthier, F. Bach, P. Gaillard. Gossip of Statistical Values using Orthogonal Polynomials.
    [hal, arXiv], 2018. [Show Abstract]

    Abstract: Consider a network of agents connected by communication links, where each agent holds a real value. The gossip problem consists in estimating the average of the values diffused in the network in a distributed manner. Current techniques for gossiping are designed to deal with worst-case scenarios, which is irrelevant in applications to distributed statistical learning and denoising in sensor networks. We design second-order gossip methods tailor-made for the case where the real values are i.i.d. samples from the same distribution. In some regular network structures, we are able to prove optimality of our methods, and simulations suggest that they are efficient in a wide range of random networks. Our approach of gossip stems from a new acceleration framework using the family of orthogonal polynomials with respect to the spectral measure of the network graph.

  • R. Berthier, A. Montanari, P.-M. Nguyen. State Evolution for Approximate Message Passing with Non-Separable Functions.
    [arXiv], 2017, accepted for publication in Information and Inference: a Journal of the IMA. [Show Abstract]

    Abstract: Given a high-dimensional data matrix A in {rm I!R}^{n times m}, Approximate Message Passing (AMP) algorithms construct sequences of vectors u^t in {rm I!R}^n, v^t in {rm I!R}^m, indexed by t in {0,1,2,dots } by iteratively applying A or A^T, and suitable non-linear functions, which depend on the specific application. Special instances of this approach have been developed –among other applications– for compressed sensing reconstruction, robust regression, Bayesian estimation, low-rank matrix recovery, phase retrieval, and community detection in graphs. For certain classes of random matrices A, AMP admits an asymptotically exact description in the high-dimensional limit m,nto infty, which goes under the name of ‘state evolution.’ Earlier work established state evolution for separable non-linearities (under certain regularity conditions). Nevertheless, empirical work demonstrated several important applications that require non-separable functions. In this paper we generalize state evolution to Lipschitz continuous non-separable nonlinearities, for Gaussian matrices A. Our proof makes use of Bolthausen's conditioning technique along with several approximation arguments. In particular, we introduce a modified algorithm (called LAMP for Long AMP) which is of independent interest.