In this talk, we address the problem of privacy-preserving training and evaluation of neural networks in an N-party, federated learning setting.

We propose a novel system, POSEIDON, the first of its kind in the regime of privacy-preserving neural network training. It employs multiparty lattice-based cryptography to preserve the confidentiality of the training data, the model, and the evaluation data, under a passive-adversary model and collusions between up to N−1 parties. It relies on homomorphic encryption and secure multi-party computation. We also introduce arbitrary linear transformations within the cryptographic bootstrapping operation, optimizing the costly cryptographic computations over the parties.

We also mention Lattigo, our quantum-resistant open-source cryptographic library on which POSEIDON is based. Our experimental results show that POSEIDON achieves accuracy similar to centralized (or decentralized) non-private approaches and that its computation and communication overhead scales linearly with the number of parties.

Furthermore, we explain how we are using these techniques for the federated analysis of medical data, in particular for genome-wide association studies. Finally, we mention our joint work with lawyers showing GDPR compliance, and address the creation of start-up Tune Insight.