Phd Position F/m Neural Implicit Representation and - Strasbourg, France - Inria

Inria
Inria
Entreprise vérifiée
Strasbourg, France

il y a 3 semaines

Sophie Dupont

Posté par:

Sophie Dupont

beBee Recruiter


Description
Le descriptif de l'offre ci-dessous est en Anglais_


Type de contrat :

CDD

Niveau de diplôme exigé :
Bac + 5 ou équivalent


Fonction :
Doctorant


Contexte et atouts du poste:


Location:

The PhD will be hosted at the University of Strabourg in the city center and founded by the project PDE-IA of the PEPR IA.

Since this is a co-supervision with UKAEA Cuham the student is expected to go to Culham campus (near Oxford) between 3 and 6 weeks by year.


P
hD director:

Emmanuel Franck (INRIA, Unistra)


PhD supervisors:
V. Michel Dansac (IN- RIA, Unistra) and S. Pamela (UKAEA Culham).


Potential Collaborators:

As part of this project, the student will interact with a number of researchers such as: Laurent Navoret (Unistra), Joubine Aghili (Unistra) and will be immersed in a stimulating ecosystem of researchers and students as part of the large projects: PEPR PDE-IA and Numpex (exascale computing).

Existing links with Nvidia and Caltech at UKAEA will also be relevant to the PhD project.


Mission confiée:


Principales activités:


In light of the significant successes achieved by deep learning methods in computer-aided vision or language processing, new learning-based methods have emerged for the simulation and resolution of PDEs.

We can mention PINN methods which allow solving a PDE by replacing finite-element approx


A first approach will be to consider the Neural Galerkin method [BPVE24] which maintains an ODE structure in time but approximates the spatial part as well as the parametric dependence of the PDE by a neural network.

This method allows using the good properties in high dimensions of networks to reduce the number of degrees of freedom.

We propose to couple this approach with recent approaches from PINNs to deal with general geometries. Secondly, we aim to study long-term stabil
- ity, which is a critical problem by incorporating the structure of the equations [Sun19], using splitting schemes to preserve the structure, or combining the scheme with "stabilization" methods [BP24]. One of the key points will be to determine robust neural network architectures.

The second approach will focus on neural operators. Early results have shown that this is a promising direction. However, long-term stability issues remain significant. We wish to explore several methods to improve long-term approximations [MHSB23]-[LVP+23] and extend them to multi-scale configurations.

In addition to these general approaches, we can also study how to incorporate the structure of the physical problem into the architecture of the operators.

The obtained approaches will be coupled with methods capable of dealing with general geometries such as [LKC+24]-[BET22] which use parameterized integral kernels in the physical domain.

Purely neural methods will remain limited in precision.

For this reason, ultimately, we would like to couple them with more classical numerical approaches to obtain algorithms that are faster than traditional approaches and reliable.

This type of coupling has already yielded very encouraging results [FMDN23].

[BET22] Nicolas Boullé, Christopher J Earls, and Alex Townsend. Data-driven discovery of green's functions with human-understandable deep learning. Scientific reports, 12(1):4824, 2022.

[BP24] Jules Berman and Benjamin Peherstorfer. Randomized sparse neural galerkin schemes for solving evolution equations with deep networks. Advances in Neural Information Process
- ing Systems, 36, 2024.

[BPVE24] Joan Bruna, Benjamin Peherstorfer, and Eric Vanden-Eijnden. Neural galerkin schemes with active learning for high-dimensional evolution equations. Journal of Computational Physics, 496:112588, 2024.

[CZP+ 24] N Carey, L Zanisi, S Pamela, V Gopakumar, J Omotani, J Buchanan, and J Brandstetter. Data efficiency and long term prediction capabilities for neural operator surrogate models of core and edge plasma codes.


arXiv preprint arXiv:
, 2024.

[FMDN23] Emmanuel Franck, Victor Michel-Dansac, and Laurent Navoret. Approximately well
- balanced discontinuous galerkin methods using bases enriched with physics-informed neu
- ral networks.


arXiv preprint arXiv:
, 2023.

[GPZ+ 23] Vignesh Gopakumar, Stanislas Pamela, Lorenzo Zanisi, Zongyi Li, Ander Gray, Daniel Brennand, Nitesh Bhatia, Gregory Stathopoulos, Matt Kusner, Marc Peter Deisen
- roth, et al. Plasma surrogate modelling using fourier neural operators.


arXiv preprint arXiv:
, 2023.

[LKC+ 24] Zongyi Li, Nikola Kovachki, Chris Choy, Boyi Li, Jean Kossaifi, Shourya Otta, Moham
- mad Amin Nabian, Maximilian Stadler, Christian Hundt, Kamyar Azizzadenesheli, et al. Geometry-informed neural operator for large-scale 3d pdes. Advances in Neural Informa
- tion Processing Systems, 36, 2024.

[LVP+ 23] Phillip Lippe, Bastiaan S. Veeling, Paris Perdikaris, Richard E Turner, and Johannes Brandstetter.


PDE-Refiner:
Achieving Accurate Long Rollouts with Temporal Neural PDE Solvers. In Thirty-seventh Conference o

Plus d'emplois de Inria