Efficient Transformers and Diffusion Models for Science

Professor: Aparna Chandramowlishwaran

Description: We are developing efficient neural operators based on transformer architectures to solve scientific problems that traditionally require expensive numerical simulations. Our focus is on creating scalable attention mechanisms that can handle varying resolutions of scientific data while maintaining the mathematical properties of neural operators, with applications ranging from fluid dynamics to molecular systems. The Cal-Bridge scholar would work on enhancing our domain decomposition-based attention mechanisms to handle complex boundary conditions and implement efficient versions of these operators for specific scientific applications, while gaining experience with both deep learning and scientific computing.

Preferred Qualifications: Python is a must. Familiarity with PyTorch is a plus. Familiarity with transformer and diffusion models will accelerate research but not mandatory.

Contact Us