How Neural Networks Transform Data

,

Professor: Alex Cloninger

Description: We will explore how the relationships and distances between data points change as they go through different layers of a deep neural network and as the weights of the neural network change throughout training the network. From this perspective, we’ll look at questions involving transfer learning, data summarization for efficient training, and modeling training dynamics. This can be analyzed from various lenses, including building mathematical theory, creating computationally efficient algorithms and code, and examining the dynamics for a specific scientific domain. There are several individual problems that can be branched off from here, depending on the strengths and interests of the REU participant.

Preferred Qualifications: Linear Algebra course, experience with Python (and preferably also a NN package like Tensorflow or PyTorch), a course on graphs/networks or deep learning is a plus



Contact Us