top of page

WILLIAM T. REDMAN

I am a Senior Research Scientist at Johns Hopkins Applied Physics Lab (APL), working on problems in the intersection of machine learning, neuroscience, complex systems, and robotics in the Intelligent Systems Center (ISC). My CV.

​

Before joining APL, I was in a Dynamical Neuroscience (DYNS) PhD student at UC Santa Barbara (UCSB), where I worked in Prof. Michael Goard's systems neuroscience lab as a Chancellor's Fellow. Prior to being at UCSB, I got my bachelor's degrees in mathematics and physics from New York University (NYU).

​

UCSB is home to an exciting and growing neuroscience community, and DYNS is a wonderfully unique program for people interested in engaging with the many different facets of the field. Feel free to send any questions about DYNS my way!

​

NEWS: I have 3 papers accepted to NeurIPS 2024 (including one Spotlight). I'll be in Vancouver in December and happy to meet!

​

I have recently been invited to serve as an Action Editor for Transactions in Machine Learning Research (TMLR). I'm especially interested in supporting work in the intersection of Koopman operator theory and machine learning - feel free to reach out with any questions about submitting to TMLR!

IMG_4890_edited.jpg
Home: Welcome

RESEARCH

My research interests are in understanding how the brain performs computations that underly and support critical cognitive processes, such as memory and learning, and utilizing this insight to expand the capabilities of machine learning methods. In particular, I have four broad focus areas: 1) neural computations; 2) sparse machine learning; 3) dynamics of learning; 4) learning of dynamics. 


See below, and my Google Scholar and ResearchGate for more information. 

Home: Research
elife-75391-fig1-v1_edited.jpg

NEURAL COMPUTATIONS

Coming soon.

IMP_RG.png

SPARSE MACHINE LEARNING

Coming soon.

Koopman_operator_types.png

DYNAMICS OF LEARNING

Coming soon.

Image from iOS (12).jpg

LEARNING OF DYNAMICS

Coming soon.

PUBLICATIONS

Home: Publications

PEER REVIEWED:

​

13)  W. T. Redman, J. M. Bello-Rivas, M. Fonoberova, R. Mohr, I. G. Kevrekidis, and I. Mezić, Identifying Equivalent Training Dynamics. NeurIPS Spotlight (2024)

12) W. T. Redman, F. Acosta, S. Acosta-Mendoza, and N. Miolane, Not so griddy: Internal representations of RNNs path integrating more than one agent. NeurIPS (2024)

11) F. Acosta, F. Dinc, W. T. Redman, M. Madhav, D. Klindt, and N. Miolane, Global Distortions from Local Rewards: Neural Coding Strategies in Path-Integrating Neural Systems.  NeurIPS (2024)

10) W. T. Redman*, S. Acosta-Mendoza*, X.X. Wei, and M. J. Goard, Robust Variability of Grid Cell Properties Within Individual Grid Modules Enhances Encoding of Local Space.  eLife (2024) (* contributed equally)

9) E. R. J. Levy, S. Carrillo-Segura, E. H. Park, W. T. Redman, J. Hurtado, S. Y. Chung, A. A. Fenton, "A manifold neural population code for space in hippocampal coactivity dynamics independent of place fields". Cell Reports (2023)

8) W. T. Redman, M. Fonoberova, R. Mohr, Y. Kevrekidis, and I. Mezić, "Algorithmic (Semi-)Conjugacy via Koopman Operator Theory".  IEEE Conference on Control and Decision (CDC 2022)

7) W.T. Redman, N.S. Wolcott, L. Montelisciani, G. Luna, T.D. Marks, K.K. Sit, C.-H. Yu, S.L. Smith, and M.J. Goard, Long-term Transverse Imaging of the Hippocampus with Glass Microperiscopes. eLife (2022)

6) W. T. Redman, T. Chen,  Z. Wang, and A. S. Dogra Universality of Winning Tickets: A Renormalization Group Perspective. International Conference on Machine Learning (ICML 2022). 

5) W. T. Redman, M. Fonoberova, R. Mohr, Y. Kevrekidis, and I. Mezić, An Operator Theoretic View on Pruning Deep Neural Networks. International Conference on Learning Representations (ICLR 2022).

4) W. T. Redman, On Koopman Mode Decomposition and Tensor Component Analysis. Chaos Fast Track (2021).

3) A. S. Dogra*, and W. T. RedmanOptimizing Neural Networks via Koopman Operator Theory. Advances in Neural Information Processing Systems 33 (NeurIPS 2020) (* contributed equally)

2) W. T. RedmanRenormalization group as a koopman operator. Physical Review E Rapid Communication (2020) 

1) W. T. RedmanAn O(n) method of calculating Kendall correlations of spike trains. PLoS One (2019)

IN PROGRESS:

3. W. T. Redman, Z. Wang, A. Ingrosso, and S. Goldt, "Sparsity Enhances Non-Gaussian Data Statistics During Local Receptive Field Formation. 

2. W. T. Redman, D. Huang, M. Fonoberova, and I. Mezić, Koopman Learning with Episodic Memory

1. N. S. Wolcott, W. T. Redman, M. Karpinska, E. G. Jacobs, and M. J. Goard, The estrous cycle modulates hippocampal spine dynamics, dendritic processing, and spatial coding.

​

ABOUT

I grew up in Hopewell, New Jersey, a little town outside of Princeton and Trenton. I found math class repulsive and, until I had a rather sudden change of heart my senior year, spent most of the period working on ways to avoid paying attention. Outside of research, I enjoy hiking and cycling (both of which are complicated by my innately poor sense of direction) and trying new beers.

IMG_5302.JPG
IMG_2842.JPG
IMG_8359.jpg
IMG_1757.jpg
Home: About
bottom of page