Welcome to my research page! My name is Gage and I’m a postdoc in the Princeton University Physics and ORFE departments, working with Boris Hanin and Dan Marlow. I’m interested in developing principles for machine learning initialization / architecture selection and applying them to physics tasks - for example, charged particle tracking and jet classification at the Large Hadron Collider and stellar stream identification in astronomical datasets. Outside of research, I love to teach, and serve as a lecturer in the Physics Department during some semesters. I also organize the Machine Learning in Physics seminar (running jointly in Physics and ORFE) along with Boris, Dan, Mariangela Lisanti, and Peter Elmer. The focus is on interdisciplinary work, both physics for ML and ML for physics. We’ve been lucky to host excellent speakers from a broad array of subfields; check out our seminar webpage for more details.

Here’s what I’ve been up to recently:

[3-25-24] Taught a Princeton Research Computing workshop called Welcome to the Neural Network Zoo: A Survey of Architectures for Your Research. The Jupyter notebooks and slides are available at my neural-network-zoo repo.

[1-24] Co-taught the Introduction to Machine Learning course, which took place over 5 days during the Princeton Wintersession. The full course is available at our intro_machine_learning Git repo; I taught during Days 4 and 5, which were respectively focused on a survey of deep learning architectures and a computer vision hackathon.

[12-6-23] Posted proceedings from CTD 2023 describing our updated object condensation pipeline, including learned graph construction and no explicit edge classification. Joint work with Kilian Lieret.

[10-30-23] Taught a Princeton Research Computing workshop called Graph Neural Networks for Your Research. The Jupyter notebook and slides are available in my prc_gnn_tutorial repo.

[10-28-23] Co-taught an eSTEAM event at Princeton Middle School focused on introducing artificial intellegence. This event was organized by the lovely folks at Princeton Research Computing.

[9-28-23]: Posted proceedings from CHEP 2023 describing our object condensation-based particle tracking pipeline, which extended traditional edge classification methods with a downstream learned clustering stage. Joint work with Kilian Lieret.

[9-5-23]: First day of classes for Physics 103, Introduction to Mechanics for Engineers. I served as a lecturer in two sections in the Fall semester, including a 103M section designed for students with less prior exposure to physics.

[8-20-23]: Successfully defended my Ph.D. thesis, Search for a Pseudoscalar Higgs Boson. I am incredibly grateful to my thesis committee (Dan Marlow, Isobel Ojalvo, and Herman Verlinde) and second reader (Jim Olsen) for seeing me through this process.

[6-20-23]: Posted a preprint of Principles for Initialization and Architecture Selection in Graph Neural Networks with ReLU Activations to arXiv. Provides concrete principles (with both rigorous proofs and experimental validation) for initializing GNNs and mitigating oversmoothing. Paper has been submitted and is currently under review. Joint work with Boris Hanin.

[4-17-23]: Published a Nature Reviews Physics article called Graph Neural Networks at the LHC. A lot of detailed work (and meticulous use of InkScape) went into this paper, which is designed to be accessible to non-LHC researchers. Fun fact: the cover art of the May 2023 Nature Reviews Physics journal was based on our article! Joint work with Catherine Biscarat, Jean-Roch Vlimant, and Peter Battaglia.

If any of this sounds interesting, I’d love to hear from you! Please feel free to contact me at jdezoort@princeton.edu.