I'm a PhD student working on computational methods to understand how the universe evolved from simple initial conditions to the complex cosmic web we see today. Most of my time is spent writing code to analyze massive datasets from telescopes and running simulations on supercomputers to test different theories about dark matter and cosmic structure formation.
Right now I'm building machine learning tools that can optimize telescope performance in real-time and developing better ways to compress astronomical data without losing the scientific information we care about. I also work on extracting cosmological parameters from the Lyman-α forest—basically using quasar light that's been absorbed by intervening gas to measure properties of the early universe.
I grew up in Bangladesh and have always been drawn to problems that require both mathematical thinking and computational tools. When I'm not debugging Python scripts or waiting for simulations to finish, I enjoy teaching and thinking about how to make complex astrophysics more accessible to students from all backgrounds.
My research combines computational physics, data analysis, and machine learning to study cosmic evolution
Training deep learning models to automatically optimize telescope optics as atmospheric conditions change. It's like giving telescopes the ability to constantly adjust their focus for the best possible images.
Using mathematical tools called wavelets to extract more cosmological information from quasar absorption lines. It's like finding hidden signals in data that traditional methods miss.
Running large-scale computer simulations of how cosmic structure formed over billions of years. These virtual universes help us test our theories against real observations.
Using data from the cosmic microwave background to check whether fundamental constants like the fine-structure constant have changed over the 13.8 billion year history of the universe.
Writing Python and Julia code to process massive astronomical datasets efficiently. Good software is just as important as good physics for modern astronomy.
Developing algorithms that can compress huge datasets by factors of 15+ while keeping all the scientifically important information intact. Essential for handling next-generation survey data.
I write about my projects without the jargon—what the actual problems are, why they're interesting, and what I learned along the way.
How I trained neural networks to help telescopes automatically correct for atmospheric disturbances and mechanical imperfections in real-time.
How I use high-resolution simulations and neural networks to model what happened when the first stars lit up the universe, and why getting the small-scale physics right matters.
Building Julia code to find planetary configurations that repeat perfectly over time, using automatic differentiation and some clever optimization tricks.
The story of baryonic streaming velocity—a subtle effect from the early universe that turns out to have big consequences for how cosmic structure forms.
How I used wavelets (math tools from signal processing) to extract new cosmological information from Lyman-α forest data that traditional methods were missing.
Using the cosmic microwave background to test whether fundamental constants have evolved since the early universe. Spoiler: probably not, but the constraints are interesting.