You’ve made it through nine clusters covering every major intersection of physics and machine learning — from fitting curves to discovering physical laws, from solving PDEs to controlling fusion reactors. Now the question is: what do you do with this? This final cluster is your practical guide to turning these skills into a career: what roles exist, where to find them, what to build for your portfolio, which conferences matter, and how to navigate the academia vs industry choice.
AI for Physics Students › Cluster 9 › Cluster 10: Career Roadmap
This is the final cluster in the series. The first nine built your technical foundation. This one answers the question every physics student with ML skills eventually asks: what do I do next? The answer is more varied — and more interesting — than most students expect.
- The Career Landscape: Roles That Exist Now
- Choosing Your Subfield
- Building a Portfolio That Gets Noticed
- Key Labs, Groups & Fellowships
- Conferences & Where the Field Happens
- Academia vs Industry: Honest Comparison
- The 12-Month Learning Roadmap
- Curated Resources: Books, Courses, Tools
Section 1 — The Career Landscape: Roles That Exist Right Now
The intersection of physics and AI has created a set of roles that did not exist ten years ago. They span academia, national laboratories, tech companies, startups, finance, and policy. Understanding the landscape before you start building toward it will save you years of misdirected effort.
Develops new ML methods specifically for scientific applications. Publishes at NeurIPS, ICML, and domain journals. Works at DeepMind, Meta FAIR, Microsoft Research, Allen Institute, or university groups.
Typical route: Physics PhD → postdoc → research scientist
Builds and deploys ML pipelines for scientific data at scale. Optimises inference, maintains data infrastructure, collaborates with researchers. Works at CERN, ESA, NCAR, pharmaceutical companies, climate tech startups.
Typical route: Physics BSc/MSc + bootcamp or self-study
Traditional physics research role that now heavily uses ML for simulation, data analysis, and parameter estimation. The majority of physics research positions now require ML literacy. National labs (Argonne, SLAC, Fermilab) are a major employer.
Typical route: Standard academic physics path
Physics background translates powerfully to finance. Statistical modelling, signal processing, uncertainty quantification, and simulation are all directly transferable. Hedge funds and banks actively recruit physics PhDs. High-paying, fast-paced, commercially focused.
Typical route: Physics MSc/PhD + finance interest
One of the fastest-growing sectors. ML for climate modelling, renewable energy optimisation, carbon capture, grid management. Organisations: ECMWF, Climate Change AI, national weather services, clean energy startups like Climeworks.
Typical route: Physics or atmospheric science PhD
ML for molecular property prediction, protein structure, drug-target binding. Enormous and growing sector. Physicists with ML skills are highly sought after. Companies: Isomorphic Labs (DeepMind spinout), Recursion, Insilico Medicine, major pharma AI groups.
Typical route: Physics/chemistry PhD, often via materials ML
Section 2 — Choosing Your Subfield: A Decision Framework
The most common mistake early-career physicists make is trying to learn everything. You’ve just read nine clusters covering quantum ML, particle physics, astrophysics, materials science, RL, and NLP. The temptation is to be fluent in all of them. That path leads to being mediocre in all of them.
The most successful people at the physics-AI intersection are deeply expert in one subfield and fluent in the ML tools relevant to it. The subfield gives you the domain knowledge that distinguishes you from a pure ML engineer. The ML gives you the computational capability that distinguishes you from a traditional physicist. The intersection is where you create unique value.
| Subfield | Core ML Tools | Hot Problems 2025 | Key Employers |
|---|---|---|---|
| Particle Physics | GNNs, anomaly detection, flows | Real-time triggers, model-agnostic searches | CERN, Fermilab, SLAC |
| Astrophysics | CNNs, SBI, symbolic regression | Rubin pipeline, GW multi-messenger | ESO, ESA, NASA JPL, LIGO |
| Materials Science | GNNs, force fields, generative | Battery materials, superconductors, catalysis | DeepMind, Toyota, NREL, startups |
| Quantum Physics | NQS, VQE, RL, ML-DFT | QEC decoding, FermiNet, QML | IBM, Google QAI, IQM, national labs |
| Plasma / Fusion | RL, PINNs, surrogate models | Tokamak control, fast ignition | Commonwealth Fusion, TAE, ITER |
| Climate / Earth | Diffusion, transformers, PINNs | AI weather models, downscaling | ECMWF, Google DeepMind, NCAR |
Section 3 — Building a Portfolio That Gets Noticed
Your CV says you know ML. Your portfolio proves it. In a field where everyone claims Python and PyTorch skills, a well-structured GitHub portfolio is often what gets a candidate through to interview. The key principle: each project should demonstrate both physics understanding and ML competence — not one or the other.
Five Portfolio Projects That Stand Out
Train a Crystal Graph Convolutional Neural Network on the Materials Project database. Predict formation energies, evaluate MAE against DFT benchmarks, and build a simple web API that accepts a crystal structure and returns stability predictions. This demonstrates: GNN implementation, materials domain knowledge, API deployment.
Download real LIGO open data via gwpy, preprocess with whitening and bandpass filtering, train a 1D CNN to classify events as signals or noise glitches, and compare against matched filtering. Include proper ROC curve analysis and discuss false alarm rates. Demonstrates: time-series ML, signal processing, binary classification, physics-appropriate evaluation metrics.
Implement a Physics-Informed Neural Network from scratch in PyTorch to solve the 2D heat equation or Burgers equation. Verify against the analytic solution, demonstrate inverse problem capability (recover diffusivity from noisy measurements), and visualise the physics residual. Demonstrates: PINNs, autograd, physics validation, inverse problems.
Download galaxy rotation curve data from the SPARC database, run PySR to discover the relationship between velocity and radius, and compare the discovered equation to MOND and NFW predictions. Write a clear narrative explaining what the equation means physically. Demonstrates: symbolic regression, astrophysics domain knowledge, scientific writing.
Build a custom Gymnasium environment modeling a physical system (double pendulum, coupled oscillators, a simplified plasma confinement toy model), train a PPO agent using stable-baselines3, and analyse the learned policy in terms of physical intuition — what strategy did the agent discover? Does it match the classical optimal control solution? Demonstrates: RL, custom environments, physics interpretation.
Section 4 — Key Labs, Groups & Fellowships
Knowing where the best work is happening — and how to get there — is as important as knowing the technical material. Here is the honest landscape of groups at the frontier.
- Kyle Cranmer (NYU/UW) — SBI, ML for HEP, symbolic regression
- Phiala Shanahan (MIT) — ML for lattice QCD, nuclear physics
- Bryan Kolb / Ilya Mandel groups — GW astronomy ML
- IAIFI (MIT/Harvard/Tufts/Northeastern) — dedicated institute for AI + physics
- Argonne, SLAC, Fermilab ML groups — national lab ML initiatives
- ExaTrkX collaboration — GNN particle tracking at LHC scale
- Google DeepMind — AlphaFold, GNoME, tokamak RL, DM21
- Microsoft Research — scientific ML, quantum simulation
- Meta FAIR — open science ML, ESMFold protein models
- NVIDIA Research — physics simulations on GPU, FourCastNet
- Allen Institute for AI — scientific NLP, SPECTER, S2ORC
- Isomorphic Labs — drug discovery, AlphaFold spinout
- CERN ML Fellowship — 1-2 year positions in HEP ML at CERN
- DOE SCGSR Program — PhD students at national labs
- IAIFI Fellowship — postdoc at the AI/physics boundary
- DeepMind Research Scientist — competitive, requires strong ML pubs
- Google Summer of Code — via ML4Sci, HEP-ML projects
- Fast.ai Scholar program — for self-taught practitioners
- ML4Sci Slack — machine learning for science community
- Physics Meets ML community — physicsmeetml.org
- HEP-ML Forum — hep-ml.github.io
- Climate Change AI — climatechange.ai community
- Materials Project Forum — materialsproject.org/forum
- arXiv hep-ph/cs.LG crosslistings — daily paper alerts
Section 5 — Conferences: Where the Field Actually Happens
Conference attendance — even virtually — is where you encounter ideas before they become mainstream, meet collaborators, and understand the actual frontier versus what textbooks and tutorials describe. Here is where to focus your energy.
All three host dedicated workshops on machine learning for physical sciences. NeurIPS “ML4PS” and ICML “AI for Science” workshops are where the most important physics-ML papers first appear in public. Attending these workshops — even as a viewer of recorded talks — gives you an accurate picture of the state of the field. Proceedings are free online.
The dedicated physics-ML venues at major conferences. Submission deadlines are earlier than the main conference and acceptance rates are higher for domain-specific work. This is where a physics PhD student should aim for their first ML publication — the physics context is valued, not just the ML novelty.
These annual workshop series specifically bridge physical sciences and ML. Excellent for finding collaborators who share both the physics and ML background, and for understanding cross-subfield connections you won’t see at domain-specific conferences.
All major physics conferences now have dedicated ML sessions. The advantage: you present to physicists who understand your problem, not to ML researchers who may not. The disadvantage: ML methods standards are lower, so publishing here doesn’t demonstrate ML credibility to pure ML employers. Know your audience.
Section 6 — Academia vs Industry: An Honest Comparison
This is the question every physics PhD student faces, and it has become significantly more complex as industry research labs have grown to rival or surpass academic groups in resources and output. Here is the honest comparison, without the usual either-direction cheerleading.
- Full intellectual freedom to choose problems
- Teaching, mentoring, and community building
- Long time horizons — work on 10-year problems
- Collaboration network across institutions
- Public contribution: all work published and open
- Compute access often limited vs industry
- Postdoc pathway is long and uncertain
- Salaries significantly lower than industry
- Geographic constraints (university locations)
- Grant-writing overhead is substantial
- Access to massive compute (TPU/GPU clusters)
- Salaries 2–5x higher than postdoc equivalent
- Large teams and diverse expertise
- Problems with real-world scale and impact
- No grant-writing — focus entirely on research
- Publication subject to IP/legal review (slower)
- Research directions set by business priorities
- Less freedom to pursue fundamental questions
- Layoffs are a real risk (especially in downturns)
- No teaching or student mentorship
| Role | Typical Salary Range (USD, 2025) | Notes |
|---|---|---|
| Physics Postdoc (US) | $55k–$75k | NIH/NSF scale; 2–3 year contract |
| Assistant Professor (US R1) | $90k–$130k | + startup package; tenure-track |
| ML Engineer (scientific domain) | $130k–$200k | + equity; higher in Bay Area/NYC |
| Research Scientist (DeepMind/OpenAI) | $200k–$400k+ | total comp incl. equity; highly competitive |
| Quant Researcher (hedge fund) | $200k–$500k+ | total comp; bonus-heavy structure |
| CERN ML Fellow | ~€50k + benefits (Geneva) | Highly prestigious; PhD required |
Section 7 — The 12-Month Learning Roadmap
The most common question after reading a guide like this is: where do I start? The answer depends on your current level, but here is a concrete 12-month plan that works for a physics undergraduate or early graduate student with basic Python knowledge who wants to reach professional competency.
Complete fast.ai Part 1. Implement linear regression, logistic regression, and a simple feedforward network from scratch — without using high-level APIs. Finish Cluster 1 of this guide (curve fitting) and apply it to your own experimental data. Goal: produce publication-quality plots of ML-fitted data with proper uncertainty quantification.
Work through Andrej Karpathy’s nanoGPT tutorial. Implement a CNN for galaxy morphology classification (Cluster 4). Implement a 1D-CNN for GW signal detection. Read 2–3 papers from Clusters 3/4 and reproduce one result. Goal: one complete end-to-end ML project using real physics data, posted to GitHub.
Go deep in one cluster relevant to your research area. If condensed matter: implement CGCNN from Cluster 5. If particle physics: build a jet tagger from Cluster 3. If astrophysics: implement SBI from Cluster 4. The goal is to understand the domain literature, not just run code. Read 10 papers, attend a relevant workshop recording, and start a domain-specific portfolio project.
Choose one advanced topic from Clusters 2, 5, or 7. Implement a complete PINN for a PDE relevant to your research, or train a normalizing flow on physics data, or build a VAE for your domain. Post the project with a detailed README and a short blog post explaining the physics motivation. This project should be your centrepiece portfolio item.
Start a research project with novel contribution. This does not have to be a full paper — it could be: a benchmark on a new dataset, an application of method X to problem Y (never done before), an ablation study, or a software tool. Post a preprint to arXiv. Even an arXiv preprint with no peer review demonstrates research initiative and gives you something to discuss in interviews.
Join ML4Sci Slack and Physics Meets ML communities. Submit a short paper or extended abstract to the NeurIPS ML4PS workshop. Apply to CERN ML Fellowship, DOE SCGSR, or relevant industry internships. Update your GitHub profile and LinkedIn with your projects. Start cold-emailing groups you want to work with — a brief, specific email referencing their papers gets far more responses than a generic inquiry.
Section 8 — Curated Resources: The Essential List
These are the resources that practitioners at the physics-ML frontier actually use and recommend. Not an exhaustive list — a curated one.
- Mehta et al. — A high-bias, low-variance intro to ML for physicists (free, arXiv:1803.08823)
- Goodfellow, Bengio, Courville — Deep Learning (free at deeplearningbook.org)
- Rasmussen & Williams — Gaussian Processes for ML (free online)
- Sutton & Barto — Reinforcement Learning (free online)
- Murphy — Probabilistic ML Vols 1&2 (free PDFs)
- fast.ai — Practical DL (free) — best hands-on ML course
- Karpathy’s nanoGPT — Build a transformer from scratch
- Deep Learning Specialisation — Coursera/DeepLearning.AI
- Physics Meets ML YouTube — recorded seminar series
- MLSS (ML Summer School) — annual school, lectures online
- Mehta et al. (2019) — Physics Reports: ML for physicists
- Carleo et al. (2019) — Rev. Mod. Phys: ML & physical sciences
- Brehmer & Cranmer (2020) — ML in particle physics
- Deringer et al. (2019) — ML interatomic potentials review
- Cranmer, Brehmer & Louppe (2020) — Simulation-based inference
- PyTorch + PyG — deep learning and graph NNs
- PySR — symbolic regression for science
- sbi — simulation-based inference
- DeepXDE — physics-informed neural networks
- NetKet — neural quantum states (JAX)
- stable-baselines3 — RL algorithms
- PennyLane — quantum ML
Closing: What It Means to Be a Physicist in the Age of AI
We opened this series by saying that physics and AI share the same obsession: finding the deepest patterns in the universe. That remains true. But something has shifted in the last decade, and it is worth naming clearly.
For most of the history of physics, the bottleneck was theory and experiment. You needed a good physical model and the apparatus to test it. Computation was a supporting tool. Today, for an increasing fraction of frontier physics problems, computation is the bottleneck — not the model, not the experiment, but the ability to analyse, simulate, and learn from data at the scale that modern physics generates.
This means that the physicist who understands machine learning is not just more employable. They can do physics that was previously impossible. They can analyse datasets that would have taken decades. They can simulate systems that were computationally forbidden. They can discover equations from data that no human would have intuited. They can control physical systems with a precision that no classical controller achieves.
That is not a small thing. That is a genuine expansion of what physics can be.
You now have the foundations. What happens next is up to you — which problem you choose, which subfield you go deep in, and what you contribute that nobody else has yet contributed. Physics has always been driven by people who care deeply about understanding the universe and are willing to do hard technical work in service of that understanding. ML is a new and powerful tool in that tradition. Use it well.
External References & Further Reading
- Mehta et al. (2019) — A high-bias, low-variance introduction to Machine Learning for physicists. Physics Reports. arXiv:1803.08823 — The foundational review. Free and comprehensive.
- IAIFI — Institute for Artificial Intelligence and Fundamental Interactions. iaifi.org — NSF-funded institute at the MIT/Harvard/Tufts/Northeastern nexus. Fellowship applications open annually.
- ML4Sci — ml4sci.org — Community, GSoC projects, and workshop resources for ML in physical sciences.
- Physics Meets ML — physicsmeetsml.org — Online seminar series with recorded talks from the leading groups.
- CERN ML Fellowship — careers.cern/fellows — Fellowship applications typically open October–January each year.
- Carleo et al. (2019) — Machine learning and the physical sciences. Reviews of Modern Physics. arXiv:1903.10563
- fast.ai — fast.ai — Free, practical deep learning course. Genuinely the best entry point for physicists new to ML.
- Six distinct career paths exist. Scientific ML researcher, ML engineer, computational physicist, quantitative researcher, climate AI, and drug discovery. Each requires a different balance of physics depth vs ML breadth vs software engineering.
- Be T-shaped, not shallow. Deep in one physics subfield, broad across ML methods. The intersection is where you create unique value that neither a pure ML engineer nor a traditional physicist can match.
- Your portfolio is your proof. Five projects demonstrating both physics understanding and ML competence will do more for your career than any certification. Each should solve a real physics problem, report real metrics, and be documented well enough that a recruiter or collaborator can understand it in five minutes.
- The field is at the NeurIPS and ICML ML4PS workshops. Read those proceedings. Attend virtually. Submit an extended abstract. This is the fastest path to visibility in the community.
- Academia and industry both have serious advantages. Industry research labs now have access to resources that universities cannot match. But academic freedom and long time horizons are genuine, not imaginary. Neither path is universally better.
- The 12-month roadmap works. Foundation (months 1–2) → deep learning (3–4) → domain depth (5–6) → advanced methods (7–8) → research contribution (9–10) → community and applications (11–12). One year of focused work can take a physics student from Python basics to competitive for research positions at the physics-AI frontier.
