Tamara Norman is a software engineer based in London with nine years' experience, currently working at DeepMind. She specialises in ML infrastructure and core numerical libraries, contributing to high-profile open-source projects like DeepMind's Sonnet (testing, build configuration and CI) and JAX (implementing atrous and transposed convolutions and fixing padding/dilation edge cases). Her work uniquely blends MLOps rigor—dependency and Python-version handling—with low-level numerical implementation for GPU/TPU acceleration, helping move research code toward production-ready, developer-friendly tooling. A Cambridge computer science graduate with top-level mathematics credentials and early C#/mapping experience, she pairs strong theoretical grounding with practical engineering discipline.
10 years of coding experience
Bachelor of Arts (B.A.), Computer Science, II.i, Bachelor of Arts (B.A.), Computer Science, II.i at University of Cambridge
A levels, A-levels A*A*A*A* in Mathematics, Further Mathematics, Computer Science, Physics, A levels, A-levels A*A*A*A* in Mathematics, Further Mathematics, Computer Science, Physics at King Edward VI Camp Hill School for Girls
Contributions:2 reviews, 53 commits, 3 PRs in 1 year 10 months
Contributions summary:Tamara primarily focused on improving the testing and build processes for the Sonnet library. They added and refined testing scripts, including those for Python and TensorFlow dependencies. Significant contributions involved modifying the build configuration, addressing Python versioning and dependencies, and integrating continuous integration aspects. This work helped ensure the library's reliability and ease of use for developers.
Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
Role in this project:
ML Engineer
Contributions:2 commits, 3 PRs, 8 comments in 8 months
Contributions summary:Tamara contributed to the JAX library by implementing and improving convolution operations, including support for atrous and transposed convolutions. They addressed issues related to padding and dilation, and implemented tests to ensure functionality across different configurations. The user also updated documentation to reflect current RNG behavior. This work demonstrates a focus on core numerical operations within the JAX framework.
pytorchpythonjitautomatic-differentiationgpu
Find and Hire Top DevelopersWe’ve analyzed the programming source code of over 60 million software developers on GitHub and scored them by 50,000 skills. Sign-up on Prog,AI to search for software developers.