Hamid Shojanazeri

San Francisco, California, United States
email-iconphone-icongithub-logolinkedin-logotwitter-logostackoverflow-logofacebook-logo
Join Prog.AI to see contacts
email-iconphone-icongithub-logolinkedin-logotwitter-logostackoverflow-logofacebook-logo
Join Prog.AI to see contacts

Summary

🤩
Rockstar
🎓
Top School
Hamid Shojanazeri is an AI and PyTorch partner engineering manager based in San Francisco with nine years of experience building and productionizing large transformer models. He combines a PhD in computer vision with hands-on MLOps work at Meta, focusing on distributed training (FSDP), model parallelism, and meeting latency/throughput SLAs for inference and serving. An active open-source contributor, he has improved high-profile projects like pytorch/tutorials and pytorch/serve—adding FSDP T5 examples, sharded checkpoint loading, and Hugging Face/FasterTransformer integrations. He also contributed backend optimizations to the Llama cookbook (time/tensor parallel configs and pad-token handling), demonstrating a practical knack for turning cutting-edge research into deployable tooling. Hamid’s niche is translating complex parallelism and model optimization tactics into accessible patterns that partners and engineers can adopt in production.
code9 years of coding experience
job13 years of employment as a software developer
bookDoctor of Philosophy - PhD Computer Vision, Doctor of Philosophy - PhD Computer Vision at Federation University Australia
stackoverflow-logo

Stackoverflow

Stats
31reputation
16kreached
0answers
1question
github-logo-circle

Github Skills (28)

torchscript10
continuous-deployment10
pytorch10
modelchecking10
distributed-training10
checkpoint10
python10
machine-learning10
sdp10
hugging-face-transformers10
ml-deployment10
checkpointing10
mlops10
model-optimization10
classification8

Programming languages (7)

JavaC++GoHTMLJupyter NotebookPythonCuda

Github contributions (5)

github-logo-circle
meta-llama/llama-cookbook

Jul 2023 - Mar 2025

Welcome to the Llama Cookbook! This is your go to guide for Building with Llama: Getting started with Inference, Fine-Tuning, RAG. We also show you how to solve end to end problems using Llama model family and using them on various provider services
Role in this project:
userBack-end Developer
Contributions:247 reviews, 110 PRs, 186 pushes in 1 year 7 months
Contributions summary:Hamid's contributions focused on adding crucial backend configurations and features to the Llama cookbook repository. This includes the implementation of time and tensor parallel (TP) configurations, along with pad token adjustments in the inference scripts. Moreover, the user incorporated FSDP (Fully Sharded Data Parallel) checkpoint loading capabilities, including sharded model loading, indicating a focus on optimizing model deployment and management strategies. The user has modified code related to loading models and preparing them for optimized execution.
aifinetuninglangchainllamallama2
pytorch/serve

May 2020 - Jan 2023

Serve, optimize and scale PyTorch models in production
Role in this project:
userML Engineer & MLOps Engineer
Contributions:372 reviews, 237 commits, 50 PRs in 2 years 8 months
Contributions summary:Hamid focused on integrating and generalizing Hugging Face transformer models within the PyTorch Serve environment. They developed custom handlers for sequence classification, question answering, and token classification, demonstrating their knowledge of model serialization and deployment. Their work included support for TorchScript, batch inference, and Captum explanations, alongside integrating with the FasterTransformer library. This involved modifying existing code, creating example files, and enhancing documentation, thus streamlining the process of serving large transformer models.
cpupytorchpytorch-modelsservingin-production
Find and Hire Top DevelopersWe’ve analyzed the programming source code of over 60 million software developers on GitHub and scored them by 50,000 skills. Sign-up on Prog,AI to search for software developers.
Request Free Trial
Hamid Shojanazeri