Jonas Geiping

Research Group Leader at ELLIS Institute Tübingen

Germany
email-iconphone-icongithub-logolinkedin-logotwitter-logostackoverflow-logofacebook-logo
Join Prog.AI to see contacts
email-iconphone-icongithub-logolinkedin-logotwitter-logostackoverflow-logofacebook-logo
Join Prog.AI to see contacts

Summary

🤩
Rockstar
🎓
Top School
Jonas Geiping is a research group leader at the ELLIS Institute Tübingen and the Max Planck Institute for Intelligent Systems with nine years of experience bridging mathematical optimization and deep learning. He holds a PhD in computer science and a Master's in Applied Mathematics and was a postdoc at the University of Maryland. His work focuses on how optimization principles inform the design of safe, secure, and private ML systems. He also contributes hands-on engineering to open-source projects, notably a "cramming" effort to train BERT-style models on limited compute where he implemented performance-focused changes to threading, learning rate schedules, and data pipelines. By combining theoretical rigor with low-level performance tuning and deployment experience, he builds research that is both principled and practical.
code9 years of coding experience
job7 years of employment as a software developer
bookMaster's degree, Applied Mathematics, Master's degree, Applied Mathematics at University of Münster
bookDoctor of Science, Computer Science, Doctor of Science, Computer Science at Universität Siegen
github-logo-circle

Github Skills (13)

data-preprocessing10
pytorch10
machine-learning10
language-models10
nlp10
language-model10
python10
transformers9
huggingface8
huggingface-hub8
ci-cd4
tensorflow23
tensorflow3

Programming languages (4)

TypeScriptHTMLJupyter NotebookPython

Github contributions (5)

github-logo-circle
JonasGeiping/cramming

Dec 2022 - Mar 2023

Cramming the training of a (BERT-type) language model into limited compute.
Role in this project:
userML Engineer
Contributions:2 releases, 31 commits, 13 PRs in 2 months
Contributions summary:Jonas contributed to the development and improvement of a BERT-type language model. Their work included refactoring code to reduce thread usage for performance optimization. They also implemented new learning rate cooldown schedulers and modified the data preprocessing pipeline to optimize dataset handling. Additionally, the user appears to be involved in model deployment, including the option to push to Hugging Face Hub.
nlplanguage-modeltransformersbertenglish-language
Contributions:1 PR, 27 pushes, 2 branches in 4 years 5 months
Find and Hire Top DevelopersWe’ve analyzed the programming source code of over 60 million software developers on GitHub and scored them by 50,000 skills. Sign-up on Prog,AI to search for software developers.
Request Free Trial
Jonas Geiping - Research Group Leader at ELLIS Institute Tübingen