Artem Chumachenko is an AI Engineer based in Amsterdam with 9 years of experience building and deploying large language models and dialog systems. He has driven model training and finetuning work at Yandex (YaLM up to 14B) and improved chatbot personalization at Neiro.ai, and now applies that expertise at Together AI. His open-source contributions to the well-known BigScience "petals" project added distributed generation, beam search, sampling and adapter/prefix-tuning support—enabling BitTorrent-style LLM inference and faster fine-tuning. With a research background from MIPT in model compression for transformers, he uniquely blends efficiency-focused research with production-grade generative systems.
10 years of coding experience
4 years of employment as a software developer
Bachelor's degree, Applied Physics and Mathematics, Bachelor's degree, Applied Physics and Mathematics at Moscow Institute of Physics and Technology (State University) (MIPT)
🌸 Run LLMs at home, BitTorrent-style. Fine-tuning and inference up to 10x faster than offloading
Role in this project:
ML Engineer
Contributions:30 reviews, 97 commits, 40 PRs in 6 months
Contributions summary:Artem primarily contributed to the development of a distributed language model platform, specifically focusing on integrating generation capabilities within the `petals` framework. Their contributions include implementing a `RemoteGenerationMixin` class for auto-regressive text generation, adding support for various decoding algorithms such as greedy search and sampling, and integrating prefix-tuned inference. The user also introduced the `beam_search` algorithm and designed functionality for utilizing adapters. Their work directly impacts the models capabilities for generation tasks.
Contributions:68 commits, 47 PRs, 47 pushes in 3 months
deep-learningcatalystmachine-learning
Find and Hire Top DevelopersWe’ve analyzed the programming source code of over 60 million software developers on GitHub and scored them by 50,000 skills. Sign-up on Prog,AI to search for software developers.