Harish Rao is a Manager at PwC Acceleration Centers in Bengaluru with a B.Tech in Information Technology and over a decade of hands‑on QA experience across insurance, investment banking and hospitality. He specializes in ETL and data warehouse/reporting QA, working with Informatica, SSIS, SAP BODS and BI tools like Tableau, Power BI, MicroStrategy and Cognos, and automates processes with UiPath. Harish blends broad testing expertise (UAT, regression, conversion, financial transaction and GUI testing) with practical data connectivity skills, enabling reliable reporting and data conversions on platforms such as Guidewire and EDW. He’s also an active open-source contributor to Apache Airflow, improving Snowflake SQLAlchemy engine support and Databricks SQL integrations—an uncommon mix of QA leadership and cloud data‑infrastructure experience. Colleagues describe him as a pragmatic manager who turns complex reporting requirements into auditable, production-ready QA processes.
7 years of coding experience
10 years of employment as a software developer
Bharat Matriculation School, Krishnagiri
Diploma of Education, Electrical, Electronic and Communications Engineering Technology/Technician, Diploma of Education, Electrical, Electronic and Communications Engineering Technology/Technician at IRT Polytechnic College - India
Bachelor of Technology - BTech, Information Technology, Bachelor of Technology - BTech, Information Technology at Sri Venkateswara College of Engineering
Apache Airflow - A platform to programmatically author, schedule, and monitor workflows
Role in this project:
Data Engineer
Contributions:31 reviews, 6 PRs, 35 comments in 7 months
Contributions summary:Harish primarily contributed to the Airflow project by modifying and testing database connection hooks, specifically for Snowflake. Their work involved improving the functionality of `SnowflakeHook.get_sqlalchemy_engine`, including support for session parameters and private key authentication. These changes extended to testing, demonstrating a focus on data connectivity and workflow execution using SQL databases. Additionally, they made minor changes to example DAGs and integration with other providers, which shows a understanding of how to integrate and test Databricks SQL connections.
Apache Airflow - A platform to programmatically author, schedule, and monitor workflows
Contributions:83 pushes, 5 branches in 6 months
Find and Hire Top DevelopersWe’ve analyzed the programming source code of over 60 million software developers on GitHub and scored them by 50,000 skills. Sign-up on Prog,AI to search for software developers.