Solution Architect in startup/ hyper-growth environments
Designing and creating data models that define the structure and relationships of various data elements within the organization. This includes conceptual, logical, and physical data models, which help ensure data accuracy, consistency, and integrity.
Designing data integration solutions that allow different systems and applications to share and exchange data seamlessly. This may involve selecting appropriate integration technologies, developing ETL (Extract, Transform, Load) processes, and ensuring data quality during the integration process.
Create and maintain optimal data pipeline architecture
Good knowledge of cloud platforms like AWS/Azure/GCP
Good hands-on knowledge of Snowflake is a must. Experience with various data ingestion methods (snowpipe & others), time travel and data sharing and other Snowflake capabilities
Good knowledge of Python/PySpark, advanced features of Python
Ability to write complex SQL queries
Should be able to back track and do deep analysis of issues and provide RCA.
Good testing and documentation skills
Should be able to communicate with fast-paced, dynamic, client-facing role where delivering solid work products to exceed high expectations is a measure of success.
Ability to be creative and analytical in a problem-solving environment.
Candidate should be based in Hyderabad and should work from customer location