Description
• 7-15 years of experience as a data engineer in datawarehouse/big data setting
• Developing algorithms for converting basic data into high level metrics
• Developing Python functions to pass data to database and Develop procedures, packages and scripts for data migration.
• Writing SQL code to process raw data
• Good experience of Python ORMs , Flask, NumPy, MatPlotlib, Scikit learn, Pandas, sqlalchemy, Alembic, Django. Google Products API Integration is a plus
• SQL Programming -Stored Procs, Functions, Triggers, Algorithms
• Good hands on skills in MySQL, PostgreSQL, Redis, RabbitMQ, MongoDB, Kafka
• Airflow – development, scheduling and configuration
• Good knowledge of DevOps tools like Docker, Docker Compose, GitHub, BitBucket
• Experience in BI Development and Deployment, ER model and Multidimensional model
• Familiar with AWS functions related to ETL (AWS Batch, AWS Glue, Lambda)
• R&D and testing APIs to identify useful data items
• Processing large amounts of structured and unstructured data GOOD TO HAVE
• Agile/Scrum , Communication (Spoken English, clarity of thought) GOOD TO HAVE
: India
Experience Required (In Years): Minimum- 7 Maximum- 15