Data Integration Engineer The Data Integration Engineer is responsible for Architecting and implementing data ingestion, validation, and transformation pipelines at Syndigo. In collaboration with the Product, Development, and Enterprise Data teams, the Data Integration Engineer will design and maintain batch and streaming integrations across a variety of data domains and platforms. The ideal candidate is experienced in big data, cloud architecture, and is excited to advance innovative analytics solutions! The Data Integration Engineer should be able to effectively communicate ideas and concepts to peers and have experience leading projects that support the business objectives and goals. |
ESSENTIAL DUTIES AND RESPONSIBILITIES: - Take ownership in building solutions and proposing architectural designs related to building efficient and timely data ingestion and transformation processes geared towards analytics work loads
- Manage code deployment to various environments
- Be proficient at positively critiquing and suggesting improvements via code reviews
- Work with stakeholders to define and develop data ingest, validation, and transform pipelines
- Troubleshoot data pipelines and resolve issues in alignment with SDLC
- Ability to diagnose and troubleshoot data issues, recognizing common data integration and transformation patterns
- Estimate, track, and communicate status of assigned items to a diverse group of stakeholders
REQUIREMENTS: - 2-4 years of experience in developing and architecting large scale data pipelines in a cloud environment
- Demonstrated expertise in Scala (Object Oriented Programming) / Python (Scala preferred), SPARK SQL
- Experience with Databricks, including Delta Lake
- Experience with Azure and cloud environments, including Azure Data Lake Storage (Gen2), Azure Blob Storage, Azure Tables, Azure SQL Database, Azure Data Factory
- Experience with ETL/ELT patterns, preferably using Azure Data Factory and Databricks jobs
- Fundamental knowledge of distributed data processing and storage
- Fundamental knowledge of working with structured, unstructured, and semi structured data
- Excellent analytical and problem-solving skills
- Ability to effectively manage time and adjust to changing priorities
- Bachelor’s degree preferred, but not required.
- Design, Develop and test highly scalable software applications using software best practices and applying various design patterns
- Develop innovative capabilities to add new and robust reporting capabilities using various insights/analytics to the platform.
- Design, Develop and Enhance REST API along with testing using tools such as Postman.
- Design and architect end to end solutions and be responsible for the deployment and maintenance of these solutions in our AWS cloud.
- Implement configurable data mappings to allow custom data transformations.
- Troubleshoot and issue resolution of software.
- Write automated unit tests and conduct code reviews.
- Collaborate with Product, DevOps and QA teams on requirements, operations, and automation.
- Agile software development using Jira/Confluence.
- Collaborate with team members across multiple geographic locations as well as time zones.
Technical skill and Experience: - 2+ years of experience in designing and developing software using Microsoft technologies
- Required:
- Proficient in Javascript
- Experience with REST API
- Experience with GitHub and Azure DevOps
- Experience with SQL
- Plus:
- Dev Ops experience in AWS
- Knowledge of e-commerce platforms.
- Knowledge of data analysis.
- Experience with Elasticsearch.
|