Project Description
We are seeking hands-on Data Engineer consultants to build out the next-generation data warehouses for the organization. To solve the data availability and access issues of all data across the organization. Enabling a graph of connectivity between 100s of data sets. We need people that are enthusiastic about enabling internal and external clients by streamlining and facilitating easy access to their critical data that is well-defined and has established transparent levels of quality. This engineer will leverage different data platforms to achieve this while providing critical input to extend data platform capabilities. Familiarity with ETL and Cloud Platform data pipeline solutions is critical, as is REST API authoring for data access.
Responsibilities
• Member of the Business Date Engineering team, working to deliver Data Ingest/Enrich Pipelines, and Access APIs using common cloud technologies.
• Work with consumers to understand the data requirements and deliver data contracts with well-defined SLIs to track SLA agreements.
• Harness modern application best practices with code quality, API test Coverage, Agile Development, DevOps, and Observability and support.
• Maintain programming standards and ensure the usage of the pattern/template for API Proxy.
• Conduct code reviews and automatic test coverage
• Standardize the CI/CD setup for API management tools and automated deployment.
• Utilize problem-solving skills to help your peers in the research and selection of tools, products, and frameworks (which is vital to support business initiatives)
Skills
Must have
• Min 5+ years of proven industry experience; bachelor's degree in IT or related fields
• Senior-level hands-on development expertise in Python, SQL, Spark, Kafka
• Hands-on experience in designing and developing high-volume REST using API Protocols and Data Formats.
• Proven experience with Snowflake (SnowPro certification would be a plus)
• Proven experience working in cloud data platforms such as Azure (Databricks, ADF)
• Working with Azure API and DB Platforms
• Understanding of Databases, API Frameworks, Governance Frameworks, and expertise in hosting and managing platforms like: Hadoop, Spark, Kafka, BI Tools like Tableau, Alteryx, Governance Tools like Callibra, Soda
• Strong understanding of Twelve-Factor App Methodology
• Solid understanding of API and integration design principles and pattern experience with web technologies.
• Design object-oriented, modularized, clean, and maintainable code and creating policies in Python.
• Experience with test-driven development and API testing automation.
• Demonstrated track record of full project lifecycle and development, as well as post-implementation support activities.
Nice to have
• Financial experience: Public and Alternatives Asset Management / or Capital markets experience
• Familiar in NoSQL\NewSQL databases
• Strong documentation capability and adherence to testing and release management standards
• Design, development, modification and testing of databases designed to support Data Warehousing and BI business teams
• Familiarity with SDLC methodologies, defect tracking (JIRA, Azure DevOps, ServiceNow, etc.)
Soft Skills:
• Candidate must have an analytical and logical thought process for developing project solutions
• Strong interpersonal and communication skills; works well in a team environment
• Ability to deliver under competing priorities and pressures.
• Excellent organizational skills in the areas of code structuring & partitioning, commenting, and documentation for team alignment and modifications
Languages
English: C1 Advanced