Description
Data Integration Lead - Talend Developer Offshore
Responsibilities
• Leads the delivery processes of data extraction, transformation, and load from disparate sources into a form that is consumable by analytics processes, for projects with moderate complexity, using strong technical capabilities and sense of database performance
• Designs, develops and produces data models of relatively high complexity, leveraging a sound understanding of data modelling standards to suggest the right model depending on the requirement
• Batch Processing - Capability to design an efficient way of processing
high volumes of data where a group of transactions is collected over a period
• Data Integration (Sourcing, Storage and Migration) - Capability to design and implement models, capabilities, and solutions to manage data within the enterprise (structured and unstructured, data archiving principles, data warehousing, data sourcing, etc.). This includes the data models, storage requirements and migration of data from one system to another
• Data Quality, Profiling and Cleansing - Capability to review (profile) a
data set to establish its quality against a defined set of parameters and to highlight data where corrective action (cleansing) is required to
remediate the data
• Stream Systems - Capability to discover, integrate, and ingest all available data from the machines that produce it, as fast as it is produced, in any format, and at any quality
• Excellent interpersonal skills to build network with variety of department across business to understand data and deliver business value and may interface and communicate with program teams, management and stakeholders as required to deliver small to medium-sized projects
• Understand the difference between on-prem and cloud-based data integration technologies.
The Role offers
• Opportunity to join a global team to do meaningful work that contributes to global strategy and individual development
• An outstanding opportunity to re-imagine, redesign, and apply technology to add value to the business and operations
• Gives an opportunity to showcase candidates’ strong analytical skills and problem-solving ability
• Learning & Growth opportunities in cloud and Big data engineering spaces
Essential Skills
• 6+ years’ experience in developing large scale data pipelines in a cloud/on-prem environment.
• Highly Proficient in any or more of market leading ETL tools like Informatica, DataStage, SSIS, Talend, etc.,
• Deep knowledge in Data warehouse/Data Mart architecture and modelling
• Define and develop data ingest, validation, and transform pipelines.
• Deep knowledge of distributed data processing and storage
• Deep knowledge of working with structured, unstructured, and semi structured data
• Working experience needed with ETL/ELT patterns
• Extensive experience in the application of analytics, insights and data mining to commercial “real-world” problems
• Technical experience in any one programming language preferably, Java,.Net or Python
Essential Qualification
• BE/Btech in Computer Science, Engineering or relevant field