The Job logo

What

Where

Data Integration Lead

ApplyJoin for More Updates

You must Sign In before continuing to the company website to apply.

Description

 

Tools: Collibra or Any Similar Data Governance tools (Will be a plus).

 

Roles:

• Looking for someone who may have data governance / metadata management experience (insurance data experience a plus!). 

• A data team member with good domain knowledge and familiarity with the data stack.

• Focused more on the Business Metadata side.  Creating Business Concepts, Business Terms, definitions and linking to the appropriate technical metadata to connect lineage.

• A person who can perform Techno/Functional role while filling the gap between Business Team and Technical team.

• Identify and prioritize data-governance projects which is prerequisite for data quality, data security, etc.

 

Responsibilities:

• Creating and approving data standards, policies, business rules.

• Resolving problems and issues that data stewards aren't able to resolve on their own.

• Helps standardize data definitions, rules, and descriptions.

• Contribute to and help manage metadata.

• Contribute to enterprise business glossary.

• Oversee and review key standard reports and dashboards.

• Assure you that legal and other compliance standards are followed.

• Ensure data documentation is written and maintained.

• Project management skills, Communication, Organization and Willingness to delegate.

• Implement changes to business process or other data management practice decided on by DGC

Set alert for similar jobsData Integration Lead role in Mumbai, India
Hexaware Technologies Logo

Company

Hexaware Technologies

Job Posted

a year ago

Job Type

Full-time

WorkMode

On-site

Experience Level

8-12 years

Category

Technology

Locations

Mumbai, Maharashtra, India

Qualification

Bachelor

Applicants

Be an early applicant

Related Jobs

Hexaware Technologies Logo

Data Integration Lead

Hexaware Technologies

Bengaluru, Karnataka, India

+3 more

Posted: a year ago

Description Job Description 1. 6+ years’ experience in developing large scale data pipelines in a cloud/on-prem environment. 2. Highly Proficient in any or more of market leading ETL or Application Integration tools like Preferred (Informatica Intelligent Cloud Services),Informatica, SSIS, Talend, etc., 3. Deep knowledge on Data Cleansing, Data Profiling and Data Integrations using the Data Rules with API Integration (thorough knowledge on using Connectors, Process Objects or variable mechanism, PostMan. 4. Thorough knowledge on SQL Server Queries and Procedures 5. Deep knowledge in Data warehouse/Data Mart architecture and modelling 6. Define and develop data ingestion, validation, and transform pipelines. 7. Deep knowledge of distributed data processing and storage 8. Deep knowledge of working with structured, unstructured, and semi structured data 9. Working experience needed with ETL/ELT patterns 10. Extensive experience in the application of analytics, insights and data mining to commercial “real-world” problems 11. Technical experience in any one programming language preferably, Java, .Net or Python

Hexaware Technologies Logo

Data Integration Lead

Hexaware Technologies

Bengaluru, Karnataka, India

+2 more

Posted: a year ago

Description   Responsibilities •  Leads the delivery processes of data extraction, transformation, and load        from disparate sources into a form that is consumable by analytics                processes, for projects with moderate complexity, using strong technical        capabilities and sense of database performance •  Designs, develops and produces data models of relatively high                        complexity, leveraging a sound understanding of data modelling                    standards to suggest the right model depending on the requirement •  Batch Processing - Capability to design an efficient way of processing      high volumes of data where a group of transactions is collected over a          period  •  Data Integration (Sourcing, Storage and Migration) - Capability to design      and implement models, capabilities, and solutions to manage data within      the enterprise (structured and unstructured, data archiving principles,            data warehousing, data sourcing, etc.).  This includes the data models,          storage requirements and migration of data from one system to another •  Data Quality, Profiling and Cleansing - Capability to review (profile) a     data set to establish its quality against a defined set of parameters and to     highlight data where corrective action (cleansing) is required to     remediate the data  •  Stream Systems - Capability to discover, integrate, and ingest all available     data from the machines that produce it, as fast as it is produced, in any         format, and at any quality •  Excellent interpersonal skills to build network with variety of department       across business to understand data and deliver business value and may         interface and communicate with program teams, management and               stakeholders as required to deliver small to medium-sized projects •  Understand the difference between on-prem and cloud-based data               integration technologies. The Role offers •  Opportunity to join a global team to do meaningful work that contributes      to global strategy and individual development  •  An outstanding opportunity to re-imagine, redesign, and apply                      technology to add value to the business and operations •  Gives an opportunity to showcase candidates’ strong analytical skills and        problem-solving ability •  Learning & Growth opportunities in cloud and Big data engineering              spaces Essential Skills   •  6+ years’ experience in developing large scale data pipelines in a                   cloud/on-prem environment. •  Highly Proficient in any or more of market leading ETL tools like                      Informatica, DataStage, SSIS, Talend, etc., •  Deep knowledge in Data warehouse/Data Mart architecture and                    modelling •  Define and develop data ingest, validation, and transform pipelines. •  Deep knowledge of distributed data processing and storage •  Deep knowledge of working with structured, unstructured, and semi             structured data •  Working experience needed with ETL/ELT patterns •  Extensive experience in the application of analytics, insights and data              mining to commercial “real-world” problems •   Technical experience in any one programming language preferably, Java,       .Net or Python Essential Qualification •  BE/Btech in Computer Science, Engineering or relevant field JD: • Strong working knowledge of IICS - Informatica Cloud, Informatica designer transformations like Source Qualifier , Dynamic and Static Lookups , connected and Unconnected lookups , Expression , Filter , Router , Joiner , Normalizer and Update Strategy transformation. • Solid hands-on development experience in Informatica PowerCenter - usage of reusable transformations, aggregates, lookups, caches, performance tuning, joiners, rank, router, update strategy etc. • Strong Knowledge on Snowflake  • Experience in working in Data Warehousing - Must have knowledge of Data Warehousing concepts - SCD1, SCD2 etc. • Troubleshoot issues and identify bottlenecks in existing data workflows. • Provides performance tuning insight and create reusable objects and templates. • Should be strong in migrating object process from Lower Environment to higher Environment. • Should be strong in scheduling process of the workflows, tasks and mappings. • Basic understanding of Informatica Administration • Strong development skills in SQL. Having knowledge on AWS, HVR would be an added advantage • Should have good communication and customer interaction skills

Hexaware Technologies Logo

Data Integration Lead

Hexaware Technologies

Chennai, Tamil Nadu, India

Posted: a year ago

Description Data Integration Lead - Talend Developer Offshore   Responsibilities •  Leads the delivery processes of data extraction, transformation, and load from disparate sources into a form that is consumable by analytics processes, for projects with moderate complexity, using strong technical capabilities and sense of database performance •  Designs, develops and produces data models of relatively high complexity, leveraging a sound understanding of data modelling standards to suggest the right model depending on the requirement •  Batch Processing - Capability to design an efficient way of processing      high volumes of data where a group of transactions is collected over a period  •  Data Integration (Sourcing, Storage and Migration) - Capability to design and implement models, capabilities, and solutions to manage data within the enterprise (structured and unstructured, data archiving principles, data warehousing, data sourcing, etc.).  This includes the data models, storage requirements and migration of data from one system to another •  Data Quality, Profiling and Cleansing - Capability to review (profile) a     data set to establish its quality against a defined set of parameters and to highlight data where corrective action (cleansing) is required to     remediate the data  •  Stream Systems - Capability to discover, integrate, and ingest all available data from the machines that produce it, as fast as it is produced, in any  format, and at any quality •  Excellent interpersonal skills to build network with variety of department across business to understand data and deliver business value and may interface and communicate with program teams, management and stakeholders as required to deliver small to medium-sized projects •  Understand the difference between on-prem and cloud-based data integration technologies.   The Role offers •  Opportunity to join a global team to do meaningful work that contributes to global strategy and individual development  •  An outstanding opportunity to re-imagine, redesign, and apply technology to add value to the business and operations •  Gives an opportunity to showcase candidates’ strong analytical skills and problem-solving ability •  Learning & Growth opportunities in cloud and Big data engineering spaces   Essential Skills   •  6+ years’ experience in developing large scale data pipelines in a cloud/on-prem environment. •  Highly Proficient in any or more of market leading ETL tools like Informatica, DataStage, SSIS, Talend, etc., •  Deep knowledge in Data warehouse/Data Mart architecture and modelling •  Define and develop data ingest, validation, and transform pipelines. •  Deep knowledge of distributed data processing and storage •  Deep knowledge of working with structured, unstructured, and semi structured data •  Working experience needed with ETL/ELT patterns •  Extensive experience in the application of analytics, insights and data mining to commercial “real-world” problems •   Technical experience in any one programming language preferably, Java,.Net or Python   Essential Qualification •  BE/Btech in Computer Science, Engineering or relevant field  

Hexaware Technologies Logo

Big Data Lead

Hexaware Technologies

Pune, Maharashtra, India

+2 more

Posted: a year ago

Description   Senior ADF Engineer Work timing: 12pm to 9.30pm IST As a Senior Azure ADF Engineer, you will be responsible for designing, developing, and maintaining data pipelines using Azure Data Factory (ADF). You will work closely with customers to understand their business requirements and translate them into scalable, performant, and secure data integration solutions. You will be responsible for ensuring that the data pipelines integrate seamlessly with other Azure services, such as Azure Blob Storage, Azure Synapse Analytics , SQL Database, and Azure Data Lake Storage. Key Responsibilities: - Design, develop, and maintain data pipelines using Azure Data Factory (ADF). - Work closely with customers to understand their business requirements and translate them into scalable, performant, and secure data integration solutions. - Collaborate with other architects, engineers, and technical teams to ensure that data pipelines are aligned with overall solution architecture and best practices. - Optimize data pipelines for performance, scalability, and cost efficiency. - Develop and maintain technical documentation, including pipeline diagrams, data flow diagrams, and code documentation. - Participate in data-related aspects of pre-sales activities, including solution design, proposal development, and customer presentations. - Stay current with emerging technologies and industry trends related to data integration and management in Azure, and provide recommendations for improving existing solutions and implementing new ones.   : - Bachelor's or master's degree in computer science, engineering, or a related field. - At least 7 years of experience in data engineering, including at least 3 years of experience with Azure Data Factory. - Strong knowledge of Azure Data Factory, including data ingestion, transformation, and orchestration. - Experience with data integration and ETL processes and writing SQL queries and SQL/Scripts. - Experience with Azure Data lake and building pipelines for batch and CDC (change data capture). - Experience with implementing ETL frameworks. - Strong understanding of data security, including encryption, access control, and auditing. - Excellent communication and presentation skills, with the ability to effectively communicate complex technical concepts to both technical and non-technical audiences. - Strong problem-solving and analytical skills, with the ability to identify and resolve complex technical issues. Good to have Qualifications : - Microsoft Certified: Azure Data Engineer Associate certification. - Experience with other Azure data services, such as Azure Synapse Analytics and Azure Databricks. - Experience with other data integration tools, such as Talend. - Experience with programming languages, such as SQL, Powershell.

Hexaware Technologies Logo

ServiceNow Tech Lead

Hexaware Technologies

Noida, Uttar Pradesh, India

+4 more

Posted: a year ago

Job Title ServiceNow  ITOM Sr.Developer /Developer Must Have Experience ·         Experienced in implementation of ITOM solution using industry best practices ·         Hands-on development to include creation, configuration and customization of patterns, Discovery Schedules, probe, sensors and Event rules ·         Integrate ServiceNow ITOM modules with a variety of enterprise monitoring tools, automation of alerts, application management tools using OOB plugins and API methods ·         Experienced in implementation on ServiceNow Discovery, Service Mapping, Event Mgmt and Orchestration use cases. ·         3rd party software integrations with ServiceNow ·         Design and modify ServiceNow forms, workflows, scripts, transform maps, service maps, web services, inbound email actions, SLAs and more. ·         MID Server management experience ·         Experience in multiple ServiceNow implementations Good to have Experience ·         Experience in application architecture, infrastructure architecture, database architecture, networking, and distributed systems ·         Well-versed in modern web technologies and cloud computing architectural principles for cloud-based platforms that include SaaS, PaaS, multi-tenancy, and automation ·         Understanding of Application and Technology Portfolio Management ·         Experience with discovery technologies (ADDM, TADDM, UD, etc.) ·         Familiar with scripting technology such as JavaScript, PowerShell, Perl, wmi, ssh, python, Xpath, SNMP ·         Experience of custom applications development ·         Experience in working on Performance Analytics ·         Design and modification of ServiceNow Service Portal. ·         Domain Separation and designing the Process and data flow within various domains ·         Knowledge of LDAP/Active Directory/SSO. ·         Review existing set up and provide best practice recommendations in line with the OEM ·         Monitor health, usage and compliance of ServiceNow systems. ·         Excellent knowledge of ServiceNow Best Practices and ongoing knowledge of latest ServiceNow features. ·         Experience with HTML coding and Jelly script strongly preferred ·         Experience with scripting in ServiceNow (Business Rules, UI Pages, UI Macros, etc.) Excellent collaboration skills including analysis, brainstorming, communication, teamwork. ·         Self-starter and innovator. Academic and Professional Experience Professional ·         3+ years of experience with ServiceNow required ·         5+ years of experience with IT Industry required Academics ·         Certification as ServiceNow System certified administration ·         CIS in Discovery, Service Mapping, Event Mgmt, Cloud Mgmt ·         ITIL Foundations certification is preferred. ·         Bachelor’s degree or equivalent combination of education and experience.