The Job logo

What

Where

Big Data Lead

ApplyJoin for More Updates

You must Sign In before continuing to the company website to apply.

Description

 

Senior ADF Engineer

Work timing: 12pm to 9.30pm IST

As a Senior Azure ADF Engineer, you will be responsible for designing, developing, and maintaining data pipelines using Azure Data Factory (ADF). You will work closely with customers to understand their business requirements and translate them into scalable, performant, and secure data integration solutions. You will be responsible for ensuring that the data pipelines integrate seamlessly with other Azure services, such as Azure Blob Storage, Azure Synapse Analytics , SQL Database, and Azure Data Lake Storage.

Key Responsibilities:

- Design, develop, and maintain data pipelines using Azure Data Factory (ADF).

- Work closely with customers to understand their business requirements and translate them into scalable, performant, and secure data integration solutions.

- Collaborate with other architects, engineers, and technical teams to ensure that data pipelines are aligned with overall solution architecture and best practices.

- Optimize data pipelines for performance, scalability, and cost efficiency.

- Develop and maintain technical documentation, including pipeline diagrams, data flow diagrams, and code documentation.

- Participate in data-related aspects of pre-sales activities, including solution design, proposal development, and customer presentations.

- Stay current with emerging technologies and industry trends related to data integration and management in Azure, and provide recommendations for improving existing solutions and implementing new ones.

 :

- Bachelor's or master's degree in computer science, engineering, or a related field.

- At least 7 years of experience in data engineering, including at least 3 years of experience with Azure Data Factory.

- Strong knowledge of Azure Data Factory, including data ingestion, transformation, and orchestration.

- Experience with data integration and ETL processes and writing SQL queries and SQL/Scripts.

- Experience with Azure Data lake and building pipelines for batch and CDC (change data capture).

- Experience with implementing ETL frameworks.

- Strong understanding of data security, including encryption, access control, and auditing.

- Excellent communication and presentation skills, with the ability to effectively communicate complex technical concepts to both technical and non-technical audiences.

- Strong problem-solving and analytical skills, with the ability to identify and resolve complex technical issues.

Good to have Qualifications:

- Microsoft Certified: Azure Data Engineer Associate certification.

- Experience with other Azure data services, such as Azure Synapse Analytics and Azure Databricks.

- Experience with other data integration tools, such as Talend.

- Experience with programming languages, such as SQL, Powershell.

Set alert for similar jobsBig Data Lead role in Pune, India, Mumbai, India, or Chennai, India
Hexaware Technologies Logo

Company

Hexaware Technologies

Job Posted

a year ago

Job Type

Full-time

WorkMode

On-site

Experience Level

3-7 years

Category

Technology

Locations

Pune, Maharashtra, India

Mumbai, Maharashtra, India

Chennai, Tamil Nadu, India

Qualification

Bachelor

Applicants

Be an early applicant

Related Jobs

Hexaware Technologies Logo

Big Data Engineer

Hexaware Technologies

Chennai, Tamil Nadu, India

Posted: a year ago

Description   Skill: Azure-ADF,Synapse, SQL,PySpark and any ETL Informatica/SSIS/Datastage Must Have •5+ years of IT experience in Datawarehouse  •Hands-on data experience on Cloud Technologies on Azure, Synapse, ADF, DataBricks, PySpark •Prior Experience on any of the ETL Technologies like Informatica Power Centre, SSIS, DataStage •Ability to understand Design, Source to target mapping (STTM) and create specifications documents •Flexibility & willingness to work on non-cloud ETL technologies as per the project requirements, though main focus of this role is to work on cloud related projects •Flexibility to operate from office location •Able to mentor and guide junior resources, as needed •Banking experience on RISK & Regulatory OR Commercial OR Credit Cards/Retail Nice to Have •Any relevant certifications

Hexaware Technologies Logo

Dotnet Cloud Native Lead

Hexaware Technologies

Pune, Maharashtra, India

+2 more

Posted: a year ago

Description   AS IS - Key Skills: .NET Core, Microservices, Docker SQL and Azure with/without Angular 4- G5 Total Experience Level : 7 - 11 years Role: Technical Lead/Senior Developer Grade: G5   Must have skills Must have hands on experience in RESTful Http services design and development using ASP.net web api2 / ASP.net core 2.2+ or .NET Core 3.0 web API for 2-3+ years   Must have good hands on experience in using Entity Framework Core with LINQ Must have Strong hands on experience in HTML5, CSS 3 and Bootstrap   Must have Experience in implementing containerized solution using Docker, Azure Container Service, Azure Kubernetes Service for at least 2 years Must have strong hands on experience in SQL Server/PostgreSQL /MYSQL and should have have done data base design e.g. writing complex stored procedures etc   Must have Experience in cloud computing with Azure Must have Hands on experience in some or all of the Azure services like Load balancer, DNS, Traffic Manager, Compute, Storage, azure SQL database, CosmosDB, azure search, azure functions, azure logic apps, azure serverless programming, Azure API management, ServiceBus, keyvault, Azure Batch, Azure Redis Cache,  AAD and AppInsights Basic skills Nice to have Hands on Experience in Angular 6+ for at least 2 years   Must have worked in .NET Framework using C# for at least 4+ years. Should have worked in ASP.NET MVC, WebAPI   Must have Experience in RESTful Http services design and development   Must have  hands on Microservices development experience and container hosting Must have Experience with Team Foundation Server OR GIT   Must have strong handle on SOLID principles and design patterns Nice to have Experience in Rabbit MQ and any Azure Queuing services is good to have Nice to have Exposure in monitoring and tracking of Azure services Nice to have Automate hosting of Kubernetes cluster using Azure Kubernetes Services Nice to have experience working in ARM Templates, Powershell scripting. Nice to have design and implementation experience and deep knowledge of building Service Fabric services such as Identity, Authentication and Authorization, Service Registration and Discovery, Deployment and Provisioning, etc. Nice to have Experience and understanding of cloud security policies, infrastructure deployments in enterprise-wide environments required Nice to have Automate end-to-end deployment systems using Visual Studio pipelines is additional desirable skill

Hexaware Technologies Logo

SharePoint Lead

Hexaware Technologies

Bengaluru, Karnataka, India

+3 more

Posted: a year ago

Description   Responsibilities:  Hands on experience in developing custom solutions on SharePoint.  Have been part of SharePoint upgrade and migration projects.  Have designing and architecting SharePoint solutions experience, also developing Workflows using Visual Studio and third-party workflow engines.  Must have experience of High level design document for functional and non-functional requirements, estimations of tasks.  Architect and manage SharePoint solutions, streamline content management, and enhance collaboration across the organization. Customize sites, enforce compliance, and provide user support for seamless knowledge sharing.   Good interpersonal communication, Team leading and organization skills.  Participate in design, code, test reviews cycles.  Assist in the creation of prototypes, POCs, presentations, collateral, etc.  The Role Offers:  To work as an Individual contributor, who can help create Proof of Concept and work on complex technical designs on his own Essential Skills:  Must be well versed with SharePoint 2010/2013/2016, SharePoint Online  Experience in design and development of solutions / portals for SharePoint 2010, 2013, 2016 and Office 365  SharePoint 2013 Add-in design and development (SharePoint hosted app and Provider hosted apps).  SharePoint 2010/2013/2016 development skills including Server Side Object Model, Office 365 / SharePoint Online, Web parts, Visual Studio, AJAX, SQL Server, Web services, WCF, SharePoint Designer, SharePoint features, BDC/BCS, ASP.NET, Search  Client side development – CSOM / JSOM / REST API, PowerShell, Third party front-end frameworks ( Angular, React, etc.)  Knowledge of Modern UI and its development methodologies – SPFx, SharePoint PnP  Basic Knowledge of SharePoint Admin activities – Configuring Service applications, Managed metadata, Backup and Recovery, Health and Monitoring  Have developed SharePoint solutions using Site definition, list definition, web parts, features and event receivers.  Understands Logical, physical and information architecture principles of SharePoint  Understands OOPS, OOAD and UML use cases.  Experience in ASP.NET (forms/MVC), http handlers, modules, IIS 6.0 and above.  Well-aware of best practices of SharePoint and Service oriented architecture.  Essential Qualifications:  Bachelors is a must. Candidate with minimum of 6 years of software development experience using Microsoft technology platform.  A relevant 6-7 years of experience on SharePoint.

Hexaware Technologies Logo

Data Integration Lead

Hexaware Technologies

Chennai, Tamil Nadu, India

Posted: a year ago

Description Data Integration Lead - Talend Developer Offshore   Responsibilities •  Leads the delivery processes of data extraction, transformation, and load from disparate sources into a form that is consumable by analytics processes, for projects with moderate complexity, using strong technical capabilities and sense of database performance •  Designs, develops and produces data models of relatively high complexity, leveraging a sound understanding of data modelling standards to suggest the right model depending on the requirement •  Batch Processing - Capability to design an efficient way of processing      high volumes of data where a group of transactions is collected over a period  •  Data Integration (Sourcing, Storage and Migration) - Capability to design and implement models, capabilities, and solutions to manage data within the enterprise (structured and unstructured, data archiving principles, data warehousing, data sourcing, etc.).  This includes the data models, storage requirements and migration of data from one system to another •  Data Quality, Profiling and Cleansing - Capability to review (profile) a     data set to establish its quality against a defined set of parameters and to highlight data where corrective action (cleansing) is required to     remediate the data  •  Stream Systems - Capability to discover, integrate, and ingest all available data from the machines that produce it, as fast as it is produced, in any  format, and at any quality •  Excellent interpersonal skills to build network with variety of department across business to understand data and deliver business value and may interface and communicate with program teams, management and stakeholders as required to deliver small to medium-sized projects •  Understand the difference between on-prem and cloud-based data integration technologies.   The Role offers •  Opportunity to join a global team to do meaningful work that contributes to global strategy and individual development  •  An outstanding opportunity to re-imagine, redesign, and apply technology to add value to the business and operations •  Gives an opportunity to showcase candidates’ strong analytical skills and problem-solving ability •  Learning & Growth opportunities in cloud and Big data engineering spaces   Essential Skills   •  6+ years’ experience in developing large scale data pipelines in a cloud/on-prem environment. •  Highly Proficient in any or more of market leading ETL tools like Informatica, DataStage, SSIS, Talend, etc., •  Deep knowledge in Data warehouse/Data Mart architecture and modelling •  Define and develop data ingest, validation, and transform pipelines. •  Deep knowledge of distributed data processing and storage •  Deep knowledge of working with structured, unstructured, and semi structured data •  Working experience needed with ETL/ELT patterns •  Extensive experience in the application of analytics, insights and data mining to commercial “real-world” problems •   Technical experience in any one programming language preferably, Java,.Net or Python   Essential Qualification •  BE/Btech in Computer Science, Engineering or relevant field  

Hexaware Technologies Logo

Technical Architect (BIBA)

Hexaware Technologies

Pune, Maharashtra, India

+2 more

Posted: a year ago

Description JD :  2-15 years of experience in various areas of data engineering 7+ years of experience as data architect 4+ years of experience in data modelling   Key responsibilities:    Supporting the development of data architecture for the enterprise by contributing to the development of the models and standards by which data is sourced, stored and distributed   Supporting the data architecture that support structured and unstructured data with schema free design   Enhance/modify the existing data pipelines to incorporate the requirements     Skills and attributes for success    Solution Architecture & Design  - Capability to help clients design solution architecture & requirements for feeding into package definition, solution build etc.  Includes technical system architecture, logical data models. Architectures include Conceptual data, Logical data, Physical data and Security. Capability to define the models and standards by which data is sourced, stored and distributed within an organization  Create data architecture and design data models that includes structured and unstructured data  Ability to design schema free data bases and data models   Have implemented at least one end to end data warehouse projects   Implement and architecture data load process including initial and incremental loads   Experience with NoSQL databases like MongoDB, Neo4j   Experience in performance tuning the queries and modifying/enhancing the existing data model  Experience in design and implementation of ETL pipelines   Knowledge of data ingestion of large datasets from on-premises to Azure    Tools &Technologies:    Cloud : Azure   Data Access Tools:  Azure Data Explorer, Azure Stream Analytics, MS SQL Server Management Studio, SQL Developer, TOAD      Data Modelling Tools:  Erwin, Excel, Power Designer, Sparx, Visio   Distributed Systems:  Azure ADLS Gen      Graph Databases:  Neo4J or similar database     NoSQL Database