The Job logo

What

Where

Big Data Engineer

ApplyJoin for More Updates

You must Sign In before continuing to the company website to apply.

Smart SummaryPowered by Roshi
Join Hexaware Technologies as Big Data Engineer working on Azure ADF, Databricks, and Pyspark. Requires hands-on experience in data transformations, DWH concepts, and writing complex SQL queries. Must have knowledge of Azure services like Azure SQL, Azure blob storage, and Azure logic apps. Coordination with business stakeholders and ability to work independently on DevOps and Agile projects is crucial. Full-time On-site opportunity in Bengaluru, Karnataka, India.

Job description 

Primary Skills: Azure ADF, Databricks with Pyspark, Python

  • Data Engineer with Pyspark, Databricks skillset 3+ years of experience Data Engineering with min 2+ years hands on experience in Pyspark for data transformations
  • Sound knowledge in Data Bricks & Logic Apps
  • Delta Lake, Azure SQL and Azure blob storage, Azure logic apps, Azure Functions and Azure Synapse, Azure Purview.
  • Extensive knowledge on big data concepts like Hive, and spark framework.
  • Should be able to write complex SQL queries. Sound understanding of DWH concepts.
  • Strong hands-on experience on Python or Scala.
  • Hands on experience on Azure Data Factory.
  • Should be able to coordinate independently with business stake holders and understand the business requirements.
  • Knowledge on DevOps and Agile methodologies-based projects, implement the requirements using ADF/Data Brick
  • Knowledge of version control tool such as Git/Bitbucket.

Should have basic understanding on Batch Account configuration, various control and monitoring options
 

Set alert for similar jobsBig Data Engineer role in Bengaluru, India
Hexaware Technologies Logo

Company

Hexaware Technologies

Job Posted

5 months ago

Job Type

Full-time

WorkMode

On-site

Experience Level

3-7 Years

Category

Software Engineering

Locations

Bengaluru, Karnataka, India

Qualification

Bachelor or Master

Applicants

Be an early applicant

Related Jobs

Adobe Logo

Big Data Engineer

Adobe

Bengaluru, Karnataka, India

Posted: 6 months ago

You will work as a Big Data Engineer in Adobe Advertising Cloud to develop predictive models and optimization algorithms for the RTB platform, involving large-scale datasets and cutting-edge research. This on-site full-time role in Bengaluru, Karnataka, India, requires 5-10 years of experience with expertise in machine learning, NLP, DNN frameworks, and collaboration with cross-functional teams.

Infosys Logo

Big Data Developer

Infosys

Bengaluru, Karnataka, India

Posted: a year ago

Responsibilities A day in the life of an Infoscion  • As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction.  • You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain.  • You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews.  • You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes.  • You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!   Educational Requirements Bachelor of Engineering   Service Line Data & Analytics Unit   Additional Responsibilities: Knowledge of more than one technology  • Basics of Architecture and Design fundamentals  • Knowledge of Testing tools  • Knowledge of agile methodologies  • Understanding of Project life cycle activities on development and maintenance projects  • Understanding of one or more Estimation methodologies, Knowledge of Quality processes  • Basics of business domain to understand the business requirements  • Analytical abilities, Strong Technical Skills, Good communication skills  • Good understanding of the technology and domain  • Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods  • Awareness of latest technologies and trends  • Excellent problem solving, analytical and debugging skills   Technical and Professional Requirements: Primary skills:Bigdata,Bigdata->Hadoop,Bigdata->Hive,Bigdata->Pyspark,Bigdata->Python,Bigdata->Scala,Bigdata->Spark,Opensource->Apache Kafka   Preferred Skills: Bigdata->Hive Opensource->Apache Kafka Bigdata->Hadoop Bigdata->Pyspark Bigdata->Scala Bigdata->Spark Bigdata