The Job logo

What

Where

AWS Data Engineer

Join for More Updates
Smart SummaryPowered by Roshi
Join HARMAN as an AWS Data Engineer leveraging Python and Spark for data pipelines on AWS and Databricks. Work on AWS essentials like S3, IAM, EC2, Lambda, Glue, and streaming pipelines using Kinesis. Develop ETL processes using Python, PySpark, and DBT.

A Career at HARMAN

 

As a technology leader that is rapidly on the move, HARMAN is filled with people who are focused on making life better. Innovation, inclusivity and teamwork are a part of our DNA. When you add that to the challenges we take on and solve together, you’ll discover that at HARMAN you can grow, make a difference and be proud of the work you do everyday.

 

 

Mandatory Skills - Python , Spark -Azure databricks -Azure Deltalake -Azure datafactory

Previous hands-on professional experience in developing and maintaining data pipelines on AWS and Databricks

Coding proficiency with Python

Good Work experience on AWS Essentials such as s3, IAM, EC2, Lambda

Engineering and Orchestrating Batch Data Pipelines using AWS Glue Workflows

Knowledge on Streaming Pipelines using AWS Kinesis

Tech stack: AWS, Databricks, Glue, Python, PySpark, DBT for developing ETL

Hands-on skills with SQL

Strong understanding of data, systems and end to end data processes within functional area.

Ability to work with agile delivery methodologies.

good communication

HARMAN International Logo

Company

HARMAN International

Job Posted

a year ago

Job Type

Full-time

WorkMode

On-site

Experience Level

3-7 Years

Category

Engineering

Locations

Bengaluru, Karnataka, India

Qualification

Bachelor

Applicants

Be an early applicant

Related Jobs

HARMAN International Logo

Senior Data Analyst, SAP BODS/IS

HARMAN International

Bengaluru, Karnataka, India

Posted: a year ago

The Senior Data Analyst will be responsible for setting up Data Quality dashboards in SAP Information Steward, and supporting Data Migration using SAP BODS or other ETL Tools. Working closely with the Data Quality Lead and business stakeholders, you will understand data standards, definitions, rules, and set up data quality dashboards. As a Data Migration Lead, you will be involved in key master data initiatives.

Publicis Sapient Logo

Manager Data Engineering DE - Big Data AWS

Publicis Sapient

Bengaluru, Karnataka, India

Posted: a year ago

Job Description As Manager, Data Engineering, you will be responsible for translating client requirements into design, architecting, and implementing AWS Cloud-based big data solutions for clients. Your role will be focused on delivering high-quality solutions by independently driving design discussions related to Data Ingestion, Transformation & Consumption, Data Storage and Computation Frameworks, Performance Optimizations, Infrastructure, Automation & Cloud Computing, and Data Governance & Security. The role requires a hands-on technologist with expertise in Big Data solution architecture and with a strong programming background in Java / Scala / Python. Your Impact: Provide technical leadership and hands-on implementation role in the areas of data engineering including data ingestion, data access, modeling, data processing, visualization, design, and implementation. Lead a team to deliver high quality big data technologies-based solutions on AWS Cloud. Manage functional & nonfunctional scope and quality. Help establish standard data practices like governance and address other non-functional issues like data security, privacy, and quality. Manage and provide technical leadership to a data program implementation based on the requirement using agile technologies. Participate in workshops with clients and align client stakeholders to optimal solutions. Consulting, Soft Skills, Thought Leadership, Mentorship etc. People management, contributing to hiring and capability building. #LI-REMOTE  Qualifications Your Skills & Experience: Overall 8+ years of IT experience with 3+ years in Data related technologies, and expertise of 1+ years in data-related AWS Cloud services and delivered at least 1 project as an architect. Mandatory to have knowledge of Big Data Architecture Patterns and experience in the delivery of end-to-end Big Data solutions on AWS Cloud. Expert in programming languages like Java/ Scala and good to have Python Expert in at least one distributed data processing framework: Spark (Core, Streaming, SQL), Storm or Flink, etc. Expert in Hadoop eco-system with AWS cloud distribution and worked at least on one or more big data ingestion tools (Sqoop, Flume, NiFI, etc), distributed messaging and ingestion frameworks (Kafka, Pulsar, Pub/Sub, etc) and good to know traditional tools like Informatica, Talend etc Should have worked on any of NoSQL solutions like Mongo DB, Cassandra, HBase etc or Cloud based NoSQL offerings like DynamoDB , Big Table etc. Good Exposure in development with CI / CD pipelines. Knowledge of containerization, orchestration and kubernetes engine would be an added advantage. Set Yourself Apart With: Certification on AWS cloud platform or big data technologies. Strong analytical and problem-solving skills. Excellent understanding of data technologies landscape / ecosystem. A Tip from the Hiring Manager: Join the team to sharpen your skills and expand your collaborative methods. Make an impact on our clients and their businesses directly through your work.   Additional Information Gender Neutral Policy 18 paid holidays throughout the year Generous parental leave and new parent transition program Flexible work arrangements  Employee Assistance Programs to help you in wellness and well being Company Description Publicis Sapient is a digital transformation partner helping established organizations get to their future, digitally-enabled state, both in the way they work and the way they serve their customers. We help unlock value through a start-up mindset and modern methods, fusing strategy, consulting, and customer experience with agile engineering and problem-solving creativity. United by our core values and our purpose of helping people thrive in the brave pursuit of the next, our 20,000+ people in 53 offices around the world combine experience across technology, data sciences, consulting and customer obsession to accelerate our clients’ businesses through designing the products and services their customers truly value.