The Job logo

What

Where

Senior Databricks Developer

ApplyJoin for More Updates

You must Sign In before continuing to the company website to apply.

Smart SummaryPowered by Roshi
The Senior Databricks Developer will be responsible to implement and maintain solutions in AWS Databricks platform. Coordinate data requests, review efficient approaches to ingest, transform, and maintain data in multi-hop model. Mentor other developers. Work in fast-paced, high-volume processing environment. Good attention to detail.

Job description 

The Senior Databricks Developer will be responsible to implement and maintain solutions in AWS Databricks platform.  You will be responsible for coordinating data requests from the various teams, reviewing and approving efficient approaches to ingest, extract, transform and maintaining data in multi-hop model.  In addition, you’ll work with team members to mentor other developers to grow their knowledge and expertise.  You’ll be working in a fast-paced and high-volume processing environment, where quality and attention to detail are vital.
 
PRIMARY RESPONSIBILITIES
•    Design and develop high performance and secured Databricks solutions using Python, Spark, PySpark, Delta tables, UDP and Kafka
•    Create high-quality technical documents including, data mapping, data processes, and operational support guides
•    Translate business requirements into data model design and technical solutions
•    Develop data ingest pipelines using Python, Spark & PySpark to support near real-time and batch ingestion processes
•    Maintain data lake and pipeline processes which includes troubleshooting issues, performance tuning and making data quality improvements
•    Work closely with technical leaders, product managers, and reporting team to gather functional and system requirements
•    Work in fast pace environment and perform effectively in an agile development environment
 
KNOWLEDGE AND SKILL REQUIREMENTS
•    Bachelor’s degree in Computer Science or Information Systems or equivalent degree
•    Must have 8+ years of experience in developing applications using Python, Spark, PySpark, Java, Junit, Maven and its eco-system
•    Must have 4+ years of hands-on experience in AWS Databricks and related technologies like MapReduce, Spark, Hive, Parquet and AVRO
•    Good experience in end-to-end implementation of DW BI projects, especially in data warehouse and data mart developments
•    Extensive hands on RDD, Data frame and Dataset operations of Spark 3.x
•    Experience with design and implementation of ETL/ELT framework for complex warehouses/marts. 
•    Knowledge of large data sets and experience with performance tuning and troubleshooting
•    Plus to have AWS Cloud Analytics experience in Lambda, Athena, S3, EMR, Redshift, Redshift spectrum
•    Must have RDBMS: Microsoft SQL Server, Oracle, MySQL
•    Familiarity with Linux OS
•    Understanding of Data architecture, replication, and administration
•    Experience in working with real-time data ingestion with any streaming tool
•    Strong debugging skills to troubleshoot production issues
•    Comfortable working in a team environment
•    Hands on experience with Shell Scripting, Java, and SQL
•    Ability to identify problems, and effectively communicate solutions to peers and management

Set alert for similar jobsSenior Databricks Developer role in Bengaluru, India
Labcorp Logo

Company

Labcorp

Job Posted

a year ago

Job Type

Full-time

WorkMode

On-site

Experience Level

8-12 Years

Category

Software Engineering

Locations

Bengaluru, Karnataka, India

Qualification

Bachelor or Master

Applicants

Be an early applicant

Related Jobs

Labcorp Logo

AEM Developer

Labcorp

Bengaluru, Karnataka, India

Posted: a year ago

We are seeking a skilled and experienced AEM Developer to join our development team. As an AEM Developer, you will be responsible for designing, developing, and maintaining Adobe Experience Manager (AEM) solutions to deliver engaging and personalized digital experiences. Contribute to exciting projects and enhance your AEM development skills.

Labcorp Logo

Senior ETL Developer

Labcorp

Bengaluru, Karnataka, India

Posted: a year ago

Job description  The Senior Datastage Developer will be responsible to design, develop and maintain ETL/ELT solutions.  You will be responsible for coordinating data requests from the various teams, reviewing requirements, design and develop mapping rules documents extract, transform and load data.  In addition, you will be responsible to mentor fellow team members to enhance their skills.  You’ll be working in a fast-paced and high-volume processing environment, where quality and attention to detail are vital.   PRIMARY RESPONSIBILITIES •    Develop and maintain scalable ETL/ELT jobs in Datastage with various sources and targets like SQL, Oracle, DB2, SAP HANA, Hadoop, Cloud, S3, ftp, sftp, etc. •    Develop incremental data load jobs by using source/target CDC mechanism. •    Develop big data spark jobs using Datastage to process mass volume of data •    Provide technical assistance to team and evaluate jobs •    Rotation basis 24/7 on call support within ETL team •    Ability to work independently with minimal supervision •    Develop mapping document and maintain sufficient documentation on mapping rules. •    Perform unit testing, volume testing and end-to-end integration testing. •    Work closely with Technology Leadership, Product Managers, and Reporting Team for understanding the functional and system requirements •    Work closely with our QA Team to ensure data integrity and overall system quality   KNOWLEDGE AND SKILL REQUIREMENTS •    BS/BA degree in Computer Science, Information Systems or related field •    Must have at least 5 years of experience in Datastage •    Must have AWS integration experience to read and write data into S3, Redshift, DynamoDB •    Strong knowledge in end-to-end solution design skills. •    Strong analytical and organizational skills •    Must have technical team lead experience •    Passion to learn new tools and technologies like cloud integration solutions. •    Must have agile development experience •    Understanding of Data architecture, replication, and administration •    Experience in working with real-time data feeds •    Strong debugging skills to troubleshoot production issues •    Comfortable working in a team environment •    Ability to identify problems, and effectively communicate solutions to peers and management •    Excellent verbal and written communication skills