The Job logo

What

Where

ETL Developer(Azure, Databricks)

ApplyJoin for More Updates

You must Sign In before continuing to the company website to apply.

Smart SummaryPowered by Roshi
Join us as an ETL Developer(Azure, Databricks) at Amdocs in Pune, Maharashtra, India. You will be responsible for designing, developing, modifying, debugging, and maintaining software systems. This is a full-time, on-site opportunity where you will work in a team and collaborate on peer code reviews. You will also provide technical support, propose technical solutions, and actively look for innovation and continuous improvement. The job requires expertise in ETL Tools, Oracle/SQL, Spark, Azure Data Factory, ADLS, Snowflake, Azure Repos, GitHub, and Unix Shell Scripting. Good problem-solving and communication skills are essential.

Job description 
In one sentence

Responsible for design, development, modification, debug and/or maintenance of software systems

What will your job look like?

•    You will design, develop, modify, debug and/or maintain software code according to functional, non-functional and technical design specifications.
•    You will follow Amdocs software engineering standards, applicable software development methodology and release processes, to ensure code is maintainable, scalable, and supportable, and demo the software products to stakeholders.
•    You will investigate issues by reviewing/debugging code, provide fixes and workarounds, and review changes for operability to maintain existing software solutions.
•    You will work within a team, collaborate and add value through participation in peer code reviews, provide comments and suggestions, and work with cross functional teams to achieve goals.
•    You will assume technical accountability for your specific work products within an application and provide technical support during solution design for new requirements.
•    You will be encouraged to actively look for innovation, continuous improvement, and efficiency in all assigned tasks.
 

All you need is...

Key responsibilities:
•    Perform Development & Support activities for Data warehousing domain using ETL Tools and Technologies
•    Understand High Level Design, Application Interface Design & build Low Level Design. Perform application analysis & propose technical solution for application enhancement or resolve production issues.
•    Perform Development & Deployment. Should be able to Code, Unit Test & Deploy
•    Creation necessary documentation for all project deliverable phases
•    Handle Production Issues (Tier 2 Support, weekend on-call rotation) to resolve production issues & ensure SLAs are met.
Technical Skills: 
Mandatory
•    Expert knowledge in Oracle/SQL - Ability to write complex SQL/PL-SQL & performance tune.
•    Have 1+years of hands-on experience in Spark, Databricks on Azure cloud platform services to build data pipelines.
•    Strong hold in Azure Data Factory with experience in creating complex pipelines.
•    Have 1+years of hands-on experience in data ingestion into ADLS.
•    Have 1+years of hands-on experience in Development, Performance Tuning and loading into Snowflake.
•    Experience of working with Azure Repos or Github.
•    Have 1+years of hands-on experience of working with Azure DevOps or GitHub or any other DevOps tool.
•    Working experience in Azure Databricks/ Pyspark.
•    Hands on in Unix & advanced Unix Shell Scripting.
•    Open to work in shift.
Good to have
•    Willingness to learn all data warehousing technologies & work out of the comfort zone in other ETL technologies (Oracle, Qlik Replicate, Golden Gate, Hadoop). Hands on working experience is a plus
•    Knowledge of job Schedulers.
Behavioral skills :
•    Eagerness & Hunger to learn
•    Good problem solving & decision-making skills
•    Good communication skills within the team, site and with the customer
•    Ability to stretch respective working hours when necessary, to support business needs
•    Ability to work independently and drive issues to closure
•    Consult when necessary with relevant parties, raise timely risks
•    Effectively handle multiple and complex work assignments, while consistently deliver high quality work

Set alert for similar jobsETL Developer(Azure, Databricks) role in Pune, India
Amdocs Logo

Company

Amdocs

Job Posted

a year ago

Job Type

Full-time

WorkMode

On-site

Experience Level

0-2 Years

Category

IT Services and IT Consulting

Locations

Pune, Maharashtra, India

Qualification

Bachelor or Master

Applicants

Be an early applicant

Related Jobs

Amdocs Logo

ETL Development Specialist (Azure Datafactory)

Amdocs

Pune, Maharashtra, India

Posted: 7 months ago

Responsible for designing, developing, and maintaining software systems within the Cloud Native Datawarehouse Architecture using Azure Data Factory and other ETL tools. Involves handling production issues, providing technical solutions, and ensuring SLAs are met. This is a full-time on-site opportunity in Pune, Maharashtra, India.

Jade Global Logo

Databricks Architect

Jade Global

Pune, Maharashtra, India

Posted: a year ago

Databricks Architect   Overall Experience : 10+ years of experience in Databricks Product suite Experience of 10+ years in data preparation. BI projects to understand business requirements in BI context and understand data model to transform raw data into meaningful insights   Designing and creating data models that define the structure and relationships of various data elements within the organization. This includes conceptual, logical, and physical data models, which help ensure data accuracy, consistency, and integrity.   Designing data integration solutions that allow different systems and applications to share and exchange data seamlessly. This may involve selecting appropriate integration technologies, developing ETL (Extract, Transform, Load) processes, and ensuring data quality during the integration process. Good knowledge of cloud platforms like AWS/Azure/GCP Strong understanding of Databricks platform. Responsible for developing database solutions Installing and configuring information systems to ensure functionality,  Analyzing structural requirements for new software Migrating data from legacy systems to new soltuions Designing conceptual and logical data models. Optimize database systems, define security procedures, and collaborate with the Data Science department to identify future needs and requirements Knowledge of Data Analysis tools like Power BI or Tableau will be an added advantage Ability to write complex SQL queries  Should be able to back track and do deep analysis of issues and provide RCA.  Good testing and documentation skills   Ability to thrive in a fast-paced, dynamic, client-facing role where delivering solid work products to exceed high expectations is a measure of success. Excellent leadership and interpersonal skills. Eager to contribute in a team-oriented environment. Strong prioritization and multi-tasking skills with a track record of meeting deadlines. Ability to be creative and analytical in a problem-solving environment. Effective verbal and written communication skills. Adaptable to new environments, people, technologies, and processes Ability to manage ambiguity and solve undefined problems