The Job logo

What

Where

Big Data developer

ApplyJoin for More Updates

You must Sign In before continuing to the company website to apply.

Smart SummaryPowered by Roshi
Design, develop, modify, debug and maintain software systems according to specifications. Investigate issues, provide fixes and workarounds. Collaborate with cross-functional teams. Experience with implementation of high-end software products and knowledge of Kubernetes and deployment methodologies. Sound problem-solving skills and ability to work with minimal supervision. Basic knowledge of database principles and Unix commands.

Job description 

Responsible for design, development, modification, debug and/or maintenance of software systems

What will your job look like?

•    You will design, develop, modify, debug and/or maintain software code according to functional, non-functional and technical design specifications.
•    You will follow Amdocs software engineering standards, applicable software development methodology and release processes, to ensure code is maintainable, scalable, and supportable, and demo the software products to stakeholders.
•    You will investigate issues by reviewing/debugging code, provide fixes and workarounds, and review changes for operability to maintain existing software solutions.
•    You will work within a team, collaborate and add value through participation in peer code reviews, provide comments and suggestions, and work with cross functional teams to achieve goals.
•    You will assume technical accountability for your specific work products within an application and provide technical support during solution design for new requirements.
•    You will be encouraged to actively look for innovation, continuous improvement, and efficiency in all assigned tasks.
 

All you need is...

Experience with:
•    2+ or 4+ years of experience in the role of implementation of high end software products.
•    Sound knowledge of Kubernetes and deployment methodologies
•    Bigdata (Hdfs/hive/HBASE/KAFka), either of Spark/Pyspark is mandatory , with strong problem-solving skills, and able to thrive with minimal supervision.
•    Knowledge of database principles, SQL, and experience working with large databases.
•    Basic Unix command. 

Key responsibilities:
•    Perform Development & Support activities for Data warehousing domain using Big Data Technologies
•    Understand High Level Design, Application Interface Design & build Low Level Design. Perform application analysis & propose technical solution for application enhancement or resolve production issues
•    Perform Development & Deployment. Should be able to Code, Unit Test & Deploy
•    Creation necessary documentation for all project deliverable phases
•    Handle Production Issues (Tier 2 Support, weekend on-call rotation) to resolve production issues & ensure SLAs are met

Technical Skills: 
Mandatory
•    Either of Spark/Pyspark is mandatory.
•    Should have a good programming background with expertise in Scala/ Java or Python.
•    Should have worked on Kafka-Spark streaming framework.
•    Experience with Big Data technologies such as Hadoop and related eco system - Cloudera & Hortonworks
•    Experience on complete SDLC and exposure to Build and release management.
•    Experience, interest and adaptability to working in an Agile delivery environment.
•    Ability to select the right tool for the job.

Good to have
•    Cloud Skills – should have knowledge on either AWS or Azure or GLP.
•    Should have worked on REST API’s using kafka Cluster.
Behavioral skills :
•    Eagerness & Hunger to learn
•    Good problem solving & decision making skills
•    Good communication skills within the team, site and with the customer
•    Ability to stretch respective working hours when necessary, to support business needs
•    Ability to work independently and drive issues to closure
•    Consult when necessary with relevant parties, raise timely risks
•    Effectively handle multiple and complex work assignments, while consistently deliver high quality work

Set alert for similar jobsBig Data developer role in Pune, India
Amdocs Logo

Company

Amdocs

Job Posted

a year ago

Job Type

Full-time

WorkMode

On-site

Experience Level

3-7 Years

Category

Software Engineering

Locations

Pune, Maharashtra, India

Qualification

Bachelor or Master

Applicants

Be an early applicant

Related Jobs

Amdocs Logo

Big Data Architect

Amdocs

Pune, Maharashtra, India

Posted: a year ago

As an Amdocs Software Architect, you will be responsible for analyzing and designing technical solutions for infrastructure and enterprise applications. You will work with software engineers and other architects to define and refine the product structure aligned with business needs. Additionally, you will research new technologies and propose improvements in processes and tools.

Infosys Logo

Big Data Developer

Infosys

Bengaluru, Karnataka, India

Posted: a year ago

Responsibilities A day in the life of an Infoscion  • As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction.  • You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain.  • You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews.  • You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes.  • You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!   Educational Requirements Bachelor of Engineering   Service Line Data & Analytics Unit   Additional Responsibilities: Knowledge of more than one technology  • Basics of Architecture and Design fundamentals  • Knowledge of Testing tools  • Knowledge of agile methodologies  • Understanding of Project life cycle activities on development and maintenance projects  • Understanding of one or more Estimation methodologies, Knowledge of Quality processes  • Basics of business domain to understand the business requirements  • Analytical abilities, Strong Technical Skills, Good communication skills  • Good understanding of the technology and domain  • Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods  • Awareness of latest technologies and trends  • Excellent problem solving, analytical and debugging skills   Technical and Professional Requirements: Primary skills:Bigdata,Bigdata->Hadoop,Bigdata->Hive,Bigdata->Pyspark,Bigdata->Python,Bigdata->Scala,Bigdata->Spark,Opensource->Apache Kafka   Preferred Skills: Bigdata->Hive Opensource->Apache Kafka Bigdata->Hadoop Bigdata->Pyspark Bigdata->Scala Bigdata->Spark Bigdata