The Job logo

What

Where

Big Data Engineer (Python, Kafka, Spark)

ApplyJoin for More Updates

You must Sign In before continuing to the company website to apply.

Smart SummaryPowered by Roshi
Join NetApp India's R&D division as a Software Engineer responsible for designing, developing, and validating software for Big Data Engineering in cloud and on-premises environments. Work on the Active IQ DataHub platform using Kafka, Spark, and NoSQL databases to enable advanced AI and ML techniques for actionable intelligence. Collaborate with a technical team to contribute to code development, testing, and mentoring junior engineers.

Job description 

As a Software Engineer at NetApp India’s R&D division, you will be responsible for the design, development and validation of software for Big Data Engineering across both cloud and on-premises environments. You will be part of a highly skilled technical team named NetApp Active IQ. 
The Active IQ DataHub platform processes over 10 trillion data points per month that feeds a multi-Petabyte DataLake. The platform is built using Kafka, a serverless platform running on Kubernetes, Spark and various NoSQL databases. This platform enables the use of advanced AI and ML techniques to uncover opportunities to proactively protect and optimize NetApp storage, and then provides the insights and actions to make it happen. We call this “actionable intelligence”
You will be working closely with a team of senior software developers and a technical director. You will be responsible for contributing to the design, and development and testing of code. The software applications you build will be used by our internal product teams, partners, and customers.
We are looking for a hands-on lead engineer who is familiar with Spark and Scala, Java and/or Python. Any cloud experience is a plus. You should be passionate about learning, be creative and have the ability to work with and mentor junior engineers.
 

job requirements

Your Responsibility 
•    Design and build our Big Data Platform, and understand scale, performance and fault-tolerance
•    Interact with Active IQ engineering teams across geographies to leverage expertise and contribute to the tech community. 
•    Identify the right tools to deliver product features by performing research, POCs and interacting with various open-source forums 
•    Build and deploy products both on-premises and in the cloud
•    Work on technologies related to NoSQL, SQL and in-memory databases
•    Develop and implement best-in-class monitoring processes to enable data applications meet SLAs 
•    Should be able to mentor junior engineers technically. 
•    Conduct code reviews to ensure code quality, consistency and best practices adherence. 

     Our Ideal Candidate 
•    You have a deep interest and passion for technology
•    You love to code. An ideal candidate has a github repo that demonstrates coding proficiency
•    You have strong problem solving, and excellent communication skills
•    You are self-driven and motivated with the desire to work in a fast-paced, results-driven agile environment with varied responsibilities

education

•    5+ years of Big Data hands-on development experience 
•    Demonstrate up-to-date expertise in Data Engineering, complex data pipeline development. 
•    Design, develop, implement and tune distributed data processing pipelines that process large volumes of data; focusing on scalability, low -latency, and fault-tolerance in every system built
•    Awareness of Data Governance (Data Quality, Metadata Management, Security, etc.) 
•    Experience with one or more of Python/Java/Scala 
•    Proven, working expertise with Big Data Technologies Hadoop, HDFS, Hive, Spark Scala/Spark, and SQL 
•    Knowledge and experience with Kafka, Storm, Druid, Cassandra or Presto is an added advantage

Set alert for similar jobsBig Data Engineer (Python, Kafka, Spark) role in Bengaluru, India
NetApp Logo

Company

NetApp

Job Posted

5 months ago

Job Type

Full-time

WorkMode

On-site

Experience Level

3-7 Years

Category

Software Engineering

Locations

Bengaluru, Karnataka, India

Qualification

Bachelor or Master

Applicants

Be an early applicant

Related Jobs

Infosys Logo

Spark and Kafka

Infosys

Bengaluru, Karnataka, India

Posted: 10 months ago

Responsibilities A day in the life of an Infoscion • As part of the Infosys delivery team, your primary role would be to provide best fit architectural solutions for one or more projects. • You would also provide technology consultation and assist in defining scope and sizing of work • You would implement solutions, create technology differentiation and leverage partner technologies. • Additionally, you would participate in competency development with the objective of ensuring the best-fit and high quality technical solutions. • You would be a key contributor in creating thought leadership within the area of technology specialization and in compliance with guidelines, policies and norms of Infosys. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit Additional Responsibilities: • Knowledge of architectural design patterns, performance tuning, database and functional designs • Hands-on experience in Service Oriented Architecture • Ability to lead solution development and delivery for the design solutions • Experience in designing high level and low level documents is a plus • Good understanding of SDLC is a pre-requisite • Awareness of latest technologies and trends • Logical thinking and problem solving skills along with an ability to collaborate Technical and Professional Requirements: • Primary skills:Bigdata->Spark,Opensource->Apache Kafka Preferred Skills: Bigdata->Spark Opensource->Apache Kafka

Infosys Logo

Big Data Developer

Infosys

Bengaluru, Karnataka, India

Posted: a year ago

Responsibilities A day in the life of an Infoscion  • As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction.  • You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain.  • You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews.  • You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes.  • You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!   Educational Requirements Bachelor of Engineering   Service Line Data & Analytics Unit   Additional Responsibilities: Knowledge of more than one technology  • Basics of Architecture and Design fundamentals  • Knowledge of Testing tools  • Knowledge of agile methodologies  • Understanding of Project life cycle activities on development and maintenance projects  • Understanding of one or more Estimation methodologies, Knowledge of Quality processes  • Basics of business domain to understand the business requirements  • Analytical abilities, Strong Technical Skills, Good communication skills  • Good understanding of the technology and domain  • Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods  • Awareness of latest technologies and trends  • Excellent problem solving, analytical and debugging skills   Technical and Professional Requirements: Primary skills:Bigdata,Bigdata->Hadoop,Bigdata->Hive,Bigdata->Pyspark,Bigdata->Python,Bigdata->Scala,Bigdata->Spark,Opensource->Apache Kafka   Preferred Skills: Bigdata->Hive Opensource->Apache Kafka Bigdata->Hadoop Bigdata->Pyspark Bigdata->Scala Bigdata->Spark Bigdata

Carelon Logo

Cloud/Python/Spark Developer Sr Software Engineer

Carelon

Bangalore Urban, Karnataka, India

Posted: a year ago

JOB DESCRIPTION Job Title Sr Snowpark/Pyspark Developer Requirement Type Fulltime Job Location Bangalore/Gurugram Requirement Level Software Engineer Hiring Manager Delivery Manager Primary Skill Snowflake, Python, Cloud Business EDA Skill Category General Job Family Group IFT - Information Systems & Technology Job Family IFT > Engineering/Dev About Elevance Elevance is a leading health company in America dedicated to improving lives and communities and making healthcare simpler.  It is the largest managed health care company in the Blue Cross Blue Shield (BCBS) Association serving more than 45 million lives across 14 states.  A regular i n Fortune 500 list, Elevance ranked 20 in 2022 . Gail Boudreaux , President and CEO of Elevance has been a consistent name in the Fortune list of most powerful women and currently holds 4th rank on this list.  About Carelon Carelon Global Solutions was founded in 2017 as a fully owned subsidiary of Elevance (Previously Anthem Inc). At the center of Carelon is its philosophy of Think Limitless. This enables us to strive for operational excellence, design and cutting-edge innovations and solutions, and deliver exceptional business value for the clients. Diversity is one of the cornerstone values at Carelon and we are proud of harboring a rich and wholesome environment that embraces differences, is inclusive, values talent and creativity, and discriminates against any bias. Carelon received its ‘Great Place To Work’ certification in July 2021.   Our Mission & Values Our Mission: Improving Lives and Communities. Simplifying Healthcare. Expecting More. Our Values: Leadership | Community | Integrity | Agility | Diversity Job Purpose CGS India is seeking a highly skilled Senior Cloud Data Engineer(Tech Lead-TL) to join the team. The ideal candidate will have experience in designing, building, and maintaining cloud-based data solutions. The incumbent will be responsible for developing and implementing data pipelines, data storage solutions, and data processing frameworks. The candidate should have a strong understanding of cloud computing technologies and be able to work collaboratively with cross-functional teams. Job Responsibility The incumbent will be responsible for, but not limited to, the following key deliverables: Design, build, and maintain cloud-based data solutions. Develop and implement data pipelines, data storage solutions, and data processing frameworks. Collaborate with cross-functional teams to ensure data solutions meet business requirements. Develop and maintain data models and data dictionaries. Ensure data quality and integrity. Monitor and optimize data solutions for performance and scalability. Develop and maintain documentation for data solutions. Stay up to date with emerging cloud computing technologies and data engineering best practices. Qualification Bachelor's degree in Computer Science, Information Technology, or a related stream. Experience 5 +  yrs years of total IT experience Strong understanding of cloud computing technologies, such as Snowflake AWS. Knowledge and Competencies Minimum 5 yrs. of experience who has knowledge on designing, implementing, testing cloud computing solutions to migrate application from Bigdata to cloud platforms using Snowflake/Snowpark technology. Strong knowledge on Python, SQL, Snowflake/Snowpark. A highly effective communicator, both orally and in writing Problem-solving and architecting skills in cases of unclear requirements. Extensive experience on ETL and Impact analysis. Ability to work independently and in a team environment. Experience with Agile development methodologies is a plus. Good Understanding of CI/CD , devops and terraform will be an ad-on The Carelon Promise Aligning with our brand belief of ‘limitless minds are our biggest asset’, we offer a world of limitless opportunities to our associates.  It is our strong belief that one is committed to a role when it is not just what the role entails, but also what lies in its periphery that completes the value circle for an associate.  This world of limitless opportunities thrives in an environment that fosters growth and well-being, and gives you purpose and the feeling of belonging. Life @ Carelon Extensive focus on learning and development An inspiring culture built on innovation, creativity, and freedom Holistic well-being  Comprehensive range of rewards and recognitions Competitive health and medical insurance coverage Best-in-class amenities and workspaces Policies designed with associates at the center Equal Opportunity Employer Carelon is committed to a diverse and inclusive workplace and is proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender identity or expression, sexual orientation, national origin, genetics, pregnancy, disability, age, veteran status, or other characteristics. Reasonable Accommodation Our inclusive culture empowers Carelon to deliver the best results for our customers. We not only celebrate the diversity of our workforce, but we also celebrate the diverse ways we work. If you have a disability and need accommodation such as an interpreter or a different interview format, please ask for the Reasonable Accommodation Request Form. Disclaimer: Offered designation titles differ*   Extra Job Content This is an example of some common job content that can be shown at the bottom of every job description. It is added in the CMS and then shown on every job. It can be used to supplement the job content that comes from the ATS.