The Job logo

What

Where

Software Development Engineer I Data

ApplyJoin for More Updates

You must Sign In before continuing to the company website to apply.

About the Role

We are looking for an experienced Software Development Engineer - I (Data), who will create prototypes and proofs-of-concept for iterative development in Java. Additionally, in this role, you will be responsible for converting design into code fluently.

The cherry on top? You’ll be part of a team that will help you upskill and grow in your career. Safe to say, an exciting and rewarding journey awaits you in this role.

What you will do
  • Convert design to code seamlessly
  • Develop proofs-of-concept and prototypes for development in Java, Python
  • Hands-on In Spark, Scala/Pyspark.
  • Hands-on working on cloud platform
  • Good knowledge in Data-lake
  • Develop long-term strategies for newbie engineers that work in favor of the organization
  • Resolve bugs on time and make edits according to the feedback
  • Stay updated with emerging tech cultures and identify the right opportunities to implement them
  • Ensure content quality and consistency of the brand
What you will need
  • B.Tech, preferably from premier institutions
  • Excellent coding skills - should be able to convert design into code fluently
  • Ability to create prototypes and proofs of concept for iterative development in Java / Python
  • Good understanding of data structures and algorithms and their space and time complexities
  • Strong hands-on and practical working experience with Java / Python
  • Strong problem-solving skills
Set alert for similar jobsSoftware Development Engineer I Data role in Bengaluru, India
Meesho Logo

Company

Meesho

Job Posted

5 months ago

Job Type

Full-time

WorkMode

On-site

Experience Level

0-2 Years

Category

Data & Analytics

Locations

Bengaluru, Karnataka, India

Qualification

Bachelor

Applicants

89 applicants

Related Jobs

Meesho Logo

Software Development Engineer II - Data Intelligence

Meesho

Bengaluru, Karnataka, India

Posted: a year ago

Are you passionate about turning data into impactful insights? Join us as a Software Development Engineer - II - Data. You will oversee the team's work, direct programming activities, and design new programs for smooth functioning. Curate data, plan strategies, and collaborate with other teams for holistic growth.

Zebra Technologies Logo

Data Engineer, I

Zebra Technologies

Bengaluru, Karnataka, India

Posted: 23 days ago

Job Description Remote Work: Hybrid Overview: At Zebra, we are a community of innovators who come together to create new ways of working to make everyday life better. United by curiosity and care, we develop dynamic solutions that anticipate our customer’s and partner’s needs and solve their challenges. Being a part of Zebra Nation means being seen, heard, valued, and respected. Drawing from our diverse perspectives, we collaborate to deliver on our purpose. Here you are a part of a team pushing boundaries to redefine the work of tomorrow for organizations, their employees, and those they serve. You have opportunities to learn and lead at a forward-thinking company, defining your path to a fulfilling career while channeling your skills toward causes that you care about – locally and globally. We’ve only begun reimaging the future – for our people, our customers, and the world. Let’s create tomorrow together. A Data Engineer will be responsible for understanding the client's technical requirements, design and build data pipelines to support the requirements. In this role, the Data Engineer, besides developing the solution, will also oversee other Engineers' development. This role requires strong verbal and written communication skills and effectively communicate with the client and internal team. A strong understanding of databases, SQL, cloud technologies, and modern data integration and orchestration tools like Azure Data Factory (ADF), Informatica, and Airflow are required to succeed in this role. Responsibilities:   • Play a critical role in the design and implementation of data platforms for the AI products. • Develop productized and parameterized data pipelines that feed AI products leveraging GPUs and CPUs. • Develop efficient data transformation code in spark (in Python and Scala) and Dask. • Build workflows to automate data pipeline using python and Argo. • Develop data validation tests to assess the quality of the input data. • Conduct performance testing and profiling of the code using a variety of tools and techniques. • Build data pipeline frameworks to automate high-volume and real-time data delivery for our data hub. • Operationalize scalable data pipelines to support data science and advanced analytics. • Optimize customer data science workloads and manage cloud services costs/utilization. Qualifications:   • Minimum Education: o Bachelors, Master's or Ph.D. Degree in Computer Science or Engineering. • Minimum Work Experience (years): o 1+ years of experience programming with at least one of the following languages: Python, Scala, Go. o 1+ years of experience in SQL and data transformation o 1+ years of experience in developing distributed systems using open source technologies such as Spark and Dask. o 1+ years of experience with relational databases or NoSQL databases running in Linux environments (MySQL, MariaDB, PostgreSQL, MongoDB, Redis). • Key Skills and Competencies: o Experience working with AWS / Azure / GCP environment is highly desired. o Experience in data models in the Retail and Consumer products industry is desired. o Experience working on agile projects and understanding of agile concepts is desired. o Demonstrated ability to learn new technologies quickly and independently. o Excellent verbal and written communication skills, especially in technical communications. o Ability to work and achieve stretch goals in a very innovative and fast-paced environment. o Ability to work collaboratively in a diverse team environment. o Ability to telework o Expected travel: Not expected.

Swiss Re Logo

Associate Data Engineer I

Swiss Re

Bengaluru, Karnataka, India

Posted: 12 days ago

About the role: The Performance Data Management (PDM) team is at the core of the Performance Management Hub (PMH) and focuses on providing Data assets for analyzing and providing insights on financial performance of the Swiss Re Group. Core objective is enabling of the performance analytics through sourcing various internal and external data sets, maintaining a data environment, and gradually building out the scope of the analytics performed. The core activities of the Data Engineer in the PDM team includes: Build Data Lake leveraging Palantir & BI tools by performing Data Modelling, applying transformations to shape the data suitable for reporting and analysis. Building executive dashboards, creating visually appealing & storytelling reports for senior management. Contributing to ad-hoc projects, POC's and supporting other team members as required Performing change management activities as necessary About you: Passionate about Data Engineering & Programming with 0-1 year relevant experience Basic understanding of PySpark & SQL Able to convert the business problem into technical implementation. Willingness to upskill on Advanced scripting skills viz: Pyspark, Python, JavaScript Knowledge in Reinsurance industry and Finance domain is a Plus. Utilize and apply best practices on projects based on experience and in consultation with experts; appropriately tailored for the client and their culture. University degree (equivalent) in quantitative field (e.g. Mathematics, Statistics, Computer Science Engineering, Information Technology or Electronics & Communications) Specific soft skills: Excellent command of spoken and written English and ability to present to senior management. Standout colleague with ability to build proactive, collaborative working relationships with peers and key partners based on respect and teamwork Inquisitive, proactive and willing to learn new technologies, understand insurance business and its economics Process and delivery mind-set aspiring for methodological and operational improvements Ability to drive the reporting aspects of multiple ad-hoc projects and have good expectation management of partners