The Job logo

What

Where

Senior Data Engineer

ApplyJoin for More Updates

You must Sign In before continuing to the company website to apply.

Smart SummaryPowered by Roshi
We are looking for a Data Engineer with at least 5 years of relevant work experience. You should have hands-on programming experience and implementation experience in technologies such as Kafka, Kinesis, Spark, AWS Glue, and AWS LakeFormation. You should also have expertise in data governance and data security implementation. In addition, experience with scheduling tools like Airflow and knowledge of Python is required. Knowledge of Java or Scala is a plus. Familiarity with systems with high volumes of transactions and experience with efficient data processing using open table formats is preferred. Certification in Data Engineering and AWS is a bonus. Join us and be a part of our team as we develop innovative data processing solutions.

Description

5+ years of relevant work experience showing growth as a Data Engineer.
Hands On programming experience
Implementation Experience on Kafka, Kinesis, Spark, AWS Glue, AWS LakeFormation.
Experience of performance optimization in Batch and Real time processing applications
Expertise in Data Governance and Data Security Implementation
Worked on Scheduling tools like Airflow
Good hands-on design and programming skills building reusable tools and products
Experience developing in AWS or similar cloud platforms. Preferred:, ECS, EKS, S3, EMR, DynamoDB, Aurora, Redshift, QuickSight or similar.
Good knowledge of Python. Good to have Java/Scala.
Familiarity with systems with very high volume of transactions, micro service design, or data processing pipelines (Spark).
Experience in efficient data processing using open table format Delta, Iceberg.
Knowledge and hands-on experience with server-less technologies such as Lambda, MSK, MWAA, Kinesis Analytics a plus.
Expertise in practices like Agile, Peer reviews, Continuous Integration,
Capable of planning and executing on both short-term and long-term goals individually and with the team.
Design and Architecture experience (High Level and Low Level Design )
Implemented data catalog solution, data observability framework, Audit Balance control framework
Certification on Data Engineering, AWS etc.
Implemented Micro-service API to process thousands of events per second

Set alert for similar jobsSenior Data Engineer role in Pune, India
Virtusa Logo

Company

Virtusa

Job Posted

a year ago

Job Type

Full-time

WorkMode

On-site

Experience Level

3-7 Years

Category

Engineering

Locations

Pune, Maharashtra, India

Qualification

Bachelor

Applicants

Be an early applicant

Related Jobs

Virtusa Logo

Senior Data Engineer

Virtusa

Pune, Maharashtra, India

Posted: a year ago

Seeking a Data Engineer with 5+ years of relevant work experience showcasing growth. Must have hands-on programming experience and implementation experience with Kafka, Kinesis, Spark, AWS Glue, and AWS LakeFormation. Expertise in data governance, data security implementation, and performance optimization. Proficiency in Python and familiarity with Java/Scala is desired. Knowledge of high-volume transaction systems, microservice design, and data processing pipelines is a plus. AWS experience, especially with ECS, EKS, S3, EMR, DynamoDB, Aurora, Redshift, and QuickSight, preferred. Strong skills in design, architecture, and building reusable tools. Certification in Data Engineering and AWS is a bonus.

KPMG Logo

Advisory Data Visualization Senior

KPMG

Pune, Maharashtra, India

Posted: a year ago

JOB DESCRIPTION About KPMG in India KPMG entities in India are professional services firm(s). These Indian member firms are affiliated with KPMG International Limited. KPMG was established in India in August 1993. Our professionals leverage the global network of firms, and are conversant with local laws, regulations, markets and competition. KPMG has offices across India in Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara and Vijayawada. KPMG entities in India offer services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focused and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment. EOE KI : KPMG India has a policy of providing equal opportunity for all applicants and employees regardless of their color, caste, religion, age, sex/gender, national origin, citizenship, sexual orientation, gender identity or expression, disability, or other legally protected status. KPMG India values diversity and we request you to submit the details below to support us in our endeavor for diversity. Providing the below information is voluntary and refusal to submit such information will not be prejudicial to you.   RESPONSIBILITIES Collaborate with stakeholders, including business analyts and end-users, to gather requirements and understand business processes that can be automated or enhanced using Power Automate and Power Apps. Design, develop, and implement customized workflows and business process automation solutions using Power Automate, ensuring seamless integration with existing systems and applications. Develop user-friendly and intuitive applications using Power Apps, adhering to user experience (UX) and user interface (UI) best practices. Utilize low-code development techniques to rapidly prototype and iterate on Power Apps solutions, ensuring timely delivery and high-quality applications. Implement data validation and business rules within Power Apps, ensuring data accuracy and consistency. Power-BI report development, data structures definition and creation, report optimization, design and development,  Integration and management of Power Platform components to support the end reporting solution (e.g. Data-Stores, Data-Marts, Power Flows, Power Automate, Sharepoint, etc) Experience of advanced design and query capabilities such as DAX, measures, and memory usage Integration of appropriate security models to end user reporting solutions (e.g. use of row-level security and distribution lists) Connecting data sources, including use of gateways and XMLA endpoints Articulating design requirements for data structures necessary for efficient reporting design and/or optimization Knowledge of security and access rights management within Power-Platform solutions Knowledge of, and experience working with, different data model structures within Power-Platform Functional: Analytical mind, problem-solving aptitude, attention to details and sense of ownership will be critical to succeed in this role and Experience working as team for small to medium size teams. Self-motivated and eager to learn, keeping up to date with the latest advancements and features in the Power platform ecosystem. Experience of having worked as per scrum/agile framework and methodology and good communication skills and dexterous in stakeholder management with client stakeholders. Ability to spot opportunities for business development with existing and new clients and work on proposals (with Data Visualization lens) QUALIFICATIONS Any Engineering Graduate