The Job logo

What

Where

Sr. Data Engineer / Lead - AWS Job

ApplyJoin for More Updates

You must Sign In before continuing to the company website to apply.

Smart SummaryPowered by Roshi
Seeking a Senior Data Engineer/Lead with 7-12 years of experience in designing, implementing, and managing cloud solutions on AWS. The role involves developing and implementing various AWS services such as Glue ETL, Redshift, DynamoDB, Lambda, etc. Strong Python and PySpark skills are required for analyzing business requirements and providing technical solutions. The position includes conducting cloud readiness assessments, defining target architectures, cost estimations, and providing design guidance for cloud solutions.

Job description 

We are looking forward to hire AWS Professionals in the following areas :

  • 7 to 12 years of experience in designing, implementing, and managing cloud solutions on AWS.
  • Design, develop, and implement cloud solutions on AWS, utilizing a wide range of AWS services, including Glue ETL, Glue Data Catalog, Athena, Redshift, RDS, DynamoDB, Step Function, Event Bridge, Lambda, API Gateway, ECS, and ECR.
  • Demonstrate expertise in implementing AWS core services, such as EC2, RDS, VPC, ELB, EBS, Route 53, ELB, S3, Dynamo DB, and CloudWatch.
  • Leverage strong Python and PySpark data engineering capabilities to analyze business requirements, translate them into technical solutions, and successful execution.
  • Conduct cloud readiness assessments, define target architectures, and provide cost estimations for cloud solutions on AWS.
  • expertise in AWS Data and Analytics Stack, including Glue ETL, Glue Data Catalog, Athena, Redshift, RDS, DynamoDB, Step Function, Event Bridge, Lambda, API Gateway, ECS, and ECR for containerization.
  • In-depth knowledge of AWS core services, such as EC2, RDS, VPC, ELB, EBS, Route 53, ELB, S3, Dynamo DB, and CloudWatch.
  • Develop HLDs, LLDs, test plans, and execution plans for cloud solution implementations, including Work Breakdown Structures (WBS).
  • Provide design guidance and proof-of-concept (POC) support to our managed services team for continuous optimization of cloud solutions.
  • Lead architectural decisions and strategies for migrating and hosting applications on AWS.
  • Develop customer cloud strategies aligned with their business objectives, focusing on cloud migrations.
  • Interact with customers to understand cloud service requirements, transform requirements into workable solutions, and build and test those solutions.
  • Manage multiple cloud solution projects, demonstrating technical ownership and accountability.
  • Capture and share best-practice knowledge within the AWS solutions architect community.
  • Serve as a technical liaison between customers, service engineering teams, and support.
  • Possess a strong understanding of cloud and infrastructure components, including servers, storage, networks, data, and applications, to deliver end-to-end cloud infrastructure architectures and designs.
  • Demonstrate experience with AWS cloud environment sizing and disaster recovery architecture for applications.
  • Effectively collaborate with team members from around the globe, including experience working with offshore models.
  • Excellent analytical and problem-solving skills.
  • Strong communication and presentation skills.
  • Ability to work independently and as part of a team.
  • Experience working with onshore - offshore teams.
Set alert for similar jobsSr. Data Engineer / Lead - AWS Job role in Pune, India
YASH Technologies Logo

Company

YASH Technologies

Job Posted

6 months ago

Job Type

Full-time

WorkMode

On-site

Experience Level

8-12 Years

Category

Software Engineering

Locations

Pune, Maharashtra, India

Qualification

Bachelor or Master

Applicants

Be an early applicant

Related Jobs

YASH Technologies Logo

Sr. Software Engineer - AWS Glue + Python Job

YASH Technologies

Hyderabad, Telangana, India

Posted: 6 months ago

Join as a Sr. Software Engineer specializing in AWS Glue + Python at YASH Technologies, Hyderabad, India. Engage in ETL work with cloud databases, Spark, SCC/Git, ELT/ETL processes, Python programming, REST APIs, SQL, DevOps, ITIL, PySpark, Postgres, and ElasticSearch.