The Job logo

What

Where

Data Engineer (AWS + Python Developer)

ApplyJoin for More Updates

You must Sign In before continuing to the company website to apply.

It's fun to work in a company where people truly BELIEVE in what they are doing!

 

We're committed to bringing passion and customer focus to the business.

 

We need Pyspark AWS DEs with strong Python dev experience

Have to be evaluated based on the below skills :-

 

  • Development Skill
  • Python Skills
  • Analytical Skills
  • Terraform , PySpark
  • AWS- S3, glue, SageMaker, lambda

 

 

If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us!

 

Not the right fit?  Let us know you're interested in a future opportunity by clicking Introduce Yourself in the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest!

Set alert for similar jobsData Engineer (AWS + Python Developer) role in Bengaluru, India, Mumbai, India, Chennai, India, Gurgaon, India, or Pune, India
Fractal Logo

Company

Fractal

Job Posted

2 months ago

Job Type

Full-time

WorkMode

On-site

Experience Level

0-2 Years

Category

Engineering

Locations

Bengaluru, Karnataka, India

Mumbai, Maharashtra, India

Chennai, Tamil Nadu, India

Gurgaon, Haryana, India

Qualification

Bachelor

Applicants

47 applicants

Related Jobs

Comcast Logo

Development Engineer 2 (Python and AWS)

Comcast

Chennai, Tamil Nadu, India

Posted: a year ago

Join as a Development Engineer 2 focusing on Python and AWS, responsible for planning, designing, and implementing new software and web applications. Work within a newly formed team to expand departmental tooling capabilities using fault-tolerant and scalable solutions for internal customers across multiple territories. Utilize Agile and Test-Driven development approaches, infrastructure as code, and CICD pipelines. Evaluate new technologies, build innovative platform solutions, and support migration to the cloud. Location: Thoraipakkam, Chennai. Mode: Hybrid. Education: Bachelor's Degree. Experience: 2-5 years.

Hexaware Technologies Logo

Cloud Data Architect

Hexaware Technologies

Chennai, Tamil Nadu, India

+2 more

Posted: a year ago

Description   Responsibilities: Demonstrate knowledge of cloud architecture and implementation features (OS, multi-tenancy, virtualization, orchestration, scalability). Act as a Subject Matter Expert to the organization for cloud/On-prem end-to-end architecture, including/ any one of AWS, GCP, Azure and future providers, networking, provisioning, and management. Communicate and provide thought leadership in Data strategy, Technologies, engineering to business and IT leadership teams and stakeholders. Providing a vital function bridging the gap between business and technology. In partnership with Product Owner(s), Stakeholders and other Subject Matter Experts, and work with technologies such as Cloud data warehousing, Data lakes, Analytical platforms and solutions.  Should be able to define modern data eco system, considering current and futuristic data needs. Work closely with all business areas to capture BI (Business Intelligence, Data Analytics and Reporting) requirements based on business objectives, initiatives, challenges and questions. Translate user stories into wireframe designs of the dashboards. Provide BI / reporting on all the identified KPI areas specified in the KPI requirements document by creating, developing and maintaining BI reports in visualization tools such as Microsoft Power BI. The dashboards will be updated and maintained by the various business teams (self service).  Should be able to cover astatic design aspects in report/dashboard design. Provide advice on technical aspects of BI development and integration, including the operational and maintenance aspects of systems under development, and proposed system recovery procedures. Ensure that relevant technical strategies, policies, standards, and practices are applied correctly. Support the Reporting & Analytics team to respond positively to proposed data initiatives, arising from the Information Governance Group, and provide the necessary support and expertise to deliver such initiatives. Provide ongoing support and implementation of change for the ETL solution interfacing between the applications. Apply enterprise data management practices and principles in ways of working. Identify patterns, trends and insight through use of the appropriate tools. Identify data quality issues through data profiling, analysis and stakeholder engagement. Analyze data to provide insights for internal and external stakeholders  on improving the performance. The Role offers: Opportunity to join a global team to do meaningful work that contributes to global strategy and individual development.  An outstanding opportunity to re-imagine, redesign, and apply technology to add value to the business and operations. Essential Skills: Hands-on-experience in delivering business intelligence solutions. Hands-on-experience with OLTP and OLAP database models. Expertise in end to end (data base + ETL /pipeline + Visualization Reporting) on one of the Cloud or Cloud Agnostic Azure: Synapse, ADF, HD Insights. Expertise in areas of data governance, advanced analytics on premise platform understanding covering one or more of the following skills Teradata, Cloudera, Netezza, Informatica, DataStage, SSIS, BODS, SAS, Business Objects, Cognos, MicroStrategy, WebFocus, Crystal. Programming: Write computer programs and analyze large datasets to uncover answers to complex problems.  Fluent in relational database concepts and flat file processing concepts. Must be knowledgeable in software development lifecycles/methodologies i.e. agile. Data storytelling: Communicate actionable insights using data, often for a non-technical audience. Business intuition: Connect with stakeholders to gain a full understanding of the problems they are looking to solve. Analytical thinking. Find analytical solutions to abstract business issues. Critical thinking: Apply objective analysis of facts before concluding. Interpersonal skills: Communicate across a diverse audience across all levels of an organization. Has strong presentation and collaboration skills and can communicate all aspects of the job requirements, including the creation of formal documentation. Strong problem solving, time management and organizational skills. Database Technology: SQL Server, Netezza, Hadoop, Cloudera. ETL Technology: SSIS, DataStage, Talend, CRON scripting, Perl. BI Technology: SSRS, SSAS, Tableau, MicroStrategy. Cloud SaaS:  Snowflake, Databricks, Matillion, HVR, etc. Familiarity in Data Engineering tools, data ops tools in On-prem and Cloud eco system. Good to have certifications in Togaf, PMP, CSM. Essential Qualification: BE or Btech in Computer Science, Engineering, or relevant field.