The Job logo

What

Where

Pyspark Developer

ApplyJoin for More Updates

You must Sign In before continuing to the company website to apply.

Smart SummaryPowered by Roshi
Looking for a candidate with expertise in PySpark, Spark SQL, and working with dataframes. Must have knowledge of advanced data transformations and good understanding of SQL. The candidate should be experienced in error handling, logging, and monitoring. Knowledge of Unix and a scheduling tool is necessary. Performance tuning and generic process development skills are required. Familiarity with the banking domain is preferred. Experience in ETL estimation and Teradata BTEQ is a plus.

Job Description

Competencies Required (Technical/Behavioral Job Description

Essentials:

.PySpark and Spark SQL

.Working with Dataframes using different APIs 

.Spark Functions

.Advanced Data transformations

.Good knowledge in SQL

.Applying UDFS                      

.Error handling, logging and monitoring 

.Unix, Scheduling Tool

.Performance Tuning & Generic process development

.Banking Domain Knowledge


Desirable:

.ETL Estimation

.Teradata BTEQCompetency)

Set alert for similar jobsPyspark Developer role in Chennai, India
Tata Consultancy Services Logo

Company

Tata Consultancy Services

Job Posted

a year ago

Job Type

Full-time

WorkMode

On-site

Experience Level

3-7 Years

Category

Software Engineering

Locations

Chennai, Tamil Nadu, India

Qualification

Bachelor or Master

Applicants

Be an early applicant

Related Jobs

Tata Consultancy Services Logo

Sailpoint Developer

Tata Consultancy Services

Chennai, Tamil Nadu, India

Posted: a year ago

We are looking for an IAM SME with experience in Sailpoint, Federation, LDAP directories, Active Directory, SOA, and JAVA/J2EE. The role is based in Pan India and is permanent.