The Job logo

What

Where

IICS Developer

ApplyJoin for More Updates

You must Sign In before continuing to the company website to apply.

Smart SummaryPowered by Roshi
Seeking a candidate with 5 years of experience in Informatica Cloud (IICS) and strong technical skills in task flow design, dependencies, and scheduling. Must have hands-on experience with Lookup, Expression, Join, Normalizer, Hierarchical stage, sort, Aggregator, and DB stages. Knowledge of AWS, migration projects, databases (preferably Oracle), and Salesforce application is desired. DataStage tool knowledge is a plus. The candidate will also be responsible for providing technical assistance to junior members.

Description

Candidate having 5 years of experience in Informatica Cloud (IICS), should be technically strong in task flow design, dependencies, scheduling
Hands on experience in following stages (Lookup, unconnected Lookup, Expression, Join, Normalizer, Hierarchical stage , sort, Aggregator, DB stages to connect to AWS DB (like Redshift) ).
Good to have candidate worked on migration projects (Any ETL tool to IICS, preferably DataStage to IICS) in the cloud environment (AWS).
AWS experience is a plus.
Any Database experience (preferabl3 Oracle) is mandatory.
Sales force application knowledge is an added advantage.(Optional)
Also, DataStage tool knowledge is a plus.(Optional)
Should provide technical assistance to junior members
 

Primary Location

Pune, Maharashtra, India

Other Locations

Hyderabad, Andhra Pradesh, India

Set alert for similar jobsIICS Developer role in Pune, India
Virtusa Logo

Company

Virtusa

Job Posted

a year ago

Job Type

Full-time

WorkMode

On-site

Experience Level

3-7 Years

Category

Data & Analytics

Locations

Pune, Maharashtra, India

Qualification

Bachelor

Applicants

Be an early applicant

Related Jobs

Virtusa Logo

Bigdata Developer

Virtusa

Pune, Maharashtra, India

Posted: a year ago

Seeking an experienced Big Data Engineer with strong programming skills in Python. Must have expertise in BigData processing frameworks like HDFS, Apache Spark, PySpark, Hive, Sqoop, Impala, CDP components. A solid understanding of distributed Computing Concepts & parallel processing is required. Hands-on experience with Cloudera On-premise Cluster nodes and knowledge of Data warehousing concepts & Technologies is a must.