The Job logo

What

Where

Cloud Data Architect

ApplyJoin for More Updates

You must Sign In before continuing to the company website to apply.

Smart SummaryPowered by Roshi
The Cloud Data Architect will demonstrate expertise in cloud architecture, data strategy, and BI reporting. They will bridge the gap between business and technology by defining modern data ecosystems and providing insights for stakeholders. The role includes working with technologies such as Cloud data warehousing, Data lakes, and Analytical platforms. The job offers an opportunity to work on global strategies and individual development.

Description

 

Responsibilities:

  • Demonstrate knowledge of cloud architecture and implementation features (OS, multi-tenancy, virtualization, orchestration, scalability).
  • Act as a Subject Matter Expert to the organization for cloud/On-prem end-to-end architecture, including/ any one of AWS, GCP, Azure and future providers, networking, provisioning, and management.
  • Communicate and provide thought leadership in Data strategy, Technologies, engineering to business and IT leadership teams and stakeholders.
  • Providing a vital function bridging the gap between business and technology. In partnership with Product Owner(s), Stakeholders and other Subject Matter Experts, and work with technologies such as Cloud data warehousing, Data lakes, Analytical platforms and solutions. 
  • Should be able to define modern data eco system, considering current and futuristic data needs.
  • Work closely with all business areas to capture BI (Business Intelligence, Data Analytics and Reporting) requirements based on business objectives, initiatives, challenges and questions.
  • Translate user stories into wireframe designs of the dashboards.
  • Provide BI / reporting on all the identified KPI areas specified in the KPI requirements document by creating, developing and maintaining BI reports in visualization tools such as Microsoft Power BI. The dashboards will be updated and maintained by the various business teams (self service). 
  • Should be able to cover astatic design aspects in report/dashboard design.
  • Provide advice on technical aspects of BI development and integration, including the operational and maintenance aspects of systems under development, and proposed system recovery procedures.
  • Ensure that relevant technical strategies, policies, standards, and practices are applied correctly.
  • Support the Reporting & Analytics team to respond positively to proposed data initiatives, arising from the Information Governance Group, and provide the necessary support and expertise to deliver such initiatives.
  • Provide ongoing support and implementation of change for the ETL solution interfacing between the applications.
  • Apply enterprise data management practices and principles in ways of working.
  • Identify patterns, trends and insight through use of the appropriate tools.
  • Identify data quality issues through data profiling, analysis and stakeholder engagement.
  • Analyze data to provide insights for internal and external stakeholders  on improving the performance.

The Role offers:

  • Opportunity to join a global team to do meaningful work that contributes to global strategy and individual development. 
  • An outstanding opportunity to re-imagine, redesign, and apply technology to add value to the business and operations.

Essential Skills:

  • Hands-on-experience in delivering business intelligence solutions.
  • Hands-on-experience with OLTP and OLAP database models.
  • Expertise in end to end (data base + ETL /pipeline + Visualization Reporting) on one of the Cloud or Cloud Agnostic Azure: Synapse, ADF, HD Insights.
  • Expertise in areas of data governance, advanced analytics on premise platform understanding covering one or more of the following skills Teradata, Cloudera, Netezza, Informatica, DataStage, SSIS, BODS, SAS, Business Objects, Cognos, MicroStrategy, WebFocus, Crystal.
  • Programming: Write computer programs and analyze large datasets to uncover answers to complex problems. 
  • Fluent in relational database concepts and flat file processing concepts.
  • Must be knowledgeable in software development lifecycles/methodologies i.e. agile.
  • Data storytelling: Communicate actionable insights using data, often for a non-technical audience.
  • Business intuition: Connect with stakeholders to gain a full understanding of the problems they are looking to solve.
  • Analytical thinking. Find analytical solutions to abstract business issues.
  • Critical thinking: Apply objective analysis of facts before concluding.
  • Interpersonal skills: Communicate across a diverse audience across all levels of an organization.
  • Has strong presentation and collaboration skills and can communicate all aspects of the job requirements, including the creation of formal documentation.
  • Strong problem solving, time management and organizational skills.
  • Database Technology: SQL Server, Netezza, Hadoop, Cloudera.
  • ETL Technology: SSIS, DataStage, Talend, CRON scripting, Perl.
  • BI Technology: SSRS, SSAS, Tableau, MicroStrategy.
  • Cloud SaaS:  Snowflake, Databricks, Matillion, HVR, etc.
  • Familiarity in Data Engineering tools, data ops tools in On-prem and Cloud eco system.
  • Good to have certifications in Togaf, PMP, CSM.

Essential Qualification:

  • BE or Btech in Computer Science, Engineering, or relevant field.
Set alert for similar jobsCloud Data Architect role in Chennai, India, Pune, India, or Mumbai Suburban, India
Hexaware Technologies Logo

Company

Hexaware Technologies

Job Posted

a year ago

Job Type

Full-time

WorkMode

On-site

Experience Level

13-17 years

Category

Technology

Locations

Chennai, Tamil Nadu, India

Pune, Maharashtra, India

Mumbai Suburban, Maharashtra, India

Qualification

Bachelor

Applicants

Be an early applicant

Related Jobs

Hexaware Technologies Logo

Java Cloud Architect

Hexaware Technologies

Pune, Maharashtra, India

+2 more

Posted: a year ago

Description   Responsibilities: Understand requirement and translate that to product features. Develop Technical solution for complex business problems using Java and related technologies Should be able to use design patterns to make the application reliable, scalable, and highly available. Should be able to design Microservices and Serverless based architecture. Should work with client architect and define top notch solutions. Should provide reference architecture for the application in scope. Should work with vendors and work on integration of multiple systems. Should work in designing application including asynchronous programming, multithreading, mutability and concurrency control/recovery when dealing with persistent data stores Should drive the entire development team by defining and setting up high coding standards and follow best practices and principles aligning to the solution. Should work with the Devops team an implement CI/CD architecture Develop applications using Front end, middleware, and database related technologies. Should participate in reviewing other project architecture for flaws and suggest solutions. Should be hands in developing and implementing best practices and write smart piece of code. Mentor the technical team The Role offers: An outstanding opportunity to re-imagine, redesign, and apply technology to add value to the business and operations. An end-to-end project exposure across multiple technical stack and cloud platform An individual who has passion to learn and adapt to new technologies quickly and scale to next level easily. Exposure to multiple platforms and team and work with them collaboratively to get the technical solution implemented. High visibility, opportunity to interact with multiple groups within the organization, technology vendors and implementation partners. Essential Skills: In-depth knowledge in core Java, spring and Hibernate Hands-on-experience in spring boot Hands-on-experience in developing Microservices using docker container or similar containerized platform. Experience with concurrency in Java including asynchronous programming, multithreading, mutability and concurrency control/recovery when dealing with persistent data stores. Hands-on-experience in RESTful HTTP services design Should have exposure or hands on in designing API using RAML or Open API Specification Proven experience in technically mentoring teams and exposure to solution architecture. Should have strong command on design patterns like CQRS, Factory, Dependency Injection, IOT, Aggregator etc Should be familiar or hands on in using Devops pipeline and managing Code repository and released using Git/BitBucket Experience with Oracle or SQL Server writing stored procedures, performance tuning and identifying deadlocks, transactions and data locking/blocking scenarios Good to have knowledge on designing SPA application using Angular 4/6/8 or REACT JS Hands-on-experience with SQL Server, Postgre SQL writing stored procedures, performance tuning and identifying deadlocks, transactions and data locking/blocking scenarios Localization, Internationalization and Globalization for server and client applications Experience working in Agile Development Methodologies such as SCRUM Essential Qualification: MCA/equivalent master’s in computers is a must. Java Certification is a must. Any cloud certification is a plus.

Hexaware Technologies Logo

Microsoft Architect

Hexaware Technologies

Pune, Maharashtra, India

+2 more

Posted: a year ago

Description   Responsibilities: Understand requirement and translate that to product features. Develop Technical solution for complex business problems using Java and related technologies Should be able to use design patterns to make the application reliable, scalable, and highly available. Should be able to design Microservices and Serverless based architecture. Should work with client architect and define top notch solutions. Should provide reference architecture for the application in scope. Should work with vendors and work on integration of multiple systems. Should work in designing application including asynchronous programming, multithreading, mutability and concurrency control/recovery when dealing with persistent data stores Should drive the entire development team by defining and setting up high coding standards and follow best practices and principles aligning to the solution. Should work with the Devops team an implement CI/CD architecture Develop applications using Front end, middleware, and database related technologies. Should participate in reviewing other project architecture for flaws and suggest solutions. Should be hands in developing and implementing best practices and write smart piece of code. Mentor the technical team The Role offers: An outstanding opportunity to re-imagine, redesign, and apply technology to add value to the business and operations. An end-to-end project exposure across multiple technical stack and cloud platform An individual who has passion to learn and adapt to new technologies quickly and scale to next level easily. Exposure to multiple platforms and team and work with them collaboratively to get the technical solution implemented. High visibility, opportunity to interact with multiple groups within the organization, technology vendors and implementation partners. Essential Skills: In-depth knowledge in C#, ASP.NET MVC, Web API. Hands-on-experience in .NET core, EF Core, Web API core. Hands-on-experience in RESTful Http services design. Hands-on-experience in developing Microservices and docker.  Exposure for at least 1 year in serverless architecture and exposure using AWS Lambda/Azure Functions Exposure for at least 1 year in using AWS or Azure PaaS services like storage, Queues, Database, security. It could be the following in AWS and Azure AWS: SQS, SNS, Dynamodb, ECS, EC2, Load Balancer, Secrets manager Azure: Storage accounts(all types), App services, Web jobs, batch, Logic apps etc Hands-on-experience in working in any of the cloud platforms like Azure or AWS Hands-on-experience in Java script, JQuery, Bootstrap, Html5, CSS3 Hands-on-experience with SQL Server, Postgre SQL writing stored procedures, performance tuning and identifying deadlocks, transactions and data locking/blocking scenarios Good to have 2+ years of experience in Angular 4/6/8 or REACT JS Good to have working knowledge of Webpack, Angular CLI and Agile Scrum framework Good communication and unit testing knowledge using NUnit,Xunit etc Good to have knowledge in one of the cloud platform like AWS/Azure/PCF Familiar with Continuous Integration methodologies and tools, including Jenkins Good to have: Exposure to Microservices, Docker, Kubernetes and cloud deployment  Essential Qualification: MCA/equivalent master’s in computers is a must. Any cloud certification is a plus.

Hexaware Technologies Logo

Data Integration Lead

Hexaware Technologies

Bengaluru, Karnataka, India

+2 more

Posted: a year ago

Description   Responsibilities •  Leads the delivery processes of data extraction, transformation, and load        from disparate sources into a form that is consumable by analytics                processes, for projects with moderate complexity, using strong technical        capabilities and sense of database performance •  Designs, develops and produces data models of relatively high                        complexity, leveraging a sound understanding of data modelling                    standards to suggest the right model depending on the requirement •  Batch Processing - Capability to design an efficient way of processing      high volumes of data where a group of transactions is collected over a          period  •  Data Integration (Sourcing, Storage and Migration) - Capability to design      and implement models, capabilities, and solutions to manage data within      the enterprise (structured and unstructured, data archiving principles,            data warehousing, data sourcing, etc.).  This includes the data models,          storage requirements and migration of data from one system to another •  Data Quality, Profiling and Cleansing - Capability to review (profile) a     data set to establish its quality against a defined set of parameters and to     highlight data where corrective action (cleansing) is required to     remediate the data  •  Stream Systems - Capability to discover, integrate, and ingest all available     data from the machines that produce it, as fast as it is produced, in any         format, and at any quality •  Excellent interpersonal skills to build network with variety of department       across business to understand data and deliver business value and may         interface and communicate with program teams, management and               stakeholders as required to deliver small to medium-sized projects •  Understand the difference between on-prem and cloud-based data               integration technologies. The Role offers •  Opportunity to join a global team to do meaningful work that contributes      to global strategy and individual development  •  An outstanding opportunity to re-imagine, redesign, and apply                      technology to add value to the business and operations •  Gives an opportunity to showcase candidates’ strong analytical skills and        problem-solving ability •  Learning & Growth opportunities in cloud and Big data engineering              spaces Essential Skills   •  6+ years’ experience in developing large scale data pipelines in a                   cloud/on-prem environment. •  Highly Proficient in any or more of market leading ETL tools like                      Informatica, DataStage, SSIS, Talend, etc., •  Deep knowledge in Data warehouse/Data Mart architecture and                    modelling •  Define and develop data ingest, validation, and transform pipelines. •  Deep knowledge of distributed data processing and storage •  Deep knowledge of working with structured, unstructured, and semi             structured data •  Working experience needed with ETL/ELT patterns •  Extensive experience in the application of analytics, insights and data              mining to commercial “real-world” problems •   Technical experience in any one programming language preferably, Java,       .Net or Python Essential Qualification •  BE/Btech in Computer Science, Engineering or relevant field JD: • Strong working knowledge of IICS - Informatica Cloud, Informatica designer transformations like Source Qualifier , Dynamic and Static Lookups , connected and Unconnected lookups , Expression , Filter , Router , Joiner , Normalizer and Update Strategy transformation. • Solid hands-on development experience in Informatica PowerCenter - usage of reusable transformations, aggregates, lookups, caches, performance tuning, joiners, rank, router, update strategy etc. • Strong Knowledge on Snowflake  • Experience in working in Data Warehousing - Must have knowledge of Data Warehousing concepts - SCD1, SCD2 etc. • Troubleshoot issues and identify bottlenecks in existing data workflows. • Provides performance tuning insight and create reusable objects and templates. • Should be strong in migrating object process from Lower Environment to higher Environment. • Should be strong in scheduling process of the workflows, tasks and mappings. • Basic understanding of Informatica Administration • Strong development skills in SQL. Having knowledge on AWS, HVR would be an added advantage • Should have good communication and customer interaction skills

Hexaware Technologies Logo

Data Integration Lead

Hexaware Technologies

Chennai, Tamil Nadu, India

Posted: a year ago

Description Data Integration Lead - Talend Developer Offshore   Responsibilities •  Leads the delivery processes of data extraction, transformation, and load from disparate sources into a form that is consumable by analytics processes, for projects with moderate complexity, using strong technical capabilities and sense of database performance •  Designs, develops and produces data models of relatively high complexity, leveraging a sound understanding of data modelling standards to suggest the right model depending on the requirement •  Batch Processing - Capability to design an efficient way of processing      high volumes of data where a group of transactions is collected over a period  •  Data Integration (Sourcing, Storage and Migration) - Capability to design and implement models, capabilities, and solutions to manage data within the enterprise (structured and unstructured, data archiving principles, data warehousing, data sourcing, etc.).  This includes the data models, storage requirements and migration of data from one system to another •  Data Quality, Profiling and Cleansing - Capability to review (profile) a     data set to establish its quality against a defined set of parameters and to highlight data where corrective action (cleansing) is required to     remediate the data  •  Stream Systems - Capability to discover, integrate, and ingest all available data from the machines that produce it, as fast as it is produced, in any  format, and at any quality •  Excellent interpersonal skills to build network with variety of department across business to understand data and deliver business value and may interface and communicate with program teams, management and stakeholders as required to deliver small to medium-sized projects •  Understand the difference between on-prem and cloud-based data integration technologies.   The Role offers •  Opportunity to join a global team to do meaningful work that contributes to global strategy and individual development  •  An outstanding opportunity to re-imagine, redesign, and apply technology to add value to the business and operations •  Gives an opportunity to showcase candidates’ strong analytical skills and problem-solving ability •  Learning & Growth opportunities in cloud and Big data engineering spaces   Essential Skills   •  6+ years’ experience in developing large scale data pipelines in a cloud/on-prem environment. •  Highly Proficient in any or more of market leading ETL tools like Informatica, DataStage, SSIS, Talend, etc., •  Deep knowledge in Data warehouse/Data Mart architecture and modelling •  Define and develop data ingest, validation, and transform pipelines. •  Deep knowledge of distributed data processing and storage •  Deep knowledge of working with structured, unstructured, and semi structured data •  Working experience needed with ETL/ELT patterns •  Extensive experience in the application of analytics, insights and data mining to commercial “real-world” problems •   Technical experience in any one programming language preferably, Java,.Net or Python   Essential Qualification •  BE/Btech in Computer Science, Engineering or relevant field