The Job logo

What

Where

DataBricks Engineer

ApplyJoin for More Updates

You must Sign In before continuing to the company website to apply.

Description

 

About the Role:

The role of a DataBricks Engineer involves working with Hadoop tools, Spark, Hive, HBase, Sqoop, and MapReduce, among other technologies. The engineer should be proficient in Data warehousing, data analysis, and have a strong understanding of the Hadoop ecosystem and distributed processing frameworks. The role requires good communication skills, flexibility, and an eagerness to upskill oneself in modern data technologies like Big Data Engineering and Information Delivery.


 

Responsibilities:

  • The responsibilities of a DataBricks Engineer include working with Hadoop tools, Spark, Hive, HBase, Sqoop, and MapReduce, among other technologies, to build and maintain data pipelines, data lakes, and data warehouses. 
  • The engineer should be able to analyze data, troubleshoot issues, and develop solutions independently. 
  • They should also be able to communicate effectively with stakeholders, including team members, project managers, and business leaders. 
  • The engineer should have a willingness to upskill themselves and stay updated with modern data technologies.

Essential Skills:

  • Hands-on experience with Hadoop and Spark
  • Proficiency in Data warehousing and data analysis
  • Strong understanding of Hadoop ecosystem and distributed processing frameworks
  • Good knowledge of Python and SQL
  • Experience working with cloud platforms
  • Flexibility and a can-do attitude
  • Excellent communication skills, both written and verbal

Essential Qualifications:

  • Minimum 4 years of experience with Hadoop, Spark, and related technologies
  • Bachelor's or Master's degree in Computer Science, Information Systems, or a related field
  • Strong problem-solving skills and ability to work independently
Set alert for similar jobsDataBricks Engineer role in Pune, India, Mumbai Suburban, India, or Chennai, India
Hexaware Technologies Logo

Company

Hexaware Technologies

Job Posted

a year ago

Job Type

Full-time

WorkMode

On-site

Experience Level

3-7 years

Category

Technology

Locations

Pune, Maharashtra, India

Mumbai Suburban, Maharashtra, India

Chennai, Tamil Nadu, India

Qualification

Bachelor

Applicants

Be an early applicant

Related Jobs

Hexaware Technologies Logo

Cloud Data Architect

Hexaware Technologies

Chennai, Tamil Nadu, India

+2 more

Posted: a year ago

The Cloud Data Architect will demonstrate expertise in cloud architecture, data strategy, and BI reporting. They will bridge the gap between business and technology by defining modern data ecosystems and providing insights for stakeholders. The role includes working with technologies such as Cloud data warehousing, Data lakes, and Analytical platforms. The job offers an opportunity to work on global strategies and individual development.

Hexaware Technologies Logo

Data Integration Lead

Hexaware Technologies

Bengaluru, Karnataka, India

+2 more

Posted: a year ago

Description   Responsibilities •  Leads the delivery processes of data extraction, transformation, and load        from disparate sources into a form that is consumable by analytics                processes, for projects with moderate complexity, using strong technical        capabilities and sense of database performance •  Designs, develops and produces data models of relatively high                        complexity, leveraging a sound understanding of data modelling                    standards to suggest the right model depending on the requirement •  Batch Processing - Capability to design an efficient way of processing      high volumes of data where a group of transactions is collected over a          period  •  Data Integration (Sourcing, Storage and Migration) - Capability to design      and implement models, capabilities, and solutions to manage data within      the enterprise (structured and unstructured, data archiving principles,            data warehousing, data sourcing, etc.).  This includes the data models,          storage requirements and migration of data from one system to another •  Data Quality, Profiling and Cleansing - Capability to review (profile) a     data set to establish its quality against a defined set of parameters and to     highlight data where corrective action (cleansing) is required to     remediate the data  •  Stream Systems - Capability to discover, integrate, and ingest all available     data from the machines that produce it, as fast as it is produced, in any         format, and at any quality •  Excellent interpersonal skills to build network with variety of department       across business to understand data and deliver business value and may         interface and communicate with program teams, management and               stakeholders as required to deliver small to medium-sized projects •  Understand the difference between on-prem and cloud-based data               integration technologies. The Role offers •  Opportunity to join a global team to do meaningful work that contributes      to global strategy and individual development  •  An outstanding opportunity to re-imagine, redesign, and apply                      technology to add value to the business and operations •  Gives an opportunity to showcase candidates’ strong analytical skills and        problem-solving ability •  Learning & Growth opportunities in cloud and Big data engineering              spaces Essential Skills   •  6+ years’ experience in developing large scale data pipelines in a                   cloud/on-prem environment. •  Highly Proficient in any or more of market leading ETL tools like                      Informatica, DataStage, SSIS, Talend, etc., •  Deep knowledge in Data warehouse/Data Mart architecture and                    modelling •  Define and develop data ingest, validation, and transform pipelines. •  Deep knowledge of distributed data processing and storage •  Deep knowledge of working with structured, unstructured, and semi             structured data •  Working experience needed with ETL/ELT patterns •  Extensive experience in the application of analytics, insights and data              mining to commercial “real-world” problems •   Technical experience in any one programming language preferably, Java,       .Net or Python Essential Qualification •  BE/Btech in Computer Science, Engineering or relevant field JD: • Strong working knowledge of IICS - Informatica Cloud, Informatica designer transformations like Source Qualifier , Dynamic and Static Lookups , connected and Unconnected lookups , Expression , Filter , Router , Joiner , Normalizer and Update Strategy transformation. • Solid hands-on development experience in Informatica PowerCenter - usage of reusable transformations, aggregates, lookups, caches, performance tuning, joiners, rank, router, update strategy etc. • Strong Knowledge on Snowflake  • Experience in working in Data Warehousing - Must have knowledge of Data Warehousing concepts - SCD1, SCD2 etc. • Troubleshoot issues and identify bottlenecks in existing data workflows. • Provides performance tuning insight and create reusable objects and templates. • Should be strong in migrating object process from Lower Environment to higher Environment. • Should be strong in scheduling process of the workflows, tasks and mappings. • Basic understanding of Informatica Administration • Strong development skills in SQL. Having knowledge on AWS, HVR would be an added advantage • Should have good communication and customer interaction skills

Hexaware Technologies Logo

Java Full Stack Engineer

Hexaware Technologies

Pune, Maharashtra, India

+2 more

Posted: a year ago

Description   Responsibilities: Understand requirement and translate that to product features. Participate in Scrum meetings and express the work done and the plan clearly. Participate in scrum ceremonies and clearly communicate. Develop applications using Front end, middleware, and database related technologies. Should be hands in developing and implementing best practices and write smart piece of code. Coding standard should be followed, and the code should be highly performant. Should be able to write unit test cases using any of the frameworks and should be completely automated. Should have strong exposure in REST API design and principles and adhere to RAML/Swagger or Open API specification Should be able to do impact analysis and document the design of the components. Should be able to develop reusable components using proper design patterns as listed by lead/architect so that it is extensible. The Role offers: An outstanding opportunity to re-imagine, redesign, and apply technology to add value to the business and operations. An end-to-end project exposure across multiple technical stack and cloud platform An individual who has passion to learn and adapt to new technologies quickly and scale to next level easily. High visibility, opportunity to interact with multiple groups within the organization, technology vendors and implementation partners. Essential Skills: Total IT Experience – 4 to 7 years. 4+ years of in-depth knowledge in Core java, spring boot, Spring DI, Spring MVC, JMS, Hibernate, JDBC, PL/SQL. Good to have experience in and microservices or it can be trained Hands-on-experience in RESTful Http services design. Good to have experience in Angular or REACT JS. Hands-on-experience in Java script, JQuery, Bootstrap, Html 5, CSS3 Hands-on-experience with SQL Server, Postgre SQL writing stored procedures Good communication and unit testing knowledge. Good to have knowledge AWS/Azure cloud platforms. Familiar with Continuous Integration methodologies and tools, including Jenkins Good to have: Exposure to Docker, Kubernetes and cloud deployment Essential Qualification: MCA/equivalent master’s in computers is a must.

Hexaware Technologies Logo

Big Data Lead

Hexaware Technologies

Pune, Maharashtra, India

+2 more

Posted: a year ago

Description   Senior ADF Engineer Work timing: 12pm to 9.30pm IST As a Senior Azure ADF Engineer, you will be responsible for designing, developing, and maintaining data pipelines using Azure Data Factory (ADF). You will work closely with customers to understand their business requirements and translate them into scalable, performant, and secure data integration solutions. You will be responsible for ensuring that the data pipelines integrate seamlessly with other Azure services, such as Azure Blob Storage, Azure Synapse Analytics , SQL Database, and Azure Data Lake Storage. Key Responsibilities: - Design, develop, and maintain data pipelines using Azure Data Factory (ADF). - Work closely with customers to understand their business requirements and translate them into scalable, performant, and secure data integration solutions. - Collaborate with other architects, engineers, and technical teams to ensure that data pipelines are aligned with overall solution architecture and best practices. - Optimize data pipelines for performance, scalability, and cost efficiency. - Develop and maintain technical documentation, including pipeline diagrams, data flow diagrams, and code documentation. - Participate in data-related aspects of pre-sales activities, including solution design, proposal development, and customer presentations. - Stay current with emerging technologies and industry trends related to data integration and management in Azure, and provide recommendations for improving existing solutions and implementing new ones.   : - Bachelor's or master's degree in computer science, engineering, or a related field. - At least 7 years of experience in data engineering, including at least 3 years of experience with Azure Data Factory. - Strong knowledge of Azure Data Factory, including data ingestion, transformation, and orchestration. - Experience with data integration and ETL processes and writing SQL queries and SQL/Scripts. - Experience with Azure Data lake and building pipelines for batch and CDC (change data capture). - Experience with implementing ETL frameworks. - Strong understanding of data security, including encryption, access control, and auditing. - Excellent communication and presentation skills, with the ability to effectively communicate complex technical concepts to both technical and non-technical audiences. - Strong problem-solving and analytical skills, with the ability to identify and resolve complex technical issues. Good to have Qualifications : - Microsoft Certified: Azure Data Engineer Associate certification. - Experience with other Azure data services, such as Azure Synapse Analytics and Azure Databricks. - Experience with other data integration tools, such as Talend. - Experience with programming languages, such as SQL, Powershell.

Hexaware Technologies Logo

Dotnet Cloud Native Lead

Hexaware Technologies

Pune, Maharashtra, India

+2 more

Posted: a year ago

Description   AS IS - Key Skills: .NET Core, Microservices, Docker SQL and Azure with/without Angular 4- G5 Total Experience Level : 7 - 11 years Role: Technical Lead/Senior Developer Grade: G5   Must have skills Must have hands on experience in RESTful Http services design and development using ASP.net web api2 / ASP.net core 2.2+ or .NET Core 3.0 web API for 2-3+ years   Must have good hands on experience in using Entity Framework Core with LINQ Must have Strong hands on experience in HTML5, CSS 3 and Bootstrap   Must have Experience in implementing containerized solution using Docker, Azure Container Service, Azure Kubernetes Service for at least 2 years Must have strong hands on experience in SQL Server/PostgreSQL /MYSQL and should have have done data base design e.g. writing complex stored procedures etc   Must have Experience in cloud computing with Azure Must have Hands on experience in some or all of the Azure services like Load balancer, DNS, Traffic Manager, Compute, Storage, azure SQL database, CosmosDB, azure search, azure functions, azure logic apps, azure serverless programming, Azure API management, ServiceBus, keyvault, Azure Batch, Azure Redis Cache,  AAD and AppInsights Basic skills Nice to have Hands on Experience in Angular 6+ for at least 2 years   Must have worked in .NET Framework using C# for at least 4+ years. Should have worked in ASP.NET MVC, WebAPI   Must have Experience in RESTful Http services design and development   Must have  hands on Microservices development experience and container hosting Must have Experience with Team Foundation Server OR GIT   Must have strong handle on SOLID principles and design patterns Nice to have Experience in Rabbit MQ and any Azure Queuing services is good to have Nice to have Exposure in monitoring and tracking of Azure services Nice to have Automate hosting of Kubernetes cluster using Azure Kubernetes Services Nice to have experience working in ARM Templates, Powershell scripting. Nice to have design and implementation experience and deep knowledge of building Service Fabric services such as Identity, Authentication and Authorization, Service Registration and Discovery, Deployment and Provisioning, etc. Nice to have Experience and understanding of cloud security policies, infrastructure deployments in enterprise-wide environments required Nice to have Automate end-to-end deployment systems using Visual Studio pipelines is additional desirable skill

Hexaware Technologies Logo

Java Cloud Architect

Hexaware Technologies

Pune, Maharashtra, India

+2 more

Posted: a year ago

Description   Responsibilities: Understand requirement and translate that to product features. Develop Technical solution for complex business problems using Java and related technologies Should be able to use design patterns to make the application reliable, scalable, and highly available. Should be able to design Microservices and Serverless based architecture. Should work with client architect and define top notch solutions. Should provide reference architecture for the application in scope. Should work with vendors and work on integration of multiple systems. Should work in designing application including asynchronous programming, multithreading, mutability and concurrency control/recovery when dealing with persistent data stores Should drive the entire development team by defining and setting up high coding standards and follow best practices and principles aligning to the solution. Should work with the Devops team an implement CI/CD architecture Develop applications using Front end, middleware, and database related technologies. Should participate in reviewing other project architecture for flaws and suggest solutions. Should be hands in developing and implementing best practices and write smart piece of code. Mentor the technical team The Role offers: An outstanding opportunity to re-imagine, redesign, and apply technology to add value to the business and operations. An end-to-end project exposure across multiple technical stack and cloud platform An individual who has passion to learn and adapt to new technologies quickly and scale to next level easily. Exposure to multiple platforms and team and work with them collaboratively to get the technical solution implemented. High visibility, opportunity to interact with multiple groups within the organization, technology vendors and implementation partners. Essential Skills: In-depth knowledge in core Java, spring and Hibernate Hands-on-experience in spring boot Hands-on-experience in developing Microservices using docker container or similar containerized platform. Experience with concurrency in Java including asynchronous programming, multithreading, mutability and concurrency control/recovery when dealing with persistent data stores. Hands-on-experience in RESTful HTTP services design Should have exposure or hands on in designing API using RAML or Open API Specification Proven experience in technically mentoring teams and exposure to solution architecture. Should have strong command on design patterns like CQRS, Factory, Dependency Injection, IOT, Aggregator etc Should be familiar or hands on in using Devops pipeline and managing Code repository and released using Git/BitBucket Experience with Oracle or SQL Server writing stored procedures, performance tuning and identifying deadlocks, transactions and data locking/blocking scenarios Good to have knowledge on designing SPA application using Angular 4/6/8 or REACT JS Hands-on-experience with SQL Server, Postgre SQL writing stored procedures, performance tuning and identifying deadlocks, transactions and data locking/blocking scenarios Localization, Internationalization and Globalization for server and client applications Experience working in Agile Development Methodologies such as SCRUM Essential Qualification: MCA/equivalent master’s in computers is a must. Java Certification is a must. Any cloud certification is a plus.