The Job logo

What

Where

Data Engineer - Intern

Join for More Updates
Smart SummaryPowered by Roshi
As a Data Engineer - Intern at Pearson in Bengaluru, Karnataka, India, you will support database operations, develop software, and utilize reporting tools like OBIEE and Tableau. This full-time, on-site position requires proficiency in Java, Python, MS-SQL, Oracle, and cloud platforms (AWS/Azure/GCP).

Job Summary:

Intern will play a crucial role in supporting Database.

Key Responsibilities:

Technical Skill Requirements :

Proficient in programming languages such as Java & Python

Familiar with database management systems like MS-SQL and Oracle

Experience in software development and testing

Knowledge in any Reporting tools like OBIEE , Tableau or Power BI

Knowledge in Any cloud flavors (AWS/Azure/GCP)

Soft Skills Good To Have:

Excellent problem-solving skills

Strong communication and interpersonal skills

Ability to work in a team environment

Capacity to learn quickly and adapt to new situations

Self-motivated and able to work independently

Qualifications:

  • Pursuing Degree in Computer Science, Information Technology, or a related field.
Pearson Logo

Company

Pearson

Job Posted

8 months ago

Job Type

Full-time

WorkMode

On-site

Experience Level

0-2 Years

Category

Data & Analytics

Locations

Bengaluru, Karnataka, India

Qualification

Bachelor

Applicants

81 applicants

Related Jobs

Capgemini Logo

Data Engineer

Capgemini

Bengaluru, Karnataka, India

Posted: 4 months ago

At Capgemini Engineering, the world leader in engineering services, we bring together a global team of engineers, scientists, and architects to help the world’s most innovative companies unleash their potential. From autonomous cars to life-saving robots, our digital and software technology experts think outside the box as they provide unique R&D and engineering services across all industries. Join us for a career full of opportunities. Where you can make a difference. Where no two days are the same. Job Description:                Expert knowledge in Python                Expert knowledge in popular machine learning libraries and frameworks, such as TensorFlow, Keras, scikit-learn.                Proficient understanding and application of clustering algorithms (e.g., K-means, hierarchical clustering) for grouping similar data points.                Expertise in classification algorithms (e.g., decision trees, support vector machines, random forests) for tasks such as image recognition                Natural language Processing and recommendation systems.                Proficiency in working with databases, both relational and non-relational like MySQL with experience in designing database schemas                And optimizing queries for Efficient data retrieval.                Strong knowledge in areas like Object Oriented Analysis and Design, Multi-threading, Multi process handling and Memory management.                Good knowledge model evaluation metrics and techniques.                Experience in deploying machine learning models to production environments.                Currently working in an Agile scrum team and proficient in using version control systems (e.g., Git) for collaborative development.  Primary Skill:              Excellent in Python Coding              Excellent in Communication Skills              Good in Data modelling, popular machine learning libraries and framework

Aspire Logo

Data Engineer Intern

Aspire

Bengaluru, Karnataka, India

Posted: 3 months ago

Aspire is the leading all-in-one finance operating system for growing businesses in APAC. We are on a mission to reinvent business finance for a new generation of entrepreneurs and business owners, empowering startups and MSME to realise their full potential.  Founded in 2018, Aspire has raised over USD 300M+ across equity and debt from world-class investors. In 2023, we successfully closed an oversubscribed USD 100 million Series C equity round led by Sequoia Capital and Lightspeed Ventures with participation of Tencent, Paypal Ventures, LGT Capital Partners, Picus Capital and MassMutual Ventures. To power our solutions, we have partnered with some of the best companies in the world such as Visa and Wise and helped more than 15,000 businesses using our suite of products. For 2 consecutive years in 2022 & 2023, Aspire has been awarded Best Employer of the Year and Startup of the year by Asia FinTech Awards, and also LinkedIn’s Top Startup in Singapore. In 2023, we also made it to CB Insights’ Top 100 Global Fintech List.  You will be amazed by the energy and experience of our team! Aspire serves as an environment for you to innovate and drive change with our team of ex-entrepreneurs, ex-founders, and high-achievers with international and diverse backgrounds. Are you a top talent who is passionate about entrepreneurship? Join our rapidly growing team to make an impact in the fintech space!  About the Team: At Aspire, Finance plays a strategic role in driving our business forward. Our Finance team consists of a wide range of crucial functions across 4 verticals: (i) Traditional Finance with Accounting, Tax, Reporting, (ii) Treasury, which is a core function for a Fintech, (iii) Data including Business Intelligence, Data Analytics and Data Engineering and (iv) Strategy & Planning, which covers strategic road map, long term planning, fundraising and FP&A. Our Finance team engages in frequent strategic data analysis, research, and modeling to provide the best financial insights for critical business decisions, helping Aspire navigate the competitive landscape and capitalize on emerging opportunities. The team also plays a key part as the first user of our Aspire software, pioneer in Finance transformation, reengineering our internal process to stay agile while maintaining the internal control in a hyper fast growing environment. About the Role At Aspire, we pride ourselves on building an amazing team. We require a Data Engineering Intern to join our very talented lineup. As a Data Engineering Intern, you will be involved in helping develop scalable processes & grow the company operations. The right person takes initiative, likes to learn & deals with integrity. This is a great opportunity for someone wanting to work to make a difference to your community with talented & motivated team members in a diverse, energetic workplace; and a company dedicated to your success, growth and advancement. Responsibilities include but not limited to: Support the team in managing and optimizing data infrastructure on AWS, including the design, development, and maintenance of data pipelines using Python. Assist in integrating various data sources with tools like Airbyte and managing workflows using Apache Airflow. Collaborate with data analysts and stakeholders to ensure data quality, reliability, and trustworthiness. Participate in code reviews, troubleshooting, and resolving data pipeline issues, contributing to coding standards and best practices. Document data engineering processes, workflows, and perform manual and automated data quality checks. Actively contribute to the observability, testability, and optimization of data pipelines. Engage in cross-departmental projects to deliver data solutions aligned with business and technical requirements. Participate in team meetings, including weekly updates, brainstorming sessions, and daily stand-up. Minimum Qualifications:  Currently pursuing a Bachelor’s or Master’s degree in Computer Science or a related field. Basic proficiency in Python, object-oriented programming, and SQL with a solid understanding of relational databases. Strong analytical and problem-solving skills Working proficiency in verbal and written English Preferred Qualifications:  Experience with data extraction, ingestion, transformation, and good coding practices, including clean code, error handling, logging, and testing. Proven ability to work independently and collaboratively, with minimal supervision. Active participation in extracurricular technical activities such as content writing, hackathons, or open-source projects is a strong plus. Enthusiastic about teamwork, continuous learning, and contributing effectively to team outcomes. What we offer Uncapped flexible annual leave. Hybrid work arrangement.  Training subsidy for your professional growth. Wellness benefit. Team bonding budget to foster collaboration and sense of belonging. Flexibility to work from anywhere (for up to 90 days per annum).

NTT DATA Logo

Data Engineer (Tableau & SQL)

NTT DATA

Bengaluru, Karnataka, India

Posted: 4 months ago

Job Description Req ID:  293309  NTT DATA Services strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Data Engineer (Tableau & SQL) to join our team in Bangalore, Karnātaka (IN-KA), India (IN). Job Duties: •    Design, develop, conducts unit testing, and maintains complex Tableau dashboards.  •    Provide technical expertise in areas of architecture, design, and implementation.  •    Work with team members to create useful reports and dashboards that provide insight, improve automate processes. •    Strong proficiency with Tableau online. •    Fixing performance issues in tableau dashboards. •    Resolve support tickets related to Tableau reports. •    Performance tuning / Server management of tableau server environment. •    Create/Manage Groups, Workbooks and Projects, Database Views, Data Sources and Data Connections. •     Manage and support migration of objects (dimensions, measures, hierarchies, reports, workbooks, dashboards), from development to production environments. •    Provide appropriate training and support to business users on the use of Tableau interactive web reports, Tableau online and Tableau Desktop. •    Partner with business to design tableau KPI scorecards & dashboards. •    Assist in the identification and migration of data sources to Tableau server                          Minimum Skills Required: •    Strong proficiency with SQL and its variation among popular databases •    Skilled at optimizing large complicated SQL statements •    Good Communication skills.