The Job logo

What

Where

Data Engineer

ApplyJoin for More Updates

You must Sign In before continuing to the company website to apply.

Smart SummaryPowered by Roshi
Join a highly skilled team of innovators applying big data, AI, and ML to develop financial products. As a Data Engineer, you will work on ingesting and transforming large datasets for fraud detection, anomaly detection, and AI products using cutting-edge technology. Collaborate with ML and Software engineers to optimize pipelines, develop microservices, and build event-driven systems. Utilize Spark, Kafka, and explore new tech options to process and store data efficiently. We offer growth opportunities, an open work environment, and benefits.

JOB DESCRIPTION

Who are we?

We are a highly skilled team of innovators. We apply big data, artificial intelligence, and machine learning to bring the next generation of financial products and services to global markets. https://pi.paytm.com/

What is Pi?

The only fraud and risk management platform that orchestrates data from the entire customer journey, fighting fraud more effectively with configurable risk models in a single, easy-to-use platform.

Company Overview:

We are a leading provider of cutting-edge software solutions, specializing in fraud risk management. Our innovative SaaS platform helps businesses mitigate fraud risks, protect sensitive data, and maintain trust with their customers.

Position Summary:

If working with billions of events, petabytes of data, and optimizing for the last millisecond is something that excites you then read on! We are looking for Data Engineers who have seen their fair share of messy data sets and have been able to structure them for further fraud detection and prevention; anomaly detection and other AI products.

You will be working on writing frameworks for real-time and batch pipelines to ingest and transform events from 100’s of applications every day. These events will be consumed by both machines and people. Our ML and Software engineers consume these events to build new and optimize existing models to detect and fight new fraud patterns. You will also help optimize the feature pipelines for fast execution and work with software engineers to build event-driven microservices.

You will get to put cutting-edge tech in production and the freedom to experiment with new frameworks, try new ways to optimize, and resources to build the next big thing in fintech using data!

What does this include?
  • Work directly with the Platform Engineering Team to create reusable experimental and production data pipelines and centralize the data store.
  • Understand, tune, and master the processing engines used day-to-day.
  • Keep the data whole, safe, and flowing with expertise on high-volume data ingest and streaming platforms (like Spark Streaming, Kafka, etc.).
  • Make the data available for online and offline consumption by machines and humans.
  • Maintain and optimize underlying storage systems to perform according to the set SLAs.
  • Sheppard and shape the data by developing efficient structures and schema for the data in storage and transit.
  • Explore as many new technology options for data processing, storage, and share them with the team.
  • Develop tools and contribute to open source wherever possible.
  • Adopt problem-solving as a way of life – always go to the root cause.
Requirements:
  • Degree in Computer Science, Engineering, or a related field
  • You have previously worked on building serious data pipelines ingesting and transforming > 10 ^6 events per minute and terabytes of data per day.  
  • You are passionate about producing clean, maintainable, and testable code as part of a real-time data pipeline.
  • You understand how microservices work and are familiar with concepts of data modeling.
  • You can connect different services and processes together even if you have not worked with them before and follow the flow of data through various pipelines to debug data issues.
  • You have worked with Spark and Kafka before and have experimented with or heard about Flink/Spark Streaming/Kafka Streams and understand when to use one over the other.
  • You have experience implementing offline and online data processing flows and understand how to choose and optimize underlying storage technologies. You have worked or experimented with Cassandra/DynamoDB/Druid/Ignite/Presto/Athena.
  • On a bad day, maintaining a Zookeeper and bringing up a cluster doesn’t bother you.
  • You may not be a networking expert but you understand issues with ingesting data from applications in multiple data centers across geographies, on-premises, and cloud and will find a way to solve them.
  • Proficient in Java/Scala/Python/Spark
Why join us?
  • For the 5th year in a row, we are proud to announce that we have been certified as a ‘Great Place to Work’
  • We are an open work environment that fosters collaboration, ownership, creativity, and urgency
  • Flexibility with work schedule
  • Enrolment in the Group Health Benefits plan from day 1, no waiting period
  • Team building events on and off-site
  • Fuel for the day: weekly delivery of groceries, and all types of snacks
  • Catered lunches and desserts on a monthly basis
  • Daily fun in the office with our competitive games of Ping Pong, Chess, FIFA, and more games, etc.
  • And of course, an unlimited amount of freshly made coffee! We’re pretty serious about our coffee beans

“We pooled our knowledge of the space and our world-class engineering talent to produce Pi. It’s everything we wanted in an FRM, saving us hundreds of millions of dollars.”   - Harinder Takhar, CEO

Go Big or Go Home!

Paytm Labs believes in diversity and equal opportunity and we will not tolerate any forms of discrimination or harassment.  Our people are critical to our success and we know the more inclusive we are, the better our work will be. 

We thank all applicants, however, only those selected for an interview will be contacted. 

Paytm Labs is committed to meeting the accessibility needs of all individuals in accordance with the Accessibility for Ontarians with Disabilities Act (AODA) and the Ontario Human Rights Code (OHRC). Should you require accommodations during the recruitment and selection process, please let us know. 

Set alert for similar jobsData Engineer role in Toronto, Canada
Paytm Logo

Company

Paytm

Job Posted

a year ago

Job Type

Full-time

WorkMode

On-site

Experience Level

3-7 Years

Category

Software Engineering

Locations

Toronto, Ontario, Canada

Qualification

Bachelor or Master

Applicants

Be an early applicant

Related Jobs

Paytm Logo

Software Engineer

Paytm

Toronto, Ontario, Canada

Posted: a year ago

Join our team as a Software Engineer at Paytm! We are a leading provider of software solutions for fraud risk management. As a part of our Risk team, you will build back-end services and tools that impact millions of users. This is a full-time opportunity based in Toronto, Ontario, Canada. We offer a flexible work schedule, state-of-the-art tools, and the chance to solve tough problems with a brilliant team.

Stripe Logo

Backend / API Engineer, Money Movement - Payouts

Stripe

Toronto, Ontario, Canada

Posted: a year ago

What you’ll do Backend Engineers at Stripe are comfortable working on new products under fluid conditions, seamlessly balancing tactical and strategic considerations.  Responsibilities Scope and lead technical projects, laying the groundwork for our products to iteratively evolve and scale   Design, build, expand and maintain APIs and services   Support our users and keep our systems reliable and performant   Align our technical decisions with Stripe’s broad strategic initiatives, while also advocating for needs specific to emerging new businesses   Work with engineers across the company to understand when existing infrastructure can be leveraged vs. when building a bespoke solution is prudent   Develop and execute against both short and long-term roadmaps. Make effective tradeoffs that consider business priorities, user experience, and a sustainable technical foundation   Produce high level internal and external documentation   Collaborate, mentor and provide support to other team members   Who you are We’re looking for software engineers with experience building APIs and scalable distributed systems in service based architecture. Minimum requirements Have a strong technical background, with 3+ years of professional software engineering experience building backend services and APIs   A passion for API design and amazing product experiences   Experience building and operating highly reliable services   Work well cross-functionally and earn trust from co-workers at all levels   Hold yourself and others to a high bar when working with production systems   Take pride ownership and driving projects to business impact   Solve problems autonomously; ability to research a technical or product domain and apply your knowledge when designing and scoping projects

Stripe Logo

Corporate Technology Integrations Engineer

Stripe

Toronto, Ontario, Canada

Posted: a year ago

What you’ll do In this role, you'll be responsible for ensuring the integrations we build are robust, scalable, and secure, meeting compliance and security requirements. In the first 30 days, you'll receive a thorough orientation and training on our systems and processes. You'll start by learning about our existing integrations and help to develop and contextualize our roadmap for future integrations. In the first 60 days, you'll help implement our iPaas solution to begin developing new integrations and migrate existing ones. By the end of the first 90 days, you'll have delivered several high-quality integrations that have made a positive impact on the business. The day-to-day in this role involves: Participating in Sprint Ceremonies    Collaborating with various stakeholders   Designing, implementing and testing new integrations   Enhancing or troubleshooting existing integrations   Responsibilities Design and implement integrations using an iPaaS solution   Collaborate with business analysts, developers, and end-users to gather requirements and build integrations that streamline business processes and improve efficiency   Ensure integrations are robust, scalable, and secure, meeting compliance and security requirements   Develop and maintain technical documentation, including design documents, database schema designs, and runbooks   Troubleshoot and resolve issues related to integrations, working with stakeholders and the iPaaS vendor as necessary   Participate in code reviews and adhere to software development lifecycle standards and best practices   Test and validate integrations with end-users to ensure they meet their needs and requirements   Participate in an on-call rotation   Who you are We're looking for someone who meets the minimum requirements to be considered for the role. If you meet these requirements, you are encouraged to apply. The preferred qualifications are a bonus, not a requirement. Minimum requirements Bachelor's degree (or equivalent experience) in Computer Science or related field   4+ years of experience in designing and implementing integrations using an iPaaS solution such as Mulesoft, Informatica, Boomi, Workato etc   Experience working with APIs, web services, and data formats such as JSON and XML   Proficient in at least one programming language, such as Java, Go, Ruby, Python, or JavaScript   Experience with database technologies such as SQL and NoSQL databases   Experience working in a faced paced environment   Strong communication and collaboration skills   Preferred qualifications Experience working in an Agile development environment   Deep understanding of authentication mechanisms such as OAuth and JWT   Experience building AWS lambdas, GCP cloud run, or AWS functions