Job description
On your first day, we'll expect you to have:
At least 5+ years of professional experience as a software engineer or data engineer
A BS in Computer Science or equivalent experience
Strong programming skills (some combination of Python, Java, and Scala)
Experience writing SQL, structuring data, and data storage practices
Experience with data modeling
Knowledge of data warehousing concepts
Experienced building data pipelines and micro services
Experience with Spark, Airflow and other streaming technologies to process incredible volumes of streaming data
A willingness to accept failure, learn and try again
An open mind to try solutions that may seem impossible at first
Experience working on Amazon Web Services (in particular using EMR, Kinesis, RDS, S3, SQS and the like)
It's preferred, but not technically required, that you have:
Experience building self-service tooling and platforms
Built and designed Kappa architecture platforms
A passion for building and running continuous integration pipelines.
Built pipelines using Databricks and well versed with their API’s
Contributed to open source projects (Ex: Operators in Airflow)