Snowplow is the ideal platform for data teams who want to manage their data in real-time and in their own cloud. We collect, validate, enrich and load up to 5 billion events for our customers each day and help them on their data journey through our management console.
We are looking for an SRE to join our team. Experience in infrastructure automation and supporting high volume, highly available platforms on AWS or GCP is required. Any experience in doing this via the HashiCorp stack would be a bonus.
This is a super-interesting challenge. We're finding ways to deploy, tune, support and update extremely complex and distributed infrastructure centrally, and rapidly! We have a top team working on this that we're wanting to grow.
There are tens of thousands of our open source data pipelines collecting events emitted from over half a million sites and apps worldwide. Snowplow is the ideal platform for data teams who want to manage their data in real-time and in their own cloud. We also collect, validate, enrich and load up to 5 billion events for our customers each day and help them on their Snowplow journey through our management console.
At Snowplow, we have a long tradition of offering remote internships. Many of our full time team have interned with us previously. This year we’re pleased to be able to offer two!
We're committed to making this a great experience for you. You'll be very likely to be contributing to our open source platforms too, so a great way to showcase your work.
Also, please take the opportunity to include a note with your application. We'd like to get a sense of what you're looking to take from the internship and to make sure we can provide it.
There are tens of thousands of our open source data pipelines collecting events emitted from over half a million sites
and apps worldwide. Snowplow is the ideal platform for data teams who want to manage their data in real-time and in their own cloud. We also collect, validate, enrich and load up to 5 billion events for our customers each day and help them on their Snowplow journey through our management console.
We're currently hiring in to the team developing our management console. Their mission is to improve the onboarding of customers and empower them to derive more and more value from their Snowplow pipeline over time. We are building Scala services to expose pipeline metrics, configuration and controls to our React UI and put them in the hands of our customers.
Snowplow is the ideal platform for data teams who want to manage their data in real-time and in their own cloud. We collect, validate, enrich and load up to 5 billion events for our customers each day and help them on their data journey through our management console.
We are looking for an SRE to join our team. Experience in infrastructure automation and supporting high volume, highly available platforms on AWS or GCP is required. Any experience in doing this via the HashiCorp stack would be a bonus.
This is a super-interesting challenge. We're finding ways to deploy, tune, support and update extremely complex and distributed infrastructure centrally, and rapidly! We have a top team working on this that we're wanting to grow.
More information, and other open roles, are on our careers page: https://snowplowanalytics.com/company/careers/
Please note we're not hiring in the US for this particular role.