Data Engineer - Podcast
To see similar active jobs please follow this link: Remote Development jobs
The “Podcast Mission” is the product and technology team at Spotify responsible for building the best-in-class tools for podcast creators. We own the platforms, tools, and features that deliver a great creator experience, allowing podcasters to host, create, and distribute their episodes while making money from their art and growing and engaging with their audience.
Our team plays a pivotal role in Podcast Mission’s data ecosystem. We develop and maintain data endpoints that empower creator tools and analytics, providing access to critical data and functionality. Additionally, we optimize data delivery by implementing efficient data serving infrastructure.
We are looking for a highly skilled Data Engineer to join our dynamic team and contribute to the development of a robust data platform.
What You'll Do
Design, develop and maintain batch and real-time pipelines with data processing frameworks like Scio, Apache Beam, and Google Cloud Platform.
Deliver high quality code that is scalable, testable and maintainable.
Collaborate with diverse, cross-functional teams to define data requirements and translate them into actionable solutions.
Implement best practices for data quality, security and governance.
Continuously learn new tools and practices.
Support and learn from engineers in your domain and across the organization.
Who You Are
You have Data Engineering experience and you know how to work with heterogeneous data, preferably with distributed systems such as Hadoop, BigTable, Cassandra or DynamoDB.
You have experience building data pipelines using Java and/or ScalaYou have experience using tools like Airflow, Luigi and storage technologies like Snowflake, BigQuery.
You understand data modeling concepts, data access patterns and various data storage technologies.
You value agile methodologies and incremental delivery.
You value clear documentation and possess strong data debugging skills.
You value teamwork and effective collaboration.
Where You'll Be
We offer you the flexibility to work where you work best! For this role, you can be within the EST Time Zone as long as we have a work location.
The United States base range for this position is $122,716 - $175,308 plus equity. The benefits available for this position include health insurance, six month paid parental leave, 401(k) retirement plan, monthly meal allowance, 23 paid days off, 13 paid flexible holidays. These ranges may be modified in the future.
Data Engineer - Podcast
To see similar active jobs please follow this link: Remote Development jobs
The “Podcast Mission” is the product and technology team at Spotify responsible for building the best-in-class tools for podcast creators. We own the platforms, tools, and features that deliver a great creator experience, allowing podcasters to host, create, and distribute their episodes while making money from their art and growing and engaging with their audience.
Our team plays a pivotal role in Podcast Mission’s data ecosystem. We develop and maintain data endpoints that empower creator tools and analytics, providing access to critical data and functionality. Additionally, we optimize data delivery by implementing efficient data serving infrastructure.
We are looking for a highly skilled Data Engineer to join our dynamic team and contribute to the development of a robust data platform.
What You'll Do
Design, develop and maintain batch and real-time pipelines with data processing frameworks like Scio, Apache Beam, and Google Cloud Platform.
Deliver high quality code that is scalable, testable and maintainable.
Collaborate with diverse, cross-functional teams to define data requirements and translate them into actionable solutions.
Implement best practices for data quality, security and governance.
Continuously learn new tools and practices.
Support and learn from engineers in your domain and across the organization.
Who You Are
You have Data Engineering experience and you know how to work with heterogeneous data, preferably with distributed systems such as Hadoop, BigTable, Cassandra or DynamoDB.
You have experience building data pipelines using Java and/or ScalaYou have experience using tools like Airflow, Luigi and storage technologies like Snowflake, BigQuery.
You understand data modeling concepts, data access patterns and various data storage technologies.
You value agile methodologies and incremental delivery.
You value clear documentation and possess strong data debugging skills.
You value teamwork and effective collaboration.
Where You'll Be
We offer you the flexibility to work where you work best! For this role, you can be within the EST Time Zone as long as we have a work location.
The United States base range for this position is $122,716 - $175,308 plus equity. The benefits available for this position include health insurance, six month paid parental leave, 401(k) retirement plan, monthly meal allowance, 23 paid days off, 13 paid flexible holidays. These ranges may be modified in the future.