Analytical Data Engineer
To see similar active jobs please follow this link: Remote Development jobs
The Motley Fool is looking for a highly skilled Freelance Analytical Data Engineer with a strong focus in data warehousing to join our team on an independent contract basis, 40 hours per week for at least 12 months. This is a mid to senior level position and requires 4-5+ years of relevant experience. This role is flexible and 100% remote, but candidates MUST reside in the United States to be eligible for consideration.
Who are we?
We are The Motley Fool, a purpose-driven financial information and services firm with nearly 30 years of experience focused on making the world smarter, happier, and richer. But what does that even mean?! It means we’re helping Fools (always with a capital “F”) demystify the world of finance, beat the stock market, and achieve personal wealth and happiness through our products and services.
The Motley Fool is firmly committed to diversity, inclusion, and equity. We are a motley group of overachievers that have built a culture of trust founded on Foolishness, fun, and a commitment to making the world smarter, happier and richer. However you identify or whatever winding road has led you to us, please don't hesitate to apply if the description above leaves you thinking, 'Hey! I could do that!'
What does this team do?
The Data Engineering team at The Motley Fool creates data pipelines to wrangle data from around the Fool. We collaborate with everyone - from third party vendors to stakeholders to build easily consumable data structures for reporting and business insights. While working closely with our business analysts and machine learning specialists, we serve the data needs of all The Motley Fool Teams!
What would you do in this role?
As a Freelance Data Engineer, you will be responsible for expanding and optimizing data, the data pipeline architecture, the data flow, and collection for cross-functional teams. You are an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. This role will involve collecting and integrating raw data from various vendors, transforming it into clean, structured data sets, and building a scalable, maintainable data mart. The ideal candidate will be responsible for ensuring the integrity and usability of the data across downstream processes, working closely with stakeholders to ensure data models align with business needs and support future growth. Strong technical skills and experience in data architecture are essential.
But What Would You Actually Do in this role?
Build data warehouses and data pipelines in Snowflake
Query and analyze large datasets in Snowflake using SQL
Debug and resolve data issues
Develop serverless data processing workflows using custom operators within Airflow.
Leverage data assets to meet mission needs, ensuring consistent data quality, establishing data standards and governance
Work in an agile, collaborative environment, partnering with client stakeholders, to develop and improve mission-based solutions
Monitor cloud-based systems and components for availability, performance, reliability, security and efficiency
Create and configure appropriate cloud resources to meet the needs of the end users.
As needed, document topology, processes, and solution architecture.
Assist with the training and enablement of data consumers.
Share your passion for staying on top of tech trends, experimenting with and learning new technologies
Required Experience:
Enterprise-level data warehousing experience - specifically within Snowflake.
Proficiency in SQL, including multi-table joins, window functions, indexing strategies.
Data Debugging Experience
Experience developing using Python in context of data ingestion via REST APIs, manipulation with native data types, and database connection
Experience with AWS Services, including Lambda functions, EC2/ECS instances, S3, SQS, DynamoDB Tables; familiarity with IAM Roles and Policies.
Experience with development and deployment of data pipelines using Airflow; proficiency in base and third-party operators for complex DAGs.
Experience with Snowflake setting up storage integrations, external stages, data shares, snowpipes, RBAC; setting up tasks using Snowpark API.
Ability to work independently, and deliver results and drive projects with minimal supervision
Strong ability to communicate blockers and issues to management for escalation and timely resolution
Strong team player, with desire to learn new skills and broaden experience
Experience working with complex data sets
Nice to Have:
Experience with event tracking configuration in Google GA4 and analysis using BigQuery
Experience with data migration project refactoring and optimizing complex SQL logic
Experience working with financial data
Experience investing and/or using The Motley Fool’s service offerings
Experience leveraging data quality tools to proactively address data discrepancies
Experience with DevOps, developing infrastructure-as-code via Terraform and/or CloudFormation
Analytical Data Engineer
To see similar active jobs please follow this link: Remote Development jobs
The Motley Fool is looking for a highly skilled Freelance Analytical Data Engineer with a strong focus in data warehousing to join our team on an independent contract basis, 40 hours per week for at least 12 months. This is a mid to senior level position and requires 4-5+ years of relevant experience. This role is flexible and 100% remote, but candidates MUST reside in the United States to be eligible for consideration.
Who are we?
We are The Motley Fool, a purpose-driven financial information and services firm with nearly 30 years of experience focused on making the world smarter, happier, and richer. But what does that even mean?! It means we’re helping Fools (always with a capital “F”) demystify the world of finance, beat the stock market, and achieve personal wealth and happiness through our products and services.
The Motley Fool is firmly committed to diversity, inclusion, and equity. We are a motley group of overachievers that have built a culture of trust founded on Foolishness, fun, and a commitment to making the world smarter, happier and richer. However you identify or whatever winding road has led you to us, please don't hesitate to apply if the description above leaves you thinking, 'Hey! I could do that!'
What does this team do?
The Data Engineering team at The Motley Fool creates data pipelines to wrangle data from around the Fool. We collaborate with everyone - from third party vendors to stakeholders to build easily consumable data structures for reporting and business insights. While working closely with our business analysts and machine learning specialists, we serve the data needs of all The Motley Fool Teams!
What would you do in this role?
As a Freelance Data Engineer, you will be responsible for expanding and optimizing data, the data pipeline architecture, the data flow, and collection for cross-functional teams. You are an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. This role will involve collecting and integrating raw data from various vendors, transforming it into clean, structured data sets, and building a scalable, maintainable data mart. The ideal candidate will be responsible for ensuring the integrity and usability of the data across downstream processes, working closely with stakeholders to ensure data models align with business needs and support future growth. Strong technical skills and experience in data architecture are essential.
But What Would You Actually Do in this role?
Build data warehouses and data pipelines in Snowflake
Query and analyze large datasets in Snowflake using SQL
Debug and resolve data issues
Develop serverless data processing workflows using custom operators within Airflow.
Leverage data assets to meet mission needs, ensuring consistent data quality, establishing data standards and governance
Work in an agile, collaborative environment, partnering with client stakeholders, to develop and improve mission-based solutions
Monitor cloud-based systems and components for availability, performance, reliability, security and efficiency
Create and configure appropriate cloud resources to meet the needs of the end users.
As needed, document topology, processes, and solution architecture.
Assist with the training and enablement of data consumers.
Share your passion for staying on top of tech trends, experimenting with and learning new technologies
Required Experience:
Enterprise-level data warehousing experience - specifically within Snowflake.
Proficiency in SQL, including multi-table joins, window functions, indexing strategies.
Data Debugging Experience
Experience developing using Python in context of data ingestion via REST APIs, manipulation with native data types, and database connection
Experience with AWS Services, including Lambda functions, EC2/ECS instances, S3, SQS, DynamoDB Tables; familiarity with IAM Roles and Policies.
Experience with development and deployment of data pipelines using Airflow; proficiency in base and third-party operators for complex DAGs.
Experience with Snowflake setting up storage integrations, external stages, data shares, snowpipes, RBAC; setting up tasks using Snowpark API.
Ability to work independently, and deliver results and drive projects with minimal supervision
Strong ability to communicate blockers and issues to management for escalation and timely resolution
Strong team player, with desire to learn new skills and broaden experience
Experience working with complex data sets
Nice to Have:
Experience with event tracking configuration in Google GA4 and analysis using BigQuery
Experience with data migration project refactoring and optimizing complex SQL logic
Experience working with financial data
Experience investing and/or using The Motley Fool’s service offerings
Experience leveraging data quality tools to proactively address data discrepancies
Experience with DevOps, developing infrastructure-as-code via Terraform and/or CloudFormation