Freelance Data Engineer

Freelance Data Engineer

This job is no longer open

The Motley Fool is looking for a highly skilled Freelance Data Engineer to join our team on an independent contract basis, 30-40 hours per week for 6-12 months. This is a mid to senior level position and requires 4-5+ years of relevant experience.

Note: Though this role is 100% remote, candidates must reside in the United States for consideration. 

Who are we?

We are The Motley Fool, a purpose-driven financial information and services firm with nearly 30 years of experience focused on making the world smarter, happier, and richer. But what does that even mean?! It means we’re helping Fools (always with a capital “F”) demystify the world of finance, beat the stock market, and achieve personal wealth and happiness through our products and services.

The Motley Fool is firmly committed to diversity, inclusion, and equity. We are a motley group of overachievers that have built a culture of trust founded on Foolishness, fun, and a commitment to making the world smarter, happier and richer.  However you identify or whatever winding road has led you to us, please don't hesitate to apply if the description above leaves you thinking, "Hey! I could do that!"

What does this team do?

The Data Engineering team at The Motley Fool creates data pipelines to wrangle data from around the Fool. We collaborate with everyone - from third party vendors to stakeholders to build easily consumable data structures for reporting and business insights. While working closely with our business analysts and machine learning specialists, we serve the data needs of all The Motley Fool Teams! 

What would you do in this role?

As a Freelance Data Engineer, you will be responsible for expanding and optimizing data, the data pipeline architecture, the data flow, and collection for cross-functional teams. You are an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. You will help to guide and support our software developers, database architects, data analysts, and data scientists on business initiatives while ensuring optimal data delivery architecture is consistent. Whether it’s working on a solo project or with the team, you are self-directed and comfortable supporting the data needs of multiple teams, systems, and products.

But What Would You Actually Do in this role? 

  • Leverage data assets to meet mission needs, ensuring consistent data quality, establishing data standards and governance
  • Work in an agile, collaborative environment, partnering with client stakeholders, to develop and improve mission-based solutions
  • Monitor cloud-based systems and components for availability, performance, reliability, security and efficiency
  • Create and configure appropriate cloud resources to meet the needs of the end users.
  • Strong, proven problem-solving skills and a proven ability to apply critical/analytical thinking to deliver sustainable and creative solutions to complex requirements.
  • As needed, document topology, processes, and solution architecture.
  • Assist with the training and enablement of data consumers.
  • Share your passion for staying on top of tech trends, experimenting with and learning new technologies

Required Experience:

  • Enterprise-level data modeling experience; proficiency in SQL, including multi-table joins, window functions, indexing strategies.
  • Experience developing using Python in context of data ingestion via REST APIs, manipulation with native data types, and database connection
  • Experience with AWS Services, including Lambda functions, EC2/ECS instances, S3, SQS, DynamoDB Tables, MWAA; familiarity with IAM Roles and Policies.
  • Experience with development and deployment of data pipelines using Apache Airflow; proficiency in base and third-party operators for complex DAGs.
  • Experience with Snowflake setting up storage integrations, external stages, data shares, snowpipes, RBAC; setting up tasks using Snowpark API.
  • Ability to work independently, and deliver results and drive projects with minimal supervision
  • Strong ability to communicate blockers and issues to management for escalation and timely resolution
  • Strong team player, with desire to learn new skills and broaden experience
  • Experience working with complex data sets

Nice to Have:

  • Experience with DevOps
  • Experience with event tracking configuration in Google GA4 and analysis using BigQuery
  • Experience with data migration project refactoring and optimizing complex SQL logic
  • Experience working with Financial data
  • Experience investing and/or using The Motley Fool’s service offerings 
This job is no longer open
Logos/outerjoin logo full

Outer Join is the premier job board for remote jobs in data science, analytics, and engineering.