Flowcode

New York
51-200 employees
We connect the real world to the digital world, instantly and magically with Flowcode and Flowpage.

Data Engineer

Data Engineer

Flowcode is the offline to online company, building direct connections for brands and consumers. By unifying data-driven design with the latest in QR technology, Flowcode enables contactless connection with speed, security, and ease. Privacy-compliant, ultra-fast scanning, and designed with intention, Flowcode is the number one trusted QR provider.

Founded by the former CEO of AOL and President of Google Americas, we are a team of large company executives, startup founders, engineers, scientists, artists, and designers - who are all data-obsessed. Flowcode is always looking to increase our potential as a company. We are focused on building a powerfully diverse workforce, not just because it is the right thing to do but because it expands the potential of our team exponentially.

About the role

Flowcode is seeking an experienced Data Engineer to join our dynamic team. In this role, you will design, build, and maintain scalable data pipelines that empower our data-driven decision-making. You will work with a modern tech stack—including Snowflake, Python, AWS, Kafka, Docker, Kubernetes, and Airflow—to optimize our data infrastructure and ensure efficient, reliable data flow across the organization. In addition to strong technical skills, you will collaborate with cross-functional teams and contribute innovative solutions and best practices to continuously improve our data processes and tools.

Responsibilities:

  • Data Pipeline Development: Design, develop, and maintain data pipelines for ingesting, processing, and transforming large datasets using Python.
  • Data Warehousing: Leverage Snowflake to build and manage robust data warehouses, ensuring efficient storage and retrieval of data.
  • ETL Processes: Implement ETL workflows using Python, Fivetran and Amazon DMS, ensuring data integrity, quality, and timely delivery.
  • API Integration: Develop and integrate APIs to facilitate seamless data exchange between internal systems and external data sources. Work with RESTful or GraphQL APIs to ensure reliable data ingestion.
  • Streaming Data Ingestion: Implement data ingestion solutions from streaming platforms, including Kafka, to handle real-time data processing.
  • Orchestration & Scheduling: Use Apache Airflow to schedule and monitor data workflows, ensuring consistent and reliable pipeline execution.
  • Containerization & Orchestration: Deploy and manage applications using Docker and Kubernetes to create scalable, containerized solutions.
  • Optimization & Troubleshooting: Continuously monitor, optimize, and troubleshoot data processes and infrastructure for performance improvements and reliability.

Qualifications:

  • Education: Bachelor’s or Master’s degree in Computer Science, Engineering, Information Systems, or a related field, or equivalent practical experience in data engineering.
  • Experience: 3+ years of experience in data engineering or software development, with a strong background in building data pipelines and working with modern data architectures.
  • Technical Skills:
    • Programming: Proficient in Python.
    • Data Platforms: Hands-on experience with Snowflake, Fivetran, and Amazon DMS.
    • APIs & Streaming: Experience with RESTful/GraphQL APIs and Kafka.
    • Orchestration: Skilled with Apache Airflow, Docker, and Kubernetes.
  • Analytical Skills: Strong problem-solving abilities, with attention to detail and a focus on data quality.
  • Communication: Excellent verbal and written communication skills, with the ability to explain complex technical concepts to non-technical stakeholders.
  • Team Collaboration: Proven experience working collaboratively in agile, cross-functional teams.
  • Experience with additional programming languages such as Java or Scala.
  • Knowledge of additional data integration or processing tools.
  • Familiarity with cloud platforms (AWS, Google Cloud, or Azure) and their respective data services.
  • Big Data Processing: Experience processing large datasets using technologies like PySpark.

Why Flowcode?

  • Impact: At Flowcode, your work will directly impact the way millions of people experience the real world.
  • Innovation: Be part of a visionary company founded by leaders who have built some of the world’s most successful tech companies.
  • Growth: Join a rapidly scaling company with the opportunity to define the future of user experience in a nascent space.
  • Location: Work where you are most productive, whether that is Remote or working in our office in the heart of Soho, NYC you’ll collaborate with some of the best minds in tech and design.
  • Culture: Our culture is unique and powerful – we are a high energy, high output team that gets 1% better everyday by being consumed with our customers, data-obsessed, operationally rigorous, ruthless on ROI, simplifying complexity and moving with speed and intention. We win and lose as a team. Period. We don’t go through the motions, ever. 

Compensation

A successful candidate’s starting pay will be determined based on the role, job-related skills, experience, qualifications, work location, and market conditions. The current range for this role is $170,000 - $180,000 OTE plus equity. Flowcode also offers a comprehensive benefits package to all employees.

Ready to Join?If you're ready to lead the design charge at Flowcode and have the passion, drive, and skills to shape the future of offline-to-online experiences, we'd love to hear from you.

Logos/outerjoin logo full

Outer Join is the premier job board for remote jobs in data science, analytics, and engineering.