Data Engineer

Data Engineer

This job is no longer open

What's an average day like?

This is a net new position for Pliancy within the Data & Business Enablement team. You will work with our Lead Analytics Engineer to help design, implement, and operate our expanding data lakehouse and analytical reporting environment. You'll create data structures that make reporting and tracking KPIs both possible and easier. In collaboration with Analytics Engineering, you'll discover internal/external data sources to design and build table structures, data products, and ELT pipelines utilizng tools such as Dataplex, Dataform, Airtable, and the rest of the GCP Data stack.


Who are we looking for?

We need someone with advanced SQL skills and expertise in building cloud-based data pipelines. Experience with GCP, Dataform, Airbyte, and Looker is a plus. As a Data Engineer, you'll need to know our platform through and through, and be able to solve challenges with data storage, processing, and access. You are fanatical about writing maintainable code within our given coding conventions across various data integration platforms.

You're a team player who enjoys interacting with the individuals you support. You establish connections with users, not problems to solve; and as a result, build trust. Few things make you happier at the end of the day than helping your team achieve a shared objective. You like addressing business challenges with new technology, typically leading the way in their implementation.

Why work with us?
  • Clients solving challenging problems with meaningful purpose
  • Top-of-their-game peers who have fun with what they do and take teamwork seriously
  • Rapid company growth (75% year over year) with the opportunity to see the impact of your efforts on the company on a massive scale
  • Flexible schedule designed to empower your communication and time management skills
  • Great culture, that is driven by empathetic people.
  • Benefits and perks built to meaningfully support you and your family while we grow together
About us

Pliancy is fundamentally changing the way businesses value technology by empowering the next evolution of IT leadership. We provide white-glove consulting solutions to life science and finance organizations. Our employees and clients find that we’re starkly different from other IT organizations because we challenge the status quo in two major ways: by putting people first in every decision we make, and by innovating towards simplicity and sustainability. Whether streamlining a client’s race to a cancer cure or securing the fine details of data integrity, we’re driven to help people.


And as a people-first company that invests in the long-term success of our employees, we’re looking for creative thinkers who like to solve interesting problems. We have a culture of mentorship, and prioritize curiosity and empathy in our hiring decisions.


Responsibilities:

  • Building and maintaining the company's core enterprise data lakehouse to support all Pliancy teams' reporting, analysis, dimensional modeling, and data development.
  • Integrating additional data sources to provide subject-matter, activity, and process analysis
  • Providing self-service data features to assist everyone in making use of data and analytics
  • Contribute to the development and promotion of data quality procedures and initiatives for Pliancy data systems.
  • Data Modeling, Data Quality, and Data Integration are some of the customized Data Services we provide.
  • Create and manage data pipelines between internal databases and SaaS apps.
  • Create and keep architectural and system documentation up to date.
  • Create manageable and performant code.
  • Incorporate the DevOps mindset into all you do.
  • Plan and carry out system expansion as required to satisfy the company's growth and analytic requirements.
  • Collaborate with Analytics Engineers to increase the efficiency of their job.
  • Collaborate with other departments to ensure that data requirements are met.

(these are recommendations, not requirements - we review all applicants)

Should have

  • Soft skills (personality, relationship-building, communication, stress management)
  • 2+ years of hands-on experience delivering high-quality code
  • Professional data processing expertise using Python, Java, or Scala (Python preferred)
  • Exhibit a thorough mastery of SQL and analytical data warehousing (BigQuery preferred)
  • Hands-on experience implementing best practices in ELT at scale.
  • Hands-on familiarity with data pipeline tools is required (Dataflow/dbt, airbyte/singer, etc)
  • Experience with data sources such as Salesforce, Zendesk, and MDM systems, as well as ingesting data through SaaS application APIs.
  • Data modeling and data structure design knowledge are required.
  • Hands-on expertise with Google Cloud Platform (GCP)-based technologies such as BigQuery, Dataflow, Firebase/Firestore, Google Kubernetes Engine, and Looker.

Nice to have
  • Prior experience within DevOps, Networking, InfoSec, and/or IT Consulting.
  • Looker LookML Developer Certification, Google Data Engineering Certification, and Google Professional Cloud DevOps Engineer a plus.
  • Generous salary, above-average pay ($100-120k DOE)
  • “Cadillac” healthcare: Anthem Blue Cross Gold Plan (Premiums 100% covered for employees, 50% for dependents)
  • Medical HRA: Company-funded reimbursement account to help cover copays, deductibles, and coinsurance
  • Dental and vision coverage
  • 401K + 6% company matching (available from your first day)
  • Unlimited vacation policy
  • Paid leave for new parents
  • Wellness reimbursement up to $100 per month
  • Cell phone + Home internet reimbursement
  • Commuter benefits
  • Fully remote, or head into one of our offices, or hybrid. Whatever works for you. #LI-Remote

As of September 27, 2021, Pliancy requires all employees to show proof of full COVID-19 vaccination before being admitted to any of our physical offices.

This job is no longer open
Logos/outerjoin logo full

Outer Join is the premier job board for remote jobs in data science, analytics, and engineering.