Job Details

Data Engineer (East Coast Remote)

  2026-02-04     Empassion     all cities,AK  
Description:

divh2Data Engineer (East Coast Remote)/h2pEmpassion is a Management Services Organization (MSO) focused on improving the quality of care and costs on an often neglected Advanced illness/end of life patient population, representing 4 percent of the Medicare population but 25 percent of its costs.The impact is driven deeper by families who are left with minimal options and decreased time with their loved ones. Empassion enables increased access to tech-enabled proactive care while delivering superior outcomes for patients, their communities, the healthcare system, families, and society./pp$145,000 - $160,000 a year/ph3The Opportunity/h3pJoin our high-impact Data Analytics team to shape a modern, flexible analytics platform that powers Empassions mission. As a Data Engineer, youll collaborate with analysts and cross-functional partnersGrowth, Product, Operations, and Financeto turn complex data into actionable insights. Using tools like SQL, dbt, and Looker, youll build pipelines, models, and dashboards that decode patient care journeys and amplify our value to partners. This is a chance to influence both internal strategy and external impact from day one./ph3What Youll Do/h3ulliPartner with teams across the business and external partners to understand data needs and deliver reliable pipelines and models that solve real problems./liliBuild and maintain scalable ingestion and egress pipelines in Airflow and dbt Cloud, ensuring high quality, automated data flows across cloud environments./liliImplement unit tests and monitoring to guarantee data integrity and reproducibility./liliModel and transform and structure healthcare datasets into usable formats that power data science models, and other reporting marts./liliEnhance and scale data models with SQL and dbt, ensuring precision and adaptability for new partnerships./liliWrite Python code for Apache Airflow DAGs, components and utilities that orchestrate and monitor data workflows. Build complex pipelines that enable flexible scheduling, conditional logic, and smooth integration across multiple data sources./li/ulh3What Youll Bring/h3ulli1 - 4 years in data engineering or analytics engineering with proven ability to build pipelines and scalable workflows./liliStrong SQL skills for querying large, complex datasets./liliProficiency in Python for data engineering tasks (transformations, APIs, automation)./liliExperience with cloud data warehouses and storage (GCP preferred: BigQuery, Cloud Storage, Composer; AWS/Azure equivalents acceptable)./liliHands on experience with dbt or similar data modeling tools./liliComfort working in collaborative dev/staging/prod environments, partnering with Product and Tech to safely test, launch, and anticipate the impact of new changes./liliCuriosity about operational workflows and a drive to partner with non-technical teams, ensuring data and reporting align with how the business actually runs. Youre not just a spec-taker, youre part of the solution./liliA proactive, problem-solving mindset and ability to thrive in fast-paced, iterative environments./liliStrong communication skills to collaborate with analysts, engineers, and business stakeholders./li/ulh3Bonus Points/h3ulliKnowledge of healthcare data (claims, ADT feeds, eligibility files)./liliFamiliarity with Git/GitHub for version control./liliEarly-stage startup experience (seed/Series A), especially mission-driven ones./liliExperience building semantic layers and data models in Looker (LookML)./li/ulpReady to Make a Difference? If youre driven by data, healthcare, and impact, apply and lets talk!/p/div


Apply for this Job

Please use the APPLY HERE link below to view additional details and application instructions.

Apply Here

Back to Search