Portfolio companies you'll love to work for.

companies
Jobs

Director of Data Architecture

HealthPlan Data Solutions

HealthPlan Data Solutions

IT
Columbus, OH, USA
Posted on Mar 10, 2026

Position: Director of Data Architecture

Company: Rivera

Focus: Pharmacy Payment Integrity SaaS Platform

Reports To: Chief Information Officer (CIO)

Location: Columbus, OH (Not remote)

Role Overview

The Director of Data Architecture will serve as the foundational data leader for the Company’s pharmacy payment integrity platform. This is a deeply hands-on, player–coach role responsible for architecting, building, and operating the company’s core data infrastructure during a critical growth phase. This position is an in-office role in Columbus, Ohio.

The Director will ensure the platform can reliably ingest, normalize, and analyze large volumes of pharmacy claims and related data with a strong emphasis on speed to market, correctness, explain -ability, and auditability. Reporting to the CIO and working closely with the Product, Clinical, and Application Development teams, this role enables early customer success, supports fraud, waste, and abuse (FWA) detection, and lays the groundwork for future scale.

Key Responsibilities

Data Architecture & Platform Engineering

  • Assume Architect responsibility for the company’s data platform, balancing rapid delivery with long-term scalability.
  • Own end-to-end data pipelines for pharmacy claims, eligibility, pricing, formulary, and reference data.
  • Make pragmatic architecture decisions across batch-first ingestion.
  • Maintain data models that support analytics, explain-ability, and audit requirements.

Technical Execution (Hands-On Leadership)

  • Write and review production code while setting patterns and standards for the team.
  • Personally lead complex pipeline design, performance tuning, cost optimization, and production incident response.
  • Establish lightweight but effective standards for:
  • -ELT/ETL frameworks (e.g., Spark, dbt, Airflow)
  • -Incremental and idempotent data processing
  • -Data quality checks, versioning, and lineage

Support for Payment Integrity & Analytics

  • Enable data capabilities that support:
  • -Rule-based payment integrity logic and early ML experimentation
  • -Near-real-time and batch analytics for claims monitoring
  • -Historical claims analysis to support monetary recoveries and client reporting
  • Partner closely with Pharmacy Analytics and Development team to support feature engineering, model experimentation, and production scoring workflows.

Security, Compliance & Governance

  • Implement practical, right-sized controls to meet HIPAA requirements and prepare for future SOC 2 and HITRUST audits.
  • Ensure secure handling of PHI through access controls, encryption, and data segmentation.
  • Build defensible data lineage and reproducibility to support client trust, audits, and dispute resolution.

Team Building & Leadership

  • Hire and mentor a small, high-impact data engineering team.
  • Establish clear technical expectations, coding standards, and onboarding practices.

Cross-Functional & Executive Collaboration

  • Work closely with Product to translate early customer needs into data capabilities that unlock revenue.
  • Advise the CIO on trade-offs between speed, cost, technical debt, and long-term platform evolution.
  • Clearly communicate technical decisions and data limitations to executives, customers, and auditors.

Required Experience & Qualifications

Experience

  • 10 to 15+ years of experience in data engineering, analytics platforms, or distributed data systems.
  • Prior experience as a senior technical leader (Director, Lead Data Architect, Data Systems Architect or similar).
  • Demonstrated success building data platforms from scratch or very early scale.
  • Experience with healthcare claims data; pharmacy, PBM, payer, or payment integrity experience is strongly preferred.

Technical Expertise

  • Strong hands-on experience with:
  • SQL Server
  • Warehouse/Lakehouse: Snowflake, Databricks, BigQuery, or Redshift
  • Orchestration: Airflow or Dagster
  • Excellent SQL and data modeling skills for complex, regulated datasets.
  • Familiarity with supporting ML workflows (feature stores, batch scoring, monitoring).
  • Solid understanding of distributed systems and data reliability.

Education

  • Bachelor’s or Master’s degree in Computer Science, Engineering, or equivalent experience required.