Applied data scientist & analytics engineer.
6+ years building attribution models, audience LTV frameworks,
and measurement systems that inform spend and release decisions.
Applied data science and analytics professional with 6+ years building measurement systems, predictive models, and attribution frameworks across streaming media, audience analytics, and enterprise data.
From designing multi-platform attribution dashboards and audience LTV models for music clients in Nashville, to leading cross-functional engineering at Bosch, to deploying custom data integrity systems at the Brooklyn DA's Office.
I bridge raw data and real decisions — modeling attribution, forecasting audience value, and surfacing the signal that drives better spend and release decisions.
Multi-platform streaming attribution system — isolating per-DSP performance contribution, modeling audience retention cohorts across releases, and generating the decision-layer data used to allocate promotional spend and time releases. Live demo below tracks "Wish You Back" by Clayton Johnson across its full 108-week run.
Music clients had no unified attribution view of their data. Streaming numbers lived in Spotify for Artists, revenue data in distributor portals, audience data in social tools — none of it connected.
Without a unified attribution view, promotional budget was being allocated based on anecdote rather than platform-level contribution data. Labels needed a single source of truth to answer: which campaign drove streams, which DSP is growing the highest-value audience, and where to allocate spend next release cycle.
Automated data integrity and audit-trail system for the Brooklyn DA's Special Operations unit — replacing a manual evidence-tagging workflow with a fully logged, schema-enforced system. Demonstrates the same data provenance and lineage design principles that underpin reliable attribution infrastructure.
Prosecutors were manually labeling hundreds of exhibits per trial using handwritten tags and spreadsheets. The process was slow, inconsistent, and legally risky — any gap in data provenance can compromise an entire case.
The unit needed a system that enforced schema integrity, automated audit logging, and maintained full data lineage from retrieval through courtroom presentation — the same class of problems that make attribution infrastructure reliable in any data-critical environment.
A custom-engineered rover designed and built from scratch to retrieve surveillance footage from physically inaccessible or hazardous locations during active field evidence operations.
Evidence retrieval near active crime scenes often required officers or technicians to physically enter unsafe or inaccessible areas — narrow crawl spaces, compromised structures, or active scene perimeters.
Standard equipment couldn't fit or couldn't be safely operated. The unit needed a remote-operated solution that could navigate tight spaces, capture footage, and extract device data without human exposure.
Open-source collection of attribution models, audience segmentation pipelines, and predictive analytics frameworks — built on real-world streaming and media schemas, with documented methodology for LTV modeling, multi-source data integration, and automated reporting.
Built to demonstrate applied analytics methodology on real-world problems — multi-platform attribution, audience LTV cohort modeling, and decision-layer dashboard design — not synthetic or classroom data.
Each project demonstrates a specific measurement discipline: attribution pipeline design, LTV model deployment, audience segmentation, or warehouse architecture built for analytical querying — not just notebooks.
Whether you're building out marketing measurement capabilities, working on attribution modeling or LTV frameworks, or need applied analytics that connects data to real spend and release decisions — reach out.