The Reporting Renaissance:
engagement metrics design
data product leadership
cycle time compression
I was brought in to define analytics requirements for a high-stakes drug launch, but the organization lacked the capacity to build reporting infrastructure on their new Salesforce Health Cloud foundation.
My role was to build the organization's first self-serve dashboard, replacing month-old manual reports and reducing reporting cycle time by 85%.
The Problem
The client was in the final stretch of an accelerated product launch during the COVID-19 pandemic while simultaneously implementing Salesforce Health Cloud for the first time. A late organizational change had forced the team to abandon their original global strategy and rush to assemble a US-only patient support program. Leadership wanted to collect real-time, self-reported patient data to personalize support and drive strategic decisions, but this was the organization's first initiative of this kind. There were no established systems, no benchmarks, no data pipelines, and no data governance processes. I was asked to define KPIs for post-launch tracking, but there was no clear path to actually accessing or analyzing the data the program would collect. The brand team relied entirely on manual reports with a one-month lag, and the five-person offshore engineering team had no product leadership to translate business needs into technical requirements.
The Solution
I started by creating a dashboard mockup in PowerPoint to make the abstract concept concrete, visualizing how we would track weekly performance from acquisition through conversion to engagement. This mockup became a critical discussion tool that sparked cross-functional conversations about which metrics mattered to the business, secured approval from Privacy and Compliance to build analytical dashboards, and established agreement on what data granularity could be stored and used. With stakeholder alignment secured, I stepped into the role of de facto data product manager: writing user stories, building the data dictionary, validating data ingestion through user acceptance testing, and transitioning the offshore development team to working in sprints to accelerate delivery. I led the team to build the brand's first self-serve analytics dashboard with weekly refreshes, then created a real-time dashboard for the sales team to track healthcare provider campaign performance at the rep level, enabling sales leadership to give reps direct feedback on progress toward incentive goals during active campaigns. Throughout the build, I performed UAT by educating engineers on what realistic patient data would look like across dozens of fields, becoming the domain expert the team needed since all data was synthetic before launch.
As a part of this project, I also obtained buy-in from Legal, Privacy and Compliance, which is detailed in “Building New Organizational Capability” — read here for more context.
Core Skills Leveraged
-
I demonstrated a bias for action by recognizing critical gaps and stepping in to fill them before they derailed progress. When I created the initial dashboard mockup, I assumed the client would bring in a product manager to translate vision into execution and that my role would end at securing resources to contract a development team. But as questions kept coming back to me—about data fields, user stories, acceptance criteria—I realized there was no product leadership in place. Rather than escalate the issue or wait for someone to be assigned, I stepped into the role myself. I became the data product manager: writing user stories, building the data dictionary, validating data ingestion through UAT, and transitioning the offshore team to sprints to accelerate delivery. I recognized that without someone driving clarity and decisions, the entire build would stall or be executed poorly and fail to prove its value. I took ownership to ensure the dashboard initiative succeeded, knowing it would serve as a critical proof point for the organization to continue investing in data-driven decision-making.
-
This project was a world of firsts: the drug was the first of its kind on the market, the client was implementing Salesforce Health Cloud for the first time, and instead of contracting with an existing vendor, the organization was building and running its own patient support program from scratch. There was no playbook for anything. Rather than wait for clarity that wasn't coming, I created it. The dashboard mockup forced critical conversations about what metrics actually mattered, what data we could realistically access given Privacy and Compliance constraints, and what priority each metric held for immediate business decisions versus long-term strategic insights. When development started, I discovered another layer of ambiguity: all the data was synthetic because the program hadn't launched yet. Without real-world patterns to validate against, the engineers had no way to know if their pipelines were working correctly. I embedded myself in UAT and became their domain expert, explaining what realistic patient data would look like based on my knowledge of the program design. Throughout the project, I operated without clear ownership structures, adapted to shifting requirements, and made decisions based on incomplete information, always keeping the team moving forward while building in flexibility to adapt as the program matured.
-
This project started with a deceptively simple request to "think about" what the client should track after launch. My challenge was translating ambiguous business aspirations into concrete, measurable analytics requirements. While everyone discussed the promise of collecting data to "personalize support" for patients, no one had outlined what types of personalization were envisioned, how personalization would be executed, or whether the planned data capture would even facilitate it. The brand team frequently articulated high-level questions — "Are patients staying engaged?" "Which touchpoints drive persistence?" — but no one had determined how those questions would be answered with actual metrics. I bridged that gap by creating a visual mockup that made the abstract tangible and facilitated discussions to refine which metrics mattered, assigning priority based on potential to inform actionable decisions. Once the dashboard design was finalized, I wrote detailed user stories with analytical requirements for each visualization, breaking down metric calculations step-by-step for engineers who had no business context. I specified which data columns fed into each calculation and walked business owners through interactive features, explaining the computations behind each metric to ensure alignment with their objectives. I acted as the interpreter between business strategy and technical execution, ensuring what we built would answer the questions that mattered.