Work / reporting
Reporting visibility reset
Rebuilt scorecards and reporting surfaces so leaders could trust the data enough to act on it.
ReportingThe business had been growing for several quarters and had accumulated dashboards in every tool along the way: GA4, HubSpot, a couple of Looker Studio reports, and a spreadsheet the finance team maintained separately. But there was no actual reporting system. Numbers existed in every tool and trust existed in none of them, because definitions were inconsistent and nobody owned the reconciliation. Marketing reported one set of lead numbers, sales reported another, and the weekly leadership meeting spent more time debating which numbers were right than deciding what to do about them. The engagement started after a board review where leadership was asked basic questions about pipeline health and could not produce consistent answers across departments.
Too many dashboards with no canonical view or single-source-of-truth layer
Conflicting numbers across marketing, sales, and finance because each team defined metrics differently
Low confidence in weekly reporting rhythm because leaders treated the numbers as directional at best
No clear ownership over reporting definitions, update cadence, or data quality
Historical data too inconsistent to establish reliable baselines or measure improvement over time
Audited every active dashboard and report across GA4, HubSpot, Looker Studio, and internal spreadsheets to map which numbers were actually being used in decisions versus which were just noise
Defined a shared metric dictionary with explicit definitions for lead, qualified lead, opportunity, and closed-won so every team worked from the same language, documented in a single reference sheet accessible to all departments
Reduced the reporting layer to a decision-grade scorecard system built in Looker Studio that surfaced acquisition health, follow-up performance, and pipeline movement in one view, pulling from HubSpot and GA4 as canonical sources
Aligned reporting definitions across acquisition, follow-up, and outcomes so the same lead counted the same way everywhere, with validation checks to flag definition drift before it contaminated downstream numbers
Created a stable weekly review surface tied to actual operating decisions, with a standing agenda that mapped directly to scorecard sections and assigned clear owners for each metric category
Outcomes
The reporting surface became clear enough to support real decisions without side conversations about whether the numbers were right. Weekly meetings shifted from debating data accuracy to discussing what to do about the trends the data showed.
A single scorecard system replaced fragmented dashboard review, giving leadership one place to assess business health. The Looker Studio surface became the default reference for all cross-functional performance conversations.
Marketing, sales, and leadership started using the same numbers in the same meetings for the first time. The shared metric dictionary eliminated the recurring translation exercise that had consumed the first fifteen minutes of every weekly review.
