
Research Interview
FEATURED GUESTS

Michael Wolbert
Interim Director & Principal Engineer @ werockit
A recent industry report reveals a troubling inconsistency: nearly 30% of platform engineering teams claim they don't measure success, yet only 24% admit they don't know if their metrics have improved. This gap exposes a deeper challenge in how platform teams approach measurement, credibility, and the evidence needed to justify their existence.
Main insights
Self-reporting bias creates a credibility gap - teams report improvements without systematic measurement
Evidence-based decision making through proper experimentation is essential for platform team survival
Adoption metrics matter as much as technical metrics, but require intentional strategy beyond mandates
Product mindset and measurement practices are strongly correlated - teams without one typically lack the other
Michael, a platform engineering ambassador and contributor to platformengineering.org, recently joined a discussion to unpack these findings and share practical approaches to measurement, experimentation, and adoption in platform engineering.
The measurement paradox: Vibes versus evidence
The report's most striking finding centers on measurement practices. When asked which metrics they use to prove success, 40.8% cited DORA metrics, 31% focused on time to market, and 14.1% used SPACE. But 29.6% reported they don't measure at all.
The inconsistency emerges in the next question. When asked whether metrics improved since introducing platform engineering, only 24% said "I don't know." This creates a roughly 5% delta - people reporting on improvements without actually measuring them.
Michael explains this gap through what he calls "vibes-based assessment." Teams gather qualitative feedback from their environment. "Let's say the 6% had meetings that went absolutely smooth. 'Everybody was smiling, everybody had fun, the business was not complaining, everything was on time,'" he notes.
While he doesn't entirely discount the value of qualitative signals - fewer support tickets, happier developers, smoother deployments - Michael emphasizes that evidence-based data doesn't lie when making informed decisions.
Building credibility through experimental design
Michael's approach to measurement centers on proper experimental design. In his widely-read article on platform engineering ROI, he documented reducing pipeline lead time from 20 minutes to 8 minutes - a 60% improvement. "You have to have a before and an after to create an experiment," Michael explains.
The experimental setup must include baseline observation, statistical analysis of results, and triangulation across frameworks like SPACE and DORA.
When entering a new organization, Michael's first step is observation - collecting quantitative data from CI/CD platforms, ticketing systems, PR merge rates, and developer onboarding metrics. "You need to get into the nitty-gritty details of all the reports that the system currently generates and then cross reference them with each other to create experiments and see where bottlenecks are and low-hanging fruit to create small improvements," he says.
The five dimensions of metric selection
Michael introduces a framework for prioritizing measurement across five dimensions: velocity, security, quality, people, and cost. The selection depends entirely on business strategy. "If a go-to-market strategy is we need to have our features in production at least 400 times a day with that speed, then we select velocity and quality probably," Michael explains.
This creates trade-offs; teams must prioritize standardized metrics like DORA before experimenting with IDE telemetry or other experimental signals. For teams new to measurement, he recommends starting with spreadsheets to learn instrumentation before investing in analytics platforms.
The product mindset correlation
An interesting pattern emerges when comparing measurement practices with product mindset adoption. The report shows 25.4% of teams lack a product mindset - almost exactly matching the 24% who don't know their metrics.
The product mindset naturally emphasizes user research, adoption tracking, and iterative improvement - which maps directly onto measurement practices. Without it, teams default to 'infrastructure-as-works' thinking and lose adoption and credibility.
The adoption challenge: Beyond mandates
The report shows adoption is still often driven by mandates (36.6%) or is erratic (16.9%). Michael describes the S-curve adoption model: slow MVP uptake, rapid ramp, then saturation; teams must measure users, voluntary usage, features delivered, and services migrated.
"If there's deviation from the S-curve, intervene - workshops, docs, onboarding. If trust doesn't follow, consider sunsetting the platform," he warns.
The credibility imperative
Michael returns to credibility as the central theme. "These platform initiatives are more about credibility, and this credibility is based on the data that is the evidence to exist as a platform team," he argues.
His typical project cycle: two to four weeks observation, analytics, conclusions, MVP experiments, then pivot or productize. Regular reporting and transparent evidence builds leadership trust across regulated and fast-moving contexts.
Key takeaways
Close the measurement gap: If you're reporting improvements without systematic measurement, you're undermining your credibility. Implement proper experimental design with clear before-and-after comparisons, even if you start with simple tools like spreadsheets.
Prioritize metrics strategically: Don't measure everything - select metrics across the five dimensions (velocity, security, quality, people, cost) based on your business strategy and accept the trade-offs that come with your choices.
Treat adoption as a product problem: Track your adoption S-curve and intervene when it stalls. Voluntary usage matters more than mandates, and sometimes the right decision is to kill a platform that can't gain trust.
Build credibility through evidence: Platform teams exist based on the evidence they provide. Establish regular reporting cycles, triangulate your research across multiple frameworks, and let data guide your decisions rather than relying solely on qualitative vibes.
Check out Michael's ROI article here: https://platformengineering.org/blog/platform-roi-showcase-how-2m-emerged-from-one-platform-shift