ATD, association for talent development

TD Magazine Article

A Laddered Approach to Measuring Change

Show impact by tracking the four levels of the change life cycle.

By

Mon Sep 01 2025

Fall tree leaves.
Loading...

Change management teams are under increasing pressure to demonstrate that their work drives real results, not just activity. But common metrics don't reveal whether employees are ready, whether new behaviors are taking hold, or whether those behaviors are making a meaningful impact.

As transformation becomes ubiquitous, executives want to know that change teams are doing more than checking boxes. The C-suite wants proof that change leaders are driving results. Yet, too often, change teams report surface-level activity metrics: How many people attended a training event, how many emails the team sent, and how many user sessions staff logged.

When disconnected metrics exist, executives begin to question the value of change efforts. And when transformation fatigue sets in or budgets tighten, change management becomes an option for the chopping block—not because it doesn't matter, but because it hasn't made its impact visible. As a change leader put it during a recent roundtable discussion, "Change is always the scapegoat when things go wrong, but we're rarely set up to show we made a difference."

What's missing is a way to connect day-to-day change activities to business outcomes. A laddered approach to measurement bridges that gap, helping change teams track readiness, adoption, and impact in a structured, actionable way. By clarifying what's happening at each stage of change, the framework provides a stronger foundation for strategy, coaching, and course correction.

The measurement framework

Imagine rolling out a new performance management system. Managers complete the training program, resulting in them logging in to the system and high usage. But six months later, nothing has changed. Attrition is still rising, and feedback quality is flat. Employees don't feel clearer on their goals.

That isn't an uncommon scenario. In fact, a common fallacy is that high adoption translates to guaranteed success. The truth is that adoption is a necessary metric for transformation—but, alone, it's not sufficient. True success means behaviors are changing in ways that improve performance, and that takes a different kind of measurement.

To link activities and outcomes, organizations need a structured measurement model that evolves over time. A laddered framework provides that structure by tracking four tiers of change:

  • Engagement. Are people aware and showing positive sentiment?

  • Learning. Have they demonstrated knowledge of what's changing?

  • Behavior. Are they adopting the new way of working?

  • Business outcomes. How is the change delivering impact?

Preconditions for effective measurement

Before measurement can add value, put the right foundations in place. Without them, metrics risk becoming noise—or worse, a misleading story that erodes trust. Companies must meet five conditions to enable meaningful, credible measurement.

Defined business outcomes. Measurement begins with purpose. What is the organization trying to achieve through the change? Lower attrition? Better compliance? Faster onboarding? Such goals become the anchors against which the change team evaluates success. If the team doesn't define the outcome, the change metrics have nothing against which to compare.

Access to reliable data. Change teams need visibility into how employees engage with communications, complete training, adopt new systems, and perform over time. That may include tools such as learning management systems, customer relationship management systems, intranet analytics, Microsoft Graph data, and pulse survey platforms. Where access doesn't yet exist, change teams must work across functions to prioritize it or start with simple manual tracking mechanisms.

Leadership sponsorship and investment. True sponsorship goes beyond approving the project plan. Leaders must commit to reinforcing change messages, reviewing metric dashboards, and acting on results. Without their engagement from launch to go-live and into hypercare (the support that occurs after implementing a change such as office hours or just-in-time training for people who feel a bit lost on the change), data becomes a report instead of a catalyst.

Robust change infrastructure. You can't measure what doesn't exist. Measurement assumes a change management program is already delivering targeted communications, training, coaching, and reinforcement. Those activities create the basis for behavioral change—and for tracking readiness, adoption, and outcomes.

A commitment to laddered measurement. Change, business, and functional leaders must agree on what success looks like, which leading and lagging indicators (both qualitative and quantitative) to track, and when to collect and review data (before, during, and after implementation).

What to track, when, and why

A laddered approach to measurement works because it tracks change over time, from initial awareness to final results. It doesn't rely on a single point-in-time metric or a one-size-fits-all dashboard. Instead, the approach builds a chain of evidence that helps teams understand not just whether a company has implemented a change, but also how well it absorbed the change and what impact the change had.

Each rung of the ladder builds on the previous one: Engagement without learning doesn't move behavior. Behavior without business impact can feel performative. When measuring them together, the sequence builds a compelling narrative about how change is progressing and whether it's delivering value.

The approach doesn't claim causation. Change is complex, involving numerous decisions and levers. Many variables shape results. But laddered metrics build executives' and change leaders' confidence by showing strong correlation. They give leaders insight into what's working and where to intervene.

By tracking the layers sequentially, change teams move from reporting activity to realizing performance insights. And by aligning metrics to each project phase, change teams make it easier to intervene early, adjust in real time, and forecast longer-
term success. Each rung of the ladder becomes relevant at a different stage in the change timeline.

Engagement opens the door. It's the first indication that employees are aware change is coming.

  • Measures: awareness, sentiment, attention

  • Examples: email open rates; survey response rates regarding awareness about the change, general feelings about it, or feedback; intranet traffic; town hall attendance

  • Timing: prelaunch through early rollout

  • Why it matters: Low engagement is an early warning sign. If employees aren't paying attention, they won't retain or act.

Learning solidifies understanding.

  • Measures: comprehension and readiness

  • Examples: LMS completions, knowledge checks, self-assessments

  • Timing: during enablement and up to the
    go-live date

  • Why it matters: Without learning, there's no behavior change. Low scores may point to weak training design.

Behavior signals readiness and execution. It confirms whether new ways of working are taking hold.

  • Measures: observable adoption of new processes or tools

  • Examples: system logins, process adherence, quality audits, manager feedback

  • Timing: post-go-live through stabilization (30–90 days or longer)

  • Why it matters: This is the core of change management. Behavior shift is the bridge between enablement and performance.

Business outcomes reveal whether the change is worth it.

  • Measures: results linked to the change

  • Examples: attrition, cycle time, productivity, customer satisfaction, error reduction, Net Promoter Scores

  • Timing: three to 12 months after the go-live date, depending on impact window

  • Why it matters: This is the why behind the change. Did it deliver?

Build the bridge to outcomes

The main goal of change management is to influence business performance, which is also the hardest part to measure—and where many change management efforts fall short. That's because outcomes are lagging indicators. They take time to show up, and many factors can influence them.

Even so, change teams shouldn't avoid outcome measurement. The key is to treat it as a credible correlation, not a clean causation. Laddered metrics help build that bridge by showing how engagement, learning, and behavior stack up ahead of the results. If behavior adoption is high and continuous and key performance indicators improve, the relationship is clear. If outcomes lag but behavior data is strong, teams can explore other root causes with credibility.

To connect to business outcomes, start with three elements.

  • A clear baseline: What were the KPIs before the change?

  • A defined target: What does success look like? (For example, a 10 percent increase in productivity or a 5 percent drop in error rates)

  • An appropriate window: Depending on the change, it may be three months for a process shift or 12 months for a mindset or culture change. To choose an appropriate window, determine whether it's a change individuals engage in daily. If so, the window should probably be three months. If the change is more periodic, provide more time—up to a year—to ensure change stickiness can contribute to business outcomes.

Choose outcomes that matter to the business. For instance, an HR KPI could be attrition; procurement KPIs could be spend requests or customer satisfaction; and IT KPIs could be support tickets. You don't need dozens of KPIs, just one or two that link directly to the change's intent.

You also shouldn't aim for perfection when comparing groups or timeframes. Instead, use available data to draw smart comparison. For example:

  • Teams with high behavior adoption versus those with low adoption

  • Regions where sponsorship was strong versus weak

  • Pre- and postchange metrics in a controlled group

Such pattern analysis helps change leaders move from explaining what happened to shaping what happens next.

When measurement drives action

Measurement only matters if it informs action. Teams often collect and report data but never use it. To make laddered metrics useful, change teams must embed measurement into the rhythm of project delivery and align it with decision-making moments.

Start by anchoring metrics to known milestones.

  • Thirty days post-go-live: Use pulse behavior metrics and feedback (such as via town halls, surveys, focus groups, field observations, or digital suggestion boxes) to identify early adoption gaps.

  • Sixty to 90 days: Check for sustained use and link behavior data (such as support tickets; system usage; or intranet clicks, opens, and visits to the page of frequently asked questions) to team-level KPIs.

  • Quarterly or semiannually: Evaluate outcomes (the business KPIs) in leadership reviews or quarterly business reviews.

Those checkpoints create space to ask the right questions: What's working? What's not sticking? Where do we need to reinforce?

Expectations should shift depending on the type of change. For technical rollouts, adoption metrics may stabilize quickly, and change teams can measure impact within 30–60 days. For process shifts, behavior data may take a quarter of the year to normalize. For cultural or mindset changes, change leaders may need to track readiness and sentiment over a full performance cycle.

Measurement strategies must also reflect change complexity. Not all behavior is equally visible or easy to quantify, such as work interactions among teams, culture programs and tactics, customer behavior in store, and staff use of workaround tools (such as using an old process after implementation of a new system). In those cases, feedback from frontline managers, qualitative pulse responses, and small-scale pilots can provide early signals.

Finally, measurement is a tool for influence. The most effective change teams go beyond reporting; they guide leaders in interpreting metrics, identifying roadblocks, and taking action. The latter may entail revisiting training delivery, strengthening sponsor visibility, or shifting communications cadence.

Case study: Performance management shift

Consider, for instance, a company that launched a new performance management framework to improve retention. Prior to the rollout, the company established that baseline turnover was 15 percent and sought to decrease it to 10 percent within nine to 12 months post-implementation.

Engagement was strong regarding the change (70 percent positive sentiment). Training completions reached 87 percent, but only 65 percent of learners passed a knowledge check. The change team responded with a two-minute explanatory video and follow-up manager sessions. Within two months, behavior tracking showed 92 percent of managers submitting feedback in the new system. Nine months later, turnover dropped to 11 percent.

Scale measurement maturity

Building maturity in laddered measurement takes time. Use each project to spot patterns. As a hypothetical, "When learning scores fall below 75 percent, behavior rarely passes 80 percent" or "Behavior adoption is slower when sponsor messaging drops off."

Over time, such insights help teams forecast risk, justify resource requests, and speed up future delivery.

Improve value

High-performing organizations use laddered metrics to build strategy, not just prove delivery. They're asking "Are staff doing what we intended, and is it working?" rather than simply "Did we train them?"

The companies seeing the most value treat change metrics as integral to business metrics. They present updates alongside operational dashboards and use data to inform resource allocation, stakeholder engagement, and process design.

In one technology company, the laddered model helped change teams identify that program leaders and data analysts understood the new workflows but weren't following them consistently. Digging deeper, change leaders found that frontline systems still defaulted to old processes. With that insight, they revised both the system defaults and the training, leading to measurable productivity gains. By revising system defaults, and supplementing it with training, change leaders captured those last remaining users who were not performing in the system as they should.

As another example, behavior data at a healthcare organization showed that certain units adopted new scheduling practices faster than others. By analyzing those outliers via field observations, interviews, and focus groups, the change team surfaced best practices that led to effective coaching strategies, which they scaled across regions.

As those organizations and others have found, when change teams embed laddered metrics into their ways of working, change stops being a soft science. Instead, it becomes a source of strategic advantage. Such change teams create a line of sight between communication and capability, action, and impact. They also give company leaders confidence that change efforts are not just busywork—they're moving the business forward.


Five Change Measurement Pitfalls to Avoid

  • Having no baseline for key performance indicators. Without an initial starting point, it's hard for change teams to measure program success when they first measure data post-implementation.

  • Calculating sentiment scores without behavior follow-up. Sentiment is just the first step. It ultimately tees up how users will decide to behave in the system.

  • Stopping tracking at go-live. Behavior is just starting to take root at go-live. Behavior is a repeatable, recurring action that change teams must measure for months after the rollout.

  • Measuring only reach, not comprehension. Change involves learning and, hence, cannot rely on metrics such as open rates or training completions. Change teams must document that learning occurred.

  • Using only metrics for reporting. The reason to measure is to inform decision making.

You've Reached ATD Member-only Content

Become an ATD member to continue

Already a member?Sign In

issue

ISSUE

September 2025 - TD Magazine

View Articles

Copyright © 2025 ATD

ASTD changed its name to ATD to meet the growing needs of a dynamic, global profession.

Terms of UsePrivacy NoticeCookie Policy