ATD, association for talent development

TD Magazine Article

All That Glitters Is Not Gold

Differentiate between emerging technologies that are just hype and those that can lead to business impact.

By

Sat Nov 01 2025

An illustration depicts gold glitter scattered over a purple background
Loading...

When's the last time you had a truly futuristic learning experience? The kind that made you lean in, blink twice, and think, "Wait—did that just happen?"

Maybe it was an artificial-intelligence chatbot that nailed your tone so well you wondered whether it had been reading your texts. Or perhaps it was a fully immersive headset demonstration during which you leaned away from a virtual cliff because your brain said, "Nope, that's too real."

Here's the truth: The future of learning isn't just around the bend. It's already at your desk, in your onboarding programs, and probably on your learners' phones.

Every year, forces far outside our L&D bubble reshape workplace learning—for example, shifting workforce expectations; hybrid and remote work; and the way consumer technology has trained people to expect instant, visual, personalized experiences. We've all felt that subtle shift where yesterday's "wow" feature suddenly becomes today's baseline expectation.

Emerging technologies—such as AI, immersive tools, and wearables—are here, delivering scalable, personalized learning that meets both business goals and learner needs. The challenge is that between hype videos, vendor promises, and sudden leadership drive-bys ("Why aren't we doing this yet?"), it's easy to feel as if you're constantly recalculating your route with a mix of excitement and trepidation. The answer isn't to try everything. Instead, it's to know what's worth your L&D team's time, how to test the technology, and when to go all in for results that matter.

Technology now moves faster than old-school L&D cycles. What once took years to evaluate can happen in months or weeks. That pace demands sharper decision-making tools and the courage to experiment.

The organizations thriving today aren't the ones stockpiling every tool but rather those that are aligning new capabilities directly to measurable business results. Tools don't drive transformation—people do, and the smartest individuals know exactly where they're headed.

Driving vs. drifting

Adoption of emerging technology doesn't just happen—real pressures and opportunities drive it forward. Picture this: You are midproject, everything's on track, and suddenly a shiny new tool drops. Leadership says, "Let's try it." You test it, it tanks, and now you're untangling the mess. That's the concept of drifting. Let's examine some examples in comparison to driving the effort.

Skills gaps and digital transformation pressures. Driving entails spotting where capabilities need to grow and using technology to close the gap before it hurts performance. Drifting involves waiting until the gap starts costing you results and then scrambling.

Hybrid work and remote onboarding challenges. Driving entails designing rich, interactive experiences that build culture and capability from anywhere. Drifting involves sending new hires a PDF and hoping they figure it out.

Learner expectations shaped by consumer technology. Driving entails matching the intuitive, personalized feel learners get from their favorite apps. Drifting involves offering clunky, outdated systems that frustrate and disengage.

The need for scalable, data-informed, engaging solutions. Driving entails using analytics to refine, improve, and prove value. Drifting involves counting completions and smile sheets and calling it a win.

Most L&D practitioners have had experiences driving as well as drifting—sometimes in the same quarter. The difference between the two is whether you're steering with purpose or simply letting the current push you around.

For instance, consider a global logistics company that saw a 30 percent surge in seasonal hiring, most of which for fully remote roles. Instead of letting chaos take the wheel, the L&D team rolled out a 360-degree virtual onboarding tour paired with mobile microlearning. The result? Time to competence dropped 25 percent compared to the previous onboarding process. That's driving.

And after you've felt that kind of result, it's hard to go back to PDFs and crossing fingers.

The Gartner Hype Cycle

Once you know what's pushing a technology tool forward, the next big question is when to hit the gas and when to tap the brakes—because timing matters just as much as direction.

Every new technology comes with a hype wave. At first, there are glossy videos, breathless headlines, and that one person in a meeting who swears the tool is going to change everything. Gartner's Hype Cycle model is a great way to see where a technology sits on its journey from shiny idea to stable, valuable tool.

Since 1995, the Gartner Hype Cycle has been the map for navigating new technology from "wow" to "works." Created by analyst Jackie Fenn, it draws on innovation research and theories—such as Everett Rogers's Diffusion of Innovations and Amara's Law ("We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run")—to track how emerging technologies mature over time. The resource is a simple curve that maps a technology's maturity, marking the route of everything from the early days of cloud computing to the blockchain boom to today's generative AI, helping leaders decide when to explore, pilot, or fully adopt a technology.

Using Gartner's Hype Cyle is like checking the weather before a road trip—you'll still make your moves, but you won't get caught in a storm you could have avoided. It plays out in five phases:

  1. Innovation Trigger. A breakthrough, big announcement, or prototype gets everyone buzzing.

  2. Peak of Inflated Expectations. This is when over-the-top promises emerge, such as "We'll all be working in the metaverse by next Tuesday."

  3. Trough of Disillusionment. The tool doesn't deliver, adoption slows, and critics pile on.

  4. Slope of Enlightenment. Teams figure out what the tool is indeed good for and start using it effectively.

  5. Plateau of Productivity. The technology becomes a normal, reliable part of work.

Remember the first corporate chatbots? Most launched at the Peak phase, with generic, irrelevant answers and low adoption. Fast-forward a few years, and AI-powered bots, trained on real company data, reached the Slope of Enlightenment. Now they're delivering helpful, contextual answers and saving people time.

The Hype Cycle isn't just a theory—it's a risk management tool. It helps you research what your L&D team could do so you can budget for exploration and keeps you from scaling too soon. Prior to investing in a technology, ask yourself: Where is it on the Hype Cycle, and are we ready for it at this stage? If it's still in phases 1–3, brace for a few wrong turns before you find your lane.

Evaluate emerging technologies for your company

Before you put the pedal down on any new technology, you need to know whether it's roadworthy for your organization, even if it's in the latter phases of the Hype Cycle. Otherwise, you're just test-driving every flashy model on the lot and wondering why you still have to walk home. A solid evaluation framework will save you from driving straight into a ditch.

Think of it as like the television show The Voice for technology—you're not spinning your chair just because a demo hit the high note. You're listening for the whole package: voice, stage presence, and staying power. BUILDS and Waterfall are emerging-technology evaluation frameworks that can serve as your coaches, helping you discover whether a technology tool deserves a place on your team.

Without a framework, and using trial and error instead, you're letting the flashiest demo in the room drive the bus. The BUILDS and Waterfall frameworks enable you to remain in the driver's seat, steering toward value, readiness, and fit instead of drifting into another dead-end project.

BUILDS: The "Is it worth it?" test

Chad Udell and Gary Woodill introduced BUILDS (business value, user experience, impact, learning models, dependencies, and signals) in their book Shock of the New. Rather than borrowing a model from another industry and awkwardly repurposing it for L&D, Udell and Woodill designed their framework for the messy, fast-moving world of workplace learning.

BUILDS walks you through six lenses to figure out whether that impressive demo will lead to solving a real problem in your organization or just eating your L&D budget.

Business value. Will this technology move a key performance indicator that leaders care about, such as revenue, safety, time to competence, or retention? If the only "value" is that it looks cool, it's not ready for your budget.

User experience. Would your company's employees want to use it, or would they only touch it under duress? Think ease of use, accessibility, and whether it naturally fits into staff's workday.

Impact. Whether political, ethical, societal, or organizational, what ripple effects could adopting the technology create? Could it raise ethical concerns? Improve inclusion? Change workflows in ways you didn't expect?

Learning models. What kind of learning is this technology good at doing? Conceptual? Procedural? Hands-on? Empathy building? Match the method to the goal or you'll have a great solution to the wrong problem.

Dependencies. What must be in place for this to work? Think hardware, bandwidth, integrations, and executive support, for example. This point is where a great idea often meets "Oh, that's why we can't launch yet."

Signals. What's the market saying? Is the vendor actively improving? Are integrations solid? Are peers adopting it successfully?

Pro tip: Score BUILDS for each audience—for instance, frontline versus office workers and salaried versus hourly employees. Just because one department loves it doesn't mean it'll play well company-wide.

Waterfall analysis: The no-surprises route

Born in 1970s software development and introduced by Winston Royce, the Waterfall method is a slow roll, ironically. You move step by step, locking each stage before starting the next one. While the framework involves the steps of requirements, analysis, design, implementation, testing, and maintenance, I've made a few tweaks to adapt the steps to L&D:

  1. Problem. Start with the why before the wow. Deeply understand how the technology solves the problem.

  2. Concept. Map what it will do, whom it helps, and how it fits into your learning ecosystem. Pay attention to the clunky connections in your ecosystem—for example, a complicated process to get completions into the learning management system can derail your solution.

  3. Feasibility. Conduct an IT, legal, and security gut check. If you sense it may not work, follow that instinct.

  4. Pilot. Target something small, controlled, and measurable. Consider it a minimum viable product that will provide the necessary data to move ahead.

  5. Measurement. Look at behavior change, error reduction, or time saved. If you cannot measure change, nothing changed.

  6. Scale or sunset. If it works, go big. If not, exit gracefully with lessons learned. Either result is progress.

Getting future ready

While knowing the BUILDS and Waterfall models is one thing, getting your organization to use them is another. To move from theory to traction:

Start small but start now. Run a test drive. Pick one team, one workflow, or one learning program. View it as a spin around the block instead of an all-night drive.

Measure what matters. Use BUILDS or Waterfall to define success metrics before launch. If you can't track it, you can't prove it.

Get champions onboard early. Identify leaders and learners who will advocate for the technology. They will be your best marketing department because nothing sells like someone's genuine "this made my life easier" story.

Plan for scale. If the pilot works, be ready to expand fast. Have your infrastructure, training, and support in place.

Stay agile. Technology changes and needs shift. Keep your evaluation framework handy so you can pivot without losing momentum.

Rather than chasing every shiny object, build the muscle that enables you to spot the right ones, test them smartly, and scale what works.

Remember the Choose Your Own Adventure book series? Instead of reading every page, you made a decision, flipped to the corresponding page number, and lived with the outcome. That's what being future ready feels like. Pick a path, take it, and be prepared for the next fork in the road.

Instead of guessing the entire journey, the point is to make your next turn with purpose, armed with enough data and clarity to keep you driving—not drifting—when the next shiny object rolls by.

The future of learning technologies isn't a distant destination—it's the road we're already on. And before we know it, it will be in the rear-view mirror. Like any good road trip, there will be detours, scenic overlooks, and the occasional "are we there yet?" moment.

The organizations that will thrive aren't the ones with the biggest technology stacks but rather the ones with the clearest sense of direction. They know why they're adopting a tool, how they will measure its success, and when it's time to trade it for something better.

So, buckle up. Keep your evaluation tools in the glove box, your champions in the passenger seat, and your eyes on the road ahead. The journey has already started, so make sure you're the one driving.


Future-Proofing Drill

Run this exercise quarterly to keep your L&D technology strategy sharp.

Spot a spark. Pick one new tool, feature, or update worth watching. Don't overthink it; simply select one that made you go "Hmm."

Perform a BUILDS flash check. Run it through the BUILDS framework's six questions:

  • Will it move a business key performance indicator?

  • Will people use it?

  • Does it have staying power?

  • Will it teach what matters?

  • Will it work here?

  • Will it last in the market?

Take a champion pulse. Ask two trusted champions: Would you use this? Avoid the slides and pitch—just get their gut reaction.

Conduct a learner vibe check. Glance at your latest learner data. Are they craving speed, simplicity, or depth? Does this technology fit?

Cut the dead weight. Drop one outdated tool, process, or report. Make room before you bring in something new.


Signs You're Stuck in the Trough

Conduct a Hype Cycle reality check when the glitter fades from a technology you've adopted.

  • A sense of spam with vendor emails. You used to open the messages immediately, but now they sit in "later" status indefinitely.

  • The "why did we buy this?" moment. No one can clearly remember the original business case.

  • Pilot-group witness-protection program. Your test users have gone mysteriously silent.

  • Dashboard dust. You haven't opened the analytics dashboard in a month, and you had to hunt for the login.

  • The secret workaround. People quietly stop using the tool and invent their own solutions instead.

You've Reached ATD Member-only Content

Become an ATD member to continue

Already a member?Sign In

issue

ISSUE

November 2025 - TD Magazine

View Articles

Copyright © 2025 ATD

ASTD changed its name to ATD to meet the growing needs of a dynamic, global profession.

Terms of UsePrivacy NoticeCookie Policy