ATD Blog
Rise of the Human‑Machine Performance Analyst
Part data analyst, part strategist, part L&D expert, the HMPA is the translator between business goals, data, and AI systems.
Thu Nov 06 2025
Do you want to move from producing content to orchestrating the learning process and showing results? If so, you’re not alone.
Instead of building every interaction, every bit of text, and managing endless review emails, generative AI allows learning design professionals to simply enter objectives and key information, and a draft learning asset can appear in minutes. Using AI, you can focus on what to create, what to measure, and how to prove it makes a difference.
Now, you’re a human-machine performance analyst (HMPA). Part data analyst, part strategist, part L&D expert, the HMPA is the translator between business goals, data, and AI systems. The role exists to turn a torrent of signals into targeted interventions that move the metrics leaders actually care about.
In Applying AI in Learning and Development: From Platforms to Performance, I examine how this hybrid role becomes the linchpin of AI‑powered learning, not by replacing human judgment but by putting it where it matters most.
Why does this matter right now?
Employers expect significant skill churn, and many organizations plan to hire specifically for AI skills. AI literacy is moving from novelty to baseline. L&D cannot meet that moment with static courses or vanity metrics. It needs a role that links learning to performance and governs AI responsibly.
To become an HMPA, you need to become fluent in data and analytics, learn how AI works well enough to be a smart consumer, strengthen your business acumen, elevate your communication and narrative skills, and stay relentlessly curious. These are not nice‑to‑haves. They are the skills that enable an HMPA to judge when to keep the human in the loop and when to trust the machine.
Proficiency Level | Upskill on Data and Analytics | Become Proficient in AI Literacy | Develop Business Acumen | Hone Communication and Storytelling |
Foundation = Basic literacy and safe use | Reads simple charts and dashboards. Understands common L&D vs business metrics. Knows why human interpretation is required even when AI generates analytics. | Explains key terms like model, prompt, token, and NLP. Chooses the right tool for a task and follows basic safety rules. | Names the business problem, stakeholders, and success criteria. Translates learning outputs into plain‑language outcomes. | Turns findings into a clear narrative with a problem, evidence, and next step. Tailor the message to the audience. |
Applied = Hands‑on use tied to tasks | Blends two to three data sources to answer a business question. Creates a basic KPI baseline and tracks change over time. | Uses prompting patterns to create draft content, insights, or summaries. Identifies when outputs need human review. | Maps initiatives to KPIs executives care about. Writes a one‑page business case for a pilot. | Builds simple visuals and uses verbal framing to handle objections. Facilitates decision meetings. |
HMPA Core = Role‑level capability across projects | Builds decision‑ready views that connect learning signals to performance outcomes. Runs small A/B tests or cohort comparisons. | Orchestrates AI services in a workflow. Places human checks at the riskiest steps. Documents prompts and reviews criteria. | Prioritizes the learning backlog by business value. Tells a solid ROI story with assumptions stated and risks noted. | Publishes monthly learning‑to‑performance updates that leaders trust. Coaches peers on evidence‑based storytelling. |
Strategic = Organization‑level influence and outcomes | Defines a measurement blueprint and data standards with HR, IT, and Ops. Socializes a regular evidence cadence for leaders. | Partners on governance and risk controls. Chooses platforms that align with privacy, security, and audit needs. | Co‑creates portfolio‑level goals with leaders and allocates capacity to the highest‑value use cases. | Shapes enterprise narratives about capability building and AI adoption. Elevates wins without hype. |
As an HMPA, you must also stay curious about AI advancements. This means you should schedule time regularly to explore new tools and emerging features. Be open to testing potential use cases in safe sandboxes.
This curiosity must be paired with governance and credibility. An HMPA must be a steward of responsible AI and use established frameworks to structure oversight and risk controls.
The title of human-machine performance analyst may sound futuristic, but the work itself is familiar. It’s about asking good questions, gathering the right data, running careful experiments, and clearly showing the value. The main change is scale. With AI as a foundation, one analyst can impact thousands. The work remains human—the reach is just much greater.