Less Than a Penny for Learning

Stanley E. Malcolm, Ph.D.

January 3, 2002

I've been giving some thought to training as it is commonly practiced in corporations. It's not a pretty picture.

I'll let you be the judge of how much any of this applies to your organization - but even if this sounds familiar, don't despair: you're not alone, and there is much you can easily do to address the issues. All you need is the vision, and the will.

Three Eighty-Twenties:

One might think that I'm obsessed with eighty-twenty rules. I'm not. They just seem to pop up with disconcerting frequency. The three eighty-twenty rules that follow are "nested." That is, for every dollar spent on training, only eight tenths of a penny actually affects business performance through increased staff competency. (The calculation is 20% of 20% of 20% of a dollar.) That's not including student salaries, travel, or opportunity costs - all additional big ticket items that dwarf the training department budget in most cases.

The means to address the three eighty-twenty rules are interrelated. I'll describe the means together, after laying out the rules as I see them.

1. Overhead: 80% of a training department's budget is overhead to the core instructional design process. Just as in the military where only 20% or so of soldiers actually fight - while the rest provide supporting services like food, laundry, transportation, medical care, etc. - so too in corporate training where the time and money going directly to design tasks is a small proportion of total resources. Who knows what the proportion actually is - I'm only using 20% as a guess. But it would be instructive to add up the costs of learning management systems, administrative and management staff, time spent by designers doing non-design tasks, delivering "stand-up" instruction where other means are available, rent, furniture, computers, equipment, etc.

2. Impact: 80% of the training that training departments deliver makes no significant difference to business performance. Putting it more bluntly, much of what we offer as training is useless, feel-good crap. As anecdotal evidence, I cite a corporation I know well that discarded its entire general skills curriculum - and had not a single complaint. The reasons why so much is bad are many. Among them: ineffective instructional designs, unimportant subjects (in terms of business performance), off-the-mark content, training when the performance gap has another root cause, no measurement or consequences, no commitment by participants or supervisors to on the job application, training as a reward or "vacation," bad timing, hodge-podge aggregations of students versus intact, natural workgroups.

3. The "Teachable Moment": 80% of critical job learning happens, always has happened, and always will happen, on the job. This is the story I've been telling for years: Think about the critical skills you apply every day on the job. Now think about where you learned those skills. Most people admit that no more than 20% were learned in a classroom or other traditional setting. (I include technology-based initiatives delivered remotely as "traditional", especially when structured as courses.) How can a training function hope to have impact on business performance if it affects only a small fraction of skill development, specifically the fraction most distant from application to real work?

Summing up:

  • little of what we spend on training goes to design and delivery;
  • little of what we do spend on design and delivery is effective or relevant to job performance; and,
  • even that ignores the vast majority of critical job learning that happens out of our usual sight - unstructured, inconsistently, on the job.

The Solution:

The solution is to attack the three rules in reverse order, starting with the on-the-job component.

First, we need to ask how we can influence that 80% of critical job learning that occurs on the job - without trying to take it off the job. The answer lies not in traditional classes, but in other means for learning and direct support of performance. I include electronic performance support and knowledge management initiatives that provide just-in-time access to critical understanding in a work process context. Coaching models fall here too, assuming they have a designed-in structure. Learn as you do, or simply do (to the extent that EPSS, for instance, diminishes the need to learn some kinds of information). Allow the need to perform a real business task "in the moment" to provide the most natural motivation to learn.

We should also think in terms of initiatives targeting intact, natural work groups, e.g., functional or project teams and their management learning and applying together. As a colleague once said, you can't put a changed individual into an unchanged organization and expect change to happen. Unfortunately, that's exactly what we do when we send individuals away to class - assuming they learn something of value, they come back and are swamped by the status quo.

Also, think of measures in terms of business performance, not learning. After all, performance is the only goal that matters in a business context. You may be able to infer learning from task performance (if you care to), but you can't infer performance from learning. So… measure performance, and don't consider your design to be complete until you're confident that performers will be able to achieve some challenging real-work metric - say, selling 500 of the new widgets, or handling 100 customer calls without escalation to a supervisor. Then, measure that!

Next, we need to think about jettisoning the courses we offer that don't demonstrably and dramatically affect business performance. Think lean and mean, not all things to all people. Dump the general skills entirely, or suggest external sources, or provide competency descriptions and means to assess skill levels - but don't waste precious resources on low-impact, "commodity" training. Years ago, Corning set a policy of offering no training for which a clear ROI was not demonstrable. It's a good rule. Existing courses that didn't meet the ROI test were off-loaded to a local community college - available to those who really wanted them, possibly reimbursed through tuition assistance, but not a burden to the training department.

By all means, assure that what you offer is linked to business plans. Not casually, but directly - by a strong feedback loop to senior executives that addresses the question, to achieve the business results you seek, what performance/skill gaps must be addressed… and in what quantity? Consider this rule: we only address existing or anticipated business performance gaps.

By and large, the notion of a "course" does more harm than good. Courses imply registration, scheduling, start dates and end dates, learning in advance of performance "just in case", and time away from work - all incompatible with a sense of learning urgency in a dynamic, complex environment. Presenting courses to intact work groups at "teachable moments" (e.g., presenting usability testing techniques and coaching at a critical stage in a systems development project) diminishes the downsides considerably.

Where courses are part of the right solution, assure that they include rigorous testing. Students who don't master the material should fail, period. That rule alone would do much to change the classroom atmosphere where, it seems to me, the students often sit back and expect to be taught - resting the burden of learning on the instructor. It would also do much to eliminate people who have no business being there in the first place, and who drain energy from the instructor and group.

Finally, if you've done a good job with the above, it should be possible to diminish your overhead expenses. For one thing, you won't be paying to administer or deliver a lot of dud courses. If you adopt the philosophy that learning doesn't necessarily require teaching, there are enormous cost savings possible. If you see coaching natural work groups as part of the solution, you may be able to distribute staff into key customer areas - shifting off some rent and equipment expenses at the same time you get better linkage to customer needs. If you're paying for a learning management system, and have purchased a technology-based curriculum to populate it, for Pete's sake cut back on registrars and other support staff, and don't offer costly, redundant, classroom alternatives. If you haven't bought a costly technology-based curriculum, consider negotiating a pay-per-utilization contract where you're not locked into (often unrealistic) demand forecasts.

What I've suggested hardly scratches the surface of what could be done. There's also much more that could be said by way of amplifying important principles like "learning without teaching." But the simple conclusion I'll leave you with is this: the potential is there to do far better than today's penny-on-the-dollar results, and the steps to achieve superior results are generally simple - you just need the right frame of reference which the eighty-twenty rules provide.

Additional Reading: Of the articles I have available on-line (at, these three are most relevant:

  • Stating the Obvious: Most Training is Boring, Ineffective, or Both: Written September 9, 2000 in response to Michelle Delio's report, "Online Training 'Boring'" (Wired News, August 30, 2000). Michelle is right, as far as she goes - but we should go a lot further! Human Resources Executive published a version of this piece in their January 2001 issue, but I feel my original makes a stronger case.

  • The 100 Percent Solution: Published as a Viewpoint in Training in 1998, this editorial advocates designing programs so that task performance is demonstrated on the job before training is considered complete.

  • Reengineering Corporate Learning: Published in Training in 1992, this article builds the case for a radical rethinking of the goals and activities relevant to learning and job performance in a corporate setting. Its conclusions flow from the simple observation that more than 80% of critical job learning occurs on the job, not in the classroom or other formal setting.

Acknowledgements: Thanks to my friends in the EPSS and Performance Centered Design community, my clients, and members of the STEP Consortium, for many engaging discussions of philosophy and design, both practical and abstract.

About the Author: Stan Malcolm serves as a coach to companies wishing to develop or review their strategies for performance support, learning technologies, and/or corporate learning issues generally. He also manages the STEP Consortium, a group of senior training and organizational development professionals in large corporations who have come together to share non-proprietary best-practices. For eleven years prior to forming his consulting practice in 1994, Stan headed learning technologies and performance support initiatives at Aetna, Inc. With a mixture of pride and embarrassment, he admits that he's never had any formal training as an educator. What he does or doesn't know, he learned on the job.

Stan Malcolm, Ph.D., Principal
Performance Vision
17 Caffyn Drive
Marlborough, CT 06447