the Obvious: Most Training is Boring, Ineffective, or Both
this really makes you sleepy, doesn't it?" Those words from my 8-year-old
daughter back in 1983 were my wake up call. The circumstances were these:
her Daddy had brought her to work and was proudly showing off his latest
computer-based training course. Out of the mouths of babes, as they say.
She was right,
the CBT I'd created was boring, even granting that she wasn't reflective
of the target audience. It's tempting to blame the tools of the era -
mainframe CBT in a whopping four colors, ASCII character graphics, and
constrained interactions - but the fact remains that CBT was far more
fun to create than it was to experience as a student.
that not very much has changed: With rare exceptions, CBT and distance
learning in their current incarnations - web-based training, e-Learning,
CD-ROM Multimedia, etc. - still elicit yawns and rebellion. Many people
resist signing up and still more drop out before completing courses. Despite
fancy graphics and interactions, the fundamental model for technology-based
training remains the outdated and flawed "tell, tell, test"; that is,
a string of text-based or narrated instructional screens followed by an
occasional half-hearted question. Ostensibly-new course models emerging
on the web are based on a book metaphor. In other words, "tell, tell,
tell" without pretense of interaction. The word on the street is that
on-line training is boring.
is right of course, at least as far as it goes. By and large, online training
is boring. I'm only surprised that people seem shocked to hear it.
But we can
go a lot further - and look at corporate training in general. I think
it is about time that we stated the obvious:
1. The vast
majority of corporate training initiatives, technology-based or traditional
classroom, are boring, or ineffective - usually both.
2. The reasons
for item 1 are many. While hardly an exhaustive list, they include:
- The subjects
are rarely critical to business success. Have you ever asked yourself
who would complain if your company canned 80% of current course offerings?
We did it at the company where I worked and there was nary a peep. If
there was any sound at all, I think it might have been a sigh of relief.
design, objectives are watered down to what the trainers/vendors think
they can achieve - and then even that is compromised in delivery. We
lower expectations when we should be stretching them.
if we write them at all, are framed in terms of learning, not individual
performance, and certainly not business performance.
- In technology-based
learning, designers frequently try to address the boredom factor by
adding glitz, by which I mean gratuitous graphics and audio that do
not support learning, much less doing. In classroom events, designers
do much the same thing with cute, but mostly meaningless, exercises
and facilitation gimmicks - if not more blatantly with food and drink.
While these techniques may reduce boredom, they do so at the cost of
- The events
are not targeted to intact work teams where group learning reinforces
application and performance measurement. As a former boss, Barry Leskin,
once said, "You can't put a changed individual back into an unchanged
organization and expect change to happen."
- The events
do not acknowledge (or design to accommodate) the 80% or more of critical
job learning that always has, and always will, occur on the job. If
you don't believe that figure, just perform this simple exercise: First
reflect on the critical skills you apply on the job every day. Got them?
Okay, now ask yourself where you learned those skills: was it in a classroom
or CBT course, or in the school of hard knocks? Generally, people say
that most learning occurred on the job.
- The events
are not integrated with other interventions to form a comprehensive
solution. Other interventions might include, for example, redesigned
performance measures or improvements to work processes and work environment.
events are not integrated with each other even when they support the
same performance need. For example, it is common to teach customer service
representatives telephone skills and application system skills independently;
not recognizing that the real job requires performers to practice these
skills simultaneously. Paraphrasing performance support guru Gloria
Gery, that's like giving people the threads of learning and expecting
them to knit together competent on-the-job performance. Instead, we
should do the weaving for performers, giving them experience doing the
interwoven tasks that represent the real work we'll expect of them.
fixated on inappropriate paradigms: the classroom experiences we had
as children and reference books. Doesn't anyone remember how frustrating
these were? There are other models - apprenticeship, simulation, portfolio
assessment, and performance support among them - but we too rarely adopt
them. Frankly, we too often don't understand them
still think in terms of the course, rather than flexibly in terms of
whatever unit of learning or support would best address a performer's
need in the context of doing their job. Along with the term "course"
comes a lot of dysfunctional baggage: courses generally require registration;
they imply a beginning and an end; they suggest that you must learn
before you can do; and they are perceived by students as something quite
different from work. If you've ever taught in a classroom, surely you've
experienced that first day of class look from your students; you know,
that laid back in their chairs, "Okay, teach me," look?
sponsors and training participants have been conditioned to expect little
of training. In fact, a common trainee expectation is to be entertained.
Participants ought to expect to be made to perform real work during
the event and as a measure of their success - but that's an editorial
in itself. Still, ask yourself how the design of training would change
if designers adopted this simple rule: The "course" isn't over until
the learning has been successfully applied on the job. For example,
the sales course isn't over until the learner has sold 500 of the new
model widgets. As simple as that rule sounds, its effects are radical.
How many of the courses you currently offer conform to it? If designers
adopted it, wouldn't much of the frustration of measurement vanish?
Wouldn't the results in terms of business performance speak for themselves?
managers and supervisors typically abrogate their role in staff development;
throwing it over the wall to the training department. Training departments
typically collude by allowing it, albeit from a spirit of helpfulness.
Putting it plainly, trainers must not become responsible for development;
rather they are support to management who "own" development and the
business results it is meant to achieve.
- On the
rare occasions where we measure anything, we measure the wrong things:
trainee satisfaction and/or learning; not on-the-job application and
improved business performance. Spare me the objection that the latter
are impossible to measure; they can be measured if you set out initially
to measure them.
training departments are often driven by dysfunctional financial measures:
"asses in classes" to put it crudely. We should be measuring performance,
period. If we measure "seat time," we should be looking for reductions.
More importantly, we should be looking for reductions in cycle time
- particularly the time necessary for all affected staff to achieve
vendors are driven to reduce development costs, resulting in uninspired
(to put it mildly), cookie-cutter designs to standards we knew were
ineffective 15 years ago. This is particularly apparent to me in technology-based
designs, though I have no doubt that it applies to classroom training
as well. For example, my pet peeve in technology-based learning is audio
narration over identical on-screen text. I can read or I can listen,
but I can't read and listen simultaneously. (The right answer is audio
narration over bulleted text, with a full-text option - without audio
- for those with hearing impairments or a wish to print content for
- Most corporate
trainers are promoted from within; never having seen alternatives to
the way it has always been done. Their models are the schools they attended
as children and young adults, and the corporate training they experienced;
hardly innovative in most cases. Often their peers and managers find
innovation threatening. With some notable exceptions, graduate programs
in instructional design still depend on the same time-worn models. Even
technology-based learning has only taken off now that training departments
have found means to model and measure it traditionally: witness the
"success" of learning management systems and computer-mediated distance
learning based on course and classroom metaphors, while electronic performance
support is rarely understood, much less adopted. It seems to me that
some trainers are finally adopting technology only because they can
now do so without fundamentally rethinking what they've always done.
3. The rare
inspired, effective designs usually result from the efforts of inspired,
reflective individuals. However, these individuals:
have limited effectiveness beyond what they personally control - and
their span of influence is typically narrow.
- Are rarely
effective in the same setting for very long before their impact is neutralized
by factors outside their control (reorganizations, "politics," etc.).
I don't see the pace of change slowing, so the answer must be to create
solutions faster - and reap performance rewards within what Gloria Gery
has termed your sponsors' "window of enthusiasm."
practice I've described can be overcome - in fact, they have been overcome,
albeit uncommonly. In most cases, I've included the solutions in this
article. Where I haven't it's because the solution is so obviously the
opposite of what I've described as the problem.
the models are there - I've seen them. In fact, I've watched talented,
imaginative designers build them. In my previous company, a large insurer,
we went far beyond "tell, tell, test" in technology-based applications.
We experimented with designs where every screen included a meaningful
interaction, and with designs where we used questions to engage learners'
interest (not expecting them to get the answers right), then satisfied
their curiosity with content. We built "reference-able CBT" - which in
standard practice would seem an oxymoron since typically one uses a CBT
course then throws it away, since it's inefficient to refer back to it.
We also designed what I called "whole-job simulations" where we allowed
learners to experience all aspects of a job, for instance, settling a
bodily injury claim - including both system and face-to-face tasks. Heck,
we even had learners taking pictures of a malingering claimant. In a satellite-based
distance learning event, we really stretched things by designing an event
with absolutely no teaching - but a heck of a lot of learning. Too much
to describe here, but very, very successful and a challenge I urge you
to consider. We did online newsletters (pre-internet/intranet) that combined
the urgency of bulletins with the archive and search functionality of
a database - and we used performers in the field as content sources since
they were nearest to the action (but not near to each other). Most importantly,
we experimented with electronic performance support before the term was
coined: only after we'd built our first one did we retroactively bless
it wit the EPSS moniker. Other companies were innovating then, and still
others are innovating now. I don't mean to imply that what we accomplished
was unique. And while I've focused on technology-based innovation, it's
clearly true that imaginative designs for classroom-based learning are
out there too.
is that the good designs are not more common. Training need not be boring,
or ineffective in realizing business performance goals - if we recognize
the obvious, but choose not accept the obvious as inevitable or adequate.
Thanks to my friends in the EPSS and Performance Centered Design community,
my clients, and members of the STEP Consortium, for many engaging discussions
of philosophy and design, both practical and abstract.
the Author: Stan Malcolm serves as a coach to companies wishing to
develop or review their strategies for performance support, learning technologies,
and/or corporate learning issues generally. He also manages the STEP Consortium,
a group of senior training and organizational development professionals
in large corporations who have come together to share non-proprietary
best-practices. For eleven years prior to forming his consulting practice
in 1994, Stan headed learning technologies and performance support initiatives
at Aetna, Inc. With a mixture of pride and embarrassment, he admits that
he's never had any formal training as an educator. What he does or doesn't
know, he learned on the job.
17 Caffyn Drive
Marlborough, CT 06447