Working With Subject Matter Experts and Action Mapping

Working with a subject matter expert (the individual who has applicable content knowledge on the course you are developing) and action mapping (a training design process that helps determine what actions, behaviors, or skills should be the “end game” of the course you are developing) are both meaty topics which could be covered in multiple blog posts.  We will likely revisit both ‘subject matter experts’ and ‘action mapping’ at a later time, but I want to share a tool I developed based on concepts found on Kathy Moore’s blog, Let’s Save the World From Boring Training. (More on Cathy Moore later, but you need to give her site a “look see” if you have never had the pleasure.)

Cathy Moore has some wonderful eBooks, resources, and blog entries about action mapping and working with subject matter experts.  I used her resources early and frequently while determining how to frame content for my first e-learning courses.  I was sold on her process for guiding subject matter experts through an intentional series of questions to determine:

  1. If a course is needed for the content/topic being proposed (Is the content/topic necessary to achieve a business outcome? Could the content/topic be communicated in a job aid or email instead of by developing training?)
  2. If there are specific and necessary actions, skills, or behaviors that a learner needs to be able to do at the conclusion of the course.
  3. What activity, scenario, or content should be developed to help learners do the action, skill, or behavior that is needed.

I wanted a simple tool to use the concepts that Cathy Moore teaches so I could stay on track while working with subject matter experts (SMEs) …lest I fall down the rabbit hole of what a learner needs to ‘know’ as opposed to the much more productive outcome of what the learner needs to ‘do’.  Click on the image below to access a PDF file.  Leave comments if you have suggestions or other resources that you like to use!

SME and Mapping

How Challenging Should e-Learning Be?

photo from enspire.com

There is a balance to engaging learners.  As tempting as it is to believe that if we throw in some animations, add audio, and have a trivial pursuit themed game at the end, learners will be engaged and therefore retain what knowledge we intend to impart…the reverse can be true.  And let’s not even quibble over trying to make amazingly interactive training for learners to “know more” instead of “do more”…that is another discussion entirely.

The balance lies in deciding what content to include and incorporating activities that will facilitate learning. You might be tempted to hand learners the answers, making it obvious what conclusions they should draw, theorizing that if learners have success or can “demonstrate” that they know the material, learning has occurred.  However, there is research that supports that learning that is more challenging is related to higher retention and learning.

Bjork and Bjork have a wonderful scholarly article that points at the difference between learning and performance, but the main thrust of the article is vested in “desirable difficulties”.  It speaks to the idea that while cramming for a test can be effective at short term recall, it fails for long term retention.  The answer to this dilemma lies somewhere in the structure and difficulty of the learning experience.

Dorothy Leonard writes for Harvard Business Review about the need to make organizational learning a bit more challenging in her article ‘Why Organizations Need to Make Learning Hard’.  She states, “Both learners and teachers confuse performance during training (termed “retrieval strength”) with long-term retention and the ability to apply the lessons (“storage strength”).”

Not only does this concept necessitate structuring training towards longer retention, it may also mean rethinking how we evaluate the impact of training on organizational performance. Is it enough to test a learner immediately following a training exercise (which may only demonstrate the “cram and test” “retrieval strength”) or are there opportunities to evaluate individual performance in the wake of learning opportunities (“storage strength”?) More challenging for instructional designers? Yes. More impactful for learners? Hmmmm…guess we need to figure out a way to measure that.