Tag Archive | Evaluation

Informal learning… Maybe I Can Informally Assess Its Impact #LCBQ

March Big Question

This month’s Big Question at Learning Circuits is regarding assessing the impact of informal learning. Or more specifically:

How do you assess whether your informal learning, social learning, continuous learning, performance support initiatives have the desired impact or achieve the desired results?

The extent of my experience in evaluation has focused on applying Kirkpatrick’s model to classroom training and e-learning. Although there may be some elements of the model that lend itself to evaluating informal learning, I do not see the model as a whole working well for assessing the impact of informal learning.

I wish I could present a straight forward model that works well for assessing the impact of informal learning, but I do not have one to offer. What I do have are some off the cuff ideas on how to assess impact. Much of this will be anecdotal information collected, but none the less information that has value in assessing impact.

  • Survey staff regarding what they have learned and how they applied it. It makes sense to use a social media tool to do this (e.g., use hash tags in Twitter). Please don’t think smile sheet, instead think individual questions delivered via social media.
  • Participate, participate, participate and see first hand what they are learning. They will probably also be talking about how they apply what they learned… That’s some good anecdotal evidence of applying behavior.
  • If you have identified specific things staff have learned and applied, look for how it has impacted the organization (results). Oops, easing into Kirkpatrick’s model, but if you can, you can.
  • Measure the “buzz.” Are people talking about it and/or encouraging others to use social media and informal learning? What are they saying that is convincing others? Is it because it has made a difference in their abilities or lead to successes?
  • Find the leaders. Who are leading discussions, being quoted, retweeted, yammed about, followed, liked, etc.? Recruit them to help you measure the impact. They have pull and can help you garner much of the fore-mentioned information.

Keep in mind, the above are ideas I “blue skied,” but if you have additional ideas, I would love to hear them in the comments section. Don’t forget to check the ever increasing posts and comments on the LCBQ blog and add your 2 cents there too.

Advertisements

If You are Reviewing It Online, Why Do You Want Me to Print the Course

All too often people reviewing a web-based training (WBT), including subject matter experts (SMEs), request the course printed for them. If it is the absolutely only way they will review it, then I do accommodate them. Keep in mind this is after I have exhausted all other attempts of getting them to do a proper online review.

Here are reasons not to print courses for a review:

  • It is important for anyone reviewing a course to not just look at content, but to review the entire learning experience including the delivery medium.
  • If they themselves are not willing to participate online how can they expect, or request, our audience to participate.
  • Online courses are very often non-linear. Thus, do not fit in a printed, linear format.
  • Courses are interactive. They may contain anything from simple rollovers to complex games or simulations. Interactivity does not translate to a printed page.
  • Once printed it is occasionally handed around for others to review without the designer’s knowledge. This can result in not being able to identify the origins of edits, if needed. It can also result in draft content mistakenly being distributed to the end user. This can all be prevented by setting appropriate access in an LMS.
  • Depending on the authoring tools used, it can be time consuming to print a course. For example, a course that contains many interactive Flash elements will require many screenshots to be taken. Time is better spent on on design and development.
  • It is more environmentally friendly to review online. As a fellow e-learning designer said to me recently, “I killed many trees with “WBT to be printed out” for SMEs, higher ups, etc.”

The reality is people reviewing courses are going to push for a printed version and sometimes the only way to get them to review it will be to comply. However, I am not going to comply without at least explaining the importance of an online review. In the end, even if I send them a printed version, or screenshots, I always supply easy access to the online course along with several reminders of how important it is to also review it online.

Want to Learn More About Beta Testing?

Benjamin Martin has published Beta Testing an Online Course in Learning Solutions Magazine. It details his approach to beta testing online courses and provides practical advice for what is a very important stage of e-learning development. If you are creating e-learning, then you are probably involved in beta testing and will find this article helpful. If you are not beta testing your courses, then you should be and this article can help you get started.

You will need to subscribe to Learning Solutions Magazine or have a membership to the e-Learning Guild to read the article in its entirety. However, associate membership is free and in my opinion an absolute must for anyone in the e-learning field.

you have to fight for the right TO BE ENGAGING

I just read a comment on a blog where a someone was very frustrated by bland, unengaging page turners. It got me thinking. How do you get a an organization out of the rut of making page turners and to start creating more engaging and effective courses? Here are my first thoughts:

  • Put on your instructional designer hat and do everything you can to educate all involved (SMEs, clients, managers, and audience too) on what effective e-learning is and how all involved can benefit from it.
  • Show all involved what effective e-learning looks like, actual examples. Here is just one place where you can find examples –http://minutebio.com/blog/free-e-learning/ (this my Free e-Learning collection ).
  • Find case studies, articles, evaluations, etc. that support your case.
  • Create a prototype to demonstrate the level of interactivity and engagement your organization can produce in a course. Get your co-workers involved so they will be vested in the “new approach.” This will earn you supporters and people who can rally against the archaic page turners the organization still wants to produce.
  • When you launch your prototype/course and your audience provides positive feedback. Be quick to send that feedback to the powers to be along with any evaluation you have done. They will have a hard time arguing against more interactive courses then.
  • Continue to evaluate your courses even after you have been given the go ahead and resources to create more interactive courses. If you can demonstrate positive results for all 4 levels of evaluation, especially “results,” they will have little argument for ever implementing a page turner again.

What else can be done to address the organization stuck in page turner mode? Please feel free to make suggestions. Thanks.

What I Ask During a Course Review

I just released the first draft of a new WBT course and as usual I have a slew of people reviewing the course. This includes Subject Matter Experts (SMEs) among others. In the past I have provided a general list of what aspects of the course should be reviewed (e. g. grammar, accuracy of content, navigation, technology, etc.). This time around I compiled a far more detailed list of concerns reviewers should be attentive to during their review. It is meant more as guide to what they should be looking for, but can also be used as a questionnaire.

Here is what I included:

Grammar

  • Check spelling, grammar, and consistency of language.

Objectives/Learning needs

  • Does the course answer your questions/concerns about the subject?
  • Do you feel prepared to begin applying the new knowledge/skills learned?
  • Does the course meet the objectives presented at the beginning of the course?
  • Do you feel you now have a better understanding of the subject at hand?

Navigation

  • Were there any links or buttons that did not work?
  • Were all navigational elements marked appropriately?
  • Were you able to navigate through the course with ease?

Graphics

  • Do you find the graphics helpful?
  • Do the graphics appear properly?
  • Was text in the graphics clear and visible?

Animation

  • Does the animation appear properly?
  • Was text in the animation clear and visible?
  • Do you find the animation helpful?

Simulations/Interactivity

  • Are the soft skill simulations reflective of realistic scenarios?
  • Do the simulations, interactive exercises and/or pop-ups function properly?
  • Are the software simulations/demonstrations realistic and appear to reflect the actual “live” system?

Assessment

  • Do the questions measure your understanding of the content presented?
  • Are there questions that address content not presented in the course?
  • Are the questions/answers accurate and pose no potential exceptions that could make an answer incorrect?
  • Is the feedback provided helpful?
  • Does the assessment provide correct scoring results?

Misc. technology

  • Does the audio function properly?
  • Do the videos function properly and appear professional?

I am sure as time goes on questions will be added and some will be eliminated. What would you include, eliminate or change on this list? Any input would be great.

Overview of Kirkpatrick's 4 Levels of Evaluation

Here is a nice, quick overview of Donald Kirkpatrick’s Four Levels of Evaluation provided by Kirkpatrick Partners, LLC.  Also a nice addition to the Free e-Learning page.

Update on 1/26/2010 – the slides are no longer available online. Sorry for the inconvenience

Slide 12 have a helpful matrix of tools that can be used for measuring each level.

Where Organizations Go Wrong With e-Learning

A post on e-Learning Guild’s group on LinkedIn asked, “In What Ways Do Organizations Get eLearning Wrong?” This question really got me thinking. There are many organizations that get it right, but many get it wrong too. Here are my thoughts on pitfalls organizations must avoid.

  • Not knowing the difference between an e-learning designer and an e-learning developer. Companies that only employ staff with development skills will end up with courses that look nice, but are not instructionally sound. Note: If the budget does not afford both a designer and developer, then find someone with both sets of skills.
  • Letting subject matter experts write courses. Again, an e-learning/instructional designer is needed to create engaging, instructionally sound courses.
  • Not conducting a needs assessment prior to creating a course. A needs assessment, even if done informally, is the only way to identify the audience’s learning needs, if any. Too many training departments act as order takers and end up creating courses that are not addressing the solutions needed.
  • Not identifying if e-learning is an appropriate medium for specific training needs. Some things ARE best delivered in the classroom.
  • Creating page turners, ugh! Courses can be non-linear. And they should also be interactive and engaging.
  • Not evaluating courses.
  • Thinking e-learning is only __________. e-Learning is a breadth of training approaches. It is not just asynchronous online courses. e-Learning is also synchronous courses, blended learning, informal or social learning, etc. And within these there are many, many types of delivery mediums and approaches. And more on the horizon.

Feel free to add to the list.