Wait! Before you go…

blog 4 image

Our young pup, ADDIE, has grown up and is ready to go out into the big world on her own. However, before we send her off to embark on her new journey, there are few things we would like to share about the last stages of the ISD model, ADDIE.

We’ve briefly touched upon the topic of evaluation, however, that had more to do with evaluation during the whole design process. But, before we get to that, we must cover implementation. The last blog discussed the importance of pilot-testing, which essentially gives the designer a chance to complete some dry-runs before administering the real thing. In short, the implementation phase is when the actual course is put into play, and the learners learn. They complete the various activities, sessions, and assignments along the way, and hopefully meet the objectives outlined at the beginning of the course. As Hodell points out, there are some sticky situations you can encounter in the implementation phase if you are the designer and the facilitator. “Designers who do not teach are not necessarily following a faulty line of reasoning. In fact, it is probably a good idea for designers not to teach. Designers who are also facilitators have a tendency to believe they can improvise a fix for missing or faulty design elements on the spot. This is usually not the case. The ability to make alterations on the fly is normally the domain of the designer. Facilitators are not always experienced or capable of making a faulty lesson plan work as designers might want them to be” (Hodell, 2011). This can be a difficult concept to wrap your mind around especially if you are a teacher, like me. I find that most teachers pride themselves on being able to adapt a lesson on the fly and make changes. However, when it comes to training, and a more formal approach, like ADDIE, adapting a lesson might not be the best thing to do. Like Hodell says, sometimes the facilitator is not trained or skilled enough with the tools to modify the lesson, which in turn could create more issues for the learner than there already are.

Once the implementation phase has concluded, that last step is to evaluate the whole shebang and determine if it was effective in meeting the objectives and goals. A guy named Donald Kirkpatrick divided the evaluation phase into four parts: reaction, learning, behavior and results (Hodell, 2011). The learning phase requires the designer to assess learning level in relation to the established objectives by using evaluation tasks. More simply put, items like formative and summative assessments can help provide the designer with important information regarding the objectives. The other phase that is really important is the reaction phase. This acts like a survey. Kirkpatrick includes topical questions about the whole training, like:

    • “Was your time well spent in this training?
    • Would you recommend this course to a co-worker?
    • What did you like the best?
    • What did you like the least?
    • Were the objectives made clear to you?
    • Do you feel you were able to meet the objectives?
    • Did you like the way the course was presented?
    • Was the room comfortable?
    • Is there anything you would like to tell us about the experience” (Hodell, pp. 67)?

These questions are great building blocks for your own survey, however, they do not provide enough guidance when it comes to the specifics of your training. As you develop your own evaluation, you must consider the smaller aspects. For example, if you embedded websites that the participants must explore, or texts they must read, consider asking them how effective they felt they were. Ask them if they felt the assignments and tasks were authentic, and helped them to better understand the concepts. Another thing you must consider is what type of evaluation will you administer. They are typically done through a survey format, but format will your questions be in? A Likert scale, open-ended, etc.? Nowadays, the go-to place to develop surveys is http://www.surveymonkey.com , which is great because you can make your own or use a template. It cruches the numbers and organizes the data for you, which can make your job easier in the long run. They also provide a ‘Survey 101’ crash-course on developing your survey: https://www.surveymonkey.com/mp/survey-guidelines/?ut_source=header.

ADDIE and I would like to wish you the best of luck with your instructional design endeavors! Keep in touch and design like you have never designed before, just remember ADDIE’s name and you’re good to go!

References:

Hodell, C. (2011). ISD From the Ground Up (3rd ed.). Alexandria, VA: ASTD Press.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s