Week Eight Assessment and Testing

This week we will use objectives you developed in the first part of Assignment Three to begin writing assessment items. In instructional design, we write assessment items after objectives to ensure we are testing what we plan to deliver instructionally to learners.

Learning Objectives

  1. Describe the purpose for criterion-reference tests.
  2. Describe how enty behaviors tests, pretests, and posttests are used by instructional designers.
  3. Name four categories of criteria for developing criterion-referenced tests and list several considerations within each criterion category.
  4. Given a variety of objectives, write criterion-referenced, objective-stype test items that meet quality criteria in all four categories.
  5. Develop instructions for product development, live performance, and attitude assessments, and develop a rubric for evaluating learners' work.
  6. Evaluate instructional goals, subordinate skills, learner and context analyses, performance objectives, and criterion-referenced test items for congruence.

Overview

We have all been subjected to testing in various formats since the beginning of grade school. The majority of these tests were probably criterion-referenced, that is, tests designed to determine how well we learned a particular subject. Most of us have also had experience with norm-referenced tests, such as the SAT or ACT, which are designed to rank or compare us with the other test-takers. While it is important that you understand the difference between the two, for the purposes of this course, we will focus on criterion-referenced assessment instruments.

Think back on a test you took in school where it appeared as though the material on the test was not covered in class or in the textbook. Sound familiar? Very often this occurs when the test items are not congruent with the learning objectives. Constructing assessment instruments immediately after you have written the learning objectives helps to ensure that congruency. In fact, if you are unable to design an appropriate assessment instrument, you may want to go back and relook at your learning objectives. As instructional designers, you should become comfortable moving forward and backwards between the various phases of the design process, recognizing that it is an iterative, not a linear process.

This week's activity involves generating assessment items for each of your performance objectives. These assessment items constitute the second part of Assignment 3.1.

Comments

One of the first decisions in this phase of the instructional design process is to choose which type of test, or combination of tests, you will create—entry behavior tests, pretests, practice tests, and/or posttests. See the table on page 148-149 for a concise summary of each. In addition, you must choose between an objective test format, for verbal and intellectual skills objectives, or an instrument that evaluates intellectual skills that result in a product or performance.

Creating assessment instruments is cognitively challenging and much more difficult than it might appear. Here is an example of a procedure for constructing test items. Include this analysis as part of assignment 3.1, as it will help your peer reviewer (as well as me) follow your rationale behind the construction of the test item. Then, when you have constructed the test item, include it in the appropriate location in your version of the table that lists your objectives.

Example

Objective 4.0 - Given a research topic and a list of ten Google search results, select the three sites most appropriate to the research topic.

  1. What will they need to do? The learners should be able to select web sites from a list of search results.
  2. What conditions will need to be provided? The learners will need to be given a predetermined research topic and a list of actual Google search results related to that topic.
  3. Domain - Intellectual Skills: Rules. Students have to apply a set of criteria in order to make a decision.
  4. This objective requires an objective-style fill-in-the-blank test item, as the students will have to write down the three most appropriate sites based on certain criteria.

Test Item: Take a look at the following Google search results: (show screen capture of search results). Which three web sites are like to have specific and relevant information dealing the the subject of life on Mars?

Source: Virginia Tech

If you are having trouble deciding on what type of test to use, Table 7.1 on p. 154 has a nice matix correlating the types of test items with the type of behavior stated in the objective.

Note that rubrics are becoming increasingly popular as assessment instruments because, if constructed correctly, they can be a descriptive, holistic characterization of the quality of a student's work. Portfolios are also gaining popularity as a means of "authentic testing", or assessing performance in realistic contexts.

Required Reading

  1. Chapter 7

Additional Reading

  1. Developing tests - tips from Don Clark; an excellent resource
  2. Writing test items - Michigan State; another excellent resource
  3. Authentic assessment toolbox - Jon Mueller, professor of psychology
  4. Alternative assessment - NC Regional Educational Laboratory (NCREL)
  5. A taxonomy of assessment formats Univ Oregon

Assignment

  1. Write at least one congruent assessment instrument for the learning objectives in your instructional unit.
  2. Look at the grading rubric for this assignment to assess whether you are on the right track.

Online Class Discussion

Monday night from 8-9:30 p.m., Eastern time via Webex.

Agenda:

  • 8:00-8:10 Introductory discussion
  • 8:10-8:25/30: Drew's discussion on assessment
  • 8:30-9:20 Working discussion of your objectives and assessment items-- be prepared to present 2 or 3 draft objectives and assessment items to the class. These can be be very incomplete and conceptual and are for discussion only.
  • 9:20-9:30 Preview of next week

Please email me with any questions.

 

Contact Me

codone_s@mercer.edu

 

 

Suggested Reading

Chapter 7 Case Study (for examples)