« July 2024 »
S M T W T F S
1 2 3 4 5 6
7 8 9 10 11 12 13
14 15 16 17 18 19 20
21 22 23 24 25 26 27
28 29 30 31
You are not logged in. Log in
Instructional Systems Design - ISD
05/04/09
Introduction to Instructional Design
Topic: 1- Introduction

Any teacher, any school, any company, any institution interested in improving learning and/or performance, interested in moving from good to great, in meeting quality and productivity goals (p.16), needs Instructional Design!

Conducting a performance analysis, a needs assessment, or a job analysis; identifying problems; listing solutions; setting goals and reaching them... all of those definitely lead to success! The key, however, is actually reaching the goals we set; and this is what this blog is about: the different steps that we need to take in order to reach our goals.

This blog -- which will always be a "Work in Progress"-- is based on Dick, Carey & Carey's book titled "Systematic Design of Instruction" and on Dr. Ryan Watkins's lectures. I created it, not only because I want all this information to be handy when I need it, but also because I know how useful it is to educators and business people alike.

If you are interested in seeing how ISD can be applied:
- Click HERE for a Short
Story Unit (by Nada S.A.)
- Click HERE for a Running Training Program (by Carol L.P.)

If you have any suggestions or comments, feel free to use the "Post Comment" link at the end of each section or to join our group on Facebook.

Regards,

Nada S.A.
http://www.nadasisland.com


When is Instructional Design used?
Instructional Design is used in the development of instructional events such as:

  • courses
  • training units
  • lessons
  • seminars
  • workshops
  • computer-based training

Those aforementioned events would serve to reach goals set in order to fill a gap identified through a front-end analysis.

Why are instructional events developed?

Instructional events are developed in order to provide learners with the required (SKAAs)

  • Skills to accomplish a task
  • Knowledge to accomplish a task
  • Attitudes to accomplish a task
  • Abilities to accomplish a task

How is an instructional event best measured?

An instructional event is best measured by

  • its ability to assist learners in the mastery of the required SKAAs for the accomplishment of a given task +
  • the application of those SKAAs and the value added to the organization by their application.

Systematic instructional design is:

  • performance based
  • learner focused
  • interactive
  • data driven
  • systematic

=> for creating effective instructional events.

It is an effective way of facilitating and replicating the design and development of instruction that achieves results.

Components of Dick, Carey & Carey's systems approach model (p.6)

  1. Identifying Instructional Goals
  2. Conducting Instructional Analysis > step by step- how to reach the goal. Determining entry behaviors (skills, knowledge and attitudes) is required.
  3. Analyzing Learners & Contexts
  4. Writing Performance Objectives
  5. Developing Assessment Instruments
  6. Developing Instructional Strategy
  7. Developing and Selecting Instructional Materials
  8. Designing and Conducting Formative Evaluation
  9. Revising Instruction
  10. Designing and Conducting Summative Evaluation

["As you begin designing instruction, trust the model. As you grow in knowledge and experience, trust yourself." (p.5)]

[What it means to practice a discipline > To practice a discipline is to be a lifelong learner. You "never arrive"; you spend your life mastering disciplines. (Peter Senge; 1990) (p.5)]


Posted by Nada at 12:01 AM EDT
Updated: 07/25/11 5:47 PM EDT
05/03/09
Front-End Analysis to Identify Instructional Goals
Topic: 2- Front-End Analysis

Front-end analysis consists of:

  1. Performance Analysis
  2. Needs Assessment / Analysis
  3. Job Analysis

According to instructional designers, the best approach for identifying instructional goals is performance technology in which instructional goals are set in response to problems or opportunities within an organization. (p.16)

Designers engage in perfrormance analysis and needs assessment processes in order to:

  • Identify problems
  • Find causes of problems
  • List possible solutions
    • Identify instructional goals
    • Identify changes needed (seldom is instruction the single answer to a problem)

1. Performance analysis (p.18)

According to Dick & Wager (1995), performance analysis is the use of analytical tools for

  • identifying organizational performance problems
  • developing the most appropriate solutions

Purpose of performance analysis study (p. 20)

The purpose of performance analysis study is to

  • acquire information in order to
  • verify problems and
  • find solutions

Outcome of a performance analysis study

  • Clear description of problems
    • failure to achieve desired results      vs.
    • desired and actual employee performance
  • Evidence of problem causes
  • Suggested cost-effective solutions

2. Needs assessment: Ways to conduct one (p.23) (Example)

  • surveys (past and present)
  • insightful interviews (individual or group interviews)
  • direct observations
  • questionnaires
  • other data collection techniques

Components of needs assessment

  • desired status
  • actual status
  • gap > need

Results of effective needs assessment

  • description of need
  • evidence of its validity
  • possible solutions

3. Job analysis (p.23)

  • The people who work in the job + the environment surrounding the job provide the characteristics of the job.
  • Task inventory: the tasks that comprise the job > duties
  • Screen task inventory by
    • asking SMEs (Subject Matter Experts)
    • job incumbents
      if tasks are really part of the job
  • Revise
  • Format tasks as survey, response scales, directions
  • Pilot test the survey
  • Final revision
  • Distribute survey to a sample of job incumbents
  • Summarize responses on a task-by-task basis
  • Choose high priority tasks for further review
  • THEN, conduct a TASK ANALYSIS
    1. break down tasks for review into component elements
    2. explain in detail the relationships among elements
    3. describe tools + conditions involved in performing each element
    4. write standards for successful performance

[What does critical thinking entail? (p.18) 

  1. be open-minded
  2. be objective
  3. seek root causes
  4. view problem from multiple perspectives
  5. give a fair hearing to evidence on multiple perspectives
  6. suspend judgment until all pertinent information has been heard
  7. listen to contrary views
  8. change a conclusion in the face of compelling information]

Relevant Links:

What is Performance Analysis?

NEEDS ASSESSMENT- the first step

THE NEEDS ASSESSMENT- TOOLS FOR LONG-TERM PLANNING

Needs Assessment Training

Needs Assessment Strategies for Community Groups and Organizations

Needs Assessment Tutorial

Job Analysis: Overview

Job Analysis

Job Analysis Tools (and sample worksheets)

Personnel Manager: Job Analysis

How to Write a Job Analysis and Description


Posted by Nada at 12:01 AM EDT
Updated: 05/04/09 11:50 PM EDT
05/02/09
Instructional Goals
Topic: 3- Instructional Goals

Instructional goals are ideally derived through:

  • Performance analysis > it gives broad indications of problems that can be solved by providing instruction
  • Needs assessment > it determines more specifically what performance deficiencies will be addressed

A complete goal statement should describe the following:

  • who learners are
  • what they will be able to do
  • the context in which they will use the skills
  • the tools that will be available to the learners in the performance context

The instructional goal should be: (p.31)

  • clear/complete > a general statement of learner outcomes (includes the aforementioned)
  • related to the identified problem + needs assessment
  • achievable through instruction

Criteria for establishing instructional goals: (p.32)

  • acceptable to administrators
  • sufficient resources (time, money, personnel) to develop instruction
  • stable content
  • available learners (not too busy)

How to conduct a Goal analysis (p.24)

  1. write down goal
  2. indicate, step-by-step, what people need to do to achieve the goal
  3. sort through statements
  4. indicate what learners will be able to do
  5. ask: "If learners achieve each performance, will they have achieved the goal?"

Two fundamental steps to conduct a goal analysis: (p.40)

  • classify the goal statement according to the kind of learning that will occur (Domains of learning)
  • identify and sequence the major steps required to perform the goal

Domains of learning:

  1. Intellectual skill
  2. Verbal Info
  3. Psychomotor skill
  4. Attitude

Relevant Links:  

Mission, Goals, Objectives (Bloom's Taxonomy)

Example of an Instructional Goal


Posted by Nada at 12:01 AM EDT
Updated: 05/05/09 12:59 AM EDT
05/01/09
Instructional Analysis: Subordinate Skills & Entry Behaviors
Topic: 4- Subordinate Skills

Subordinate Skills

After the steps in the goal have been identified, it is necessary to examine each step to determine what learners must know or be able to do before they can learn to perform that step in the goal => this is referred to as Subordinate Skills Analysis. 

Purpose: Identify the appropriate set of subordinate skills for each step.
If required skills are omitted and many students do not have them => instruction will be ineffective!
If superfluous skills are included => they might interfere with learning the required skills.

A Subordinate skill is "a skill that must be achieved in order to learn a higher level skill. Also known as a subskill, prerequisite, or enabling skill."

 

Approaches to Subordinate Skills Analysis:

 A- Hierarchical analysis technique: Used to classify intellectual or psychomotor skills. (p.62) Sometimes sequences of procedural steps will be included in a hierarchical analysis. (p.91)

In order to identify critical subordinate skills, ask the question: 

What must students already know so that, with a minimal amount of instruction, this task can be learned?

In order to identify additional subordinate skills, ask the question: 

What must students already know how to do, the absence of which would make it impossible to learn this subordinate skill?

What mistake might students make if they were learning this skill? (p.66)

In order to learn how to perform problem-solving skills, learners must first know how to apply the rules that are required to solve the problem => the immediate subskills to the instructional goal are the rules that must be applied in the problem situation

In order to learn the relationship among "things," you must be able to classify them => the subordinate skills required for any given rule are classifying the concepts used in the rule.

If the step is a problem-solving skill, the subskills should include

  1. the relevant rules
  2. concepts
  3. discriminations (discriminating whether examples are relevant to the concept or not)

The whole point of using the hierarchical approach is to identify just what the learner must know to be successful. (p.67)

B- Procedural analysis technique: Used with the steps for intellectual or psychomotor skills, when those contain an additional set of mental or physical steps. (p.67)

The subskills go in the same direction as the main steps => the same step-by-step manner as was done for the goal analysis.

C- Cluster analysis technique: Used when analyzing verbal information goals.(p.68)

The most meaningful analysis of a verbal information goal is to identify the major categories of information that are implied by the goal. (Are there ways that the information can be clustered best?)

D- Subordinate skills analysis for attitude goals (p.69)
=> combine with psychomotor or intellectual skills.

In order to determine the subordinate skills for an attitudinal goal, ask the following questions:

  • What must learners do when exhibiting this attitude?
    • psychomotor or intellectual
    • => hierarchical analysis
  • Why should learners exhibit this attitude?
    • verbal information
    • => cluster


Entry Behaviors

The instructional analysis process helps the designer identify entry behaviors: what learners will already have to know or be able to do BEFORE they begin the instruction.

Entry behaviors are defined as "specific competencies or skills a learner must have mastered before entering a given instructional activity."

In order to identify entry behaviors, examine the hierarchy or cluster analysis and identify those skills that a majority of the learners will have already mastered before beginning your instruction. Draw a dotted line above these skills in the analysis chart.  The skills that are above this line are those you must teach in your instruction. Those that fall below the line are the entry behaviors.


Relevant Links:

Example of an Instructional Analysis

http://edtech.ced.appstate.edu/class/isd/modules/mod_4/


Posted by Nada at 12:01 AM EDT
Updated: 05/05/09 1:01 AM EDT
04/30/09
Learner and Context Analyses
Topic: 5- Learners & Contexts

- Learner Analysis: it involves determining the characteristics of the learner (p.117)

  1. Educational and ability levels / Reading level
  2. Attention span
  3. Previous experience / prior knowledge of topic
  4. Academic motivation level (ARCS)
  5. Attitudes towards school/work/organization
  6. Attitudes towards content and potential delivery system-- expectations
  7. Previous performance level
  8. Entry behaviors
  9. Group characteristics

- Learning Context Analysis: it involves determining the contexts in which the instruction will be delivered

  •  Resources that support instruction:
    • room
    • environment
    • tools needed
    • tools provided
  • Constraints that could inhibit or limit instruction:
    • finances
    • personnel
    • time
    • facilities
    • equipment
    • local culture
  • Compatibility of learning site with instructional needs and learners' needs
  • Adaptability of site to simulate workplace (feasibility of simulating performance site)

- Performance Context Analysis: it involves determining the context in which the skills will eventually be used

  • when they will use the skills
  • the environment where they will use the skills / the information
  • how motivated they are
  • the managerial support they will receive (recognition? praise?)
  • the physical aspects of the site: will the use of skills depend on:
    • equipment?
    • facilities?
    • tools?
    • timing?
    • resourc?es
  • the social aspects:
    • alone?
    • in teams?
    • independently?
    • will they make presentations?
  • relevance of skills to workplace

How to Conduct the Analyses:

  1. Meet with learners / instructors / managers
  2. Talk to learners / instructors / managers
  3. Observe learners / instructors / managers

Tools/Methods:

  • Interviews about learners'
    • entry behaviors
    • personal goals
    • attitudes
    • + self-report skill levels
  • Surveys
  • Questionnaires
  • Pretests

Relevant Link:

Example of Learner & Context Analyses


Posted by Nada at 12:01 AM EDT
Updated: 05/05/09 1:04 AM EDT
04/29/09
Writing Performance Objectives
Topic: 6- Performance Objectives

A performance objective is a detailed description of what students will be able to do when they complete a unit of instruction.  (p.125). Performance objectives are also sometimes called behavioral objectives or instructional objectives.

Performance objectives are derived from the skills in the instructional analysis. One or more objectives should be written for each of the skills identified in  the instructional analysis. For every unit of instruction that has a goal, there is a terminal objective (p.131). The terminal objective has the 3 components of a performance objective, but it describes the conditions for performing the goal at the end of the instruction.

For each objective we write, we need to have a specific assessment of that behavior (that must be observable!).

Important question to ask ourselves when writing objectives: (p.133)
"Could I design an item or task that indicates whether a learner can successfully do what is described in the objective?"

Components of an Objective:

  1. Performance: A description of the skill or behavior identified in the instructional analysis. (What the learner will be able to do-- both action and content or concept)
  2. Conditions: A description of the conditions that will prevail while a learner carries out the task > what the learner will be given/ what will be available to him/her (also check pp. 127-129)
    • computer to use?
    • calculator?
    • paragraph to analyze?
    • story to read?
    • content?
    • from memory?
  3. Criteria: A description of the measures, or standards that will be used to evaluate learner performance (also check pp. 130-131)
    • the tolerance limits for a response
    • range
    • qualitative judgment (what needs to be included in an answer or physical performance judged to be acceptable)
    • given time period/circumstance
      • use checklist of behavior types / rubrics: to define complex criteria for acceptable responses.

The steps in writing performance objectives: (p.132)

  • Edit goal to reflect eventual performance context.
  • Write terminal objective to reflect context of learning environment.
  • Write objectives for each step in goal analysis for which there are no substeps shown.
  • Write objectives that reflect the substeps in one major objective, or write objectives for each substep.
  • Write objectives for all subordinate skills.
  • Write objectives for entry behaviors if some students are likely not to possess them.

 Some important guidelines for writing performance objectives: (Dr. Watkins)

  • Always state objectives from the point of view of the learner (i.e., what the learner/trainee will be able to do) NOT from the point of view of the instructor (i.e., what the teacher/trainer will teach)
  • Always include at least the three components (conditions, performance, criteria) that a Mager Type objective calls for (please read on below for more information...)
  • Make sure the Performance part of your objective is observable
  • Make sure the standards (criteria) are measurable, clear and not open for interpretation
  • Do not confuse conditions and standards
  • Take conditions into consideration when determining realistic criteria  

Relevant Links:  

Example of Performance Objectives

Mission, Goals, Objectives (Bloom's Taxonomy)

Relevant Book: 

Preparing Instructional Objectives: A Critical Tool in the Development of Effective Instruction by Robert F. Mager (1977)


Posted by Nada at 12:01 AM EDT
Updated: 05/05/09 1:05 AM EDT
04/28/09
Assessment Instruments
Topic: 7- Assessment

A- Criterion-referenced instruments
B- Student work samples
C- Student Narratives
D- Performance Standards and Assessment Rubrics


A- Criterion or objective-referenced assessment instruments:
(Learner-Centered) p145
> Assessment that enhances student learning (Baron, 1998)

Criterion-referenced assessments are

  • linked to instructional goals
  • linked to objectives derived from goals

The purpose of this type of assessment is to evaluate:

  1. students' progress: they enable learners to reflect on their own performances > they will become ultimately responsible for the quality of their work
  2. instructional quality: they indicate to the designer which components of the instruction went well and which need revision

+ it contains a criterion that specifies how well a student must perform the skill in order to master the objective.
>Therefore, performance criteria should be congruent with the objectives, learners, and context.

The performance required in the objective must match the performance required in the test. (p 146)

Two uses of criterion-referenced tests:

  • pretests: they have 2 goals
    • to verify that the student possesses the anticipated entry behaviors
    • to measure the student's knowledge of what is to be taught.
  • posttests: they are used primarily to measure the student's knowledge of what was taught.

4 Types of Criterion-Referenced Tests & Their Uses (p.146)

a- Entry behaviors test: It is given to learners before they begin instruction in order to assess those learners' mastery of prerequisite skills.
This entry behavior test should cover the skills that are more questionable than others in terms of being already mastered by the target population.
From entry behaviors test scores designers decide whether learners are ready to begin the instruction.

b- Pretest: It is given to learners before they begin instruction in order to determine whether they have previously mastered some or all of the skills that are to be included in the instruction.
If all the skills have been mastered, then the instruction is not needed.
The pretest includes one or more items for key skills identified in the instructional analysis, including the instructional goal.
Since both entry behaviors tests and pretests are administered prior to instruction, they are often combined into one instrument.
From pretest scores designers decide whether the instruction would be too elementary for the learners and, if not too elementary, how to develop instruction most efficiently for a particular group.
A pretest is valuable only when it is likely that some of the learners will have partial knowledge of the content.

c- Practice/rehearsal tests: They are focused at the lesson rather on the unit level. The purpose for practice tests is to:
- provide active learner participation
- enable learners to rehearse new knowledge and skills
- enable learners to judge for themselves their level of understanding and skill
- enable the professors to provide corrective feedback
- enable the professors to monitor the pace of instruction

d- Posttests: They are administered following instruction, and they are parallel to pretests (excluding entry behaviors items).
Posttests measure objectives included in the instruction. They assess all of the objectives, and especially focus on the terminal objective

Designing a Test (p.149)

A criterion-referenced test is designed by matching the learning domain with an item or assessment task type.

a- Objectives in the Verbal Information Domain: they require objective-style test items which include the following formats:

  • short answer
  • alternative response
  • matching
  • multiple-choice items

b- Objectives in the Intellectual Skills Domain: they require one of the following:

  • objective-style test items
  • the creation of a product (essay, research paper...)
  • a live performance of some type (act in a play, make a presentation, conduct a business meeting...)

If an objective requires the learner to create a unique solution or product, it would be necessary to 

  1. write directions for the learner to follow
  2. establish a set of criteria for judging response quality
  3. convert the criteria into a checklist or rating scale (rubric) that can be used to assess those products

c- Objectives in the Affective/Attitudinal Domain: They are concerned with the learner's attitudes or preferences. Items for attitudinal objectives require one or both of the following:

  • that the learners state their preferences
  • that the instructor observes the learners' behavior and infers their attitudes from their actions

d- Objectives in the Psychomotor Domain: They are usually sets of directions on how to demonstrate the tasks and require the learner to perform a sequence of steps that represents the instructional goal. Criteria for acceptable performances need to be identified and converted into a checklist or rating scale that the instructor uses to indicate whether each step is executed properly.

Writing Test Items (p. 151)

There are 4 categories of criteria that should be considered when creating test items:

  1. Goal-Centered Criteria: The test items and assessment tasks should be congruent with the terminal and performance objectives; they should provide learners with the opportunity to meet the criteria necessary to demonstrate mastery of an objective.

    No rule states that performance criteria should or should not be provided to learners. Sometimes it is necessary for they to know performance criteria and sometimes it is not.

    (Based on my personal experience, here, I would say that it is very beneficial for the learner to receive the criteria before s/he is given the test; this will help them to prepare for it much more efficiently)
  2. Learner-Centered Criteria: The test items and assessment tasks must be tailored to the characteristics and needs of the learners. Criteria in this area include considerations such as learners'
    1. vocabulary and language levels
    2. developmental levels (for setting task complexity)
    3. motivational and interest levels
    4. experiences and backgrounds
    5. special needs

      Designers should consider how to aid learners in becoming evaluators of their own work and performances. Self-evaluation and self-refinement are two of the main goals of all instruction since they can lead to independent learning.
  3. Context-Centered Criteria: The test items and assessment tasks must be as realistic or authentic to the actual performance setting as possible. This criterion helps to ensure transfer of the knowledge and skills from the learning to the performance environment.
  4. Assessment-Centered Criteria: The test items and assessment tasks must include clearly written and parsimonious directions, resource materials, and questions + correct grammar, spelling, and punctuation!

    To help ensure task clarity and to minimize learners' test anxiety, learners should be given all the necessary information to answer a question before they are asked to respond.

    Items should not be written to trick learners! Ideally, learners should err because they do not possess the skill, not because the test item or assessment is convoluted and confusing!

Evaluating Tests and Test Items (p.156)

When writing test directions and test items, the designer should ensure the following:

  1. test directions are clear, simple, and easy to follow
  2. each test item is clear and conveys the intended information
  3. conditions under which responses are made are realistic
  4. the response methods are clear to learners
  5. appropriate space, time, and equipment are available

Using Portfolio Assessments (p. 162)

Portfolios are collections of criterion-referenced assessments that illustrate learners' work. These assessments might include:

  • objective-style tests that demonstrate progress from the pretest to the posttest
  • products that learners developed during instruction
  • live performances
  • assessments of learners' attitudes about the domain studied or the instruction

There are several features of quality portfolio assessment:

  1. the work samples must be anchored to specific instructional goals and performance objectives
  2. the work samples should be the criterion-referenced assessments collected during instruction (the pretests and posttests)
  3. each assessment is accompanied by its rubric with a student's responses evaluated and scored, indicating the strengths and problems within a performance.

The assessment of growth is accomplished at two levels:

  1. learner self-assessment: learners examine their own materials and record their judgments about the strengths and problems + what they might do to improve the materials
  2. instructor assessment: instructors examine the materials set without examining the evaluations by the learner, and record their judgments.

Then both the learner and the instructor compare their evaluations , discussing any discrepancies between the two evaluations.

As a result, they plan together next steps the learner should undertake to improve the quality of his/her work.


B- Student work samples:

  • written reports
  • rough drafts
  • notes
  • revisions
  • projects
  • Journals
  • peer reviews
  • self-evaluations
  • anecdotal records
  • class projects
  • reflective writings
  • artwork; graphics
  • photographs
  • exams
  • computer programs
  • presentations

 C- Student Narratives: A narrative description that

  • states goals
  • describes efforts
  • explains work samples
  • reflects on experience.

 

D- Performance Standards and Assessment Rubrics:

A variety of scoring criteria can be used in developing performance standards or assessment rubrics. Standards and rubrics should include:

  • performance levels (a range of various criterion levels)
  • descriptors (standards for excellence for specified performance levels)
  • scale (range of values used at each performance level)

Examples of Rubric:

  • analytic assessment rubric
  • holistic assessment rubric: it provides a general score for a compiliton of work samples rather than individual scores for specific work samples


Relevant Links:

Assessment Instruments

Rubrics

Teaching Writing: Approaches & Activities


Posted by Nada at 12:01 AM EDT
Updated: 05/05/09 1:09 AM EDT
04/27/09
Instructional Strategy
Topic: 8- Instructional Strategy

Instructional Strategy involves a huge variety of teaching/learning activities (microstrategies) such as:

  • group discussions
  • independent reading
  • case studies
  • lectures
  • computer simulations
  • worksheets
  • cooperative group projects

Microstrategies: they are pieces of an overall macrostrategy that must take learners from a motivational introduction to a topic learners' through mastery of objectives.

A textbook is a microstrategy that serves primarily as a source of information (and is incomplete instruction).

Macroinstructional strategy: is the complete instruction created by an instructor and involves:

  • defining objectives
  • writing lesson plan and tests
  • motivating learners
  • presenting content
  • engaging students as active participants in learning process
  • administering and scoring assessments (providing feedback)

A well-designed set of instructional materials contains many strategies and procedures.

Three of the major components in the learning process that facilitate learning: (according to psychologists)

  • motivation
  • prerequisite and subordinate skills
  • practice & feedback

Psychologists whose work influences approaches to instructional design:

  • Behaviorists (30 to 40 years ago)
  • Cognitivists (who later modified behaviorists' views)
  • Constructivists (more recent; they suggested new approaches)

HOW to TEACH
The term Instructional Strategy covers the following:

  1. selection of a delivery system
  2. sequencing and grouping content
  3. describing learning components that will be included in instruction
  4. specifying how students will be grouped during instruction
  5. establishing lesson structures
  6. selecting media for delivering instruction

1- Selection of a delivery system: (instruction) (p.184)

The delivery system is the general methodology used for managing and delivering the teaching and learning activities called "instruction."

A delivery system is only part of an overall instructional strategy. Here are some examples of some common delivery systems:

  • traditional model
  • correspondence
  • large-group lecture with small-group Q&A follow-up
  • telecourse by broadcast or videotape
  • two-way, interactive videoconference
  • computer-based instruction
  • Internet or intranet web-based instruction
  • self-paced programs
  • combinations and unique, custom systems

In an ideal instructional design process, one would consider the following:

  • goal
  • learner characteristics
  • learning and performance contexts
  • objectives
  • assessment requirements

 Ideal path for choosing a delivery system:

  1. review instructional analysis and identify logical clusters of objectives that will be taught in appropriate sequences
  2. plan the learning components that will be used in the instruction
  3. choose most effective student groupings
  4. specify effective media and materials that are within the range of cost, convenience, and practicality for the learning context
  5. assign objectives to lessons and consolidate media selections
  6. select or develop a relevant delivery system 

However, in real life, things do not really happen this way. Then, the designer must be flexible and get everything out of the system that it is capable of delivering. The designer must make appropriate adaptations or propose an alternative system.

2- Sequencing and grouping content: (p.186)
A- Sequencing content
What sequence should we follow in presenting content? We should follow our instructional analysis, beginning with the lower-level skills. The instructional sequence tends to be a combination of bottom to top and left to right.
B- Clustering content
We may decide to present information
- on an objective-by-objective basis with intervening activities or
- on several objectives prior to any kind of learner activities
5 factors need to be considered when determining the amount of information to be presented:
  1. age level
  2. complexity of material
  3. type of learning
  4. if the activity can be varied
  5. time required

3- Describing Gagne's 9 learning components that will be included in instruction: (p.189)

Nine events represent external teaching activities that support mental processes of learning (cognitive psychology):

  1. gaining attention
  2. informing learner of objective
  3. stimulating recall of prerequisite learning
  4. presenting stimulus material
  5. providing learning guidance
  6. eliciting performance
  7. providing feedback
  8. assessing performance
  9. enhancing retention and transfer

The purpose for developing an instructional strategy is planning how to guide learners' intellectual processing through the mental states and activities that psychologists have shown will foster learning.

Here are Gagne's events of instruction re-organized into 5 major learning components that are part of an overall instructional strategy:

a- Preinstructional activities
b- Content presentation
c- Learner participation
d- Assessment
e- Follow-through activities

____________________________

a- Preinstructional activities: 3 factors to consider prior to beginning instruction:

- Motivating learners: John Keller (1987)- ARCS model

  • Attention: gaining and sustaining attention (by using emotional or personal information, asking questions, creating mental challenges, using human-interest examples)
  • Relevance: learners must perceive the instruction as relevant to them; instruction must be related to important goals in their lives > there should be congruence between learners' expectations and the instruction.
  • Confidence: learners must be confident that they can master the objectives for the instruction. If the learners
    • lack confidence > less motivated
    • are overconfident > they will see no need to attend to the instruction

      The challenge is to create the appropriate level of expectation for success (> zone of proximal development)
  • Satisfaction: learners must derive satisfaction (extrinsic rewards or intrinsic feelings of accomplishment?) from the learning experience

 - Informing learners of the objectives > so that they know what to memorize, solve, or interpret.

Providing learners with the objectives helps them to

  • focus their study strategies on these outcomes
  • use more efficient study strategies
  • determine the relevance of the instruction

- Informing learners of prerequisite skills: this will prepare them for the instruction to follow. Two purposes for this component:

  • make sure learners view the relationship between new content and what they already know (can be done through brief test of entry behaviors or by providing learners with a brief description of required entry behaviors)
  • promote learners' active recall of relevant mental contexts in which the new content can be integrated.

b- Content presentation: we should determine what information, concepts, rules, and principles need to be presented. (Avoid presenting too much information, especially that unrelated to the objective)

It is important to

  • define new concepts
  • explain their interrelationship with other concepts

We also need to determine the types and numbers of both examples and nonexamples (deliberate attempt to point out why an example is wrong).

Forms of examples and nonexamples:

  • illustrations
  • diagrams
  • demonstrations
  • model solutions
  • scenarios
  • case studies
  • sample performances

c- Learner participation: Practice with feedback! Learners should be provided an opportunity to practice what we want them to be able to do + they should be provided feedback about their performance.

d- Assessment:

  • Entry behavior tests
  • Pretests
  • Practice tests
  • Posttests 

Refer to the following page: http://www.nadasisland.com/isd/index.blog?topic_id=1112721

e- Follow-through activities:

  • Memory aids for retention (job aids such as checklists are very useful)
  • Transfer of learning: research indicates that learners transfer only some of what they learn to new contexts. The designer must be aware of thatand use every means possible to promote the transfer of learning.
    Instruction is effective if learners can use it to further their study of more advance topics or to perform skills on the job that make a difference in their organization's effectiveness.

4- Specifying how students will be grouped during instruction: (p.207)

The type of student grouping (individual pairs, small group, large group) depends on specific social interaction requirements and is often mixed within and among the learning components in a lesson or unit.

5- Establishing lesson structures: (pp.198-199)

Matching learning components with the amount of guidance needed by the intended learners.

Check Moore & Kearsley's Theory of Transactional Distance (1996)

  • Level of course structure: flexible vs. rigid
  • Level of course dialogue: little interactive communication vs. lots of it
  • Transactional distance: greater vs. lesser
  • Suitability for learner autonomy level: highly autonomous learner vs. less autonomous learner who has not learned how to learn

6- Selecting media for delivering instruction: (p.209)

Media are useful to the extent that they effectively carry required learning components of an instructional strategy.

According to Clark (1983), it is the design of instruction, rather than the medium used to deliver instruction, that determines student learning. 

Detailed Outline for an Instructional Strategy: (p.197)

a- Preinstructional activities

  1. Gain attention and motivate
  2. Describe objectives
  3. Describe and promote recall of prerequisite skills

b- Content presentation

  1. Content
  2. Examples

c- Learner participation

  1. Practice
  2. Feedback

d- Assessment

  1. Entry behavior test
  2. Pretest
  3. Posttest

e- Follow-through activities

  1. Memory aids for retention
  2. Transfer considerations

Educational Activities (Dr. Ryan Watkins)

When you have selected the appropriate instructional strategies for your objectives, it is now the time in the ID process for the designer to consider the educational activities that will be included in the lesson. Each of the activities should promote learner interactions as discussed in last week's readings. A cursory list of potential educational experiences offered by Atsusi Hirumi at the University of Houston Clearlake includes:

listen to a lectureread a journal article
conduct a surveycomplete handouts
conduct library researchparticipate in a debate
assess the work of other studentshandle manipulatives
watch a filmcreate a presentation
analyze current eventsparticipate in a panel discussion
interview othersconduct experiments
visit community resourceinteract with laserdisc
participate in a panel discussionparticipate in a simulation
create or make a graphical presentationwrite research, concept, or position paper
write a reflective paperwatch a demonstration

Active participation of students is essential for the learning process. One challenge, however, is that active learning is not always the easiest instruction to design or implement. Most current education and training programs are based primarily on relatively passive forms of education. And while passive instructional events are appropriate to accomplish some objectives, you will also want to consider some active learning events as well.

Graf and Albright (1994) characterize active learning as:

  • students who are involved in more than listening;
  • less emphasis on transmitting information and more emphasis on developing skills;
  • involvement in higher order thinking skills;
  • engagement in activities;
  • the exploration of learners prior skills, knowledge, attitudes, and abilities.

In creating active learning in your instruction you may want to consider:

  • the development of learning communities;
  • InterActivities (Internet-based activities, webquest, etc.);
  • information gathering activities (information searches, electronic appearances, electronic mentoring, electronic publishing, database creation, etc.);
  • information sharing (keypals, sequential creations, discussion groups, impersonations, generating websites, brainstorming, site surveys, etc.);
  • collaborative problem solving (polled data analysis, parallel problem-solving, simulations, social action plans).


Different Strategies + Lesson Plans (Worth considering, but disregard typing-- mainly punctuation-- mistakes)

Community Language Learning + Lesson Plan

Direct Instruction Model + Lesson Plan (pdf)

Inductive Model + Lesson Plan  (pdf)

Inquiry Model + Lesson Plan  (pdf)

Cooperative Learning + Lesson Plan

Reading- Lesson Plan (under construction)  


Relevant Links:

Example of Instructional Strategy based on ISD

Keller’s ARCS model of motivation

Instructional Design Knowledge Base: Models/Theories

Instructional Design Knowledge Base: Instructional Strategies & Tactics

Design and Sequence Your Way to WBT Interactivity By Atsusi Hirumi and Kathryn Ley

Reigeluth's Elaboration Theory  

Merrill's Instructional Transaction Theory (ITT)


Relevant Books:

A Handbook of Job Aids by Allison Rossett and Jeannette Gautier-Downes (1991)

Electronic Performance Support System by Gloria Gery (1991)

Designing and Developing Electronic Performance Support Systems by L. Brown (1996)

Transfer Of Training: Action-packed Strategies To Ensure High Payoff From Training Investment by Broad and Newstrom (1992)

Distance education: A system's view by Moore and Kearsley (2004)


Posted by Nada at 12:01 AM EDT
Updated: 05/06/09 12:30 AM EDT
04/11/09
Instructional Materials
Topic: 9-Instructional Materials

In individualized instruction, many of the instructional events carried out by the instructor with a group of students are now presented to the individual student through instructional materials.

The instructor's role is different, and even more important than in lockstep instruction. The instructor is still

  • the motivator
  • the counselor
  • the evaluator
  • the decision maker
  • responsible for each student's mastery of the objectives.

The delivery system and media selections: (p.238)

Three factors often cause compromise in media selections and the delivery system:

  1. availability of existing instructional materials
  2. production and implementation constraints
  3. amount of instructor facilitation during instruction

Summary by Susanne Hoepfl-Wellenhofer

When developing instructional materials consider:

 

1)       The three major components of an instructional package: 

  • Instructional Materials: They contain the content – either written, mediated or facilitated by an instructor (the content includes materials for the major objectives, the terminal objective, and any materials for enhancing memory and transfer).  Instructional materials refer to any preexisting materials that are being incorporated, as well as to those that will be specifically developed for the objectives. The materials may also include information that the learners will use to guide their progress through the instruction.
  • Assessments: All instructional material  should be accompanied by objective tests or by product or performance assessments. These may include a pretest and/or a posttest.
  • Course Management Information: There is often a general description of the total package, typically called the instructor’s manual, which provides the instructor with an overview of the materials. It might include the following:
    • tests and other information considered important for implementing the course.
    • student guidance templates
    • automated class listing
    • student tracking
    • online testing
    • project monitoring
    • grade book
    • a variety of communication and messaging mechanisms

      Special attention should be paid to the ease with which course management information can be used by the instructor or course manager.

2)       The evaluation criteria when selecting existing instructional materials:

a- Goal-Centered Criteria for Evaluating Materials: They are focused on the content of the instruction. Specific criteria in this area include:

  • congruence between content in materials and objectives
  • adequacy of content coverage and completeness
  • authority
  • accuracy
  • currency
  • objectivity

b- Learner-Centered Criteria for Evaluating Materials: They are focused on the appropriateness of instructional materials for the target group. The learner analysis documentation should provide the foundation for this evaluation. Specific criteria in this area include the appropriateness of the materials for the learners with regards to their:

  • vocabulary and language levels
  • developmental, motivation, and interest levels
  • backgrounds and experiences
  • special language or other needs 

c- Learning-Centered Criteria for Evaluating Materials: They are focused on the adequacy of existing materials (do they need to be adapted or enhanced prior to use?).  Materials can be evaluated to determine whether the following items are included and adequate/complete:

  1. preinstructional materials
  2. content sequencing and presentation
  3. student participation and congruent practice exercises
  4. feedback
  5. assessments
  6. follow-through directions for enhancing memory and transfer
  7. delivery system and media formats
  8. learning guidance to move students from one component/activity to the next..

d- Context-Centered Criteria for Evaluating Materials: They are focused on the appropriateness of existing materials for the instructional and performance context. Judge if existing material can be adopted; if not, you are in instructional materials development business. Criteria in this area include:

  • the authenticity of the materials for context and learners
  • the feasibility of the materials for settings and budget. Here examine the technical quality of existing materials with regards to:
    • packaging
    • graphic design and typography

    • durability

    • legibility

    • audio and video quality

    • interface design

    • navigation

    • functionality

A recent development in selecting existing instructional materials is the SCORM: Sharable Content Object Reference Model, which is a set of e-learning standards for interchangeability of learning objects (i.e. lessons or modules).

The theory of SCORM is that cost savings could be realized by distributing learning objects across agencies that teach the same learning outcomes.

The theory is promising and bears watching, but practice currently lags well behind theory. 

 

3)       Which types of learning components you would like to include:

·         Preinstructional activities (including objectives and review materials + motivational materials and activities)

·         Content (including examples and nonexamples of information, concepts, or skills that need to be learned)

·         Participation activities (for practice) and feedback on students' performance 

·         Assessment of learners’ mastery of new information and skills

·         Activities that enhance memory and transfer

 

4)       Which types of material you want to include in an instructor’s guide:

·         Information about target population

·         Suggestions on how to adapt materials (for older, younger, higher achieving, or lower achieving students)

·         Content overview

·         Intended learning outcomes

·         Suggestions for using the materials in a certain context or sequence

·         Suggestions for materials management for 

  1. individualized learning
  2. small-group learning
  3. learning-center activities
  4. classroom activities 

·         Retention and transfer activities:

  1. tests that evaluate performance on terminal objectives
  2. evidence of effectiveness of materials
  3. suggestions for evaluating students' work and reporting progress
  4. estimation of time required to use the materials properly
  5. equipment or additional facilities needed for the materials 

5)      If the designer

·         is the developer and the instructor: the whole process of materials development is rather informal.

·         is not the instructor: there might be teams – manager, ID designer, SME (Subject Matter Expert), materials developer and evaluator. Here a premium is placed on precision specifications and working it requires communication and collaboration skills. 


Great summary, Susanne!  The only thing I would add to your summary if I were going to give another designer advice for moving into the development of instructional materials is to have the performance objectives handy.  I found myself constantly refering back to them to keep focus on the conditions I had set and the specific content needed to meet the overall goal.  Also, I would tell them to make sure the materials facilitate the learning (measured by the info gathered in the learner analysis) and that they are conducive to transfer of skills to the actual workplace (or setting where the new skills will be used).   As a last thought, I would add (only because D,C&C mentioned it a few times throughout chapter nine) to remember that the drafts are just that, drafts, so as D,C,&C explain, there is no use in investing a lot of time and money on materials that will most likely be revised and updated (even after they are used in the instruction, but definitely before). 

Patricia Parada


Technical and Instructional Alignment  (By Dr. Ryan Watkins)

The focus of this lesson is on the development of "rough draft materials" for formative evaluation. It is often useful to think of these rough drafts with either "technical" or "instructional" alignment in mind. Let me explain...

If we want to create a rough draft (or prototype) to assess the technical alignment in the formative evaluation, then we want to develop a technical template that may be used throughout the instruction. For example, if we were creating web-based instruction this may be the "home page" that identifies features (e.g., discussion area, online help) that will be available to learners throughout the instruction. In the development of instruction not all of these features have to be fully functioning at the time of the formative evaluation, though at least partially functioning examples should be developed and available to learners during the formative evaluation. Similarly, in text-based instruction items such as a glossary or self-check assessment do not have to be fully developed, but they should be identified in detail within the introduction and/or instructions of the lesson and partially developed for the formative evaluation participants. During the formative evaluation you will assess the technical alignment by considering if these resources are required by learners, can learners access the necessary resources, what other resources may be required, and other related questions.

When we create a rough draft (or prototype) to assess the instructional alignment in the formative evaluation, then we want to develop a functioning module (or selection) of the instruction to be evaluated. In the formative evaluation we will assess this module to determine if it serves as an adequate template for the remainder of the lesson. The module may be related to a discrete objective or to multiple instructional objectives if a variety of instructional strategies were utilized. For web-based instruction this would include all of the educational materials as well as other instructional features available to learners (e.g., glossary of words, online resources, online help related to the module). Likewise, in text-based instruction you will want to develop a portion of your instructional materials to determine if instructional strategies, events, and activities work as desired.

In assessing both technical and instructional alignment you will want to consider issues related to visual literacy and graphic design as well.


Relevant Link:

Example of an Instructor's Package

Example of a Student Guide


Posted by Nada at 12:43 AM EDT
Updated: 05/05/09 1:15 AM EDT
04/10/09
Formative Evaluations
Topic: 91- Formative Evals

In the past, too often were instructors blamed for poor teaching and learners for poor learning when, in fact, the materials were not sufficient to support the instructional effort. This is why formative evaluation to newly developed materials, to selected and adapted materials, to instructor delivered instruction, and to combinations of these 3 presentation modes, is nowadays a must in order to ensure that instruction is properly implemented and managed! 

Formative evaluation is the collection of data and information during the development of instruction that can be used to improve the effectiveness of the instruction (Dick and Carey, p. 277; based on Cronbach, 1975 and Scriven 1967).

=> Formative evaluation is the process designers use to obtain data that can be used to revise their instruction to make it more efficient and effective (gathering information from learners in order to revise the materials before proceeding with the design process).

The emphasis in formative evaluation is on the

  1. collection and analysis of data
  2. revision of the instruction

[Two related activities that share many of the same principles as formative evaluation:  usability testing and rapid prototyping]

Three basic phases of formative evaluation: (p.279)

  1. One-to-one or clinical evaluation: the designer works with individual learners to obtain data to revise the materials
  2. Small-group evaluation: a group of 8 to 20 learners (who represent the target population) study the materials on their own and are tested to collect the required data
  3. Field trial: often 30 learners are sufficient. The emphasis here is on the testing of the procedures

The three phases of formative evaluation are typically preceded by the review of instruction by interested specialists who are not directly involved in the instructional development project, but have relevant expertise.

Designing Formative Evaluations (FEs): (p.279)

The purpose for the formative evaluation is to pinpoint specific errors in the materials in order to correct them.

The best anchor or framework for the design of the FE is the instructional strategy: create a matrix that lists the components of the instructional strategy on one side and the major areas of questions about the instruction on the other.

The 5 areas of questions that would be appropriate for all materials:

  1. Type of learning: are the materials appropriate for the type of learning outcome? (ask expert)
  2. Content: do the materials include adequate instruction on the subordinate skills, and are these skills sequenced and clustered logically? (ask expert)
  3. Clarity: are the materials clear and readily understood by representative members of the target group? (ask target learners)
  4. Motivation: do learners find the materials relevant to their needs and interests? (ask target learners)
  5. Management: can the materials be managed effectively in the manner they are mediated? (ask expert & target learners)

In designing instrumentation for gathering information from learners, you must consider:

  1. the phase: one-to-one, small-group, field trial
  2. the setting: learning or performance context
  3. the nature of the information being gathered

The types of data the we need to collect:

  1. data collected on entry behavior tests, pretests, posttests, and performance context
  2. learners' comments
  3. data collected on attitude questionnaires and/or debriefing comments (reveals learners' overall reactions to instruction)
  4. time required
  5. SME's reactions
  6. manager or supervisor's reactions

Relevant Link:

Example of a Formative Evaluation


Posted by Nada at 12:01 AM EDT
Updated: 05/05/09 1:17 AM EDT

Newer | Latest | Older