April 2009
1 2 3 4
5 6 7 8 9 10 11
12 13 14 15 16 17 18
19 20 21 22 23 24 25
26 27 28 29 30
Instructional Systems Design - ISD (Under Construction)
Introduction to Instructional Design
Topic: 1- Introduction

(Sources: Dick, Carey & Carey + Prof. Ryan Watkins)

Any school, any teacher, any company, any organization interested in improving learning and/or performance, interested in moving from good to great, in meeting quality and productivity goals (p.16), needs Instructional Design!

Conducting a performance analysis, a needs assessment, or a job analysis, identifying problems, listing solutions, setting goals and reaching those goals, all those definitely lead to success! The key, however, is actually reaching those goals; and this is what this blog is about: the different steps that we need to take in order to reach our goals.

I created this blog because I know how useful it will be to anyone with high aspirations!

When is Instructional Design used?
Instructional Design is used in the development of instructional events such as:

  • courses
  • training units
  • lessons
  • seminars
  • workshops
  • computer-based training

Those aforementioned events would serve to reach goals set in order to fill a gap identified through a front-end analysis.

Why are instructional events developed?

Instructional events are developed in order to provide learners with the required (SKAAs)

  • Skills to accomplish a task
  • Knowledge to accomplish a task
  • Attitudes to accomplish a task
  • Abilities to accomplish a task

How is an instructional event best measured?

An instructional event is best measured by

  • its ability to assist learners in the mastery of the required SKAAs for the accomplishment of a given task +
  • the application of those SKAAs and the value added to the organization by their application.

Systematic instructional design is:

  • performance based
  • learner focused
  • interactive
  • data driven
  • systematic

=> for creating effective instructional events.

It is an effective way of facilitating and replicating the design and development of instruction that achieves results.

Components of Dick, Carey & Carey's systems approach model (p.6)

  1. Identifying instructional goals
  2. Conducting instructional analysis > step by step- how to reach the goal. Determining entry behaviors (skills, knowledge and attitudes) is required.
  3. Analyzing Learners & contexts
  4. Writing performance objectives
  5. Developing assessment instruments
  6. Developing instructional strategy
  7. Developing and selecting instructional materials
  8. Designing and conducting formative evaluation
  9. Revising instruction
  10. Designing and conducting summative evaluation

["As you begin designing instruction, trust the model. As you grow in knowledge and experience, trust yourself." (p.5)]

[What it means to practice a discipline > To practice a discipline is to be a lifelong learner. You "never arrive"; you spend your life mastering disciplines. (Peter Senge; 1990) (p.5)]

Posted by Nada at 12:01 AM EDT
Updated: 04/01/09 8:57 AM EDT
Front-End Analysis to Identify Instructional Goals
Topic: 2- Front-End Analysis

(Sources: Dick, Carey & Carey + Prof. Ryan Watkins)

Front-end analysis consists of:

  1. Performance Analysis
  2. Needs Assessment / Analysis
  3. Job Analysis

According to instructional designers, the best approach for identifying instructional goals is performance technology in which instructional goals are set in response to problems or opportunities within an organization. (p.16)

Designers engage in perfrormance analysis and needs assessment processes in order to:

  • Identify problems
  • Find causes of problems
  • List possible solutions
    • Identify instructional goals
    • Identify changes needed (seldom is instruction the single answer to a problem)

1. Performance analysis (p.18)

According to Dick & Wager (1995), performance analysis is the use of analytical tools for

  • identifying organizational performance problems
  • developing the most appropriate solutions

Purpose of performance analysis study (p. 20)

The purpose of performance analysis study is to

  • acquire information in order to
  • verify problems and
  • find solutions

Outcome of a performance analysis study

  • Clear description of problems
    • failure to achieve desired results      vs.
    • desired and actual employee performance
  • Evidence of problem causes
  • Suggested cost-effective solutions

2. Needs assessment: Ways to conduct one (p.23)

  • surveys (past and present)
  • insightful interviews (individual or group interviews)
  • direct observations
  • questionnaires
  • other data collection techniques

Components of needs assessment

  • desired status
  • actual status
  • gap > need

Results of effective needs assessment

  • description of need
  • evidence of its validity
  • possible solutions

3. Job analysis (p.23)

  • The people who work in the job + the environment surrounding the job provide the characteristics of the job.
  • Task inventory: the tasks that comprise the job > duties
  • Screen task inventory by
    • asking SMEs (Subject Matter Experts)
    • job incumbents
      if tasks are really part of the job
  • Revise
  • Format tasks as survey, response scales, directions
  • Pilot test the survey
  • Final revision
  • Distribute survey to a sample of job incumbents
  • Summarize responses on a task-by-task basis
  • Choose high priority tasks for further review
  • THEN, conduct a TASK ANALYSIS
    1. break down tasks for review into component elements
    2. explain in detail the relationships among elements
    3. describe tools + conditions involved in performing each element
    4. write standards for successful performance

[What does critical thinking entail? (p.18) 

  1. be open-minded
  2. be objective
  3. seek root causes
  4. view problem from multiple perspectives
  5. give a fair hearing to evidence on multiple perspectives
  6. suspend judgment until all pertinent information has been heard
  7. listen to contrary views
  8. change a conclusion in the face of compelling information]

Posted by Nada at 12:01 AM EDT
Updated: 04/01/09 5:31 PM EDT
Instructional Goals
Topic: 3- Instructional Goals

(Sources: Dick, Carey & Carey + Prof. Ryan Watkins)

Instructional goals are ideally derived through:

  • Performance analysis > it gives broad indications of problems that can be solved by providing instruction
  • Needs assessment > it determines more specifically what performance deficiencies will be addressed

A complete goal statement should describe the following:

  • who learners are
  • what they will be able to do
  • the context in which they will use the skills
  • the tools that will be available to the learners in the performance context

The instructional goal should be: (p.31)

  • clear/complete > a general statement of learner outcomes (includes the aforementioned)
  • related to the identified problem + needs assessment
  • achievable through instruction

Criteria for establishing instructional goals: (p.32)

  • acceptable to administrators
  • sufficient resources (time, money, personnel) to develop instruction
  • stable content
  • available learners (not too busy)

How to conduct a Goal analysis (p.24)

  1. write down goal
  2. indicate, step-by-step, what people need to do to achieve the goal
  3. sort through statements
  4. indicate what learners will be able to do
  5. ask: "If learners achieve each performance, will they have achieved the goal?"

2 fundamental steps to conduct a goal analysis: (p.40)

  • classify the goal statement according to the kind of learning that will occur (Domains of learning)
  • identify and sequence the major steps required to perform the goal

Domains of learning:

  1. Intellectual skill
  2. Verbal Info
  3. Psychomotor skill
  4. Attitude

Posted by Nada at 12:01 AM EDT
Updated: 04/01/09 5:30 AM EDT
Instructional Strategy
Topic: 8- Instructional Strategy

(Sources: Dick, Carey & Carey + Prof. Ryan Watkins)

Instructional Strategy involves a huge variety of teaching/learning activities (microstrategies) such as:

  • group discussions
  • independent reading
  • case studies
  • lectures
  • computer simulations
  • worksheets
  • cooperative group projects

Microstrategies: they are pieces of an overall macrostrategy that must take learners from a motivational introduction to a topic learners' through mastery of objectives.

A textbook is a microstrategy that serves primarily as a source of information (and is incomplete instruction).

Macroinstructional strategy: is the complete instruction created by an instructor and involves:

  • defining objectives
  • writing lesson plan and tests
  • motivating learners
  • presenting content
  • engaging students as active participants in learning process
  • administering and scoring assessments (providing feedback)

A well-designed set of instructional materials contains many strategies and procedures.

3 of the major components in the learning process that facilitate learning: (according to psychologists)

  • motivation
  • prerequisite and subordinate skills
  • practice & feedback

Psychologists whose work influences approaches to instructional design:

  • Behaviorists (30 to 40 years ago)
  • Cognitivists (who later modified behaviorists' views)
  • Constructivists (more recent; they suggested new approaches)

The term Instructional Strategy covers the following:

  1. Selection of a delivery system
  2. sequencing and grouping clusters of content
  3. describing learning components that will be included in instruction
  4. specifying how students will be grouped during instruction
  5. establishing lesson structures
  6. selecting media for delivering instruction

1- Selection of a delivery system: (instruction)


Relevant Links:

Instructional Design Knowledge Base: Models/Theories

Instructional Design Knowledge Base: Instructional Strategies & Tactics

Design and Sequence Your Way to WBT Interactivity
By Atsusi Hirumi and Kathryn Ley


Posted by Nada at 1:09 AM EDT
Updated: 04/01/09 5:30 AM EDT
Assessment Instruments
Topic: 7- Assessment

(Sources: Dick, Carey & Carey + Prof. Ryan Watkins)

A- Criterion-referenced instruments
B- Student work samples
C- Student Narratives
D- Performance Standards and Assessment Rubrics

A- Criterion or objective-referenced assessment instruments:
(Learner-Centered) p145
> Assessment that enhances student learning (Baron, 1998)

Criterion-referenced assessments are

  • linked to instructional goals
  • linked to objectives derived from goals

The purpose of this type of assessment is to evaluate:

  1. students' progress: they enable learners to reflect on their own performances > they will become ultimately responsible for the quality of their work
  2. instructional quality: they indicate to the designer which components of the instruction went well and which need revision

+ it contains a criterion that specifies how well a student must perform the skill in order to master the objective.
>Therefore, performance criteria should be congruent with the objectives, learners, and context.

The performance required in the objective must match the performance required in the test. (p 146)

Two uses of criterion-referenced tests:

  • pretests: they have 2 goals
    • to verify that the student possesses the anticipated entry behaviors
    • to measure the student's knowledge of what is to be taught.
  • posttests: they are used primarily to measure the student's knowledge of what was taught.


B- Student work samples:

  • written reports
  • rough drafts
  • notes
  • revisions
  • projects
  • peer reviews
  • self-evaluations
  • anecdotal records
  • class projects
  • reflective writings
  • artwork; graphics
  • photographs
  • exams
  • computer programs
  • presentations

 C- Student Narratives: A narrative description that

  • states goals
  • describes efforts
  • explains work samples
  • reflects on experience.


D- Performance Standards and Assessment Rubrics:

A variety of scoring criteria can be used in developing performance standards or assessment rubrics. Standards and rubrics should include:

  • performance levels (a range of various criterion levels)
  • descriptors (standards for excellence for specified performance levels)
  • scale (range of values used at each performance level)

Examples of Rubric:

  • analytic assessment rubric
  • holistic assessment rubric: it provides a general score for a compiliton of work samples rather than individual scores for specific work samples

Posted by Nada at 1:23 AM EDT
Updated: 04/01/09 5:31 AM EDT