| Preface | p. xvii |
| To the Instructor | p. xxi |
| Introduction to Instructional Design | p. 2 |
| The Dick and Carey Systems Approach Model for Designing Instruction | p. 2 |
| Components of the Systems Approach Model | p. 6 |
| Assess Needs to Identify Goal(s) | p. 6 |
| Conduct Instructional Analysis | p. 6 |
| Analyze Learners and Contexts | p. 7 |
| Write Performance Objectives | p. 7 |
| Develop Assessment Instruments | p. 7 |
| Develop Instructional Strategy | p. 7 |
| Develop and Select Instructional Materials | p. 7 |
| Design and Conduct the Formative Evaluation of Instruction | p. 8 |
| Revise Instruction | p. 8 |
| Design and Conduct Summative Evaluation | p. 8 |
| Using the Systems Approach Model | p. 9 |
| What Are the Basic Components of Systematically Designed Instruction? | p. 9 |
| For Which Instructional Delivery System Is the Systems Approach Appropriate? | p. 10 |
| Does the Use of the Systems Approach Imply that All Instruction Will Be Individualized? | p. 10 |
| Why Use the Systems Approach? | p. 11 |
| Who Should Use the Systems Approach? | p. 12 |
| Assessing Needs to Identify Instructional Goal(s) | p. 16 |
| Concepts | p. 19 |
| Performance Analysis | p. 19 |
| Clarifying Instructional Goals | p. 22 |
| Learners, Context, and Tools | p. 22 |
| Criteria for Establishing Instructional Goals | p. 23 |
| Examples | p. 25 |
| Leading Group Discussions | p. 25 |
| Needs Assessment | p. 25 |
| Clarifying the Instructional Goal | p. 25 |
| Criteria for Establishing Instructional Goals | p. 26 |
| Providing Customer Service | p. 27 |
| Conducting a Goal Analysis | p. 36 |
| Concepts | p. 38 |
| Verbal Information | p. 39 |
| Intellectual Skills | p. 39 |
| Psychomotor Skills | p. 40 |
| Attitudes | p. 40 |
| Goal Analysis Procedures | p. 42 |
| Analysis of Substeps | p. 45 |
| More Suggestions for Identifying Steps within a Goal | p. 46 |
| Examples | p. 47 |
| Intellectual Skills Goals | p. 48 |
| Psychomotor Skills Goals | p. 49 |
| Attitudinal Goals | p. 49 |
| Verbal Information Goals | p. 51 |
| Typical First Approach to Goal Analysis | p. 51 |
| Identifying Subordinate Skills and Entry Behaviors | p. 58 |
| Concepts | p. 60 |
| Hierarchical Approach | p. 60 |
| Cluster Analysis | p. 65 |
| Subordinate Skills Analysis Techniques for Attitude Goals | p. 66 |
| Combining Instructional Analysis Techniques | p. 67 |
| Instructional Analysis Diagrams | p. 68 |
| Entry Behaviors | p. 70 |
| The Tentativeness of Entry Behaviors | p. 73 |
| Examples | p. 74 |
| Hierarchical Analysis of an Intellectual Skill | p. 74 |
| Topic | p. 74 |
| Instructional Goal | p. 74 |
| Cluster Analysis for Verbal Information Subordinate Skills | p. 76 |
| Topic | p. 76 |
| Subordinate Skills | p. 76 |
| Subordinate Skills Analysis of an Additional Goal That Requires Both Intellectual Skills and Verbal Information | p. 79 |
| Topic | p. 79 |
| Instructional Goal | p. 79 |
| Analysis of a Psychomotor Skill | p. 79 |
| Topic | p. 79 |
| Instructional Goal | p. 79 |
| Subordinate Skills Analysis for an Attitudinal Goal | p. 82 |
| Topic | p. 83 |
| Instructional Goal | p. 83 |
| Identification of Entry Behaviors | p. 84 |
| Analyzing Learners and Contexts | p. 94 |
| Concepts | p. 96 |
| Learner Analysis | p. 96 |
| Entry Behaviors | p. 97 |
| Prior Knowledge of Topic Area | p. 97 |
| Attitudes Toward Content and Potential Delivery System | p. 97 |
| Academic Motivation (ARCS) | p. 97 |
| Educational and Ability Levels | p. 98 |
| General Learning Preferences | p. 98 |
| Attitudes Toward Training Organization | p. 98 |
| Group Characteristics | p. 98 |
| Collecting Data for Learner Analysis | p. 99 |
| Output | p. 99 |
| Context Analysis of Performance Setting | p. 99 |
| Managerial or Supervisor Support | p. 99 |
| Physical Aspects of the Site | p. 99 |
| Social Aspects of the Site | p. 100 |
| Relevance of Skills to Workplace | p. 100 |
| Collecting Data for Context Analysis in the Performance Setting | p. 100 |
| Output | p. 100 |
| Context Analysis of Learning Environment | p. 100 |
| Compatibility of Site with Instructional Requirements | p. 101 |
| Adaptability of Site to Simulate Workplace | p. 101 |
| Adaptability for Delivery Approaches | p. 101 |
| Learning-Site Constraints Affecting Design and Delivery | p. 101 |
| Collecting Data for Context Analysis in the Learning Environment | p. 102 |
| Output | p. 102 |
| Public School Contexts | p. 102 |
| Evaluation and Revision of the Instructional Analysis | p. 103 |
| Examples | p. 104 |
| Learner Analysis | p. 104 |
| Performance Context Analysis | p. 106 |
| Learning Context Analysis | p. 108 |
| Writing Performance Objectives | p. 120 |
| Concepts | p. 123 |
| Performance Objective | p. 123 |
| Components of an Objective | p. 124 |
| Derivation of Behaviors | p. 125 |
| Derivation of Conditions | p. 126 |
| Derivation of Criteria | p. 128 |
| Process for Writing Objectives | p. 129 |
| Evaluation of Objectives | p. 130 |
| The Function of Objectives | p. 131 |
| Examples | p. 132 |
| Verbal Information and Intellectual Skills | p. 132 |
| Verbal Information | p. 134 |
| Intellectual Skills | p. 134 |
| Psychomotor Skills | p. 136 |
| Attitudes | p. 136 |
| Developing Assessment Instruments | p. 144 |
| Concepts | p. 146 |
| Four Types of Criterion-Referenced Tests and Their Uses | p. 146 |
| Entry Behaviors Test | p. 147 |
| Pretest | p. 147 |
| Practice Tests | p. 148 |
| Posttests | p. 148 |
| Designing a Test | p. 149 |
| Determining Mastery Levels | p. 150 |
| Writing Test Items | p. 151 |
| Goal-Centered Criteria | p. 151 |
| Learner-Centered Criteria | p. 152 |
| Context-Centered Criteria | p. 153 |
| Assessment-Centered Criteria | p. 153 |
| Setting Mastery Criteria | p. 153 |
| Types of Items | p. 154 |
| Sequencing Items | p. 155 |
| Writing Directions | p. 156 |
| Evaluating Tests and Test Items | p. 156 |
| Developing Instruments to Measure Performances, Products, and Attitudes | p. 157 |
| Writing Directions | p. 158 |
| Developing the Instrument | p. 158 |
| Identify, Paraphrase, and Sequence Elements | p. 158 |
| Developing the Response Format | p. 159 |
| Checklist | p. 159 |
| Rating Scale | p. 160 |
| Frequency Count | p. 161 |
| Scoring Procedure | p. 161 |
| Using Portfolio Assessments | p. 162 |
| Evaluating Congruence in the Design Process | p. 163 |
| Examples | p. 165 |
| Test Items for Verbal Information and Intellectual Skills | p. 165 |
| A Checklist for Evaluating Motor Skills | p. 168 |
| Instrument for Evaluating Behaviors Related to Attitudes | p. 170 |
| Materials for Evaluating the Design | p. 171 |
| Developing an Instructional Strategy | p. 182 |
| Concepts | p. 184 |
| Selection of Delivery System | p. 185 |
| Instructional Strategies | p. 186 |
| Content Sequence and Clustering | p. 187 |
| Content Sequence | p. 187 |
| Clustering Instruction | p. 188 |
| Learning Components of Instructional Strategies | p. 189 |
| Preinstructional Activities | p. 190 |
| Motivating Learners | p. 190 |
| Informing the Learner of the Objectives | p. 192 |
| Informing the Learner of the Prerequisite Skills | p. 192 |
| Content Presentation and Examples | p. 193 |
| Learner Participation | p. 193 |
| Assessment | p. 194 |
| Follow-Through Activities | p. 195 |
| Memory Skills | p. 195 |
| Transfer of Learning | p. 195 |
| Detailed Outline of Learning Components | p. 196 |
| Learning Components for Learners of Different Maturity and Ability Levels | p. 197 |
| Learning Components for Various Learning Outcomes | p. 198 |
| Intellectual Skills | p. 198 |
| Verbal Information | p. 201 |
| Motor Skills | p. 202 |
| Attitudes | p. 203 |
| Student Groupings | p. 205 |
| Selection of Media and Delivery Systems | p. 205 |
| Media Selection for Domains of Learning | p. 206 |
| Media Selection for Certain Task Requirements Found in Objectives | p. 207 |
| Practical Considerations in Choosing Media and Delivery Systems | p. 207 |
| Alternative Views About Developing an Instructional Strategy | p. 209 |
| Developing an Instructional Strategy | p. 209 |
| Evaluating an Instructional Strategy | p. 212 |
| Examples | p. 214 |
| Sequence and Cluster Objectives | p. 214 |
| Plan Preinstructional, Assessment, and Follow-Through Activities | p. 215 |
| Plan Content Presentation and Student Participation | p. 216 |
| Allocate Activities to Sessions | p. 221 |
| Consolidate Media Selection and Confirm or Select Delivery System | p. 221 |
| Developing Instructional Materials | p. 240 |
| Concepts | p. 242 |
| The Delivery System and Media Selections | p. 242 |
| Availability of Existing Instructional Materials | p. 242 |
| Production and Implementation Constraints | p. 243 |
| Amount of Instructor Facilitation | p. 243 |
| Components of an Instructional Package | p. 245 |
| Instructional Materials | p. 245 |
| Assessments | p. 245 |
| Course Management Information | p. 245 |
| Selecting Existing Instructional Materials | p. 246 |
| Goal-Centered Criteria for Evaluating Materials | p. 246 |
| Learner-Centered Criteria for Evaluating Materials | p. 246 |
| Context-Centered Criteria for Evaluating Materials | p. 246 |
| Learning-Centered Criteria for Evaluating Materials | p. 247 |
| The Designer's Role in Material Development and Instructional Delivery | p. 247 |
| When the Designer Is Also the Materials Developer and the Instructor | p. 247 |
| When the Designer Is Not the Instructor | p. 250 |
| Developing Instructional Materials for Formative Evaluation | p. 251 |
| Rough Draft Materials | p. 251 |
| Rapid Prototyping | p. 252 |
| Materials Development Tools and Resources | p. 253 |
| Beginning the Development Process | p. 254 |
| Steps in the Development of Instruction | p. 254 |
| Examples | p. 255 |
| Preinstructional Activities | p. 257 |
| Mediation of Preinstructional Activities | p. 257 |
| Motivation Materials and Session Objectives | p. 257 |
| Pretest | p. 258 |
| Mediation of Pretest | p. 259 |
| Content Presentation | p. 260 |
| Mediation of Instruction | p. 260 |
| Instruction | p. 260 |
| Learner Participation | p. 260 |
| Mediation of Learner Participation and Feedback | p. 260 |
| Learner Participation Script | p. 265 |
| Feedback | p. 265 |
| Designing and Conducting Formative Evaluations | p. 282 |
| Concepts | p. 284 |
| Role of Subject-Matter, Learning, and Learner Specialists in Formative Evaluation | p. 285 |
| One-to-One Evaluation with Learners | p. 286 |
| Criteria | p. 286 |
| Selecting Learners | p. 286 |
| Data Collection | p. 287 |
| Procedures | p. 288 |
| Assessments and Questionnaires | p. 289 |
| Learning Time | p. 290 |
| Data Interpretation | p. 291 |
| Outcomes | p. 291 |
| Small-Group Evaluation | p. 291 |
| Criteria and Data | p. 291 |
| Selecting Learners | p. 292 |
| Procedures | p. 292 |
| Assessments and Questionnaires | p. 293 |
| Data Summary and Analysis | p. 293 |
| Outcomes | p. 293 |
| Field Trial | p. 294 |
| Location of Evaluation | p. 294 |
| Criteria and Data | p. 294 |
| Selecting Learners | p. 294 |
| Procedure for Conducting Field Trial | p. 295 |
| Data Summary and Interpretation | p. 295 |
| Outcomes | p. 295 |
| Formative Evaluation in the Performance Context | p. 295 |
| Criteria and Data | p. 296 |
| Selecting Respondents | p. 297 |
| Procedure | p. 297 |
| Outcomes | p. 297 |
| Collecting Data on Reactions to Instruction | p. 297 |
| Formative Evaluation of Selected Materials | p. 300 |
| Formative Evaluation of Instructor-Led Instruction | p. 301 |
| Data Collection for Selected Materials and Instructor-Led Instruction | p. 302 |
| Concerns Influencing Formative Evaluation | p. 302 |
| Context Concerns | p. 302 |
| Concerns about Learners | p. 303 |
| Concerns about Formative Evaluation Outcomes | p. 304 |
| Concerns with Implementing Formative Evaluation | p. 304 |
| Problem Solving During Instructional Design | p. 305 |
| Examples | p. 305 |
| Formative Evaluation Activities | p. 305 |
| One-to-One Evaluation | p. 305 |
| Small-Group Evaluation | p. 307 |
| Field Trial | p. 309 |
| Formative Evaluation of Selected Materials and Instructor-Led Instruction | p. 309 |
| Instruments for Assessing Learners' Attitudes about Instruction | p. 310 |
| Revising Instructional Materials | p. 322 |
| Concepts | p. 324 |
| Analyzing Data from One-to-One Trials | p. 324 |
| Analyzing Data from Small-Group and Field Trials | p. 325 |
| Group's Item-by-Objective Performance | p. 326 |
| Learners' Item-by-Objective Performance | p. 327 |
| Learners' Performance Across Tests | p. 327 |
| Graphing Learners' Performances | p. 329 |
| Other Types of Data | p. 330 |
| Sequence for Examining Data | p. 330 |
| Entry Behaviors | p. 330 |
| Pretests and Posttests | p. 330 |
| Instructional Strategy | p. 331 |
| Learning Time | p. 331 |
| Instructional Procedures | p. 331 |
| Revision Process | p. 332 |
| Revising Selected Materials and Instructor-Led Instruction | p. 332 |
| Examples | p. 333 |
| Summarizing Item-by-Objective Data Across Tests | p. 334 |
| Summarizing and Analyzing Data Across Tests | p. 336 |
| Summarizing Attitudinal Data | p. 337 |
| Determining How to Revise Instruction | p. 340 |
| Designing and Conducting Summative Evaluations | p. 348 |
| Concepts | p. 350 |
| Expert Judgment Phase of Summative Evaluation | p. 352 |
| Congruence Analysis | p. 352 |
| Organization's Needs | p. 352 |
| Resources | p. 353 |
| Content Analysis | p. 353 |
| Design Analysis | p. 354 |
| Utility and Feasibility Analysis | p. 354 |
| Current User Analysis | p. 354 |
| Field-Trial Phase of Summative Evaluation | p. 356 |
| Outcomes Analysis | p. 356 |
| Planning | p. 356 |
| Preparing | p. 358 |
| Implementing/Collecting Data | p. 358 |
| Summarizing and Analyzing Data | p. 359 |
| Reporting Results | p. 359 |
| Comparison of Formative and Summative Evaluation | p. 359 |
| Examples | p. 361 |
| Data Summary Form for the Congruence Analysis | p. 361 |
| Checklist for Content Analysis: Evaluating the Completeness and Accuracy of Materials | p. 361 |
| Checklists for Design Analysis: Evaluating the Learning and Instructional Strategies in Materials | p. 362 |
| Motivation | p. 364 |
| Types of Learning | p. 365 |
| Instructional Strategies | p. 367 |
| Form for Utility and Feasibility Analysis: Expert Judgment | p. 368 |
| Form for Current Users' Analysis | p. 368 |
| Glossary of Terms | p. 373 |
| Appendixes | p. 377 |
| Description of Problem (Need), Purpose of Instruction, Target Group, and Delivery System | p. 378 |
| Goal Analysis of the Instructional Goal on Story Writing | p. 380 |
| Hierarchical Analysis of Declarative Sentence Portion of Story-Writing Goal with Entry Behavior Lines | p. 381 |
| Design Evaluation Chart Containing Subskills, Performance Objectives, and Parallel Test Items | p. 382 |
| Instructional Strategy for Objective Sequence and Clusters, Preinstructional Activities, and Assessment Activities | p. 385 |
| Instructional Strategy for the Content Presentation and Student Participation Components and the Lesson Time Allocation Based on the Strategy | p. 387 |
| Session 1: Motivational Materials, Unit Objectives, and Assessment of Entry Behaviors | p. 390 |
| Session 2: Pretest Story and Rubric to Evaluate Stories | p. 392 |
| Session 3: Pretest and Instruction in Subordinate Skills 5.6 through 5.11 | p. 394 |
| Group's and Individuals' Achievement of Objectives and Attitudes About Instruction | p. 398 |
| Materials Revision Analysis Form | p. 406 |
| Index | p. 409 |
| Table of Contents provided by Syndetics. All Rights Reserved. |






