This site is 100% ad supported. Please add an exception to adblock for this site.

MTS Section 500: CURRICULUM

Terms

undefined, object
copy deck
Describe the Phases of Instructional Systems Design
1. Analysis. Identification of training needs, including knowledge and attitudes.

2. Design. Specification of training requirements in terms of measurable, demonstrated trainee achievements.

3. Development. Specification of teaching and learning activities; selection or development of instructional media; and field testing new materials.

4. Implementation. Conducting the training program, including validation during a pilot implementation.

5. Evaluation/Maintenance. Specification and implementation of procedures for measuring and maintaining the effectiveness of the instruction provided.
Describe the curriculum documentation, approval and maintenance procedures.
1. Documentation.
a. A validated Training Requirements Inventory (TRI). The TRI will be completed by subject matter experts (SMEs) from UNIT

b. An approved Curriculum Outline. The COMMAND Training Department in cooperation with the Course Curriculum Model Manager (CCMM) will complete course outlines as Lesson Topic Guide (LTG) revisions are completed.

c. Course Schedule Summary. The course schedule summary will be completed by each training site. Any course
hours requiring less than a 25-1 student-instructor ration will be justified on the schedule.

d. Training Materials List. The training materials list will be completed by course managers. If individual Echelon 4 activities add additional support materials to a course that activity must add the support materials to the Master Training Materials List on file at the activity.

e. Approved LTG with course support materials and student guide/handouts to be
completed by course managers.

f. Class G courses taught under COMMAND cognizance must include a
Student Evaluation Plan.

2. Approval. COMMAND is the Training Program Manager for all courses conducted at COMMAND and has approval authority for curricula. Echelon 4 activities are responsible for developing, revising, and maintaining all courses for which they are CCMM.

a. CCMMs will revise current courses to meet the requirements of BUMEDINST
1553.1 Series and BUMED Curriculum Development Guide.
b. Revised course materials will be provided to each activity teaching a course for SME review and comment and to COMMAND for SME and format review and comment.
c. Echelon 4 activities teaching a course and COMMAND SME and
Training Department representatives will provide review comments within one month of receiving a revision.

d. The CCMM will incorporate pertinent review comments into the master LTG and
training support material.

e. CCMMs will provide to COMMAND and to each activity teaching a
course, a complete copy of all course materials, including the LTG, student guides/handouts, and all other course support material. This may be sent by electronic mail or diskette. The form of transmission must provide podium ready copies, properly formatted and ready for instructors’ personalization and use.

f. COMMAND will provide a Letter of Promulgation and a Course Outline within one month of receiving the master LTG and training support material.

g. New course development may be mandated by BUMED and Navy Training Plan
action items. New courses may be recommended by Echelon 4 training activities, COMMAND, or as a result of customer input. New course recommendations may be submitted to the Commanding Officer as a point paper. The COMMAND Training Department will distribute for review and comment.

3. Maintenance.
a. Training activities will collaboratively decide on CCMM responsibilities and inform the COMMAND Training Department of any changes to Course Curriculum Model Manager List.
b. By 30 August annually, Echelon 4 activities will provide COMMAND with proposed annual review dates for each of their courses.
c. COMMAND will compile a master annual review schedule and provide to echelon 4 activities by 30 September annually.
d. Training activities will use the Annual Course Review Guidelines to complete annual course reviews on each course they teach and forward to the CCMM and COMMAND.
e. After incorporating course review changes, the CCMM will provide an electronic transfer of course material changes to all Echelon 4 activities teaching a course and to the COMMAND Training Department.
f. Any revisions that impact the TRI or Course Outline must be reviewed by SMEs at training sites and approved by COMMAND.
(1) Any training facility may propose changes not impacting the TRI or Course Outline to the CCMM. The CCMM will review and approve the change and notify all activities teaching the course to make the change and record it in the LTG Change Record.
Discuss the item(s) developed in Phase I (Analyze) of Task-Based Curriculum Development.
The analysis phase answers two basic questions: What are the requirements of the job or function? Which requirements must be included in the training program to ensure competency at entry level? The primary responsibility for the analysis phase rests with the training program managers, who may involve instructional staff in developing the preliminary training requirements inventory. For existing courses, the training site compares validated training requirements to curriculum materials and determines what revisions, if any, need to be made to the program.
1. Analyze Similar Programs/Courses.
a. Existing Training. For mandatory review projects (such as Cyclical Curriculum Reviews or Navy Training Requirements Reviews) involving current programs, review similar course for possible consolidation or gaps in the present curriculum.
b. New Development. Existing courses provide shortcuts in the analysis process if enough overlap exists, particularly if you can obtain documentation of the previous analysis. The analysis should be current. Look for documentation that includes the following: general description of the job, including major functions performed, a description of the target audience, and a list of job or training requirements.
(1) Description of the Job. Make sure that the functions included in the analysis or other curriculum materials are basically the same as the functions required in the new job. (2) Target Audience. The group of people who will receive training makes up the “target audience.” The target audience for a technical education and training program consists of the personnel who could be recruited for the specialty.
(3) Job or Training Requirements. The analysis documentation should include a list of the job or training requirements identified. Job requirements usually reflect everything that is done on the job, although they may exclude requirements that are part of previous training. Training requirements form a subset of job requirements, and include only the elements that must be trained to provide entry-level competency.
2 Analyze Job Structure. Provides a framework for constructing a TRI. The job structure is made up of the major areas of responsibility in the job and major supporting requirements for competent performance.
a. Major areas of responsibility consist of the duties or functional areas where specialists or technicians are assigned.
b. Supporting requirements for competent performance consist of the cognitive, affective, and soft skill areas that enable the technician or specialist to carry out the duties that make up the job.
c. Review available job information.
(1) Occupational Standards, /NOBC/Subspecialty descriptions (NAVPERS 18068E/BUPERS)
(2) creditation/certification Requirements. Accreditation requirements are useful in determining cognitive requirements.
(3) Curriculum Documentation. Curriculum outlines and lesson topic guides provide information about job requirements even if no analysis documentation could be obtained.
(4) Job descriptions. Local job descriptions may exist even when a function or group of functions is not recognized as an NEC, NOBC, or sub-specialty code.
(5) Technical documents. For jobs that rely heavily on particular types of equipment, technical, and operator manuals give valuable information on the operation and maintenance requirements that need to be covered in training.
d. Outline Job Structure. Outlining the structure serves as an organizing tool for more detailed analyses. It should also serve as a summary of the major requirements for the job; duties or functional areas, knowledge, and attitudes and skills needed.
e. Outline Duty Structures. Develop an outline for each duty shown on the outline for each duty shown on the outline of job structure. This process expands on the information provided in the outline of job structure by creating similar outlines for each duty.
3. Construct Preliminary Training Requirements Inventory. The construction of detailed lists of specific cognitive, affective, and skill requirements as well as a list of specific tasks that must be performed under each duty. (1) Application. Develop a preliminary TRI for new program development and for review projects where the existing curriculum is not support by a TRI.
(2) Format and conventions. No required format for the preliminary TRI. For new programs development, follow the organization of the outlines of job and duty structures. For existing programs, follow the organization of the unit and lessons from the current curriculum.
(3) Cognitive Elements. Includes the concepts, principles, and theories that graduates must know, understand, and apply in each of the cognitive area included in the outline of job structure plus additional cognitive areas identified for the duties included in the job.
(a) Stating Cognitive Elements. State using one of three verbs: knows, understands, or applies. As used in the cognitive inventory, these verbs indicate increasingly complex levels of knowledge or learning levels. When drafting cognitive elements:
1. Include a verb indicating the learning level.
2. Make the statement a specific as possible.
3. Use clear, unambiguous language - avoid jargon.
4. Stick to major concepts, rules, principles, and theories.
(b) Affective Elements. These elements should include the attitudes and
behaviors graduates should possess on the job.
1. Stating Affective Elements. The majority of affective elements will be
covered in an education and training program by direct teaching about the element, with evaluation limited to student’s comprehension of the elements (Knows, Understands, Applies). State such affective elements as you would state cognitive elements; e.g., “Understand the Patients Bill of Rights.”
(4) Soft Skill Elements. Effective use of soft skills or human skills depends on developing a repertoire of strategies that can be used in variable situations and, secondly, on the ability to decide which strategies will work best in each situation.
(5) Task Elements. Task elements will form the great majority of elements on any TRI for a technical education and training program. A learning level and a performance level must be assigned to each task element.
(a) Task Learning Levels.
1. Knows. Ability to recall or recognize factual information
about the task.
2. Understands. Ability to generalize about the task across familiar applications.
3. Applies. Ability to analyze variable situations and make substantive decisions regarding completion of the task.
(b) Task Performance Levels.
1. Guided Performance. Can perform the task using standard procedure with assistance.
2. Standard Performance. Can perform the task using standard procedure without assistance.
3. Adaptive Performance. Can perform the task including modifying standard procedures to meet requirements of changing situations.
(6) Writing Task Statements. A task statement must include an action
verb and an object.
(a) Avoid combining two or more actions in a single task statement.
(b) Avoid multiple objects unless the action remains constant.
(c) Include references in the task statement only if the inclusion changes the character of the task.
(d) Include restrictions or equipment in the task statement if the inclusion changes the character of the task.
(e) Avoid using trade names in task statements.
4. Validate Training Requirements.
a. Selection Factors. Multiple factors including: commonalty, consequence of inadequate performance/knowledge, delay tolerance, learning difficulty, and commonality; factors combined into definitions for priority rankings.
b. Data Gathering. Mail out survey; respondents mark learning/performance level and priority for each element on the TRI (priorities include “delete”); space is provided for recommendations or comments on individual items and the TRI as a whole, and for recommending additions to the TRI.
c. Selection Criteria. Determined by jury-of-experts during validation conference (typically consensus among attendees that items belong to specified priority levels), plus group agreement on learning/performance level.
5. Construct Training Proposal. A training proposal is required for all new program
development. For review projects, a training proposal is required if changes indicated in the analysis require an increase or decrease in resources to maintain student load and analysis indicated a need to change the scope of the training program as a whole. The training proposal consists of a cover page, an abstract, a summary of projected resources required, a POA&M, and an analysis report.

a. Course Data. Consists of length, locations, class capacity, convening rate, and instructor requirements.

b. Student Data. Student data provide information about who may attend the proposed program. Entries include: personnel eligible, personnel physical requirements, security clearance required, and prerequisite training.

c. Curriculum Content. This section provides a synopsis of the proposed program.

(1) New Programs. List and describe major areas of the proposed training.

(2) Existing Programs. Summarize the proposed additions and/or deletions to the course or program.

d. Summary of Resource Requirements. Includes sections for facility requirements, one time start-up costs, and annual costs.

(1) Facility Requirements. Describe the classroom, office, and laboratory spaces that will be needed for the program or course, including special requirements.

(2) One Time Start-up Costs. Cost of modifications to facilities; office, classroom, or lab furniture or equipment, teaching materials, and staff training.

(3) Annual Costs. Include manpower requirements for instructor and staff, printing and duplication of student materials, expendable supplies, and reference materials. Estimated costs for berthing and/or messing should be included if government berthing is not available.

e. Plan of Action and Milestone (POA&M). Start the POA&M with the approval of the training proposal. Instead of showing estimated start and completion dates for subsequent events, indicate the completion time required after completion of some prior event.

f. Analysis Report. The analysis report includes a brief description of the process used to
develop and validate the TRI.
Discuss the item(s) developed in Phase II (Design).
The major tasks of the design phase are to establish the basic structure, scope, and sequence of training, develop learning objectives and test items, and complete the curriculum documentation for course approval.

1. Structure and Scope (Curriculum Abstract).

a. Designate Units and Lesson Topics. Most units reflect a job duty. Lesson topics are the divisions within each unit. A lesson topic has a single, finite outcome, expressed as a terminal objective. Lesson topics within the unit cover individual items included under the duty on the TRI.

b. Construct Curriculum Abstract. The curriculum abstract establishes the tentative scope and content for each segment (units and lesson topics) of the program or course. The Training Program Manager is responsible for development of the curriculum abstract for a new course and may delegate to one of the training sites. The curriculum abstract does not require approval by the Training Program Manager (TPM), however, it should be retained until the curriculum outline is approved.

c. Unit Synopsis. The unit synopsis is a brief statement of the intent and scope of the unit, similar to a course description in a college catalog. The unit synopsis should be no longer than a single paragraph.

d. Lesson Topic Overview. The lesson topic overview includes a list of the items in the validated TRI that will be covered in the lesson topic and an estimate of the number of didactic, lab/practical, and/or clinical contact hours required for the lesson. One or two sentences stating the purpose of the lesson may also be included.

2. Conduct Learning Analysis (Optional).

3. Write/Revise Learning Objectives. All learning objectives must state behavior, conditions, and standards. The learning objective must reflect the training level (learning and performance level) assigned to the item(s) from the TRI that it covers. Write only one unique terminal objective for each lesson topic.

4. Develop Draft Evaluation Tools/Items. Criterion-reference testing is used for all BUMED courses. This means, students’ performance is evaluated against the criteria established by the learning objectives being tested. Make sure the objective is tested, not the content.
a. Written Tests. Written tests measure cognitive abilities and are appropriate for measuring student performance on most learning objectives written at the knowledge or understanding learning levels.

b. Performance Checklists. Performance checklists are used for most learning objectives that require students to demonstrate a skill, task, or procedure. It evaluates performance of steps in a process. Performance checklists serve a number of purposes. Prior to evaluation, they serve as performance guides for students. During evaluation, they minimize subjectivity in grading. After evaluation, they provide detailed feedback to students.

c. Product Evaluation. Product evaluation forms are similar to performance checklists except they focus on the characteristics of a product rather than steps in a process. These are generally used when all students are producing the same product and the focus is on the physical characteristics of the product.

5. Complete Curriculum Documentation. All technical education and training programs under BUMED must have an approved curriculum outline and course schedule summary.
a. Curriculum Outline. The curriculum outline is a summary document, in outline form, for each course of instruction. The outline lists pertinent course and student data, units, lesson topics, learning objectives, evaluation methods, contact hours, and training materials. The curriculum outline is the primary document used to describe and catalog courses and programs under BUMED cognizance. The curriculum outline consists of three major sections: the front matter, the outline of instruction, and the annexes.
(1) The front matter for the curriculum outline includes the cover sheet, change record, course data, student data, foreword, unit synopses, contact hours outline, and table of contents.
(2) The outline of instruction is the longest section of the curriculum outline. It includes all of the learning objectives for the course or program, arranged by units and lesson topics. The outline of instruction includes course and/or unit conventions pages (optional), unit pages, and lesson topic pages.

(3) There are two required annexes to the curriculum outline: a training materials list and the TRI.
b. Student Evaluation Plan. The student evaluation plan (SEP) defines how all students will be evaluated and graded and ensures that students at all training sites are evaluated in the same way in multiple-site programs. It details the number and types of evaluations for each unit in a program, the methods for computing unit and course grades, the policies for remediation and retesting, and information on college credits recommended for completing the course of instruction.
c. Course Schedule Summary. The course schedule summary documents instructor requirements for each course. It includes course data such as convening schedules and summarizes the number of contact hours at each student-to-instructor ratio throughout the course.
State the difference between:

(a) Course Learning Objectives and Terminal Objectives.

(b) Topic Learning Objectives and Enabling Objectives.
1. Course Learning Objectives. Course learning objectives reflect the specific skills and knowledge required in a job.

2. Terminal Objectives. A terminal objective is a specific statement of the performance expected from a student as the result of training. There is only one unique terminal objective for each lesson topic.

3. Topic Learning Objectives. The topic learning objectives support course learning objectives. They state behaviors, conditions, and standards for knowledge and skills student must acquire.

4. Enabling Objectives. An enabling objective is a specific statement of the behavior to be exhibited, condition(s) under which it is to be exhibited, and the standard to which it will be performed. Enabling objectives cover all the cognitive, affective, and skill elements students need to master to reach the terminal objective. Enabling objectives support the terminal objectives.
State the difference between the Course Mission Statement and a Terminal
Objective.
The primary differences is that a Terminal Objective relates to trainee behavior, while the Course Mission Statement is descriptive of the course.

1. The Terminal Objective for a lesson reflects a final outcome – some culminating

38

performance that demonstrates mastery of the lesson. The required performance must be observable and measurable by the completion of the lesson.

2. The Course Mission Statement provides the “who,” “what job,” “degree of qualifications,” the “where," and “conditions” for training.
Discuss the item(s) developed in Phase III (Develop).
1. During this phase of curriculum development the training site(s) develop the content of the education and training program or course; specify the instructional delivery system, learning activities and methods of presentation that will be used; and select or develop the media that will be used. The following items are developed during Phase III:

a. Content Outline. Content outline covers the information that must be presented to students if they are to master the learning objectives in the lesson topic. This includes all of the facts, concepts, procedures, rules, theories, or principles that students must learn to master each learning objective. Include enough detail in the content outline for it to serve as the instructor's primary source for information about the content of the lesson topic.

b. Learning Activities. "Learning activities" is a generic term covering almost any structured event where the student is doing something that aids in mastering a learning objective. There are two key characteristics of learning activities. The first key is that students actively participate in the event. The second key is that the learning activity must aid in mastering the learning objective(s) targeted. Learning activities range from drill and practice exercises (e.g., to improve typing skills or learn symbols) to complex projects requiring students to integrate a wide range of knowledge and skills. Whether the learning activity is simple or complex, students must receive feedback on their performance.

c. Program/Course Instructional Delivery System. The program/course instructional delivery system refers to the way the course as a whole is presented; either as group-paced or self-paced instruction.

d. Methods of Presentations. This refers to the ways that the content (i.e., new material or information) of a lesson topic (or a segment of the lesson topic) can be passed to students. Common methods of presenting new information in a group-paced program or course are lectures, instructor demonstrations, and reading assignments.

e. Select/Develop Instructional Media. "Instructional media" are the means of presenting

39

new information and learning activities. Books, audio and video tapes, film, computer programs, student handouts, flash cards, anatomical models, and drawings on a chalkboard are all media. A combination of media can be used in conjunction with a single presentation.

(1) Select or develop instructional media to meet the objectives of the education and training program.

(2) Do not reinvent the wheel – use existing materials or revise existing materials to meet the needs of the program

e. Lesson Topic Guide. Lesson Topic Guides, also called instructor guides, are used by instructors to implement the education and training program.

f. Review and Field Test Materials.

(1) Review the material for each lesson topic with subject matter experts.

(2) Individual Trials. Individual trials involve using the materials in a one-to-one situation. The person acting as the student should have the same level of background knowledge and skills as the regular the students would have.

(3) Group Trials. Group trials are similar to individual trials, but students go through the material without interruption. Check student comprehension on each learning objective. Review key points, concepts, and skills.

Ref(s): BUMED Curriculum Development Guide (1996), pgs 4-1 - 4-17.


501.8 Discuss the purpose of Instructional Media Materials (IMM).

ANSWER: Instructional media are the means of presenting new information and learning activities. Books, audio and video tapes, film, computer programs, student handouts, flash-cards, anatomical models, and drawings on a chalkboard are all media.
State and discuss the elements of the Lesson Plan.
Front Matter and Lesson Topics

1. Front Matter. Consist of these elements in this order: Cover page (optional), Title Page, Change of Record, Table of Contents, Security Awareness

40

Notice Page, Safety/Hazard Awareness Notice Pages, How to use the Lesson Plan (optional) and Terminal Objectives Page(s).

2. Lesson Topics. Lesson Topics are logically grouped and make up a unit. Instruction is presented at the lesson topic level. Lesson topics contain two parts:

a. Topic Pages list:

(1) Allocation of classroom and lab time.

(2) Enabling objectives.

(3) Trainee preparation materials.

(4) Instructor preparation materials.

(5) Training materials required.

b. Discussion-Demonstration-Activity Pages show:

(1) Discussion points (all points to be covered in proper sequence).

(2) Related Instructor Activity (gives the instructor specific directions).
Discuss the item(s) developed in Phase IV (Implement).
1. General Planning. Develop a detailed Plan of Action and Milestones (POA&M) for preparing for the pilot implementation as early as possible. For new courses, the training program manager may require a POA&M. As a minimum, establish estimated start and completion dates in the POA&M for the following:

a. Ordering equipment, supplies, and materials.

b. Scheduling use of classroom and laboratory spaces, instructors, equipment, and media.

c. Conducting staff development.

d. Duplicating or printing materials for instructors and students.

41

e. Collating materials for students.

f. Developing a validation plan.

2. Staff Development. The function of staff development for pilot implementation is to ensure that every instructor is familiar with all parts of the course or program and prepared to instruct his/her assigned portion of the course. Staff development plans do not require approval from higher authority.

3. Validation Plan. The validation plan identifies the data collection and analysis methods that will be used to monitor training. At a minimum, the validation plan will include data collection and analysis methods for class achievement, validity and reliability of evaluation instruments, student feedback, and instructor feedback.

a. Class Achievement. Collecting and analyzing data on class achievement helps to identify problem areas in a course. These problem areas may involve the successful or unsuccessful accomplishment of the course objectives, the effort that was exerted by students to achieve the criteria, or the number of students who finished the course or were set back. As a minimum, the validation plan will include a definition of acceptable achievement and a statement of what areas will be studied if the achievement goal is not met. Acceptable class achievement is usually defined in terms of a percentage of the class achieving a specific level of performance for each section (i.e., unit or lesson topic). This designated level of class performance signifies accomplishment of the training objectives. Class achievement may be defined at two levels, one for the percentage of students passing a unit or lesson topic and one for the percentage of students attaining a specified level of performance above passing.

b. Validity and Reliability of Evaluation Instruments. The validation plan should state the criteria for deciding to conduct a detailed analysis of individual tests and test items. The primary purpose of test item analysis is to detect bad test items. Include descriptions of the following in the validation plan:

(1) Criteria for conducting a test item analysis.

(2) Method(s) for establishing reliability of a suspect test or item.

(3) Method(s) for establishing validity of a suspect test or item.

c. Student Feedback. Student feedback provides additional evaluation of the effectiveness of the instructional materials and strategies from the students' point of view. This feedback may be gathered through questionnaires or through meetings with students. As a minimum, the validation plan will state how student feedback on the course will be gathered. If possible, criteria should also be established for when student responses indicate a need for further investigation of the effectiveness of the instructional materials or presentation.


42

d. Instructor Feedback. Instructor feedback is also useful in evaluating the effectiveness of instructional materials and strategies. As a minimum, the validation plan will state how instructor feedback will be gathered. As with student feedback, questionnaires may be used to obtain instructor feedback or meetings may be held with the instructors to let them express their reactions to the curriculum.

4. Final Approval. During the pilot implementation, follow the validation plan to monitor the effectiveness of the curriculum, instructional materials, and instructional methods. If the curriculum requires substantial further revisions, submit a validation report detailing the problems that were found during the pilot implementation and the revisions that will be made to address the problems. Include a POA&M for completion of the revisions with the validation report. If the curriculum requires only minor adjustments that can be put in place with the next convening class, no validation report is required. Simply send a letter to the training program manager advising him/her that a second pilot implementation will be needed (he/she may ask for additional information). Once pilot implementation is complete, submit the documentation for the course to the training program manager for final approval.
Discuss the item(s) developed in Phase V (Evaluate/Maintain).
The intent of this phase is to ensure that an education and training program or course remains effective.

1. Review.

a. In-House Course Reviews. Cover the same elements as those in the validation plan, but focus on maintaining the effectiveness and consistency of the training materials, testing instructions, and instruction. The results are summarized in a short report referred to as an “after-instruction report.” These reports should include:

(1) A summary of class achievement.

(2) A summary of responses from student questionnaires.

(3) Comments or suggestions from instructors (if the segment is taught by more than one instructor).

(4) Disposition of any suspect test items or evaluation tools.

b. Additionally, review lesson topic guides, instructional methods, and instructional materials for the course as a whole at least annually.


43

c. Each training program should have an established monitoring plan, detailing the data to be gathered and the frequency of reporting.

d. Include provisions for ensuring consistency, currency, and efficient use of technology across the course as a whole.

2. Meet the Medical Department Training Needs.

a. External Training Feedback. Feedback is obtained from recent graduates and/or their supervisors. Feedback may also be obtained from sites where graduates are assigned. Feedback is used to make ongoing adjustments to the course.

(1) Surveys of Graduates and their Supervisors. Use information received to make adjustments to the instruction and to identify areas for curriculum revisions. Survey results should be kept on file for at least two years.

(a) Survey a sample of graduates and/or their supervisors 3 to 6 months following graduation.

(b) Ask graduates pertinent questions to ascertain if training was relevant to their position, prepared them for their job, etc.

(c) Similar questions about the graduate’s performance may be asked of the supervisor.

c. Mandatory Review Programs. The Navy Training Requirements Review (NTRR) is mandated by CNO and is required for all shore-based training. BUMED is integrating the Cyclical Curriculum Review (CCR) and the NTRR processes. The CCR process has been applied to all Medical department enlisted education and training programs leading to the award of an NEC and to the Hospital Corps and Dental Technician Basic (Class “A”) education and training programs. In each cycle of the CCR, a complete review of each program is conducted. Training requirements are developed or reviewed and compared to existing curriculum.

(1) A curriculum change process is used to propose, review, and approve changes to an education and training program that do not change the resources required to conduct the training nor change the scope of the program as a whole.

(2) Changes may be proposed in a letter or point paper format, or using the formats supplied in the BUMED Curriculum Development Guide.
Explain the purpose and function of a Multi-Training Site Standardization
Conference.
1. Multiple Training Site Conferences. Multiple training site conferences are scheduled for all enlisted formal school programs (Class "A" and "C" schools) conducted at more than one training command. For most such programs, conferences are held annually, rotating the meeting site among the commands conducting the program. The training command hosting the MTS meeting is responsible for the following: establishing the dates of the conference, soliciting agenda items from all associated training sites and the training program manager, providing copies of the tentative agenda and any documents that will be reviewed during the conference, providing a meeting room, providing a recorder to take minutes of the proceedings (review draft minutes of the previous days' proceedings each morning), and completing the conference report and submitting it to the training program manager and all associated training commands.

a. The following elements of any program or course taught at two or more sites must be identical:

(1) Unit and lesson topic titles.

(2) Terminal and enabling objectives.

(3) Didactic, laboratory/practical, and clinical contact hours assigned to each lesson topic.

(4) Student references (i.e., training materials for which students are held directly responsible; the references cited in the learning objectives).

(5) Student Evaluation Plan.

2. Minor variations are allowed in scheduling and in evaluation tools as noted below:

a. Scheduling. Lessons within one unit may not be shifted to another unit nor may material in one lesson be shifted to another. Within these limitations, schedules may be adjusted to allow the most efficient use of facilities, equipment, and instructors as long as dependent relationships within the curriculum are not compromised.

b. Performance Checklists. Performance checklists will be identical at all sites, except where variation is required by equipment differences. Equipment used at different sites must be functionally identical with the same level of complexity, but does not have to be the same brand and model.

c. Written Tests. Written tests must be equivalent but need not be identical. Written tests must cover the same span of material, include the same blend of types of test items, and be equal

45

in terms of complexity and difficulty. Where feasible, a single test item bank will be used by all sites.

d. Lesson Topic Guides. Lesson topic guides must reflect the same essential information, but do not have to be identical. Exception: Lesson topic guides for Hospital Corps School are identical except for instructor personalization.

3. Provided that the above requirements are not compromised, differences may occur at the various sites in the following:

a. Instructor and student activities.

b. Guest lecturers.

c. Homework assignments.

d. Instructor reference materials.

e. Supplementary materials (e.g., audiovisuals or instructor developed student handouts used for enhancement, Learning Resource Center acquisitions).

f. Materials used for remediation.
501.13 Every activity has a different process for scheduling, conducting, and completing course packages. Explain the in-house process (including courses receiving continuing education credits) at your activity.
(Each activity shall write in their own process here)
State the purposes of testing.
To determine whether or not a student has sufficient knowledge or skill to meet the requirements established by the learning objectives; that is, whether or not the student has learned the material.
List and describe the three types of performance tests.
1. Product. A product is an observable result – something you can see, hear, or touch. Product testing is possible when:

a. The objective specifies a product.

b. The product can be measured as to the presence or absence of certain characteristics, e.g., does it look right, have the right texture, sound the way that it should?

c. Procedural steps may be performed in a different order or sequence without affecting the product.

2. Process. A process consists of step-by-step procedures required to produce a product or complete a task. Process testing is appropriate when:

a. The product and the process are the same thing – such as teaching a lesson.

b. There is a product, but safety, high cost, or other constraints prevent the product from being measured.

c. It is necessary to examine each step of the process in order to diagnose the reason for performance failure.

d. There may be a product, but there are critical points in the process which must be performed correctly because of the possibility of damage to personnel or equipment.

e. The objective specifies a sequence of steps that can be observed.

f. The process does not result in a product.

g. Your interest is in the actual behavior itself.

3. Combination (Product and Process). This performance test is concerned with both an observable result, and the step-by-step process leading to the result. Combination testing is appropriate when:

a. Both product and process are equally important to the final result, or it is required so as to avoid hazards to personnel or equipment.

b. Safety considerations almost always dictate that the operation or maintenance of a device,

47

i.e., the process, be done in a certain way-however, the outcome, i.e., the product, is just as important to successful job performance.
Describe the format, advantages, and disadvantages of each of the
following types of knowledge test items:

(a) Matching.
(b) True/False.
(c) Multiple-Choice.
(d) Short-Answer.
(e) Essay.
1. Matching.

a. Format. Consists of directions to inform the trainee how to match the listed items. The standard matching format consists of two lists containing related words, phrases, or symbols. The student is required to match elements on one list with associated elements on the other list according to specific instructions. Matching format is usually used when it is difficult to develop plausible distracters for individual items and the items are closely related.

b. Advantages:

(1) Easier to grade.

(2) Works well for items like terms and definitions.

c. Disadvantages.

(1) As a selected response item, it cannot be used to test recall.

(2) A good selected-response item is difficult to write for testing learning at the higher levels.

2. True/False.

a. Format. True-false test items present a statement to the student who must then determine whether or not the statement is correct.

48
b. Advantages. Easier to grade.

c. Disadvantages.

(1) As a selected response item, it cannot be used to test recall.

(2) Students have a 50-50 chance of guessing the correct answers.

(3) A good selected-response item is difficult to write for testing learning at the higher levels.

3. Multiple-Choice.

a. Format. Format consists of two parts: the problem statement, followed by a set of alternatives. Typically either 4 or 5 alternatives are given. The problem statement is called the “stem,” the correct response is called the “key,” and the incorrect alternatives are the “distracters.”

b. Advantages. Easier to grade

c. Disadvantages.

(1) As a selected response item, it cannot be used to test recall.

(2) A good selected-response item is difficult to write for testing learning at the higher levels.

(3) If wording in the problem statement is not clear or is ambiguous, more than one answer could possibly be correct.

(4) If not well written, distracters can be easily discarded by students, allowing them to more easily guess the correct answer.

4. Short-Answer.

a. Format. Short-answer items may be phrased either as questions or as directives. It should be clear and complete. The student must construct a short response to a question. The response is usually no more than one or two sentences and may be a single word or phrase.

b. Advantages.

(1) Minimizes the effectiveness of guessing.


49

(2) Provide an indication of students’ ability to communicate ideas in a coherent and logical manner.

c. Disadvantages. Grading. Scoring keys must be carefully prepared. Even with good scoring keys, grading takes longer and may be more subjective than with selected-response items.

5. Essay.

a. Format. Essay items are normally phrased as directives. It should be clear and complete.

b. Advantages.

(1) Minimizes the effectiveness of guessing.

(2) Provides an indication of students’ ability to communicate ideas in a coherent and logical manner.

c. Disadvantages. Grading. Scoring keys must be carefully prepared. Even with good scoring keys, grading takes longer and may be more subjective than with selected-response items.
Describe the function of test item analysis.
After the test items have been reviewed for content validity and administered to the students, statistics will be kept by the course personnel to complete the validation process. These statistics include discrimination; difficulty; and for multiple choice items, effectiveness of alternatives.
Define the following terms as used in evaluating instruments and items for
measuring student achievement:
(a) Content validity.
(b) Reliability.
(c) Interrater reliability
1. Content Validity. An evaluation item or tool has content validity if it accurately measures what it purports to measure. Content validity for all student evaluation tools and items must be established prior to pilot implementation, by review with SMEs and education experts. To be valid, an evaluation tool or item must be reliable. The validity of evaluation tools and items must be monitored throughout the life of the course. For written tests, statistical measures can be used to establish the reliability of the test as a whole and of individual items on the test.

2. Reliability. Refers to how consistently an evaluation tool or item measures performance. Reliability for performance checklists and product evaluation forms relies heavily on staff training. For written tests, statistical measures can be used to establish the reliability of the test as a whole and of individual items on the test.

3. Interrater Reliability. The key to establishing interrater reliability is to compare ratings assigned to a single performance or product by a number of raters using the same evaluation tool. If the evaluation tool used is reliable and instructors have been trained in its use, variability among raters will be minimal. Optimally, 3 to 5 raters or evaluators will participate and multiple performances or products should be used to compare the ratings assigned by individual raters to different performances or products. Results from all raters are compared. In general, if there is no majority agreement or if raters differ by two of more points on the scale, the step should be reviewed.
Program Evaluation

503.1 List the elements included in a validation plan and explain the application
of each.
1. Class Achievement. Collecting and analyzing data on class achievement helps to identify problem areas in a course. These problem areas may involve the successful or unsuccessful accomplishment of the course objectives, the effort that was exerted by students to achieve the criteria, or the number of students who finished the course or were set back. As a minimum, the validation plan will include a definition of acceptable achievement and a statement of what areas will be studied if the achievement goal is not met. Acceptable class achievement is usually defined in terms of a percentage of the class achieving a specific level of performance for each section (i.e., unit or lesson topic). This designated level of class performance signifies accomplishment of the training objectives. Class achievement may be defined at two levels, one for the percentage of students passing a unit or lesson topic and one for the percentage of students attaining a specified level of performance above passing.


51

2. Validity and Reliability of Evaluation Instruments. The validation plan should state the criteria for deciding to conduct a detailed analysis of individual tests and test items.

a. The primary purpose of test item analysis is to detect bad test items. Include descriptions of the following in the validation plan:

(1) Criteria for conducting a test item analysis.

(2) Method(s) for establishing reliability of a suspect test or item.

(3) Method(s) for establishing validity of a suspect test or item.

b. Analyze responses to all high-miss items (i.e., items missed by 20% of class).

(1) High vs low scoring students. Compare number of misses from top and
bottom third of class based on overall test scores; review for content validity and revise or delete item if more than 30% or more of incorrect responses come from high scoring students.

(2) Common patterns for incorrect responses. Determine number of hits on same
incorrect response; review for content validity and possible revision if same response accounts for 30% or more of incorrect responses.

c. Analyze test structure (item mix, item construction, comparison to LTG) for all written tests with a retest rate of 10% or higher.

d. Analyze for interrater reliability on any performance or product evaluation with a retest rate of 10% or higher.

e. Review LTG content and instructional methods for any segment of a unit with a retest rate on any written test or performance/product evaluation of 10% or higher.

3. Student Feedback. Student feedback provides additional evaluation of the effectiveness of the instructional materials and strategies from the students' point of view. This feedback may be
gathered through questionnaires or through meetings with students. As a minimum, the validation plan will state how student feedback on the course will be gathered. If possible, criteria should also be established for when student responses indicate a need for further investigation of the effectiveness of the instructional materials or presentation.

a. Administer student questionnaires at the end of each unit; investigate any item with a negative response from more than 15% of the class.

b. Elicit input from students during post-test review sessions.

52

4. Instructor Feedback. Instructor feedback is also useful in evaluating the effectiveness of instructional materials and strategies. As a minimum, the validation plan will state how instructor feedback will be gathered. As with student feedback, questionnaires may be used to obtain instructor feedback or meetings may be held with the instructors to let them express their reactions to the curriculum. Have instructors complete the lesson evaluation form at the completion of each lesson; summarize and review all negative comments during staff meetings.

5. Final Approval. During the pilot implementation, follow the validation plan to monitor the effectiveness of the curriculum, instructional materials, and instructional methods. If the curriculum requires substantial further revisions, submit a validation report detailing the problems that were found during the pilot implementation and the revisions that will be made to address the problems. Include a POA&M for completion of the revisions with the validation report. If the curriculum requires only minor adjustments that can be put in place with the next convening class, no validation report is required. Simply send a letter to the training program manager advising him/her that a second pilot implementation will be needed (he/she may ask for additional information). Once pilot implementation is complete, submit the documentation for the course to the training program manager for final approval.
List the elements included in an after instruction report and explain the
application of each.
1. Class Achievement. State the number of test and performance checklist failures within the segment (with results of retesting) and the number of academic/non-academic drops or setbacks that occurred during the segment. You may also want to include the class average and the lowest and highest averages for the segment.

2. Student Feedback.

a. In each course, students provide anonymous feedback on the effectiveness of the
instruction from their perspective, including problems they experience with the curriculum, the instructors, the training aids, the tests, and the equipment. A questionnaire should focus on information to improve instruction, and should be administered immediately following the class or unit being evaluated, when students' memory of the experience is fresh.

b. In the after-instruction report, summarize the feedback and report actions taken on adverse findings. In the sample report, the same negative comment from a relatively small number of students was sufficient to include the item in the report. Please note that not all things that students dislike need to be changed.


53

c. There are no specific requirements for summarizing student questionnaires. That process depends on the way you set up the student questionnaires and the resources available for analyzing the input. Some schools compute average responses on the student questionnaires; others use simple tallies to summarize responses. Some only count responses that indicate problems. Thus, on a 1 to 5 scale, with 5 being highly positive, a school could decide to count the responses at the 1 or 2 level only. This is certainly faster than tallying each response, but you lose the chance to take a pat on the back if most of the responses are at the 4 or 5 level.

3. Instructor Input. Instructor input may come from questionnaires, in which case you should tally the responses much the same way as you would from student questionnaires. Most schools use less formal mechanisms for instructor input. In the sample report, instructor input deals with timing. Other items that may surface from instructor input are the need to revise the content or structure of a lesson topic guide, to clarify some student materials, or recommendations to switch to a different student text. You need to report any problems instructors encountered with instructional materials, sequence of material, or student response to instruction.

4. Test Validity/Reliability
a. This section usually focuses on high-miss items and their disposition. It may also be used to point out changes needed in the schedule for evaluating student performance.

(1) Review of Materials. The after-instruction report focuses on the delivery of instruction, usually in a single segment of the course, and critiques of the delivery by students and instructors. The second part of in-house monitoring focuses specifically on materials used in the course as a whole. Are they current? Have changes in lesson topic guides (LTGs) created discrepancies in student handouts, test items, or practice materials? Are all materials consistent? Have changes in one part of the course created discrepancies with another part of the course? Can technology be used to provide more effective or efficient instruction?

(2) Frequency and Documentation. The key to in-house monitoring of training is that it is a continuous evolution. For long courses, complete an after-instruction report for each block
of instruction (e.g., each unit or set of closely related units) within the course. For shorter courses presented several times a year, complete an after-instruction report for each iteration with an annual or semi-annual summary to detect trends across presentations. Keep the reports on file for at least two years. If filing space is a problem, consider keeping the files on computer disks.
State the purpose of student critiques of instruction
Questionnaires may be developed to assess the effectiveness of instruction from the students' perspective. These questionnaires may be administered after each lesson topic or unit, at the conclusion of a group of units, and at the conclusion of the course. Administer questionnaires frequently enough to ensure that responses can be tied to specific instructional materials or methods. When a program is going through pilot implementation, student feedback is normally obtained for each unit (new programs) or for each revised unit (existing programs with extensive revisions).
Describe the Cyclical Curriculum Review Process
1. The cyclical curriculum review process has been applied to all Medical Department enlisted education and training programs leading to the award of an NEC and to the Hospital Corps and Dental Technician basic (Class "A") education and training programs.

2. In each cycle of the cyclical curriculum review process, a complete review of each program is conducted. Training requirements, including accreditation or certification requirements, for the program as a whole are developed or reviewed and compared to the existing curriculum. A broad range of subject matter experts from outside the training community participate in the analysis phase and may be involved in part of the design phase. Cyclical curriculum reviews cover all aspects of the training program and include all phases of the curriculum development model.
Explain the purpose of conducting surveys of graduates and their
supervisors.
1. Instructors should survey a sample of course graduates and/or their supervisors 3 to 6 months following graduation, to learn whether the training was adequate to the requirements of the job. The survey may be conducted by mail, electronic mail, telephone, or other interview. Include questions such as the following:


55

a. What are the main duties of your job?

b. How well are you able to perform your job?

c. How well did the course prepare you for your job?

d. What portions of the instruction were relevant to your job?

e. What portions were irrelevant?

f. In your job, how often do you use the skills taught?

g. In your job, what tasks give you the most difficulty?

h. In your job, for which tasks do you feel least adequately prepared?

i. What parts of the instruction do you think could be changed to better prepare students for their job?

2. Similar questions about the graduate's performance can be asked of the supervisor. Use information from the survey to make adjustments to the instruction and to identify areas for curriculum revisions. Keep survey results on file for at least 2 years, as such documentation is often required in the accreditation process.

Deck Info

22

permalink