Evaluating Skills

Get Started. It's Free
or sign up with your email address
Evaluating Skills by Mind Map: Evaluating Skills

1. How?: Examine actions or behaviours that can be directly observed.

1.1. Performance assessment

1.2. A learner's ability to perform a particular task OR series of related tasks that make up a particular skill

1.3. E.g.: Speaking in an interview

1.3.1. Coherence

1.3.2. Body Language

1.3.3. Fluency

1.3.4. Grammar

1.3.5. Cohesion

1.3.6. Tone

1.3.7. Lexical resources

1.4. General skill = instructional goal Precise skill = instructional objectives

2. Guidelines

2.1. 1. Determine what will be evaluated. Both process and product can be evaluated.

2.2. 2. Evaluating process involves: Following a proper series of steps, using tools or instruments properly, completing the skill in a certain timeframe.

2.3. 3. Evaluating product: Quality and quantity

2.4. 4. Evaluate under the most realistic condition possible

3. Assessment Techniques

3.1. Direct Testing

3.1.1. Certain skills can be directly tested. E.g.: Drawing blood, sorting letters, running, using a power tool (physical)

3.1.2. Primary evaluation is on final outcome as a result of performing the skill (product). How the learner performed can also be evaluated (process)

3.1.3. Guidelines

3.1.3.1. 1. Review task analysis results to determine the steps needed to perform = individual criteria

3.1.3.2. 2. Determine level of proficiency required

3.1.3.3. 3. Determine where, and what testing will require

3.1.3.4. 4. Write test instructions to inform how test will be conducted

3.1.3.5. 5. Establish how results will be recorded

3.1.3.6. 6. Conduct test, judge proficiency, provide feedback

3.2. Performance Ratings

3.2.1. Similar to direct testing, but focus is different.

3.2.2. Focus on process, not product (E.g. Cooking techniques to make souffle, develop plans to print 3D model, debating skills)

3.2.3. Checklists to determine sequence, Rating scales to determine quality (Likert-scale)

3.2.4. Rating scales must be interrater reliable (consistent results): Pilot-test Agree on scale criteria, how it works

3.2.5. Guidelines

3.2.5.1. 1. Five points Likert-scale (or odd numbers)

3.2.5.2. 2. Verbal descriptions for each rating point must be included, not overlapping

3.2.5.3. 3. Clear and distinct wording to express items

3.2.5.4. 4. One idea for each item

3.2.5.5. 5. Pilot test from target group or closely representative group

3.3. Observation and Anecdotal Records

3.3.1. A record resulting from observing the perfomance of a skill (e.g. teacher observation, instructor conducting training, riding along on shift with police)

3.3.2. Steps

3.3.2.1. Carried out in exact or representative setting in which skill will be used

3.3.2.2. Notes taken to reflect how well skill was performed

3.3.2.3. Notes are expanded into narrative describing the performance and areas of improvement

3.3.3. Guidelines

3.3.3.1. 1. Determine what skills to focus on

3.3.3.2. 2. Determine whether recording devices will be used - gain permission

3.3.3.3. 3. Decide if observation will be announced (or not)

3.3.3.4. 4. Notes taken should be brief and direct, no interpretation. Expand on notes ASAP after the observation

3.3.3.5. 5. Determine whether follow-up observation is required

3.3.3.6. 6. If there are multiple observers, ensure interrater reliability

3.4. Portfolios

3.4.1. Collection of artifacts depicting ability E.g: Artist paintings, songs, short stories, creative works, learning products, ISD model

3.4.1.1. Represent important contextualized learning that require complex thinking and expressive skills

3.4.2. Focus: The product learner creates resulting from the skills acquired over a given period of time - show progression of learning

3.4.2.1. Indicate whether skills learnt have been applied

3.4.2.2. Indicate strength and weaknesses

3.4.2.3. Richer and more valid assessment of students' competencies compared to traditional testing

3.4.3. Guidelines

3.4.3.1. 1. Determine skills to cover, period of time, relation to instructional goals and objective

3.4.3.2. 2. Identify how artifacts will be evaluated; develop rubrics

3.4.3.3. 3. Identify how the entire portfolio will be evaluated, how change in skill determined

3.4.3.4. 4. Determine what artifact samples to be included - discuss and meet to give feedback

3.4.3.5. 5. Judge the completed portfolio

3.5. Rubrics

3.5.1. Criterion-referenced assessment

3.5.1.1. Can be used in conjuction with various other techniques

3.5.1.2. Can be used with knowledge, skill, and attitude assessment

3.5.1.3. Guide for learner AND instructor

3.5.2. Benefits

3.5.2.1. Can be matched with standards and objectives

3.5.2.2. Provide much greater level of detail than letter grades or numerical scores

3.5.2.3. Save evaluation time

3.5.2.4. Help make consistent and objective evaluation

3.5.2.5. Guide learners on what to focus on

3.5.3. Rubric generator

3.5.3.1. RubiStar Home

3.5.3.2. Rubric Maker - Create custom assessments

3.5.3.3. Quick Rubric :)