OT 530 MIDTERM STUDY GUIDE
Screening - Answers - -first part of evaluation
-initial identification of strengths and weaknesses and need for further evaluation
-not sufficient for diagnostic and intervention planning
-usually not more than 15 minutes
Evaluation - Answers - -comprehensive process of obtaining and interpreting data
necessary to understand the person, system, or situation
-interpretation leads to an in-depth understanding of occupational performance
Assessment - Answers - -a specific tool or systematic interaction (observation, interview
protocol) used to collect data
-component of evaluation
-will use a variety of assessments in evaluation
Reevaluation - Answers - -process of critical analysis of client response to intervention
-it enables the therapist to make any necessary changes of intervention plan
-see if there is a change in occupational performance
Top-Down Approach - Answers - -functional
-occupation based
-contextual
-more client centered with it being based on their needs
Bottom-Up Approach - Answers - -focuses on underlying components (body functions)
Reliability - Answers - -degree of consistency between scores
Interrater Reliability - Answers - -degree of agreement between 2 raters following
observation of the same subject
Intrarater Reliability - Answers - -evaluator consistently administers and scores an
assessment the same way each time and gets similar or same results
Standard Error of Measure - Answers - -reflects degree of true score vs. error (should
be treated as an estimate since it is not always their "true score")
-example: a child has an IQ test and scores a 72, but the manual states that this IQ test
has a SEM of 6, their actual IQ score based on this test would fall somewhere between:
66 and 78
Test/Retest Reliability - Answers - -measure of test score stability on the same version
of the assessment over 2 occasions
, Internal Consistency - Answers - -degree of agreement of commonality between items
in an assessment that measures a single concept or skill
Split Half Reliability - Answers - -split up questions and compare answers to see if you
can shorten assessment
Validity - Answers - -degree to which a test accurately measures the specific construct,
trait, behavior, or performance it was designed to measure
Face Validity - Answers - -items appear related to the purpose of the assessment
-not based on statistical proof
-some tests may seem non purposeful for patients
Content Validity - Answers - -how well an assessment represents all aspects of the
phenomenon being evaluated
-look at specific components of the test
-there is evidence to back it up
Concurrent Validity (Criterion-Related) - Answers - -compares a new assessment with
one that is considered the "gold standard"
Predictive Validity (Criterion-Related) - Answers - -extent to which scores on an
assessment forecast future behavior
-example: going from facility to home to predict safety
Ecological Validity - Answers - -degree to which the test mimics and measures similar
activities that can be observed in a typical, daily living context
Pie (P) - Answers - -evaluator and test taker
-rater bias:
-low intrarater/interrater reliability
-coaching the client
-severity/leniency: tendency to use higher or lower ratings
-central tendency: tendency to evaluate as "average" when applying a rating scale
-halo effect: evaluator's general impression of test taker effects accurate rating of
performance
-test taker bias: physical status (discomfort, pain, anxiety, fatigue), motivation, previous
testing, regional or language barriers, lack of patient devices or aids
Item (I) - Answers - -when people of similar abilities perform differently on a given
assessment because of age, gender, ethnicity, cultural, or socioeconomic differences
-asking client to do something that they never knew how to do, using the item
incorrectly, is it working correctly
Screening - Answers - -first part of evaluation
-initial identification of strengths and weaknesses and need for further evaluation
-not sufficient for diagnostic and intervention planning
-usually not more than 15 minutes
Evaluation - Answers - -comprehensive process of obtaining and interpreting data
necessary to understand the person, system, or situation
-interpretation leads to an in-depth understanding of occupational performance
Assessment - Answers - -a specific tool or systematic interaction (observation, interview
protocol) used to collect data
-component of evaluation
-will use a variety of assessments in evaluation
Reevaluation - Answers - -process of critical analysis of client response to intervention
-it enables the therapist to make any necessary changes of intervention plan
-see if there is a change in occupational performance
Top-Down Approach - Answers - -functional
-occupation based
-contextual
-more client centered with it being based on their needs
Bottom-Up Approach - Answers - -focuses on underlying components (body functions)
Reliability - Answers - -degree of consistency between scores
Interrater Reliability - Answers - -degree of agreement between 2 raters following
observation of the same subject
Intrarater Reliability - Answers - -evaluator consistently administers and scores an
assessment the same way each time and gets similar or same results
Standard Error of Measure - Answers - -reflects degree of true score vs. error (should
be treated as an estimate since it is not always their "true score")
-example: a child has an IQ test and scores a 72, but the manual states that this IQ test
has a SEM of 6, their actual IQ score based on this test would fall somewhere between:
66 and 78
Test/Retest Reliability - Answers - -measure of test score stability on the same version
of the assessment over 2 occasions
, Internal Consistency - Answers - -degree of agreement of commonality between items
in an assessment that measures a single concept or skill
Split Half Reliability - Answers - -split up questions and compare answers to see if you
can shorten assessment
Validity - Answers - -degree to which a test accurately measures the specific construct,
trait, behavior, or performance it was designed to measure
Face Validity - Answers - -items appear related to the purpose of the assessment
-not based on statistical proof
-some tests may seem non purposeful for patients
Content Validity - Answers - -how well an assessment represents all aspects of the
phenomenon being evaluated
-look at specific components of the test
-there is evidence to back it up
Concurrent Validity (Criterion-Related) - Answers - -compares a new assessment with
one that is considered the "gold standard"
Predictive Validity (Criterion-Related) - Answers - -extent to which scores on an
assessment forecast future behavior
-example: going from facility to home to predict safety
Ecological Validity - Answers - -degree to which the test mimics and measures similar
activities that can be observed in a typical, daily living context
Pie (P) - Answers - -evaluator and test taker
-rater bias:
-low intrarater/interrater reliability
-coaching the client
-severity/leniency: tendency to use higher or lower ratings
-central tendency: tendency to evaluate as "average" when applying a rating scale
-halo effect: evaluator's general impression of test taker effects accurate rating of
performance
-test taker bias: physical status (discomfort, pain, anxiety, fatigue), motivation, previous
testing, regional or language barriers, lack of patient devices or aids
Item (I) - Answers - -when people of similar abilities perform differently on a given
assessment because of age, gender, ethnicity, cultural, or socioeconomic differences
-asking client to do something that they never knew how to do, using the item
incorrectly, is it working correctly