Various Types of Validity and Reliability
by Chariel Dye
1. Validity
1.1. Content- the extent to which the content of the test matches instructional objectives
1.2. Criterion- the extent to which scores on the test are in agreement (concurrent validity) or (predictive validity) an external criterion
1.3. Construct- The extent to which an assessment corresponds to other variables, as predicted by some rationale or theory
2. Reliability
2.1. Stability or Test-Retest- Give the same assessment twice, separated by days, weeks, or months. Reliability is stated as the correlation between scores at Time 1 and Time 2
2.2. Alternate Form- Create two forms of the same test (items should vary slightly. Reliability is stated as correlation between scores of Test 1 and Test 2
2.3. Internal Consistency (Alpha, a)-Compare one half of the test to the other half or use methods such as Kuder-Richardson Formula 20 (KR20) or Cronchbach's Alpha