Title: Construct Validity in Psychological Measurement

Authors: Carmen Wilson, Bill Cerbin, Melanie Cary, Rob Dixon, University of Wisconsin-La Crosse

Contact: Carmen Wilson, [email protected]

Discipline or Field: Psychology

Course Name: Psychological Measurement

Date: January 15, 2007


Course Description. The class examines the principles and procedures for psychological measurement. It is a required course for psychology majors who take it in the junior or senior year. Graduate students in school psychology also take the class. The class meets twice a week for 85 minutes each. There were 44 students enrolled when the lesson was taught October 24, 2005. The lesson came at the end of a unit on validity.


Executive Summary

The goal of the lesson is to develop students' understanding of construct validity as measured by their ability to: 1) explain methods used to determine construct validity for psychological measures and 2) design a study to determine the construct validity of a given measure.

Prior to the lesson. In the two class days prior to the lesson, the instructor presented information on content, criterion, and construct validity. Each type of validity was presented in terms of a question it answered and how it might be assessed. Content validity answers the question, "Do the items represent the domain of interest?" It can be assessed by having an expert in the topic review the test. Criterion validity answers the question, "Do scores on the test predict some non-test behavior?" It can be assessed by correlating scores on the test with some other measure of the behavior (e.g. behavioral observation). Construct validity answers the question, "Does the test measure what it claims to measure?" The lecture highlights several processes to assess construct validity. The answer to the construct validity question is dependent upon what is known about the construct being measured. For example, if the theory about the construct suggests that two groups of people should have different levels of a construct, and the test actually measures the construct, then the groups' scores should be different.

We evaluated three versions of the lesson across three semesters. In Version 1 (lesson, no lecture - A), students developed a 5-item measure of depression and then designed three research studies to evaluate the validity of their measure prior to receiving any instruction about construct validity. In Version 2 (lesson, no lecture - B) we made minor modifications, but the lesson essentially remained the same. In Version 3 (lesson after lecture), we made significant modifications. The instructor lectured about construct validity first, and in a subsequent class, students analyzed three validity studies and then designed a validity study based on information provided by the instructor.

In Versions 1 and 2, students became bogged down in details of their proposed research studies and missed the more important goal of predicting results that would support the validity of their measure. The team decided to restructure the lesson so that in Version 3 they first heard a lecture, and then read summaries of real validity studies and predicted the results of those studies given the tests were valid. In the last part of the lesson students designed a study to determine if a given test was valid and predicted the results of that study. Interestingly, students who participated in Version 1 of the lesson (lesson, no lecture - A) generally performed better than students who participated in Versions 2 and 3.


Printer Friendly Version of Complete Report

Construct Validity Lesson Complete Report

The Lesson

Construct Validity Lesson Plan

Individual Worksheet

Group Worksheet

Lesson Plan Versions 1 & 2

Design Validity Study Exercise

Convergent Divergent Scales

The Study

Study of the Construct Validity Lesson

Student Perceptions of the Lesson

Observation Guidelines

Think Aloud Problem

Observation Guidelines Versions 1 & 2


Copyright UW System



Visit my ideabank



This electronic portfolio was created using the KEEP Toolkit™, developed at the
Knowledge Media Lab of The Carnegie Foundation for the Advancement of Teaching.
Terms of Use - Privacy Policy