A of tools used to measure teacher professional development.
In many professional development studies, teachers are typically asked to evaluate the change in their knowledge or practices (self-learning). However, many studies also test that change with direct assessments. For this study, researchers examined the relationship between self-learning and assessments.
鈥淓ach year, millions of dollars and a great deal of time are spent on professional development, with the assumption that PD will lead to improvements in teaching and student learning,鈥 said Yasemin Copur-Gencturk, an assistant professor of education at 海角论坛 and the study鈥檚 principal investigator. 鈥淵et not knowing what makes PD effective limits the efficient use of funds and teachers鈥 time.鈥
Another reason for the study, Copur-Gencturk said, is that many conversations around professional development may make claims about effectiveness that can鈥檛 really be supported.
Self-learning vs. assessment
Copur-Gencturk鈥檚 new study collected data from 545 teachers who participated in content-focused professional development programs. The programs were supplied by a professional development organization supported by a Mathematics and Science Partnership grant from the U.S. Department of Education.
Teachers completed an assessment before and after their programs which showed that, on average, teachers showed a moderate and statistically significant increase in their mathematical knowledge for teaching (MKT).
Participants also answered a survey provided by the researchers, to self-report learning. The survey asked teachers to rate how they felt their understanding of mathematical concepts had deepened, how their understanding of how children think and learn about mathematics had increased, and how their attention to children鈥檚 thinking and learning when planning their mathematics lessons had increased.
Comparing the results, however, revealed that the correlation between teachers鈥 gain scores based on the direct assessment and their self-reported gains was almost zero. For example, teachers who self-reported that they had learned more did not see that growth reflected by the direct assessment.
Such a result suggests that teachers鈥 self-reports and direct assessments captured different underlying constructs鈥攊n other words, the assessments might not be measuring what teachers themselves are measuring. Thus, a program that is identified as effective based on teachers鈥 self-reports might not be considered effective if the outcome measure was a direct assessment of teachers鈥 learning.
That鈥檚 a problem because many of the large-scale studies of PD effectiveness rely on teacher self-reports, suggesting teacher educators and researchers need to use caution when studying PD efficacy.
Better PD approaches
Those results weren鈥檛 surprising to Copur-Gencturk, who has long-studied how well assessments align with what they purportedly track. But she was surprised in other ways: Teachers who felt they learned more reported using strategies similar to those used by the PD facilitators in their teaching more often. But direct assessment did not detect any such differences in teachers鈥 learning.
鈥淭he teachers felt they had learned more when they鈥檇 already been using strategies similar to those used in the PD programs in their teaching more frequently,鈥 she said. 鈥淏ut the more frequent use of teaching strategies that were not used by the PD facilitators was linked to less learning, as measured by the direct assessment.鈥
Copur-Gencturk urges administrators not to see the study as reason to be skeptical of professional development writ large, but rather to be methodical in selecting a good program. She urged administrators to find the PD in the area of teachers鈥 need that is supported by empirical research showing, first, that the PD has the potential to improve teachers鈥 knowledge, and second, skills that were captured by valid, reliable and meaningful measures.
Said Copur-Gencturk: 鈥淭he devil is in the details.鈥
Read the study: (Journal of Teacher Education)