Using five critical levels of evaluation, you can improve your school's professional development program. But be sure to start with the desired result—improved student outcomes.
Educators have long considered professional development to be their right—something they deserve as dedicated and hardworking individuals. But legislators and policymakers have recently begun to question that right. As education budgets grow tight, they look at what schools spend on professional development and want to know, Does the investment yield tangible payoffs or could that money be spent in better ways? Such questions make effective evaluation of professional development programs more important than ever.
Traditionally, educators haven't paid much attention to evaluating their professional development efforts. Many consider evaluation a costly, time-consuming process that diverts attention from more important activities such as planning, implementation, and follow-up. Others feel they lack the skill and expertise to become involved in rigorous evaluations; as a result, they either neglect evaluation issues completely or leave them to “evaluation experts.”
Good evaluations don't have to be complicated. They simply require thoughtful planning, the ability to ask good questions, and a basic understanding of how to find valid answers. What's more, they can provide meaningful information that you can use to make thoughtful, responsible decisions about professional development processes and effects.
What Is Evaluation?
In simplest terms, evaluation is “the systematic investigation of merit or worth”(Joint Committee on Standards for Educational Evaluation, 1994, p. 3). Systematicimplies a focused, thoughtful, and intentional process. We conduct evaluations for clear reasons and with explicit intent. Investigation refers to the collection and analysis of pertinent information through appropriate methods and techniques.Merit or worth denotes appraisal and judgment. We use evaluations to determine the value of something—to help answer such questions as, Is this program or activity achieving its intended results? Is it better than what was done in the past? Is it better than another, competing activity? Is it worth the costs?
Some educators understand the importance of evaluation for event-driven professional development activities, such as workshops and seminars, but forget the wide range of less formal, ongoing, job-embedded professional development activities—study groups, action research, collaborative planning, curriculum development, structured observations, peer coaching, mentoring, and so on. But regardless of its form, professional development should be a purposeful endeavor. Through evaluation, you can determine whether these activities are achieving their purposes.
Critical Levels of Professional Development Evaluation
Effective professional development evaluations require the collection and analysis of the five critical levels of information shown in Figure 1 (Guskey, 2000a). With each succeeding level, the process of gathering evaluation information gets a bit more complex. And because each level builds on those that come before, success at one level is usually necessary for success at higher levels.
Figure 1. Five Levels of Professional Development Evaluation
Evaluation Level
|
What Questions Are Addressed?
|
How Will Information Be Gathered?
|
What Is Measured or Assessed?
|
How Will Information Be Used?
|
---|---|---|---|---|
1. Participants' Reactions
|
Did they like it?
Was their time well spent?
Did the material make sense?
Will it be useful?
Was the leader knowledgeable and helpful?
Were the refreshments fresh and tasty?
Was the room the right temperature?
Were the chairs comfortable?
|
Questionnaires administered at the end of the session
|
Initial satisfaction with the experience
|
To improve program design and delivery
|
2. Participants' Learning
|
Did participants acquire the intended knowledge and skills?
|
Paper-and-pencil instruments
Simulations
Demonstrations
Participant reflections (oral and/or written)
Participant portfolios
|
New knowledge and skills of participants
|
To improve program content, format, and organization
|
3. Organization Support & Change
|
Was implementation advocated, facilitated, and supported?
Was the support public and overt?
Were problems addressed quickly and efficiently?
Were sufficient resources made available?
Were successes recognized and shared?
What was the impact on the organization?
Did it affect the organization's climate and procedures?
|
District and school records
Minutes from follow-up meetings
Questionnaires
Structured interviews with participants and district or school administrators
Participant portfolios
|
The organization's advocacy, support, accommodation, facilitation, and recognition
|
To document and improve organization support
To inform future change efforts
|
4. Participants' Use of New Knowledge and Skills
|
Did participants effectively apply the new knowledge and skills?
|
Questionnaires
Structured interviews with participants and their supervisors
Participant reflections (oral and/or written)
Participant portfolios
Direct observations
Video or audio tapes
|
Degree and quality of implementation
|
To document and improve the implementation of program content
|
5. Student Learning Outcomes
|
What was the impact on students?
Did it affect student performance or achievement?
Did it influence students' physical or emotional well-being?
Are students more confident as learners?
Is student attendance improving?
Are dropouts decreasing?
|
Student records
School records
Questionnaires
Structured interviews with students, parents, teachers, and/or administrators
Participant portfolios
|
Student learning outcomes:
|
To focus and improve all aspects of program design, implementation, and follow-up
To demonstrate the overall impact of professional development
|
No comments:
Post a Comment