Professional Development Evaluations

This was written for a recent train-the-trainers workshop I lead for educational technology professional developers. I thought it might be worth sharing here as well. I hope some of you find this helpful – and I’d love to hear your thoughts on the topic in the comments.

Participants’ evaluations of a professional development session are an
important way to collect feedback regarding what went well (that you
can build on for the future) and what went poorly (that you can do
differently in the future). It is also an opportunity for participants
to request additional follow up. Be sure to include an evaluation after
each professional development session you lead.

If participants will be sitting in front of a computer, you can offer the evaluation online using a service such as (Effective surveys can also be created using forms on Google Docs,
but dedicated services like Survey Moneky still provide more survey
specific features and analysis.) Even if participants are not in front
of a computer, the value of having evaluation data in an easy to use
electronic format might make it worth while to email participants a
link to the survey to complete at their convenience – rather than
handing out paper forms at the end of an event.

Evaluations might be best if kept to a few very specific and very short
questions using a Likert-type scale. Ten might be the maximum number of
questions you should expect participants to answer. Be sure that each
question is asking only one thing (so that participant responses are
not ambiguous) and be sure  that each question will provide data you
will actually use. Otherwise, don’t waste the participant’s time with
the question. Ideally, of course, you should ask questions that focus
on the things that are important for you to get right – or that you are
working on. A glance at the questions in the sample evaluation below
will reveal something about what I value and what I think makes good
professional development.

Also, always be sure to leave an opportunity for open-ended feedback at
the end of the evaluation. The opportunity to express themselves
through natural language is the best opportunity for participants to
share suggestions for improvements – or to share praise for what you
did well. I always find the open ended responses the most valuable at
the end of the day… both for making changes in the future and for
feeling good about a job well done. Also, these responses will often
shed light on otherwise confusing responses to the closed (multiple
choice) questions.

Finally, give participants the opportunity to leave you their contact
information (particularly their email). This can be invaluable for
contacting them regarding follow up – and make it possible for you to
contact them regarding future professional development opportunities.
It is best, however, if leaving contact information is optional for
participants. This way, it is still possible for them to complete the
survey anonymously, which will make it more likely you will receive
honest (and helpful) feedback.

The sample evaluation (and results) below are representative of the
online evaluations I use for all workshops that I lead or arrange.

Sample Evaluation

Sample Results (based on what is entered into the sample evaluation above)

I’d love to hear your comments or feedback on this sample evaluation (or on this post in general). How do you handle evaluations for workshops you’ve lead? What have been some of your challenges and how did you overcome them (if you did)? Or… what have you seen as a participant that you’ve appreciated or disliked? In other words, what makes a good professional development evaluation? I look forward to learning from what you might share.

5 Replies to “Professional Development Evaluations”

  1. I really like the work done by Thomas Guskey at the University of Kentucky-its a must read for anyone interested in evaluating professional development efforts. It’s based on a 5 part approach:

    1. Assess the workshop demographics (did you like the room, was the coffee hot, etc.)
    2. Assess what teachers learned.
    3. Assess the organization readiness to support what the professional learning experience was about and what the teachers learned. (Done later after the workshop and is important in assessing how well the organization can support and sustain the key elements of the learning.
    4. Assess what teachers have changed about their practice. Did the professional learning experience result in using what they learned in #2. In other words, did it get to the kids. In my opinion, high-quality pd results in a change in teacher practice and an increase in student learning.
    5. Assess learning. (how to measure specifically is the key).

    Guskey does a much better job in explaining it…

    Thanks for the post.

  2. I think around 5 -7 questions is appropriate (and they would be willing to answer after training time.) Anonymous and online is the key to a successful assessment. Sorry David, but I don’t want to know if they liked the coffee or the room. I want to know if they learned anything and if they are able to transfer that knowledge very soon to their real world. KWL charts sometimes help me to help my teachers with this process. I do think that feedback is just as important as the training. Good work, as usual Mark!

  3. I might also like to see questions specific to content and be somewhat reflective of the learning state of the participant as a result of the workshop. For example;

    1. Assess your learning state prior to this workshop for (enter content here).
    o Null o Knowledge o Understanding o Performance o Accomplished

    2. Assess your learning state following this workshop for (enter content here).
    o Null o Knowledge o Understanding o Performance o Accomplished


  4. Thanks for the comment, David. I noted you mention of Guskey during your keynote in Monterey and meant to look into it… I’m thankful for this reminder. Here’s what looks like a useful link if anyone else is interested:

    Cindy, I’m sympathetic to most of the points you make. I’d like my evaluations to be shorter… and you’re right, I don’t assess their actual learning much. Jeff actually makes a pragmatic solution to that problem (in the comment after yours).

    Thanks, Jeff. I might actually be able to implement a “fill in the blank” question similar to what you recommend. This is a very pragmatic solution to assessing specific content in an evaluation format that I’ll likely use over and over.

  5. We always “throw out” the best evaluation and the worst evaluation when we do the aggregation of our data. Although I still like to look at the comments of outliers, its gives you better data to make decisions with.

    I also like to use these open-ended questions:

    What is the most valuable lesson you learned today?
    What is the biggest question in your mind now?
    What training would assist you in the future?

Comments are closed.