This was written for a recent train-the-trainers workshop I lead for educational technology professional developers. I thought it might be worth sharing here as well. I hope some of you find this helpful – and I’d love to hear your thoughts on the topic in the comments.
Participants’ evaluations of a professional development session are an
important way to collect feedback regarding what went well (that you
can build on for the future) and what went poorly (that you can do
differently in the future). It is also an opportunity for participants
to request additional follow up. Be sure to include an evaluation after
each professional development session you lead.
If participants will be sitting in front of a computer, you can offer the evaluation online using a service such as surveymonkey.com. (Effective surveys can also be created using forms on Google Docs,
but dedicated services like Survey Moneky still provide more survey
specific features and analysis.) Even if participants are not in front
of a computer, the value of having evaluation data in an easy to use
electronic format might make it worth while to email participants a
link to the survey to complete at their convenience – rather than
handing out paper forms at the end of an event.
Evaluations might be best if kept to a few very specific and very short
questions using a Likert-type scale. Ten might be the maximum number of
questions you should expect participants to answer. Be sure that each
question is asking only one thing (so that participant responses are
not ambiguous) and be sure that each question will provide data you
will actually use. Otherwise, don’t waste the participant’s time with
the question. Ideally, of course, you should ask questions that focus
on the things that are important for you to get right – or that you are
working on. A glance at the questions in the sample evaluation below
will reveal something about what I value and what I think makes good
professional development.
Also, always be sure to leave an opportunity for open-ended feedback at
the end of the evaluation. The opportunity to express themselves
through natural language is the best opportunity for participants to
share suggestions for improvements – or to share praise for what you
did well. I always find the open ended responses the most valuable at
the end of the day… both for making changes in the future and for
feeling good about a job well done. Also, these responses will often
shed light on otherwise confusing responses to the closed (multiple
choice) questions.
Finally, give participants the opportunity to leave you their contact
information (particularly their email). This can be invaluable for
contacting them regarding follow up – and make it possible for you to
contact them regarding future professional development opportunities.
It is best, however, if leaving contact information is optional for
participants. This way, it is still possible for them to complete the
survey anonymously, which will make it more likely you will receive
honest (and helpful) feedback.
The sample evaluation (and results) below are representative of the
online evaluations I use for all workshops that I lead or arrange.
Sample Results (based on what is entered into the sample evaluation above)
I’d love to hear your comments or feedback on this sample evaluation (or on this post in general). How do you handle evaluations for workshops you’ve lead? What have been some of your challenges and how did you overcome them (if you did)? Or… what have you seen as a participant that you’ve appreciated or disliked? In other words, what makes a good professional development evaluation? I look forward to learning from what you might share.