Dr Delwyn Goodrick is a psychologist who undertakes program evaluation and social research with a wide range of government and private sector clients. She is an experienced trainer having conducted a range of specialist workshops in research and program evaluation in Australia, Singapore, the US, UK, Canada and New Zealand. Most of Delwyn's work relates to health and education contexts. She maintains her own private consultancy practice and is currently working from New Zealand.
The course is structured to progress participants through the stages of an evaluation from conceptualisation through to negotiating the design, data collection, synthesis and reporting stages. You will be exposed to key theories and approaches to evaluation and then have the opportunity to apply these to real examples of evaluations. Examples from published studies and the facilitator’s applied practice in evaluation will be drawn upon to illustrate and reinforce key concepts.
This course is designed for public sector workers and academics who are interested in commissioning, managing or conducting evaluations of public policy or programs.
Day 1
- Program Evaluation: What is it? How does it differ from social research?
- The role of valuing in evaluation.
- Key Evaluation types and approaches.
- The roles of internal and external evaluation.
- Key evaluation design considerations.
- Fostering the use and influence of evaluations.
- Using evaluation models to inform practice – social programming, valuing, knowledge construction, use and influence, and practice.
Day 2
- Planning Evaluation: Working with Key Stakeholders.
- The role of program logic and program theory in evaluation: What is program logic and when is it helpful? How to facilitate a program logic mapping session.
- Program theory for complicated and complex social programs.
- From initiation to final evaluation brief (or plan).
Day 3
- Designing an evaluation plan: Key components of an evaluation plan.
- The importance of identifying key audience information needs, and evaluation scope.
- Budgeting for an evaluation.
- Defining a limited set of key evaluation questions (KEQs).
- Managing an evaluation: some principles and tips for practice.
- Designs relevant for evaluation of emergent and/or multifaceted public sector programs (complex and complicated programs).
- Types of Evaluation: Process and Outcome evaluation approaches.
- The emerging role of Developmental evaluation.
- The roles of experimental, quasi experimental, non-experimental and qualitative designs, including case studies.
Day 4
- Data Collection and Analysis/Major methods of data collection.
- Topics to be reviewed will include interviews, Most Significant Change technique (MSC), photographs and other visual methods, focus groups, nominal group technique and Delphi technique, Q sort, surveys, and goal attainment scaling.
- The role of mixed methods in evaluation – Strategies to integrate qual and quant methods.
- Analysis principles and approaches will be emphasised in Day 4 and in Day 5 in relation to major methods addressed.
Day 5
- Reporting an evaluation and supporting use and influence.
- The 1, 5, 25 rule.
- Considerations in generating valid claims.
- Issues of rigour and relevance.
- The role of program evaluation standards and core competencies in evaluation.
- Disseminating evaluation findings and influencing the use of evaluation.
- Practical tips to enhance the value of evaluation.
- Building a culture of evaluative thinking into public sector agencies – the role of monitoring and evaluation frameworks.
This course is run in a classroom. No equipment is required.
This course assumes a basic understanding of methods of data collection, including surveys, focus groups and other group methods, interviews, document analysis, and data retrieval for secondary data.
- Patton, M.Q. (2008). Utilization-focused evaluation (4th ed). Thousand Oaks, CA: Sage.
- Torres, R.T., Preskill, H., and Piontek, M.E. (2005). Evaluation strategies for communicating and reporting: Enhancing learning in organizations (2nd ed). Thousand Oaks, CA: Sage.
Q: Are there any prerequisites for this course?
A: You should have a basic understanding of data collection – see recommended background.
Highly relevant to clients reeds, a lot of the research we conduct informs service improvement and evaluation of client programs. (Summer 2019)
Its been really useful to learn about the profession more generally but also specifically to logic maps and have these related to evaluation framework. It has been invaluable to my job and Del was very knowledgeable. (Summer 2019)
A great opportunity to maneuver day to day work and provide tips, checklist, better process to design & evaluations. (Summer 2017)
Delwyn is a dynamic presenter with a wealth of subject knowledge. It was really useful that Delwyn was able to illustrate topics that we discussed with examples from her past work. Really enjoyed the course and learnt a lot about evaluation processes (Summer 2017)
Good content, Engaging discussion, Useful activities. (Summer 2014)
1) Strong emphasis on professional dilemmas I encounter & not simplistic. 2) Clarification of some issues I’m having with these methodology. (Summer 2014)
This have given me some clear theories for design my programs, understanding considering evaluations; Given me great insight into knowledge paradigms and productions. (Summer 2014)
The instructor's bound, book length course notes will serve as the course texts.