There are 47 tools and processes used in Project Management. Which ones are best suited to your evaluation practice? Join us this April for a workshop that will expand on four key project management tools you can use to advance and hone your evaluation practice.
The sheer number of tools and processes in the Project Management Body of Knowledge (PMBOK® Guide) can make it challenging for evaluators to know which tools to apply, and when. This workshop provides theory and hands-on experience applying four PMBOK® derived light-touch project management tools that have been tailored for evaluation projects. All tools can be applied using the standard Microsoft Office suite of applications.
Participants will work with two case studies used to illustrate project management in practice, with time for participants to share their own learnings and experiences. The workshop concludes with a facilitated session using a modified version of the PMBOK® risk management processes to help participants identify, and begin to develop mitigation strategies for, the risks most likely to affect evaluation projects. The course will enhance participant knowledge and application of project management tools and processes that have been tailored for evaluators, with a focus on:
// Understanding the project management mindset of “define then deliver”
// Hands-on experience applying light-touch project management tools and processes; specifically:
- Work breakdown structure
- Deliverable-based Gantt chart schedule
- Deliverable-based budget
- Deliverable-based project tracking and reporting
// Sharing experience and knowledge on the topic of common risks for evaluation projects
This course presupposes a basic understanding of evaluation theory and practice, and is best suited for those with a firm grounding in its evaluation concepts
The Credentialed Evaluators competencies aligned through this workshop include:
Management Practice: skills and tools to manage a project/evaluation such as work plans, coordinating accountabilities and resources (including budget)
Technical Practice: inclusion of core technical elements of evaluation in PM tools such as program theory and evaluation design, as frameworks to technical practice
Interpersonal Practice: attendance to process and tools that promote stakeholder involvement, inclusion and discussion
This course is organized by CES Ontario Chapter and applicable towards your application to, and maintenance of, the Credentialed Evaluation (CE) designation.
This workshop is also eligible for 5.5 contact/ training hours towards the Project Management Professional (PMP) certification. Attendees are encouraged to bring their laptops for interactive exercises.
About the Facilitator
Alison Paprica PhD, PMP is the Principal and founder of RPM – a boutique consulting firm that provides strategic, light-touch project management services and training. Through RPM, Alison has provided workshops to build project management capacity in diverse audiences including evaluators, graduate students, and principal investigators leading large research grants. As adjunct faculty at the Institute of Health Policy Management & Evaluation at the University of Toronto, Alison developed and delivers a graduate student course on Strategic Project Management for Research.
Register before March 8 for a 10% early bird discount!
This 4-day course was developed by the Canadian Evaluation Society to provide participants with a basic understanding of the main models and practices associated with the profession of evaluation. Prior knowledge of evaluation and of social science research methods is not required. There are no examinations associated with the course, nor is there any homework. Participation in all 15 course modules entitles you to a Certificate of Completion from the Canadian Evaluation Society.
Participants will gain basic levels of knowledge, skill and appreciation with respect to the essential elements of evaluation, equipping them to enter and participate in the evaluation field and develop as evaluation professionals.
Participants will be able to describe in basic terms:
• the uses and benefits of evaluation (1)
• common settings in which evaluation takes place (1)
• where evaluations and performance monitoring fit in the program cycle (1)
• underlying theories related to evaluation, and their implications (1)
• the historical and technical context in which evaluation takes place (1)
• the range of terminology used in evaluation (1)
• causes of, and responses to, resistance (4)
• potential misuses of evaluation (4)
• cultural considerations in conducting evaluations (4)
• ethical considerations with respect to evaluation (6)
• evaluation standards (6)
• evaluation as a profession (9)
• the role of CES, including the CE system (9)
• special applications of evaluation (15)
• the relevance of, and processes associated with:
o identifying evaluation objectives (2)
o identifying and engaging evaluation clients and stakeholders, and establishing roles (2)
o develop a program profile (3)
o creating evaluation questions (5) and indicators (7)
o common data collection methods (8, 10)
o creating an appropriate and valid evaluation design, including the selection of data collection methods (11)
o collecting data, including performance monitoring (12)
o data management (12)
o assessing data quality (12)
o data analysis, including cost effectiveness analysis (12)
o determining evaluation scale and budget (13)
o project management (13)
o synthesis of evidence leading to the development of conclusions and recommendations (14)
o evaluation reporting and dissemination (14)
Participants will also gain elementary experience in:
• creating logic models (3)
• developing indicators (7)
• question construction (5)
• creating an evaluation matrix (11)
• writing findings statements (12)
• creating a reporting plan (14)
About the Facilitator
John Allen has worked in the area of Program Evaluation, Performance Measurement and Strategic Planning for more than 35 years, serving on governments at the national, provincial / state and municipal levels in Canada and the United States as well as the not-for-profit sector. Before becoming a consultant John was an Ontario public servant with the Treasury Board Secretariat and the Ministry of Municipal Affairs and Housing. While with Treasury Board, John implemented Ontario’s first formal performance measurement and evaluation process, Managing By Results. That system remains the conceptual framework of Ontario’s results based management initiatives to this day.
John works on client projects on his own and in partnership with Power Analysis Inc. of London, Ontario. As well as undertaking project assignments in all his areas of speciality, John also provides training on those subjects. He is a vendor of record with the provinces of Ontario, Nova Scotia, Manitoba, British Columbia, the Federal government and Nunavut Territory. He is an instructor for the Canadian Evaluation Society, the Association of Municipal Managers, Clerks and Treasurers of Ontario and the Ontario Municipal Social Services Association.
John is an instructor in public sector management topics with the University of Alberta’s School of Business, the Sprott School of Business at Carleton University and the Schulich School of Business at York University.