IMPACT Summer Institute

Evaluating Together: Expanding Thinking and Strengthening Practice

IMPACT Summer Institute_Banner

The IMPACT Summer Institute was a unique opportunity for participants who completed courses as part of the IMPACT Program to delve deeper into the monitoring and evaluation topics most relevant to their work, and to connect their monitoring and evaluation practice to broader issues within the non-profit sector in Ontario and beyond.

IMPACT Summer Institute took place in Toronto from July 6-8, 2016. The Institute brought together participants from three streams of training held in the Fall of 2015 and Winter Spring of 2016 to further explore topics identified as of interest for further study, ranging from using evaluation for internal learning and change, mastering qualitative and quantitative tools and communicating impact to diverse audiences.

At the Summer Institute, seasoned experts, learners and non-profit leaders networked, shared, deepened collective understanding and situated monitoring and evaluation work within organizational and sectoral learning, practice and impact.

The Institute kicked off on a Wednesday afternoon with a plenary session with representatives from organizations leading the way in capacity building for non-profit monitoring and evaluation practice in Ontario. The plenary was followed by a catered networking social.

Breakout parallel workshop sessions ran on Thursday and Friday representing a selection of conceptual and hands on practice sessions. The Institute closed with a Visualizing Collective Impact exercise where participants mapped their organizational impact statements onto our Collective Impact Mural.


OCIC’s IMPACT Summer Institute
When: July 6-8, 2016
Where: Metro Hall, 55 John Street, Toronto ON

Download the Participant Package Below

Parallel Workshop Session Topics and Suggested Background Readings:
Excel Basics for Monitoring & Evaluation

In this workshop participants walked through the data analysis process from start to finish: from merging several datasets together to prepare for analysis, to checking that dataset for duplicate entries, to running descriptive statistics and calculating frequencies. We concluded with time-saving techniques like using =proper() to transform short phrases like ANN EMERY into Ann Emery.

Financing Monitoring & Evaluation
Many organizations wondered how to adequately budget for M&E activities that will be required in their programs, and make the case for investments in M&E with funders. This workshop focused on ways to integrate M&E activities into program planning, in order to get them adequately financed.

Focusing in on Focus Groups
Focus group discussion is one of the widely used methods for collecting data. While focus groups may look simple, they can easily get unwieldy if the right skills and proper steps are not followed. In this session participants learned what focus group discussions are, when to use and when not to use it how to conduct it, “do’s and don’ts”, and how to analyze focus group data.

Informal Data, Real Insight
This workshop helped participants understand how informal data and data collection relates to more formal approaches. We examined when and why informal approaches are beneficial and appropriate, and reviewed a variety of approaches to informal data collection. We also reviewed key data quality, and ethical and methodological issues associated with informal data, while allowing for discussion on needs and capacities of different stakeholders. Finally, we discussed where informal data fits in the evaluation context, and how it can be analysed, triangulated and validated to strengthen evaluation findings.

More than Words: Illustrating Data
In More than Words we focused on practical considerations: designing with M&E stakeholders’ information needs front and centre, using readily available software like Excel, and thinking through a dozen chart types—dot plots, small multiples, heat maps and more—that can be applied to participant datasets. Participants left with a better understanding of the critical thinking and technical skills needed to illustrate their data, and learned through doing.

Survey, Survey, Survey
This workshop focused on the quality of survey data. Quality assurance includes all activities that are aimed to guarantee quality i.e. to prevent, reduce or limit the occurrence of errors in a survey – to get it right the first time. Both sampling and non-sampling errors introduce bias the survey data and the different sources of errors will be reviewed. Participants were shown the good practices in designing surveys to ensure rigorous methodology and reliable survey data that can be used to make statistical inferences and provide evidence of results and changes brought about by interventions.

More Information

  • Course material from previous IMPACT: Building Organizational Capacity in Comprehensive Program Evaluation Courses are available in the Public Resources and Community Discussion Forum on our eLearning Platform. Log in at learn.ocic.on.ca