CESBCY 2015 Conference – Collaboration, Contribution, and Collective Impact

PrintToday was the Canadian Evaluation Society BC and Yukon (CESBCY) chapter’s conference. Now, I may be biased given that I was the conference Program Chair, but I think we had an outstanding program of presentations this year! But before you think I’m being too arrogant, I will state for the record that the outstanding program was 100% due to the fantastic presenters – my job as program chair was easy given that incredible proposal we received from evaluators and non-profit organizations from around the region.

This year’s conference theme was the non-profit sector; “Collaboration, Contribution, and Collective Impact” and the only complaint I had about the conference was that there were so many good sessions that I couldn’t go to all the ones I wanted to see!

Here are my notes from the sessions that I did attend:

Social Return on Investment (SROI) for Aunt Leah’s place

-worked with Sametrica – company from Ontario; proprietary SROI framework
-students from UBC economics
Aunt Leah’s Place works with youth in foster care – those transitioning out of foster care and work with low income mothers who are struggling to keep their children
-40% of homeless youth have been in government care (foster care is a “pipeline to homelessness”)
-people are in school longer (post-secondary education) and salaries are relatively flat, but housing prices have skyrocketed over a generation
-70% of parents with 19-28 years olds at home provide free rent and groceries (essentially a subsidy for the youth)
-this is something that youth in foster care don’t get – they “age out” of foster care at 19 years
-Aunt Leah supports these youth after 19 years of age (trying to provide what many parents provide for their children in this age range), but does not provide housing
-700 youth “age out” of foster care per year in BC
-Aunt Leah – 10% receiving support were homelessness vs. 32% of control group (not getting Aunt Leah support)
-SROI – estimated a $7 return for every $1 invested (a small study; didn’t include some of their newer programs) – they wanted to do a more robust study and look at their different programs
-they logic modelled programs and created indicators, which they came up with a $ amount for
-ROI – typically just about financial stuff
-SROI – attempts to be more holistic (including social, environmental, and more holistic economic perspective)
-financial proxies – they looked at similar activities and what people were willing to pay for them (e.g., what’s the “value” of building social connections through an activity Aunt Leah’s offers? found out how much people pay to join similar social activities for this age group?)

Promoting social innovation in vulnerable populations – a Developmental Evaluation

-shared challenge that funders (Community Action Initiative, City of Vancouver, Vancouver Foundation) were facing: low quality of innovation proposals coming forward
-thought maybe they needed to do something differently to develop an environment that encourages experimenting and testing to support innovation
-traditionally, funders don’t really engage with projects until their proposal is funded
-but innovations don’t lend themselves to fully formed proposals
-so thought about reaching out to applicants with good project ideas, but weak or non-existent innovation plans, to help them develop the innovation side of things
-project went a bit sideways as many funders wanted in on the project across the province
-decided they needed to evaluate if this would really lead to more socially innovative proposals and if this could be done well in a partnership model (multiple funders)
-engaged an external evaluator, decided on a Developmental Evaluation approach
-logic modelled to show how resources –> planned outcomes (increased knowledge of social innovation, application of this know to create more innovative project proposals, spread of knowledge throughout organizations, change in relationship between funders and participants (and among the funders))
-open to adapting LM (because it was developmental evaluation approach), but didn’t need to in the end
-mixed methods for data collection
-interviews with funders (how did you come to this project (at start)
-surveys with participants – they got survey fatigue, so changed to in depth interviews with project teams at the end
-found that the process –> more socially innovative proposals (sounds like based on the opinion of the funders?), partnerships -> able to leverage each organization’s resources
-allowed organizations to take on more risk than they normally would (permission to be innovative, which comes with risk of project maybe not coming to fruition)
-found increased understanding and application of social innovation; learnings spread throughout the funding organizations; increased understanding of organizational readiness for system change; increased acceptance of uncertainty in innovation process)

ECLIPS: An Innovative and Cost-Effective Evaluation Capacity Building Initiative for Not-for-Profits

-evaluation capacity building model used at UBC, but can also be used at other organizations
-they are an internal evaluation unit at UBC Faculty of Med, but can also be used by external evaluators
-CLIPs – Communities of Learning, Inquiry and Practice – developed by B. Parsons in a higher education context (but it applies to a wide range of settings)
-allows self-selected teams to build evaluation capacity by conducting a project they are interested in
-3-6 people in the team, they own the project and conduct and report out on it
-small amount of funding (e.g., they did $1000) – e.g., transcription, honararia, etc.
-resources provided to guide participants to do their work
-evaluator to provide assistance/guidance to the team
-UBC wanted to build a strong culture of evaluation in their Med School (called ECLIPs because there is another medical project called CLIPS)
-very few teams used any of the $1000 they were eligible to get
-20 applications, 18 approved, 11 fully completed (in time for the evaluation of ECLIPS + one completed later)
-86% of projects were facutly-led
-hired Kylie Hutichson to evaluated ECLIPS
-methods: interviews with 10 team leaders + doc review + 1 focus group with Evaluation Specialists (tried online survey with team members, but didn’t get good response)
-findings:
-evaluation capacity building occurred at the level of Faculty of Med, Evaluation Studies Unit, programs, and participant level
-participant level: increased appreciation of, knowledge & skills in evaluation, and increased evaluation activity
-program-level: increased use of evidence for program decision making
-ESU got understanding understanding of selves and others developed a better understanding of ESU
-FoM level – increased number of evaluation champions
-Evaluation Specialists seen as a major help to the teams
-some teams didn’t use the resources at all (overwhelming, too idealistic and not practical for their limited time)
-5 projects didn’t get off the ground due to lack of time; some teams felt 1 year not long enough to do a project (but some really liked the 1 year deadline)
-making some changes to ECLIPs based on findings:
-more flexible time lines and intake during the year
-consultation with Evaluation Specialists before they submit their proposals
-more training for Evaluation Specialists on how to be a good coach
-revising resources
-some of the key ingredients:
-program teams chose evaluation projects that are meaningful to them
-tailored coaching for the team
-www.insites.org/clip – some free resources
-upcoming article in New Directions journal

Mobile Learning in Evaluation for Health Leaders: Evaluation of an Innovative Capacity Building Tool

-a course to help senior health leaders to become informed users of evaluation
-can also be used with other sectors, including non-profits
-public health tends to educate people about evaluation, but it teaches them how to do it, whereas this course teaches people on how to *use* evaluation findings in their work
-mobile learning because its anywhere/anytime; engaging; personalized; interactive; highly focused; informal
-you can also collect mobile analytics (to see how people are using it)
-gamification – more engaging for learners
-evaluation to inform improvements in course design and content and to look at if this could be an effective way to provide this kind of learning
-beta testers – 15 health leaders, 1 physician, 9 internal evaluators, 2 mobile tech experts
-data collection – end of unit and end of course surveys, phone interviews with health leaders; phone focus group with evaluators, unstructured interviews with tech experts (observed them going through the course to get their perspective)
-findings:
-users like short and succinct units
-some difficulties navigating through the course (some inconsistencies in the buttons)
-expected more interactivity (people like what was there and wanted more)
-expected more personalization
-many people disliked stock photos
-mixed opinions on use of audio (some liked, others felt it did not add value)
-units 1-3: too simplistic for senior health leaders
-units 4-7: engaging, relevant and practical (topics were: systems thinking, enhancing use, managing evaluation, supporting evaluation)
-valuable way to increase knowledge and interest in evaluation
-majority would recommend course to the colleagues (if improvements were made in content & design (as per the above findings))
-recommendations:
-revise target to mid-level mangers and directors (not just senior leaders)
-revise units 1-3 to be more engaging and relevant (so you don’t lose them before they get to the more engaging, relevant stuff in the later modules)
-increase interactivity, more gamification
-improve course navigation
evaluationforleaders.org – revised version will be available spring 2016, for free!

The Doctor is In: Evaluation Therapy for All (Forum Theatre)

This session was absolutely brilliant! It involved the presenters running through a skit with some common evaluation errors/issues and then running through the same skit again, but this time allowing anyone in the audience to yell “Stop” when they saw an issue that they felt could be done better. And then the audience members joins the scene and “corrects” the problem. It’s hard to capture it in writing, but it was absolutely hysterical!

Move Over Accountability, We’re Putting Learning in the Driver’s Seat
Fostering Change, Vancouver Foundation
-pilot project – transition worker to give support to those transitioning out of foster care (support provided to youth up to age 20)
-just started in May 2015
-referrals come in from youth, probation officers, social workers, etc.
-have a Youth Advisory Circle – young people age 17-24, engaging the youth in “adult” conversations (where they haven’t traditionally been included)
-principle-based (rather than expecting people to implement certain models)
-value the lived experience – they have expertise, knowledge, and wisdom
-shared learning agenda – iteratively created with grantees
-have a shared learning and evaluation working group – frontline staff and managers from grantee agencies
-learning/knowledge exchange days
-grantees have a strong relationship with the funder and with other grantee agencies
-grantee was worried that the expectation that they contribute to working group might be really time consuming, but it’s actually been very beneficial to them
-rather than relying on “misery porn” (e.g., images of sad looking youth to try to get funding), focus on images of empowering youth – how things can be positive, how issues are due to the system, rather than blaming youth for the circumstances they are in

10 Plus Ways to Stretch Your Evaluation Budget
-“I want to do myself out of a job” – I want to give evaluation away!

1. Lower your rate
-when you are starting out, to build a portfolio
-working with a repeat client (or getting them to become a repeat customer)
-you want to try something new (you get learning out of it)
-pro bono work (not to do off the side of your desk or poor quality, but because you want support the work – it’s a volunteer contribution
-can can a chartitable recipet (for your personal taxes) if working with a nonprofit
-include full cost and show the discoutn (so they see the true value of the evaluation)
-if they have a funder that requires, eg. 10% of budget must be evalution, you can ccharge your full rate

2. Act as an evalution coach
-coach program staff to go through the evalution process
-use a reallly good evalution “how to” guide

3. Leverage evaluation course
-program staff goes through an evaluation course (e.g., CES essnetial skills series, and you act as theri oach

4. Become a case or student project
-SFU, UBC, UVic all have evaluation courses that use real programs as case studies for the students to develop an evaluation plan
-CES Case Competition
-Timing needs to be right

5. Use existing evlaution frameworks or systems
-RE-AIM
-Vancity Demonstrating Value
-IHI Triple Aim
-etc
-helps you be more efficient
-has to work for your program, of course
-your value-add service as an evaluator is focused on what’s not includied in the existing framework

6. Choose data collection tools from online tool repositories
-engage staff in selecting more
-e.g.:

Conference slide re: online data collection tool repository

7. Support program staff or participants to collect data
-provide training
-do routine checkins
-especially helpful in multi-lingual environments and staff speak language of participants (and you don’t have budget for interpreters)

8. Use analysis packages from online survey packages
-e.g., Fluid Survey
-qualitative data: invest in qualitative data analysis program (really speeds up the process)
-http://cognitive-edge.com/sensemaker/ – engage program participants in doing the analysis

9. Hold data interpretation sessions
-present analyzed data and have stakeholders do the interpretation and generate recommendations
-can use a simple framework like: What? So What? Now What?
-saves you a lot of time (helps you develop the story of what the findings told you – and helps you write the report)

10. Simply the report
-talk to client in planning stages about what kind of report is wanted/needed
-sometimes they only want a PowerPoint (the title of the slide is the conclusion of what the graph shows)
-even when narrative report is wanted: follow the 1-3-25 page rule from CFHI
-can put all the gory details in appendices (that are typically only ever read by other evaluators!)
-www.piktochart.com
-www.canva.com (not free) – slide docs

Tips from the audience (this was the “Plus” in the title!)

-clarify language – make sure that you are using words the same way
-listen to how the clients talks and use the language they use (e.g., you can say “what changes do you want to see? vs. what are your intended outcomes. you can say “how will you know if you are seeing those changes? vs. indicators)
-make sure you are a good fit for the project
-frequent check ins (rather than revealing stuff only at the end)
-Vantage Point – a nonprofit that has a subsidiary (Go Volunteer) that you can offer up discounted services
-start with a small projects to give them a taste for what evaluation can offer, but they still have unanswered questions that you can do the next evaluation on

Funder Panel

-Bryn (Vancity Community Foundation), Trilby (Vancouver Foundation), Cathy (Telus)
-standardized metrics are a challenge to develop
-as soon as you start tracking indicators, people focus their attention on that (and you don’t want to unintentionally drive them to do stuff that will cause a negative effect)
-process measures vs. outcome measures – e.g., do you care that you reached 1000 students this year or that you prevented 3 suicides?
-big corporations (like Telus, IBM, Accenture) often have employees who want to give back to community, but want to use their skills to do that, so why not match them up with non-profits to provide pro bono services? (knowledge philanthropists)
-you have to recognize the realities of the non-profits you are working with (e.g., VF offered to send staff from their grantee agencies to a 5 day conference and thought it would be an amazing opportunity, but the non-profits said “We can’t send our staff to a conference for 5 days! Who do you think will run the program??”
-people want to get to outcomes and impact, but often we only get to activities and outputs (i.e., running the program)
-balance between the value the data will bring to you and the cost of getting the data
-there’s been a shift towards funding projects instead of operations, so people are having to show that their program is “new” and “innovative” (and may just position something they are already doing as “innovative”)
-funders get way more requests for funding than they could ever fund (and it’s sad to have to say “no” to projects that could be really great)

This entry was posted in evaluation, event notes, notes and tagged , , , , , , , . Bookmark the permalink.

1 Response to CESBCY 2015 Conference – Collaboration, Contribution, and Collective Impact

  1. Pingback: Conferences and Conferences and AGMs, Oh My! | Not To Be Trusted With Knives

Leave a Reply

Your email address will not be published. Required fields are marked *