71 Community Engagement and Collaboration | Curated Links 2

Curated links banner

The Community Tool Box, which is a free online resource for individuals working towards social impact and community engagement run by the Center for Community Health and Development at the University of Kansas, has a list of 46 chapters around community-building, chapters 1, 2, 36, 37, 38 and 39 all comment on evaluating community programs and initiatives.

  • Goodman, M. S., Thompson, V. L. S., Arroyo Johnson, C., Gennarelli, R., Drake, B. F., Bajwa, P., … & Bowen, D. (2017). Evaluating community engagement in research: quantitative measure development. Journal of community psychology, 45(1), 17-32. Retrieved from https://onlinelibrary.wiley.com/doi/epdf/10.1002/jcop.21828

After an in-depth review of community engaged research (an umbrella term for the numerous forms of research that have community engagement at the core), Goodman and colleagues (2017) noted that tools for measuring and evaluation of community engaged research and community engagement have been primarily focused on qualitative approaches. In turn this team has begun the development of a quantitative measure for assessing the level of engagement among collaborating community members based on 11 engagement principals with numerous items and Likert scale response options for each. Take some time to review the above article by Goodman and colleagues and their quantitative community engagement measure in Appendix A of the article.

Innoweave, which is a network of coaches and tools for social innovation, has created the Developmental Evaluation self-assessment tool to help teams determine their readiness and suitability for a developmental evaluation approach. Developmental evaluation is well-suited for teams developing KMb and impact work within multifaceted contexts.

This resource provides examples of different strategies/tools that fall under each of the measurement areas that emerged from the authors mapping review of contexts; processes; outcomes within measurement of community-engaged research.

The National Collaborating Centre for Methods and Tools, based out of McMaster University’s School of Nursing, coordinates the Registry of Methods and Tools for Evidence-Informed Decision Making which is “a repository of curated resources to support evidence-informed decision-making” (n.d.). This repository is searchable and includes numerous resources around evaluation. One important note is that the resources included are mainly related to KMb in public health, so may not apply across different disciplines.

The NHS Institute for Innovation and Improvement, now a part of Public Health England, has developed the The Good Indicators Guide: Understanding how to use and choose indicators (2008), which is a resource for anyone in the healthcare sector looking to use indicators to monitor and improve performance. An important note from the authors of this resource is that the two most critical sections are section 2. Indicators – the important principles (page 5) and section 7. Criteria for good indicators and good indicator sets (page 23) – “If you only read two sections from this guide, read these!”.

The Ontario Centre of Excellence for Child and Youth Mental Health has developed the Program evaluation toolkit: Tools for planning, doing and using evaluation, which contains resources for planning, doing and using program evaluation.

The Tamarack Institute, an institute focused on supporting and developing community engagement around major social issues in Canada, has cultivated numerous resources on evaluating impact at a community level. You will note that the majority of the resources available via the Tamarack Institute are focused on collective impact via numerous impact metrics.

The University of Calgary’s Knowledge Exchange team has developed the Knowledge Engagement (KE) Impact Assessment Toolkit (n.d.), which is based on the REAP Self-Assessment Model out of the University of Bradford). The REAP Self-Assessment Model focuses on four principles (Reciprocity, Externalities/Reach, Access and Partnership) and the KE Impact Assessment Toolkit highlights impact indicators based on these four principles in a matrix for researchers to review and complete at different stages of their projects/KMb/community engagement work.

The University of Wisconsin-Madison and the National Center for the Dissemination of Disability Research has developed the Knowledge Translation: Introduction to Models, Strategies, and Measures (2007) resource. Within this resource there is a section on measures of knowledge use that includes considerations and a framework for measuring knowledge use.

  •  Phipps, D.J., Johnny, M. and Poetz, A. (forthcoming) Demonstrating impact – considerations for collecting and communicating the evidence of impact. In The Impactful Academic. Ed. Wade Kelley. Emerald Publishing, Bingley, UK. Retrieved from TBD link to be inserted when available.

Research Impact Canada, which is a network that supports researchers, students and their partners to demonstrate the contribution to and impact of research excellence, has developed a report entitled Demonstrating impact – considerations for collecting and communicating the evidence of impact (forthcoming) to “[allow] the user to collect the evidence that describes the narrative of the research impact” (p. 1), with a focus on engaging with key stakeholders, like community members, partners, etc. around their input on the engagement process and KMb work. Within this document there is a stakeholder interview guide in Appendix A that can be used to lead an interview of community partners around their involvement in a research and KMb project. This interview guide has been developed to capture rich qualitative data and statements from various stakeholders within a project. After reviewing the interview guide, how do you think that you could incorporate qualitative interviews such as this into your project?

Worton and colleagues (2017) have presented a community-based framework for evaluating “short-term knowledge use in community settings” (p. 125), after noting that “despite the existence of numerous knowledge-to-action strategies, minimal attention has been directed towards evaluating knowledge mobilisation” (p. 124). While the Community Knowledge Mobilization Evaluation (CKME) Framework presented within this work is mainly focused on evaluating KMb and knowledge use in the community, it also presents several interesting lines of inquiry that could be posed to community partners, particularly those underpinning the final CKME model in figure two of the above cited article (p. 135). After reviewing the CKME model, how do you think that you could rephrase the questions included in the model to evaluate your KMb and community engagement work with partners?


Back button
Next button

License

Share This Book