ARCHIVED - Integrated Strategy on Healthy Living and Chronic Diseases - Knowledge Development and Exchange Functional Component
The conceptualization of a knowledge cycle that guides the KD&E Functional Component in ISHLCD is shown in this diagram:
The Knowledge Cycle Framework
Figure 2 - The Knowledge Cycle Framework Text Equivalent
This image is of a knowledge cycle conceptual framework that guides the work of the KD&E Functional Component to support evidence-informed practice and practice-based learning through capacity building interaction of practitioners, researchers and policymakers in inter-related component processes of needs assessment, knowledge creation, knowledge translation, dissemination, adoption and uptake, and evaluation.
Definitions of Key Terms in the Knowledge CycleFootnote 13
Evidence: The best research and evaluation information available, based on a systematic analysis of the effectiveness of an intervention, strategy or service and its use. Evidence is gathered in order to produce the best outcome, result or effect, and may be generated from a range of rigorously implemented and appropriate quantitative and qualitative research and evaluation methodologies.
Needs assessment: The process of identifying the learning and practice needs of policy makers, practitioners, and researchers engaged in health promotion and chronic disease prevention in Canada. This is most often accomplished through subjective survey methods and informal feedback methods, such as meetings and conversations, though the process may also include objective measures.
Knowledge: A fluid mix of framed experience, values, contextual information, evidence interpretation and expert insight that provides a framework for decision making, evaluating and incorporating new experiences and information. It may be explicit or tacit, and individual or collective. In organizations, it often becomes embedded not only in documents or repositories, but also in organizational routines, processes, practices, and norms.
Source: adapted from Davenport, T.H. & Prusak, L. Working Knowledge: How Organizations Manage What They Know, Harvard Business School Press, 1998 and European Committee for Standardization, 2004.
Knowledge creation: A process that results in the generation or collection of new knowledge. Knowledge creation is not limited to research activities, but also results from evaluation of practice or policy and the collection and sharing of tacit knowledge in order for it to become explicit knowledge.
Examples include performing basic or applied research, attaining expert consensus, and gathering and documenting evidence.
Knowledge translation: The integration, reformatting and ethically-sound application of knowledge through interactions of policy makers, practitioners and researchers to accelerate the capture of the benefits of research and evaluation of practice.
Examples include preparing a policy brief/report, synthesizing research findings into accessible and practical formats, documenting a treatment program, and repackaging information prepared for one audience for another.
Capacity building: Increasing an individual, organizational or systemic ability to effectively plan, implement, evaluate and sustain public health promotion and protection efforts. Improved capacity is understood to lead to better decisions informed by multiple sources of data and information and to enhanced practice.
Source: Goodman RM, Speers MA, McLeroy K, Fawcett S, Kegler M, Parker E, Smith SR, Sterling TD, Wallerstein N. Identifying and defining the dimensions of community capacity to provide a basis for measurement. Health Education and Behaviour. 1998; 25(3): 258-278.
Context: The settings, circumstances, conditions and factors influencing the way in which knowledge is developed, shared, adapted and implemented. This may include consideration of processes, structures, resources and environments, as well as interactions between researchers, policymakers, practitioners, the public and media.
Sources: adapted from McCormack, B., Kitson, A., Harvey, G., Rycroft-Malone, J., Titchen, A., Seers, K. 2002. Getting evidence into practice: the meaning of ‘context’. Journal of Advanced Nursing. 38(1): 94-104.
Evidence-informed practice: Practice that is attentive to evidence, including research, experiential knowledge of the organization, cultural context, and educational, symbolic/political and process uses, and that uses knowledge syntheses of summarized findings to inform practice, decision-making and implementation.
Source: adapted from Avis, J. 2002. Really useful knowledge? Evidence-informed practice, research for the real world. Post 16 Educator (8): pp. 22-24.
Knowledge dissemination: An active and strategically planned process whereby new or existing knowledge, interventions or practices are communicated to targeted groups in a way that encourages them to factor the implications into their work. Dissemination goes well beyond simply making research available through the traditional vehicles of journal publication and academic conference presentations.
Examples of knowledge dissemination include research literature, best practices documents, presentations, policy development, Web materials, training, and pilot studies or trial use of an intervention.
Sources: adapted from Kiefer, L., Frank J., Di Ruggiero, E., Dobbins, M., Manuel, D., Gully, P., Mowat, D. 2005. Fostering Evidence-based Decision-making in Canada. Canadian Journal of Public Health. May-June: 11-119; and Canadian Health Services Research Foundation
Knowledge adoption and uptake: The acceptance by a profession or organization of knowledge disseminated. This includes organizational policies and practices, as well as the decision to adopt an innovation. Uptake refers to the utilization and implementation of knowledge in practice which includes several types of use: direct/instrumental, conceptual/enlightening, symbolic/political and process.
Sources: adapted from: Organization for Economic Co-operation and Development: Knowledge Management in the Learning Society, 2000, p. 40; and Pelz, D.C. 1978. Some Expanded Perspectives on Use of Social Science in Public Policy. In Major Social Issues: A Multidisciplinary View, eds. J.M. Yinger and S.J. Cutler, 346-57. New York: Free Press.
Evaluation: In the context of the Cycle, it entails both the evaluation of interventions and knowledge exchange activities, including the assessment of processes, outcomes, related facilitators and barriers, as well as context. Evaluation provides both evidence of effectiveness and practice-based learning that contribute to knowledge creation.
Examples of topics of interest for evaluation include: perceptions of stakeholders and participants, reach and participation rates, competency, communication and interaction change, rate of knowledge uptake, nature of decision-making changes (research, policy and practice), behavioural change, health system outcomes and cost-benefit issues.
Practice-based learning: A systematic and collaborative cycle of inquiry and feedback related to the context, design, implementation and outcomes of population health policies and programs that produce evidence relevant to the application setting and which is primarily improvement- and learning-oriented.
Sources: adapted from Potter MA, Quill BE, Aglipay GS, et al. 2006. Demonstrating excellence in practice-based research for public health. Public Health Reports, 121(1), A1-A16 and Green LW, Glasgow R. 2006. Evaluating the relevance, generalization, and applicability of research: Issues in external validation and translation methodology. Evaluation & the Health Professions,29 (1), 126-153.
Report a problem or mistake on this page
- Date modified: