This presentation was about the development of indicators for measurement of KT within KBHN NCE's KT Core, and the associated database created in order to track data and generate reports for use in progress reporting to the NCE and to inform internal management decision-making. A handout was provided in conjunction with this presentation, for attendees to use to create their own indicator definitions (also uploaded to slideshare).
Akurdi ( Call Girls ) Pune 6297143586 Hot Model With Sexy Bhabi Ready For S...
Development of Indicators for Measurement at Each Stage of Knowledge Translation from Research to Impact
1. www.kidsbrainhealth.ca
Anneliese Poetz, PhD – Manager, KT Core
David Phipps, PhD, MBA – Lead, KT Core
Renee Leduc, MSc – Program Officer, NCE Secretariat
Mobilizing Science Knowledge and Research: NCE Sharing of Best Practices Symposium
Thursday January 29, 2015 11:00am – 12:00pm (Sable Room A -D)
Halifax, Nova Scotia
2. Outline
1. What is KBHN, KT Core
2. Evaluation, KT Core
3. Tool Development Process & Lessons Learned
5. KT Core
“KT maximizes the impact of research and
training in neurodevelopmental disorders”
Services for researchers and trainees in NDN
1. Knowledge Brokering
2. KT Events
3. KT Products
4. Evaluation
5. KT Planning
7. Role of Evaluation
How are the KT Core services helping KBHN
researchers & trainees?
How have we made a difference for children with
NDDs and their families, in Canada?
10. Anatomy of indicator
Definition components
1) Name/title of indicator
2) Definition (including N/D if
applicable)
3) Type
4) Rationale
5) Strengths/Limitations
6) Data Source(s)
7) Stage in CPPI
8) Date
9) Responsibility for collecting
data
Example
1) [CB] % increase in knowledge
2) Sum of scores/#attendees x 100
3) Impact
4) Measure knowledge gained
5) S: measures change/impact,
L: bias & missing data
6) Pre-Post questionnaires
7) All
8) May 28, 2014
9) KT Core
12. Process
• Map services onto CPPI evaluation framework
• Define indicators for tracking each service
• Design and develop database for tracking
• Fully define indicators
18. Updated (current) System
-Access-
1) To enable tracking of service data (indicators)
2) To map services onto CPPI framework
3) Must be easy to enter data
4) To generate useful reports (analysis)
i. To report data within date constraints
ii. To capture and report “Dan #” vs “David #”
iii. To capture and report what we did for each service
(part of def’n)
5) Must be able to add services/indicators as KT
Core & services evolve (long-term req’m)
Database/system requirements:
22. Take away messages
1) Have an evaluation framework to position your
indicators
2) Define indicators to the fullest extent possible
3) Define requirements for your database/system
4) Be willing to revise the system as new
requirements become apparent
What is NeuroDevNet? What we do, why we’re here (high level)
Federally funded NCE
~80 researchers/trainees (FASD, ASD, CP)
Knowledge and Technology Exchange and Exploitation
“…seek to shed light on the causes of neurological disorders, and to share this knowledge to health care professionals, policy makers, and communities of interest…
NeuroDevNet works across traditional disciplinary boundaries and sectors to ensure our findings are translated into tangible diagnostic, preventative, therapeutic,
social, economic, and health benefits for all.”
What is the role of the KT Core within NeuroDevNet?
We (KT Core) support NeuroDevNet’s HQP mandate through workshops like the KT/Social Media workshop.
What are the indicators/data used for?
Reporting (value/impact) of: services, events, products, etc.
Comparison across services, events, products, programs, etc., possibly across NCEs (KTEE standards?)
Inform decision-making
Succession planning
Explain CPPI, what it is and why we are using it (important planning tool, also for monitoring progress toward impact – CPPI is best suited for a system of research or ‘set of projects’ rather than a single project that may never reach ‘impact’).
Need to revise these numbers with the current # based on MS Access data
Indicator Type (input, process, output, outcome, impact, performance, behaviour change, etc.)
Talk about an interesting thing about the indicator (value of a good indicator) such as the story about % increase in knowledge at the KT/Social Media workshop (e.g. 28% increase in knowledge – infographic on handout)
For Rationale – I actually got rid of some indicators because I couldn’t think of a good reason why we should collect data/measure it.
Whenever you design a system (database) you need to define the requirements – what is it supposed to do?
Whenever you design a system (database) you need to define the requirements – what is it supposed to do?
Whenever you design a system (database) you need to define the requirements – what is it supposed to do?
Whenever you design a system (database) you need to define the requirements – what is it supposed to do?
Whenever you design a system (database) you need to define the requirements – what is it supposed to do?
Whenever you design a system (database) you need to define the requirements – what is it supposed to do?
Whenever you design a system (database) you need to define the requirements – what is it supposed to do?
Whenever you design a system (database) you need to define the requirements – what is it supposed to do?
Have to defined indicators fully, so that you know what is included/excluded in the measure, who is responsible for collecting the data and where the data are collected from. It is important to know how the indicators are defined for the purpose of clarity of measurement for management purposes but most importantly for comparison across time (evolution of the program over years) and also across programs.
Takes time but you have to think about and articulate what the requirements are for the system you need to create, otherwise it costs time and $$ to re-do it. Having said that, requirements will evolve over time as you get to know more about what you are measuring, and as your program evolves, so you should revisit the design of your system in the future to ensure it still meets the requirements you have at that time. And you have to be willing to revise the system as new requirements become apparent (like what I did by migrating our system from Excel to Access).
Have to defined indicators fully, so that you know what is included/excluded in the measure, who is responsible for collecting the data and where the data are collected from. It is important to know how the indicators are defined for the purpose of clarity of measurement for management purposes but most importantly for comparison across time (evolution of the program over years) and also across programs.
Takes time but you have to think about and articulate what the requirements are for the system you need to create, otherwise it costs time and $$ to re-do it. Having said that, requirements will evolve over time as you get to know more about what you are measuring, and as your program evolves, so you should revisit the design of your system in the future to ensure it still meets the requirements you have at that time. And you have to be willing to revise the system as new requirements become apparent (like what I did by migrating our system from Excel to Access).
Have this evaluation framework, and this set of services, needed to map the services onto the evaluation framework.
Have this evaluation framework, and this set of services, needed to map the services onto the evaluation framework.