For many years, we could proudly report the cost of learning (cost per learning hour, the ratio of L&D professionals per 1000-employee, or how many people our high-priority programs reached); however, it was very difficult to demonstrate the actual value of learning to the business.
After years of focused efforts, we now use a pragmatic and cost-effective approach to measure the business impact of learning. This document covers:
• The pillars of our effectiveness evaluation framework
• Moving the conversation with stakeholders from cost to value
• Developing a pragmatic measurement approach for a learning program: challenges to expect and goals to set
“by sending employees to this EM practitioner program, as business leaders, you can develop your project managers, they are twice more likely to become certified project managers, AND you are also 43% more likely to have profitable projects and happy clients”... or....
“By sending employees to this PeopleManager program, you will have more effective managers who better manage the performance of their teams and better develop their people by up to 50%... ultimately, resulting, in 95% of the cases, in significantly more satisfied and engaged employees (+7 points).”
Too wordy……add tag line related to: Need to assess the effectiveness of a learning solution, rather than its efficiency……not cost/learning hour…..understadnign the effecties of building new capabilities
Objectives of the approach
4 levels of measurements, emphasis on summative and formative evaluation (not only measure impact, but also understand how to improve the program – actionable measurements!!)
Type of data collected for each level, examples of how it is collected
Conscious decision NOT to include ROI, but to focus on individual performance and business impact
In 2014, our Capgemini University launched a new effectiveness evaluation program: “EvaluateIT”.
The goal of EvaluateIT is to understand the business impact of training programs, optimize and refine them, and validate the financial investments with the Business.
EvaluateIT was designed to:
Help assure the business they are getting the best value for their learning investment, through evidence collected in the program
Help the University and local learning and development organizations implement a continuous improvement program rather than run expensive and infrequent re-design cycles
Close the gap in our development process around evaluation where we do not have a consistent evaluation framework
Help participants continue to enhance their learning through assessments and provide their input on the quality of the learning through timely and effective feedback
Help the Capgemini learning network implement a consistent and robust evaluation method for its training programs
The overall focus is on assessing the effectiveness of a learning solution, rather than its efficiency. The framework was based on industry best practice (Kirkpatrick, Bersin) with external expert advice, used systematically and piloted over 18 months. It is now a standard approach that our University applies to all learning programs.
Not all findings result in implications associated with ‘Request Management’ (e.g., course enhancements, re-design). Some findings revert back to ‘Relationship Management’ working with stakeholders or other impacted business functions to address (e.g., HR)
key lessons learned as a list
Objectives of the approach
4 levels of measurements, emphasis on summative and formative evaluation (not only measure impact, but also understand how to improve the program – actionable measurements!!)
Type of data collected for each level, examples of how it is collected
Conscious decision NOT to include ROI, but to focus on individual performance and business impact
In 2014, our Capgemini University launched a new effectiveness evaluation program: “EvaluateIT”.
The goal of EvaluateIT is to understand the business impact of training programs, optimize and refine them, and validate the financial investments with the Business.
EvaluateIT was designed to:
Help assure the business they are getting the best value for their learning investment, through evidence collected in the program
Help the University and local learning and development organizations implement a continuous improvement program rather than run expensive and infrequent re-design cycles
Close the gap in our development process around evaluation where we do not have a consistent evaluation framework
Help participants continue to enhance their learning through assessments and provide their input on the quality of the learning through timely and effective feedback
Help the Capgemini learning network implement a consistent and robust evaluation method for its training programs
The overall focus is on assessing the effectiveness of a learning solution, rather than its efficiency. The framework was based on industry best practice (Kirkpatrick, Bersin) with external expert advice, used systematically and piloted over 18 months. It is now a standard approach that our University applies to all learning programs.