CoverStorY
Here’s a story about doing the right things today, while thinking long-term about tomorrow.
At the LTEN Annual Conference in June 2024, Novartis presented a workshop session, after working with IC Axon on a measurement strategy and ACTO on learning dashboards. The session was focused on helping people determine their current state of “measurement maturity” at their organizations and constructed a Level 1 evaluation tool consisting of questions about the learning experience, content and job impact.
At Novartis, Brezden Peter, associate director, learning technology & measurement, prepares a slide for trainers with speaking notes that encourages healthy response rates, and she makes a point of meeting with every trainer. During the meeting, she reviews things like whether the training is face-to-face or virtual, the different modalities that may be used and the number of attendees.
From there, Peter offers a recommendation and often revises the talking points to reinforce the importance of completing a course evaluation.
Her contact with trainers is one of many communications across multiple channels to raise awareness of the measurement strategy and engage people in the evaluation process. This can be challenging for field personnel conducting training, compared to the trainers in our organization who understand the vision. Frequent and consistent communications to all stakeholders are a critical part of our success.
Currently, the primary focus is human performance. For Novartis, that’s the first step. Many people try to connect learning to business outcomes that are influenced by multiple variables. “We’re certainly not afraid of that challenge, but we’re resolute in our belief that the clearest connection to learning and development is human performance,” Peter said.
Every standard evaluation administered at Novartis offers the program owner an opportunity to ask specific questions about the learning experience. Program-specific questions are always tied to learning objectives. They establish the foundation for assessing learner confidence with course content and identify observable behaviors for a manager evaluation that is administered 60 days post-training.
The Novartis standard evaluation utilizes a forced choice, four-point scale. Learners respond to questions about the learning experience, content, job impact and the relevant learning modality. The selections are “strongly disagree,” “somewhat disagree,” “somewhat agree” and “strongly agree.”
The standard evaluation and manager evaluation both include options for program-specific questions that are directly linked to learning objectives. Program-specific questions in the manager evaluation are translated into observable behaviors. Managers use the same four-point scale (from the standard evaluation) for their observations at 60 days post-training.
Using the same four-point scale in both of these evaluations helps Novartis maintain consistency in our dashboard and simplifies the user experience.
Novartis is exploring adaptive learning solutions and determining the learning paths needed to accommodate the needs of a large learner population.
In the new hire training process, a significant number of people come with experience from one or more companies. Another large group includes associates who are newer in their careers. Collecting this information not only supports a longer-term adaptive learning strategy, but also helps Novartis make any necessary modifications to subsequent phases of new hire training.
Novartis values learner feedback and includes optional comments with each section. Excellent feedback is generated with this approach.
Peter created a user guide and action plan to help trainers use the dashboard and interpret data, including verbatim comments. The three-step process begins with key words that can be searched and quantified in Microsoft Excel.
Next, trainers look for multiple related comments. The internal ChatGPT can quickly synthesize workshop feedback, find common themes and identify improvement opportunities.
Sometimes, Peter said, they compare comments to numerical scores to see if the feedback is balanced. The action plan offers suggestions for handling both the positive and constructive comments. Outliers tend to be less important to us than verifiable trends.
Making progress on your measurement strategy isn’t just about being productive; it’s about being productive on the right things. It’s important also to “do the little things,” like meeting with the trainers, because it all adds up to big accomplishments.
Measurement isn’t always easy and getting started can be the hardest part. When you know where you’re headed and have a few tools to help you get there, taking the first step will be easier.
David Solomon is a learning architect with IC Axon. Email David at dsolomon@icaxon.com or connect through https://www.linkedin.com/in/dlsolz/.