- Automating data-related tasks is super useful, and starting out with a manual process can help to make sure that you are picking the right solution for the right results.
- The learning and development industry is missing the importance of baselining, and calculating ROI from learning solutions is something that we should all prioritize.
- The future of learning will see the rise of adaptive learning and more microlearning opportunities within teams.
Transcript (edited for clarity)
Claudia: How do you think of automation and of using data to get things off your plate and still produce high quality outputs?
Dan: We were having some interesting discussions with some of our clients on this exact topic. They are looking for cutting edge ways of doing things. We know that there is a lot coming down the pipeline, but what would you do with it? Before we automate something or before we require a level of quality that might be more than needed, we need to know the functionality you are interested in, if we can get the data, and what are we going to do with it. If we can get the data easily and then you are able to use it and it is beneficial, then let us automate it. So, that is how we look at automation, we need to prove out the usefulness first and make sure there is a benefit. I would say that our process is manual first to have the right quality, and make sure that the benefits are coming through. Once we confirm those things, then we can fine tune the process. We train if we need the skills that are not existing on the team, and we substitute with technology where we can, and that is how we get to the automation.
Claudia: One thing that you notice when you work closely with learning and development is that there is a lot of data being collected, but there is not always a set of processes in place to do something with that data. There are companies that use data very positively and productively, and they have large teams of analysts, but in many cases, learning and development is a 1-2 person team. What do you think is missing within the learning and development industry in terms of teams’ ability to use data effectively?
Dan: When I was with McGraw-Hill, we were working with very large organizations that face this exact problem – how do we measure effectiveness of training? There are some basic answers that I found lacking. One of which would be – are we baselining something? We cannot measure improvement if we do not have a baseline. The other challenge is concerning the end results – are we thinking about ROI? In some cases, L&D may say that it is not necessarily ROI, for example for some safety companies it was just a requirement to do something like compliance training. Typically, a disconnect is in the middle. For example, we want to avoid a risk because it has a cost associated with it, so we create training. What about the middle? How are we tying that together? That means that if we have incidences that are costing us money and we have training in place, then we should see a decrease in those incidences. So, how can we start to bring those things together to say that these people had the incidences, and we were able to take any other variables and hopefully narrow it down to the coursework they were taking. But this is unrealistic because there are a lot of variables. If you have multiple variables, you must be cautious about the assumptions or conclusions that you make. I think there is not an easy silver bullet to all of it, other than just trying and starting somewhere, baselining it and putting together some logical chain where you can connect the training to the ROI or to the financial aspect. If you are a nonprofit company, you need to see if there are some folks in there that are more ambitious and that are getting voluntary training. Because of that training are they able to demonstrate skills faster, and are you able to reduce churn because your workforce is up-skilling in areas that they did not have as a skillset? Depending on what you are looking for, it could be strictly financial, or it could be a mix. Also, one of the things that we do for some of our clients is work out an adaptive learning strategy. The concept behind adaptive learning is that I am recognizing that each individual is coming to the table with different experiences and skillsets, and we want to optimize their path to mastery of a particular topic. Even though it may be challenging because you need to connect coursework and learning to financial outcomes, you need to optimize and maximize the efficiency of learning time itself. If we can do that, we are making gains even in employee time spent. If normally a course is 10 hours on average, and after using an adaptive learning strategy, we can move everybody to less than 8 hours on average, then we have saved on average 2 hours. There may be some folks who will still take 15 hours to go through a particular course, and others may take only 3 hours, but at least we are moving the needle and it is measurable.
C: How would you personally hope that adaptive learning is going to evolve in the future?
D: I would love to see more of an open-source approach to algorithms in adaptive learning. A lot of companies are doing adaptive now, and most of them tout the algorithm as their special sauce. If you have 15 or 20 companies saying that they have the best special sauce, you start to wonder if 19 out of 20 are possibly lying. The trend that I would love to see does not necessarily require technology, it is more of a mindset trend, of moving towards a hundred percent mastery. There are too many scenarios where you take training, and then you take a test at the end. If you get 70% or 80% you pass. Whether you are an accountant or physician, if you take a one-hour course and at the end you take a multiple-choice test, and you pass if you get 7/10 – that does not seem right to me. Once you say that you need only 70% or 80% score to pass, what you are effectively saying is that nothing is essential because you can get any 20% or 30% wrong. I have a course where not none of it is essential, you just must get most of it. So, I would love to see that evolution where folks give more value to a 100% mastery, as opposed to the approach of take a test and get most of it. There is also the concept of microlearning where I take a very broad topic and break it down into sub-units so that I can target very specifically, any required remediation. That is a huge component of what we do at Allarium.
C: How do you see microlearning in the future being woven into the typical person’s workday and what are some triggers that should tell somebody that there is a microlearning opportunity?
D: I think it is contextual and it is about making sure that there is a learning mindset throughout the organization. For example, someone who is building processes – they need to be aware enough and build a learning strategy where they can tag something while they are working on it. I think we are going to see a lot of contextual learning, and we already do. In software, this has been around for a long time, where you have the little exclamation point that you can hover over and sometimes there is a tool tip or a trick. It has been around for a while, it just tended to be more specific to software tips and tricks. I think the more we are integrating a learning mindset to all the departments, we are going to see that those contextual learning triggers are going to be more and more relevant to soft skills and to other elements outside of just software.