The loss from ineffective training is staggering: $13.5M per 1000 employees [1]. And that is just corporate education. Articles like Problems With Today's Education System also don't come as a surprise to anyone who is part of the common educational discourse. The question is, how can we improve? Let me attempt to answer.

There is a lot of "innovation" going on in education. The startup community provides solutions within the category of EdTech, which is short for Education Technology. But is there significant progress? Upon looking for innovation in education on the internet, different things come up: teacher and curriculum autonomy in Finland, individualised learning plans in the Netherlands, embracing the dangerous side of creativity in the US, forest kindergartens in Denmark, novelties like 3D learning in the United Arab Emirates, robot teachers in South Korea, the paperless classroom in England and many more [2]. The same topics come up at every education conference one can attend these days (plus add VR tools). The question is, do these "innovations" actually work? Whenever each innovation is presented, we mostly hear the marketing message about the best experiences connected to this new tool, never the worst.

andy-kelly-402111-unsplash.jpg

There are a lot of new tools for educators, yet the progress is still not obvious. If those technologies are not the answer, what is? What if we could have a clarity of whether these innovations work or not, what are advantages and disadvantages, and how they compare in bringing results? When they are exposed to the idea, many people agree that the biggest single thing that could drastically advance education (corporate, university or school education alike) is clarity of what works and what doesn't.

So, what makes good learning?

Isn't this the question at the center of everything? The keys to the door of a completely new world are an information processing system that allows to track what works and what doesn't in education as well as an information sharing system that allows us to distribute the knowledge. To achieve this we need to collect and analyse data.

I know, by this time we are all tired to hear about AI this and data that. At the same time, studying data helps us find answers to complex questions and solve difficult problems. Maybe data analysis could be the right tool here?

If you are a trainer or educator, there are a number of benefits, data analysis could offer you the chance to:

  • Learn how to improve trainings, whether it's a logistical detail or a skill that needs to be developed.

  • Have a proof of the direct impact of your services for your clients.

  • Certify and promote your trainings to new clients on the basis of service quality and measurable impact, not just good design or PR.

Similar benefits are also there for HR Development and Learning & Development professionals, usually with a focus on impact measurement and making better investment decisions in learning.

markus-spiske-445253-unsplash.jpg

Despite all the promises of data science we haven't seen anything yet. The biggest changes are invisible. Kirkpatrick model of training evaluation has been the dominant tool for corporations since 1959. The model states that in order to evaluate effectiveness of training one has to examine four consecutive levels, one by one: reaction of participants, things they learned, behaviour they adopted and observable results for the organisation that can be attributed to the training. Kirkpatrick encourages evaluation of all four levels for better understanding, such as knowing that the lack of behavioural change might not mean ineffective training, but could be caused by a toxic environment. Lots of research has been done on the model which lead to other scientific models being introduced that were mostly built on the work and fame of the original. Today 92% of companies scrutinise at least level one, reaction [3].

At the same time, as someone who has been selling training evaluation to companies, let me tell you a secret. Companies collect reaction feedback using Likert scales (1 to 5) and free-form feedback. The latter is read and mostly disregarded in the long-term, scaled answers are summed up and divided by the number of answers to calculate the averages, then reported up. There is no mention of big picture analysis in most cases. Instead, the data is kept inside the company and benchmarks are non-existent.

The problem with academic models like the Kirkpatrick model is that they require big tailor-made projects to evaluate training effectiveness, which is expensive. Most of the companies who can afford that are huge corporations. At the same time, startups with Software as a Service product can't yet offer a proper alternative. A Finnish EdTech startup, Claned, promises AI-driven e-learning education. Yet, according to company representatives, the main value they can provide so far, remains the UX of the platform. AI is still a work in progress. Claned and many other so called "learning analytics" companies often offer a dashboard-like solution. If the data is relevant and well collected in the first place, the challenge for the end users is to make sense of all the pie-charts and connect the dots. What would be ultimately helpful are predictive and prescriptive analytics, but those are a lot more challenging.

2a1d25fa549ff2378390613d4514ddc9fefec5c4_2880x1620.jpg

I believe that we are not making enough progress because we don't have faith in the advanced possibilities of data within education. "The important stuff can't be measured" try to argue some, like Liz Ryan [4]. In his book Homo Sapiens [5] Yuval Noah Harari mentioned that a big change that enabled modern-day capitalism was the belief in a better future. The idea behind it is that you will get a loan because the person giving you the money assumes tomorrow will be better today and you can repay with an interest. Where the trust in this idea comes from is a difficult question.

Our approach is a combination of the scientific consulting and startup Software as a Service ways of viewing the challenge.

We don't believe that we know the answers to questions like "what makes good learning?", at least yet. It takes rigorous work, iterative research and development projects in collaboration with many stakeholders to get to the truth. It's hard but important to stay critical: it’s easy to look for confirmations as a researcher, even more so as a business. Therefore, it's important to have experts looking at the data before the automation of the process. If generalisations and conclusions are not drawn before, validity cannot be ensured. That being said AI supporting an expert is a huge asset we want to look into when we have the resources to do so.

We believe in standardisation versus tailor-made approaches. The advantages of the former are that it's inexpensive and leads to even better performance than consulting in the long-term due to big data intelligence. Making in-depth analytics economical might be the small but critical step to make the industry adopt the technology.

https_%2F%2Fblogs-images.forbes.com%2Fharoldstark%2Ffiles%2F2017%2F08%2Fairbnb-1200x675.jpg

Another important thing we've noticed is the role companies like AirBnB and TripAdvisor play in their respective markets. They connect the market and build transparency. We realised that once we get better at understanding what makes good learning, what worked and what didn't, it's important to spread that knowledge. Therefore, we are working on a training feedback and rating platform, which we will release once we have enough data. At the moment, we are already working on a Beta version which will be initially presented to our first clients before market introduction. Stay tuned for updates!

We believe that data is the key to improving education. Check out our website if you are curious to know more about how we work. We are not only entrepreneurs, but also researchers and would love to collaborate, also on pro-bono basis when we are able to do that, to figure out important questions.

If you are a trainer or educator, get in touch and let's see if we can help you improve your learning programs. Currently we are offering 10 free trials (including post-training and impact report), be fast to claim yours here!

If you are an HRD or L&D professional, get in touch and let's see if we could help you optimise investment in learning programs.

Thanks,

Dima Syrotkin, CEO at Panda Training Oy


Footnotes:

[1] Bughin, J., Dobbs, R., Roxburgh, C., Sarrazin, H., Sands, G., & Westergren, M. (2012). The social economy: Unlocking value and productivity through social technologies. McKinsey. Com.

O’Boyle, E., & Harter, J. (2013). State of the American workplace: Employee engagement insights for US business leaders. Washington, DC: Gallup.

Boushey, H., & Glynn, S. J. (2012). There are significant business costs to replacing employees. Center for American Progress, 16.

[2]Matcher, E. (2015, June 5). Seven Inspiring Innovations In Education From Around the Globe. Retrieved from https://www.smithsonianmag.com/innovation/seven-inspiring-innovations-in-education-from-around-the-globe-180955484/ Nelson, K. (2018, January 17). 5 Inspiring Innovations In Education From Around the Globe. Retrieved from https://knect365.com/innovation/article/a4a325ca-a038-4744-968c-aff796ecb0b9/5-inspiring-innovations-in-education-from-around-the-globe

[3] Wentworth, D., Tompson, H., Vickers, M., Paradise, A., & Czarnowsky, M. (2009). The value of evaluation: Making training evaluations more effective. Alexandria, VA: American Society for Training & Development.

[4] Ryan, L. (2015, August 9). 'If You Can't Measure It, You Can't Manage It' (False). Retrieved from https://www.forbes.com/sites/lizryan/2015/08/09/if-you-cant-measure-it-you-cant-manage-it-false/#680209c15b6b

[5] Harari, Y. N. (2014). Sapiens: A brief history of humankind. Random House

Comment