Have you ever listened to a story told around a campfire? Whether it inspired a backwoods Blair Witch sense of foreboding or bellylaughs brought on by cringeworthy dad-antics, chances are you still remember the person that told it, as well as the story in its entirety.
The experience of sharing stories around the campfire is sure to be a memorable one for one reason alone: social is at the heart of engagement, and at Fuse, at the heart of learning and development as well.
Yet today, it’s easy to forget how integral people are to learning and engagement, particularly in a corporate setting. Throughout the world, COVID-19 enforced lockdown has unhelpfully removed the human element so essential to learning, engagement and emotional connection. The workforce the world over is working mostly remotely and people aren't getting the social support in the workplace that they have traditionally.
The transition to lockdown has been a bit of a bumpy ride for learning. People have gone from a recommendation and learning culture that spans nurturing environments like the campfire, classroom and community, and are now having to turn more and more to AI for recommendations. Think about this scenario: historically, if you’d moved into a flat, you might have asked a friend to recommend a good takeaway, or you might have Googled it and looked at takeaway reviews. Today, Deliveroo knows where you are and simply generates a list of recommendations for you based on what you’ve ordered in the past. The human connection is missing.
Of course, AI is important and it can help by prioritising or recommending elements of learning. In this way AI can help get people into the right conversations. But if relied upon too much, AI could end up taking the social out of learning, which is as bad as willingly leaving learning in lockdown.
Surely the right combination is AI and human. Remember when the six-game chess matches between world chess champion Garry Kasparov and the IBM supercomputer called Deep Blue? Kasparov won the first round in 1996, but Deep Blue took the victory in 1997. Yet if they worked together, combining the best of human and machine, they would be unbeatable in the world at chess.
Machine Learning can help: recommendation engines can look to provide learning recommendations based on data points linked to communities’ interactions with content. However, recommendation engines need to go beyond simply looking at what users have consumed and recommending more content on that basis - what you needed to learn last month is not what you need now. At Fuse, we bring together a picture of a user’s current situation, including their job role, skills, preferences, goals and objectives. The more we know about a user, the better a recommendation can be. Using all of this, we can recommend content based using a future-focused approach rather than past consumption.
However, it’s crucial to preserve choice, balance and variety in learning recommendations. Most people don’t want, and don’t trust AI to just make choices for them. In corporate learning and development, as in life, people want the freedom to search and look for things, and to feel comfortable that they are seeing everything on offer.
It’s a view shared by many experts in the learning and development industry, including Erin Streeter, VP of Learning Solutions, Forrester, as explained in this article in HR Technologist:
“Interestingly, our research shows that a significant percentage of consumers prefer no personalization at all in their interactions with a company, which is antithetical to where everything from marketing to learning has been trending. If you unpack that and consider the “why,” you understand that a lot of individuals don’t like the idea of having their choices mediated (and possibly limited) by an AI. That’s something we’re watching closely for its application to learning, where we think there could be a risk of over-rotation with platforms that promise complete personalization of the learning experience. We need to also preserve choice and variety.”
It’s about finding the happy medium between choice and few well-positioned recommendations. This human approach is part of what it means to put social at the heart of learning and engagement.
Does Learning Tech Have a Social Dilemma?
Have you seen the 2020 docu-drama The Social Dilemma? Described by the producers as ‘a documentary-drama hybrid that reveals how social media is reprogramming civilization, with tech experts sounding the alarm on their own creations,’ it’s enough to make a person want to unplug and run.
Yet the parallels we see in the real world, even very recently, are shockingly similar to the tactics mentioned through the film. Take one of Trump’s most recent social media election campaigns for example. The campaign was successful in using social media platforms to micro-target different groups of Latino voters in South Florida using their fear of communism in their home countries, showing and proving yet again - for good or bad - how behaviours can be changed significantly through AI driven micro-targeting.
While these sort of tactics may seem the realm of underhanded election campaigns, it’s unlikely they come across as surprising to most people, and this is down to the fact that we’ve seen one personal data-misuse story too many in the news over the past few years, and far too many algorithm-driven social media horror stories. Understandably, most of us take AI-driven anything with a pinch of salt.
The same extends to learning platforms. One of the biggest lessons from documentaries like the Social Dilemma is that companies need to be open about how data is used, as otherwise, it could spark some genuine fear that technology, even learning platform technology, could be used in the wrong way. Beyond this, there is a moral obligation here for vendors to use only what is needed and only when needed in order to solve problems that help users and their organisations to perform better. If people don’t have trust, they may fear they are being influenced to think or act differently than they may choose.
The problem that presents itself when fear enters the learning platform opportunity is the risk of people opting out. And what we find is that when people opt out, they generally lose out on potential career building performance and engagement, as well as the potential to make a real impact and contributions towards company goals.
We’re not going to give you an example of what happens when employees opt out due to fear or distrust of their learning platform. Instead, we’re going to give you the Avon example, where opting in to deep learning experiences in-flow and in-context showed superior learning outcomes, fantastically improved sales and impressive representative retention rates.
Fact: When Avon Beauty reps showed even an incremental increase in monthly visits to Avon’s learning development platform deployed by Fuse, the difference between low frequency (1 to 2 visits per month) and medium frequency (3 to 4 visits per month) created dramatic uplifts of +320% in aggregate sales over a 6 month period.
It gets even more interesting. Andy Stamps, Avon’s Digital Experience Manager, discovered a marked filter down effect between sales leaders and beauty reps:
“When we mapped sales leader engagement against beauty rep engagement in any one country, we saw a near carbon copy correlation. In markets such as the UK and Italy where sales leaders were using Avon Connect regularly, so were the beauty reps in those markets. In countries like Taiwan and India though, where sales leader engagement was low, we could see that engagement amongst our entrepreneurs was also low.”
Learn more about how Avon managed to create positive and habitual learning behaviours at scale - with a direct impact on the bottom line. Watch this on-demand webinar with Andy Stamps, Avon’s Digital Experience Designer.
It really does go to show the importance of social at the heart of learning, as well as the power of emotional connection. People like to learn from other people, and people feel motivated and engaged by other people.
So, What Can Companies Do to Take Learning out of Lockdown?
We might not be able to help government-mandated lockdowns, but we can still try to put social at the heart of learning. As a good embodiment of storytelling, video learning may help. It may not be face to face, but it’s still a similar level and you certainly get much better social engagement. For example, in social media you get so many more comments on a video than you do on an article, because you feel emotionally different when you see someone.
Humans are great, AI is wonderful, but humans + AI is best when it comes to creating deep learning experiences in-flow and in-context. While recommendations can be powerful, transparency in recommendations is key. Companies must be able to show that sometimes they are blind recommendations, and other times there is thinking that goes into the recommendations (ie, you have a goal set by your manager, surrounding a skill.)
Discovery is also powerful. While recommendations may be appreciated, as humans, we are all natural explorers and we want to see what else is out there: and often, we want to discover it for ourselves. In this sense, a strong search capacity in a learning technology platform is key.
Trust is everything. People have to trust that your learning platform is not being used as a channel to micro target content in an unwanted and influential way. Ultimately, people need awareness, education and choice. If you have that, then learning platforms are a great way of achieving better performance, better job creation opportunities, and better engagement across the board.