“Learning analytics”: a red herring?

A key message from my recent Kirkpatrick program was to start with the “top of the mountain”.  In this metaphor the top, the peak, the target to reach, is the organizational vision.  Strategic learning programs are, therefore, helping the org reach this vision and should be evaluated as such.

My reflection during the program was that this, of course, is common sense.  We should be working to support the organizational goals.  The challenge then becomes prioritizing multiple needs – so only by forecasting potential impact up front can prioritization be done correctly.   And this is one of the areas where there is misconception with KirkP – it should be about starting with the end in mind and working backwards (not just dealing with level 1 in a standard way and then carrying on from there).

In terms of evaluation of success, LEO have recently discussed the role of learning analytics (LA).  Now, like a lot of things in L&D, I would say the problem with LA is that it has meant multiple things to different people.  One of the earliest examples I saw, sold as LA, was Starfish Solutions (SS) who had a clear solution and goal – use existing LMS/VLE data to improve student retention.  SS makes perfect sense for organizations where educational outcomes and student retention are the organization’s objectives.  I liked SS’s solution (in part discussed with them at the BBWorld Conference back in ’09) but it also faced the challenge that, for many university courses, there was/is less need for ‘big’ data solutions – lecturers know their students in traditional models.  It only made real sense when talking about large scale education – the problem then, again, is that ‘large scale’ means multiple things to different people 😉

The LEO article does a good job at articulating the problems I have always had with L&D impact – especially how to assess when there are so many other uncontrolled variables.  As mentioned in my previous post on the KirkP Cert, this was my main challenge I wanted clarity on from the course.  The recommended KirkP approach of identifying multiple ‘indicators’ (suggesting behaviors are on track for the desired result[s]) that can show a correlation – a key learning point for me.  In this model, therefore, we are building a ‘chain of evidence’ akin to a court of law – “data, information and testimonies at each of the four levels that, when presented in sequence, demonstrate the value obtained from a business partnership initiative”.

What I really liked about this is the clarity of the evidence/steps up the ladder/mountain, from bottom to top:

  1. Individual Outcomes
  2. Team Outcomes
  3. Departmental/Divisional/Unit Outcomes
  4. Customer/Client Response
  5. Organizational Outcomes
  6. Customer Satisfaction
  7. Market/Industry Response

It is this breakdown, of benefit chain, that I will likely adopt in my existing project planning documents.

Let L&D then be clear, as the KirkP course made clear: stop trying to tackle issues through limited/impossible direct causation but instead correlations.  I would say this is a much better approach than simply seeing “measuring learning impact as a waste of time and money” as the LEO article mentions many people argue.

Therefore, I would argue, let us (i.e. L&D professionals) not worry about learning analytics but instead organizational analytics (i.e. goals and metrics) that can be seen as trending over time and aim to see where our investments have an impact.   As recommended in the KirkP programme, do not reinvent the wheel, borrow metrics from elsewhere as they will already be used by stakeholders and those same stakeholders (should) understand the logic.  This should then allow us to, as I’ve hinted at before, not worry about ROI but instead (as recommended by KirkP) Return on Expectations.

So what do I think of KirkP following the course and certification?  Well I’d have to agree with the LEO suggestions:

  1. It’s better than doing nothing… which is what most organisations are doing.

  2. Think about what level of evidence will be good enough for you. As the Kirkpatricks have pointed out, a chain of evidence that would not stand up to scientific rigour may be enough to convict someone in a court of law. If a level of evidence is good enough for a court then it’s probably good enough to convince your board to invest in L&D.

  3. Good enough develops into great.

Microsoft Teams: The platform we’ve been waiting for?

What is a ‘learning platform’?  It, perhaps, needs to support behavior change and knowledge sharing.  Therefore, it has been good to try out this week what Microsoft have launched with Teams and think how it might be used.

Now, it could be used for the Teams communication and sharing.  However, my mind has wandered to how it might work as more of an ESN/LMS if you went for a topic focus – creating open/public teams per topic where the business feels it has needs.

Now there are possible problems – not least that the rather unhelpful banner prompts you download a desktop app.  Hi Microsoft – its 2017 calling, where is the mobile app prompt!

Microsoft Teams Desktop Download Prompt

As for Microsoft – this might be the way to add structure to your sharing of documents and conversations.  However, there are clearly the problems with how this should work between Delve, Yammer and other options.

So what about the LMS?  Well there has, of course, been the “LMS/VLE is dead” narrative for a while, add to this a renewed discussion around disruption.  Therefore, can Teams act in place of the LMS – for example as a “learning experience platform“.  Whilst you could argue with a lot of that article this piece certainly resonates:

A disruptive change has occurred. Companies no longer look at their LMS as the core of their learning infrastructure. It’s now the back-end, and they are searching for a new employee experience, which demands a new set of tools.

There are many exciting things happening in the learning technology space: tools like Workplace by Facebook, Slack, and Skype are becoming enterprise-class, and these tools will likely become primary destinations for learners too. Now we need a new class of learning platforms that bring all this content together, deliver it in a compelling way, and give us the social and mobile experience we use every day throughout our life at home.

 

Kineo Connect: progress with purpose (event)

Well, I’ve finally had my first go on a HoloLens – after failing at BETT – thanks to MakeReal at the latest Kineo client event.

It really confirmed to me the potential value of HoloLens for multiple solutions.  The example available to try out was an interactive site map (i.e. building complex not website) – great for not having to carry around scale models with you!  For learning there are some great examples starting to emerge – such as this one for healthcare.  Cost, of course, remains an issue – with the development edition currently a tasty £2719.  It was also an opportunity to try an activity via SteamVR OS – building a reactor in VR.

The presentations/workshops, away from MakeReal’s demo space, focused on:

  1. Kineo’s Learning Insights report: Move for L&D to “facilitators of career development” from “deliverers of training”.
    1. I would agree with this.  I tend to describe my role as creating scaffolds for the organization to succeed and its people to be empowered.
    2. Report here.
  2. Social Learning.
    1. This was a nice recap of some of the logic and reminded me of the need to continue to encourage people to share and shift cultures – for example, the value in people sharing their ‘bibles’.
    2. Ideas included “find an expert”, something we are trying to do via badges for people at the top of our competency model.
    3. Slides here. White paper.
  3. Content Curation.
    1. I’m always torn on this topic as whilst I agree there is information overload I’m also conscious that information teams have fallen away in industries like law – in part as the WWW has made a lot of information free and thus the ‘value add’ from a professional team is tricky to identify/articulate.
    2. The challenge, as I see it, is how you combine a world of personal information management (the presentation mentioned bookmarking as example most people would do) and PLNs with opportunities to add value (context, metadata, write descriptions, etc.) centrally.
    3. Recommendation for a search > aggregate > filter > add value > promote model (akin to Jarche’s seek > sense > share).
    4. Couple of examples were run through – use of existing content to support BDO’s competency model (i.e. not needing to author new content) and Anders Pink as a tech solution (which I’ve trialed and like but raises questions over how it fits into your wider ecosystem).
    5. Slides here.
  4. Interactive Video.
    1. Have to agree there is a lot of value in interactive video where you can find the right use case, authoring and deployment approach.
    2. It was interesting to see/hear the Kineo approach – including the blended of ‘learning’ content with existing promotional video, for example, with Rolls Royce.
    3. Some nice ideas – like weighting questions where there is ambiguity (no true/false), gopro cameras of POV style, setting up secured YouTube channels for UG content submission and the oft shared Australian Deloitte video.
    4. Some of this could be done in Storyline – advantage of their authoring is multi-device SCORM publishing.
    5. Slides here. Guide here.

Another reading catchup

Another in the occasional series looking at some of the reading I’ve stockpiled with reflections for anyone who might be interested in the topics covered:

It’s never too late to learn [Journal of Workplace Learning; Vol. 27 Iss 6; Russell Paul Warhurst and Kate Emma Black; 2015]

An attention-worthy approach (from Newcastle Business School) to the empirical study of workplace learning – as the articles puts it:

The informal, incidental, tacit and social nature of much workplace learning poses methodological challenges in ascertaining from respondents exactly what they have learnt and particularly, how their learning has occurred (Fuller and Unwin, 2005; McNair, 2013). Therefore, visual elicitation techniques were deployed in advance of interviews to assist participants in exploring such tacit learning accomplishments and implicit learning processes. Three visual techniques were used:

(1)  Timelines: Participants were requested to depict their work over the preceding ve years as a horizontal line showing the degree of change, and learning, as gradients on that line against a vertical axis scaled from “set backs” through “stability” to “rapid”.

(2)  Sociograms: Participants were requested to indicate whom they interacted with over a typical month, the nature of this interaction (e.g. face-to-face or electronically) and how signi cant they felt these interactions were for them.

(3)  Pictors: Participants were asked to produce a visual representation of how they viewed themselves in their social worlds, in response to the question “How do I see myself as a later-career manager?”

I thought the three approaches above were interesting ones considering the challenge in getting professionals to reflect and think how they have developed.  Indeed they could also be used to visually represent impact of programmes to, in part, tackle the evaluation challenge.

I have recently been running a series of workshops at my organisation on career development opportunities – in part around the advantages that the May changes to apprenticeships will bring.  The ability to retrain, even if you have a degree or other qualification, via an apprenticeship is a key message.  Thus I liked the idea that “ageing populations need to be seen as a key, growing, natural asset rather than, as typically construed today, a liability”.  The article itself focuses on informal learning but realistically there will be opportunities to think again about skills development via multiple routes.  “Later-career workers need to be alerted to the learning potential within their jobs and the capabilities to leverage this potential” – yep!

How to enhance the impact of training on service quality?: Evidence from Malaysian public sector context [Journal of Workplace Learning; Vol. 27 Iss 7; Abdul Rahim Zumrah; 2015]

Whilst it is common sense to presume training needs to be reinforced/transferred to actually have an impact on quality this article argues for a positive relationship via a measurement approach.  Overall it is useful in reinforcing that training alone will not impact on performance:

The finding of this study is an important outcome that has not been empirically determined previously in the literature, which highlight the significance of transfer of training as a mechanism to enhance the impact of training on employees’ performance (service quality). This finding provides support for the social exchange approach, suggesting that where employees perceive the support from their organization (they have been sponsored to attend training programs by organization), then feel an obligation to engage in behaviors that benefit the organization (transfer the training outcomes to the workplace) and are also willing to expend more effort to fulfill their organizational goals (delivering quality service to organization’s customers). The finding of this study also responds to calls for research to investigate the mediating factor between HRM practices and employees’ performance (Tremblay et al., 2010). The finding also helps to clarify the ambiguity in the literature in regard to the relationship between training and service quality (Beigi and Shirmohammadi, 2011; Chand and Katou, 2007; Cook and Verma, 2002; Hung, 2006; Schneider and Bowen, 1993; Zerbe et al., 1998). Specifically, this study extends the literature by providing empirical evidence that transfer of training has a mediating effect on the relationship between training and employee service quality in the context of the Malaysian public sector.

How to Solve the Content Discovery Problem [Brandon Hall Group, Presentation Deck]

A topic of interest to me with my library/content/information route into the world of learning.  Unsurprisingly it identified the challenge of content:

94% of companies say that managing the expanding content library is a challenge presented by today’s learning environment

It also picked up on a point I regularly make and have previously tried to argue should be increasingly irrelevant through decent organizational design:

content overload is only getting worse with siloed organizations using their own tools

In other words, departments such as KM, marketing and L&D should be working together but instead buy their own solutions and thus overwhelm internal and external clients.

One of the solutions for the learner experience is federated search – which was the hot topic 10ish years ago when I did my library/information MA (especially for HE orgs) but corporates seem to have failed with this.  Indeed, in some ways such as looking at experts and peer based collaboration, the corporate world is still catching up with academia.

The filtering is then the challenge.  However, whilst the presentation suggests a “personalized feed” integrated with other business applications I wonder if this ignores the UX of some tools.  Content, generally speaking, will have some level or motivation/gamification built in to their platform.  So is the solution, instead, a combination of apps by audience/need?  The presentation instead suggesting Edcast as the aggregating “knowledge cloud”.  The traditional model probably being the topic/course centric LMS.

The takeaways are fair though and the kind of issues I have tackled in the last 15ish years:

  • Traditional content models cannot keep up with modern business needs
  • Keeping content current and fresh can be a challenge
  • Legacy content can also become unusable
  • The modern learner wants choices, personalization and a familiar interface
  • Subject matter expertise exists farther and wider than most organizations can reach

12th annual Keeping Pace report [Evergreen; 2015]

Very late catching up with this one but another document on a topic of interest – online “K-12” education – and this report helps highlight how the US (in this area at least) are considering a variety of approaches to improve outcomes and meet demand.  IMO a form of UK online free school is surely needed.  Indeed the report identifies the kind of needs, from the American online ‘charter’ school equivalents, that the UK could tackle:

In the case of elementary and middle school students, many attend an online school due to temporary reasons (illness, injury, behavioral issues, allergies). In high schools, many students move to an online school because they are behind and at risk of dropping out of school altogether.

Thus dispelling some of the myths such as the US demand coming from geography and there would not be the demand in the UK.  Indeed there are some positive messages on impact too – such as in Florida where “students outperform state average in end-of-course exams”.

The Disruption of Digital Learning: Ten Things We Have Learned [Josh Bersin]

A newer piece, as it was published this week, with a lot of good stuff summarizing the state of play for those of us who like to say we work in online/digital learning.  It is a little US centric and knowledge-worker focused in places (which to be fair it acknowledges) but it is one of the better things I’ve seen from Bersin for quite a while:

http://joshbersin.com/2017/03/the-disruption-of-digital-learning-ten-things-we-have-learned/

Overall, it really drives home that the myriad of tools now available means the boundaries between home/work/learning have effectively gone.  Indeed I’d challenge the point about people spending so much time on email always being a negative – yes, a lot will be junk keeping people away from “specific” work but how many of those emails are learning experiences?  Even if the learning is just related to clandestine organizational cultures or rabbit holes of bureaucracy?

I especially like that Microsoft Teams got a mention having had a first look this week.  MT does an awful lot of what your traditional LMS would – discussion groups for communities of practice, VOIP/video for tutoring, task lists for activities/assessments, content sharing for reading/watching/reflecting, wikis for collab authoring, etc etc.  SCORM, I guess, would be the elephant in the room for most LMS deployment but there you’d be looking at if that content is really what you want going forward.  How we might combine MT and xAPI is very much on my “to do” list.

There are things to criticize – for example a lot of this isn’t new just being delivered differently (e.g. microlearning) but as mentioned earlier a lot to like too.

I’ve got a whole ‘reading’ folder on my desktop going back many months so expect some more of these posts soon…

Evaluating my impact: preparing for Kirkpatrick certification

Ever since I started attending workplace learning related events, it has been clear there is an undercurrent in the industry of criticizing the Kirkpatrick model. There are the views it is too old, simplistic, outdated, etc.

Personally I try and avoid criticism for anything I am not in a position of authority on – for example I feel I can criticize politicians for lying, their interpersonal skills, etc. but not necessarily their skill at the practicalities of being an MP such as passing laws.

Whilst I have worked with different evaluation models I have opted to pursue Kirkpatrick certification (as mentioned previously) to really try and crack evaluation/impact for my team and my wider organization. Thus, I will be in a more authoritative position to consider the pros and cons of Kirkpatrick.

Unlike some L&D teams, I would say we are closely aligned to business objectives and do a reasonable job of challenging top down and bottom up requests for programs, performance improvement, etc. However, evaluating success in tackling the issues is tricky and, as I inherited an L&D policy specifically saying we will use the Kirkpatrick levels to evaluate, then we are following that route.

I’m hoping the two-day program will empower me around evaluation in general but specifically being able to make use of the appropriate models. Or it may well make me a more vocal opponent.

The pre-work includes some nice detail behind the levels. Indeed there are references to deal with some of those regular topics of criticism. However, whilst ‘the 70’ is referenced as covered in the ‘new’ Kirkpatrick model, the template forms still reference evaluation of “training”.

A particular challenge for myself in the past has been evaluating solutions’ real impact on the bottom line; in a world of multiple factors (KSME) how can we (L&D or a wider project team) claim an impact. Indeed I raised this at the Kineo/Boost evaluation session. The pre-read refers to using ‘indicators’ and this will be particularly interesting.

Expect a blog later in the month once I’ve been on and reflected upon the certification program!