Some more on what learning needs to pick up from gaming

So another post on the lessons from the world of gaming.

This one was sparked by an article considering if the latest Legend of Zelda game is the greatest ever in terms of design.  I’ve spent quite a bit of time already in this iteration of the world of Hyrule and it is difficult to disagree with the arguments in the article.

The closing paragraph should particularly resonate with learning professionals thinking about how to support their organisations:

the job of the designers is not to hold your hand and guide you around a set path. It is [to] reach out hundreds of hands and leave it up to you which you grab first.

Wow! There’s a topic starter for instructional/learning design debate!

Whilst in the past people may have talked about things like “learning styles” to warrant different approaches we are now, instead, in a position where we consider the different approaches we might drive performance and support learning for people at different starting points and existing levels of competence.

Now the counter argument would be that the multiple, even unlimited, permutations of many games are not feasible in instructional design.  Instead we end up with versions of relatively simple board game constructs when gaming or fairly restricted ‘serious games’.  However, with dynamic algorithmically driven learning there is the potential for an explosion in personalisation.

Screen Shot 2017-05-29 at 20.33.00.png

Winning a battle with only your general left may not be recommended. But a win is a win.

Now the above image is an example of a counterfactual gaming experience, crusading as the Byzantine Empire.  Traditional L&D has of course made use of just such counterfactuals, through role plays, business modeling, simulations, etc.  If you can create an appropriate model then the variations are possible – with different focuses possible across, say, finance, marketing, etc. – all in the ‘safe’ environment of not impacting actual bottom lines, patients, customers, etc.

By thinking through game constructs there is the potential to think about what you want to achieve in a different way.  For example, the battle focused historic counterfactual (such as Total War games in the above image) and more character focused such as the grand strategy Crusader Kings 2 (images below) are effectively giving you the same goal (rebuilding the empire) but in very different experiences.


Expanding (and renaming) the Byzantine Empire across c.100 years (of game time)

The storytelling in a scenario such as the above is prompted by certain actions (for example Byzantium becoming large enough to reclaim the title of “Rome” as an achievement) but is not as structured as, say, a linear first-person-shooter game like Call of Duty.  The latter, more linear style, offers up the potential for set storytelling, with some games much better at this than others.  Which leads to an argument that future instructional designers would be best sourced from graphic communication or creative writing backgrounds.

Traditionally simulation has, of course, taken many forms in workplace development – from table top games to computer scenarios.  The challenge with simulations remains the balance between ‘keeping it real’ (i.e. actually useful in the workplace environment) and maintaining interest through the storytelling/fun and other components.  Meanwhile this post makes good points about balancing complexity versus needing to know ‘now’.

So what to takeaway?

  1. Think about how much hand holding is appropriate – it’s not always a bad thing.
  2. Have the plot/narrative/story drive motivation.
  3. Reward with hidden achievements.
  4. Use users/learners to determine if you are hitting the right balance between reality and gaming elements.

CIPD L&D SHOW MAY 2017: Part 2

Following on from my last post about the CIPD show, a couple of interesting things from the show floor that (I think) were new to me:

Zoho Showtime

I used Zoho apps years ago when they were as well developed as any SAAS solution (guess they would inevitably be called ‘cloud apps’ now).  This included their rather excellent database tool (Forms I think) that allowed for my team, at the time, to create what was (effectively) an institutional digital repository for past exam papers.  Now this was limited in functionality, compared to what you might get with, say, a full library management system tool, but it did the job of record keeping with a searchable interface for end users. So it was interesting to see Zoho ShowTime being advertised at the show.

ShowTime seems like a really interesting attempt to combine presenter elements (Slideshare-esque sharing/presentation), audience interaction (a busy market place considering the remaining audience response system market players, mobile orientated like Socrative, virtual tools like Blackboard Collaborate that can be used in f-2-f too, etc.) and, of course, data and analytics.

Certainly something to try next time I do a presentation and a shame I missed out at my conference presentation last week.


I have some history with LCMS projects so it was interesting to see a new player on the exhibition floor.  Ixxus is, seemingly, a well developed product moving into the learning market (presumably with the likes of Xyleme and Exact in their sites) from a publishing background.

For big content authoring teams it certainly seems to offer a lot – although the tool as a way to “move away from monolithic courses…toward, flexible, personalized and ‘bitesize’ content” is a bit of a hard sell given the challenge many organisations would face in moving towards a CMS approach.  However, certainly interesting to see a player I was not aware of in this space.

CIPD L&D Show May 2017

Not too much to report from this year’s show as I left after only about 4 hours and only a couple of the learning sessions.  However, I did have some good chats with people on the floor and it was pretty clear what the main growth area was from last year – apprenticeships providers, levy support, etc etc.

I left early partly as I was full of cold but also as the first session I sat through, from the Open University, had a narrower focus than their sessions in the past and did not really answer the topic (“How to Make Your Digital Learning More Engaging”).  A nice cartoon fell in the trap of many such eLearning examples – repeatable production values.  Yes the concepts are true – humor, animation, bite-size, storytelling, etc but it worked as an example due to the quality: most of us are not going to get David Mitchell to do voice-overs on our eLearning.  It then reverted to being much a pitch for making use of their content platforms – OpenLearn and FutureLearn – i.e use their digital learning not your own.  Which is an argument you could make of course.

The second session I attended, was from Bolt From the You Ltd, on “Making it Stick – Turning learning into real change”.  This was better and considered why change does not happen: lack of engagement (needs to be fixed by being linked to the individual, think about how measure for impact), lack of learning culture (need ownership and partnership from sponsors to drive through) and to close gap:

Attention density:

Exposure + repetition + application = permanent change.

#thelmsdebate with Saba (event)

A really impressive venue (my first trip within ‘The Shard’) for this breakfast briefing hosted by Saba.

View from 31st floor of The Shard

A morning of reflection and blue sky thinking

The event was basically made up of three parts: a presentation on the old chestnut of “Is the LMS Dead?” from Donald Taylor, a presentation from Saba on the evolution of the Learning Management System market and some networking.

Is the LMS Dead?

@donaldhtaylor ran through some of the core arguments in the whole debate, a week before the launch of his new book (attendees getting their mits on a copy ahead of the inevitable bank holiday bad weather).

Learning Technologies book from Donald H Taylor

Learning technologies in the workplace 101

Particular points that jumped out for me are below, as tends to be the case with Don it was difficult to disagree with the key messages, some comments/thoughts in square brackets:

  1. The purpose of L&D has to come first – not the tech.
  2. LMSs are evolving with new ways to support learning and supporting the workplace move to digital.
  3. Expectations are changing [e.g. consumerisation of workplace tech – I’d say this has been one of the major challenges for those of who have worked closely with learning tech: evolution keeping pace with expectations.  The point was made that perceptions of LMS lag behind actual system development – this is true in-part but also points to the issue of how tricky corporate systems have been to update].
  4. Whilst growing expectations to learn elsewhere (not in LMS – which you can) there remains a need sometimes for the LMS, especially for compliance in heavily regulated industries [yes, certainly my current workplace would effectively be forced to close without the reporting the LMS provides.  However, your learning culture should make it clear that sharing learning makes sense].
  5. Referenced an article from Saba “the-lms-is-dead-long-live-the-lms” [worth a read].
  6. Always really just been part of the landscape [again, yes, but the challenge here for me is that there are many ‘competing’ systems – for example LMS, Intranet, ESN, email, Office365, etc and how you balance these.  Whilst they might not always be obviously competing they are competing for people’s time, often the most precious resource in an organization].
  7. A reminder of the 5 moments of learning need…push is still needed for unconscious incompetent [‘people dont know what they dont know’].
  8. Users dislike their LMS often due to compliance topics.  However, often needed and L&D can make that better – not always the LMS’s fault [obviously good instructional design should minimize this].
  9. Another criticism is the ‘clunky’ nature but many systems moving away with better UI [some anyway] and change from the classroom information distribution model to better more engaging [and personalized] systems.
  10. LMS often really a “workforce working tool” and Saba’s own first use of their tool was as a social platform [this is one of my main issues with modern LMS – you could use it as intranet, ESN, learning, VC and many more things but it depends on the nature of work and where your people spend their time].
  11. Before any of the tech though – what is the role of L&D?  Too tactical view leads to tactical LMS.  Should be about what you make possible – individuals and organizations fulfilling their potential [this is the “empowerment” piece I bang on about].  Nod to the St Pauls and NASA stories for everyone contributing to vision and organizational goals.


@dipak1p followed up with some of what Saba are doing…20 years after basically kicking off the LMS market:

  1. The system continues to lead and evolve [some nice ‘then and now’ screenshots to show how things have changed].
  2. LMS needs to be at center of engagement model.  Can not be in isolation – needs to be core of talent strategy.  Use it to have people feel they are moving forward in a way aligned to the company strategy.
  3. Can be part of a single vendor suite but also, with cloud and other changes, easier to integrate with other things.
  4. Compliance is big part of the picture for many clients.  Needs to be friction-less, for example automatically dealing with requirements when people’s role changes [to an extent we’ve got this in place with a non-Saba system and I’d agree it is 100% essential].  Just show user what have to do [I agree and we do this but it should also be explaining how it fits and what the expected behaviors are] as “consumption isn’t competence” [which is a great phrase I’m stealing from now on].  Real-time analytics and dashboard [looked good – again an example where lots of systems will do it but fine detail and UI all important].
  5. Mobile interface – think about your content again, for example target mobile learners with video, esign tracking of policies on the go, etc.
  6. Peer to peer – capture and use that data: comments, ratings, likes, etc.  Grow out into Communities of Practice [again I’d say this is a challenge depending on how and where people work and what other systems are in play].
  7. Crowd source and social – follow people in system [e.g. people, not topic, first focus].  Socially driven machine learning takes this and other data for more personalized recommendations [some of this similar to what Library Management Systems [[the other LMS]] introduced c.10 years ago – e.g. ‘people like you have liked this’].
  8. Systems becoming more intelligent via interaction data, not based on organizational structures or expansive competency frameworks as they change regularly [this is always a tricky one and I really feel is based on the nature of the organization].
  9. LRS – emerging as the answer to some issues but not going to capture everything.  Create a record of when you recognize you learn something and another source of data for the the LMS [but why bother if you’re an individual?  I’m still really not sure on LRS – it is an L&D solution for a problem learners do not recognize].
  10. “Battle for profile” – what holds your data and what they do [again my worry would be multiple systems compete for the data and our colleagues time – how much you can integrate everything is realistically questionable].

“Learning analytics”: a red herring?

A key message from my recent Kirkpatrick program was to start with the “top of the mountain”.  In this metaphor the top, the peak, the target to reach, is the organizational vision.  Strategic learning programs are, therefore, helping the org reach this vision and should be evaluated as such.

My reflection during the program was that this, of course, is common sense.  We should be working to support the organizational goals.  The challenge then becomes prioritizing multiple needs – so only by forecasting potential impact up front can prioritization be done correctly.   And this is one of the areas where there is misconception with KirkP – it should be about starting with the end in mind and working backwards (not just dealing with level 1 in a standard way and then carrying on from there).

In terms of evaluation of success, LEO have recently discussed the role of learning analytics (LA).  Now, like a lot of things in L&D, I would say the problem with LA is that it has meant multiple things to different people.  One of the earliest examples I saw, sold as LA, was Starfish Solutions (SS) who had a clear solution and goal – use existing LMS/VLE data to improve student retention.  SS makes perfect sense for organizations where educational outcomes and student retention are the organization’s objectives.  I liked SS’s solution (in part discussed with them at the BBWorld Conference back in ’09) but it also faced the challenge that, for many university courses, there was/is less need for ‘big’ data solutions – lecturers know their students in traditional models.  It only made real sense when talking about large scale education – the problem then, again, is that ‘large scale’ means multiple things to different people 😉

The LEO article does a good job at articulating the problems I have always had with L&D impact – especially how to assess when there are so many other uncontrolled variables.  As mentioned in my previous post on the KirkP Cert, this was my main challenge I wanted clarity on from the course.  The recommended KirkP approach of identifying multiple ‘indicators’ (suggesting behaviors are on track for the desired result[s]) that can show a correlation – a key learning point for me.  In this model, therefore, we are building a ‘chain of evidence’ akin to a court of law – “data, information and testimonies at each of the four levels that, when presented in sequence, demonstrate the value obtained from a business partnership initiative”.

What I really liked about this is the clarity of the evidence/steps up the ladder/mountain, from bottom to top:

  1. Individual Outcomes
  2. Team Outcomes
  3. Departmental/Divisional/Unit Outcomes
  4. Customer/Client Response
  5. Organizational Outcomes
  6. Customer Satisfaction
  7. Market/Industry Response

It is this breakdown, of benefit chain, that I will likely adopt in my existing project planning documents.

Let L&D then be clear, as the KirkP course made clear: stop trying to tackle issues through limited/impossible direct causation but instead correlations.  I would say this is a much better approach than simply seeing “measuring learning impact as a waste of time and money” as the LEO article mentions many people argue.

Therefore, I would argue, let us (i.e. L&D professionals) not worry about learning analytics but instead organizational analytics (i.e. goals and metrics) that can be seen as trending over time and aim to see where our investments have an impact.   As recommended in the KirkP programme, do not reinvent the wheel, borrow metrics from elsewhere as they will already be used by stakeholders and those same stakeholders (should) understand the logic.  This should then allow us to, as I’ve hinted at before, not worry about ROI but instead (as recommended by KirkP) Return on Expectations.

So what do I think of KirkP following the course and certification?  Well I’d have to agree with the LEO suggestions:

  1. It’s better than doing nothing… which is what most organisations are doing.

  2. Think about what level of evidence will be good enough for you. As the Kirkpatricks have pointed out, a chain of evidence that would not stand up to scientific rigour may be enough to convict someone in a court of law. If a level of evidence is good enough for a court then it’s probably good enough to convince your board to invest in L&D.

  3. Good enough develops into great.