Reflecting on: “Here’s why you’re failing to create a learning culture”

Another great article from Laura Overton and the Toward’s Maturity team got me thinking this week.  The article considers “five common mistakes” that can stifle a learning culture.

Below are some of my reflections on these points – both from my own experience and what I’ve read, seen at shows, conferences, etc.

  1. You don’t trust staff to manage their own learning
    • I totally agree that everyone needs to own their part in continuous improvement and the part learning plays in that.
    • We are doing a lot to empower managers to coach and facilitate their team’s development.  The challenges I see are two fold:
      1. is that people feel they are too busy to take this on.  I tend to feel people are ‘doing this already’ and do not perhaps realize but…
      2. how can the ‘day to day’ learning can be amplified?  The amplification across silos being a particular challenge.
    • The “trust” point is an interesting one as I wonder how many L&D organisations are happy to trust the individuals in what they need (with the risk of verging into solution-centric models rather than analyzing issues) but not in how to spend money.  In some ways this is fair as it is where L&D have a governance role to play – consistency, economies of scale and consistent outcomes with controlled pilots/innovation, etc.  However, there is the risk of being a blocker…
  2. You are stifling staff contribution
    • “91% of learners like being able to learn at their own pace and they are more than capable of searching for the information they need” – my experience would suggest people generally struggle to search and retrieve (information skills are limited and overload a problem).  This is where information systems are key, L&D needs to be embedded with coms and KM, architecture is all important and it largely depends around what is already in place for having an online internal profile – for example, ESNs.
    • I would though agree with the main points: it is all important to get people to share what they find and user generated content is part of this – so too is getting people to feedback after external training or conferences.  The latter examples have been known issues for a long time and remain issues, I presume, in most organisations from what I have seen and heard.
    • Perhaps the issue here is with “personal development planning” and career development more generally.  Yes, it is a personal journey and one which will be more personalized via analytics, customization and technology like Filtered.  However, the fundamental point why an organisation wants to invest in you (be it funding or just funding your time away from work) is to see a performance improvement now or in the future (see Degreed for a definition of learning culture) so do we drop the “personal” to stress that it is a co-investment?  We could say “performance improvement plan” but that sounds rather draconian and as if people are on their “final warning”.  Anyone out there got a better name?  Really “plans” just needs to be dropped altogether for ongoing small scale development?  Then what about required accreditation (where they are not going away any time soon)?  Lots of issues here for the workplace in general beyond L&D departments – for example, how do you budget for these more flexible requirements.
  3. Your content is inaccessible
    • Yep, a real problem with the traditional model of hiding things from search via SCORM, etc.  This ties in with some of what I’ve written under ethos about trying to change L&D to an open web approach – do we really need to hide behind logins?  Often its about having everything in one place but that is, in part, due to poor architecture and a lack of hyperlinking
    • There remains, to me, a question over how much the best content is inaccessible.  Yes, the open web hosts enough to get by on most topics but do we still need to licence from vendors larger libraries of nice solutions like getAbstract?  I would say yes, even if many publishers have gone to the wall in the digital age.  The challenge then remains what it has been for probably 20 years or more – federated search across multiple resources.
  4. You take learning away from work
    1. Again my ethos page stresses the need to consider learning as work and work as learning.  I ran a session last week for people in my organisation who have formal “learning” responsibilities in their roles.  The interesting outcome of the session, which was the first such event and therefore deliberately navel-gazing about how we work (via me picking various articles and thought pieces from Jane Hart, Donald Taylor, Saffron Interactive and others), was our consideration of where we are on some of these spectrum.  Effectively a bench-marking reflection exercise for the wider group.  I still doubt many organisations are actively giving people such time to reflect on external learning and bring it back in a productive way to influence behavior.
    2. The growing importance in the UK given to apprenticeships is in some ways reinforcing problems here but also targeting learning at the workplace performance.  It remains to be seen if the government’s approach with the Levy can survive the Brexit fallout and other challenges.
  5. You don’t reward learning
    1. Agreed, this can be a major problem.  I’ve previously left organisations frustrated at a lack of opportunities to make use of my skills and I suspect many many others have had this problem.  I recently spoke to a colleague who had even been through a formal development programme only to not have a role to go into at the end – again apprenticeships should help here with the formal development options leading to rewards.
    2. Sharing success can be driven via internal coms channels and we’re also using a combination of Open Badges and competency models to drive recognition.

Overall some really interesting points to reflect on and try to tackle going forward!

Learning Technologies book from Donald H Taylor

Thoughts on: “Learning Technologies in the Workplace”

So after picking up a copy a while back I’ve now had a skim through Donald Taylor’s book and thought I would capture a few thoughts here:

  1. I really like that it goes back to the origins of some of our key concepts (e.g. eLearning and technologies).  No doubt due to my history studying background, I have a soft-spot for books that consider historical perspectives.
  2. It does a nice job of linking those historical issues to the current state of play; with recommendations from and for the usual suspects: Jennings, Harrison, etc.
  3. It feels like the kind of book that could become somewhat seminal – the kind of history/good practice balance that often act as an entry point for people coming into an industry (or, in this case, HR generalists up-skilling in this area).  What makes me say this are various, perhaps unintentional, attempts to establish standards – such as a move for the use of ‘e-learning’ over ‘eLearning’ and other variations.  I know that example is basic semantics but it is indicative of the industry that such things have never really been agreed – I’ve certainly tended to always use eLearning and a lot of Don’s webinars/presentations around the book’s launch have stressed that this kind of text has never really been done before for learning tech and the question really for him in authoring it was “why not?”.  My view would be that its just been presumed you can pick up bits and pieces from conferences, blogs, etc. rather than needing a ‘go to’ text.  I am certainly going to treat it as such and pass my copy around my team!
  4. The book adopts the approach of Clive Shepherd in using e-learning as the generic term, under which includes the traditional self study model but also virtual classrooms, social tools, etc.  Personally I prefer ‘online’ or ‘digital’ as the umbrella, under which ‘click next’ style content is what we call ‘e-learning’.  Again it is semantics but you do often get misunderstandings if you are not explicit – for example, a static PPT file is IMO a resource (or ‘piece of content’) not eLearning [oops there I go again].
  5. The book also makes the point that much of “learning” technology is really about being inventive with workplace and commercial tech.  This include’s categorizations such as those in the below image.  Personally this is an area that has always interested me – the scope to be more productive and innovative with tools beyond their initial design but avoiding what the book refers to as “magazine management” (i.e. just running with the latest ideas without proper analysis).
  6. WP_20170602_21_43_28_Pro
  7. The introduced APPA model (an aim; a people focus; a wide perspective; a pragmatic attitude) is sensible and gives a structure to the case studies and main arguments.  Obviously there are lots of other ways you could classify successful projects but its a useful mnemonic.
  8. There are a few big jumps in logic – for example the section on justifying the aim/evaluating through metrics is strong but this statement: “If the Hawthorne effect can be accounted for, and if we have historical data and a control group, then ROI should be calculable in instances where we can calculate the value of the work of the employees concerned”.  The statement is fine but easier said than done for many people!  The chapter goes on to a pet peeve of mine – people using “ROI” as a term when its not actually ROI and it’s good to see Don call this out.
  9. The classification of typical aims very nicely simplifies the complexities of different (particularly LMS) projects as being one of the below (with appropriate metrics):
    1. “Organisational infrastructure – Effective business-as-usual, risk avoidance, compliance
    2. More efficient L&D delivery – Cost savings in L&D , reduced admin, faster delivery
    3. More effective learning – Faster time to competence, better retention
    4. Part of organizational change – Defined by the sponsors off change”
  10. The above is a really nice way to consider if the LMS is more than compliance (point 1) through to fulfilling options such as being THE social platform in an organisation (an example of cross-organisation change like point 4)

To summarize, the book reads a little like a “greatest hits” album, a compilation of my 10ish years of going to Learning Technologies shows and LSG Webinars. With Don calling on his experience as chair to mention players like Jane Hart and their contributions to the industry (such as her top tools for learning) as well key concepts towards good practice.

Overall, it is a great primer on development within and of organisations, covering introductions to Performance Consulting, Agile, network analysis and more – not just learning tech [which of course is the point – learning tech can not survive if just acting in a ‘learning bubble’].  I also attended his session at the Summer Forum and will post my notes on that one soon.  Even more from Don on topics around the book on this podcast.

CGS Corporate Learning Trends, Observations, & Prediction 2017

Notes:

  1. A decent report based on a survey of L&D leaders.
  2. The headline taken away is that digital has truly arrived with increased use of video, mobile, social and micro formats.  Perhaps more interestingly is the strong intention to use “instructors” to the same amount (c.65%) or more (10%) suggesting a general increase in the mix/blend rather than shifts.
  3. The “greatest challenge” was budget (surprise surprise) with 47% reporting this issue.
  4. When considering the metrics to justify that budget – “employee engagement” was identified as the “most important”.  I wonder how much this is due to the fact that employee engagement surveys are relatively easy?  The KP programme would argue for combining different metrics and, indeed, the catch all of “business metrics” was a close second to EE in the survey response list.  A separate quote from Jack Welch reverts to three measures for overall company performance: employee engagement, customer satisfaction and cash flow.
  5. I had more issues with the question of “So how do L&D professionals get employees excited about learning?” being answered “by giving them what they want”.  The report here is talking about “speed, efficiency, relevance and usability” but, of course, that’s all well and good but only if it actually helps improve their performance.  We can give wonderful ‘development’ experiences, that people want, that can be completely irrelevant and fail to stop someone being redundant a few weeks down the line.
  6. There’s a run out for the old quote (included below).  Lots of issues with this I’d say, including:
    1. retraining might be to solve original failures,
    2. the upgrade needs a clear as-is/to-be message,
    3. development in this context = performance and outcomes?

“People need maintenance and upgrades even more than machines do.  Retraining is maintenance.  Training is an upgrade.  Development is the next generation model.”

Some more on what learning needs to pick up from gaming

So another post on the lessons from the world of gaming.

This one was sparked by an article considering if the latest Legend of Zelda game is the greatest ever in terms of design.  I’ve spent quite a bit of time already in this iteration of the world of Hyrule and it is difficult to disagree with the arguments in the article.

The closing paragraph should particularly resonate with learning professionals thinking about how to support their organisations:

the job of the designers is not to hold your hand and guide you around a set path. It is [to] reach out hundreds of hands and leave it up to you which you grab first.

Wow! There’s a topic starter for instructional/learning design debate!

Whilst in the past people may have talked about things like “learning styles” to warrant different approaches we are now, instead, in a position where we consider the different approaches we might drive performance and support learning for people at different starting points and existing levels of competence.

Now the counter argument would be that the multiple, even unlimited, permutations of many games are not feasible in instructional design.  Instead we end up with versions of relatively simple board game constructs when gaming or fairly restricted ‘serious games’.  However, with dynamic algorithmically driven learning there is the potential for an explosion in personalisation.

Screen Shot 2017-05-29 at 20.33.00.png

Winning a battle with only your general left may not be recommended. But a win is a win.

Now the above image is an example of a counterfactual gaming experience, crusading as the Byzantine Empire.  Traditional L&D has of course made use of just such counterfactuals, through role plays, business modeling, simulations, etc.  If you can create an appropriate model then the variations are possible – with different focuses possible across, say, finance, marketing, etc. – all in the ‘safe’ environment of not impacting actual bottom lines, patients, customers, etc.

By thinking through game constructs there is the potential to think about what you want to achieve in a different way.  For example, the battle focused historic counterfactual (such as Total War games in the above image) and more character focused such as the grand strategy Crusader Kings 2 (images below) are effectively giving you the same goal (rebuilding the empire) but in very different experiences.

Rome

Expanding (and renaming) the Byzantine Empire across c.100 years (of game time)

The storytelling in a scenario such as the above is prompted by certain actions (for example Byzantium becoming large enough to reclaim the title of “Rome” as an achievement) but is not as structured as, say, a linear first-person-shooter game like Call of Duty.  The latter, more linear style, offers up the potential for set storytelling, with some games much better at this than others.  Which leads to an argument that future instructional designers would be best sourced from graphic communication or creative writing backgrounds.

Traditionally simulation has, of course, taken many forms in workplace development – from table top games to computer scenarios.  The challenge with simulations remains the balance between ‘keeping it real’ (i.e. actually useful in the workplace environment) and maintaining interest through the storytelling/fun and other components.  Meanwhile this post makes good points about balancing complexity versus needing to know ‘now’.

So what to takeaway?

  1. Think about how much hand holding is appropriate – it’s not always a bad thing.
  2. Have the plot/narrative/story drive motivation.
  3. Reward with hidden achievements.
  4. Use users/learners to determine if you are hitting the right balance between reality and gaming elements.

CIPD L&D Show May 2017

Not too much to report from this year’s show as I left after only about 4 hours and only a couple of the learning sessions.  However, I did have some good chats with people on the floor and it was pretty clear what the main growth area was from last year – apprenticeships providers, levy support, etc etc.

I left early partly as I was full of cold but also as the first session I sat through, from the Open University, had a narrower focus than their sessions in the past and did not really answer the topic (“How to Make Your Digital Learning More Engaging”).  A nice cartoon fell in the trap of many such eLearning examples – repeatable production values.  Yes the concepts are true – humor, animation, bite-size, storytelling, etc but it worked as an example due to the quality: most of us are not going to get David Mitchell to do voice-overs on our eLearning.  It then reverted to being much a pitch for making use of their content platforms – OpenLearn and FutureLearn – i.e use their digital learning not your own.  Which is an argument you could make of course.

The second session I attended, was from Bolt From the You Ltd, on “Making it Stick – Turning learning into real change”.  This was better and considered why change does not happen: lack of engagement (needs to be fixed by being linked to the individual, think about how measure for impact), lack of learning culture (need ownership and partnership from sponsors to drive through) and to close gap:

Attention density:

Exposure + repetition + application = permanent change.