Some more on what learning needs to pick up from gaming

So another post on the lessons from the world of gaming.

This one was sparked by an article considering if the latest Legend of Zelda game is the greatest ever in terms of design.  I’ve spent quite a bit of time already in this iteration of the world of Hyrule and it is difficult to disagree with the arguments in the article.

The closing paragraph should particularly resonate with learning professionals thinking about how to support their organisations:

the job of the designers is not to hold your hand and guide you around a set path. It is [to] reach out hundreds of hands and leave it up to you which you grab first.

Wow! There’s a topic starter for instructional/learning design debate!

Whilst in the past people may have talked about things like “learning styles” to warrant different approaches we are now, instead, in a position where we consider the different approaches we might drive performance and support learning for people at different starting points and existing levels of competence.

Now the counter argument would be that the multiple, even unlimited, permutations of many games are not feasible in instructional design.  Instead we end up with versions of relatively simple board game constructs when gaming or fairly restricted ‘serious games’.  However, with dynamic algorithmically driven learning there is the potential for an explosion in personalisation.

Screen Shot 2017-05-29 at 20.33.00.png

Winning a battle with only your general left may not be recommended. But a win is a win.

Now the above image is an example of a counterfactual gaming experience, crusading as the Byzantine Empire.  Traditional L&D has of course made use of just such counterfactuals, through role plays, business modeling, simulations, etc.  If you can create an appropriate model then the variations are possible – with different focuses possible across, say, finance, marketing, etc. – all in the ‘safe’ environment of not impacting actual bottom lines, patients, customers, etc.

By thinking through game constructs there is the potential to think about what you want to achieve in a different way.  For example, the battle focused historic counterfactual (such as Total War games in the above image) and more character focused such as the grand strategy Crusader Kings 2 (images below) are effectively giving you the same goal (rebuilding the empire) but in very different experiences.

Rome

Expanding (and renaming) the Byzantine Empire across c.100 years (of game time)

The storytelling in a scenario such as the above is prompted by certain actions (for example Byzantium becoming large enough to reclaim the title of “Rome” as an achievement) but is not as structured as, say, a linear first-person-shooter game like Call of Duty.  The latter, more linear style, offers up the potential for set storytelling, with some games much better at this than others.  Which leads to an argument that future instructional designers would be best sourced from graphic communication or creative writing backgrounds.

Traditionally simulation has, of course, taken many forms in workplace development – from table top games to computer scenarios.  The challenge with simulations remains the balance between ‘keeping it real’ (i.e. actually useful in the workplace environment) and maintaining interest through the storytelling/fun and other components.  Meanwhile this post makes good points about balancing complexity versus needing to know ‘now’.

So what to takeaway?

  1. Think about how much hand holding is appropriate – it’s not always a bad thing.
  2. Have the plot/narrative/story drive motivation.
  3. Reward with hidden achievements.
  4. Use users/learners to determine if you are hitting the right balance between reality and gaming elements.

The (Work)Force Awakens

There has been a lot of interest recently in the importance of engagement in the workplace.  My view would be that this is not as generation influenced as some commentators would believe and has to be looked at as part of the bigger picture.

Emergent trends such as the rise of holocracy, and apparent disappointment with it, can be seen as part of a growth in thinking, again, about the nature of work.  Even if it is easy to see holocracy, itself, as the latest management fadThe Workforce Awakens

The rise of the ‘manager class’, seen in many fields (including Chinese Higher Education), seems to be slowing through association with unnecessary bureaucracy.  Therefore, we are left with valid questions about what the alternatives may be.

Some politicians would have you believe that workers are no longer exploited, the argument from many quarters would no doubt be that without some kind of partnership model for all staff there remains inequality and a lack of engagement.

If we consider organizational knowledge management, in the format it has emerged around SharePoint solutions at least, as reinforcing silos in organizations through endless permission setting.  The ‘circles’ of holocracy and alternative structures offer an appealing alternative.  Indeed If we consider the future to be that of ‘learner workers’, not ‘knowledge workers’, then we can perhaps go so far as to say the individual finally moves to the position of prominence beyond any kind of team structure.

There would be additional options here, data can now be gathered and presented in so many ways that an appeal by the workforce for more engaging workplaces and better representation will likely come at a cost of closer (and often automatic) scrutiny.

This is all in an environment where the ‘war for talent’ might be hotting up with demand outstripping labor supply in some markets.  In the UK at least this will likely result in further brain drain from public sector austerity and then more finger pointing when public expenses come in over budget, projects delayed and seemingly using never ending streams of temporary staff (from high-end consultants to the large volume of agency nurses plugging NHS staffing gaps).

There are plenty of suggestions for ways to engage the workforce, such as opening the books, to make people better understand their influence on the bottom line.  The challenge is that many options come back, again, to the ownership model and if that supposed end to exploitation sees a future of joint ownership rather than one of zero hour contracts, freelancing and uncertainty.

This all obviously has huge implications for any local learning and how fit for purpose models such as PLC will be going forward.  L&D can play their part, but the post-recession awakening in high demand jobs is only likely to lead to your people following the dark side (of more money at your competitors) if you can not fundamentally consider them as equals.

Can there be ‘original thought’ in the era of the knowledge-age organization?

I think I have only ever applied for one temporary ‘professional’ role.  My logic normally is that with constraints such as a mortgage I would not want to risk a period of unemployment.  However, in the case of this particular role it sounded fantastic so I thought I would apply.  I was pleasantly surprised when I was offered an interview even though I did not have one of the key ‘essential’ criteria of the person specification.  When interview day came I, for some reason, developed horrendous hiccups and generally did not do very well.

Anyway, one particularly awkward point was where I started describing past achievements and relating them to some of the prevalent best-practice theory in the discipline (eLearning).  Now I think I might have come across as suggesting that I (or rather my employer of the time and team) were ahead of the game.  At one point, I think, I even suggested it being a little ‘chicken and egg’ in that practice and theory become so intertwined that it becomes difficult to remember what came first – theory, you implementing an idea, recognized best practice, etc.  At best I think I suggested I was an original thinker and innovator, but without really backing it up as a reflective practitioner perhaps should be able to, at worst I appeared egomaniac-ish saying “I was first” to do various things.

Whilst I did not deal with it very well on that interview day, I would have now suggested that 100% ‘original thinking’ is incredibly difficult in our networked world.  In other words, we are products of our environments and if one has a particularly active personal learning ecosystem the ‘original’ source of an idea is difficult to track.  The challenge then should be to ensure you ‘stand on the shoulders of giants’ rather than reinventing the wheel.  Whilst the Internet has accelerated growth and sharing of ideas contributing to a world that is rapidly changing [I wouldn’t agree with all of this video but it is at least useful for seeing the prevailing mood] it also means that you can quickly appear out-of-date or just rehashing the work of others.  This has been particularly highlighted in the last week or so…

  • This article on big data in Higher Education, for example, makes a number of valid points but few are original.  Where it mentions work being done to track student achievement by their library use, many in HE will be already familiar with examples of such work.  Indeed some institutions already track devices (certainly of guests) to their networks and LMS/VLE data (should) has been used in ways such as those mentioned in the article.  Perhaps the issue the author really alludes to is the potential value in linking data and (I would argue) warehousing data from multiple institutions to see bigger trends.  Indeed this cross-pollination would help improve the data usage, for organisational effectiveness purposes, mentioned in the article.
  • In the L&D space, this week I watched the recording of a recent LSG webinar from Jane Hart.  Now I have followed Jane online for years which makes it tricky to pick out how much she is confirming my hunches/way of doing things as opposed to leading my thinking with original ideas but this recording really hit home.  Whilst I agree that it is fair to say ‘lets not kid ourselves, people are not going to adopt all of this’ (a point approximately made in the presentation) I have to feel that in an office/knowledge/people based business there has to be much smarter coming together of learning, sharing, collaboration, knowledge/information/resource management, etc. in the kind of ways Jane mentions.  I tweeted at an event earlier in the year that Salesforce-centric employees seemed to always be the example given of where some of this works, but surely there are leaders out there who are implementing appropriate organisational development(s)?

I would argue that joined up systems and data are one thing but, realistically, you need an enterprise where learning is fully embedded culturally.  Here is where education organizations have an advantage as learning is their mission but they should also be able to use the LMS/VLE as their organizational platform, alas I would imagine too many break that shared social hub by using separate Intranets, etc.  Yes there remain specialist functions, that need certain software (arguably a Library Management System would be an example), but for being an employee of a collaborative organization that shares, reflects, learns and adapts as one I really do feel we need to move from breaking things into silos of learning, knowledge, resource, etc. management.

Perhaps it is my own environment and ‘professional genetics’ of training and beliefs that sends me down the above road but surely the above should be the case and I am not diving into ‘original thinking’.  However, when you see so many project management, L&D, learning technology and other advertised posts which are clearly based around old models it does make me wonder.

Capitalism 4.0

Anatole Kaletsky’s 2010 book has a question that I had not really thought about before – when did the 21st Century start?

1815 and 1918 are the dates, as a historian, you often associate with the previous two centuries. Kaletsky considers key dates in the 21st.

Identified are 1989 (Berlin Wall collapse and the WWW being two of five “major transformations”).

However, the era of “Capitalism 3.0” ended in 2008 (with the collapse of Lehmans) and thus the 21st century began.

For the record:
– Capitalism 1.0 = laissez faire (1776-1932)
– Capitalism 2.0 = state involvement (1931-1980)
– Capitalism 3.0 = Thatcher-Reagan led (1979-2008)

As a history graduate I like to try and step back from issues and consider these trends. In-particular, the point made that it was not just cheap credit that caused the 2008 problems. It was a wider self-destruction of “market fundamentalism”, growth driven by 30m communist souls opened to western goods.

The argument is that we now face a period of balance between state and free-market.

So will we look back and consider a banking crash to be the great apocalyptic moment of our generation – when we moved into a new century of new concepts? Perhaps – its something to keep in mind going forward though.