As part of my job hunt (new role starts soon!) I have been going back through various old notes, resources, etc. This is, in part, as the role will see me making use of knowledge and skills that I have not used for a couple of years or more so I have been giving myself a bit of a refresher.
One thing I came across was my MSc dissertation’s initial literature review, methodology and themed findings. In hindsight an interesting bit of this was the attempt to define instructional design as a baseline to what my dissertation was focusing on (copied below). The emphasis (that I have added to the original text in the final paragraph) is to show an interesting conclusion I made – a few years before the mainstream conversational shift to LXP platforms and the multiplication of “learning experience designers”.
Towards a definition of instructional design
ID is a term that can be seen as having various meanings, having been associated with “process…discipline…science…[as well as practical] reality” and sometimes synonymous with “instructional system[s]…technology…[and/or] development” (Berger and Kam, 1996). This complexity has contributed towards there arguably having been a “poor research basis…[for] a lot of instructional design practice” (Elen and Clarebout, 2001, p.4). Indeed the ADDIE model, often seen as the “typical” ID process (Anagnostopoulo, 2002), has been criticised for its lack of evidence-base, especially when used outside of higher education (Ruark, 2008). ADDIE’s pervasiveness extends beyond practice to use as the template for ID literature, including in the chapters of Armstrong ed. (2004). Thus, ADDIE can be seen as the “conventional core” from which other models are built (Tan, 2010). However, whilst ADDIE is often central to ID practice it should not be seen as synonymous with ID as a discipline, instead simply as one of the systematic tools and processes in the designers’ toolbox.
The works of Charles M. Reigeluth remain key texts for instructional designers, such as Cammy Bean, in defining what their profession relates to, including the demarcation of curriculum design and ID (Bean, 2010). Reigeluth’s definitions, of ‘curriculum’ as “what to teach” as opposed to ‘instruction’ that is “concerned primarily with how to teach”, are important in establishing what the discipline of ID focuses upon (Reigeluth, 1983, p.6). This focus on teaching has been seen as one reason why ‘ID’, is a term that, has not appealed universally outside of the United States where it “was conceived during the period when the behaviorist paradigm was dominant in American psychology” (Molenda, 1994, p.3) and is closely associated with the “objectivist tradition” (Duffy and Jonassen, 1992, p.2). More recent shifts from behaviourist models in education to constructivist models can be seen as having encouraged the use of ‘learning design’, and other terminology, although some ID theorists would see constructivism as simply part of the “wide variety of instructional-design theories” that influence instructional designers in their work (Molenda, Reigeluth, and Nelson, 2001, p.5). Indeed Reigeluth identified many of the criticisms of traditional ID definitions and advocated changes in educational practice, which have since become mainstream. The changes called for included a need for a “‘learning-focused’ paradigm” in which “instruction must be defined more broadly as anything that is done to facilitate purposeful learning” (Reigeluth, 1999, pp.19-20). Thus we can take instructional design as being the discipline of developing valid learning experiences following systematic theories and models in the way that has been “accepted in business and industry” and is increasingly acknowledged in education (Gustafson and Branch, 2002, p.23).
This plays into some of the problems in society too, for example paternal leave as still being seen as odd by some or those who have had time out to care for loved ones struggling to get back into the labour market. We have messy lives and current role will rarely be a perfect description of what you are about – even full histories on LinkedIn and CVs struggle to do this.
Now in some areas it may be correct that you are judged on your latest role, for example you would hope the Thomas Cook management team will be judged on that company’s failure and any past success is not used to self promote into other top paying jobs (thus ignoring their role in a brand’s collapse). However, as we’ve seen from most of the senior figures involved in the 2008 crash, mud often does not stick to those at the top.
I’ve recently just finished the back catalogue of the L&D Podcast. The first episode, an interview between the host (David James) and Nigel Paine resonated a lot with me. There were elements of Learning Reducer concepts at play here but on a personal level Nigel’s explanation that he “knows a lot on many things” [possibly sic] feels largely where I am. I have been keen to develop skills and experience in a broad way, in fields where I feel once you have done something once or twice you have done that and you can move on and learn other things.
As Nigel mentions, he could have specialized and so could most of us, especially in the learning industry.
I could, at different parts of my career, specialized by spending more hours on things from delivering information skills (which I’ve actually picked back up just recently), developing eLearning modules (which I’ve picked up a bit recently too), LMS management, LCMS management, classroom facilitation, virtual classroom facilitation, etc. etc. I fully recognise that there are instructional designers, eLearning developers, etc who will be super specialist in such areas or a particular tool (say the Articulate Suite) where I would say I a am ‘competent’. However, I can assist organisations via a broad range of learning (or Learning and Development) related areas.
Overall I feel I have a good spread of skills when one considers things such as the LPI Capability Map, as well as leveraging my background in libraries and information to tend to come at learning issues from a different perspective. To simplify, my default position is often for social collaborative learning, built around curated resources. However, I am also aware that what the social within this means will change and increasingly be driven by AI. Or in other words, my default position is not for anything that resembles a sage-on-stage ‘classroom’. I would rather see people discussing this mentality piece in learning rather than saying we need a “developer”, “facilitator” or other narrow skill set.
There is also a related point that I’ve complained about before, that you can register for webinars for your own personal development but can not access them with a personal email (such as outlook.com, Gmail, Yahoo, etc). Here vendors are being short sighted for their current sales pipeline rather than long(er) term branding and customer development. Indeed free webinars may well be a good recruitment tool for them – for example, I would happily say I would love a job at Mercer, Brightwave and many others companies due to the quality of their work that comes out at free conferences and webinars.
So, when I introduce myself at an event or in the pub what do I describe myself as? Good question. I struggle and I’d happily accept suggestions from readers! One fact would be that it would not be restricted to any of my previous jobs – we are more than a sum of our parts. Overall, let’s not pigeon hole people and let’s not forget the power of experience and the “whole person”.
FINAL UPDATE MARCH 19th 2020: obviously this is a pretty mammoth task now that more US and UK organisations have got involved (and the increase in HEIs closing). Overall, I would recommend looking at what you are due to cover from a learning perspective and working out best approaches from there. Obviously tools should come after topics/tasks/outcomes. Here’s hoping that the digital learning world gets some credit out of this and continues to evolve.
Yes, I know this would be better as a shared Google Doc, Google Sheet or Wiki but I was trying to avoid false advertising from opening up the editing to those pushing products.
Note these are where there there seem to be clear attempts at offering longer free trials than normal or specific, short term, free upgrades/accounts. This goes over and above ‘always free’ tools such as ft.com (via their schools programme) and YouTube (free to all). I will try to add to this over time:
If you do not know Clint Clarkson‘s work I would recommend his podcast. One of the features of his pod/YouTube are the #SundayRants where he lets rip on the (e)learning industry. I have found some of these to be funny, others quite familiar with what I have seen in my 15-ish years in the industry but also some where he plays to long standing tropes that (I would hope) are somewhat out-of-date. Overall, they are ‘close to the bone’ criticisms, many of which industry pros will have heard before at events like Learning Technologies, the Learning and Skills Group Webinars, etc.
On his pod feed was a link to a recent, five minute, video. For the rest of this post I am going to deconstruct his video, title “We suck at training” (link/embed below):
Claim 1: We have dehumanised learning
I think this is less about learning and more about the dehumanisation of big business. Humans are increasingly a cost to be justified in many industries, with increasing options for offshoring or automation of even traditionally ‘professional’ roles. This is the context that led to the CIPD show back in 2015 calling for a more ‘humane’ HR experience. As L&D often makes up just part of the ‘People’/HR offering it can be lost within that bigger picture, not least if the focus is primarily on compliance reporting. Thus the call here, for me, is to re-personalise our organisations overall – ESNs, Microsoft Teams, Skype for Business, etc. have rebuilt some of the damage from digitisation but forcing people through automated recruitment processes and then minimal human contact with distant leaders needs to stop. Advice from an old colleague comes to mind – most business problems would be solved if “people stopped acting like d*cks”.
Claim 2: “For the most part, learning doesn’t lead to better organisational performance”
It is noticeable that the slide shown in the video does not have this claim in quotation marks, the nearest I could find in the quoted source (Harvard Business Review) were a couple of different articles, neither of which I would necessarily disagree with:
(Clint does have the specific article on the slide but I can not see what it says from the video).
As for the claim itself, every learning professional (should)/will know this to be true. There need to be opportunities to apply learning, learning needs to be reinforced, individual engagement (and thus contribution to organisational performance) may not drop thanks to learning but that simply mean a steady hand on a tiller rather than improvement. This latter point is why I try to talk about “performance issues” as it may not always be about improvement per se. Indeed if we believe in an age of a ‘reskilling revolution‘ then L&D is really about transforming people’s lives and careers – no longer simply seeking improvement in existing roles. Even if you do not go along with the version of the future where large %s of people need reskilling we hopefully can agree that part of L&D’s role is to help and support people. That may be helping colleagues avoid stress related illness, feeling like they need to leave the organisation or a multitude of other alternative scenarios – L&D (can) rock! Too often though L&D end up being the ‘good guys’ in HR and to be humane we need other functions to come with us really – it is not about L&D catching up with others.
Claim 3: We need Ingenuity, Creativity and Courage
Yes, I think this really is fair and effectively mirrors the professional discourse. These could be seen as being aligned to (amongst other things):
doing more with less,
doing more better
and learning from mistakes of the past to challenge the future.
I have met very few L&D folks who would ever say they are happy with their offering and, in my experience at least, it feels old-hat to hint we have legions of L&D folks out there rehashing solutions without considering what is right for their organisations. This seems to be a trope on a lot of podcasts and other L&D media but it does make me feel how this can be the case – surely the 100s of people who attend Learning Technologies and other events are not then just going back to the office to be delivering what the business told them to?
Claim 4: “We start training today” with boring stuff
Really? Is this still the case? I see good examples where learning objectives are outlined, I see good examples where they are not. I know from personal experience that some organisations have insisted on learning objectives being outlined at the start of events but there can be justification for that – not least when learning management systems were poor and you had to access courses just to see what they were about and objective lists acted as a form of table of contents. As for the reference to Gagne, and whilst that was a theory in both of my learning related masters programmes, I think we all know that both in terms of learning theory and practice that the ‘grab attention’ rule is very nuanced. Indeed often the rules around this are enforced by regulators and accrediting bodies, it is not good learning but a single L&D team are unlikely to ever have the power to drag such bodies into the 21st century. But we can try through organisational channels.
Claim 5: It is a smart phone world
Unfortunately, having worked in a school for a while, my views of smart phones have changed. A year or so back I would say that people should be self-managing in their usage and that, as learning professionals, you should never ignore the power of having such a tool available to many. Unfortunately, observing behaviour in a school, you see the addictive behaviours that come from the device and the negative impact that come from that. Where I perhaps differ from Clark is that I doubt learning will ever be able to claim a learners attention in this environment, from a young age we are creating our bubbles of interests and training our brains to hope for dopamine shots from notifications, comments and messaging. I once ran a training session in Russia where the culture was that people could come in and out for phone calls and emails as the expectation was that people were still contactable despite being in face-to-face training. I feel now we really need to be harsh on the rules of what we are aiming for – for example I know I can double or triple task but also know that if I am sat concentrating, making notes, etc. I am more engaged in that webinar, call, meeting, etc. than when I am doing more than one thing at once. If the phone, tablet or laptop is there and on we have lost the battle – ground rules and contracting are more important for learning, meetings, etc. than ever before.
I can be very self critical in this area – I know I revert to games, YouTube, Twitter, LinkedIn and other things when I should be doing something else. We all do it, L&D is not the only thing that (appears to) suck in the face of constant connectivity.
Claim 6: Shift toward human-centred (not just “rules-based”) learning
Now I suspect the establishment of “rule-based” learning is really an attempt at ‘evidence-based’ good practice, but, yes – as mentioned above – a slide of learning objectives is not good practice unless in makes sense in that human context. As someone who has authored guidelines, instructional design instructions, eLearning standards and more it is also the case that “rule-based” can be trying to simply aim for what “works” in the deployment environment you are working in.
Probably my worst ever workshop facilitation experience was where I had presumed the team I was with had been told the objectives/purpose by the manager. When I arrived (late thanks to the trains) I jumped straight into reflection sharing exercises, only for most of the most reflections to come back to me as variations of “sorry, why are we here?” It only really got worse from there thanks to dynamics in the room – knowing your audience, the human relationships, etc. are important (as mentioned in the video). This is tough at scale.
The need to be human-focused goes to the humane point but I feel the real challenge is that we are all different – again, learning design at scale is tough. There are ways to tackle this and technology continues to evolve to help, lets move onto how Clark thinks it can be done and I’ll address there…
Claim 7: Stories are more human than facts
The storytelling bandwagon has been going for quite some time and in general I do not disagree – real examples, interesting experiences, scary examples (especially for where things go wrong), etc. are all applicable learning experiences and tend to get the message across in memorable ways. However, I do think that it dismisses the fact some people are far more likely to remember facts and figures. Again, the experience of working a school has been illuminating for my thoughts on adult learning – from looking at younger learners (remember we were all there once) – there are people who simply work and learn differently to others. Simply calling for more storytelling, more gamification, etc. is saying we are going to target those %s of people we think this will help. I am obviously trying to avoid saying there are different learning styles at play here but numbers and facts are memorable – lets just think about some of L&D’s favourite theories, such as 70/20/10 and the faux %s related to learning by doing – these are easy to remember and stick in the collective industry consciousness.
Claim 8: Pictures are more human than text
I get the logic here but pictures (and other visualisations) are difficult to get right. Even worse you can distract from the main message. Text can be hugely powerful, not least in storytelling, and the librarian in me is always keen to point out to eLearning folks that we learned fine from books for a long time. In some ways, things have not changed since medieval times – you need people and/or resources to add value to your text and get the messages across that you need to get across. Then you can build into application, behaviour change, etc. In this regard traditional university eLearning (i.e. a series of resources often using the tutor’s voice across of mix media) can be superior to traditional corporate eLearning (packaged click-next SCORM stuff).
Is text “torment” as in the video? I admit I am certainly the world’s worse librarian for actually reading stuff – again we are distracted by our bubbles, smarphones, etc. I would actually advocate that part of formal education and L&D’s mission needs to be recreate in depth study for the modern age – this may well include reading, a lot.
Claim 9: Fun is more human than drudgery
Purpose is the key thing here for me – do people feel like they are contributing, are they aligned to what they are contributing, is training going to help them progress and contribute more. Fun is secondary to feeling value and feeling you add value IMO. Indeed we know that the science tells us things being hard can often improve retention – and that hard is not always fun.
“People show up for fun” – well this reminds me of the book ‘There will be donuts‘: I think I was given a copy for going to a meeting once. I would recommend a read.
I get that DisruptHR is deliberately funny and controversial event but I thought I would use it the other way to question if learning/Learning is really all as bad as (it is often) made out to be.
Well we’ve reached the year that many an organisation had set as the future – the year for ‘visions’ and forward planning – yep, it’s 2020 time. So with our ‘2020 vision’ hindsight here is a look back at the last decade – the 2010s:
My own decade
Looking back at 2010 it does make one feel a little better about life in that I, personally, have at least achieved a few things…
In terms of career moves I have followed perhaps an odd path but it has followed trends in technology, not least the rise of Web 2.0 in the mid to late 00s leading me into working on eLearning, LCMS and other more general L&D areas since.
Brandon Hall actually recently had a webinar on LCMS platforms and there do remain arguments for them, at least in theory (see image below). I thought this was interesting given their ‘buzz’ certain seems to subsided (although 2020 will be a year where I do not make it to the Learning Tech show or BETT so I might be a little out of the loop).
Back in 2009 my primary tech focus at work was on the learning management system (LMS aka VLE) and BBWorld 09 remains the last time I went to the USA – although the 2010s brought plenty of travel to Canada and elsewhere. On the LMS front it is pretty depressing to recently see research and case study outcomes such as:
“the LMS implemented in the university is not being utilised to an optimum level”
Yet more depressing in the above article is the ‘solution’ to the problem – namely to be “adaptive” by classifying “learners into three main categories, namely, visual learner, an auditory learner, and kinesthetic learner.” Maybe by 2030 such lingering love for learning styles will finally be debunked and gone?
Personally, whilst I loved my Windows XP touchscreen netbook at the start of the decade, ultimately a powerhouse PC/laptop is really still the tool to have. Whilst Chromebooks have probably not picked up as much outside the USA their online-first style is probably suited for the 5G world we are moving towards (even if fast broadband for all will not be happening in the UK without a Labour government). I would argue that the reality is that little has been achieved by the focus on the hardware side, albeit that tablets/iPads have allowed for early applications of AR, there’s been little transformation of learning via these routes. Instead media consumption is increasingly easy and of course can be leveraged for learning but also offer us huge distractions.
In workplace learning we have seen various ‘buzz’ topics such as mobile learning, Tin Can, AI and social learning. All in all, these have probably been worked into most organisation’s approaches to digital learning, to at least some degree, even if not necessarily by the organisation as the digital transformation of learning increasingly sees it democratised and moved into the learners control (in a similar way to what has happened with IT in general).
In many ways this feels like a lost decade – regimes in some parts of the world have cemented their power whilst the UK has effectively stagnated on most measures. Overall, its a depressing picture and no surprise to see lots of people on social media welcoming the 20s as something that, hopefully, can be a fresh start.
The end of my decade: Lessons from Star Trek (TNG)
I’ve spent the last few weeks of the decade on a Star Trek The Next Generation binge – rewatching all seven series ahead of the launch of the new Picard show. TNG has a soft spot in my heart, after originally watching it in the post tea-time slot on BBC television with my family. I was always behind friends who had watched episodes on Sky but it was a show I have fond memories of.
Re-watching TNG there are lots of lessons that can be taken from it and doing a quick Google search predictably shows vast numbers of articles that are devoted to this in terms of ‘best episodes’, ‘best Picard moments’, etc. All in all there were some particular things that jumped out from my binge:
Don’t be afraid to work ‘under yourself’or hire an unexpected candidate. Jean-Luc Picard develops across the series into a wonderful character with a considerable amount of depth. However, it would surely have been easy for Patrick Stewart to turn down the role given his experience as a stage actor and that Trek has often been looked down upon (along with a lot of the rest of sci-fi). Having just also watched Logan you have to admire PS for taking on iconic roles and really running with them and making them his own. Many of us will find ourselves needing to work for money at times rather than for ‘passion’ or obvious career choices (personally I’ve worked in a call centre, Burger King, B&Q, Somerfield supermarket and other jobs because I needed the money and/or experience) – unfortunately it feels in 2020 that recruiters are too often looking for ‘perfect’ candidates and ignore the realities of people’s lives.
80s body horrorand cultural acceptability. There are a few early episodes that are now very ’80s, particularly in special effects. A few of these are quite grim in the effects – raising interesting questions over what was appropriate for a family friendly show in the late 80s and early 90s and what you might deem appropriate today. Indeed this runs through other media from the time – for example violence in Spielberg movies and special effects in Indiana Jones and other media. Have we regressed here? What might be appropriate in developing workplace (learning) media? Would ‘not safe for work’ have changed too in that time? Similarly there are clear demonstrations of where what is culturally appropriate/correct have changed – for example a late episode about “North American Indians” (who have setup a settlement on a contested planet) would surely be “Native Americans” if written today – this goes some way to show how quickly things can change and that we should perhaps be less harsh (as a society) on those who perhaps do not keep up with changes to what is deemed culturally acceptable.
Performance reviews – hated then, hated now? It probably didn’t feel like it when watched as a weekly serial but, watching in binge mode, it is surprising how often performance reviews are mentioned on the show. Almost universally these mentions are negative – and often tied to emotions around getting a promotion through the ranks. Clearly considering the corporate world’s ongoing challenges with talent management this is something that seems to have stuck around from the early 90s even if we’ve seen a decrease in focus on hierarchy towards matrix and other models.