Time to stop the snobbery in L&D

L&D departments need to support their organisation in valuable ways.  Simples.

Yet I increasingly feel that the L&D industry takes a snobbish approach to the world of work – far too often talking about what we might actually call ‘knowledge workers’ or, at least, office work.

Yes knowledge work is obviously a large part of the workforce, however, this focus ignores the large numbers of UK PLC working in hospitality, healthcare and other areas where the workplace and workforce are relatively ‘low tech’, ‘low skilled’ (in the traditional graduate workforce kind of sense) and unfortunately often low paid.

Part of the problem seems to be that multiple traditional support departments (IT, KM, L&D and more) all seem to be running for a middle ground around productivity – which is largely as identified by the DWG’s 2018 research agenda:

Digital Workplace Group (DWG) embarks on an exciting research programme to deliver focused insights across both intranet and digital workplace good practice….

1. Collaborating in the digital workplace: how to have and to measure impact
2. Taking a strategic approach to the digital workplace: teams, structures, methods
3. Office 365: a detailed look at the wider suite
4. Digital literacy in the workplace: how to raise the organization’s digital IQ
5. Successful intranet migrations: strategies, approaches, tactics
6. The intelligent DW assistant: what teams need to know now about artificial intelligence
7. Digital workplace trends, themes and statistics: insights from DWG research and benchmarking.

The above list is pretty close to the buzz in L&D circles – at least if you swap out intranet for LMS or other system.  The reality on the ground for L&D professionals – especially in those low paid sectors mentioned above – is instead apprenticeships, post-Brexit skills agenda, basic skill training (even JISC are saving ‘citizen’ resources from closure) and more.  The positive is that at least via mobile, AR and VR we are seeing some practical workplace L&D buzz away from the knowledge workers who are tied to a desk and Outlook.

Yes, digital workplaces exist and many support departments will be made up of digital-first workers (even if their parent market or industry are not).  However, let’s not forget everyone else.

After starting this post I then, when catching up with TJ podcasts, hit upon the Donald Clark interview from Online Educa that really hits many nails on their heads.

Why I’ve changed my mind on award nights

In the past I’ve been quite snobby about award nights in the HR/L&D field.  I’d argued the only real recognition you should need is from your board/c-suite that you are doing a good job.  However, I’m increasingly being won over as award nights for a number of reasons.

My new found enthusiasm includes that (well organised and robust – not like these) awards are one of the few times that we see robust evaluation of L&D.  For example, I was quite shocked to see how Brandon Hall asked the second of the below questions in their “Learning Measurement 2018” survey.  In my opinion the third option may be perfectly valid, i.e. “not all our…initiatives get measured” – this can be perfectly fine as evaluation requires resourcing (like other activities) and the ‘top’ two options wouldn’t necessarily be correct in that context:

thumbnail

Awards are also increasingly important in sharing best practice in what has become a more transient environment – we’re not realistically seeing books or journal articles on good practice but shorter format blogs, tweets, conference presentations, etc where an award can act as a stamp of quality based on people taking the time to be reflective and analytical of their practice.

There is also the bonus that L&D teams can be too often firefighting problems.  Taking the opportunity to reflect, on actually solving a workforce or performance issue, can be a positive benefit in line with the “celebrate achievement” piece of good team building.

Adapting your Personal Brand in a Volatile World (Event)

A thought provoking breakfast session at TeleTech consulting.

Whilst I attended to think about my own brand and how I can support others with their own it was really an event that worked at multiple levels – individual, team, department, organisation, etc.

The three recommended strategies:

  1. A growth mind-set vs. fixed
  2. Clarity of self, strengths, passions, differentiation (vs. other, technology…)
  3. Understanding value (the market…)

nicely align to some of my recent work, including via the strengths based positive mentality.

This brought my mind back to the Mercer/HBR paper I picked up at the Leadership Symposium on “Bottom-Up Leadership” and their own Venn diagram showing the need to combine personal strengths, personal interest and business needs.

I was particularly interested in attending after a recent event where I saw some old colleagues for the first time in c.5 years.  Those interactions highlighted the long term perceptions people hold and the TeleTech event described this as the weighting of perception on one trait rather than taking a balanced view.  The personal brand was described as the “story people tell about you behind your back” so I guess we all need to get back to basics and reflect on our expertise.  I also thought this paralleled with the idea of the weight we give to first impressions.

The three fundamentals of the brand outlined as:

  1. Credentials
  2. Passion
  3. Market Needs

The sweet spot, the trait to focus on, being the middle point of these three.  The challenge here was to “go big” on the sweet spot, this posing a test for me as I would like to be seen as being good at a number of areas of L&D competency.  However, when I was looking for work a couple of years ago, I suspect I was not “selling” myself well enough due to too broad an interest?  I also thought there were challenges around what I could do and what I have actively done a lot of – the two do not automatically line up but that is not necessarily a bad thing if it can be justified in the middle ground of the 3 brand items.

“Learning analytics”: a red herring?

A key message from my recent Kirkpatrick program was to start with the “top of the mountain”.  In this metaphor the top, the peak, the target to reach, is the organizational vision.  Strategic learning programs are, therefore, helping the org reach this vision and should be evaluated as such.

My reflection during the program was that this, of course, is common sense.  We should be working to support the organizational goals.  The challenge then becomes prioritizing multiple needs – so only by forecasting potential impact up front can prioritization be done correctly.   And this is one of the areas where there is misconception with KirkP – it should be about starting with the end in mind and working backwards (not just dealing with level 1 in a standard way and then carrying on from there).

In terms of evaluation of success, LEO have recently discussed the role of learning analytics (LA).  Now, like a lot of things in L&D, I would say the problem with LA is that it has meant multiple things to different people.  One of the earliest examples I saw, sold as LA, was Starfish Solutions (SS) who had a clear solution and goal – use existing LMS/VLE data to improve student retention.  SS makes perfect sense for organizations where educational outcomes and student retention are the organization’s objectives.  I liked SS’s solution (in part discussed with them at the BBWorld Conference back in ’09) but it also faced the challenge that, for many university courses, there was/is less need for ‘big’ data solutions – lecturers know their students in traditional models.  It only made really sense when talking about large scale education – the problem then, again, is that ‘large scale’ means multiple things to different people 😉

The LEO article does a good job at articulating the problems I have always had with L&D impact – especially how to assess when there are so many other uncontrolled variables.  As mentioned in my previous post on the KirkP Cert, this was my main challenge I wanted clarity on from the course.  The recommended KirkP approach of identifying multiple ‘indicators’ (suggesting behaviors are on track for the desired result[s]) that can show a correlation – a key learning point for me.  In this model, therefore, we are building a ‘chain of evidence’ akin to a court of law – “data, information and testimonies at each of the four levels that, when presented in sequence, demonstrate the value obtained from a business partnership initiative”.

What I really liked about this is the clarity of the evidence/steps up the ladder/mountain, from bottom to top:

  1. Individual Outcomes
  2. Team Outcomes
  3. Departmental/Divisional/Unit Outcomes
  4. Customer/Client Response
  5. Organizational Outcomes
  6. Customer Satisfaction
  7. Market/Industry Response

It is this breakdown, of benefit chain, that I will likely adopt in my existing project planning documents.

Let L&D then be clear, as the KirkP course made clear: stop trying to tackle issues through limited/impossible direct causation but instead correlations.  I would say this is a much better approach than simply seeing “measuring learning impact as a waste of time and money” as the LEO article mentions many people argue.

Therefore, I would argue, let us (i.e. L&D professionals) not worry about learning analytics but instead organizational analytics (i.e. goals and metrics) that can be seen as trending over time and aim to see where our investments have an impact.   As recommended in the KirkP programme, do not reinvent the wheel, borrow metrics from elsewhere as they will already be used by stakeholders and those same stakeholders (should) understand the logic.  This should then allow us to, as I’ve hinted at before, not worry about ROI but instead (as recommended by KirkP) Return on Expectations.

So what do I think of KirkP following the course and certification?  Well I’d have to agree with the LEO suggestions:

  1. It’s better than doing nothing… which is what most organisations are doing.

  2. Think about what level of evidence will be good enough for you. As the Kirkpatricks have pointed out, a chain of evidence that would not stand up to scientific rigour may be enough to convict someone in a court of law. If a level of evidence is good enough for a court then it’s probably good enough to convince your board to invest in L&D.

  3. Good enough develops into great.

Apprenticeships: more following the October guidance

Having continued to reflect since my last post on this topic as well as taking in the October guidance (both through reading and a couple of related events).  It’s starting to look a lot clearer now…my current view on the three main options for employers:

Ignore it all together…

A lot of companies will continue to ignore apprenticeships as the 20% off-the-job and new division between providers and assessment organisations will not be as efficient as what can be done via other development approaches.

It’s not just about the levy – company’s existing training will have some level of value and quality.  I’ve always felt workplace learning, FE and HE need to be much more joined up and its good that the levy is starting to make people look wider than their existing silo, for example the OU working with people consultants from KPMG for a wider solution.

There remains though a lot of snobbery in learning, including:

  • from apprenticeship providers about the quality of non-accredited workplace learning
  • about the lack of skills in HE from FE and employers
  • the HE snobbery around degrees being of most value.

That the levy seems to be breaking down at least the last of these, via degree apprenticeships, and getting some cross-sector conversations going can only be a good thing.  However, as mentioned in this article, if the model is to be employer led why force funding for apprenticeships only?  And will degree apprenticeships get very far if even the BBC refers to them as ‘degree apprenticeships‘ as if a non-recognized qualification.

…or sub-contract…

There’s a logic in presuming subcontracting will be the most popular route with companies who have existing L&D teams but little/no experience of apprenticeships.

You would expect few will have met the short Skills Funding Agency deadlines at this enrollment window and even fewer will attempt the full employer-provider model this time around.  The October guidance suggests sub-contracting is a valuable (upto £500k per annum) way for L&D teams to save their companies from some of the levy ‘hit’ whilst putting existing learning into more formal structures. Indeed its also become clearer in October that the SFA sees investment in management information systems as essential for employer-providers.  This and other logistics may be a big ask for all but the biggest employers and you suspect sub-contracting well allow many employers to deliver the training they deem appropriate but leverage a provider’s economies of scale for systems, standards management, Ofsted requirements, etc.

…or wait and see.

The deadlines of late November for registrations were challenging (when SFA employer engagement events were fully booked in the run up) so the ‘big bang’ of the levy introduction (the event I went to said 500 companies had attended/booked nationwide) may well become a whimper for a year or two.

That the SFA needed to send the below note out on the day of the registration deadlines shows that there’s interest – even if organisations have failed to be totally clear on who is responsible for what in this new world!

The SFA has noticed that some organisations have submitted multiple PQQs despite clear guidance.

Organisations are reminded that only one PQQ route must be submitted. Please check that this is the case.