Should learning pros shift from sector specific tools? #3 : “creator” all-in-one platforms (over an LMS)

A bit of background on the learning side

This bit is an attempt to think where we (i.e. the learning industry/industries) currently are with ‘platforms’…

Learning management systems (LMSs), love them or hate them, remain the core component of many learning environments – be that workplace learning and development (L&D) or in education (where virtual learning environment, or VLE, may be the preferred term dependent on geography). Many LMS/VLE are built, at least initially, from the point that a common product, a ‘course’, is at their heart.

Even the rise of the Learning Experience Platform (LxP/LXP) has not shifted the need, for many, of an LMS. Indeed the lines that separate an LMS and an LXP are blurred at best – “there is no hard and fast distinction“. Continuing use cases for an LMS include that they often remain the single source of truth for compliance records, the ‘one stop shop’ for organisational learning opportunities, a walled garden for education and much more. Whilst LMS/LXP/VLEs come in many shapes and sizes they have, in many ways, replaced or reinforced the ‘learning by location’ model – i.e. you used to learn in a classroom now you learn in/via this technology or simply register for in-person events via it.

Coaching, mentoring and other less formal and location defined learning experiences are supported to differing degrees by the wide range of LMS/VLE/LXP that are out there in the market. The learning sector in the last c.20 years has arguably fluctuated somewhat in how much an LMS should be an ‘all-in-one’ tool with, at times, it being simply a launch pad for SCORM and other content whilst at other times there has been a push for talent management, discussion, social learning, knowledge management and much more (that have separate specialist tools/markets) to be included in core LMS functionality. Today we tend to see platforms that may not offer all this functionality but will integrate with other tools, for example via integrations, APIs, etc. These integrations often developed via acquisitions by the tech vendors themselves.

Creator platforms

So on to the point of this blog – some reflection on another market segment I am new to, having only just realised in the last couple of weeks that such things exist.

Previously I presumed most people monetize their ‘creator’ work through the relevant platform (YouTube, a photo site, WordPress or whatever) in combination with services such as Patreon. However, there appears to be another model that has developed via ‘creator platforms’.

Podia is one of the tools that clearly see themselves in this space. They split their own functionality “sell your work” and “market your work” functionality, interesting the sell aspects include online courses, webinars and other functionality that will be familiar to many learning pros. The below video is seemingly a pretty honest self-assessment on their part, comparing themselves to another major player in this space (do let me know if there are other tools worth looking at):

So is Podia an option for learning pros?

As is often the case, it really depends on what you already have and use. However, if you were looking to break from, say, a Moodle with sales plugins, with separate CRM, video hosting, etc to move into a (potentially) simpler all-in-one solution this seems like a realistic option.

Your online course, your way.

Choose from a variety of online courses that fit your business and customers’ needs; we support every file type, host all of the content, and never place limits on how much content you can upload and sell to your students.

https://www.podia.com/features/sell-online-courses

Is Podia really a learning platform?

In many ways Podia is quite a traditional solution for learning pros in that a “course” remains a unit of transaction, i.e. you buy access to a course. The platform lets you sell these via one of a number of temporal-based formats such as standalone, timed release, cohort based, etc.

In terms of content you are effectively talking about adding links to content and uploading media. This is clearly not SCORM focused like a traditional LMS but is no different to how many LMS are used, for example the traditional complaint that university online learning platforms are used as “file stores”. That said, Podia as a file store approach is quite appealing considering there are “zero limits on content…as much as you want to as many people as you can.” Many of us will have horror stories about eLearning systems falling over in the past when used at scale so Podia seems to have a lot of potential here.

There are other functionality options beyond static content, including borrowing from the traditional approaches of the learning space:

Ensure your course students are truly understanding your material with a multiple-choice quiz at the end of your lessons.

https://www.podia.com/features/sell-online-courses

Obviously a MCQ is far from ensuring ‘true understanding’ but I guess it at least separates a tool like Podia from something like, say, WordPress for hosting your content (yes, I know there are lots of WordPress plugins!).

Beyond the content and MCQs the more interesting element may well be that more social learning is possible via community elements and webinar integration. It is perhaps worth noting here though that Podia’s “all-in-one” would still need to connect external accounts for the webinar functionality itself:

Podia integrates with both YouTube Live (all plans) and Zoom (Shaker/Earthquaker) to let you offer webinars to your audience.

https://www.podia.com/features/sell-webinars

Conclusions

Overall, these platforms seem to offer another option for those who work in learning. By creating a ‘community’ around our work there is a different model here than the traditional LMS – be it if these were used internally to an org or for the sales-based models they are clearly intended. One market that clearly could look in this direction would be Membership based organisations who effectively have existing communities and are, in some ways, the traditional “creator” organisations due to their model of sharing useful resources, links, courses, encouraging knowledge sharing, etc for bodies interested in a topic. If the costs are worthwhile for such an organisation will clearly depend on the alternatives and existing IT setup, resources, platforms in use, etc.

Should learning pros shift from sector specific tools? #2 : Weet for video based communication

That I have not been to any conferences or other in-person events for at least a couple of years might be why I seem to be stumbling across a few new tools of late. In the last couple of weeks this has included Weet.

Are you already using Weet for learning related projects? If so, let me know.

Weet offers browser based video recording with very easy webcam over screen share functionality. Obviously similar functionality exists in Teams, Zoom and elsewhere – indeed one of my few paid for apps is Screencastify which has some similar functionality. Where Weet is powerful is that the webcam video can make use of their virtual backgrounds (ala Teams and Zoom) within the browser based recording.

Weet themselves do a good job here for explaining its benefits for learning so education/learning is clearly a market they are aiming for. Therefore, they are targeting market share from some of the more sector specific tools such as Camtasia.

The Teams integration (I haven’t tried this) offers an easy way to communicate, share video, etc without having to do the slightly counter-intuitive Teams approach of having single person “meetings” to record video messages.

A very nice feature of Weet is that even on the free plan you can download your recordings. Therefore, if you have concerns over their hosting (which can be a private or public link) you can download and host/share on/via your own systems. The free plan caps you at a 8 minute video which, in all honesty, is probably as long as most videos should be anyway.

Obviously lots of use cases as part of async communication – in education video based feedback would be an obvious example.

Here’s a quick video I did for my LinkedIn profile.

More generally, are we seeing a move back towards browser based tools? It feels a little like this to me, with less emphasis on Apps, or maybe I am just imagining that?

StreamYard: the streaming tool we’ve been waiting for?

So, I am not sure how I became aware of it initially but I have been giving StreamYard a go.

What is StreamYard?

Basically you can Stream live, from your browser, to a number of different platforms (YouTube, Facebook, etc). This is all super easy and cheaper than more traditional methods:

Why might you be interested?

From an education and L&D background, I have used various online classroom/meeting tools over the years. These normally involve some level of friction for the user, be it logins, tracking, plugin installation, etc.

I recently only just realised that LinkedIn has a beta “live” feature – details here. StreamYard is one of the tools you can use to publish to your profile and/or page(s) via this – if you have LinkedIn groups this might be the way to go, being able to broadcast direct to your groups, rather than needing a separate meeting tool (at least for low interaction broadcasts).

The alternative to a browser based model is a tool like https://obsproject.com/ – an open tool but one that requires install and setup. When I have tried to use Twitch in the past I might have tried OBS but must admit I can’t actually remember what I used! Either way it was not as easy as how StreamYard makes things.

StreamYard trial

So, what have I found with StreamYard is that it is very easy to use. I ended up deleting most of what I created in the trial but it is noticeable that there are still hoops to jump through, for example, YouTube requires your phone number to allow you to live stream to your channel. Therefore, StreamYard might be the best tool in its space (considering ease and cost) but the platforms you stream to might still be a problem.

More on educational games : the example of mission1point5

Using mobile gaming technology, Mission 1.5 educates people about climate solutions and asks them to vote on the actions that they want to see happen.

https://www.mission1point5.org/about

This new climate change related online activity is an interesting idea, combining a series of what are basically multiple choice questions (that give the user options for what their government should do to meet the 1.5 degree challenge) with calls to action for individual and national-level behaviour change.

Responses from your selected country will be aggregated and submitted to your government as your “vote”…

What will we do with the results?

Your vote, and those from your country, will be compiled and presented to your government to encourage bolder climate action. Votes will also be counted in a global tally. So stay tuned for the results!

https://www.mission1point5.org/about

Presumably this vote piece is only prearranged with the 12 countries (plus the EU) that are listed. In addition the game mechanics themselves are a little odd given your choice for each point is really between two items as one is clearly a ‘red herring’. The onscreen results from the ‘quiz’ record 10, 700 or 1000 points depending on your answer to a question and combine into a total score for tackling the 1.5 challenge across multiple areas such as “farms and foods”, “transport”, etc.

Example question from the “Farms & Food” topic.
A section’s “vote” (which acts like a summary/debrief of the ‘correct’ answers for each section).
Overall scoring in keeping temperature change down.

Does it educate?

The first quote included above specifically states the resource “educates people”. Obviously I could write a lot here about what educating someone actually means versus learning something, etc. What I will try to focus on is if someone is likely to learn anything from the activity. The answer, of course, will be “it depends”.

If we take the cattle example, in the above screenshot, there is a lot of pre-requisite knowledge required – for example a reading level to comprehend “livestock” and “plant-based diet”, albeit with mobile-style friendly graphics as visual clues. Beyond reading ability, there is no real information on the different option and what they mean – thus the light touch to any kind of knowledge content could be confusing and if you really wanted/needed to learn something from this you would likely have to do some research away from the resource. This is not helped by the text being image based and, therefore, you can not simply select text and ask your browser to search the web for more information.

Therefore, I am tempted to say this resource might be quite useful for a school to run through in a group, i.e. with a teacher/facilitator in place to use it to foster discussion, rather than as a learning resource per se.

How could it be improved?

10, 700 and 1000 don’t obviously relate to the 1.5 degree temperature and it is not very clear from the onscreen graphics how many ‘points’ are needed as a minimum for your choices to meet your country’s requirements. Indeed there is a contradiction between not wanting to add to temperature but also needing a high score. It would be better if the scoring was somehow reversed – e.g. starting with your high carbon total and then cutting it with a % target to reach 1.5 from a high score.

There is also a risk here from oversimplifying as, presumably, the carbon impact of some choices would be more in some countries than others (this complexity might be built in but I doubt it).

The “none of the above” option on the vote really does not work either as a form of learning summary nor as a mockup of the democratic action. Particularly if the intention of the resource is for actual democratic input…

Reliable information on public opinion on climate action

This is given, in a related YouTube video’s description, as a reason for the website’s vote element:

Mission 1.5 YouTube introduction

However, it is clearly a limited activity with just three (well two) options to consider per question and then the user being very heavily prompted to select the ‘best’ option for each section’s three questions as the vote. I must admit I voted a few “none of the above” responses in a Brewster’s Millions style mood.

Summary

Overall this feels like one of those examples of where someone wants to try to achieve educational outcomes but they have limited content, a desire to reduce instruction (but to the point of irrelevance) and really only manage to leverage the gaming expertise involved (which seems considerable from the “about” page) to graphics/UI and little else. It also highlights the incredible difficulty in building content for a global audience with no personalisation or clear target audience.

“Totally unrealistic”? Reflecting on categorising learning topics within games

This post was triggered by the below Twitter thread. Nuance is of course often lost in Twitter character limits, but, was my immediate response on reading @DTWillingham’s article fair or was I being too emotional (given my work in learning and time spent in the world of video games)?

Trigger thread

Firstly, lets all agree games are hugely powerful for learning. Indeed, I often blame Sid Meier for my choice of History for undergraduate studies (although, of course, a number of good teachers and other factors were at play).

Second, I would recommend you look at the original article. The idea is a really interesting one. The numbered points below are mostly where I disagreed with the article on first read through, with some reflections included below each point. Many of these have little to do with the (knowledge and skills) learning specifically but are important in terms of the framing of the learning environment and motivation (if we consider based on KISME). “Design for motivation” arguably being a skill in itself, as articulated in this new learning designer competency framework.

  1. “if my kids are representative”
    1. I appreciate this is a newspaper opinion piece but anecdotal starting points are not great. I also appreciate most of my views are very anecdotal based on my own experiences 🙂
  2. “I gave in to increased gaming time but gravely told my children they should choose educational games”
    1. This is a hugely “first world problem” problem statement. When I was in the age bracket being discussed (8 to 18) I got one game for my birthday and one for Christmas. If gaming is a concern for a parent then I would rather see an article encouraging them to be active in choices, either choose the games or be active with the children in the selection.
  3. “it’s usually impossible to know what, if anything, kids will learn from a video game based on a simple description of it
    1. I really like the opening of this part but not the bit I have italicised. Yes, a description will not likely cover this but a gaming experience is intensely personal. There are so many levels of competence to gaming skill, many games are non linear and players will pay differing levels of attention. Therefore, just like in an education environment, it is incredibly difficult to say what people “will learn” – only what we are encouraging and supporting them to learn. This also counters some game design – for example deliberately open design in the latest Zelda game.
  4. “The Entertainment Software Rating Board rates games for objectionable content like sex and violence. That’s helpful, but it should be as easy for parents to guide their kids toward enriching games as it is to shield them from unacceptable ones.”
    1. Surprisingly, given the author, this massively over simplifies learning. The ESRB, the BBFC, etc. are dealing with a very small taxonomy – for example, I just looked at GTA V on ESRB (presuming it would be the game with the most ‘warnings’) and it is only rated on 7 items – albeit that their are levels to this model (“intense”, “strong”, etc which is probably how we get to the 30 categories the article mentions). If we were to map “topics” as mentioned earlier, what would be the appropriate taxonomy? Cataloguers and librarians the world over would be quick to tell you this is difficult, video games themselves were an example used in my Librarianship MA as an example of how difficult it is to fit things into Dewey Decimal Classification – under games, technology, etc.?
  5. “boring”, education-first, games
    1. I previously considered if podcasts were the rebirth of “edutainment”. I don’t think we would say that as a concept is entirely bad. Indeed most people will remember their more “fun” teachers over some of the others. However, I would agree that “chocolate-covered broccoli” learning design isn’t very helpful in general, similarly to forced gamification in workplace learning. At the most recent school I worked at, most made for education “games” tended to frustrate the kids as they are the first to see when learning is being ‘forced’ into a game environment. Similarly potentially educational games, like Minecraft, were misused by what can probably be best described as ‘di**king about’. However, the experience of course varied enormously between the games and the children in terms of preference and practice. That said, some serious games undoubtedly do work and the science has been worked on for a long time, even if just thanks to the age old learning paradigm of simulation and practice of activities in safe(r) environments.
  6. “To make them fun, game creators either make the content less academic (and claim education will still benefit) or remove the tests (and claim kids will still learn). But the effect of either change on learning is unpredictable.”
    1. “learning is unpredictable” – I think this is the nub of the matter. It is unpredictable and difficult which is really why I was saying it is unrealistic to try and rate learning in such media. Indeed the article references the evidence that some games designed to help with memory do not work (which is in part why I said the vast majority of game driven learning is really accidental).
  7. “playing Tetris, do improve spatial thinking skills, an ability linked to success in math and science”
    1. But the designers probably did not anticipate this and the evidence becomes clear over time. It would be very difficult to classify such outcomes at the point of publication.
  8. “not quiz players on it”
    1. This is of course a very education way to talk about learning (going back in part to the original reason this site was called what it is). It probably doesn’t help to reinforce parental expectations of testing everything. It does double back to say learning is “because you thought about it, not because you were quizzed” but I would say it is weak on the fact that repetition to counter the forgetting curve is key here. For example, I learned Caribbean geography from Pirates! (like the other article mention in the thread but with Black Flag rather than Pirates!) as I played for many hours over a long period of time, however, I also had that knowledge reinforced through following football/soccer, looking at maps, watching the Olympics, etc. We know who “Prince Harry is married to” due to constant exposure to that content, I know very little about less exposed celebrities/royals.
  9. “They have to think about it, and that’s guaranteed only if the information is crucial to playing. Even then, “will they think about it?” isn’t always obvious.”
    1. I wouldn’t say it is guaranteed even in that case, repetition, interest, existing level of knowledge, etc. would all impact this. Also you do not necessarily think about spatial thinking skills. That is more incidental when benefiting from the Tetris example, etc.
  10. Roller Coaster Tycoon
    1. As the article suggests, the gamer would need an interest to pick on the more scientific elements rather than playing for fun/crashes. It would also depend a lot on existing knowledge, this would be impacted by age, literacy levels, etc.
    2. This could revert to something like sticking a recommended reading level on a game, for example, I loved Shadowrun but got bored with Shadowrun Returns as there was far too much reading text. A text rating would help parents and gamers of all ages. The text could also be potentially exported from code and analysed away from the game. This might help people determine if the game is too complex, for example if they are going to have sit through a huge tutorial reading activity. That said, in another context I would happily play more ‘interactive book’ type experiences.
  11. “Someone who understands cognition needs to evaluate gameplay. The video gaming industry could arrange for that.”
    1. This is the really difficult bit from a practical perspective. You may understand cognition but could you get through the game? Your analysis is unlikely to map to the possible variations in relation to the experience. Would you be better analysing pro players (for example on Twitch or YouTube)? I doubt “Game makers submit a detailed description of a new game, which is then evaluated by three professional raters”, as for the ESRB, would be anywhere near sufficient for the complexity of knowledge, skills and behaviours a game may change.
    2. There would also be potential cost implications – gaming is notoriously a low price inflation industry (even though the tech involved and size of games has transformed) with small and big designers regularly disappearing into bankruptcy.
  12. “they owe parents that much.”
    1. A nice way to wrap up the article. However, if we take that a parent would have to be at least 16 years old I would say the industry does not really owe you anything unless you have chipped in by playing games yourself within those years. As with film ratings and Parental Advisory it would also only be of use for the small number of parents who care.

The ease at which this information would appear to parents/purchasers is also perhaps giving more credit than due to some of the systems involved. The PlayStation store, for example, does not even offer a ‘wish list’ or ‘save for later’ type of option. The Steam Store allows various tagging but again we would come back to how difficult a taxonomy would be. The article and Twitter thread both mentioned Assassins Creed, if we take Valhalla you could argue you would learn a rough idea of:

  • English and Norwegian geography
  • some (stereotyped) Anglo Saxon and Norse cultural aspects
  • elements of medieval religious practice
  • different weapon types
  • and probably some other knowledge pieces.

However, as with learning from films and other media perhaps the most interesting point is away from such obvious content. Instead Valhalla’s approach to same-sex relationships could be a transformational learning experience, for example, if a sexist homophobe played the game then maybe, just maybe, they might have some of their beliefs and resulting behaviours changed. That said, did Ubisoft consultant with relevant bodies to ensure their representation was appropriate? This could be a challenge cast at many sources of information of course, for example if the The Crown should come with a health/education warning.

As I tweeted, I would love to work in gaming at some point, indeed one of those ‘sliding doors’ moments in my younger years was turning down a job at Codemasters. However, on reflection, I still don’t think the article’s suggestion is the best way to go. Indeed education consultants working for the developers would seem preferable to external rating and verification. DTWillingham is, of course, a luminary in this area (hell the LA Times publishes his articles!) but whilst I love the idea of this job existing I still feel it would be incredibly difficult to bring to fruition in a way that is of value to parents or anyone else.