How playful design is transforming university education

Written by David Chandross, Ryerson University. Photo credit Sergey Galyonkin, Epic Games Berlin via Wikimedia Commons, CC BY-SA. Originally published in The Conversation.

Video games have inspired a revolution in university teaching. Pictured here is a scene from the popular game Fortnite Battle Royale.

A group of 25 interns sit at Baycrest Health Sciences, a research centre for aging in Canada, their eyes glued to their smart phones. They are playing SOSan award-winning game that simulates real-world gerontology practice — where they compete with other students to earn virtual currency.

Across town, a group of professors sit around a table at George Brown College, designing a role-playing game with a virtual hospital called The Grid, based on a Matrix-like theme of saving the world from ignorance, for an accredited program in health sciences. Yet another team of game programmers are hard at work at Humber College, building a virtual reality experience of a subway car after a bomb incident. Players wear goggles, moving from person to person, saving some and tagging others for care later on.

Welcome to the new world of serious games and mixed reality. Serious educational games (SEGs) are games designed for learning. Mixed reality is a blend of virtual reality, augmented reality and what are called immersive technologies.

When you combine these, you get a sense of where university education is going. It is called playful design and it’s a multi-billion dollar industry, that was used by over 40 per cent of the top 1,000 companies in 2015. It is also likely coming to a college near you.

Better than lectures

There are thousands of peer-reviewed studies on the effectiveness of SEGs, showing that they do three things better than conventional teaching in higher education.

Serious educational games harness the addictive quality of consumer video games such as such as The Legend of Zelda series. Photo credit Flickr/Tofoli.douglas.

First, they enable skill acquisition, that is, they encourage students to use what they have learned and repeat that many times to master it.

Second, they engage and motivate students more strongly than most lectures.

Third, they reward the learner with achievements every time they play. As far as other comparisons with conventional learning go, they are at least as effective as lectures in most cases for teaching the basics.

In short, there are no compelling reasons not to use games for learning, but there are understandable questions about when and how to do so.

Achievement, exploration, social connection

SEGs are divided into categories. They range from card games based on a pharaoh’s empire in which you build an organization, to mobile addiction treatment systems approved by the United States Food and Drug Administration, to training simulations for ambulance drivers and paramedics created by Montreal-based SimLeader.

“Gamification” researchers, such as David Kaufman from Simon Fraser University, have held multi-million dollar grants to investigate their use in education over the past decade.

Some of the ideas used in gamification are simple. In “Point, Badge, Leaderboard (PBL)” systems, points lead to badges, which lead to a position in a group. You might earn points for learning about the biology of a moth, then more points when you also learn about the biology of arachnids. This provides the kind of moment-to-moment feedback you get in video games.

In fact, most of the ideas that underlie this field came from video games. There is an impressive body of research showing that video games do improve our brains, despite what well-meaning parents might say when they see children glued to the screen.

And when we take these elements present in good video games — of achievement, exploration, competition and social connection — and use them for learning, an entirely new way of thinking about education emerges.

An addictive quality

It is called playful design in some fields, based on the idea that we do like play, that it is human to play, not just childish.

But what is meant by play in this context? It is making learning fun, something you want to do more of, not less of. It is seen as a remedy for “training malaise,” the disappointing finding that the majority of what we teach people is not remembered. We forget things that are banal and remember things that have significance.

A teaching application showing the interface screen for game-based learning in medicine in partnership with ARC Reach, an Edmonton-based technology company. Photo credit Baycrest Health Sciences, author provided.

Serious games work by practising skills and tracking achievement, but also by giving learning an addictive quality. The “one more move” thinking that keeps video gamers up all night is harnessed for learning. It is based on the release of the neurotransmitter dopamine. What dopamine does is make us want the “next good thing.” Once we get it, we lose interest and want “another good thing.”

This is called a compulsion loop in game design, and can be the basis of making university courses not only relevant, but engaging. It is also a form of active learning, which has been shown to increase grades and decrease failure rates in a major review of research in the field.

Playgrounds of learning

The best SEGs now are called “open world games” and are a blend of mixed reality and game design. Imagine living in an alternate reality, where you are not you, but an image of you called an avatar, and this avatar is having adventures building something, defeating a plague or combating addiction.

This is called a “conglomerate of player satisfaction loops” and, in short, gives you so many things to do in this alternate world that you don’t know where to start. These are truly open exploratory playgrounds of learning, you can study things in any order you want, there is no scripting of learning in the best SEGs. You simply enter an imaginary digital world and begin to engage with it.

Preliminary studies by professors Deb Fels, Rob Bajko and other faculty at Ryerson University for internal purposes have shown that the majority of students prefer to learn using games.

Students learning to manage care of the frail elderly using a card-based game. Photo credit Baycrest Health Sciences, author provided.

One long term study tracked a class of students using game-based learning over a three-year period and found there were many types of players.

Some students were thrilled by the game, some less so, but in all cases, the better they did in the game, the better they did in the courses. This has been replicated many times by the author and other teams.

We learn a lot when we love what we are learning. Its a basic benchmark of achievement in higher education. Human beings love doing certain things, and learning to become a master of their own world, however fantastical it might be, is one of them.

Game worlds and mixed reality are rapidly developing fields which any educator with even a passing interest in leveraging student success would be advised to track as it unfolds.

How to get culture right when embedding it into AI

Written by William Michael Carter. Photo credit MIT. Originally published in The Conversation.

MIT’s experiment with a serial killing AI called Norman, based on Psycho’s Norman Bates, underscores the importance of ensuring we get it right when embedding AI with culture.

If, like Rip Van Winkle, you’ve been asleep for the last decade and have just woken up, that flip phone you have has become super-popular among retro technologists and survivalists alike, and, oh yeah, Artificial Intelligence (AI) is either going to kill you or save you.

AI is the latest in a long line of technology buzzwords that have gripped society, and if we are to believe the people at the respected technology analysts firm Gartner Inc., 2018 will be the year in which AI is truly integrated into our daily lives. As unnerving as the surreal robotics being cooked up at Boston Dynamics or the deployment of facial recognition AI in Chinese public schools may seem, this technology is a product of the human condition and as such, we are embedding our own culture within its coded DNA.

Debates about AI currently focus on the notion of ethics. In the study of culture, ethics are embedded within values, and they’ve become an important part of the deliberations about how AI will integrate into our lives. What hasn’t been discussed is whose ethics, and ultimately whose values, we are talking about.

Is it Western versus Eastern, or is it American versus everyone else? As values within culture are influenced by the community and larger society, ethics are dependent on the cultural context in which communal values have developed.

‘Enculturation’

Thus, culture plays an important role in the formation of AI through what’s known as the enculturation of that data.

Anthropologist Genevieve Bell, the previous Intel vice-president and cultural visionary, was able to steer the tech giant towards a more profound understanding of how culture and AI interplay with each other.

Bell’s research indicated that human interaction with technology is not culturally universal. It is neither the same nor objective, and we encode culture within and throughout technology at a conscious and unconscious level.

Genevieve Bell is seen in this 2015 photo at the Women Innovation & Technology summit in Miami Beach, Fla. Photo credit AP Photo/Wilfredo Lee.

If this is true, what happens in the eventual development of culture in AI?

For anthropologists, human cultural evolution has many markers: The manipulation of tools, the development of abstract thought, and more fundamentally, the creation of language in which to communicate.

Culture begins when two or more living entities start to communicate and exchange information and, with more complexity, ideas. Cultural development among non-human AI entities is something that hasn’t been discussed yet, let alone the melding of human and AI culture.

Bots developed their own language

Recently, Facebook’s AI research group (FAIR) made brief mention of an experiment in which two bots were tasked with negotiating with each other. It was reported at the time that the bots began to develop a more efficient language to communicate with one another.

Facebook computer science researchers quickly pulled the plug on what was rapidly becoming the development of a more efficient AI language between the two bots, not because they were frightened of the emergence of AI self-creation, but because the bots did not return expected results — a negotiation in English.

In a world where code is essentially made up of zeroes and ones, yes or no commands, there isn’t much room for the unexpected. But at times, we should embrace the opportunity and explore the possibilities, as culture does not manifest itself in a singular fashion.

Culture is what we make it. It is a set of norms that we as a society agree upon, consciously or unconsciously, and it frames how we operate within our daily lives.

AI can absorb cultures

AI has the unique ability in the future to absorb all of the world’s cultural norms and values, developing a potentially true pan-global culture. But first, we, the creators of AI, must understand our roles and how we impact that ability to absorb. AI represents, after all, a microcosm of the culture of the people who build it as well as those who provide input into AI’s foundational data framework.

Science-fiction novelist Alastair Reynolds, in his book Absolution Gap, describes a planet in which the only intelligent creature is a vast sea that absorbs information from the beings and creatures that swim in it. The sea learns from that information and redistributes that knowledge to other beings.

Called “pattern juggling” in the book, the current manifestation of AI as we know it is very much like that fictional sea, absorbing knowledge and selectively distributing it with its own enculturated data.

Using Reynolds’ knowledge-absorbing ocean as an example, AI is currently like the separated salt and fresh water bodies of Earth — each with its own ecosystem, isolated and independent.

What happens when these very unique ecosystems begin to communicate with each other? How will norms and values be determined as the various AI entities begin to exchange information and negotiate realities within their newly formed cultures?

Norman is a warning

MIT’s Norman, an AI personality based on a fictional psychopath produced a singular example of what we have long known in humans: With prolonged exposure to violence comes a fractured view of cultural norms and values. This represents a real danger to future exposure and transmission to other AI.

How so?

An example of personalities going awry when brought together? Photo credit The Associated Press.

Envision Norman and Alexa hooking up. Both AI’s are representative of the people who made them, the human data that they consume and a built-in need to learn. So whose cultural values and norms would be more persuasive?

Norman was built to see all data from the lens of a psychopath, while Alexa as a digital assistant is just looking to please. There are countless human examples of similar personalities going awry when brought together.

Social scientists argue that the debate over AI is set to explode and, as a result, that multiple versions of AI are bound to co-exist.

As philosophers, anthropologists and other social scientists begin to voice their concerns, the time is ripe for society to reflect on AI’s desired usefulness, to question the realities and our expectations, and to influence its development into a truly pan-global cultural environment.