Ban children from loot box gambling in video games, say MPs

12 September 2019, 11:09

MPs say that in-game spending should be regulated by gambling laws
MPs say that in-game spending should be regulated by gambling laws. Picture: PA
Maddie Goodfellow

By Maddie Goodfellow

MPs are calling for tougher regulation of in-game spending and a total ban on loot boxes for children.

The UK trade body for the gaming industry has said that they will review MPs recommendations "with the upmost seriousness".

Their comments come after MPs accused those who gave evidence to the committee of "a lack of honestly and transparency".

MPs have criticised free video games for encouraging players, including children, to buy virtual loot boxes. These boxes contain an unspecified amount of items which may improve game play.

The MPs also called for both social media platforms and game-makers to establish effective age-verification tools. Currently both rely on an honesty system and, as a result, there are large numbers of under-age users on social media and playing games.

The report also calls upon games companies to accept responsibility for addictive gaming disorders and protect their players from potential harms due to excessive play-time and spending.

Some games have also been criticised for having online marketplaces where these items can be bought and sold.

The committee has called for more stringent age checks in games
The committee has called for more stringent age checks in games. Picture: PA

The Digital, Culture, Media and Sport Committee's inquiry into addictive and immersive technologies heard stories of young adults who had built up debts of thousands of pounds through in-game spending.

Gaming companies such as Jagex, the company behind Runescape, admitted players can spend up to £1000 per week or £5000 per month in the game.

Representatives of major games including Fortnite maker Epic Games and social media platforms Snapchat and Instagram also gave evidence on the design of their games and platforms.

MPs criticised the industry for being "wilfully obtuse" in answering questions about their games and not giving clear answers regarding data collection, usage and the psychological underpinning of games.

They also said that the industry is reluctant to accept responsibility for intervening in player overspending or put a figure on what counts as overspending.

DCMS Committee Chair Dominic Collins said: "Their business models are built on this but it's time for them to be more responsible in dealing with the harms these technologies can cause for some users."

"Gaming disorder based on excessive and addictive game-play has been recognised by the World Health Organization. It's time for game companies to use the huge quantities of data they gather about their players to do more to proactively identify vulnerable gamers."

In response Dr Jo Twist, the chief executive of UK Interactive Entertainment, said: "The video games industry has always, and will continue to put the welfare of players at the heart of what we do."

"The industry does not dispute that, for a minority, finding balance is a problem. This is why we are vocal in supporting efforts to increase digital literacy and work with schools and carers on education programmes."

Companies such as Epic Games, the company who created Fortnite, attended the committee
Companies such as Epic Games, the company who created Fortnite, attended the committee. Picture: PA

The main recommendations of the report are:

- Sale of loot boxes to children should be banned

- Government should regulate ‘loot boxes’ under the Gambling Act

- Games industry must face up to responsibilities to protect players from potential harms

- Industry levy to support independent research on long-term effects of gaming

- Serious concern at lack of effective system to keep children off age-restricted platforms and games

- MPs on the Committee have called for a new Online Harms regulator to hold social media platforms accountable for content or activity that harms individual users

- Social media platforms must have clear procedures to take down misleading ‘deep-fake’ videos – an obligation they want to be enforced by a new Online Harms regulator.

Happening Now