We discussed in detail how countries around the world are starting to tighten verification rules in games with Natalia Vozyan, who holds the position of VP of Legal & Operations at the company Xsolla.
Natalia Vozyan
Alexander Semenov, App2Top: We are increasingly seeing reports about the tightening of online verification tools in the news. Let's start by figuring out when this topic actually began?
Natalia Vozyan, Xsolla: The idea of user verification is rooted in the discussion about protecting children in the digital space. This agenda started gaining momentum around 2020 — at that time, every reputable legal journal was writing about the risks for children online.
Who was the first to seriously take on verification? China?
Natalia: No, China is also among the pioneers, but the first significant step was the Children’s Online Privacy Protection Act (COPPA) in the US, which came into effect on April 21, 2000. Then came China, and only relatively recently, in 2018, the General Data Protection Regulation (GDPR) of the European Union was enacted.
Did it become a milestone?
Natalia: Absolutely. GDPR effectively introduced the concept of "know your customer" (KYC) regarding personal data, including children's data. After it, many countries took the European experience as a model and adopted local regulations based on it.
Where is verification being actively implemented today?
Natalia: Recently, the UK introduced the Online Safety Act in 2025, aimed at content regulation for children and requiring verification when selling digital content. Brazil has introduced the most radical measures, literally this year. Now it's no longer enough to simply click "I confirm I am 18" — verification through real documents or biometrics is required. Self-declaration is completely banned.
You mentioned self-declaration earlier. Before we move forward, let's stop here. What did you mean by that?
Natalia: Self-declaration is one of the verification tools. They can be divided into three types. The first is self-declaration: the user simply checks the "I am 18" box. It's quick and convenient but absolutely unreliable. The second is verification through data: credit card, phone number, ID. Slightly more reliable. The third is verification through documents or biometrics: uploading a passport, Face ID, cross-referencing with government databases. It's the most reliable.
So, is there a move away from self-declaration now?
Natalia: And it's quite logical: modern children start using phones before they can talk and perfectly understand that to get access to content, they just need to answer "yes" to the question "are you 18?". Regulators have realized this and are moving towards more trustworthy tools. Additional triggers were mass complaints from parents about unauthorized spending by children — in-game purchases with parental credit cards for significant amounts. This created public pressure and accelerated the shift to stricter methods of identity confirmation.
Including in Russia?
Natalia: The situation is different here: user verification remains conditional, with the protection of certain groups mainly regulated by the "Advertising" law. However, there are more frequent public initiatives about mandatory identification for internet access.
Earlier, you mentioned that tightening verification is logical. From the state's perspective as an institution, yes. But I'm interested in your personal take on the current situation?
Natalia: If we talk about my stance — there's an obvious imbalance of interests here. There are industries where KYC is completely justified. But the gaming industry isn't the financial sector. A game developer doesn't need to know their user's passport details to offer their services. However, in Brazil, they're now required to store biometrics, adding operational burdens and enormous regulatory risks.
Regulators are essentially shifting the responsibility of raising and controlling children from parents to businesses. Monitoring what content a child consumes and how long they spend in games is primarily a parental function. Businesses undoubtedly should observe reasonable restrictions, but when a game developer is required to collect biometrics and verify every user's age — it's an excessive burden not aligned with the nature of the business.
I don't question the importance of compliance and protecting vulnerable groups — these are fundamental values. The issue is proportionality: regulation should consider the industry's specifics and not create barriers where lighter tools suffice.
You've pointed out well that for businesses, all this is an excessive burden. But understanding correctly, verification primarily affects gaming platforms, not the developers, right?
Natalia: Look, the current wave of verification restrictions has primarily affected two major categories.
The first is financial services: banks, payment systems, cryptocurrency exchanges. Here, user verification has long been the norm, and KYC along with other procedures is the industry standard, so there's little surprise.
The second category is entertainment services. Here, it's complicated. It includes streaming platforms, social networks, online gambling, and, of course, video games. These are currently under the most regulatory pressure, as regulators see them as the main risks for children and teenagers — whether it's access to inappropriate content, excessive time online, or in-game purchases.
The logic of regulators is generally understandable: if a platform allows minors access to content or sells them something, it must control that. The question is whether the proposed tools are proportionate to this task.
What do they demand or aim to achieve from them?
Natalia: The declared goals are noble: protecting minors from inappropriate content, combating fraud, ensuring transparency of the digital environment. On paper, verification appears to be a tool for public good.
But in practice, the picture is more complex. By mandating businesses know their users, the regulator essentially acquires a ready-made data collection infrastructure on citizens without incurring costs themselves. Upon request from a state body, any company is obliged to provide the accumulated information. It's convenient.
Add to this the penalty element: non-compliance with verification requirements leads to severe sanctions, becoming a notable revenue source for the budget.
I wouldn't say the legal aim is just pretty words. But for the regulation to truly work in society's interest — not just on paper — a dialogue between the regulator and business is crucial. This balance is clearly lacking for now.
Talking about changes in verification specifically in the gaming sector, I'd like to focus on China, where restrictions were implemented earlier than anywhere else (not about legislative activity, but about where developers and users initially encountered restrictions on the ground). Tell us about the experience of Chinese game developers, what challenges they faced and how they addressed them?
Natalia: China can indeed be considered a real pioneer in this subject. As early as 2007, several ministries jointly proposed a system of restrictions for minors in video games to protect their physical and mental health. A time classification was introduced: how many hours a day representatives of different age groups could spend in games.
As is often the case, developers were left to their own devices to figure out implementation without specific tools. Over several years, major market players experimented on their own: collecting data from identification documents, implementing biometrics, all coming at a cost of conversions and enormous operational expenses.
It wasn't until 2021 that the government systematically engaged with the problem. The NPPA (National Press and Publication Administration) launched a centralized national verification infrastructure, fundamentally changing the model. Now, the developer makes a request to the government database, which returns user age in real time and automatically applies appropriate restrictions. Thus, the burden of storing sensitive data was removed from businesses.
However, one cannot say the problem is fully solved. Workarounds remain: children use their parents' documents, and some parents consciously help their children bypass verification. Developers identify such users after the fact, for example, through support service inquiries or refund requests, identifying minors by their communication style and phrasing.
I've also heard that South Korea attempted a similar model but eventually decided to abandon the practice?
Natalia: Korea is a telling example of how strict regulation can evolve into a more balanced approach. In 2007, the country introduced an internet verification system using real names: users of major sites had to confirm their identity via a resident number. Simultaneously, the so-called Shutdown Law prohibited children under 16 from playing online games from midnight to six in the morning.
However, by 2012, the Constitutional Court overturned the internet verification system, recognizing it as an excessive restriction on free speech and citizens' right to anonymity.
The Shutdown Law lasted longer but by 2021, the government also scrapped it. Compulsory restrictions were replaced by a parental choice system: now parents decide when and how long their child can play.
Thus, Korea transitioned from state control to parental responsibility, which I think is a logical direction for the gaming industry.
We began our conversation essentially with the European Union. Now I want to return to it in the context of measures taken in Asia. Compared to China or the past situation in South Korea, the gaming sector here seems to not yet feel a significant change in verification approaches. Is that the case?
Natalia: Overall, yes. I think the European businesses are still digesting the impacts of implementing GDPR. This regulation imposed a huge load, and the documentation alone that describes all data processing procedures requires significant time and financial resources. The industry isn't experiencing any new major upheavals yet.
In the gaming field, PEGI — the Pan-European Game Information age rating system — acts as a protective buffer. Along with personal data protection requirements and refund rules, it currently maintains reasonable order without enforcing strict verification.
Moreover, GDPR already includes a crucial provision: processing data of children under 16 without explicit parental consent is prohibited. Formally, this implies age verification requirements. Yet, unless these are enshrined in specific binding acts for the gaming industry, businesses are balancing these norms without reaching the point of checking each user's passport.
You also mentioned the British Online Safety Act. It seems that verification is stricter on the British Isles than on mainland Europe. I've heard that even Steam has acknowledged its requirements.
Natalia: The UK is one of the most serious examples of a regulator truly intending to enforce what's written on paper.
New provisions of the Online Safety Act prohibit self-declaration, verification through payment methods that don't guarantee adulthood, and restrictions via user agreements — according to the regulator, these don't meet the "highly effective verification" threshold.
And, yes, Valve responded to the law and updated its Steam policy: British users must now undergo verification to access games with an 18+ rating. Valve opted for an approach based on a linked credit card — an elegant solution with minimal impact on conversion. Although many experts doubt if it will be seen as "highly effective".
The British regulator also recently fined AVS Group, managing 18 adult sites, £1 million for lacking proper age verification. The issue wasn't the absence of verification per se — the AVS system allowed uploading a photo without a "live check", meaning a child could merely present someone else's photo to the camera.
The law is already altering the behavior of major players: YouTube, Spotify, and several other platforms have introduced verification via ID cards.
Stern measures. What's happening in the States? You said they were the first to address verification, but what's the situation now?
Natalia: It's challenging to talk about the US because the situation is unfolding on two levels — state and federal — and they are advancing in different directions.
At the state level, the process is swift. In 2025, age verification laws came into effect in nine states, including Florida, Georgia, and Missouri. For now, the primary targets for regulators are platforms with adult content and social networks. The gaming industry hasn't been addressed directly yet, but the direction is clear: the boundaries are constantly expanding.
On the federal level, there are attempts to create a unified standard — for instance, in November 2025, a congressman introduced the Safer GAMING Act, mandating online game developers to incorporate parental control tools, particularly the ability to disable chats between a child and other users. A national standard would be a wise solution since keeping track of each state's requirements and adapting to them individually is nearly impossible.
As for prospects — the picture is ambiguous. The First Amendment to the Constitution, guaranteeing free speech, already acts as a real barrier: courts have blocked several state laws on this ground. Research also shows that verification isn't achieving its goals — users simply move to other platforms or use VPNs. After the law was enacted in Florida, demand for VPNs increased by more than a thousand percent.
Against the backdrop of tightening practices — should we expect full passport verification in games with a high age rating?
Natalia: Full passport verification in games with a high age rating is no longer a question of "if", but "when" and "where". Brazil has effectively already gone down this path. The UK is moving in the same direction. China realized a centralized model where the state itself acts as the verification operator.
Technically, it's feasible — the question is at what cost. A platform with a long-standing user base, where people have been building accounts and investing real money for years, will suddenly ask everyone to show their passport. The audience's reaction is predictable, and conversion losses are inevitable. Beyond that, developers become operators of sensitive personal data with all the ensuing responsibilities — storing systems, leak protection, compliance with local data processing regulations.
The issue isn't whether it will happen — but how prepared the industry will be when it does.
Oh, we are in for trying times. Thank you for the conversation!
