Dragonnord wrote: »
allochthons wrote: »Just for info purposes, it appears the new EULA screen is only popping up on EU.
I logged into my alt NA account this morning, no EULA. I switched PSN accounts, logged into EU on my main, EULA.
NA on my main, no EULA.
You likely are accepting EULA for the account as a whole, which doesn't require accepting it for each server.
This as I did have to accept it on PC/NA.
Also, I doubt many have left the game due to chat monitoring for offensive chatter. Most of us do not say stuff that would be an issue, so we do not care. The chatter in forums does not reflect the player base. Most of the player base hardly visits the forums to start with.
From https://www.zenimax.com/en/legal/terms-of-serviceContent Moderation
To the extent that ZeniMax performs any content moderation of UGC to ensure its compatibility with these Terms of Service (including the Code of Conduct or any relevant EULA), such content moderation may be carried out via human review as well as through the use of AI-powered proactive and reactive moderation methods including without limitation, software that uses algorithmic decision making.
ZeniMax's proactive content moderation includes without limitation using tools to block and filter UGC [User Generated Content -Allo] that is illegal and/or incompatible with these Terms of Service.
allochthons wrote: »Just for info purposes, it appears the new EULA screen is only popping up on EU.
I logged into my alt NA account this morning, no EULA. I switched PSN accounts, logged into EU on my main, EULA.
NA on my main, no EULA.
You likely are accepting EULA for the account as a whole, which doesn't require accepting it for each server.
This as I did have to accept it on PC/NA.
Also, I doubt many have left the game due to chat monitoring for offensive chatter. Most of us do not say stuff that would be an issue, so we do not care. The chatter in forums does not reflect the player base. Most of the player base hardly visits the forums to start with.
I know several players who quit the game specifically over AI monitoring of private chats, and where they saw it leading.
It's not that different than sitting down in a restaurant with a microphone in the flower centerpiece, which isn't that different than someone in the employ of the restaurant standing next to every table, listening just in case something untoward might be uttered by the patrons, because 'hey, it's their restaurant...'.
This makes me wonder what is going on in some of the chats that are of a criminal nature to get the police involved?
I'm pretty naive about these things.
I don't know if something like that actually already happened somewhere, but one thing that certain people have made quite a fuss about in my country - usually to justify surveillance - is that terrorists could potentially use game chats to discuss some evil plans and then nuke (insert name of random famous politician here) to the moon or something.
It is doubtful of course that people discussing criminal plans would do that in normal language and without encryption of any kind. Also, I'm not sure how a filter against swear words could be related to this topic.
allochthons wrote: »Just for info purposes, it appears the new EULA screen is only popping up on EU.
I logged into my alt NA account this morning, no EULA. I switched PSN accounts, logged into EU on my main, EULA.
NA on my main, no EULA.
You likely are accepting EULA for the account as a whole, which doesn't require accepting it for each server.
This as I did have to accept it on PC/NA.
Also, I doubt many have left the game due to chat monitoring for offensive chatter. Most of us do not say stuff that would be an issue, so we do not care. The chatter in forums does not reflect the player base. Most of the player base hardly visits the forums to start with.
I know several players who quit the game specifically over AI monitoring of private chats, and where they saw it leading.
It's not that different than sitting down in a restaurant with a microphone in the flower centerpiece, which isn't that different than someone in the employ of the restaurant standing next to every table, listening just in case something untoward might be uttered by the patrons, because 'hey, it's their restaurant...'.
I don't use slurs and usually don't swear either, so normally, I would not be bothered at all. But AI moderation has the risk of false positives, which does make me a little concerned. What if it flags a harmless word I use in a conversation in another language because in English it looks like a slur? A human moderation would see it's not the English-language slur, can the AI (or rather simple word filter, from what I've seen so far) recognize this? I doubt it.
allochthons wrote: »The actual AI monitoring text is below. Someone posted a screenshot on the last page, but this should allow text searches:From https://www.zenimax.com/en/legal/terms-of-serviceContent Moderation
To the extent that ZeniMax performs any content moderation of UGC to ensure its compatibility with these Terms of Service (including the Code of Conduct or any relevant EULA), such content moderation may be carried out via human review as well as through the use of AI-powered proactive and reactive moderation methods including without limitation, software that uses algorithmic decision making.
ZeniMax's proactive content moderation includes without limitation using tools to block and filter UGC [User Generated Content -Allo] that is illegal and/or incompatible with these Terms of Service.
"AI-powered proactive and reactive moderation" directly contradicts the information provided here by @ZOS_Kevin. Whether they actually use proactive AI moderation is unanswered AFAIK. But it is specifically in the EULA, now.
It is a legal requirement in some of the countries the game operates in.
https://lewissilkin.com/en/insights/digital-commerce-creative-101-online-safety-act-navigation-for-the-video-games-industry
The OSA imposes a number of compliance requirements which will be regulated, investigated and enforced by Ofcom, the regulator for online safety.
- Risk assessments: In-scope game studios will need to carry out regular OSA risk assessments to identify the illegal harms which may arise in the context of their games – for example, whether chat rooms could support dissemination of hateful content, or if any illegal in-game behaviour might encourage real-world illegal harms.
- Child risk assessments: Additional – and more rigorous – risk assessments will be required wherever children are a likely game audience. Specific guidance from Ofcom on best practice is expected this Spring.
- Content moderation: All in-scope platforms must actively monitor and remove illegal content, which may include human, manual or automated content moderation tools.
- In-game reporting systems & complaints procedures: Players must be able to easily report illegal and harmful content within games, with straightforward complaint procedures.
- Updated Terms of Service: games’ EULAs and Terms of Service must specify how users are protected from illegal content, explain relevant complaint handling policies and new breach of contract rights for users. Where relevant, Terms of Service will also need to clarify how children are prevented from coming across illegal or harmful content, including in respect of any proactive technology used to do so.
- Reporting and record-keeping: operators of online games must keep records of risk assessments and outcomes and report findings to Ofcom.
So it may be legally required. I'm wondering now how exactly this should work in a fantasy game set in a game world where violence is common and also a normal game objective ("Kill 15 innocent people with the Blade of Woe.")? How are we supposed to talk to other players about game content with these rules?
PrincessOfThieves wrote: »Yeah it might be, but people allegedly get punished for things that are not illegal, such as consensual rp.
I would really like ZOS to clear things up and explain what really is allowed and what isn't. Because with the evidence we have it seems like they are taking a very heavy-handed approach to this.
So it may be legally required. I'm wondering now how exactly this should work in a fantasy game set in a game world where violence is common and also a normal game objective ("Kill 15 innocent people with the Blade of Woe.")? How are we supposed to talk to other players about game content with these rules?
Thousands do it every day without a problem. The requirement to moderate what happens on their chat system was imposed on ZOS, and there will be a learning curve for the A - Not So - I as well as the human oversight. That there are not dozens of forums posts complaining about being banned or punished would indicate that the problem of mis-punishing is not widespread.PrincessOfThieves wrote: »Yeah it might be, but people allegedly get punished for things that are not illegal, such as consensual rp.
I would really like ZOS to clear things up and explain what really is allowed and what isn't. Because with the evidence we have it seems like they are taking a very heavy-handed approach to this.
ZOS are heaviily reliant on automated monitoring - AI is really not that intelligent. There are going to be mistakes, and ZOS absolutely need to minimize that, but the nature of that specific UK law does not give ZOS any leeway at all.
ZOS would be very hard pressed to eliminate all mistakes, because they have to be heavy handed.
Edit: Just to add as well, that ESO age rating varies by country, with some allowing 14+, others 18+ BUT these are legal requirements on purchasing not playing. And you can moan about parental repsonsibility as much as you like, but the UK OSA doesn't care about that - and ZOS HAVE to take account that children may be playing.
PrincessOfThieves wrote: »Nowhere does it say that they have to be super heavy handed, aside from situations when something illegal is happening. But I don't think any sane person would be against them intervening when someone is grooming a child in their game or something like that.
PrincessOfThieves wrote: »Nowhere does it say that they have to be super heavy handed, aside from situations when something illegal is happening. But I don't think any sane person would be against them intervening when someone is grooming a child in their game or something like that.
They have to monitor everything. Some content may or may not be illegal - but ZOS have no way of knowing that. Example: Two players RPing and the language becomes sexualised. Is this illegal? Well that depends, how old are the players? How does ZOS know one of them isn't 14 or 12 or 10? They don't. They are required to presume that one or both may be though.
Edit: Typos
spartaxoxo wrote: »This makes me wonder what is going on in some of the chats that are of a criminal nature to get the police involved?
I'm pretty naive about these things.
Predators preying on children, school shootings, swatting, terrorist/extremist groups, and drug dealing. Also cyber bullying and the non-consensual sharing of intimate images. Some of this is also something you can flag for on the forums.
The game is rated M, so there shouldn't be anyone under the age of 18 in game, at least not in the USA. Other nations have varying standards, but the point remains the same. There is an effective ignore function in ESO, so that should take care of most of the bullying. And there is no means for sharing images via ESO, so that's not an issue.
spartaxoxo wrote: »This makes me wonder what is going on in some of the chats that are of a criminal nature to get the police involved?
I'm pretty naive about these things.
Predators preying on children, school shootings, swatting, terrorist/extremist groups, and drug dealing. Also cyber bullying and the non-consensual sharing of intimate images. Some of this is also something you can flag for on the forums.
The game is rated M, so there shouldn't be anyone under the age of 18 in game, at least not in the USA. Other nations have varying standards, but the point remains the same. There is an effective ignore function in ESO, so that should take care of most of the bullying. And there is no means for sharing images via ESO, so that's not an issue.
An M rating in the US means 17+ - and that's a suggestion, not a requirement. 17-year-olds are legally minors.
They have to monitor everything. Some content may or may not be illegal - but ZOS have no way of knowing that. Example: Two players RPing and the language becomes sexualised. Is this illegal? Well that depends, how old are the players? How does ZOS know one of them isn't 14 or 12 or 10? They don't. They are required to presume that one or both may be though.
PrincessOfThieves wrote: »Then they would have to remove all violence from the game on the off chance a 6-year old starts playing it and gets scared by all those corpses and demons in the tutorial. And more adult-oriented games (BG3, for example) would just be banned worldwide.
"Think of the children" is an emotional and frankly, manipulative argument. It should of course be forbidden to intentionally expose kids to adult content, but the mere chance of kids clicking "I am 18" when installing a game doesn't mean that all games must be E-rated. It is parent's responsibility to control their kid's internet access.
And even if you were correct... How come that a newly added companion can tell me "Is that all that silver tongue can do?" out of the blue, and I can't even tell them it makes me feel uncomfortable and yet consensual RP between adults can get people banned? This doesn't make any sense. This game has a ton of violence and innuendos.
They have to monitor everything. Some content may or may not be illegal - but ZOS have no way of knowing that. Example: Two players RPing and the language becomes sexualised. Is this illegal? Well that depends, how old are the players? How does ZOS know one of them isn't 14 or 12 or 10? They don't. They are required to presume that one or both may be though.
This would be the end of all "private" (as in chat whispers and "private message" systems) online communication because every place where people can insert text of some kind could potentially be used to write something lewd, and one or more of the participants could potentially be a child.
The only way this could be avoided would be requesting a passport to register or log in (although even then it wouldn't be 100% safe, as a child could use an older sibling's or parent's passport for that).
What's with communication by phone, btw? How would the phone provider know a person saying something lewd to their partner through phone could not be a child using a voice changer?
PrincessOfThieves wrote: »Then they would have to remove all violence from the game on the off chance a 6-year old starts playing it and gets scared by all those corpses and demons in the tutorial. And more adult-oriented games (BG3, for example) would just be banned worldwide.
"Think of the children" is an emotional and frankly, manipulative argument. It should of course be forbidden to intentionally expose kids to adult content, but the mere chance of kids clicking "I am 18" when installing a game doesn't mean that all games must be E-rated. It is parent's responsibility to control their kid's internet access.
And even if you were correct... How come that a newly added companion can tell me "Is that all that silver tongue can do?" out of the blue, and I can't even tell them it makes me feel uncomfortable and yet consensual RP between adults can get people banned? This doesn't make any sense. This game has a ton of violence and innuendos.
They are required to assess whether or not game violence can lead to real world harm. That is not the same risk assessment as: "Can an adult use our chat system to exchange sexually explicit messages with a minor?".
"Think of the children" is the entire basis of the UK's OSA and the EU's DSA. ZOS have to comply if they want UK/EU players, it's that simple. The issue isn't ZOS, it's the laws passed.
Dragonnord wrote: »No one appreciates being spied on.
Spied? How can we be spied? We are in THEIR house.
It's like you enter my house and you then accuse me of controlling or checking what you are doing or saying in MY house.
We own NOTHING in this game and we accepted that when we installed it and agreed to the ToS.
Game chat belongs to ZOS, not to us, so they can do whatever they want.
The fact that you use some company's infrastructure doesn't change the fact that you are a human beeing and you have your own rights (human rights, citizen rights - guaranteed by law).
Using your logic: when you use a mobile phone you are in your mobile network provider's house, I assume you don't have your own satalites, antenas and whatever devices they use, so does it mean all your phone communication can be spied by the provider?
Same while using internet, you also use your providers cables, and all the infrastructure, does it mean you should be spied by him?
PrincessOfThieves wrote: »You are misrepresenting these laws. Nowhere does it say that everything must be E-rated in case a child gets access to it.
I live in the EU, and adult chats and websites are very much legal here even though there is of course a chance that a kid clicks on that "I am 18" button. There is a difference between trying to groom a child online or exposing them to adult content and, well, adult content and conversations existing.
You also conveniently ignored the second part of my message.
dk_dunkirk wrote: »If ZOS were required to actively monitor private messaging by law, then why doesn't Discord? AFAICT, they are not doing this. Discord operates globally as well. If they don't have to do it, then ZOS doesn't either. So ZOS is doing this because they want to, not because they have to.
dk_dunkirk wrote: »If ZOS were required to actively monitor private messaging by law, then why doesn't Discord? AFAICT, they are not doing this. Discord operates globally as well. If they don't have to do it, then ZOS doesn't either. So ZOS is doing this because they want to, not because they have to.
Let me just be clear: I think the DSA is badly crafted, I think the OSA is massive government overreach, but that doesn't stop them being the law, or allow companies to decide they won't follow it.
If Discord choose not to follow the law it will find itself on the hook for a hefty fine or prosecution from the UK government. They broke the law, so I didn't is not a defense anywhere.
Couple of things though, Discord requires age verification - according to Discord. They are also currently looking at implementing the requirements of the OSA - according to Discord.
Oh, and also worth noting Discord is not a computer game, and different aspects of the OSA apply.
dk_dunkirk wrote: »If ZOS were required to actively monitor private messaging by law, then why doesn't Discord? AFAICT, they are not doing this. Discord operates globally as well. If they don't have to do it, then ZOS doesn't either. So ZOS is doing this because they want to, not because they have to.
Let me just be clear: I think the DSA is badly crafted, I think the OSA is massive government overreach, but that doesn't stop them being the law, or allow companies to decide they won't follow it.
If Discord choose not to follow the law it will find itself on the hook for a hefty fine or prosecution from the UK government. They broke the law, so I didn't is not a defense anywhere.
Couple of things though, Discord requires age verification - according to Discord. They are also currently looking at implementing the requirements of the OSA - according to Discord.
Oh, and also worth noting Discord is not a computer game, and different aspects of the OSA apply.
The Act’s duties apply to search services and services that allow users to post content online or to interact with each other. This includes a range of websites, apps and other services, including social media services, consumer file cloud storage and sharing sites, video sharing platforms, online forums, dating services, and online instant messaging services.