Personofsecrets wrote: »
[*] I literally only paid for a key that gives me access to this game's server, which is owned by ZoS.
[*] I agreed to the ToS before playing.
Or you could try hacking the game and setting up your own private servers where you are responsible for your own sensitive data and can do whatever you want there.
> Suggests that the concept of ownership and TOS matters
> Suggests hacking the game
???
- Is there a law in one or more countries requiring ZOS to monitor private chat in an M rated game for illegal acts?
- Is there a law in one or more countries requiring ZOS to monitor private chat in an M rated game for strong language, sexual content?
Veryamedliel wrote: »
- Is there a law in one or more countries requiring ZOS to monitor private chat in an M rated game for illegal acts?
- Is there a law in one or more countries requiring ZOS to monitor private chat in an M rated game for strong language, sexual content?
There is no such thing as 'private chat'. Chat is chat. Period. The only difference for ZoS is the amount of people it reaches, but the rules don't (legally) change because of that.
As to your questions:
1: yes. Everywhere in the EU at any rate. Not sure about the US and other regions.
2. Law? No. Can a ToS legally state that no strong/sexual language and the like is not allowed within the game? Yes. Their service, their rules (within the applicable law). Does the ToS state anything of the sort? Yes. Read the rules of conduct. Does ZoS break any law by stating such rules and ZoS enforcing said rules? Not that I know of.
AI requires a huge amount of computing power. ZOS using their own server capacity for AI would explain the current state of the game, with a noticable delay between pushing a key and the ability firing, with potions and weapon swapping having a 50:50 chance to work, overland enemies bringing you in combat before you even see them, and NPCs and players slowly loading in within 10-20 seconds after traveling to a quest hub.
But in reality, ZOS is not the FBI and AI is just the latest buzz word to describe systems that existed since the 1970's.
What the "AI" is likely doing is looking for messages that contain "bad words" and storing only those, instead of storing everything in case CS needs it later to look for evidence.
Veryamedliel wrote: »1: yes. Everywhere in the EU at any rate. Not sure about the US and other regions.
Veryamedliel wrote: »1: yes. Everywhere in the EU at any rate. Not sure about the US and other regions.
Which EU law is that? I'm wondering because the much discussed "chat control law" has been proposed, but it has not been adopted yet. There was a directive in 2021 which allowed for chat and email providers to scan for severe criminal content, but it was not mandatory.
That doesn't bother anyone, but that you are monitored in a game where there is NO private chat, that bothers.
SteveCampsOut wrote: »
People know that their communications on the internet are not secure. You are only fooling yourself if you have any expectations of privacy on the internet. By its very nature it is the transmission and reception of data, and by its transmission methods it can and most likely will be intercepted.
JemadarofCaerSalis wrote: »SteveCampsOut wrote: »
People know that their communications on the internet are not secure. You are only fooling yourself if you have any expectations of privacy on the internet. By its very nature it is the transmission and reception of data, and by its transmission methods it can and most likely will be intercepted.
This is what I find strange.
Ever since I first got online, 20+ years ago, I realized that what I put out there is both pretty permanent AND not private.
Because I am using someone else's site to put those thoughts, whatever thoughts, out there, and every TOS I have seen has some clause about them storing those thoughts, and some even have clauses that say they can use the things you put on their website for advertising purposes.
I was *always* told to be careful of what I put online for this very reason.
As for an analogy, ESO is like going to ZOS's house for DnD and wanting to have a private conversation in a different room, when you have already been told all conversations are monitored in the house. It is ZOS's house, they can make their rules, as long as they comply with the laws of the countries they are available in, AND they can enforce those rules and change them when they want.
To me, this has always been the reality of using the internet and signing up for sites.
TybaltKaine wrote: »Equating 1984 with signing up to play a game that tells you that it is logging your actions in order to protect themselves is missing the plot.
You have no expectation of privacy on the internet. Someone, somewhere has access to everything you watch, say and do They even tell you that they are keeping that information. Your ISP even tells you this.
There is a system in place in which a customer may report another customer. This system should be sufficient because it includes a first hand witness of the purported harm. The action was initiated by the customer present at the time.
AI looking for words and then sending information to someone who will look into it is not the same. The action was not initiated by a customer or a human being.
According the those who have been banned, this system has handed out punishment when there is no injured party. Typing the words appear to be the crime, words, which the company will not disclose, could shut down a customer's game play for a few days or longer.
You agree that You have no ownership right or title in or to any such Downloadable Content, including, but not limited to, the virtual goods appearing or originating in the Services (such as a Game) or any other attributes associated with any Account or Services. ZeniMax does not recognize any purported transfers of virtual property executed outside of the Game, or the purported sale, gift, or trade in the "real world" of anything that appears or originates in a Service or a Game. Accordingly, You may not sell, and You may not assist others in selling, Service(s) or in-Game items for real currency, or exchange those items for value outside of the Services. Evidence of any attempt to redeem Downloadable Content for a purported exchange, sale, gift, or trade in the "real world" will result in the immediate suspension or termination of Your Account or Membership. You acknowledge and agree that all virtual items represent a limited license right for Your personal, private, non-commercial, non-transferable, and limited use governed by these Terms of Service and are not redeemable for any sum of money or monetary value from ZeniMax at any time. ZeniMax reserves the right to refuse Your request(s) to acquire Downloadable Content, and reserves the right to limit or block any request to acquire Downloadable Content for any or no reason.
WHY is this training happening on the live servers? ZoS must have terabytes of logs they could train on. Training on live data, and incorrectly banning people, leads to just this, a PR disaster.ZOS_Kevin wrote:That being said, we have been iterating on some processes recently and are still learning and training on the best way to use these tools, so there will be some occasional hiccups.
karthrag_inak wrote: »Their system. Their responsibility. Their privilege.
I sure hope you never share anything personal in your 'private' chats with online friends, then.
Yes it is, but only if we agreed to the change when we were told about it.Is it acceptable for zenimax to scour through your messages and take account actions at random?
Any action is the result of an ESO staff member following some unpublicised protocol.Is it acceptable for zenimax to scour through your messages and take account actions at random?
To the people that voted no... you agreed to Zenimax's terms of service when you created your account.
https://account.elderscrollsonline.com/en-us/terms-of-serviceYou agree that You have no ownership right or title in or to any such Downloadable Content, including, but not limited to, the virtual goods appearing or originating in the Services (such as a Game) or any other attributes associated with any Account or Services.
This is Zenimax game.. it's their platform... their sandbox. We pay them to use their service.
While scouring through player messages or account activity without transparency may seem intrusive at first, there are several reasons why it could be legitimate and, in some cases, reflect positively on a company like ZeniMax:
1. Security and Fraud Prevention
Protection against cheating and fraud: One of the most common reasons for monitoring player activity is to prevent cheating, hacking, or other forms of manipulation. In competitive or multiplayer environments, cheaters can ruin the experience for legitimate players. By keeping an eye on unusual or suspicious behavior, the company can maintain a fair playing field.
Account security: Scanning player activity can help identify when accounts are compromised. If someone logs in from an unusual location or engages in suspicious transactions, the company can step in to protect the account.
2. Toxicity and Harassment Prevention
Enforcing community standards: Games often have rules or codes of conduct to foster positive experiences. By monitoring chat and interactions, companies can detect harassment, hate speech, or other violations of these standards. Ensuring a safe and respectful environment is crucial for retaining players and keeping the community healthy.
Proactive moderation: Instead of relying solely on player reports, monitoring allows the company to be proactive in removing toxic players, thus improving the overall quality of the player base and game environment.
3. Compliance with Laws and Regulations
Legal requirements: Companies may need to comply with various laws related to online communication, such as those dealing with child safety or illegal activities. Monitoring player activity can help ensure compliance with laws that protect users, particularly in certain jurisdictions.
Data protection and privacy laws: Ironically, by keeping track of account activity, the company can also ensure that personal data is not being misused or that players are not engaging in illegal activities that could expose other players' data.
4. Improvement of User Experience
Data-driven improvements: Analyzing player behavior and interactions can provide insights into gameplay patterns, preferences, and frustrations. This data can help developers identify areas that need balancing, content that is underused, or features that are especially popular. It ultimately leads to a better user experience and more informed game design decisions.
Tailored support: By knowing more about how players interact with the game, customer support teams can be more effective. For example, if they can see where a player struggled before they submitted a help ticket, they can offer more personalized support.
5. Intellectual Property Protection
Preventing unauthorized use: Scanning accounts and messages may help a company detect attempts to distribute unauthorized copies, leak confidential information about upcoming releases, or engage in behavior that violates the terms of service.
Protecting in-game economies: In games with virtual currencies or economies, monitoring helps ensure that players are not engaging in activities like real-money trading or botting, which could destabilize the game economy and harm the player experience.
6. Minimizing Legal Risk and Liability
Avoiding lawsuits: In cases of harassment, illegal activities, or other harmful behaviors, a company may be held liable if it fails to act. Regular monitoring can provide the company with evidence that it is being proactive in enforcing policies and preventing harm.
Clear terms of service: Many companies outline in their terms of service the scope of their monitoring. If ZeniMax or any other company clearly communicates these terms when players agree to them, the practice becomes legally justifiable and protects the company from potential legal disputes.
7. Preserving the Integrity of the Game World
Maintaining immersion and narrative: For some players, a cohesive and immersive game world is a key part of their experience. Monitoring and enforcing rules against disruptive behavior (like trolling or griefing) helps to preserve this integrity, ensuring that players' immersion is not interrupted by negative elements.
While transparency is important, players also tend to expect a certain level of protection and fairness in their gaming environment. If done ethically, monitoring can be a valuable tool to ensure a positive player experience, protect the community, and enhance the overall quality of the game.