Maintenance for the week of November 4:
• [COMPLETE] ESO Store and Account System for maintenance – November 6, 9:00AM EST (14:00 UTC) - 6:00PM EST (23:00 UTC)

Bots - The Technical Reasons Behind Them

  • aleister
    aleister
    ✭✭✭✭
    HarryWolfe wrote: »
    just want to point out that for all its vaunted systems, WoW was still compromised.

    Floating gnomes corpses spelling out gold seller site addresses.

    Warsong Gulch flag captures in the first 3 seconds of a match.

    Underground farm bots.

    Yes, there were bots in WoW too, but I played it from day 1 for years and was never personally impacted by them -- polar opposite experience in this game.

    ESO will become the textbook MMO design counter-example.
  • starkerealm
    starkerealm
    ✭✭✭✭✭
    ✭✭✭✭✭
    aleister wrote: »
    krix_ost wrote: »
    ZeniMax has chosen to use Client Side Trust for many of the work load for the game. This does have advantages because every computer can do their own calculations and send that to the Server for Updating. However, it seems this data is not being encrypted (which adds overhead and complicates network delivery). As hinted above.

    If true, this is astonishingly naive of them. And besides, if the client-side approach was for performance reasons, it sure as hell isn't working. This is the laggiest, most unresponsive MMO I've played in the last 10 years.

    From what I've read, both from Zenimax, users, and ZoS's behavior... so, take this one with a grain of salt. The largest source of lag in the game is probably some offshoot of the guild system.

    I'm not sure if it just wasn't coded for the volume it receives or if the servers are really that fragile. I have the suspicion something's really broken under the surface here that we're just not seeing.
    And "trusting" is not the same as designing with client side movement - the client can do the movement and collision calc's and still not be "trusted" by the server.

    (BTW - most of my background is in client server design).

    Then how come you can't see the overwhelming evidence that Zenimax trusts the client far too much, and your caveats and nuance just don't apply here?

    I suspect the real reason they codded it this way was for the consoles. If they do it right, the client side movement would require way less network traffic than a conventional setup. They can kinda trust the client on a console, because of how the whole system is already locked down and secured. And, unlike the previous generation of consoles, the XBone and PS4 can certainly take the hit for handling a lot of stuff locally.

    Historically Bethesda has not liked pushing out different engine builds for different platforms, (part of what happened with Skyrim on the PS4 IIRC, and the entirety of what happened when they released Rage for the PC.) Normally, I couldn't blame them, but by launching ESO on the PC first, we're seeing all the ways this can go catastrophically wrong in an unsecured environment. Well... technically not all the ways. I'm sure there are more, lurking.
    Edited by starkerealm on 21 May 2014 14:03
  • aleister
    aleister
    ✭✭✭✭
    Chirru wrote: »
    Right now the bots are still confined to the lower to mid-range levels. But this will change with the new extension. The general outcry following this might damage the reputation of the game greatly.

    It's already happened. Do a Google News search on Elder Scrolls. If you ignore the press releases and paid placement articles about Craglorn, at least half of the real articles and blog discussions are about the unprecedented bot infestation.
  • rowdog
    rowdog
    When I first realized that ESO trusts the client to do movement calculations, it blew my mind because "never trust the client!" After I calmed down a bit and realized that whoever wrote ESO's netcode couldn't possible be that stupid, I started to wonder why. Assuming they're not idiots, why would ESO intentionally toss out what is probably THE prime tenant of secure network programming?

    Traditionally, security is a trade off between security and convenience but security can suffer for other reasons as well. My theory is that it's one or more of...

    1) Large scale PVP performance.
    2) Economics: more work on the client means smaller server bills.

    Other people have said

    3) Lag, especially at launch
    4) Consoles are a semi-secure client so maybe they just ignored security on the PC side of things.

    As I seriously doubt they said "eh, screw the players, it's a million bucks more to have an authoritative server", I'm inclined to think that large scale PVP is the driving force.
  • starkerealm
    starkerealm
    ✭✭✭✭✭
    ✭✭✭✭✭
    rowdog wrote: »
    Assuming they're not idiots, why would ESO intentionally toss out what is probably THE prime tenant of secure network programming?

    I'm going to go with naivete. I know I said it was probably for the consoles, and I stand behind that, but across the board, the game made some remarkably naive assumptions that players wouldn't try to break the game.

    For some reason the example that's coming to mind was the guy a couple weeks after launch who rolled up a character named Gamemaster somethingerother, and was trying to scam players. Because, ZoS never put Gamemaster in the name filter for players, and didn't give actual GMs any distinguishing characteristics in the text box.

    To be honest, the chat system itself is another example. No one thought someone would generate ten lines of text to deliberately block out the entire chat window with an ad and hide their user name.

    Even if the chat box just snapped to the top of the last post instead of the bottom, we would have had a much more usable system at launch. But, again, no safeguards, including against people manually editing item hyperlinks, or pretending to be GMs.

    And, of course, once people did start reporting the bots, they stuck fake names in brackets, which... I mean, I've seen them pull this stunt in other games, hoping to confuse you, but I've yet to see another game where it worked. People were trying to report [superman] and the like because there was no way in the interface to tell that it was a fake ID and the real spammer was the poster before who was babbling something unintelligible just like 90% of the users in zone, until you got a "no such user" response. As far as I know, the spammers never started using actual player names, but...

    Anyway, I'm going with naivete. They forgot that users exist to break everything you code as quickly as possible.
  • ciannait
    ciannait
    ✭✭✭
    Aci wrote: »
    In other games there are a lot instanced areas. In this game there is no such for the public dungeons. Maybe the number of bots is not higher than it was in other games. I truely think its that painfull because we actually see them now, there is not much place to hide for them.

    Dont get me wrong, I like open world and megaserver. If Z only could find a way through all that...

    Are you familiar with the "phasing" aspect of the game? There are many instances (called phases) on this megaserver. (Don't believe me, try grouping with someone - chances are good they won't be in your phase, but that's a rant for another day.)

    So it's even worse than all that. For all the ones you do see, there's hundreds or thousands more than you don't.
  • BrassRazoo
    BrassRazoo
    ✭✭✭✭✭
    Well, technically many of us are lazy.
    That is why Bots and their associated programs exist.
  • Loco_Mofo
    Loco_Mofo
    ✭✭✭✭✭
    BrassRazoo wrote: »
    Well, technically many of us are lazy.
    That is why Bots and their associated programs exist.

    I wish Zeni could target the actual gold buyers more. Banning those tossers would help starve the gold sellers of profits, thus cutting down on the amount of bots/gold spam.
  • Publius_Scipio
    Publius_Scipio
    ✭✭✭✭✭
    ✭✭✭
    I say the gold sellers and bots should be nuked with extreme prejudice. Take away some of the client side stuff giving them their abilities and let them rot.
  • thjudgeman1142ub17_ESO
    Best thread on this topic I have read since launch!!!!
  • Anex
    Anex
    ✭✭✭
    Really interesting thread even if the reason for it is not so pleasant -_-

    In a case like this I would be more apt to blame marketing/managers than the devs. Being married to a developer, I have lost count of the number of time he has told me stories of forced shoddy work to reach a deadline, or wild promises marketing made that they have to uphold despite desires/better judgment etc. Seems to crush his spirit a bit because who wants to make something crappy?

    Anyway this client-side business certainly confirms the suspicions I had when I lost my internet connection while playing. I was kind of disturbed by the fact that I could still run around (albeit not interact with anything) and that when I got my connection back, all was as if I had never disconnected.

    I guess they decided to go that route for lag reasons etc, but probably never really thought or considered how big the botting problem could become (if they considered it at all, but I am sure SOMEONE thought of it even if was scoffed at).

    Assassination/ Dual Wield Specced Stamina-based Nightblade, because I like Hardmode apparently
    Twitter | Raptr | Twitch.TV
  • liquid_wolf
    liquid_wolf
    ✭✭✭✭
    Anex wrote: »
    I guess they decided to go that route for lag reasons etc, but probably never really thought or considered how big the botting problem could become (if they considered it at all, but I am sure SOMEONE thought of it even if was scoffed at).

    Without a doubt, bots were brought up. Ideas were discussed, and plans were made...

    But ultimately every MMORPG is different. They likely decided on a "Wait and See" approach to see how their system would be compromised.

    I'm certain plenty of protests were made about the design decisions here... but in the end certain parts were pushed back to make sure the project completed on time.

    Could they expect bots were going to go invisible, and manipulate the X/Y/Z axis coordinate? A possibility, but they could have just as easily done some kind speed adjustment.

    When it comes to manipulating things on the client side, you will never be able to combat it until it starts.

    They probably could have not put so much on the client, but in all reality their whole server setup was something brand new.

    Now that it appears everything is working server side, I am confident they will work on correcting and securing things on the client.

    After reading a number of the posts and replies on this thread, I can tell a great many of the people responding know what they are talking about.

    But know very little about the project itself.

    You guys see a wall. You see the cracks, and the poor construction of the wall.

    But completely ignore what is inside the wall... where most of the time and energy was spent.

    This is coming from a project manager.
    I get into these arguments and discussion with developers, code monkeys, and analysts all the time. I've had to fire people because they are right. They were very intelligent, and an incredible asset... but couldn't work with the plan we had.

    They were absolutely right... but it doesn't matter.

    You do what you can, in the time that you can, and correct it later. Always plan to correct it later.

    Because after all... this isn't like erecting a building. You can make sure you can pull pieces out without the whole thing collapsing.
    Edited by liquid_wolf on 22 May 2014 17:07
  • starkerealm
    starkerealm
    ✭✭✭✭✭
    ✭✭✭✭✭
    Anex wrote: »
    Really interesting thread even if the reason for it is not so pleasant -_-

    In a case like this I would be more apt to blame marketing/managers than the devs. Being married to a developer, I have lost count of the number of time he has told me stories of forced shoddy work to reach a deadline, or wild promises marketing made that they have to uphold despite desires/better judgment etc. Seems to crush his spirit a bit because who wants to make something crappy?

    I can relate, really. :(
    Anex wrote: »
    Anyway this client-side business certainly confirms the suspicions I had when I lost my internet connection while playing. I was kind of disturbed by the fact that I could still run around (albeit not interact with anything) and that when I got my connection back, all was as if I had never disconnected.

    There was actually an even more hilarious version of this during the stress test betas. Areas like Deshan weren't technically accessible. When you walked through the transition between zones you'd get a "can't transfer message" but you could keep walking, and walking, and walking. Eventually you'd hit the edge of the loaded terrain, (which was about half the zone, if you're wondering.) But, you'd never actually desync, because you were still connected to the server, even though the server had no idea where you were.

    It gets better.

    The next weekend they opened up the 15+ zones, and the character I'd wandered halfway to the capital of Morrowind on, while the map was technically not there, loaded in fine, right where she'd been standing when I logged out. Deshan loaded correctly that time, because I was authorized to be there and could be transfered.

    And then I was promptly murdered by an enraged Kaguti that was about 10 levels over me.
    Anex wrote: »
    I guess they decided to go that route for lag reasons etc, but probably never really thought or considered how big the botting problem could become (if they considered it at all, but I am sure SOMEONE thought of it even if was scoffed at).

    Yeah, when I was scoffing at the client side elements earlier, it honestly didn't occur to me just how major an achievement the game actually is. (I'm going to lay that one at the feet of the cold I'm still recovering from; my brain is a mess right now.)

    Now all we need is some real client side cheat detection. At least for the PC version.
  • Grageeky
    Grageeky
    ✭✭✭
    So, in summary, putting most of the trust in the client side was a good decision in order to minimize lag, and a bad decision in that we now have rampant manipulation of client chatter to the server facilitating a bot utopia?
    "Perhaps his egg spent too much time in the shade before his hatching." -Wareem-
  • ciannait
    ciannait
    ✭✭✭
    This is coming from a project manager.
    I get into these arguments and discussion with developers, code monkeys, and analysts all the time. I've had to fire people because they are right. They were very intelligent, and an incredible asset... but couldn't work with the plan we had.

    They were absolutely right... but it doesn't matter.

    "Code monkeys"? Where is it you work, so I know never to apply?
  • lecarcajou_ESO
    lecarcajou_ESO
    ✭✭✭✭
    Not to be off topic, or too hyperbolical, but this is seriously one of the threads most worth reading, ever.
    "Morally Decentralized."
  • starkerealm
    starkerealm
    ✭✭✭✭✭
    ✭✭✭✭✭
    Grageeky wrote: »
    So, in summary, putting most of the trust in the client side was a good decision in order to minimize lag, and a bad decision in that we now have rampant manipulation of client chatter to the server facilitating a bot utopia?

    Putting the processing on the client was a good idea. Trusting the client to relay good data back was a bad idea. There's a lot of ways you could verify the data coming in, and to be honest, they're probably using a few of those, but, obviously, the botters have found ways around it.

Sign In or Register to comment.