HarryWolfe wrote: »just want to point out that for all its vaunted systems, WoW was still compromised.
Floating gnomes corpses spelling out gold seller site addresses.
Warsong Gulch flag captures in the first 3 seconds of a match.
Underground farm bots.
ZeniMax has chosen to use Client Side Trust for many of the work load for the game. This does have advantages because every computer can do their own calculations and send that to the Server for Updating. However, it seems this data is not being encrypted (which adds overhead and complicates network delivery). As hinted above.
If true, this is astonishingly naive of them. And besides, if the client-side approach was for performance reasons, it sure as hell isn't working. This is the laggiest, most unresponsive MMO I've played in the last 10 years.
mike.crewsb14_ESO wrote: »j.frank.nicholsb14_ESO wrote: »And "trusting" is not the same as designing with client side movement - the client can do the movement and collision calc's and still not be "trusted" by the server.
(BTW - most of my background is in client server design).
Then how come you can't see the overwhelming evidence that Zenimax trusts the client far too much, and your caveats and nuance just don't apply here?
Right now the bots are still confined to the lower to mid-range levels. But this will change with the new extension. The general outcry following this might damage the reputation of the game greatly.
Assuming they're not idiots, why would ESO intentionally toss out what is probably THE prime tenant of secure network programming?
In other games there are a lot instanced areas. In this game there is no such for the public dungeons. Maybe the number of bots is not higher than it was in other games. I truely think its that painfull because we actually see them now, there is not much place to hide for them.
Dont get me wrong, I like open world and megaserver. If Z only could find a way through all that...
BrassRazoo wrote: »Well, technically many of us are lazy.
That is why Bots and their associated programs exist.
I guess they decided to go that route for lag reasons etc, but probably never really thought or considered how big the botting problem could become (if they considered it at all, but I am sure SOMEONE thought of it even if was scoffed at).
Really interesting thread even if the reason for it is not so pleasant -_-
In a case like this I would be more apt to blame marketing/managers than the devs. Being married to a developer, I have lost count of the number of time he has told me stories of forced shoddy work to reach a deadline, or wild promises marketing made that they have to uphold despite desires/better judgment etc. Seems to crush his spirit a bit because who wants to make something crappy?
Anyway this client-side business certainly confirms the suspicions I had when I lost my internet connection while playing. I was kind of disturbed by the fact that I could still run around (albeit not interact with anything) and that when I got my connection back, all was as if I had never disconnected.
I guess they decided to go that route for lag reasons etc, but probably never really thought or considered how big the botting problem could become (if they considered it at all, but I am sure SOMEONE thought of it even if was scoffed at).
liquid_wolf wrote: »This is coming from a project manager.
I get into these arguments and discussion with developers, code monkeys, and analysts all the time. I've had to fire people because they are right. They were very intelligent, and an incredible asset... but couldn't work with the plan we had.
They were absolutely right... but it doesn't matter.
So, in summary, putting most of the trust in the client side was a good decision in order to minimize lag, and a bad decision in that we now have rampant manipulation of client chatter to the server facilitating a bot utopia?