YautjaLanu wrote: »A little depressing.....

This game was always doomed right from the get-go. The software and hardware support is so old it would have been better off running dx9.
Why do I say that? well I guess what you should be asking your self is what do you know about technology!?!
We are in a transition stage at the moment and most people don't even know what API with be the next used in PC's or consoles. Dx12 isn't even all its cracked up to be and any pc enthusiast would tell you win 10 is worse than 8.1 with open shell and win7 takes the cake. So while Dx12 only runs of 10 is that really what people want and thats not even the real reason why developers or gaming wouldn't want to use it. It's not even this super low lvl API most think it is. AMD are lookng into opening up the GPU to a whole new lvl that just makes Dx12 looked dated.
So I conclude any games released current day is just not doing to survive any long term performance gains that we have in store for us. These games will die out so fast you will not bother to look back. Current gaming is so stale and lacks anything worth caring for and is why most big titles are just milking it dry.
This is why I no longer care about the lack of creative worth wile games out today. Better things look forward to. So much interesting technology sorftware and hardware
I think most of you are missing my point. Once a game has been developed it cant just change the way it uses hardware without a ground up rebuild. That means and my point about current game is they can't take advantage of things like OpenGPU or even Dx12. They are stuck and can not evolve. So a game that performs badly will always perform badly.
I think most of you are missing my point. Once a game has been developed it cant just change the way it uses hardware without a ground up rebuild. That means and my point about current game is they can't take advantage of things like OpenGPU or even Dx12. They are stuck and can not evolve. So a game that performs badly will always perform badly.
That's simply not true. The game performs badly because it's issuing a large number of draw calls, without any semblance of batching.
If they make a new renderer, the game will perform absolutely stellar, as the several thousand draw calls being issued will be but a droplet in the bucket, as far as performance is concerned. With the current renderer, the several thousand draw calls are the elephants standing upon the back of a deaf, dumb, blind, crippled mute.
Rohamad_Ali wrote: »They are doing something with performance and it is noticeable . For the last 2 weeks I've been able to PvP on a custom ultra high graphics setting with 45-50 FPS and low ping . I am the first one to jump with other on ZOS when performance tanks here so this no fan post at all . It's important to give constructive feedback and here it is finally . Who ever is fixing the performance issues is onto something @ZOS_GinaBruno , please let them continue with what they are doing on the PC platforms . It is noticeable .
Woah now, I'm not sure you understand the importance of Mantle/Vulkan/Direct3D 12.
In the traditional, driver-oriented APIs (Direct3D 11 and older), the driver is performing insane amounts of validation, error checking and correction. As a result, when a draw call is issued by the 3D application's renderer, it takes up a large amount of CPU time.
For example, in Direct3D 9, your framerates will suffer extremely once you go past 3.5k draw calls on an AMD CPU (sub 30fps), or past 10k draw calls on an Intel CPU (again, sub 30fps). On Direct3D 11, just double the number of draw calls for the best possible scenario.
For those that don't know, each individual object in a game, whether it be a single plant, a character's hair, a ring, a boulder, or a house, will issue around 4 or 5 draw calls minimum. The more lights and shadows upon an object, and that the object is casting, the draw calls will increase 1:1; five lights on an object? Five more draw calls, at least. Five shadows on an object? Five more draw calls, at least.
And with models that have multiple textures, the draw calls will rise exponentially. Seven draw calls that would be issued on an object that only has one set of diffuse, normal and specular maps? Well this one has two sets, so that's 14 draw calls. Three sets of textures? That's 21 draw calls. Etc. etc. The solution to that conundrum is to use texture atlases, but these gobble up vRAM, overburdens the GPU due to all the extra shader instructions required for anisotropic filtering being used on the large (often >8192x8192 in size) textures, use up large amounts of drive space, as well as being a pain in the ass for the artists to create.
There is little that can be done to alleviate this bottleneck on said APIs. There are two solutions, when looking solely at the issue of draw call performance. One, is to create an API that allows for parallel issuing and validation of draw calls. This is what AMD demonstrated a short while before DICE collaborated with AMD to bring this to the consumer space; an embarrassingly parallel graphics API.
However, this still has the limit of several thousand draw calls being issued per core. Same CPU performance hit, but with four cores (i.e, four CPUs) handling draw calls. In other words, if we take D3D9's 3.5k draw call ceiling and chuck in parallel submission, we would ideally get 14k draw calls on a quad core CPU.
That's still exceedingly paltry. Compare Oblivion to Fallout 4; Fallout 4 is issuing many more draw calls, and has a moderately higher object density, but it's still using the exact same 5x5 active game area system (ugrids), as well as using the same LOD system. Why? Because even with Direct3D 11's more efficient draw calls, it's nowhere near enough to accommodate moderate-long draw distances.
So what must be done? Exactly what Mantle, Direct3D 12 and Vulkan are; parallel draw call submissions, with the graphics driver being a fairly "raw" interface between the game and graphics card. So rather than having to perform fail-safe, catch-all, better-safe-than-sorry, we-don't-know-what-will-happen-so-we'll-play-it-safe, CPU-crunching error checking, have the engine programmers perform the necessary error checking and such.
Whilst that sounds scary to the ignorant, that's because there's a lack of perspective. A good way to look at it, is the difference between, say, Lua and C++. Lua is a scripting language that is absolutely not what a realtime program (e.g, Skyrim, 3DS Max, Photoshop, Cubase, etc.) should be made with; it's exceedingly wasteful on the CPU. The upside? Extremely easy to get a handle on, and will take care of a large amount of error checking (memory leaks and threads, mainly).
But if you want to have a realtime application that won't have a framerate running the low twenties on the most beefy of consumer hardware available, you will need to use C++; at the cost of having to do error checking and fail-safes manually, the performance of your program will perform magnitudes faster.
That's the difference between Direct3D11 and these newer APIs; more manual renderer code, in order to gain magnitudes more performance.
If we were to compare the most intensive match on Trueflame, with a Direct3D 11 renderer and a Vulkan renderer, you would find that a venerable old Phenom, like yours truly has, will be thrust into the low teens. On Vulkan? I'd be surprised if the framerate wasn't hitting close to 50, if not 60.
That is actually one of my only few gripes overall to be honest, the totally unnecessary use of lua. Sure its easier to get to grips with and will be easier to make addons for and such and ..yadada the list goes on, but lua has never been reliable preformance wise, there was a time when it was but not anymore.
Fact is they should never of built the engine of a prexisting engine to begin with, it takes time and money but they should of built their own just for ESO.
Rohamad_Ali wrote: »They are doing something with performance and it is noticeable . For the last 2 weeks I've been able to PvP on a custom ultra high graphics setting with 45-50 FPS and low ping . I am the first one to jump with other on ZOS when performance tanks here so this no fan post at all . It's important to give constructive feedback and here it is finally . Who ever is fixing the performance issues is onto something @ZOS_GinaBruno , please let them continue with what they are doing on the PC platforms . It is noticeable .
I just went from wifi to cable to see if that made a difference less than 2 weeks ago. So unless you are playing in thetest server I dont believe that for one second.
A good ping is below 50ms, I've never had anything close to that in this game and the server is in germany. I use to play with 8ms ping in germany 10 years ago, so no dont talk bs.
YautjaLanu wrote: »A little depressing.....
No not at all. If anything things are going to become so much better. The soon Directx is put in the bin the better.
we have some good thing ahead.
Whats depressing for me is playing games that are of such a substandard.
We all came here looking for the next big mmo but the truth is, isn't not here yet.