seriously we're now to "led christmas lights"?
this thread is about giving feedbak about the cpu improvements, if you didn't test the latest build or wanna share your opinion on THAT, leave the thread
personman_145 wrote: »If you are running ANY game, and getting less fps than your refresh rate, you are either maxing your CPU or GPU. The good ones are made to handle it. My 770 works still after several years and ran hot like it was a fermi. Your explanations just come off as kinda condescending. I'm not new to the tech game. I don't need to think of it as redlining an engine, I can think of it as a gfx card. There are a lot things wrong with that analogy. Redlining an engine would be more like overclocking the card too far.
profundidob16_ESO wrote: »Mystrius_Archaion wrote: »profundidob16_ESO wrote: »personman_145 wrote: »From the 4.0.1 notes:
Improved issues with framerate hitching when running through larger areas (such as cities) on quad-core or better PCs and consoles.
Improved multithreading usage on quad-core PCs.
Improved the framerate stability when spending a lot of time in the same zone.
Now that you mention this...I recreated my 2 chars yesterday and took one for a trip overland and indeed I didn't experience that stalling bug (massive fps drop and game hangs for 1-2 sec) so far. looks like the patch might have fixed that
framerate was pretty stable and good. 85-100 fps steady in the start area of summerset where all new chars spawn and an effortless 140-165fps in places with no people.
That's with max settings for all settings except "water reflection" which I set to low (helps boost fps as the dev suggested) and view distance I set to 60
I find it funny that anybody describes 140-165fps as "effortless".
I honestly, if I had the PC power you do, would limit it to 60fps hard limit if I can.
Why? You're pushing the graphics card harder than it needs to be pushed because you can't visually see that many frames per second. If it was constantly smooth, the human eye couldn't determine any difference between 45 of 60fps and anything above. I would link the study if I could but I read it years ago so I have no clue where the link is or even if it is on this same PC.
FYI, movies tend to be around 30fps, even those super HDBluRay ones, but nobody notices issues with them. We only notice issues if the fps drops by 15fps or more, from the above mentioned study, which actually happens much more often when you push them as high as you possibly can.
I've had overheated graphics cards before, mainly when I had a laptop for gaming years ago, and it just kept dropping lower and lower in fps. You're just asking for shorter life to your hardware.
Everybody just wants higher numbers, but what if you literally had 1000fps? Your eye takes something like 0.2 seconds to blink. That means you would lose 200 frames in the blink of an eye, literally. Magicians move their hands "faster than the eye can see" to perform many tricks.
We're just not built for the frames per second you're running at.
Edit:
Also, you want your fps setting to match your tv/monitor refresh rate to avoid any potential tearing. Even with Gsync and Freesync monitors, you want the fps capped at the max refresh rate of the monitor. Anything above is completely wasted anyway and anything below can result in blank screen frames for non-Gsync/non-Freesync monitors.
sorry but I'm not taking the baitThis dead horse has been beaten over and over in the past years in many games on numerous forums in debates. That party is long over though and if the fps gaming community would read your comment they would have a field day I assure you.
In short:
-yes my monitor supports 165hz g-sync
-yes I clearly see, notice and FEEL a difference above 60fps that makes it impossible to go back to my 4K 60hz monitor
-no my graphics card during 165fps doesn't go any higher than 70% load at it's highest peak and mostly hovers around 50% usage all the time
-no I have no interest in meaningless higher numbers
-no my custom watercooled graphics card never exceeds 49° celsius (fyi it idles at 28° celsius) gpu temperature during 165fps which is a joke considering they can handle up to 90°
-no I don't have any tearing at all since the g-sync module in my monitor supports up to 165hz. It's buttersmooth in fact
-yes I'm lucky enough to have a bleeding edge machine and apologize if that offended you somehow or triggered you into your reponse
-yes I have a clue of what I'm saying and doing after over 20 years professional experience in custom hardware and performance testing
-no I don't mean you any harm or disrepect. Just trying to stop this derailing which is irrelevant to the devs and other people reading this thread
please guys let's not derail and litter this informational thread by dragging the irrelevant dead horse subject into it. If you can't resist the urge, please take it into a separate thread or private discussion. In other words @Mystrius_Archaion please don't start the derail as it's clickbait and @personman_145 please don't take the bait, even when you're right and correct on the subject.
I invite you both to get back on track and keep this thread about multicore experiences and feedback on pts
Mystrius_Archaion wrote: »Also if you didn't know, there is a reason we have 30 fps and 60fps and 90 and 120 for most monitors and TVs. That is caused by the frequency of US power distribution in hertz. Anything off those multiples will see artifacts, so 144hz is not ideal.
Electronic equipment typically convert that AC power to a smooth DC current using a power supply. Otherwise your computer would simply not function at all since your CPU (amongst other components) wouldn't do so well with a gap in power 60 times per second.
Mystrius_Archaion wrote: »Mystrius_Archaion wrote: »Also if you didn't know, there is a reason we have 30 fps and 60fps and 90 and 120 for most monitors and TVs. That is caused by the frequency of US power distribution in hertz. Anything off those multiples will see artifacts, so 144hz is not ideal.
Electronic equipment typically convert that AC power to a smooth DC current using a power supply. Otherwise your computer would simply not function at all since your CPU (amongst other components) wouldn't do so well with a gap in power 60 times per second.
But your tv/monitor DO have a refresh rate or they wouldn't advertise "60hz/120hz/144hz" and they are vulnerable to this effect of the power supply frequency. Most of those also do not have that "converter brick" on the power cord either.
So yes, the screen will show issues with picture if the frame rate isn't right for it, which is why Gsync and Freesync exist.
Mystrius_Archaion wrote: »Mystrius_Archaion wrote: »Also if you didn't know, there is a reason we have 30 fps and 60fps and 90 and 120 for most monitors and TVs. That is caused by the frequency of US power distribution in hertz. Anything off those multiples will see artifacts, so 144hz is not ideal.
Electronic equipment typically convert that AC power to a smooth DC current using a power supply. Otherwise your computer would simply not function at all since your CPU (amongst other components) wouldn't do so well with a gap in power 60 times per second.
But your tv/monitor DO have a refresh rate or they wouldn't advertise "60hz/120hz/144hz" and they are vulnerable to this effect of the power supply frequency. Most of those also do not have that "converter brick" on the power cord either.
So yes, the screen will show issues with picture if the frame rate isn't right for it, which is why Gsync and Freesync exist.
All monitors and TV's will use DC. If there is not a block somewhere from the outlet to the monitor then it will be built in.
The inner electronics simply can not run on AC. It needs to be converted.
That doesn't say we can actually notice a difference if the motion blur effect weren't present though. That explains the noticeable difference.Why 240Hz matters
At a minimum, to do motion interpolation or black frame insertion, you need 120Hz. Trying to do this with a 60Hz TV means with a lot of content, the TV would be throwing away information. Also, the backlight flashing would be visible to most people.
240Hz is better, as you can flash the backlight much faster (so it's less noticeable)
its no harm for your computer to throttle up and down, its designed for it and it that it do all the time.Mystrius_Archaion wrote: »profundidob16_ESO wrote: »Mystrius_Archaion wrote: »profundidob16_ESO wrote: »personman_145 wrote: »From the 4.0.1 notes:
Improved issues with framerate hitching when running through larger areas (such as cities) on quad-core or better PCs and consoles.
Improved multithreading usage on quad-core PCs.
Improved the framerate stability when spending a lot of time in the same zone.
Now that you mention this...I recreated my 2 chars yesterday and took one for a trip overland and indeed I didn't experience that stalling bug (massive fps drop and game hangs for 1-2 sec) so far. looks like the patch might have fixed that
framerate was pretty stable and good. 85-100 fps steady in the start area of summerset where all new chars spawn and an effortless 140-165fps in places with no people.
That's with max settings for all settings except "water reflection" which I set to low (helps boost fps as the dev suggested) and view distance I set to 60
I find it funny that anybody describes 140-165fps as "effortless".
I honestly, if I had the PC power you do, would limit it to 60fps hard limit if I can.
Why? You're pushing the graphics card harder than it needs to be pushed because you can't visually see that many frames per second. If it was constantly smooth, the human eye couldn't determine any difference between 45 of 60fps and anything above. I would link the study if I could but I read it years ago so I have no clue where the link is or even if it is on this same PC.
FYI, movies tend to be around 30fps, even those super HDBluRay ones, but nobody notices issues with them. We only notice issues if the fps drops by 15fps or more, from the above mentioned study, which actually happens much more often when you push them as high as you possibly can.
I've had overheated graphics cards before, mainly when I had a laptop for gaming years ago, and it just kept dropping lower and lower in fps. You're just asking for shorter life to your hardware.
Everybody just wants higher numbers, but what if you literally had 1000fps? Your eye takes something like 0.2 seconds to blink. That means you would lose 200 frames in the blink of an eye, literally. Magicians move their hands "faster than the eye can see" to perform many tricks.
We're just not built for the frames per second you're running at.
Edit:
Also, you want your fps setting to match your tv/monitor refresh rate to avoid any potential tearing. Even with Gsync and Freesync monitors, you want the fps capped at the max refresh rate of the monitor. Anything above is completely wasted anyway and anything below can result in blank screen frames for non-Gsync/non-Freesync monitors.
sorry but I'm not taking the baitThis dead horse has been beaten over and over in the past years in many games on numerous forums in debates. That party is long over though and if the fps gaming community would read your comment they would have a field day I assure you.
In short:
-yes my monitor supports 165hz g-sync
-yes I clearly see, notice and FEEL a difference above 60fps that makes it impossible to go back to my 4K 60hz monitor
-no my graphics card during 165fps doesn't go any higher than 70% load at it's highest peak and mostly hovers around 50% usage all the time
-no I have no interest in meaningless higher numbers
-no my custom watercooled graphics card never exceeds 49° celsius (fyi it idles at 28° celsius) gpu temperature during 165fps which is a joke considering they can handle up to 90°
-no I don't have any tearing at all since the g-sync module in my monitor supports up to 165hz. It's buttersmooth in fact
-yes I'm lucky enough to have a bleeding edge machine and apologize if that offended you somehow or triggered you into your reponse
-yes I have a clue of what I'm saying and doing after over 20 years professional experience in custom hardware and performance testing
-no I don't mean you any harm or disrepect. Just trying to stop this derailing which is irrelevant to the devs and other people reading this thread
please guys let's not derail and litter this informational thread by dragging the irrelevant dead horse subject into it. If you can't resist the urge, please take it into a separate thread or private discussion. In other words @Mystrius_Archaion please don't start the derail as it's clickbait and @personman_145 please don't take the bait, even when you're right and correct on the subject.
I invite you both to get back on track and keep this thread about multicore experiences and feedback on pts
I was commenting originally about the people who obviously didn't have their high frame rate stable. Look back on the ones saying "I get 120fps(or whatever) and 75fps(or some other lower number) in trials".
That is definitely not healthy for a machine to be "chugging" like that. It's like a car going uphill chugging more gas just to maintain the same speed, the difference being that the car can do it so maintains the same speed. A computer that can't maintain the same frame rate in all situations in a game is being pushed beyond its limits which causes damage to the parts making it less likely to maintain the performance it even has been able to do over time.
I also hate it when I see my own frame rate suddenly drop. Why do people just leave everything maxed out unlimited AND get drops like that even more often not hate it? They obviously do nothing on their end to prevent sudden drops and instead count on game developers to somehow magically rewrite how the game handles graphics to suddenly improve their performance by 50% or more.
I need some help to understand one option.
What gives me exactly SET GPUSmoothingFrames “10” ??
I need some help to understand one option.
What gives me exactly SET GPUSmoothingFrames “10” ??
Your GPU is smoothing the next 10 frames.
If you set it to 0 your GPU won't smooth frames at all.
At least that's what I think it is.
But that doesn't draw CPU ressources.
It would only be a problem if your GPU can't handle the load.
then I suggest you leave the internet.
It will only work with V-Sync on?
i will buy some christmas led on amazon before i go.
I've also got kind of an old CPU: i5-4440
I need some help to understand one option.
What gives me exactly SET GPUSmoothingFrames “10” ??
Your GPU is smoothing the next 10 frames.
If you set it to 0 your GPU won't smooth frames at all.
At least that's what I think it is.
But that doesn't draw CPU ressources.
It would only be a problem if your GPU can't handle the load.
It will only work with V-Sync on?
I've also got kind of an old CPU: i5-4440
The weird stuttery frame-massacre that used to happen every time I left a major city is gone (this has only been happening to me since Clockwork City, though, so it's really just fixing a recent performance bug, more than anything).
The average framerate is the same for me, however. The game still runs like crap, and it still only seems to get worse with every update. Before Thieves' Guild I used to always stay around 60 fps (I use vsync to avoid screen-tearing), with the exception of Cyrodiil and a few effect-heavy trial situations, but now I'm always between 10-20 fps in trials, and the same in Cyrodiil. It's pretty awful, especially considering ESO hasn't really made any changes to graphics and such, at least not in any ways that would warrant such a massive shift in performance. It also doesn't seem like performance optimisation is a focus either.
The only people I know who the game runs well for are the ones with really new rigs, and it just makes me feel like ZOS is pushing the performance optimisation aside and hoping their players will keep upgrading their PCs to deal with the growing pile of performance hitches. It does make me wonder how the consoles fare..
I was really hoping multicore support would help, but it hasn't. I honestly can't even tell a difference between the PTS and Live.
I've also got kind of an old CPU: i5-4440
The weird stuttery frame-massacre that used to happen every time I left a major city is gone (this has only been happening to me since Clockwork City, though, so it's really just fixing a recent performance bug, more than anything).
The average framerate is the same for me, however. The game still runs like crap, and it still only seems to get worse with every update. Before Thieves' Guild I used to always stay around 60 fps (I use vsync to avoid screen-tearing), with the exception of Cyrodiil and a few effect-heavy trial situations, but now I'm always between 10-20 fps in trials, and the same in Cyrodiil. It's pretty awful, especially considering ESO hasn't really made any changes to graphics and such, at least not in any ways that would warrant such a massive shift in performance. It also doesn't seem like performance optimisation is a focus either.
The only people I know who the game runs well for are the ones with really new rigs, and it just makes me feel like ZOS is pushing the performance optimisation aside and hoping their players will keep upgrading their PCs to deal with the growing pile of performance hitches. It does make me wonder how the consoles fare..
I was really hoping multicore support would help, but it hasn't. I honestly can't even tell a difference between the PTS and Live.
@Saturn have you tried trashing your shader file?
I went to Empty spot on Live in Stormhaven (no pple / no Npcs ) my fps was 89 staring at one door with CPU usage 24%
Went to the same Empty spot on PTS and my fps was 84 with CPU usage of 36%
i7 7740x @4.9ghz OC
you droaw your conclusion about the "Optimisation" .