Maintenance for the week of September 22:
· [COMPLETE] NA megaservers for maintenance – September 22, 4:00AM EDT (8:00 UTC) - 10:00AM EDT (14:00 UTC)
· [COMPLETE] EU megaservers for maintenance – September 22, 8:00 UTC (4:00AM EDT) - 14:00 UTC (10:00AM EDT)

How many people with lag and stutter are not using a hdmi 2.0 cable or maybe have a bad cable?

  • Nemesis7884
    Nemesis7884
    ✭✭✭✭✭
    ✭✭✭✭✭
    No, i'm sure my cable is hdmi 2.0 and that it's good because i tested it.
    listen, you cant have 16 times the detail without expecting some lag
  • CleymenZero
    CleymenZero
    ✭✭✭✭✭
    Lady_Linux wrote: »
    https://www.asrock.com/Graphics-Card/AMD/Phantom Gaming X Radeon RX570 8G OC/

    this is the exact graphics card i have and it will not conect on linux or windows for 60 hrtz

    and since my air tv player will connect at 4k 60 hrtz it is not the tv or the cable but the card and the card is supposed to be capable of 8k

    8k on a 570... This is joke thread of the year.

    This is infuriating because your complete lack of understanding of technology is undermining the insanely bad performance of the game.

    You will never be able to run a game at 4k or 8k with this GPU. The best you could do is have the GPU render at 1080p (the best performance you're likely to get on this GPU) and let the screen upscale to 4k (at which point your basically playing 1080p).

    I have a GTX 1080ti which is propably 8 times more powerful than a RX 570 and I wouldn't attempt 4k with it on any game unless I'm willing to lower graphics significantly to be able to play at more than 40 fps.

    If you're trolling, I applaud you but if you're serious, you need to educate yourself.

    BTW HDMI 2.0 is 18 gigabit per second (a data transfer RATE) as opposed to 18gb which would refer to a quantity of data which still wouldn't make sense...
  • CleymenZero
    CleymenZero
    ✭✭✭✭✭
    Oh wow, this thread is still going...

    I think some things need to be mentioned though...

    The only functional reason to ever have a refresh rate over 60 is not visual quality. It has been pretty darn conclusively proven that while humans can tell the difference between a 240hz display and a 60hz display... they can't tell it when they are PLAYING anywhere near as often.

    Refresh rate has exactly ONE impact on performance. If your refresh rate is less than your FPS in some circumstances you will have a yoyo effect with Vsync on. Beyond this, there is no effect. Higher refresh rates DO NOT IN ANY WAY INCREASE PERFORMANCE.

    The final thing is what we, as gamers, actually get high refresh rates for is INPUT LATENCY. The time between seeing, clicking, and seeing the result of the click. Higher refresh rates allow for more images to be drawn and thus allows for your eye to see and register action faster. In an FPS this is very important... In ESO less important(it helps to weave).

    Fact checks:
    A bad cable is a bad cable. Most of you probably have bad cables. You've bent them, rolled a chair over them. Abused them. Stuck them into dirty a** ports. They are "less optimal".
    GOOD NEWS!
    Having a bad cable with a crap monitor means NOTHING! Cables are DIGITAL NOW. Digital has this really cool thing where it either works.. or it doesn't. Unless you have actual more expensive than a car hardware cable choice isn't going to bother you one whit. Rejoice at no longer having to physically shield your monitor cable from your power cable or get artifacts!

    2.0 vs 1.0 HDMI. It doesn't matter. I've seen really cheap cables handle 2.0 specs... I've seen really expensive cables fail at 2.0 spec. You can have a passing 2.0 spec cable fail by bending it wrong. Unless you are truly 100% visualphile hardcore you probably won't have a setup that cares. If you do have a setup that cares you won't use HDMI because of how finicky it is.

    Hands down the stupidest thread I've seen to a point where I think she's trolling.

    Few things though:

    - some people are clearly better at discerning higher VS lower framerates. I can tell better than my brother and we've tested it an unreasonable number of times. I do play a lot more games than he does so that's a potential bias right there.
    - you forgot to mention adaptive sync

    You seem to know your way around so process this: she believes her RX 570 is capable of 8k gaming... I nearly lost it there...
  • KoeKhaos
    KoeKhaos
    ✭✭
    @Lady_Linux
    I didn't read every post but going to answer what I did read. I run ESO on Linux POP!OS on my laptop on a 4k monitor via HDMI sometimes and it runs fine at 60hz.

    1. HDMI or Display port won't matter, as in the end your TV is only HDMI so the cable will still be HDMI even with displayport on one end. Cable standard (version), cable quality, and cable length all do matter though. You will need version 2.0 of HDMI to get 60hz. The other issue is if the actual connecting devices support higher versions of HDMI as well. There are controllers in both the TV and your graphics card and if they are only programmed for HDMI 1.4 they won't go to 60hz 4k.

    2. Stuttering isn't due to frame limit (hz is just how many screen refresh is pushed/rendered per second, IE fps) as much as drops in frame rate from your computer (graphic card/CPU/RAM maxed out). Think of it this way, movies play at only 24 hz/fps and they look smooth. You will notice a difference between 60 hz and 30 hz, but it you are staying at a stable 30hz the game would still be playable and not be horrible. What actually happens in ESO though is that the game sometimes render super heavy. I get anywhere from 80 fps in 4k to a mere 6 fps in Vivec city solely on where I look on my GTX 1080. The game is not very well optimized which is why they are supposed to be fixing it.

    3. Here are some things to consider. Enable FPS in the game and see if when you see a stutter your FPS is dropping. If you card can only render at 20 FPS, you will always have issues. The cable will still support exactly 30 hz and your TV will still refresh at 30hz or 60hz, but it will only receive those 20 frames so it will just refresh the same unchanged image during a refresh.

    4. SOLUTION! (Possibly.) Just set your resolution to Fullscreen (not windowed) at 1920x1080 (1080p) in game. It should work, if not set your operating system to 1920x1080 (1080p) when you want to play. You'll be able to play at 60hz and your graphics card will stutter a lot less. 1080p is exactly divisible by two so every four pixels should just group up into a single pixel and act like a decent 1080p monitor. Even on my beast 2080 TI desktop I can drop to low FPS at times on my 144hz 1440p monitor. It's rare, but happens. The game needs to be optimized, but you also don't have a good enough video card for intensive graphic games to run at 4k well. Light games and desktop apps/video should run fine though.
    Edited by KoeKhaos on November 17, 2019 8:12AM
  • CleymenZero
    CleymenZero
    ✭✭✭✭✭
    KoeKhaos wrote: »
    @Lady_Linux
    I didn't read every post but going to answer what I did read. I run ESO on Linux POP!OS on my laptop on a 4k monitor via HDMI sometimes and it runs fine at 60hz.

    1. HDMI or Display port won't matter, as in the end your TV is only HDMI so the cable will still be HDMI even with displayport on one end. Cable standard (version), cable quality, and cable length all do matter though. You will need version 2.0 of HDMI to get 60hz. The other issue is if the actual connecting devices support higher versions of HDMI as well. There are controllers in both the TV and your graphics card and if they are only programmed for HDMI 1.4 they won't go to 60hz 4k.

    2. Stuttering isn't due to frame limit (hz is just how many screen refresh is pushed/rendered per second, IE fps) as much as drops in frame rate from your computer (graphic card/CPU/RAM maxed out). Think of it this way, movies play at only 24 hz/fps and they look smooth. You will notice a difference between 60 hz and 30 hz, but it you are staying at a stable 30hz the game would still be playable and not be horrible. What actually happens in ESO though is that the game sometimes render super heavy. I get anywhere from 80 fps in 4k to a mere 6 fps in Vivec city solely on where I look on my GTX 1080. The game is not very well optimized which is why they are supposed to be fixing it.

    3. Here are some things to consider. Enable FPS in the game and see if when you see a stutter your FPS is dropping. If you card can only render at 20 FPS, you will always have issues. The cable will still support exactly 30 hz and your TV will still refresh at 30hz or 60hz, but it will only receive those 20 frames so it will just refresh the same unchanged image during a refresh.

    4. SOLUTION! (Possibly.) Just set your resolution to Fullscreen (not windowed) at 1920x1080 (1080p) in game. It should work, if not set your operating system to 1920x1080 (1080p) when you want to play. You'll be able to play at 60hz and your graphics card will stutter a lot less. 1080p is exactly divisible by two so every four pixels should just group up into a single pixel and act like a decent 1080p monitor. Even on my beast 2080 TI desktop I can drop to low FPS at times on my 144hz 1440p monitor. It's rare, but happens. The game needs to be optimized, but you also don't have a good enough video card for intensive graphic games to run at 4k well. Light games and desktop apps/video should run fine though.

    She is running an AMD RX 570 and thinks it is 8k capable.

    Just thought you should know since you're taking your time to answer her...
  • KoeKhaos
    KoeKhaos
    ✭✭
    Technically, it could, though in actuality even high end cards are a pain to get 8k actually working. But probably be at like 5 fps on desktop only. :wink: Doesn't hurt me to be helpful any which way. Have a nice day. :)
    Edited by KoeKhaos on November 17, 2019 8:27AM
  • Lady_Linux
    Lady_Linux
    ✭✭✭✭✭
    I'm sure my cable isn't 2.0
    Oh wow, this thread is still going...

    I think some things need to be mentioned though...

    The only functional reason to ever have a refresh rate over 60 is not visual quality. It has been pretty darn conclusively proven that while humans can tell the difference between a 240hz display and a 60hz display... they can't tell it when they are PLAYING anywhere near as often.

    Refresh rate has exactly ONE impact on performance. If your refresh rate is less than your FPS in some circumstances you will have a yoyo effect with Vsync on. Beyond this, there is no effect. Higher refresh rates DO NOT IN ANY WAY INCREASE PERFORMANCE.

    The final thing is what we, as gamers, actually get high refresh rates for is INPUT LATENCY. The time between seeing, clicking, and seeing the result of the click. Higher refresh rates allow for more images to be drawn and thus allows for your eye to see and register action faster. In an FPS this is very important... In ESO less important(it helps to weave).

    Fact checks:
    A bad cable is a bad cable. Most of you probably have bad cables. You've bent them, rolled a chair over them. Abused them. Stuck them into dirty a** ports. They are "less optimal".
    GOOD NEWS!
    Having a bad cable with a crap monitor means NOTHING! Cables are DIGITAL NOW. Digital has this really cool thing where it either works.. or it doesn't. Unless you have actual more expensive than a car hardware cable choice isn't going to bother you one whit. Rejoice at no longer having to physically shield your monitor cable from your power cable or get artifacts!

    2.0 vs 1.0 HDMI. It doesn't matter. I've seen really cheap cables handle 2.0 specs... I've seen really expensive cables fail at 2.0 spec. You can have a passing 2.0 spec cable fail by bending it wrong. Unless you are truly 100% visualphile hardcore you probably won't have a setup that cares. If you do have a setup that cares you won't use HDMI because of how finicky it is.

    Hands down the stupidest thread I've seen to a point where I think she's trolling.

    Few things though:

    - some people are clearly better at discerning higher VS lower framerates. I can tell better than my brother and we've tested it an unreasonable number of times. I do play a lot more games than he does so that's a potential bias right there.
    - you forgot to mention adaptive sync

    You seem to know your way around so process this: she believes her RX 570 is capable of 8k gaming... I nearly lost it there...

    https://www.asrock.com/Graphics-Card/AMD/Phantom Gaming X Radeon RX570 8G OC/index.asp#Specification
    I simply must protest. There are no Penguin avatars for me to use in the forums.

    BTW, I use arch too
  • Lady_Linux
    Lady_Linux
    ✭✭✭✭✭
    I'm sure my cable isn't 2.0
    Lady_Linux wrote: »
    https://www.asrock.com/Graphics-Card/AMD/Phantom Gaming X Radeon RX570 8G OC/

    this is the exact graphics card i have and it will not conect on linux or windows for 60 hrtz

    and since my air tv player will connect at 4k 60 hrtz it is not the tv or the cable but the card and the card is supposed to be capable of 8k

    8k on a 570... This is joke thread of the year.

    This is infuriating because your complete lack of understanding of technology is undermining the insanely bad performance of the game.

    You will never be able to run a game at 4k or 8k with this GPU. The best you could do is have the GPU render at 1080p (the best performance you're likely to get on this GPU) and let the screen upscale to 4k (at which point your basically playing 1080p).

    I have a GTX 1080ti which is propably 8 times more powerful than a RX 570 and I wouldn't attempt 4k with it on any game unless I'm willing to lower graphics significantly to be able to play at more than 40 fps.

    If you're trolling, I applaud you but if you're serious, you need to educate yourself.

    BTW HDMI 2.0 is 18 gigabit per second (a data transfer RATE) as opposed to 18gb which would refer to a quantity of data which still wouldn't make sense...


    who cares what your card games at. what matters is you desktop can display 4k 60 hz or not.


    https://www.asrock.com/Graphics-Card/AMD/Phantom Gaming X Radeon RX570 8G OC/index.asp#Specification
    Edited by Lady_Linux on November 17, 2019 8:55AM
    I simply must protest. There are no Penguin avatars for me to use in the forums.

    BTW, I use arch too
  • daemonios
    daemonios
    ✭✭✭✭✭
    ✭✭✭✭✭
    I have a GTX 1080ti which is propably 8 times more powerful than a RX 570 and I wouldn't attempt 4k with it on any game unless I'm willing to lower graphics significantly to be able to play at more than 40 fps.

    I must be trolling too, since I'm running the game at 4k and max settings on a 980TI.
    Edited by daemonios on November 17, 2019 8:52AM
  • Lady_Linux
    Lady_Linux
    ✭✭✭✭✭
    I'm sure my cable isn't 2.0
    KoeKhaos wrote: »
    Technically, it could, though in actuality even high end cards are a pain to get 8k actually working. But probably be at like 5 fps on desktop only. :wink: Doesn't hurt me to be helpful any which way. Have a nice day. :)

    Yah well the card does 4k at 60 hz and the tv does 4k at 60 htz and the hmdi cable is rated 4k at 60 hz but i only get 30.

    though i cant test the card.... i tested the monitor by connecting a different 4k device and the desktop defaulted to 4k at 60 hz, the sling tv android device... and if a low end android device can connect at 4k 60 hz without a hitch on the same hdmi cable then certainly a computer with an rx 570 should get a desktop to connect at the same res and refresh rate.

    i have put in a support request with asrock and with sceptre given that neither windows or linux seems to connect to the tv correctly and neither linux or windows 10 seems capable of identifying the tv correctly...

    and to the elitist ragging on me about the 8k resolution, it's supposed to be able to display it. i wouldnt expect eso to run in it but it should do a video at least and if it can play an 8k video they shouldnt be allowed to advertise it.

    Heck they shouldnt be allowed to advertise 4k 60 hz if you cant game at that rate or even get the desktop to show that resolution and refresh rate.
    I simply must protest. There are no Penguin avatars for me to use in the forums.

    BTW, I use arch too
  • Kadoin
    Kadoin
    ✭✭✭✭✭
    ✭✭
    It's shadows and reflection that cause that in nearly every game. They are expensive and probably the most FPS-draining because they usually run on CPU and are dependent on the floating point units of the CPU you use, which by the way has not improved anywhere near as much as GPU performance from generation to generation and why you don't and won't see massive improvements from most CPUs to other CPUs in games like ESO.

    I personally would use a different method of doing shadows, but eh...
  • KoeKhaos
    KoeKhaos
    ✭✭
    @Lady_Linux

    That's not entirely accurate. Think of it like your graphics card has to write a letter to an audience (The monitor). The longer the letter, the longer it takes to write. More skilled writers can write faster (This is your frames per second). The HZ is the gopher running between you and the audience. They ALWAYS come 60 times at exact intervals even if you have not finished your letter. So if you have not finished your letter by the time the gopher comes, they return to the audience and simply tell them nothing new to report. This is how monitors and graphics cards work basically. They are still running at that hz* but you may not be able to work faster enough on the content to provide them new updates. So technically it is always running at 60hz when set to that, but your frames might not be sent which causes a dip in performance and stutter. You will be able to run desktop apps at 4k 60hz fine, and depending on the game and settings, you'll even be able to game fine at 4k on your graphics card.

    *I add a note here because freesync and gsync monitors will actually change their hz based on your video card so might actually only update 20 times or less if it knows you are being slow at writing basically. The thing is that change is speed is more noticeable than a single speed. That's why 30 fps, that never wavers, will look a LOT better than a variance of 20 fps to 40 fps. It comes down to human perception then.

    I personally think you should try a new 2.0 cable. Yes, it might work with the Android, but the Android might also be more tolerant. There have been so many times where simply buying a new cable has helped people. Go with as short as you can for best tolerance typically. We use twisted veins brand at work, but even cheap Amazon Basic should work and you can always return those super easy if they don't. That, and I still think you should just try playing in 1080p resolution... You didn't really answer if you tried that.
    Edited by KoeKhaos on November 17, 2019 9:50AM
  • Lady_Linux
    Lady_Linux
    ✭✭✭✭✭
    I'm sure my cable isn't 2.0
    KoeKhaos wrote: »
    @Lady_Linux

    That's not entirely accurate. Think of it like your graphics card has to write a letter to an audience (The monitor). The longer the letter, the longer it takes to write. More skilled writers can write faster (This is your frames per second). The HZ is the gopher running between you and the audience. They ALWAYS come 60 times at exact intervals even if you have not finished your letter. So if you have not finished your letter by the time the gopher comes, they return to the audience and simply tell them nothing new to report. This is how monitors and graphics cards work basically. They are still running at that hz* but you may not be able to work faster enough on the content to provide them new updates. So technically it is always running at 60hz when set to that, but your frames might not be sent which causes a dip in performance and stutter. You will be able to run desktop apps at 4k 60hz fine, and depending on the game and settings, you'll even be able to game fine at 4k on your graphics card.

    *I add a note here because freesync and gsync monitors will actually change their hz based on your video card so might actually only update 20 times or less if it knows you are being slow at writing basically. The thing is that change is speed is more noticeable than a single speed. That's why 30 fps, that never wavers, will look a LOT better than a variance of 20 fps to 40 fps. It comes down to human perception then.

    I personally think you should try a new 2.0 cable. Yes, it might work with the Android, but the Android might also be more tolerant. There have been so many times where simply buying a new cable has helped people. Go with as short as you can for best tolerance typically. We use twisted veins brand at work, but even cheap Amazon Basic should work and you can always return those super easy if they don't. That, and I still think you should just try playing in 1080p resolution... You didn't really answer if you tried that.

    i DID buy a new cable. that's the problem. same NEW cable works with dated 6.something android box while powerful gpu doesnt. crazy making
    I simply must protest. There are no Penguin avatars for me to use in the forums.

    BTW, I use arch too
  • Lady_Linux
    Lady_Linux
    ✭✭✭✭✭
    I'm sure my cable isn't 2.0
    this pc SHOULD be connecting at 60 hz in 4k irrespective of how it gets its games on.
    I simply must protest. There are no Penguin avatars for me to use in the forums.

    BTW, I use arch too
  • Aurielle
    Aurielle
    ✭✭✭✭✭
    ✭✭✭✭✭
    Sweet Jesus. This thread, and the snippy back and forth, is a pretty good summation of why I stopped PC gaming a while back. PC bros are always going on and on about how it’s “easy” to build a PC, and then have condescending meltdowns on video game forums in which they try to one-up one another over technical minutiae. You all need to seriously take a step back evaluate the way you’re communicating with one another over something as inconsequential as an HDMI cable.
  • CleymenZero
    CleymenZero
    ✭✭✭✭✭
    Lady_Linux wrote: »
    Lady_Linux wrote: »
    https://www.asrock.com/Graphics-Card/AMD/Phantom Gaming X Radeon RX570 8G OC/

    this is the exact graphics card i have and it will not conect on linux or windows for 60 hrtz

    and since my air tv player will connect at 4k 60 hrtz it is not the tv or the cable but the card and the card is supposed to be capable of 8k

    8k on a 570... This is joke thread of the year.

    This is infuriating because your complete lack of understanding of technology is undermining the insanely bad performance of the game.

    You will never be able to run a game at 4k or 8k with this GPU. The best you could do is have the GPU render at 1080p (the best performance you're likely to get on this GPU) and let the screen upscale to 4k (at which point your basically playing 1080p).

    I have a GTX 1080ti which is propably 8 times more powerful than a RX 570 and I wouldn't attempt 4k with it on any game unless I'm willing to lower graphics significantly to be able to play at more than 40 fps.

    If you're trolling, I applaud you but if you're serious, you need to educate yourself.

    BTW HDMI 2.0 is 18 gigabit per second (a data transfer RATE) as opposed to 18gb which would refer to a quantity of data which still wouldn't make sense...


    who cares what your card games at. what matters is you desktop can display 4k 60 hz or not.


    https://www.asrock.com/Graphics-Card/AMD/Phantom Gaming X Radeon RX570 8G OC/index.asp#Specification

    The capacity to display a certain resolution at a certain refresh does not make it functional. Have you ever looked at benchmarks ever in your life? A 570 does not have the horsepower to properly push 4k unless you're running a SNES emulator. How oblivious are you to that?

    The speedometer on your Corolla says it can get to 200mph but it'll never even get close to 160mph. Is that an analogy you can understand?

    What you are quoting is the "Maximum Display Resolution". It doesn't mean it can functionally operate outputting those resolutions, it's just the maximum resolution it could display based on the bandwidth of the connectors.

    https://www.eurogamer.net/articles/digitalfoundry-2019-05-01-amd-radeon-rx-570-benchmarks-7001
    https://www.tomshardware.com/reviews/amd-radeon-rx-570-4gb,5028-8.html

    Your card can't do 1440p (2,5k) at 60 fps in fact, none of the reviewers for your card have ever tested it in 4k because they know better. Your best bet, as someone else stated, is to run it at 1080p and let the screen upscale (if it can even do a proper job at that).

    And here, let me paste my card specs from the manufacturer as if it made me more credible at anything:
    https://www.nvidia.com/en-us/geforce/products/10series/geforce-gtx-1080-ti/
  • CleymenZero
    CleymenZero
    ✭✭✭✭✭
    Aurielle wrote: »
    Sweet Jesus. This thread, and the snippy back and forth, is a pretty good summation of why I stopped PC gaming a while back. PC bros are always going on and on about how it’s “easy” to build a PC, and then have condescending meltdowns on video game forums in which they try to one-up one another over technical minutiae. You all need to seriously take a step back evaluate the way you’re communicating with one another over something as inconsequential as an HDMI cable.

    It's the most infuriating yet entertaining thread I've seen in a long time.

    The HDMI cable is inconsequential indeed but not for the same reason you think it is. An RX 570 is not a powerful GPU. A maximum display resolution stated by a manufacturer is also inconsequential.

    So many other things to be said, I'm gonna say the OP appreciates the attention, good or bad and doesn't mind being 'splained stuff even though there is no willingness to understand. The only issue is what she says might be taken as true by people who also don't understand much about technology.
  • CleymenZero
    CleymenZero
    ✭✭✭✭✭
    daemonios wrote: »
    I have a GTX 1080ti which is propably 8 times more powerful than a RX 570 and I wouldn't attempt 4k with it on any game unless I'm willing to lower graphics significantly to be able to play at more than 40 fps.

    I must be trolling too, since I'm running the game at 4k and max settings on a 980TI.

    At 9 fps with all low settings? Sure.
  • CleymenZero
    CleymenZero
    ✭✭✭✭✭
    daemonios wrote: »
    I have a GTX 1080ti which is propably 8 times more powerful than a RX 570 and I wouldn't attempt 4k with it on any game unless I'm willing to lower graphics significantly to be able to play at more than 40 fps.

    I must be trolling too, since I'm running the game at 4k and max settings on a 980TI.

    And your card is high-end compared to an RX 570 btw... It is at least 30% more powerful than hers.
  • Kadoin
    Kadoin
    ✭✭✭✭✭
    ✭✭
    Lady_Linux wrote: »
    this pc SHOULD be connecting at 60 hz in 4k irrespective of how it gets its games on.

    Yes, but the never said if you need a displayport cable to do it instead of HDMI.

    Regardless contacting the manufacturer is the right decision because it could be a bug with the BIOS on the card or some other defect. What they might tell you is that you can only use 4K with a displayport cable due to additional hardware limitations not related to the AMD chip, but their GPU board design.

    You already know the TV is capable, so it only leaves the card as the culprit.

    Though no company will ever willingly admit they cheaped out on parts to maximize profits...
  • daemonios
    daemonios
    ✭✭✭✭✭
    ✭✭✭✭✭
    daemonios wrote: »
    I have a GTX 1080ti which is propably 8 times more powerful than a RX 570 and I wouldn't attempt 4k with it on any game unless I'm willing to lower graphics significantly to be able to play at more than 40 fps.

    I must be trolling too, since I'm running the game at 4k and max settings on a 980TI.

    At 9 fps with all low settings? Sure.

    I get 50+ FPS most times at max settings apart from ambient occlusion and particle systems. Will post screenshot if I can remember and be bothered when I get home. But sure, keep doubling down.
    daemonios wrote: »
    I have a GTX 1080ti which is propably 8 times more powerful than a RX 570 and I wouldn't attempt 4k with it on any game unless I'm willing to lower graphics significantly to be able to play at more than 40 fps.

    I must be trolling too, since I'm running the game at 4k and max settings on a 980TI.

    And your card is high-end compared to an RX 570 btw... It is at least 30% more powerful than hers.

    Irrelevant. I was replying to your claim that a 1080TI still isn't enough to play at 4k. I wasn't replying to OP, and don't have a clue where AMD cards place regarding performance, since I haven't had one in years.
    Edited by daemonios on November 17, 2019 3:13PM
  • CleymenZero
    CleymenZero
    ✭✭✭✭✭
    daemonios wrote: »
    daemonios wrote: »
    I have a GTX 1080ti which is propably 8 times more powerful than a RX 570 and I wouldn't attempt 4k with it on any game unless I'm willing to lower graphics significantly to be able to play at more than 40 fps.

    I must be trolling too, since I'm running the game at 4k and max settings on a 980TI.

    At 9 fps with all low settings? Sure.

    I get 50+ FPS most times at max settings apart from ambient occlusion and particle systems. Will post screenshot if I can remember and be bothered when I get home. But sure, keep doubling down.
    daemonios wrote: »
    I have a GTX 1080ti which is propably 8 times more powerful than a RX 570 and I wouldn't attempt 4k with it on any game unless I'm willing to lower graphics significantly to be able to play at more than 40 fps.

    I must be trolling too, since I'm running the game at 4k and max settings on a 980TI.

    And your card is high-end compared to an RX 570 btw... It is at least 30% more powerful than hers.

    Irrelevant. I was replying to your claim that a 1080TI still isn't enough to play at 4k. I wasn't replying to OP, and don't have a clue where AMD cards place regarding performance, since I haven't had one in years.

    Make sure it's a screenshot of Rakkhat midfight or beginning of the fight or Cloudrest or any fight in Cyrodiil with 10+ total players or else it is irrelevant.

    I run the game in 1440p and yes, I keep the default frame limiter at 100fps because I can get 180+ fps but any of the fights mentionned will get you a nice dip in the 30s or 40s. It's not because my system is not adequate, it's because the game is poorly optimized. Now, in 4k, I'd probably get 80+ fps most of the time but it would dip in the 20s during those crucial moments which is less than ideal.

    What's worse is, depending on whether you use adaptive sync or not, it could make the situation even worse. If you dip below the adaptive sync range, you'll get a phat stutter when it goes out and back into the adaptive sync range making matters worse.

    So in essence, you MAY be able to run it in 4k but you'll run into issues at crucial moments in the game. To avoid this, my solution is to have use worst potential instance as my benchmark to set my graphics, avoiding to shoot myself in the foot. My mistake here is assuming people run the same content I do or have the same performance standards I have. Since this thread is about an imagined performance issues due to a cable, I was not so wrong to assume some kind of expected performance standard at which point it is foolish to complain about performance issues and running a poorly optimized game in 4k on a mid-tier graphics card.

    Notice though that in your reply to my original statement, you seem to assume that I say that you can't run it in 4k whereas I say that I would not attempt to. The reasons I would not attempt to are stated above and is not because it can't run well sometimes but because it will go very bad often enough to justify NOT running the game at 4k. Is that nuance clear enough now?

    And the bottom line is, in all the reviews you can read and testing you do, 1080ti is still not considered a 4k gaming card unless you want to give up significant graphical fidelity or frame rate. So no, 1080ti is not the 4k workhorse you seem to believe it can be.
    Edited by CleymenZero on November 17, 2019 4:53PM
  • daemonios
    daemonios
    ✭✭✭✭✭
    ✭✭✭✭✭
    You claimed 9 fps, which you clearly simply made up. You brought up the rx570 performance, which I hadn't addressed. Yet you keep digging your heels and doubling down. Learn to argue. I'm not bothering replying to you anymore.
  • CleymenZero
    CleymenZero
    ✭✭✭✭✭
    daemonios wrote: »
    You claimed 9 fps, which you clearly simply made up. You brought up the rx570 performance, which I hadn't addressed. Yet you keep digging your heels and doubling down. Learn to argue. I'm not bothering replying to you anymore.

    Ohh please, you took the 9 fps seriously and discredit an entire set of arguments based on that and tell me to learn to argue?
  • CleymenZero
    CleymenZero
    ✭✭✭✭✭
    daemonios wrote: »
    You claimed 9 fps, which you clearly simply made up. You brought up the rx570 performance, which I hadn't addressed. Yet you keep digging your heels and doubling down. Learn to argue. I'm not bothering replying to you anymore.

    Ohh please, you took the 9 fps seriously and discredit an entire set of arguments based on that and tell me to learn to argue?

    Actually, in the situations I suggested you take your screenshots, you could probably get 9 fps actually. This is not a joke this time.
  • CleymenZero
    CleymenZero
    ✭✭✭✭✭
    daemonios wrote: »
    You claimed 9 fps, which you clearly simply made up. You brought up the rx570 performance, which I hadn't addressed. Yet you keep digging your heels and doubling down. Learn to argue. I'm not bothering replying to you anymore.

    What my comments boiled down to, and I failed to clearly explain why I made such comments, is that the game has too many instances where running the game at 4k on a 570, a 980ti or a 1080ti is really not a great choice.
  • itsfatbass
    itsfatbass
    ✭✭✭✭✭
    Lady_Linux wrote: »
    itsfatbass wrote: »
    First of all.... HDMI is laughable for a true gamer. We all know DISPLAY PORT is the way to go... and please... stuttering and lag are nearly NEVER related to your cable OMEGALULZ

    If you're having performance issues, refer to this MEGA thread that has some VERY solid tips and tweaks to really improve performance AND graphics quality

    Why would i do any of that when i have already determined the cable to be the issue? In addition, the advice to use a dp to hdmi cable has also been debunked as the only ones i could find that would run 4k did so only at a 30 hrtz refresh rate when the hdmi 2.0 cable does so at twice that rate. Oh and lookie here: here's an 8k hdmi cable that is hdmi 2.1:

    https://smile.amazon.com/gp/product/B07KNRXGW4/ref=ox_sc_act_title_2?smid=A1CJUFQIP79W99&psc=1


    So much for display port elitism

    Display port is VASTLY superior. Your link provided an HDMI 2.0 cable, which DP still has higher bandwidth options than even that. Again, I will repeat, YOUR CABLE IS NOT WHY YOU HAVE STUTTERING AND PERFORMANCE ISSUES.

    I'm not sure if you're trying to surf the dark web for display port information but you are largely misinformed. DP can EASILY do more than 60 fps on 4k.
    Edited by itsfatbass on November 17, 2019 5:06PM
    ~PC/NA~ Magblade, Tankanist, Healplar, Stamcro, Oakensorc, Healden, Tanknight ~PLUR~
  • SeaGtGruff
    SeaGtGruff
    ✭✭✭✭✭
    ✭✭✭✭✭
    And she said back in post #133 of this thread that she now realizes the cable wasn't the issue. People can stop beating her up about that now that it's 46 posts later.
    I've fought mudcrabs more fearsome than me!
  • CleymenZero
    CleymenZero
    ✭✭✭✭✭
    itsfatbass wrote: »
    Lady_Linux wrote: »
    itsfatbass wrote: »
    First of all.... HDMI is laughable for a true gamer. We all know DISPLAY PORT is the way to go... and please... stuttering and lag are nearly NEVER related to your cable OMEGALULZ

    If you're having performance issues, refer to this MEGA thread that has some VERY solid tips and tweaks to really improve performance AND graphics quality

    Why would i do any of that when i have already determined the cable to be the issue? In addition, the advice to use a dp to hdmi cable has also been debunked as the only ones i could find that would run 4k did so only at a 30 hrtz refresh rate when the hdmi 2.0 cable does so at twice that rate. Oh and lookie here: here's an 8k hdmi cable that is hdmi 2.1:

    https://smile.amazon.com/gp/product/B07KNRXGW4/ref=ox_sc_act_title_2?smid=A1CJUFQIP79W99&psc=1


    So much for display port elitism

    Display port is VASTLY superior. Your link provided an HDMI 2.0 cable, which DP still has higher bandwidth options than even that. Again, I will repeat, YOUR CABLE IS NOT WHY YOU HAVE STUTTERING AND PERFORMANCE ISSUES.

    I'm not sure if you're trying to surf the dark web for display port information but you are largely misinformed. DP can EASILY do more than 60 fps on 4k.

    Truly getting trolled here. Display port elitism???

    Nevermind that a quick and simple search will tell you that Displayport has an 80gbps bandwidth vs her touted 18gbps for HDMI 2.0. Nevermind the fact that Displayport was doing 20gbps in 2012...

    Nevermind that nobody here could have any reason to have a vested interest in pushing Displayport therefore justifying elitism. I mean, the only reason why you'd push the Displayport suggestion is to solve the problem that was misdiagnosed by Miss Ubuntu in the 1st place....

    This is not a Displayport vs HDMI argument Lady, it's a: "HDMI 1.8 vs HDMI 2.0 is a non-issue in this game" thing
    Edited by CleymenZero on November 17, 2019 5:23PM
  • NBrookus
    NBrookus
    ✭✭✭✭✭
    ✭✭✭✭
    @Lady_Linux Video cards tend to be the limiting factor on gaming systems and the one you have is insufficient to what you want to achieve. I suggest returning the TV, as you mentioned, and -- providing your system otherwise supports it -- making your next upgrade a video card.

    I always end up wishing I had a better video card not long after building a new comp. You'd think I'd learn by now.
Sign In or Register to comment.