Maintenance for the week of June 10:
• PC/Mac: No maintenance – June 10

How many people with lag and stutter are not using a hdmi 2.0 cable or maybe have a bad cable?

  • SeaGtGruff
    SeaGtGruff
    ✭✭✭✭✭
    ✭✭✭✭✭
    NBrookus wrote: »
    I always end up wishing I had a better video card not long after building a new comp. You'd think I'd learn by now.

    Technology is racing ahead so quickly that by the time you click "Buy now" on the latest-greatest computer, it's already pretty much outdated. :(
    I've fought mudcrabs more fearsome than me!
    Options
  • Aurielle
    Aurielle
    ✭✭✭✭✭
    ✭✭✭✭✭
    Aurielle wrote: »
    Sweet Jesus. This thread, and the snippy back and forth, is a pretty good summation of why I stopped PC gaming a while back. PC bros are always going on and on about how it’s “easy” to build a PC, and then have condescending meltdowns on video game forums in which they try to one-up one another over technical minutiae. You all need to seriously take a step back evaluate the way you’re communicating with one another over something as inconsequential as an HDMI cable.

    It's the most infuriating yet entertaining thread I've seen in a long time.

    The HDMI cable is inconsequential indeed but not for the same reason you think it is. An RX 570 is not a powerful GPU. A maximum display resolution stated by a manufacturer is also inconsequential.

    So many other things to be said, I'm gonna say the OP appreciates the attention, good or bad and doesn't mind being 'splained stuff even though there is no willingness to understand. The only issue is what she says might be taken as true by people who also don't understand much about technology.

    It’s not just the OP...
    Options
  • precambria
    precambria
    ✭✭✭✭✭
    Display port.
    Options
  • Lady_Linux
    Lady_Linux
    ✭✭✭✭✭
    I'm sure my cable isn't 2.0
    Kadoin wrote: »
    Lady_Linux wrote: »
    this pc SHOULD be connecting at 60 hz in 4k irrespective of how it gets its games on.

    Yes, but the never said if you need a displayport cable to do it instead of HDMI.

    Regardless contacting the manufacturer is the right decision because it could be a bug with the BIOS on the card or some other defect. What they might tell you is that you can only use 4K with a displayport cable due to additional hardware limitations not related to the AMD chip, but their GPU board design.

    You already know the TV is capable, so it only leaves the card as the culprit.

    Though no company will ever willingly admit they cheaped out on parts to maximize profits...

    That's true the card manufacturer never said if i would need to use a display port. this is so.

    However i do have it running in windows now at 60hz 4k but and my android tv box defaults to 60hz 4k and i get the most amazing videos btw.... but i cant get 60hz 4k out of linux to save my life if it needed be... ill post a video of me playing in windows at 4k 60 hz... not native but it's running... on high settings...

    I have a 4k adapter that is supposed to do 120hz at 4k from dp to hdmi but obv, the hdmi is only 60 k on the tv. it comes tomorrow. we'll see then if that can help. if not i'll resign. too much crazy making.
    I simply must protest. There are no Penguin avatars for me to use in the forums.

    BTW, I use arch too
    Options
  • Lady_Linux
    Lady_Linux
    ✭✭✭✭✭
    I'm sure my cable isn't 2.0
    actually it's not true

    https://www.amd.com/en/support/graphics/radeon-500-series/radeon-rx-500-series/radeon-rx-570

    Connectivity
    DisplayPort 1.4 HDR
    HDMI™ 4K60 Support
    I simply must protest. There are no Penguin avatars for me to use in the forums.

    BTW, I use arch too
    Options
  • Bucky_13
    Bucky_13
    ✭✭✭✭✭
    I think people playing ESO through a wireless internet connection is a far bigger reason for bad lag and latency in ESO compared to HDMI cables.
    Options
  • Lady_Linux
    Lady_Linux
    ✭✭✭✭✭
    I'm sure my cable isn't 2.0
    Bucky_13 wrote: »
    I think people playing ESO through a wireless internet connection is a far bigger reason for bad lag and latency in ESO compared to HDMI cables.

    cat 6 cable all the way..(or better)
    I simply must protest. There are no Penguin avatars for me to use in the forums.

    BTW, I use arch too
    Options
  • Lady_Linux
    Lady_Linux
    ✭✭✭✭✭
    I'm sure my cable isn't 2.0
    UPDATE WOOT WOOT!

    The folks on manjaro forums just helped me finish it today. My linux desktop now connects to my Sceptre 50 inch tv in 4k 60Hz sweetness. Will post a video of that later...
    I simply must protest. There are no Penguin avatars for me to use in the forums.

    BTW, I use arch too
    Options
  • DaNnYtHePcFrEaK
    DaNnYtHePcFrEaK
    ✭✭✭
    wut, wut, wut?
    The game doesnt even hit 30fps on consoles hahaha so 60hz is a waste.... I use a display port cable, still get lag and stutters 🎉😂 go figure lol
    Options
  • Jem_Kindheart
    Jem_Kindheart
    ✭✭✭✭

    I have a GTX 1080ti which is propably 8 times more powerful than a RX 570
    If you're trolling, I applaud you but if you're serious, you need to educate yourself.

    WHOA wait a second now lmao. Time out. The 1080ti is a fantastic card, but it's not 8x more powerful lololol. Depending on which metric you look at, it is between 25% and 45% more powerful. It also uses twice the wattage and costed you 5x more and is two years old. So, before cutting down strangers online, look up the numbers. The 570 is still an entry grade card, though new, and still it's a $150-200 card and might struggle at 8k. Should do 4k halfway acceptably.

    https://gpu.userbenchmark.com/Compare/Nvidia-GTX-1080-Ti-vs-AMD-RX-570/3918vs3924

    And

    https://versus.com/en/asrock-phantom-gaming-x-radeon-rx-570-oc-8gb-vs-nvidia-geforce-gtx-1080-ti
    Longtimer since beta, the usual. 26 CP toons. ~1700cp on main account, 1000cp on 2nd account. Endgame-ish lol. Most Vets / some HM's cleared.
    Options
  • thorwyn
    thorwyn
    ✭✭✭✭✭
    ✭✭✭
    Turns out Linux folks are not as techy as we thought. Another urban myth debunked!
    And if the dam breaks open many years too soon
    And if there is no room upon the hill
    And if your head explodes with dark forebodings too
    I'll see you on the dark side of the moon
    Options
  • Lady_Linux
    Lady_Linux
    ✭✭✭✭✭
    I'm sure my cable isn't 2.0
    thorwyn wrote: »
    Turns out Linux folks are not as techy as we thought. Another urban myth debunked!
    Lady_Linux wrote: »
    UPDATE WOOT WOOT!

    The folks on manjaro forums just helped me finish it today. My linux desktop now connects to my Sceptre 50 inch tv in 4k 60Hz sweetness. Will post a video of that later...


    are we reading the same thread?


    I simply must protest. There are no Penguin avatars for me to use in the forums.

    BTW, I use arch too
    Options
  • Sinolai
    Sinolai
    ✭✭✭✭✭
    wut, wut, wut?
    #PC EU.
    I am guessing OP is playing on console.
    Options
  • CleymenZero
    CleymenZero
    ✭✭✭✭✭

    I have a GTX 1080ti which is propably 8 times more powerful than a RX 570
    If you're trolling, I applaud you but if you're serious, you need to educate yourself.

    WHOA wait a second now lmao. Time out. The 1080ti is a fantastic card, but it's not 8x more powerful lololol. Depending on which metric you look at, it is between 25% and 45% more powerful. It also uses twice the wattage and costed you 5x more and is two years old. So, before cutting down strangers online, look up the numbers. The 570 is still an entry grade card, though new, and still it's a $150-200 card and might struggle at 8k. Should do 4k halfway acceptably.

    https://gpu.userbenchmark.com/Compare/Nvidia-GTX-1080-Ti-vs-AMD-RX-570/3918vs3924

    And

    https://versus.com/en/asrock-phantom-gaming-x-radeon-rx-570-oc-8gb-vs-nvidia-geforce-gtx-1080-ti

    There is no way signaling deliberate exaggeration so I'll take the bkame for that.

    I also shot at a guy saying he was playing in 4k that he was probably hitting 9fps (a way of saying he didn't hit great numbers but in an exaggerated fashion). He offered a rebuttle saying he was hitting 50+fps in 4k and I asked him to send a pic during Rakkhat fight or similarly demanding (I'm sure it was THEN 9fps).

    The reason why I made a gross exaggeration of my card's power is because OP said she possessed a card capable of beautiful 8k according to manufacturer website which I found to be completely out of this world. She took the maximum possible display resolution as a guarantee of performance...

    And the trick to getting an MSI 1080ti Tri-X for around the world 500$ USD (probably less since it was 600CAD) is to wait right before the then launch of the RTX and make a ton of offers. A few bit and the best option by far was this beast of a 1080ti for that price. Problem is I had JUST bought a Vega 64 and what a sore loss that was. I should've sold it cause I got it for list price when it was selling for at least twice the list price on Newegg...
    Options
  • Lady_Linux
    Lady_Linux
    ✭✭✭✭✭
    I'm sure my cable isn't 2.0

    I have a GTX 1080ti which is propably 8 times more powerful than a RX 570
    If you're trolling, I applaud you but if you're serious, you need to educate yourself.

    The reason why I made a gross exaggeration of my card's power is because OP said she possessed a card capable of beautiful 8k according to manufacturer website which I found to be completely out of this world. She took the maximum possible display resolution as a guarantee of performance...

    OP is only responsible for what op says not for how you choose to interpret it. please take some time and learn the difference.

    AND the reality is that if the device says it will do 8k at 60 hz then it ought to be able to connect the desktop at that regardless of it's performance. Likewise when it says it can connect to 4k at 60 hz it should connect the desktop at that resolution, irrespective of its performance and especially irrespective of you chest pounding about your own equipment - which bw is in no way relevant to anything in this thread..
    Edited by Lady_Linux on November 20, 2019 10:15AM
    I simply must protest. There are no Penguin avatars for me to use in the forums.

    BTW, I use arch too
    Options
  • CleymenZero
    CleymenZero
    ✭✭✭✭✭
    Lady_Linux wrote: »

    I have a GTX 1080ti which is propably 8 times more powerful than a RX 570
    If you're trolling, I applaud you but if you're serious, you need to educate yourself.

    The reason why I made a gross exaggeration of my card's power is because OP said she possessed a card capable of beautiful 8k according to manufacturer website which I found to be completely out of this world. She took the maximum possible display resolution as a guarantee of performance...

    OP is only responsible for what op says not for how you choose to interpret it. please take some time and learn the difference.

    AND the reality is that if the device says it will do 8k at 60 hz then it ought to be able to connect the desktop at that regardless of it's performance. Likewise when it says it can connect to 4k at 60 hz it should connect the desktop at that resolution, irrespective of its performance and especially irrespective of you chest pounding about your own equipment - which bw is in no way relevant to anything in this thread..

    I'd take the time to look for your comment about your "beautiful 8k capable GPU" to quote you verbatim but you've wasted everyone's time enough, it's unfathomable.

    You started this thread trying to equate the performance issues we are all having to a cable and derailed into a stupid saga about you being able to connect to a cheap tv in 4k 60hz. This is no chest pounding, it's a resounding astonishment at the pinnacle of ignorance that you display.

    To what end is it so relevant to this game that your TV connects in 4k 60hz if you will realistically never be able to run it properly? Is it not then about performance?

    Before this thread, you were oblivious to the concept of input lag and I don't think you'd fully understand how simple and silly this thread is a thousand posts later.

    So, to your original question to "how many people with lag and stutter are not using an HDMI 2.0 cable"

    The answer is: the chances of it having an influence are slim to none and that very question exposes a grand lack of technical understanding

    After that, you've followed up with your GPU being capable of 8k (why would it even matter that it CAN connect in 8k when it cannot perform at all in 8k?).

    There were the the 18gb cable (which is a quantity of data and not a data flow rate) and a myriad of other stupidities that had many completely baffled.

    The fact is, instead of a reasonable "oopsy, I messed up and I didn't really know what I was talking about, sorry guys" (which I would've wholeheartedly respected) we are subjected to a wallowing pit of technical ignorances that have nothing to do with the game.

    The main point is: who cares of it can or cannot connect in 4k 60hz, your GPU can't do it. And even if it could, it wouldn't be so advisable unless all you do is decorate your house... There are enough technical issues with this game, you don't need to add all these stupid hurdles it'll just give it one more reason to mess the bed.

    I'm nit sorry for being a total jerk because you argued people that clearly explained how off the track you were.
    Options
  • Noldornir
    Noldornir
    ✭✭✭
    While the correct cable is needed Lag is NOT caused by these and hardly can.

    60 FPS means that 60 images are processed (and sent to screen) in a second, of couse a fast proper cable is needed for this to happen.

    30 FPS (half of it) halves the required effort

    fact is: Human eyes can only "catch" 24 Images/sec. All the others are "invisible" to us since you'll only "see" 24 of them (25 FPS are the minimum required to make a smooth image).

    30 FPS puts 5 more "extras" frame since one can't really tell which one your eyes are gonna "pick" and which one they are gonna "drop".

    60 FPS are definely redundant but it ensures that we have more than twice the minimum required to make a smooth image. So if something happens and we experience some "drop" we have more than enough images anyway.

    Now if you can go to 60 with a proper cable

    AND

    you drop to 30 with an unproper one= you'll hardly be able to tell the difference (unless FPS sudden drop).

    No lag with 30 FPS as that's still more than we can see, video lag like the ones yiu talk about could only be caused by performances < 24 FPS (a poor video card COULD deliver actually less than this value).
    Options
  • Lady_Linux
    Lady_Linux
    ✭✭✭✭✭
    I'm sure my cable isn't 2.0
    Lady_Linux wrote: »

    I have a GTX 1080ti which is propably 8 times more powerful than a RX 570
    If you're trolling, I applaud you but if you're serious, you need to educate yourself.

    The reason why I made a gross exaggeration of my card's power is because OP said she possessed a card capable of beautiful 8k according to manufacturer website which I found to be completely out of this world. She took the maximum possible display resolution as a guarantee of performance...

    OP is only responsible for what op says not for how you choose to interpret it. please take some time and learn the difference.

    AND the reality is that if the device says it will do 8k at 60 hz then it ought to be able to connect the desktop at that regardless of it's performance. Likewise when it says it can connect to 4k at 60 hz it should connect the desktop at that resolution, irrespective of its performance and especially irrespective of you chest pounding about your own equipment - which bw is in no way relevant to anything in this thread..

    I'd take the time to look for your comment about your "beautiful 8k capable GPU" to quote you verbatim but you've wasted everyone's time enough, it's unfathomable.

    You started this thread trying to equate the performance issues we are all having to a cable and derailed into a stupid saga about you being able to connect to a cheap tv in 4k 60hz. This is no chest pounding, it's a resounding astonishment at the pinnacle of ignorance that you display.

    To what end is it so relevant to this game that your TV connects in 4k 60hz if you will realistically never be able to run it properly? Is it not then about performance?

    Before this thread, you were oblivious to the concept of input lag and I don't think you'd fully understand how simple and silly this thread is a thousand posts later.

    So, to your original question to "how many people with lag and stutter are not using an HDMI 2.0 cable"

    The answer is: the chances of it having an influence are slim to none and that very question exposes a grand lack of technical understanding

    After that, you've followed up with your GPU being capable of 8k (why would it even matter that it CAN connect in 8k when it cannot perform at all in 8k?).

    There were the the 18gb cable (which is a quantity of data and not a data flow rate) and a myriad of other stupidities that had many completely baffled.

    The fact is, instead of a reasonable "oopsy, I messed up and I didn't really know what I was talking about, sorry guys" (which I would've wholeheartedly respected) we are subjected to a wallowing pit of technical ignorances that have nothing to do with the game.

    The main point is: who cares of it can or cannot connect in 4k 60hz, your GPU can't do it. And even if it could, it wouldn't be so advisable unless all you do is decorate your house... There are enough technical issues with this game, you don't need to add all these stupid hurdles it'll just give it one more reason to mess the bed.

    I'm nit sorry for being a total jerk because you argued people that clearly explained how off the track you were.

    it can and does and ive posted a video in the forum.. look for the thread

    as far as derailment of the op that is largely due to people like you.
    Edited by Lady_Linux on November 20, 2019 12:35PM
    I simply must protest. There are no Penguin avatars for me to use in the forums.

    BTW, I use arch too
    Options
  • lagrue
    lagrue
    ✭✭✭✭✭
    No, i'm sure my cable is hdmi 2.0 and that it's good because i tested it.
    Noldornir wrote: »
    fact is: Human eyes can only "catch" 24 Images/sec. All the others are "invisible" to us since you'll only "see" 24 of them (25 FPS are the minimum required to make a smooth image).
    .

    This is one of those "facts" that those versed in psychology, like me, cringe seeing posted around - because no, it's not a fact. It's a rumour people made up because Film used to be shot in 24 frames due to storage limitations (and nothing to do with your eyes).

    Firstly the human eye doesn't see in FPS to begin with. But the truth is the nerves responsible for relaying visual data to our brain fire ~300-1000 times a second. I.e. your brain reads and interprets ~300-1000 "frames" a second. There's diminishing returns on what you will see in the end "image" because your brain naturally filters out useless data - air force pilots have been tested and cap out at about 220 unique frames - but no the human eye does not cap out at 24 frames, that's ridiculous. There is no definitive count of how many frames the human eye can see but between 220-300 seems to be the magic spot.

    What they did with the pilots was play a video file where there was a picture on only one single frame out of 220, and they could detect it. Keep in mind that's not the limited of perceivable frames - just the limit of identifiable single frames. Your eyes interpret 100s more frames than even that. As far as we know, nobody has made technology yet which supersedes the human eye's ability to interpret.
    .
    Edited by lagrue on November 20, 2019 1:00PM
    PSN ID (NA only): Zuzu_With_a_Z
    *GRAND MASTER CRAFTER*

    "You must defeat me every time. I need defeat you only once"
    Options
  • Lady_Linux
    Lady_Linux
    ✭✭✭✭✭
    I'm sure my cable isn't 2.0
    here's an interesting article... while old it seems to suggest that using YCbCR pixel format of 4:2:2 uses less bandwith that sounds like a good thing. Using less bandwith sounds like it would reduce laod on the gpu for sure and may help to improve performance.

    http://www.sensoray.com/support/appnotes/frame_grabber_capture_modes.htm
    Edited by Lady_Linux on November 20, 2019 1:38PM
    I simply must protest. There are no Penguin avatars for me to use in the forums.

    BTW, I use arch too
    Options
  • CleymenZero
    CleymenZero
    ✭✭✭✭✭
    Noldornir wrote: »
    While the correct cable is needed Lag is NOT caused by these and hardly can.

    60 FPS means that 60 images are processed (and sent to screen) in a second, of couse a fast proper cable is needed for this to happen.

    30 FPS (half of it) halves the required effort

    fact is: Human eyes can only "catch" 24 Images/sec. All the others are "invisible" to us since you'll only "see" 24 of them (25 FPS are the minimum required to make a smooth image).

    30 FPS puts 5 more "extras" frame since one can't really tell which one your eyes are gonna "pick" and which one they are gonna "drop".

    60 FPS are definely redundant but it ensures that we have more than twice the minimum required to make a smooth image. So if something happens and we experience some "drop" we have more than enough images anyway.

    Now if you can go to 60 with a proper cable

    AND

    you drop to 30 with an unproper one= you'll hardly be able to tell the difference (unless FPS sudden drop).

    No lag with 30 FPS as that's still more than we can see, video lag like the ones yiu talk about could only be caused by performances < 24 FPS (a poor video card COULD deliver actually less than this value).

    The 24fps thing is a meme. I don't know why people still propagate that idea. Because some game company ceo mentioned that 24fps framerate is more cinematic as a justification for locking fps due to console limitations?

    The more
    lagrue wrote: »
    Noldornir wrote: »
    fact is: Human eyes can only "catch" 24 Images/sec. All the others are "invisible" to us since you'll only "see" 24 of them (25 FPS are the minimum required to make a smooth image).
    .

    This is one of those "facts" that those versed in psychology, like me, cringe seeing posted around - because no, it's not a fact. It's a rumour people made up because Film used to be shot in 24 frames due to storage limitations (and nothing to do with your eyes).

    Firstly the human eye doesn't see in FPS to begin with. But the truth is the nerves responsible for relaying visual data to our brain fire ~300-1000 times a second. I.e. your brain reads and interprets ~300-1000 "frames" a second. There's diminishing returns on what you will see in the end "image" because your brain naturally filters out useless data - air force pilots have been tested and cap out at about 220 unique frames - but no the human eye does not cap out at 24 frames, that's ridiculous. There is no definitive count of how many frames the human eye can see but between 220-300 seems to be the magic spot.

    What they did with the pilots was play a video file where there was a picture on only one single frame out of 220, and they could detect it. Keep in mind that's not the limited of perceivable frames - just the limit of identifiable single frames. Your eyes interpret 100s more frames than even that. As far as we know, nobody has made technology yet which supersedes the human eye's ability to interpret.
    .

    Glad someone took the time to write a response. Didn't feel like explaining the concept of a continuous electrical signal (nerve impulse) not being able to be completely equated to a frames per second.

    As for the diminishing returns, I remember seeing that, for gaming, above 100 frames is where the curve flattens and you get less benefits more rapidly as the # of fps increases. In essence, you'll see more improvement from 60 to 100 fps than you'll see from 100 to 144 or 144 to 240hz.
    Options
Sign In or Register to comment.