Lady_Linux wrote: »https://www.asrock.com/Graphics-Card/AMD/Phantom Gaming X Radeon RX570 8G OC/
this is the exact graphics card i have and it will not conect on linux or windows for 60 hrtz
and since my air tv player will connect at 4k 60 hrtz it is not the tv or the cable but the card and the card is supposed to be capable of 8k
nafensoriel wrote: »Oh wow, this thread is still going...
I think some things need to be mentioned though...
The only functional reason to ever have a refresh rate over 60 is not visual quality. It has been pretty darn conclusively proven that while humans can tell the difference between a 240hz display and a 60hz display... they can't tell it when they are PLAYING anywhere near as often.
Refresh rate has exactly ONE impact on performance. If your refresh rate is less than your FPS in some circumstances you will have a yoyo effect with Vsync on. Beyond this, there is no effect. Higher refresh rates DO NOT IN ANY WAY INCREASE PERFORMANCE.
The final thing is what we, as gamers, actually get high refresh rates for is INPUT LATENCY. The time between seeing, clicking, and seeing the result of the click. Higher refresh rates allow for more images to be drawn and thus allows for your eye to see and register action faster. In an FPS this is very important... In ESO less important(it helps to weave).
Fact checks:
A bad cable is a bad cable. Most of you probably have bad cables. You've bent them, rolled a chair over them. Abused them. Stuck them into dirty a** ports. They are "less optimal".
GOOD NEWS!
Having a bad cable with a crap monitor means NOTHING! Cables are DIGITAL NOW. Digital has this really cool thing where it either works.. or it doesn't. Unless you have actual more expensive than a car hardware cable choice isn't going to bother you one whit. Rejoice at no longer having to physically shield your monitor cable from your power cable or get artifacts!
2.0 vs 1.0 HDMI. It doesn't matter. I've seen really cheap cables handle 2.0 specs... I've seen really expensive cables fail at 2.0 spec. You can have a passing 2.0 spec cable fail by bending it wrong. Unless you are truly 100% visualphile hardcore you probably won't have a setup that cares. If you do have a setup that cares you won't use HDMI because of how finicky it is.
@Lady_Linux
I didn't read every post but going to answer what I did read. I run ESO on Linux POP!OS on my laptop on a 4k monitor via HDMI sometimes and it runs fine at 60hz.
1. HDMI or Display port won't matter, as in the end your TV is only HDMI so the cable will still be HDMI even with displayport on one end. Cable standard (version), cable quality, and cable length all do matter though. You will need version 2.0 of HDMI to get 60hz. The other issue is if the actual connecting devices support higher versions of HDMI as well. There are controllers in both the TV and your graphics card and if they are only programmed for HDMI 1.4 they won't go to 60hz 4k.
2. Stuttering isn't due to frame limit (hz is just how many screen refresh is pushed/rendered per second, IE fps) as much as drops in frame rate from your computer (graphic card/CPU/RAM maxed out). Think of it this way, movies play at only 24 hz/fps and they look smooth. You will notice a difference between 60 hz and 30 hz, but it you are staying at a stable 30hz the game would still be playable and not be horrible. What actually happens in ESO though is that the game sometimes render super heavy. I get anywhere from 80 fps in 4k to a mere 6 fps in Vivec city solely on where I look on my GTX 1080. The game is not very well optimized which is why they are supposed to be fixing it.
3. Here are some things to consider. Enable FPS in the game and see if when you see a stutter your FPS is dropping. If you card can only render at 20 FPS, you will always have issues. The cable will still support exactly 30 hz and your TV will still refresh at 30hz or 60hz, but it will only receive those 20 frames so it will just refresh the same unchanged image during a refresh.
4. SOLUTION! (Possibly.) Just set your resolution to Fullscreen (not windowed) at 1920x1080 (1080p) in game. It should work, if not set your operating system to 1920x1080 (1080p) when you want to play. You'll be able to play at 60hz and your graphics card will stutter a lot less. 1080p is exactly divisible by two so every four pixels should just group up into a single pixel and act like a decent 1080p monitor. Even on my beast 2080 TI desktop I can drop to low FPS at times on my 144hz 1440p monitor. It's rare, but happens. The game needs to be optimized, but you also don't have a good enough video card for intensive graphic games to run at 4k well. Light games and desktop apps/video should run fine though.
CleymenZero wrote: »nafensoriel wrote: »Oh wow, this thread is still going...
I think some things need to be mentioned though...
The only functional reason to ever have a refresh rate over 60 is not visual quality. It has been pretty darn conclusively proven that while humans can tell the difference between a 240hz display and a 60hz display... they can't tell it when they are PLAYING anywhere near as often.
Refresh rate has exactly ONE impact on performance. If your refresh rate is less than your FPS in some circumstances you will have a yoyo effect with Vsync on. Beyond this, there is no effect. Higher refresh rates DO NOT IN ANY WAY INCREASE PERFORMANCE.
The final thing is what we, as gamers, actually get high refresh rates for is INPUT LATENCY. The time between seeing, clicking, and seeing the result of the click. Higher refresh rates allow for more images to be drawn and thus allows for your eye to see and register action faster. In an FPS this is very important... In ESO less important(it helps to weave).
Fact checks:
A bad cable is a bad cable. Most of you probably have bad cables. You've bent them, rolled a chair over them. Abused them. Stuck them into dirty a** ports. They are "less optimal".
GOOD NEWS!
Having a bad cable with a crap monitor means NOTHING! Cables are DIGITAL NOW. Digital has this really cool thing where it either works.. or it doesn't. Unless you have actual more expensive than a car hardware cable choice isn't going to bother you one whit. Rejoice at no longer having to physically shield your monitor cable from your power cable or get artifacts!
2.0 vs 1.0 HDMI. It doesn't matter. I've seen really cheap cables handle 2.0 specs... I've seen really expensive cables fail at 2.0 spec. You can have a passing 2.0 spec cable fail by bending it wrong. Unless you are truly 100% visualphile hardcore you probably won't have a setup that cares. If you do have a setup that cares you won't use HDMI because of how finicky it is.
Hands down the stupidest thread I've seen to a point where I think she's trolling.
Few things though:
- some people are clearly better at discerning higher VS lower framerates. I can tell better than my brother and we've tested it an unreasonable number of times. I do play a lot more games than he does so that's a potential bias right there.
- you forgot to mention adaptive sync
You seem to know your way around so process this: she believes her RX 570 is capable of 8k gaming... I nearly lost it there...
CleymenZero wrote: »Lady_Linux wrote: »https://www.asrock.com/Graphics-Card/AMD/Phantom Gaming X Radeon RX570 8G OC/
this is the exact graphics card i have and it will not conect on linux or windows for 60 hrtz
and since my air tv player will connect at 4k 60 hrtz it is not the tv or the cable but the card and the card is supposed to be capable of 8k
8k on a 570... This is joke thread of the year.
This is infuriating because your complete lack of understanding of technology is undermining the insanely bad performance of the game.
You will never be able to run a game at 4k or 8k with this GPU. The best you could do is have the GPU render at 1080p (the best performance you're likely to get on this GPU) and let the screen upscale to 4k (at which point your basically playing 1080p).
I have a GTX 1080ti which is propably 8 times more powerful than a RX 570 and I wouldn't attempt 4k with it on any game unless I'm willing to lower graphics significantly to be able to play at more than 40 fps.
If you're trolling, I applaud you but if you're serious, you need to educate yourself.
BTW HDMI 2.0 is 18 gigabit per second (a data transfer RATE) as opposed to 18gb which would refer to a quantity of data which still wouldn't make sense...
CleymenZero wrote: »I have a GTX 1080ti which is propably 8 times more powerful than a RX 570 and I wouldn't attempt 4k with it on any game unless I'm willing to lower graphics significantly to be able to play at more than 40 fps.
Technically, it could, though in actuality even high end cards are a pain to get 8k actually working. But probably be at like 5 fps on desktop only.Doesn't hurt me to be helpful any which way. Have a nice day.
@Lady_Linux
That's not entirely accurate. Think of it like your graphics card has to write a letter to an audience (The monitor). The longer the letter, the longer it takes to write. More skilled writers can write faster (This is your frames per second). The HZ is the gopher running between you and the audience. They ALWAYS come 60 times at exact intervals even if you have not finished your letter. So if you have not finished your letter by the time the gopher comes, they return to the audience and simply tell them nothing new to report. This is how monitors and graphics cards work basically. They are still running at that hz* but you may not be able to work faster enough on the content to provide them new updates. So technically it is always running at 60hz when set to that, but your frames might not be sent which causes a dip in performance and stutter. You will be able to run desktop apps at 4k 60hz fine, and depending on the game and settings, you'll even be able to game fine at 4k on your graphics card.
*I add a note here because freesync and gsync monitors will actually change their hz based on your video card so might actually only update 20 times or less if it knows you are being slow at writing basically. The thing is that change is speed is more noticeable than a single speed. That's why 30 fps, that never wavers, will look a LOT better than a variance of 20 fps to 40 fps. It comes down to human perception then.
I personally think you should try a new 2.0 cable. Yes, it might work with the Android, but the Android might also be more tolerant. There have been so many times where simply buying a new cable has helped people. Go with as short as you can for best tolerance typically. We use twisted veins brand at work, but even cheap Amazon Basic should work and you can always return those super easy if they don't. That, and I still think you should just try playing in 1080p resolution... You didn't really answer if you tried that.
Lady_Linux wrote: »CleymenZero wrote: »Lady_Linux wrote: »https://www.asrock.com/Graphics-Card/AMD/Phantom Gaming X Radeon RX570 8G OC/
this is the exact graphics card i have and it will not conect on linux or windows for 60 hrtz
and since my air tv player will connect at 4k 60 hrtz it is not the tv or the cable but the card and the card is supposed to be capable of 8k
8k on a 570... This is joke thread of the year.
This is infuriating because your complete lack of understanding of technology is undermining the insanely bad performance of the game.
You will never be able to run a game at 4k or 8k with this GPU. The best you could do is have the GPU render at 1080p (the best performance you're likely to get on this GPU) and let the screen upscale to 4k (at which point your basically playing 1080p).
I have a GTX 1080ti which is propably 8 times more powerful than a RX 570 and I wouldn't attempt 4k with it on any game unless I'm willing to lower graphics significantly to be able to play at more than 40 fps.
If you're trolling, I applaud you but if you're serious, you need to educate yourself.
BTW HDMI 2.0 is 18 gigabit per second (a data transfer RATE) as opposed to 18gb which would refer to a quantity of data which still wouldn't make sense...
who cares what your card games at. what matters is you desktop can display 4k 60 hz or not.
https://www.asrock.com/Graphics-Card/AMD/Phantom Gaming X Radeon RX570 8G OC/index.asp#Specification
Sweet Jesus. This thread, and the snippy back and forth, is a pretty good summation of why I stopped PC gaming a while back. PC bros are always going on and on about how it’s “easy” to build a PC, and then have condescending meltdowns on video game forums in which they try to one-up one another over technical minutiae. You all need to seriously take a step back evaluate the way you’re communicating with one another over something as inconsequential as an HDMI cable.
CleymenZero wrote: »I have a GTX 1080ti which is propably 8 times more powerful than a RX 570 and I wouldn't attempt 4k with it on any game unless I'm willing to lower graphics significantly to be able to play at more than 40 fps.
I must be trolling too, since I'm running the game at 4k and max settings on a 980TI.
CleymenZero wrote: »I have a GTX 1080ti which is propably 8 times more powerful than a RX 570 and I wouldn't attempt 4k with it on any game unless I'm willing to lower graphics significantly to be able to play at more than 40 fps.
I must be trolling too, since I'm running the game at 4k and max settings on a 980TI.
Lady_Linux wrote: »this pc SHOULD be connecting at 60 hz in 4k irrespective of how it gets its games on.
CleymenZero wrote: »CleymenZero wrote: »I have a GTX 1080ti which is propably 8 times more powerful than a RX 570 and I wouldn't attempt 4k with it on any game unless I'm willing to lower graphics significantly to be able to play at more than 40 fps.
I must be trolling too, since I'm running the game at 4k and max settings on a 980TI.
At 9 fps with all low settings? Sure.
CleymenZero wrote: »CleymenZero wrote: »I have a GTX 1080ti which is propably 8 times more powerful than a RX 570 and I wouldn't attempt 4k with it on any game unless I'm willing to lower graphics significantly to be able to play at more than 40 fps.
I must be trolling too, since I'm running the game at 4k and max settings on a 980TI.
And your card is high-end compared to an RX 570 btw... It is at least 30% more powerful than hers.
CleymenZero wrote: »CleymenZero wrote: »I have a GTX 1080ti which is propably 8 times more powerful than a RX 570 and I wouldn't attempt 4k with it on any game unless I'm willing to lower graphics significantly to be able to play at more than 40 fps.
I must be trolling too, since I'm running the game at 4k and max settings on a 980TI.
At 9 fps with all low settings? Sure.
I get 50+ FPS most times at max settings apart from ambient occlusion and particle systems. Will post screenshot if I can remember and be bothered when I get home. But sure, keep doubling down.CleymenZero wrote: »CleymenZero wrote: »I have a GTX 1080ti which is propably 8 times more powerful than a RX 570 and I wouldn't attempt 4k with it on any game unless I'm willing to lower graphics significantly to be able to play at more than 40 fps.
I must be trolling too, since I'm running the game at 4k and max settings on a 980TI.
And your card is high-end compared to an RX 570 btw... It is at least 30% more powerful than hers.
Irrelevant. I was replying to your claim that a 1080TI still isn't enough to play at 4k. I wasn't replying to OP, and don't have a clue where AMD cards place regarding performance, since I haven't had one in years.
You claimed 9 fps, which you clearly simply made up. You brought up the rx570 performance, which I hadn't addressed. Yet you keep digging your heels and doubling down. Learn to argue. I'm not bothering replying to you anymore.
CleymenZero wrote: »You claimed 9 fps, which you clearly simply made up. You brought up the rx570 performance, which I hadn't addressed. Yet you keep digging your heels and doubling down. Learn to argue. I'm not bothering replying to you anymore.
Ohh please, you took the 9 fps seriously and discredit an entire set of arguments based on that and tell me to learn to argue?
You claimed 9 fps, which you clearly simply made up. You brought up the rx570 performance, which I hadn't addressed. Yet you keep digging your heels and doubling down. Learn to argue. I'm not bothering replying to you anymore.
Lady_Linux wrote: »itsfatbass wrote: »First of all.... HDMI is laughable for a true gamer. We all know DISPLAY PORT is the way to go... and please... stuttering and lag are nearly NEVER related to your cable OMEGALULZ
If you're having performance issues, refer to this MEGA thread that has some VERY solid tips and tweaks to really improve performance AND graphics quality
Why would i do any of that when i have already determined the cable to be the issue? In addition, the advice to use a dp to hdmi cable has also been debunked as the only ones i could find that would run 4k did so only at a 30 hrtz refresh rate when the hdmi 2.0 cable does so at twice that rate. Oh and lookie here: here's an 8k hdmi cable that is hdmi 2.1:
https://smile.amazon.com/gp/product/B07KNRXGW4/ref=ox_sc_act_title_2?smid=A1CJUFQIP79W99&psc=1
So much for display port elitism
itsfatbass wrote: »Lady_Linux wrote: »itsfatbass wrote: »First of all.... HDMI is laughable for a true gamer. We all know DISPLAY PORT is the way to go... and please... stuttering and lag are nearly NEVER related to your cable OMEGALULZ
If you're having performance issues, refer to this MEGA thread that has some VERY solid tips and tweaks to really improve performance AND graphics quality
Why would i do any of that when i have already determined the cable to be the issue? In addition, the advice to use a dp to hdmi cable has also been debunked as the only ones i could find that would run 4k did so only at a 30 hrtz refresh rate when the hdmi 2.0 cable does so at twice that rate. Oh and lookie here: here's an 8k hdmi cable that is hdmi 2.1:
https://smile.amazon.com/gp/product/B07KNRXGW4/ref=ox_sc_act_title_2?smid=A1CJUFQIP79W99&psc=1
So much for display port elitism
Display port is VASTLY superior. Your link provided an HDMI 2.0 cable, which DP still has higher bandwidth options than even that. Again, I will repeat, YOUR CABLE IS NOT WHY YOU HAVE STUTTERING AND PERFORMANCE ISSUES.
I'm not sure if you're trying to surf the dark web for display port information but you are largely misinformed. DP can EASILY do more than 60 fps on 4k.