I witnessed a invisible Wall on the final boss in BC II. Pretty fun when neither Rilis nor the Daedroths can moveKneighbors wrote: »I already observed new bugs in ICP and Spindl II.
In ICP the door after first boss stays closed. In Spindl II on the final boss the adds spawn and stay in spawn place. Every time I see something like that I'm like "wtf, you were not even supposed to touch that part of the game"
So since morrowind release;
- grp finder does NOT work
- random loading screens in BGs, leading to kick, leading to long wait times.
- Trials are totaly screwed up right now (DSA, last boss invis for party. HRC, the warrior jumps into the roof and resets. and tons more)
- random FPS drops EVERYWERE!.
So, Why did you decide on early accsess? it was done you said? Why is it so *** buggy then?!
No, you can't. That's like saying you can tell the difference between 40W and 100W bulb and you'd be outright lying if you said "Yes I can".Kneighbors wrote: »I can easily tell difference between 70 and 100 fps. You simply got no idea what you are talking about.
Thank You PC Master Race, your beta testing will greatly benefit us console pesants.
No, you can't. That's like saying you can tell the difference between 40W and 100W bulb and you'd be outright lying if you said "Yes I can".Kneighbors wrote: »I can easily tell difference between 70 and 100 fps. You simply got no idea what you are talking about.
All electronics operate on phases, or frequencies. Light bulbs in the US work at 60Hz, no matter what type of bulb used. The US power system is based on 60Hz (50Hz in Europe and other parts of the world).
This prevents humans from seeing the "flicker" of electrical phases. It's also why you see TVs measured in frequencies which are divisible by 60 (120, 240, etc).
FPS, or frames per second, isn't a component of power, but of output of graphics cards, which work at their own phases but still adhere to the 60Hz frequency. The FPS is merely a measurement of how many frames the GPU is producing, not what you can see.
There's a huge difference between frequency and clarity. The human eye is often quoted as having a "limit" of fps, but this isn't true and never will be true. Our eyes do much more than see "frames". We also see depth, light, color, shadows, and shockingly, motion.
Our eyes do not see continually, which is where this misconception comes from. Our eyes break apart images much like fps, but does so at a rate that's impossible to measure.
Which is why we cannot see everything on screen but pretend to notice a "difference" in changes. That's just not true. As you read this line, your clarity in your peripheral vision is blurry but you if look at it, it instantly adjusts for sharpness but this line turns blurry.
FPS is, and always will be, a *** poor mechanic to judge visual acuity. It was a buzz word created by an industry pushing memory size and CPU power. Even today, we're still measuring our "perfection" on "GHz", "cores", and "RAM". Now FPS is part of the lexicon when it comes to GPUs.
The majority of people out there aren't using their video cards correctly. Those who bought monitors specifically for their GPUs are doing it right.
There's a reason it's called "sync", because it's critical to match the monitor's phase with the GPU phase even if it means a loss of FPS.
This is why most people who say they can "tell the difference" is clearly conflating visual acuity with "fps", and that's just inaccurate.
Your monitor will always dictate what you can perceive and no amount of cuda cores will change this fact and no monitor on the planet is as effective as the human eye when perceiving "frames per second".
It's just not that simple.
No, you can't. That's like saying you can tell the difference between 40W and 100W bulb and you'd be outright lying if you said "Yes I can".Kneighbors wrote: »I can easily tell difference between 70 and 100 fps. You simply got no idea what you are talking about.
All electronics operate on phases, or frequencies. Light bulbs in the US work at 60Hz, no matter what type of bulb used. The US power system is based on 60Hz (50Hz in Europe and other parts of the world).
This prevents humans from seeing the "flicker" of electrical phases. It's also why you see TVs measured in frequencies which are divisible by 60 (120, 240, etc).
FPS, or frames per second, isn't a component of power, but of output of graphics cards, which work at their own phases but still adhere to the 60Hz frequency. The FPS is merely a measurement of how many frames the GPU is producing, not what you can see.
There's a huge difference between frequency and clarity. The human eye is often quoted as having a "limit" of fps, but this isn't true and never will be true. Our eyes do much more than see "frames". We also see depth, light, color, shadows, and shockingly, motion.
Our eyes do not see continually, which is where this misconception comes from. Our eyes break apart images much like fps, but does so at a rate that's impossible to measure.
Which is why we cannot see everything on screen but pretend to notice a "difference" in changes. That's just not true. As you read this line, your clarity in your peripheral vision is blurry but you if look at it, it instantly adjusts for sharpness but this line turns blurry.
FPS is, and always will be, a *** poor mechanic to judge visual acuity. It was a buzz word created by an industry pushing memory size and CPU power. Even today, we're still measuring our "perfection" on "GHz", "cores", and "RAM". Now FPS is part of the lexicon when it comes to GPUs.
The majority of people out there aren't using their video cards correctly. Those who bought monitors specifically for their GPUs are doing it right.
There's a reason it's called "sync", because it's critical to match the monitor's phase with the GPU phase even if it means a loss of FPS.
This is why most people who say they can "tell the difference" is clearly conflating visual acuity with "fps", and that's just inaccurate.
Your monitor will always dictate what you can perceive and no amount of cuda cores will change this fact and no monitor on the planet is as effective as the human eye when perceiving "frames per second".
It's just not that simple.
So since morrowind release;
- grp finder does NOT work
- random loading screens in BGs, leading to kick, leading to long wait times.
- Trials are totaly screwed up right now (DSA, last boss invis for party. HRC, the warrior jumps into the roof and resets. and tons more)
- random FPS drops EVERYWERE!.
So, Why did you decide on early accsess? it was done you said? Why is it so *** buggy then?!
Averya_Teira wrote: »I don't know about 70 FPS and 100 FPS, but I (and most other people) can EASILY differentiate a 30 FPS video and a 60 FPS video...
There are loads of split screen videos with one side showing 30 FPS and one side showing 60 FPS, and it's easily noticeable.
Also, I play the game mostly on PS4, but when I log in on PC, the smoothness and increased FPS is obvious to me.
Averya_Teira wrote: »I don't know about 70 FPS and 100 FPS, but I (and most other people) can EASILY differentiate a 30 FPS video and a 60 FPS video...
There are loads of split screen videos with one side showing 30 FPS and one side showing 60 FPS, and it's easily noticeable.
Also, I play the game mostly on PS4, but when I log in on PC, the smoothness and increased FPS is obvious to me.
Motion and the speed of motion plays a significant factor in framerate and what the human eye can visualize.
In my field of engineering, we acquire video of the arteries and veins of the heart while dye is being injected, so that the vessel walls can be visualized.
Video of an adult is generally acquired at 30 fps. However, pediatric cases are acquired at 60 fps. Why? Because a child's heartrate is faster than an adult's, and if a lower framerate is used critical information is lost. We also display this video on monitors capable of 120Hz refresh.
Your monitor is another limiting factor. If framerate is in excess of 60, you probably can't see it because most monitors are limited to 60Hz refresh.
Back to the original topic, Morrowind has sucked the life out of the game for me. I just can't get excited about playing like I used to before the Morrowind update. I really really want to enjoy this game. I'm just not.
What do you expect from ZOS? The only thing keeping people in this game is the fact that the game is a TES game.
ProfessorKittyhawk wrote: »
Yes, I know. And I understand why people want to distinguish the two but what's the first word I see when I start the game? Like it or not, Bethesda is all over this game. I wonder how much praise they got for their version of Morrowind from old Morrowind's Bethesda.