"
Aldonés wrote:
"
Rexeos wrote:
I am not sure what you try to say - that 30fps is ok or what you wanted to tell us here?
I’m saying if your screen refreshes 180 times a second, then it refreshes the exact same image 6 times before the servers calculate a new image. 30 fps and 180 fps provide the exact same experience on the PoE servers (technically 30.3 fps).
You forgot that the human eye can only see 24 fps! /s
|
Posted byCelestriad#0304on Sep 10, 2020, 11:10:46 AM
|
"
Aldonés wrote:
I’m saying if your screen refreshes 180 times a second, then it refreshes the exact same image 6 times before the servers calculate a new image. 30 fps and 180 fps provide the exact same experience on the PoE servers (technically 30.3 fps). Most games have a server tick rate (the game’s fps) of 64-128 times per second.
wow
Last edited by Tyrion2#1155 on Sep 10, 2020, 11:23:44 AM
|
Posted byTyrion2#1155on Sep 10, 2020, 11:22:57 AM
|
"
0Crimnor wrote:
"
Aldonés wrote:
"
Rexeos wrote:
I am not sure what you try to say - that 30fps is ok or what you wanted to tell us here?
I’m saying if your screen refreshes 180 times a second, then it refreshes the exact same image 6 times before the servers calculate a new image. 30 fps and 180 fps provide the exact same experience on the PoE servers (technically 30.3 fps).
You forgot that the human eye can only see 24 fps! /s
Not entirely correct. Base theory is between 24-48fps, however that does not tell the entire story either. You have a whole lot more that goes into play and why people can actually see the difference between lower fps and extremely high fps, which by itself is more than the eye can see. You have to get into refresh rates and also go into that the eyes perceive motion and frames differently in different parts of your vision. Peripheral is actually sensitive to motion and on lower frame rates can cause your center vision to perceive more flicker, lag, etc.
It is quite obvious if you play a game at 24fps and change it to say 60+fps. The human eye can absolutely tell the difference and gamers who play a lot are even more sensitive to this. Another thing the human eye can detect is the change in fps. If it drops or rises by a big enough percentage, during motion it is very easy to tell.
In the end the fps theory is only a theory and there are many other theories that counter it.
A common mistake that people make when trying to design something completely foolproof is to underestimate the ingenuity of complete fools.
|
Posted byDS_Deadman#3546on Sep 10, 2020, 1:28:08 PM
|
"
DS_Deadman wrote:
Not entirely correct. Base theory is between 24-48fps, however that does not tell the entire story either. You have a whole lot more that goes into play and why people can actually see the difference between lower fps and extremely high fps, which by itself is more than the eye can see. You have to get into refresh rates and also go into that the eyes perceive motion and frames differently in different parts of your vision. Peripheral is actually sensitive to motion and on lower frame rates can cause your center vision to perceive more flicker, lag, etc.
It is quite obvious if you play a game at 24fps and change it to say 60+fps. The human eye can absolutely tell the difference and gamers who play a lot are even more sensitive to this. Another thing the human eye can detect is the change in fps. If it drops or rises by a big enough percentage, during motion it is very easy to tell.
In the end the fps theory is only a theory and there are many other theories that counter it.
https://www.urbandictionary.com/define.php?term=%2FS
|
Posted byCelestriad#0304on Sep 10, 2020, 1:38:16 PM
|
"
DS_Deadman wrote:
Not entirely correct. Base theory is between 24-48fps, however that does not tell the entire story either. You have a whole lot more that goes into play and why people can actually see the difference between lower fps and extremely high fps, which by itself is more than the eye can see. You have to get into refresh rates and also go into that the eyes perceive motion and frames differently in different parts of your vision. Peripheral is actually sensitive to motion and on lower frame rates can cause your center vision to perceive more flicker, lag, etc.
It is quite obvious if you play a game at 24fps and change it to say 60+fps. The human eye can absolutely tell the difference and gamers who play a lot are even more sensitive to this. Another thing the human eye can detect is the change in fps. If it drops or rises by a big enough percentage, during motion it is very easy to tell.
In the end the fps theory is only a theory and there are many other theories that counter it.
https://www.youtube.com/watch?v=FktsFcooIG8
|
Posted byRexeos#3429on Sep 10, 2020, 1:47:06 PM
|
"
Aldonés wrote:
"
Rexeos wrote:
I am not sure what you try to say - that 30fps is ok or what you wanted to tell us here?
I’m saying if your screen refreshes 180 times a second, then it refreshes the exact same image 6 times before the servers calculate a new image. 30 fps and 180 fps provide the exact same experience on the PoE servers (technically 30.3 fps). Most games have a server tick rate (the game’s fps) of 64-128 times per second.
You are, of course, wrong. Very wrong.
Can the server fuck up your performance? Yes. But does a 30 tic server limit your FPS to 30? No. Animations and rendering don't happen on the server. It's a cooperation between the client, your GPU and your screen. The server tells the client that Sirus walks from A to B 30 times a second. But the drawing and animation of Sirus are updating (and played) much faster than that, when the client tells your GPU what to work on, and the screen tries to show that as fast as possible. So animations, the camera and the flow will of course be much smoother than "30 FPS" per second.
You can see this in a lot of games when the server crashes. A lot of animations are still being drawn smoothly, and you can still move your camera smoothly.
Bring me some coffee and I'll bring you a smile.
|
Posted byPhrazz#3529on Sep 10, 2020, 2:02:08 PM
|
I doubt't even 3080 will let you play poe at 100 fps at super juiced t16 maps.
IGN: Arlianth
Check out my LA build: 1782214
|
Posted byNephalim#2731on Sep 10, 2020, 2:07:25 PM
|
"
Nephalim wrote:
I doubt't even 3080 will let you play poe at 100 fps at super juiced t16 maps.
Then newer/faster cpu is the way to go? Or is it just POE that is unoptimized?
|
Posted byDiGG#5743on Sep 10, 2020, 6:25:57 PM
|
I play on a surface pro with intel iris plus.
|
|
"
Aldonés wrote:
"
Rexeos wrote:
I am not sure what you try to say - that 30fps is ok or what you wanted to tell us here?
I’m saying if your screen refreshes 180 times a second, then it refreshes the exact same image 6 times before the servers calculate a new image. 30 fps and 180 fps provide the exact same experience on the PoE servers (technically 30.3 fps). Most games have a server tick rate (the game’s fps) of 64-128 times per second.
The devs built the servers from scratch when creating PoE rather than spending money they didn’t have getting a professionally designed one. I’m no engineer and haven’t the faintest clue what the intricacies are, only that budget was the reason for their initial server choice and they cannot simply replace the server because it would involve recreating the game in some fashion. Personally I think half the reason they are pushing templates in the new design modules is so they can provide new servers in patch 5.0 (PoE 3). But again, I haven’t a clue what is what when it comes to design, lol.
ever heard of frame time? try locking your game to 30 fps then back to whatever highest frame you can lock it to and tell me you don't feel an immense difference in smoothness
|
Posted byCAPSLOCK_ON#7907on Sep 10, 2020, 7:11:36 PMOn Probation
|