Benchmarking CPU/GPU Load

Final update

I upgraded my PC:

From:
i5-3570k 4C/4T @ 4.3 GHz
8 GB DDR3 @ 1600 MHz

To:
Ryzen 5 1600 6C/12T @ 3.875 GHz
16 GB DDR4 @ 2933 MHz

PoE performs similarly to what was seen before in my initial video with the initial hardware. FPS still drops down to 30s in worst case scenario and CPU usage never really goes above 40%. Performance in PoE is therefore unchanged, however, this could differ slightly depending on the background tasks you have in addition to PoE (listening to a twitch stream for example).

This conclusion seems to suggest the issue is on GGG's side and not local hardware, which is disappointing.

Thanks.
I came to the same conlusion. Even though by modern standards my gaming station is absolute junk (4G DDR3 RAM, nVIDIA 820m with DDR 2G), I remain convinced that there are some serious optimization issues with PoE. No matter how you look at it, a hack&slash diablo like game should absolutely never ever exceed the GPU requirement specs of any full blown open world RPG (that's only my opinion though). Yet all of recent ones run quite fine on my system.


One logical explanation I have is that GGG prioritize faster development of any ideas of any graphic improvements to draw more casual MTX heavy people into the game - that seems like a good way to profit. Since they would want people with $ to play the game so that they can afford MTX - they can assume that people who will pay for the game will have also enough disposable income to afford a top notch gaming rig. So their load testing is performed towards this kind of hardware. Anything short of that is bound to suffer.

Sticking to above, I can't really see myself spending days of research to setup some top end gaming rin. I'm also a fan of laptops, which are at a significant disadvantages compared to your regular PCs. I also don't like the idea of running through the upgrade process every year just to keep up with GGG dishing out 2% smoother particle effects at the cost of 30% more load on GPU(I just pulled this out of my ass tbh, but I think this is pretty much what actually happens).

I think that it would be good if GGG actually took their time and made really good settings customization which would actually affect your FPS at least somewhat significantly. In the current state, the difference between potato (like full retard 800x600 no shadows monochrome rectangular body parts potato) and max settings gives you 50% frame time which indicates that the issue lies somewhere in the resource utilization and optimization towards that.

I'm totally fine running the game with some mediocre graphics. After all, PoE is for the most part about grinding, planning, testing, number crunching and hacking&slashing and not about how pretty something looks. Most of the time there's too many flying bones, body parts, loot and other colorful diarrhea to be able to admire the pink glowing shade of your sword being reflected by water onto the nearby tree.

p.s.: please, anyone who is talking about FPS drops due to server side load - stop. It doesn't work like that. If anything, you would at most see freezes (in lockstep mode) or latency spikes (predictive) due to that, but FPS on client has nothing to do with server load since all the processes that affect display should always be on client side.
and I've only died 39 times. (c) Libbritania ( tokimeki ).
Last edited by ZL0J on Jun 24, 2017, 6:45:30 PM
"
ZL0J wrote:
I came to the same conlusion. Even though by modern standards my gaming station is absolute junk (4G DDR3 RAM, nVIDIA 820m with DDR 2G), I remain convinced that there are some serious optimization issues with PoE. No matter how you look at it, a hack&slash diablo like game should absolutely never ever exceed the GPU requirement specs of any full blown open world RPG (that's only my opinion though). Yet all of recent ones run quite fine on my system.


One logical explanation I have is that GGG prioritize faster development of any ideas of any graphic improvements to draw more casual MTX heavy people into the game - that seems like a good way to profit. Since they would want people with $ to play the game so that they can afford MTX - they can assume that people who will pay for the game will have also enough disposable income to afford a top notch gaming rig. So their load testing is performed towards this kind of hardware. Anything short of that is bound to suffer.

Sticking to above, I can't really see myself spending days of research to setup some top end gaming rin. I'm also a fan of laptops, which are at a significant disadvantages compared to your regular PCs. I also don't like the idea of running through the upgrade process every year just to keep up with GGG dishing out 2% smoother particle effects at the cost of 30% more load on GPU(I just pulled this out of my ass tbh, but I think this is pretty much what actually happens).

I think that it would be good if GGG actually took their time and made really good settings customization which would actually affect your FPS at least somewhat significantly. In the current state, the difference between potato (like full retard 800x600 no shadows monochrome rectangular body parts potato) and max settings gives you 50% frame time which indicates that the issue lies somewhere in the resource utilization and optimization towards that.

I'm totally fine running the game with some mediocre graphics. After all, PoE is for the most part about grinding, planning, testing, number crunching and hacking&slashing and not about how pretty something looks. Most of the time there's too many flying bones, body parts, loot and other colorful diarrhea to be able to admire the pink glowing shade of your sword being reflected by water onto the nearby tree.

p.s.: please, anyone who is talking about FPS drops due to server side load - stop. It doesn't work like that. If anything, you would at most see freezes (in lockstep mode) or latency spikes (predictive) due to that, but FPS on client has nothing to do with server load since all the processes that affect display should always be on client side.


To be fair, they have incorporated multi-threading into their game engine and have made some optimizations, as well as adding in Dx11 (beta). I think GGG will continue to improve in this area, but as you said, I am not sure the priority on that. I wouldn't mind dropping my settings down if it actually made a difference, but it doesn't. It is disappointing to see major frame rate dips, while GPU and CPU utilization are under 70% and 50% respectively. I also agree with you on the server side load. It would be unheard of that their server is inhibiting frame output at the client side. I would expect server side related lag, but the FPS counter should be unaffected.
"
Finkster06 wrote:
Final update

I upgraded my PC:

From:
i5-3570k 4C/4T @ 4.3 GHz
8 GB DDR3 @ 1600 MHz

To:
Ryzen 5 1600 6C/12T @ 3.875 GHz
16 GB DDR4 @ 2933 MHz

PoE performs similarly to what was seen before in my initial video with the initial hardware. FPS still drops down to 30s in worst case scenario and CPU usage never really goes above 40%. Performance in PoE is therefore unchanged, however, this could differ slightly depending on the background tasks you have in addition to PoE (listening to a twitch stream for example).

This conclusion seems to suggest the issue is on GGG's side and not local hardware, which is disappointing.

Thanks.


From the front page :

"
Unlike when we usually record video, we recorded this on a normal computer in order to demonstrate the difference that a regular user might be able to expect. Here are the specs:
Intel i5-6500
GeForce 1050 Ti
8 GB of RAM


I guess we should buy those.
Spreading salt since 2006
"

I guess we should buy those.


Yes, I admired that too.
The GPU is in top 5 available on the market -> normal computer. I guess they are testing on some stable nVIDIA prototypes that are scheduled for release in 2020?
and I've only died 39 times. (c) Libbritania ( tokimeki ).
I am pleased to see this news so soon after my post! They seem confident in their solution and the machine they used for the video is fairly modest, so I will be interested in seeing how this goes. My benchmarker character isn't on the beta, so I will have to use end game party play as a benchmark in the meantime.
"
Finkster06 wrote:
"
Arrowneous wrote:
Spoiler
The conclusion I draw from your testing is proof of what we have all known about for a long time. That the client side is not where the performance slowdowns (and thus our framerate drops) are occurring. In high density mob packs or areas with lot's of cold/fire/shock ground effects the performance slowdowns are all because of server calculation overload. GGG has known about this for over a year and Chris even commented briefly about this problem last fall in a State of Exile podcast. Further proof of this is knowing that just 1 necromancer can kill frame rates but quick if there are very many dead that can be resurrected.

We have no way of knowing just how many game sessions that GGG is assigning to each realm server (I'm assuming GGG uses servers with dual Intel Xeon 10+ core cpus in some kind of blade configuration but we'll never know what SoftLayer is providing for hosting PoE) and thus there could easily be dozens of game sessions assigned to each cpu thread in the server. No surprise there and we can know this from observations of slower performance (more latency) at the start of a new league than after 3 months when there are physically fewer players on at any given time of the day. More game sessions assigned to a finite number of server cores = more sessions per core. When GGG tries to force more sessions per server core than it can handle then something has to give and that something is a longer time for the server simulation to go through it's next calculation iteration and send the results back to our client PoE to use and update the next frame on our screen.

So as long as you have above the minimum specs needed to run PoE anything else is just not needed so there is not going to be much more of a local CPU or GPU % load increase just because you have agro'd more mob packs together before killing them. However, the server simulation is now dealing with a much larger number of individual creatures that must be damage calculated for and thus that will take more processing time. We must all remember that GGG intentionally does all this server side to minimize cheating. If the client isn't performing these (and all other) important calculations then there is nothing to hack and cheat.

Chris and company knows that the only solution beyond a few design and/or coding changes (such as limiting the total number of minions we can have) is to add more game servers and he stated last fall that GGG will be doing that but it is expensive and takes time so we can't expect a quick fix overnight. I just hope that GGG is ramping up the realm server quantity fast enough so that by the time PoE 3.0.0 launches next month we don't have total realm overload collapse or end up waiting in the queue forever like we got with D3 back in 2012. Hopefully GGG can avoid a disaster like that.

I would be surprised to hear that server side operation is limiting my frame rate. I can understand slow server side operation to cause latency lag and longer delays between updating NPC and player positions, states, effects, etc. I am not an industry expert, but it does not seem right that my hardware must wait for a server update before outputting the next frame.

It's easy for your frame rate to go down if the server simulation is taking too much time. I ran into a frame rate killing experience with a 2 golem Elementalist build playing the Crypt map that absolutely wrecked my frame rate down to single digits. Here's my post on it:

Crypt map appears to be bugged?

The key info from my own bad experience was the frame time graph. As time frame time goes way up PoE client frame rates go down. If you think about this it makes absolute sense in that the client is almost always waiting for the server information in order to do it's processing and display the next screen frame. If the server takes 100 ms to do all it's processing and send the data back to our client PoE then we only get 10 fps. The magic time is 20 ms: I have vsync enabled (since I can only see 60 fps on my 60 Hz monitor, anything higher is wasted processing) and from what the PoE in-game stats show it is 16.7 ms frame time to achieve 60 fps (1000 ms / 60 = 16.6666666 or approx. 16.7 ms). So if the server side processing takes 20 ms I get an effective fps of 50, 50 ms frame time = 20 fps, etc.

So pay attention to the frame time when your frame rates plummet to see if that is high. My 8 fps in Crypt was from a frame time averaging 128 ms according to my snapshot (1000 / 128 = 7.8 fps). Now why the 2 golems are causing frame times (server side processing time) to go over 100 ms is a mystery that only the GGG game engine coders can figure out (but probably will never admit to poor performance due to inefficient code). From my limited programming experience (6502 and x86 assembly, Pascal, C++) back when I was in my 20s I know that writing tight efficient code is a black art and thus is difficult to do. Compound that with multi-threading (many code threads all executing simultaneously on our cpu cores) and code optimization becomes a major headache to do.
"You've got to grind, grind, grind at that grindstone..."
Necessity may be the mother of invention, but poor QoP in PoE is the father of frustration.

The perfect solution to fix Trade Chat:
www.pathofexile.com/forum/view-thread/2247070

Report Forum Post

Report Account:

Report Type

Additional Info