I honestly didn't expect a performance increase when my 1080 arrived.

upsample to 4k or above and you should see a difference im using 980sli and running 5120x3200 and the game looks beautiful at 100+ fps in town and when im not using skills lol
edit: just ran Dried Lake to see how far they go they hit 200 fps but when i use a skill it goes down to 40fps or less
IGN : Detrium
Last edited by Kindred1#3951 on Jun 11, 2016, 5:35:38 PM
"
Kindred1 wrote:
upsample to 4k or above and you should see a difference im using 980sli and running 5120x3200 and the game looks beautiful at 100+ fps in town and when im not using skills lol
edit: just ran Dried Lake to see how far they go they hit 200 fps but when i use a skill it goes down to 40fps or less.

Now we're not comparing apples to apples. If you want to run 4K resolution then a high end 1070 or 1080 is most likely needed to keep frame rates up. But then my lowly HD6870 would do over 100 fps without mobs and I couldn't see the extra frames so I turned vsync on to cap video to 60 fps to match my 24" monitors 60 hz refresh rate and keep my gpu temp lower. There isn't any reason (except bragging rights) for 200 fps when our brains can't process visual input anywhere near that fast. We routinely watch movies on the big screen at 30 fps and are fine with that.

I will state again that most of us are probably thinking incorrectly about PoE QoP and like me our stuttering and erratic performance has more to do with server/client communication latency then it does with the gpu. To be sure GGG has some serious fps slowdowns due to old DirectX 9.0c engine coding so updating to at least DX11 should help some. I don't know anyone who doesn't have a DX11 capable gpu but DX12 would be too video hardware limiting. Then again GGG is forever saddled by the lack of funds to licence a commercial game engine back in 2006 and also by the game engine companies asking way too high of licensing fees such that only the companies like Blizzard with deep pockets could afford it. So they rolled their own game engine for PoE (ancient history now) and have to muttle along the best they can using it.

So the OP didn't think upgrading to a 1080 gpu would do much to improve performance and then posted that he was right. I don't need a computer science degree to understand that an "old school" arpg started in 2006 (the XP years) to understand the limitations of the PoE home grown game engine and that it can't hold a candle to any currently available commercial game engine. But then the trade off is that PoE is "free to play".
"You've got to grind, grind, grind at that grindstone..."
Necessity may be the mother of invention, but poor QoP in PoE is the father of frustration.

The perfect solution to fix Trade Chat:
www.pathofexile.com/forum/view-thread/2247070
Why do you assume the GPU is the culprit?

The issues you're having could very well be a software problem on your end.
Burning Ground FPS


Normal FPS(Capped at monitor refresh rate, otherwise 400+)



It's time to get rid of burning ground map mod, it dosn't do anything other than shit on FPS, unless that's the "danger" you intended it too add.
Last edited by txd#1856 on Jun 12, 2016, 12:58:02 AM
"
txd wrote:
I was right.

Random FPS spikes, ground effects still shitting on FPS, especially burning ground, no idea why it's still even in the game.

The 12 or so crashes I've gotten this morning are making me want to quit this game I love so much.

TLDR; No difference in performance from GTX 980 to GTX 1080.


Even a super computer 10 years in the future will get shit on so hard by ground effects. They cut fps in half for no apparent reason and GGG refuses to remove them despite them having no point other than an eye sore fps eater.


Exploding a large pack with herald of ice will also dip your fps so the fps graph looks like a really sadistic roller coaster.
IGN: Arlianth
Check out my LA build: 1782214
Last edited by Nephalim#2731 on Jun 12, 2016, 1:39:40 AM
"
Kindred1 wrote:
upsample to 4k or above and you should see a difference im using 980sli and running 5120x3200 and the game looks beautiful at 100+ fps in town and when im not using skills lol
edit: just ran Dried Lake to see how far they go they hit 200 fps but when i use a skill it goes down to 40fps or less


How to upsample with nvidia's DSR? I've been on AMD cards for years & it was simple with VSR. Got a 1080 last night & in gforce experience PoE has no option to upscale with DSR.


"
Bipolartuna wrote:
"
Kindred1 wrote:
upsample to 4k or above and you should see a difference im using 980sli and running 5120x3200 and the game looks beautiful at 100+ fps in town and when im not using skills lol
edit: just ran Dried Lake to see how far they go they hit 200 fps but when i use a skill it goes down to 40fps or less


How to upsample with nvidia's DSR? I've been on AMD cards for years & it was simple with VSR. Got a 1080 last night & in gforce experience PoE has no option to upscale with DSR.


AMD is dominating true DX12 games currently at price point.
Tho, I am very tempted by 1080 but it's too expensive currently, I am seriously considering 1070 tho.

I am most likely going to wait till end of the month for radeon RX480 and see some benchmarks, its said to be only $199, and supposedly 2x 480 could rekt 1080 for under $400. 2 more weeks... and we will see it's either gonna be a 1070 or a 480 (and later crossfired ofc).
The real hardcore PoE players and the elites sit in town and zoning in and out of their hideouts trading items. Noobs that don't know how to play PoE correctly, kill monsters for items. It's pure fact, it will never change.

Welcome to PoE.
"
Pewzor wrote:
"
Bipolartuna wrote:
"
Kindred1 wrote:
upsample to 4k or above and you should see a difference im using 980sli and running 5120x3200 and the game looks beautiful at 100+ fps in town and when im not using skills lol
edit: just ran Dried Lake to see how far they go they hit 200 fps but when i use a skill it goes down to 40fps or less


How to upsample with nvidia's DSR? I've been on AMD cards for years & it was simple with VSR. Got a 1080 last night & in gforce experience PoE has no option to upscale with DSR.


AMD is dominating true DX12 games currently at price point.
Tho, I am very tempted by 1080 but it's too expensive currently, I am seriously considering 1070 tho.

I am most likely going to wait till end of the month for radeon RX480 and see some benchmarks, its said to be only $199, and supposedly 2x 480 could rekt 1080 for under $400. 2 more weeks... and we will see it's either gonna be a 1070 or a 480 (and later crossfired ofc).


My main reason to move away from AMD is ever since they moved from CCC to Crimson, the few games that do support crossfire, none of them worked any more. So I ended up taking 1 card out to stop wasting power.



"
txd wrote:
"
Entropic_Fire wrote:
"
txd wrote:
I was right.

Random FPS spikes, ground effects still shitting on FPS, especially burning ground, no idea why it's still even in the game.

The 12 or so crashes I've gotten this morning are making me want to quit this game I love so much.

TLDR; No difference in performance from GTX 980 to GTX 1080.


Why would you bother replacing a 980 at this point anyways unless you just have more money than you know what to do with?

How many watts does your power supply have on the 12v rail by the way?


Because 1080 > 980 when you're playing on an ultrawide 1440p 100hz monitor.

If your still using ancient technology like a 1080p 60hz monitor than a GTX 980 is just fine and im sure you won't notice a significant FPS drop if you're capping your FPS at 60.

It's noticeable when it drops from 100 to 30-40.


entitled ameritard much?
"
There isn't any reason (except bragging rights) for 200 fps when our brains can't process visual input anywhere near that fast. We routinely watch movies on the big screen at 30 fps and are fine with that.


this part is absolutely false, the brain can and will perceive the difference in fluidity between 30 and 200 fps with a monitor adapted for this refresh rate. A 60 HZ monitor WILL not show the difference.

We watch 30 fps routinely, yes, we have fine with this? absolutely not. The brain see the missing pictures in the refresh rate and compensate for it. Some TV has also options to add pictures between two frames with a kind of motion blurr that create a simili 45-60 fps but it's not perfect.

Dont get me wrong, the optics nerves probably couldnt see the difference between 196 and 200 fps or 92 and 100. But 30 to 200 is like night and day, with a monitor supporting this refresh rate but you couldnt tell because you do not have one!


I have a proof for this, some LOTR movies have been recorded in 48 fps and shown in 48 fps:
PLS HAVE NOTICED IT and some disliked it because it's not usual , some loved it.

"

On the technology side I thought the 48 frames worked as advertised: The images were clearer and sharper, the movement and action more fluid and engaging, and the 3D far smoother and rather less headache inducing. I understand a number of film critics (and some audiences) don’t like the higher frame rate because it looks less “film-like,” and the adjectives they use to describe the 48fps experience reflect that — I’ve seen it compared to television, video games, thrill rides and so forth.



ALSO there is a biiiiiiig difference between movies and video games:

30 fps in movies is a fluid motion, a constant rate that never changes, the brain can adapt and is not surprised by "frame spikes".

a video game rendered on a gpu is constantly "spiking" frames, only v-sync will resynchronize frames with the refresh rate of the monitor.

spiking frames? what does it even mean? well it's simple at 30 fps you can get 20 frames for the first 0.5 seconds and 10 frames at the last 0.5 seconds. It's not constant.

Because of this behaviour, it can create horizontal and vertical lines on the screen , "breaking" the picture shown on it.


The G-sync and and amd free sync were made to counter this problem by synchronizing monitor refresh rate at the speed of frames sent at this monitor. It's still not perfect but it's already better/

Anyway, a game at 30 fps is lesssssss less smooth than a movie at 30 fps because of thoses "frame spikes" and with 60 fps we reduce this perceived effect by two. with 120 by 4.

It's not comparable at all. And even 120 fps movie with 120 fps monitor/projector are noticeable ;)
I will never be good but always I try to improve.
Last edited by Geisalt#1772 on Jun 15, 2016, 5:05:53 AM

Report Forum Post

Report Account:

Report Type

Additional Info