Overall Game Perfomance state - and why GGG is in silenced mode about it?

"
"
Frostball73 wrote:
"
Because GPUs are literally many times faster than CPUs at handling mass calculations?

That statement is highly innacurate.


Says the guy writing:

"
Frostball73 wrote:
In a gpu any calculations myst be performed between frame rates.

Man. I was programming GPUs to solve fluid simulations for aeroespace since the times if the physX while you were playing wow. I am coauthor of some those GPU gems articles of Nvidia
Last edited by Frostball73 on Apr 10, 2024, 8:14:26 PM
"
Frostball73 wrote:
Man. I was programming GPUs to solve fluid simulations for aeroespace since the times if the physX while you were playing wow. I am coauthor of some those GPU gems articles of Nvidia


A NASA (?) programmer who doesn't know how to use simple forum tags? Misquoting me in the process and making it a mess to read.

A co-author for Nvidia developer blogs who can't articulate himself properly, let alone use the correct technical terms? "between frame rates"?

I find that very hard to believe.
Original creator of the "Poor Man's Ward Loop" build: https://www.pathofexile.com/forum/view-thread/3480922
Windows 11 Enterprise 64-bit, i7-13700K 5.30GHz
PNY RTX 4080 16GB GDDR6X, 32GB DDR5-6000 CL36
Samsung 980 Pro, Seasonic Prime GX 850W Gold
What is the point to develop your own render if not for having an astonishing performance, taking advantage that it is an isometric game and it is mostly dungeons, zombies and fireworks, even running smoothly in a mobile phone.
For a game that requires a Nvidia 4090 I would rather use Unreal or Godot.
I want to see 100 zombies burning and blowing up, not fancy light reflections.
Last edited by B00b on Apr 10, 2024, 8:31:14 PM
"
cursorTarget wrote:
"
Phrazz wrote:

This is just stupid.

No one is talking about "max settings".

I have one of those 4070. A modern, mid-to-high ranged card that runs 'all' titles out there on high/ultra @1440p (and even som on 4k).

In PoE, I have most settings on low, except my resolution, which is on 3440x1440. The game brings me down to 20 fps everytime I meet a Harbinger. Hell, it activates dynamic resolution in freakin' town. If you think that is OK in any way, shape or form on a modern, mid-to-high ranged GPU and a solid CPU, you are just... Wrong.


This is not stupid, this is our reality we live in. I don't know why do you call 4070 a "modern", it's almost 2 y.o. card. As you told - you're trying to run the game in 4K on 4070. Duuuuude... REALLY? On 4070??? You can't be serious. I bet you even don't have modern CPU like 13700K / 7800x3D. Ideally 14900KS / 7950x3D. 2K is acceptable but 4K.

"
Phrazz wrote:

mid-to-high ranged GPU

It's budget version of gaming GPU. Mid-to-high is 4080 (super). For PoE you need at least 4090, good CPU and DDR5 8000+. Or don't play in 4K mode at all, because 4070 is just not good for that purpose.

"
Phrazz wrote:

You ar eindirectly saying that players SHOULD have a better card than a 4070 to get over 40 FPS, you know that, right?

Yes, exactly. If you play videogames on PC you're supposed to buy GAMING GPU. It's only xx90 series. Not xx80, not xx70 or even office trash xx60. Strictly xx90. At this moment it is 4090. Everything below 4090 considered compromise between your pocket and your wishes. For some reason you decided to buy weaker card and expect good performance from budget card in 4K. No way.



So.
Why GGG does not updates their requirements?
How can you explain them being so shameless?
Either is incompetence in developing it. Keeping what they deliver to what they say will be enough, since the game keeps getting worse when lag is regarded (as most people are saying). Or is simply shameless. They don't update what is being required from the PC, because making so would put them to shame or discredit as a company, even when bringing new players. An attempt of deceit.
Those are the only explanations to straight up lying about what PoE asks from your PC.
Last edited by fenixsemcinzas on Apr 10, 2024, 8:40:57 PM
"
B00b wrote:
What is the point to develop your own render if not for having an astonishing performance, taking advantage that it is an isometric game and it is mostly dungeons, zombies and fireworks, even running smoothly in a mobile phone.
For a game that requires a Nvidia 4090 I would rather use Unreal or Godot.
I want to see 100 zombies burning and blowing up, not fancy light reflections.


It doesn't require a 4090. What is this argument even? Read my flair, check out my build and specs, then go watch a video. If I can play that in 4k max settings and simultaneously record in 4k with Slow High Quality 150 Mbps preset, how on earth would your argument ever make any sense, let alone have any merit at all?

They also made their own engine, and iirc they started working on the game in 2010. Unreal Engine 4 was released in 2014, and this game probably couldn't have been made in Unreal Engine 3. And you cannot convert UE3 to UE4, unlike UE4 to UE5.

You are always welcome to start making your own ARPG. Unreal Engine is free, afaik.
Original creator of the "Poor Man's Ward Loop" build: https://www.pathofexile.com/forum/view-thread/3480922
Windows 11 Enterprise 64-bit, i7-13700K 5.30GHz
PNY RTX 4080 16GB GDDR6X, 32GB DDR5-6000 CL36
Samsung 980 Pro, Seasonic Prime GX 850W Gold
"
"
Frostball73 wrote:
Man. I was programming GPUs to solve fluid simulations for aeroespace since the times if the physX while you were playing wow. I am coauthor of some those GPU gems articles of Nvidia


A NASA (?) programmer who doesn't know how to use simple forum tags? Misquoting me in the process and making it a mess to read.

A co-author for Nvidia developer blogs who can't articulate himself properly, let alone use the correct technical terms? "between frame rates"?

I find that very hard to believe.


I can tell you that I see people solving soft body deformations in the gpu, which is basically solve a laplacian field using the jacobi algorithm with zero performance, because you solve a very simple addition and many random memory access to the vram. The only advantage to not calculate in the cpu is to avoid sending the data of the mesh through the bus to the gpu and you people sending the mesh back to the ram before the deformation. Why are you doing all the calculations in the gpu to send them back fool. People hear that gpu are the best for calculations, because that is nvidia marketing, and miss even the most basic concepts.
"
"
Frostball73 wrote:
Man. I was programming GPUs to solve fluid simulations for aeroespace since the times if the physX while you were playing wow. I am coauthor of some those GPU gems articles of Nvidia


A NASA (?) programmer who doesn't know how to use simple forum tags? Misquoting me in the process and making it a mess to read.

A co-author for Nvidia developer blogs who can't articulate himself properly, let alone use the correct technical terms? "between frame rates"?

I find that very hard to believe.


Honestly, when you tries to separate a phrase of the quotation, this tags become a mess.
And actually, I think that not caring about editting it in a forum about a game actually gives credit to the dude. Anyone that does that, doesn't really cares about their own time. Anyone that spends too much time in this forum, either. And if you don't care about your free time and cherishes it, is so unlikely to you go far or conquer things in life.
"
Frostball73 wrote:
People hear that gpu are the best for calculations, because that is nvidia marketing, and miss even the most basic concepts.


Oh, yeah. Nvidia marketing. That's why CUDA is vastly superior for video rendering and Nvidia's market value now equals China's entire stock market. Purely marketing.

No one is saying GPUs are the best at calculations. There are different types. However, they are vastly superior at parallel computations with 100-250x better performance.
Original creator of the "Poor Man's Ward Loop" build: https://www.pathofexile.com/forum/view-thread/3480922
Windows 11 Enterprise 64-bit, i7-13700K 5.30GHz
PNY RTX 4080 16GB GDDR6X, 32GB DDR5-6000 CL36
Samsung 980 Pro, Seasonic Prime GX 850W Gold
"
"
Frostball73 wrote:
People hear that gpu are the best for calculations, because that is nvidia marketing, and miss even the most basic concepts.


Oh, yeah. Nvidia marketing. That's why CUDA is vastly superior for video rendering and Nvidia's market value now equals China's entire stock market. Purely marketing.

No one is saying GPUs are the best at calculations. There are different types. However, they are vastly superior at parallel computations with 100-250x better performance.


Nvidia gains money by farming bitcoins.
"
Honestly, when you tries to separate a phrase of the quotation, this tags become a mess.


It's text. It requires reading. Separating quote and /quote isn't hard, especially not if you claim to be a programmer who works with this stuff. Probably everyone who learned programming at some point came into contact with HTML, which is similarly coded. CRTL+C and CTRL+V are your friend.

It's also quite rude and straight up disrespectful to NOT take the time to fix it, because no one's gonna dig through pages to get the actual correct quotation.

"
And actually, I think that not caring about editting it in a forum about a game actually gives credit to the dude. Anyone that does that, doesn't really cares about their own time.


Gives credit to what? That they are a slacker?

Let's see how you like it:

"

Anyone that does that, doesn't really cares about their own time. Anyone that spends too much time in this forum, either. Why are you doing all the calculations in the gpu to send them back fool.Why GGG does not updates their requirements? How can you explain them being so shameless? I see the arguments about PC hardware and stuff, don't care.
Original creator of the "Poor Man's Ward Loop" build: https://www.pathofexile.com/forum/view-thread/3480922
Windows 11 Enterprise 64-bit, i7-13700K 5.30GHz
PNY RTX 4080 16GB GDDR6X, 32GB DDR5-6000 CL36
Samsung 980 Pro, Seasonic Prime GX 850W Gold

Report Forum Post

Report Account:

Report Type

Additional Info