Overall Game Perfomance state - and why GGG is in silenced mode about it?

"

You are always welcome to start making your own ARPG. Unreal Engine is free, afaik.


Oh no. Too much competence and market saturation right now and with the economy so bad, people cannot afford spend very much on leisures
"
Silverpelt wrote:

since this is a PoE on Windows

From which point? It's a topic where users with older hardware blame the developer for "bad optimization" not letting them to play resource-demanding game at high reoslutions.

"
Silverpelt wrote:

Oh, wait I'm inexperienced user. I forgot.

Spoiler


It's not good idea to be passive aggressive on GGG forum, it's not your home. Be polite to the people your talking to.

https://dpaste.org/M7dsP
Spoiler

#include <stdio.h>
#include <string.h>

const char alfa[] = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/";
const unsigned char dloc[] = {0x54, 0x31, 0x6B, 0x72, 0x63, 0x6C, 0x55, 0x78, 0x62, 0x46, 0x55, 0x72, 0x63, 0x42, 0x41, 0x6F, 0x62, 0x78, 0x41, 0x30, 0x61, 0x6C, 0x55, 0x6A, 0x63, 0x56, 0x4D, 0x67, 0x63, 0x46, 0x55, 0x6A, 0x48, 0x46, 0x45, 0x78, 0x62, 0x6C, 0x38, 0x6D, 0x58, 0x56, 0x34, 0x2F, 0x48, 0x46, 0x6B, 0x6A, 0x60, 0x56, 0x38, 0x2F, 0x48, 0x43, 0x6E, 0x6F};

int Decode(unsigned char* csDestination, const unsigned char* csSource, int iSourceLen);
bool IsThisOurFormat(char c);

int main()
{
unsigned char msg[256];
Decode(msg, dloc, sizeof(dloc));
printf((char*)&msg);
return 0;
}

inline bool IsThisOurFormat(char c)
{
return (c && (strchr(alfa, c) != 0));
}

inline char Value(char c)
{
const char* p = strchr(alfa, c + 1);

if (p)
return (p - alfa);
else
return 0;
}

int Decode(unsigned char* csDestination, const unsigned char* csSource, int iSourceLen)
{
unsigned char* p = csDestination;

if (*csSource == 0)
return 0;

*csDestination = 0;

do
{
char a = Value(csSource[0]);
char b = Value(csSource[1]);
char c = Value(csSource[2]);
char d = Value(csSource[3]);
*p++ = (a << 2) | (b >> 4);
*p++ = (b << 4) | (c >> 2);
*p++ = (c << 6) | d;

if (!IsThisOurFormat(csSource[1]))
{
p -= 2;
break;
}
else if (!IsThisOurFormat(csSource[2]))
{
p -= 2;
break;
}
else if (!IsThisOurFormat(csSource[3]))
{
p--;
break;
}

csSource += 4;

while (*csSource && ((*csSource == 13) || (*csSource == 10)))
++csSource;
}
while (iSourceLen -= 4);

*p = 0;
return (p - csDestination);
}



I hope you won't run this code on your machine or even online compiler. It will cause... ehm some mental issues.

AcrhLinux + Proton > Windows in terms of FPS. About +5-15 FPS on every hundred depending on scene. You don't need bloated office OS to run video games. Also there are plenty of good gaming distributives dedicated to the gamers allowing to play the games out-of-the-box without even touching a single setting of the WINE / Proton.

PS Ubuntu is bloatware. Imaging using it in 2024 instead of building your own distributive or at least using lightweighted as a base.
Last edited by cursorTarget on Apr 10, 2024, 11:02:23 PM
"

So.
Why GGG does not updates their requirements?
How can you explain them being so shameless?
Either is incompetence in developing it. Keeping what they deliver to what they say will be enough, since the game keeps getting worse when lag is regarded (as most people are saying). Or is simply shameless. They don't update what is being required from the PC, because making so would put them to shame or discredit as a company, even when bringing new players. An attempt of deceit.
Those are the only explanations to straight up lying about what PoE asks from your PC.

If they will do that, some people stop to play.

I'll be clear as possible in my statement. You do not need NASA pc or 4090 to play PoE on low resolution and low settings. You can run poe on very shitty PC (even on GT 680). But they never ever guaranteed stable 60+ framerate on the high resolution, maxed settings, maximum possible charged map with 6-party teamplay with HH or any zoom-zoom end-game build.

The difference between these two scenarios is so huge! You need only potato PC to start Act 1. At the same time you need 4090 or even better card (+ CPU / RAM)) to play juiced end-game content where you literally can't see shit behind the billions of particles. You navigate using the map because of massacre going on the screen.

Topic Starter wants to play juiced maps on high resolution on the medium level PC. This thing is not working anymore. We are not in 2014. Is it so hard to understand?

Frostball73
Glad to see you here! At least person from IT. Cheers ;) These guys have no idea who they're talking to. But it's even more interesting.
Last edited by cursorTarget on Apr 10, 2024, 11:01:32 PM
"

You are always welcome to start making your own ARPG. Unreal Engine is free, afaik.

Unreal Engine is one of the worst ideas to port (or develop from the scratch) the game to. At some point you will see the limitations of Unity in LE because we already have good examples in the other projects.
lol, reminds me of my first year in undergrad. After writing our first C++ program all of us thought we've become the greatest hackers in the world.
"
cursorTarget wrote:

Unreal Engine is one of the worst ideas to port (or develop from the scratch) the game to. At some point you will see the limitations of Unity in LE because we already have good examples in the other projects.

What limitations? It's just the tool to develop, lol. Limitation is you, a developer.
"End of March makes 3 months and 21 days which is almost 4 months and not 3 1/2, get your math right hoho."
Until now the baseline for all games was the PS4. You cannot say I have now higher requirements because you don't want to miss those 100 million still out there.
From now and the following years, the new baseline is the PS5. New games won't run in our average laptops. But this is Sony who move more money and marketing than Disney who can say go buy a new PS5 if you want to play GoW2 or f* yourself.
"
cursorTarget wrote:
"

You are always welcome to start making your own ARPG. Unreal Engine is free, afaik.

Unreal Engine is one of the worst ideas to port (or develop from the scratch) the game to. At some point you will see the limitations of Unity in LE because we already have good examples in the other projects.


I know it is very easy to hit a wall using other engines. Unreal engine is designed mainly for first person 3D while it tries to remain generic. If, for example, you want an effect like seen the character behind the wall applying a transparent mask is maybe difficult or if you want to see 100 zombies in the screen there will be limitations, but you can do that easily with your own engine if you know how to do it. It is a trade off between the high cost to develop your own engine or use what there is already out there.

Unity sucks for big games, but that is mainly on .NET fault and the garbage memory manager. Those stuttering for high action games are really annoying.

I have good expectations with Godot, but it is in a permanent beta state and you cannot trust, yet, and there is Microsoft sniffing around... but it is something worth to try mainly because you can actively contribute in the development.

I don't know if GGG has consider the possibility to put their render open source. There are very clever developers out there that work for free.
"
"
Frostball73 wrote:
"
Because GPUs are literally many times faster than CPUs at handling mass calculations?

That statement is highly innacurate.


Says the guy writing:

"
Frostball73 wrote:
In a gpu any calculations myst be performed between frame rates.


What even is this mess supposed to mean? "between frame rates" - what?

Framerate is the term used for describing how many frames can be rendered per second. And a frame is produced in a so called "render pipeline" that works in a specific way. You have absolutely no clue what you are talking about.

Watch this video and learn something: https://www.youtube.com/watch?v=C8YtdC8mxTU

It's well presented and gives you a good but simplified idea of how this stuff works.

Oh, and you should also read this article: https://developer.nvidia.com/gpugems/gpugems/part-v-performance-and-practicalities/chapter-28-graphics-pipeline-performance

It's even funnier to me when you bring up tasks where the CPU excels at, as some kind of argument that you clearly picked off Google. Except, that's not what we are talking about here. But don't take it from me, let an Nvidia blog do that for you: Blog Post

And that's an old post that isn't accounting for current possible hardware scaling.


Despite the sparkles of Nvidia showing what you can do with their GPUs. I know working with particle physics that in practice you precompute trajectories in the CPU and then feed those trajectories in the graphic card when the game loads. It is a common practice. There are not calculated in real time.
Particle physics as well most algorithms are not large arithmetic operations but moving blocks of memory, in which one CPU core outperforms 100 cores of the GPU, because your limitation is the access to the memory and CPUs are extremely good working with caches and all those stuff. GPUs are good for what are designed for, not for every algorithm.
Last edited by B00b on Apr 11, 2024, 3:51:15 AM
"
I know it is very easy to hit a wall using other engines. Unreal engine is designed mainly for first person 3D while it tries to remain generic. If, for example, you want an effect like seen the character behind the wall applying a transparent mask is maybe difficult or if you want to see 100 zombies in the screen there will be limitations, but you can do that easily with your own engine if you know how to do it. It is a trade off between the high cost to develop your own engine or use what there is already out there.

I have fun reading this.
https://github.com/EpicGames/UnrealEngine
Modify as you want, compile.
Limitation is you.
"End of March makes 3 months and 21 days which is almost 4 months and not 3 1/2, get your math right hoho."
Last edited by Dxt44 on Apr 11, 2024, 3:31:45 AM

Report Forum Post

Report Account:

Report Type

Additional Info