POE WILL BE THE NEW VANGUARD IF THEY DONT CHANGE SHIT UP

How much is actually desynch and how much is packet loss and/or network latency?

I know, for a fact, that most of the times have experienced what many in these forums term "desynch", that network latency and/or packet loss is the at the root of my issue.

Unless your computer is the only device on the network, you cannot guarantee unbroken data transmission.

Nobody likes to see the "RIP" panel. It is easier to blame something over which we have no control than to simply accept that something happened which caused me to respawn.

But, I suppose I should rage against GGG and demand that they upgrade my home network infrastructure with more robust tech, since they have to be responsible, right?
Desync is really there so the Diablo 3 team can think they did at least one thing right.

Good on GGG!
Path of PEW PEW PEW PEW PEW PEW
"
Sinnesteuer wrote:
How much is actually desynch and how much is packet loss and/or network latency?

I know, for a fact, that most of the times have experienced what many in these forums term "desynch", that network latency and/or packet loss is the at the root of my issue.

Unless your computer is the only device on the network, you cannot guarantee unbroken data transmission.

Nobody likes to see the "RIP" panel. It is easier to blame something over which we have no control than to simply accept that something happened which caused me to respawn.

But, I suppose I should rage against GGG and demand that they upgrade my home network infrastructure with more robust tech, since they have to be responsible, right?


lol, whiteknighing again

watch how streamers desync. they have incredibly robust and tested infra (to prevent som scumbags ddos'ing them) and they desync like crazy

you can desync 'on demand' with cyclone or whirl blades. i recently added glacial hammer to the list.

i have 50-60ping to eu servers and what happens when glacial hammer'ing is so atrocious that i simply cannot stand playing that char anymnore.

you can use 'smart' words like packets etc but anyone with more than month of practical software experience knows it is just deception and lame way of obstructing the problem

"
morbo wrote:
"
ScrotieMcB wrote:
More difficult compared to what? Do you really think monster AI is a computationally intensive process? Well, it's not.


More difficult compared to predicting player location / path. Simply because there are a lot more mobs than players, they use skills that are equivalent to players (all the juicy desyncing ones), and they usually move faster than players

The AI has to properly simulate 10 - 20 mobs on average that are all rushing to you, trying to fit through narrow passages and avoid inter-mob-collision. These computations become exponentially more difficult (lengthy) the more entities you have on the scene (because their paths depend on each other), the more complex the scene layout is and the faster the entities actions are. Multiply this with global or local sources of haste or similar effects that add complexity (bringer of bones, lol).

Yes, mob AI is a computationally really intensive process (probably the most intensive task that is solely the job of the CPU), so much that rendering graphics and AI is done in separate threads, to take advantage of hardware multi-threading, where its available.


there is NOTHING difficult about predicting 20 or even 20000 mobs. this game if it wasnt for player input - with current architecture - would not desync at all. because both client and server have the same info about start location and know 'how to behave' - without any external input the game would 'play' perfectly identical on both ends (With added sync for sanity once each 10 or 15 seconds). 20000 mobs is a lot, but their ai and pathing also is simple. you can ofc parallel it and sent to gpu if you need to (possible and quite easy to implement with fallback for non-cuda/non-amdcuda)

AI routines are easy for dumb ARPG mobs. it is not starcraft (That also plays a 'scenario'). this is simple 'swarm' type AI that 'detects', 'gets to' and 'fights'. if that is complex... it is easy, computatively cheap non-issue area.

what is an issue is predicting PLAYER movement. this is tough, and can be done cheaply/poorly or a bit better. cheap is 'rubber man' that calculates the 'mass' and 'center of mass' and predicts movement in that direction, another one is per-player statisticaly-created 'movement style'. second one IS expensive to prepare and maintain. much more difficult to create than fixing this current network model.

anyway, desync for some skills/maps is absolutely atrocious and made me literally close poe when playing glacial hammer char more than id like to admit.

continued silence about this topic (wall-of-excuses are just that - excuses) does not bode well
"
Sinnesteuer wrote:

But, I suppose I should rage against GGG and demand that they upgrade my home network infrastructure with more robust tech, since they have to be responsible, right?


You miss the point. It's obvious that GGG can't do anything if a player's internet connection/PC can't handle PoE. Desync, however, is a bug caused exclusively by some of GGG's design decisions. You could own the best PC in the world and sit directly next to the server and you would still desync.
I'd rather deal with lag than desynch.
It's worth noting that +2 maps are a dangerous thing.
They can cause players to get out of their depth -
playing maps that are too hard for the items they currently have. Herp Derp.
"
RickyDMMontoya wrote:
I'd rather deal with lag than desynch.


I think that too many players don't really know the difference, but already are playing with both.

Is that too much whiteknighting? I don't want to upset the rodent cart.

And I never, at any point in time, suggested that the netcode was flawless. Yes, DESYNCH DOES EXIST IN POE. What I was trying to suggest is that it may not be only suspect involved in these crimes.

Damn me for suggesting something more than GGG's netcode being the evil bane of all things PoE.
Last edited by Sinnesteuer#7507 on Feb 14, 2014, 11:31:08 AM
"
sidtherat wrote:
there is NOTHING difficult about predicting 20 or even 20000 mobs. this game if it wasnt for player input - with current architecture - would not desync at all.


Well, PoE cant even fit 10 mobs through a doorway, without failing at prediction, ie. causing desynced ghosts poping in from rooms. Especially in case of fast mobs & maps with narrow doorways (vaal pyramid, sceptre of god type).

The only input in this case is that the player aggroed them and player location. If this is not a case of failed AI / client side prediction, then I dunno what it is...

Necromancers are a type of mob that desynces a lot, because they have a different AI than most, they actively search for corpses (avian retch too). In this case there is no player input at all, just location of corpses. (thought the location of corpses is a direct result of player input...)
When night falls
She cloaks the world
In impenetrable darkness
Last edited by morbo#1824 on Feb 14, 2014, 11:51:17 AM
Desynch is caused by a failure of the prediction model, run client-side, to align with the data delivered from the server.

The game doesn't update appropriately, leading to situations where your local client isn't properly displaying your position, and isn't properly displaying the position of enemies.

It leads to the worst situations when due to inaccurate position display, the player's actions are inappropriate. For example if you think you are in the hall, and the enemy is in the hall, and you target the enemy with an attack, if both of you are actually in the room (filled with other enemies), then you just committed suicide. The game will generally re-synch as soon as you're dead. Giving the appearance of teleporting you into the room, that you had no intention of entering.

Lag is just a delay in the actions directed by the player. It would be much more tolerable. Yes, you could die due to lag, but not in such an annoying fashion.

Lag kills you because your input is too slow. Desynch kills you because your input is turned into nonsense that doesn't reflect the actual state of the game. It's much less frustrating to die because you had no input compared to dying because your input was essentially bullshit.

Change no mechanics of the game. Leave in stun, and dodge etc. Just run it all 100% client side. Yes, players with a fast ping will have an advantage. So what? Players with better systems already have an advantage because you can't turn off weather effects.

"I died because of a lag spike when enemies attacked me and I couldn't respond."

"I died because I wasn't where I thought I was, the enemy wasn't where I thought the enemy was, the other enemies weren't where I thought the other enemies were, I didn't know how much health I had, and didn't know where the enemies attacks were."
It's worth noting that +2 maps are a dangerous thing.
They can cause players to get out of their depth -
playing maps that are too hard for the items they currently have. Herp Derp.
Last edited by RickyDMMontoya#7961 on Feb 14, 2014, 11:47:18 AM
"
HellGauss wrote:
I seriosly doubt that 'better prediction algorithms' would solve the desync issue, with PoE architecture in this current state.

I'm quoting an excellent explanation of MrMisterMissedHer in another thread (bold is mine);

"
MrMisterMissedHer wrote:
Spoiler
Yay, desync discussions again.

Firstly, you are delusional if you think prediction to the extent that PoE uses can work. Utterly laughable concept in a game (unless, of course, you're okay with the mechanics fundamentally suffering because of it, which quite a few people seem to be okay with).

Any person that has more than basic knowledge of dynamics and has had to practically deal with this sort of thing knows this. You cannot predict for discontinuous and essentially random state and behaviour.

Games that use prediction to good effect do it to hide latency. The extent of prediction (or at least the governing thresholds for something like resync) in PoE is between 1-2 orders of magnitudes higher than what is typically successful (that is, games tend to do local prediction relative to latency, NOT on the order of seconds).

Considering how error as a result of desync accumulates non-linearly (it can easily be worse than exponential increases, again, due to discontinuities), letting the state go uncorrected for as long as PoE does leads to the wonderful artifacting that we get in PoE.

Prediction is an attempt to make state converge, problem is, it's not provably correct in any sort of general sense and in fact easily causes divergence instead. It is essentially nothing more than hoping it works out.

You can only accurately predict for state without future undetermined factors, this happens to be true of trivial behaviour and low complexity state (eg. walking unobstructed in a straight line), sometimes.

In closing, since GGG are unable or unwilling to implement real fixes, learn to accept PoE's desync or move on. It's not going to change much.
I agree wholeheartedly regarding one point: uncorrected desync does accumulate non-linearly, tends towards pseudo-exponential increases as the flaw butterfly-effects, and as such necessitates some kind of re-synchronization method as a fallback.

However

it's important to understand that this is precisely what is currently happening when you desync. Why is it that you and a bunch of monsters suddenly teleport into a room you never meant to enter? Butterfly effect — the simulation had an error and the error grew in the system non-linearly. It's utter fallacy to pretend this kind of problem exists solely for proposed suggestions and has nothing to do with the current state of the game itself.

Thus, good prediction is not about utterly eliminating the possibility of a flaw. It's about experiencing those flaws less often by minimizing the window during which they can occur. Not eliminating, minimizing; not zero desync, but far less. Which brings me to my next quote...
"
morbo wrote:
"
sidtherat wrote:
there is NOTHING difficult about predicting 20 or even 20000 mobs. this game if it wasnt for player input - with current architecture - would not desync at all.
Well, PoE cant even fit 10 mobs through a doorway, without failing at prediction, ie. causing desynced ghosts poping in from rooms. Especially in case of fast mobs & maps with narrow doorways (vaal pyramid, sceptre of god type).
That's because it doesn't even try.

Perhaps if you actually knew how PoE currently handles pathing, you'd understand how much GGG have neglected predictive systems.

Go to a room-based level with slow-moving mobs (such as Lower/Upper Prison), get near a doorway, and do two things in quick succession: disconnect your networking (preferably using a firewall to block PoE's HTTPS-based communications) and move into the room, which will probably have some monsters. You will notice they don't move towards you, at all. That's because monster AI is entirely on the server, so the monsters can't make decisions regarding your presence in any way without connectivity.

Now start over, and try the same thing but move into the room first. After the monsters slowly begin shambling over to you, move to a new location. You'll see they straight-line to your previous location and then stop.

That is the full extent of client-based prediction in PoE.

Thus, assuming a latency of 100ms, you end up with this as the current situation:

Client (simulation of) gamestate
Current player animations (movement and skills): zero delay
Monster animations (movement and skills): 200ms delay
Other player animations (movement and skills): 200ms delay

Server (true) gamestate
Current player data (movement and skills): 100ms delay (waits on player)
Monster data (movement and skills): 100ms delay (immediately upon receipt of player input)
Other player data (movement and skills): 100ms delay

Note that on the server, everything is happening with simultaneity, but on the client there is a 1/5 second difference between monster position and player position. When blockages happen during that 1/5 of a second (or whatever the value of twice the latency), then you've got a blockage on the server which isn't a blockage on the client. This is a massive cause of desync.

What I'm suggesting is more like this:

Client (simulation of) gamestate
Current player animations (movement and skills): zero delay
Monster animations (movement and skills): zero delay*
Other player animations (movement and skills): 100ms delay, or 200ms if opting out**

Server (true) gamestate
Current player data (movement and skills): 100ms delay (waits on player)
Monster data (movement and skills): 100ms delay (immediately upon receipt of player input)
Other player data (movement and skills): 100ms delay

* This is achieved by simply putting monster AI on the client. I'm not talking damage calculations or anything like that, just let the monster choose what it's doing and start animating it, then hopefully the results of the monster's choice will arrive from the server before the action completes.
** This is achieved by adding an option to give your address to other players to allow them to send their action data to you directly, rather than relying on the server as a middleman. This is a slight security risk in similar to using torrents in both nature and severity. Failure to provide your direct address will not hurt other players, but will mean you get their packets slower, increasing multiplayer desync on your end.

Under my suggestion, there is simultaneity on the client, meaning the window for such blockage-based desync is virtually nil.

The only issue is ensuring monster AI on the client provides the same results as monster AI on the server. When monsters need to "flip a coin" to decide what to do, shared pseudorandom seeds can be used by both client and server to ensure the client's coinflips always match the servers. Thus, the only thing which is unpredictable is fluctuations in latency — this is a different concept than latency itself. By which I mean: if your latency is a constant 400ms, the system still works for single-player, because the client is consistently ahead of the server by 400ms. However, if your latency shifts from 400ms to 300ms, you've just shrunk the amount of time the client is ahead, which could lead to desync from missed predictions during that 100ms.

Proof the suggestion is better than the current system
The time window for desync under the current model is round-trip latency; the time window for desync under my model is variation in single-trip latency. Therefore, assuming L1 and L2 are two consecutive latency values...
Current system: L1 + L2
Suggestion: | L1 - L2 |
Since L1 + L2 > | L1 - L2 |, the window under the current system for blockage-based desync would always be greater under the current system; the suggestion narrows the window considerably.
When Stephen Colbert was killed by HYDRA's Project Insight in 2014, the comedy world lost a hero. Since his life model decoy isn't up to the task, please do not mistake my performance as political discussion. I'm just doing what Steve would have wanted.
Last edited by ScrotieMcB#2697 on Feb 14, 2014, 1:10:53 PM
We have some legit Client-Server Architects/Database Architect/Software Engineers up in here.

Please apply at GGG gaming and fix issues pls, I'll give you all of my exalts.
Last edited by RagnarokChu#4426 on Feb 14, 2014, 1:18:57 PM

Report Forum Post

Report Account:

Report Type

Additional Info