*Technical* Idea regarding desync

I've been reading up on online multiplayer games (as a computer programmer who only gets to work on boring projects, the idea of working on a MMO game is somewhat alluring) and found an interesting article regarding desync in RTS games.

http://www.altdevblogaday.com/2011/07/09/synchronous-rts-engines-and-a-tale-of-desyncs/

After reading this it became obvious that GGG has used a more FPS style engine (think Counterstrike, Team Fortress 2) for their netcode.

How much work it would be to implement PoE as a completely deterministic simulation similar to the RTS games mentioned in the article above.

Pros:
Desync is minimised --> game crashes if clients/server get out of sync
Replays are easy --> just a list of inputs that can be played back anywhere
...

Cons:
Game is as slow as it's laggiest player
Difficult to implement joining an instance already in existence.
...

Personally I would rather player a slower game where my actions and consequences are clearly linked, than a faster game where I can be punished for network events that are out of my control.





This thread has been automatically archived. Replies are disabled.
Interesting read, but I disagree with your conclusions.

One big red flag is how the example program handles desync. SupCom, a game that uses this synchronous lockstep engine architecture, closes out the game with an error message for all clients when a desync is detected! This implies that using such an architecture makes desync recovery difficult to impossible. Is this really a desync solution?

No, it's not. The beginning of the article makes it plenty clear that this is a bandwidth solution. You have thousands of units per game, a need to keep positional data on all of them, and that is simply too much information to send on a continuous basis. Bandwidth, not desync. Actually, the programmers of games like these then need to go the extra mile to program out all possible desyncs, to the point that the author refers to desyncs as "usually programmer error."

So the question is: is PoE using too much bandwidth (meaning this might make things better), or does it have too many problems recovering from desync (meaning this might make things worse)?

Pretty much everyone's going with the latter here. This isn't a solution. The game crashing if there is desync is not a pro, it's a con.

Nevertheless, I am impressed with your inquisitiveness.
When Stephen Colbert was killed by HYDRA's Project Insight in 2014, the comedy world lost a hero. Since his life model decoy isn't up to the task, please do not mistake my performance as political discussion. I'm just doing what Steve would have wanted.
Last edited by ScrotieMcB#2697 on Jun 27, 2013, 12:14:17 AM
You're correct in that PoE does not use a fully deterministic model, which would force the client not to act until both it had notified the server, and the server had sent back a confirmation.

That kind of model is relatively standard for RTS games with hundreds of units and less fast-paced gameplay, but it is decidedly not the standard for ARPG games like Path of Exile, and for good reason - that level of input lag to do anything very quickly makes such fast-paced games unplayable.

Instead, ARPGs like PoE run without making the assumption that everything is perfectly in sync, and thus can handle cases where it's not without crashing, along with other benefits like ease of joining instances and actual immediate responses to player input.

Chris goes into more detail about such topics in this Development Manifesto Thread.
Last edited by Mark_GGG#0000 on Jun 27, 2013, 12:21:35 AM
Non determinism also happens to be impossible if you have multiple players and Internet connections: they are a randomness source. Although theoretically you could do it partially, which must be what the original article is referring to.

Report Forum Post

Report Account:

Report Type

Additional Info