GZDoom Multiplayer Support
Moderator: Graf Zahl
-
- Posts: 6
- Joined: Sun Jan 31, 2016 18:57
GZDoom Multiplayer Support
I like the way GZDoom runs better on my mac than Zandronum. It runs smoothly without small jumps when running and there is no screen artifacts and tearing which is not fixed even by v-sync in Zandronum. I think GZDoom looks and runs better and has helpful features.
However, I was not able to play multiplayer like I could with Zandronum using Doomseeker. Could something be done with GZDoom to create a larger multiplayer support and community?
Could Zandronum's network code be added to GZDoom perhaps? Could there be another solution to make any easy well performing multiplayer experience and community using the GZDoom client?
Thank You
However, I was not able to play multiplayer like I could with Zandronum using Doomseeker. Could something be done with GZDoom to create a larger multiplayer support and community?
Could Zandronum's network code be added to GZDoom perhaps? Could there be another solution to make any easy well performing multiplayer experience and community using the GZDoom client?
Thank You
- Rachael
- Developer
- Posts: 3651
- Joined: Sat May 13, 2006 10:30
Re: GZDoom Multiplayer Support
The short answer is, no it is not possible - at least not right at the gate.doomydoomer wrote:Could Zandronum's network code be added to GZDoom perhaps? Could there be another solution to make any easy well performing multiplayer experience and community using the GZDoom client?
Thank You
The long answer is - Zandronum will catch up soon enough with GZDoom's features, anyhow. Everything that GZDoom is today, Zandronum will be within probably at most 2 years, if not sooner, minus a few special GZDoom-only features that very few people know about or use.
To make GZDoom multiplayer-compliant to the level that Zandronum is would be a huge undertaking. Possible? Yes, it is - but most ACS and actor functions would have to be hooked. The savegame and net code would have to change. In fact it would be such an effort, that in the end you would be left with pretty much a Zandronum clone, anyhow, and then you must ask yourself was it worth all that effort to essentially duplicate a project that has already been done?
The system we have in place where Zandronum and GZDoom are both sibling ports to ZDoom (with Zandronum childing GZDoom's OpenGL renderer) is far better than most other possibilities right now. There is currently a very active dev who works on 2 of the source ports at once - and he gets so little recognition for his work, but he's been instrumental in forwarding all 3 source ports, one way or another. There is another dev who has worked on Zandronum since it was Skulltag, taking over when Skulltag's creator decided to pursue more entrepreneurial hobbies. He's been instrumental in making Zandronum what it is today, and he leads a very successful dev team in that project. GZDoom has taken some of his contributions back in the Skulltag days, too.
Basically, what it comes down to is - if you want something in Zandronum, ask the Zandronum devs. But to ask GZDoom to take on Zandronum's netcode would be something that just can't be easily done right now. It would be nice, but it's just better to let Zandronum to catch up - because trust me, it will, eventually - and that's going to happen a lot sooner than it would if GZDoom tried to take Zandronum's netcode.
- Graf Zahl
- GZDoom Developer
- Posts: 7148
- Joined: Wed Jul 20, 2005 9:48
- Location: Germany
- Contact:
Re: GZDoom Multiplayer Support
Actually, if I ever wanted to do something like that, I'd probably use Zandronum as a roadmap how NOT to do stuff. The netcode's invasiveness is incredibly amplified by a poorly thought out system inherited from Skulltag.Eruanna wrote:but most ACS and actor functions would have to be hooked. The savegame and net code would have to change. In fact it would be such an effort, that in the end you would be left with pretty much a Zandronum clone, anyhow, and then you must ask yourself was it worth all that effort to essentially duplicate a project that has already been done?
What I'd do first is to encapsulate all network critical variables and access them only through getter and setter functions which then can take care of the gory details transparently. What I'd try to avoid at all costs is 'hooking all functions'. Such a system is broken by design, and we see how much maintenance it requires.
(That is, if I ever were to design a C/S port - which I won't!

- Rachael
- Developer
- Posts: 3651
- Joined: Sat May 13, 2006 10:30
Re: GZDoom Multiplayer Support
Well - obviously, you are right, there are alternate ways to do C/S'ing, and I had been thinking of some, myself, but my methods may rely too heavily on the P2P system that is already present in (G)ZDoom and would not be very network tolerant, anyhow.
That being said, P2P can be used for absent-prediction, while the server would constantly send updated actor flags and stats to clients, but the only thing that comes to mind is how much bandwidth that could consume so quickly.
Needless to say, I've been thinking about it ever since I first had access to the Skulltag source.
I don't know if Zandronum has a new way of handling weapons, but I do know I really never liked the way Skulltag handled it - it was extremely hacky and it was way too trusting of the Client-side.
That being said, P2P can be used for absent-prediction, while the server would constantly send updated actor flags and stats to clients, but the only thing that comes to mind is how much bandwidth that could consume so quickly.
Needless to say, I've been thinking about it ever since I first had access to the Skulltag source.

-
- Posts: 6
- Joined: Sun Jan 31, 2016 18:57
Re: GZDoom Multiplayer Support
I'm hearing it would be a massive undertaking to add network support for online play. I guess that GZDoom must be very targeted to single player games like Quakespasm is for single player games. Cooperative gameplay modes would be interesting though.
So it wouldn't even help to just be able to take the network code from the best open source out there like say Quake 3 and make some modifications to retrofit it for say a doom port like GZDoom or QuakeSpasm?
I was reading about the network architecture comparing Quake1, Quake2 and Quake3 and the author said the Quake 3 is the one that finally "got it right". I thought perhaps since the network code already exists it could be utilized with much less effort.
It would be interesting if a new network code standard based on the Quake 3 was used for both Doom and Quake1 as a new standard.
Also, would it be possible to somehow get anything developed if we either had money contributed or maybe even made a commercial product? Would that be possible? To me spending some money on a robust Quake or Doom client would provide so much value because of all the free content people could develop. If there was a robust multiplayer system and perhaps even a robust level maker with new features closely tied in somehow that would be a great product.
So it wouldn't even help to just be able to take the network code from the best open source out there like say Quake 3 and make some modifications to retrofit it for say a doom port like GZDoom or QuakeSpasm?
I was reading about the network architecture comparing Quake1, Quake2 and Quake3 and the author said the Quake 3 is the one that finally "got it right". I thought perhaps since the network code already exists it could be utilized with much less effort.
It would be interesting if a new network code standard based on the Quake 3 was used for both Doom and Quake1 as a new standard.
Also, would it be possible to somehow get anything developed if we either had money contributed or maybe even made a commercial product? Would that be possible? To me spending some money on a robust Quake or Doom client would provide so much value because of all the free content people could develop. If there was a robust multiplayer system and perhaps even a robust level maker with new features closely tied in somehow that would be a great product.
- Graf Zahl
- GZDoom Developer
- Posts: 7148
- Joined: Wed Jul 20, 2005 9:48
- Location: Germany
- Contact:
Re: GZDoom Multiplayer Support
No, if the game isn't written to handle this, all the best netcode in the world won't work.doomydoomer wrote: So it wouldn't even help to just be able to take the network code from the best open source out there like say Quake 3 and make some modifications to retrofit it for say a doom port like GZDoom or QuakeSpasm?
There's a reason why C/S Doom ports are separate projects that get developed at their own pace.
-
- Posts: 6
- Joined: Sun Jan 31, 2016 18:57
Re: GZDoom Multiplayer Support
I wonder why something that used the original network wouldn't work. What about DWANGO and how it was put into Microsoft Internet Gaming Zone years ago. Maybe an older solution would be easier. I notice Chocolate Doom has an old dos looking config window that pops up with a join multiplayer game from local or internet server. There aren't any players there, but it finds servers. Maybe they are doing something much more simple and there is an easier solution that is not as robust with as many player support or whatever, but would work fine.
- Rachael
- Developer
- Posts: 3651
- Joined: Sat May 13, 2006 10:30
Re: GZDoom Multiplayer Support
The only thing such projects have going for them is P2P is very light on bandwidth, so it is possible to achieve much faster pings than it would be with a regular C/S project even on ye olde 56k modem.doomydoomer wrote:I wonder why something that used the original network wouldn't work. What about DWANGO and how it was put into Microsoft Internet Gaming Zone years ago. Maybe an older solution would be easier. I notice Chocolate Doom has an old dos looking config window that pops up with a join multiplayer game from local or internet server. There aren't any players there, but it finds servers. Maybe they are doing something much more simple and there is an easier solution that is not as robust with as many player support or whatever, but would work fine.
Doom, at its very core, is a P2P game. P2P has a huge amount of weaknesses inherent, and it expects that everyone is using the exact same copy of the game - at the exact same time - because the only thing that is ever broadcast in a P2P game is player movement. Each client expects the other to know exactly where all the monsters are supposed to be, the exact state of every stairwell, every floor, ever door, every room, on its own, without correction from neighboring clients. Literally - every copy of the game is expected to guess what the game state is supposed to be. All it takes is one lag spike to desynchronize the whole circus and pony show, and then the game is broken - you can't play it anymore.
Even worse, P2P relies on a very low ping environment to function properly. If you have high ping (such as connecting to someone halfway across the world), then the game runs extremely slow, or there is a huge delay between your input and player actions. (Starcraft 1 had a great example of "delayed" P2P - every order given to your unit was delayed by 2 seconds on the "high latency" setting, in order to keep all copies of the game in sync and still allow the game to play at a somewhat decent speed)
This is why projects such as CSDoom, ZDaemon, and Zandronum even exist. C/S is fault and lag tolerant - if your net screws up, no problem, the server can send you a functional copy of the game state pronto and you're right back into the action in less time than it takes for you to type "oops, lag spike" to your friends. But Doom was not built as C/S - which is why it would take so much effort to make GZDoom into a C/S game.
It also doesn't help matters that Doom's game processing rate is 35 tics per second. Modern C/S games are around 10, or sometimes even less. 35 tics per second can consume a ton of bandwidth, especially when there is a lot of action going on, which is why Doom worked so well with P2P, but presents such a challenge for C/S port authors.
-
- Posts: 6
- Joined: Sun Jan 31, 2016 18:57
Re: GZDoom Multiplayer Support
I'm starting to understand now. That does sound bad. They seemed to make it work back then somehow though, probably by just limiting games to low latency. I imagine if we did the same thing as was done back then we would have the same success rate. It is complicated by all the add-ons though we have now and you did mention everything needs to be the same, so without add-ons would be be possible without getting too complicated. Maybe it would just make more sense for Zandronum to catch up. I have no idea why it doesn't visually run as well as GZDoom. I'll have to ask them about that. Will GZDoom go away once Zandronum get's to that point do you think?
- Rachael
- Developer
- Posts: 3651
- Joined: Sat May 13, 2006 10:30
Re: GZDoom Multiplayer Support
Zandronum does lag a release or two behind GZDoom. Sure - they'll catch up to GZDoom, but GZDoom itself will never go away. Zandronum still depends on GZDoom for OpenGL code and many people prefer GZDoom.doomydoomer wrote:I have no idea why it doesn't visually run as well as GZDoom. I'll have to ask them about that. Will GZDoom go away once Zandronum get's to that point do you think?
Plus - when new features get released, GZDoom will always sprint right ahead of Zandronum. Both ZDoom and GZDoom are under constant development, and Zandronum does not pick up their code until it's been tested thoroughly. In fact, Zandronum's release schedules are intentionally staggered with beta testing (which, from what I've seen, players fail to properly take advantage of) to make sure that what does get added to Zandronum doesn't break the engine.
-
- Developer
- Posts: 197
- Joined: Sun Nov 29, 2009 16:36
Re: GZDoom Multiplayer Support
Can you check with last Zandronum 3.0 beta build from here? It contains the GZDoom 1.8.6 code base.doomydoomer wrote:I have no idea why it doesn't visually run as well as GZDoom.
- Gez
- Developer
- Posts: 1399
- Joined: Mon Oct 22, 2007 16:47
Re: GZDoom Multiplayer Support
Heh.Eruanna wrote:I don't know if Zandronum has a new way of handling weapons, but I do know I really never liked the way Skulltag handled it - it was extremely hacky and it was way too trusting of the Client-side.
I remember back then I used to autoload a resource file that had pretty much every sprite and sound from all core games (Doom, Heretic, Hexen, Strife), in PNG format for the sprites. This had sometimes a few weird side effects. Anyway one time I tried Skulltag with that, in the game mode where we're given all weapons (Last Man Standing IIRC?) the result was funny. I had to cycle through all the non-Doom weapons because I was perfectly able to select the Sigil or the Wraithverge; however the server wouldn't let me actually fire them. Too bad, it would have been funny.
- Edward850
- Posts: 63
- Joined: Fri Mar 20, 2009 21:48
Re: GZDoom Multiplayer Support
Who keeps perpetuating this rumor? The way the input framework functions makes this a logical impossibility. You could pull your network cable out for 5 minutes and still resume the game once it's back in, perfectly in sync.Eruanna wrote:All it takes is one lag spike to desynchronize the whole circus and pony show, and then the game is broken - you can't play it anymore.
Granted at a 5 minute delay the UDP endpoint is likely to close if you're talking between two gateways (current RFC spec is 2 minutes), but that's not a sync issue.

Doom does have a designed latency cap to prevent singleplayer games from getting trapped in a process loop. If you extend the loop, the game can run at a higher latency point across networks, but at the cost of more processing time needed to recover lost frames. At the moment G/ZDoom allows about 420ms, which is enough for globally spanned games to run unimpeded, even between 8 players in different continents.Eruanna wrote:Even worse, P2P relies on a very low ping environment to function properly. If you have high ping (such as connecting to someone halfway across the world), then the game runs extremely slow, or there is a huge delay between your input and player actions. (Starcraft 1 had a great example of "delayed" P2P - every order given to your unit was delayed by 2 seconds on the "high latency" setting, in order to keep all copies of the game in sync and still allow the game to play at a somewhat decent speed)
However here's the thing: Latency is not a P2P problem, it's universal. Async C/S has just as much issues at high latency that P2P has (input latency is not something you can just remove, you can hide it but you can never actually remove it), and sometimes worse depending on your backend framework (anybody remember the original Quake networking?). ZDoom implements an input prediction model to account for exactly this, and even at high latencies of 1-2 seconds (satellite internet will do this) it continues to function.
It still exists in a P2P form because the input buffer was designed for it, notably Dwango needed it this way for its custom 4-player modem driver, however it doesn't need to stay as P2P. This very specific aspect can be changed, still exclusively as a deterministic framework, but there's no reason why you can't make a deterministic framework C/S. Seeing as people commonly confuse the two (and I cannot understand how) it does make me wonder as to which thing you're apparently trying to talk about.
- Rachael
- Developer
- Posts: 3651
- Joined: Sat May 13, 2006 10:30
Re: GZDoom Multiplayer Support
Edward850 - it was not my intent to spread misinformation.
That being said - on point one - I've had bad experiences with running GZDoom multiplayer over the internet. I was in the United States, my partner was in Italy. We were testing Stronghold multiplayer at the time, trying to make sure that the game functioned properly. Everything went well until - well, a lag spike, and the clients were out of sync. The more we played, the more lag spikes and desyncs there were, and it ended up that we just couldn't play it at all. We later had to switch to Skulltag in order to do a proper testing. It may just be that ZDoom's multiplayer code just wasn't as mature and/or debugged then as it is today, but multiplayer was never a major focus for ZDoom so if it has changed I wouldn't know.
On point two, I am fairly certain I am not confusing P2P and C/S. When I say P2P I am referring to a technology where only player movements are broadcast and the games are kept in sync by running their own copies of the simulation and AI using a shared random number seed so that all results of random calculations will be the same. When I say C/S I am referring to a technology by which actor and game positions and states are forcefully replicated from a server to all its clients, thus allowing the clients to run their own game sims that get updated by the server. Player movement commands are often not replicated in C/S, rather the actors and environment objects that players control are, instead.
That being said - on point one - I've had bad experiences with running GZDoom multiplayer over the internet. I was in the United States, my partner was in Italy. We were testing Stronghold multiplayer at the time, trying to make sure that the game functioned properly. Everything went well until - well, a lag spike, and the clients were out of sync. The more we played, the more lag spikes and desyncs there were, and it ended up that we just couldn't play it at all. We later had to switch to Skulltag in order to do a proper testing. It may just be that ZDoom's multiplayer code just wasn't as mature and/or debugged then as it is today, but multiplayer was never a major focus for ZDoom so if it has changed I wouldn't know.
On point two, I am fairly certain I am not confusing P2P and C/S. When I say P2P I am referring to a technology where only player movements are broadcast and the games are kept in sync by running their own copies of the simulation and AI using a shared random number seed so that all results of random calculations will be the same. When I say C/S I am referring to a technology by which actor and game positions and states are forcefully replicated from a server to all its clients, thus allowing the clients to run their own game sims that get updated by the server. Player movement commands are often not replicated in C/S, rather the actors and environment objects that players control are, instead.
- Edward850
- Posts: 63
- Joined: Fri Mar 20, 2009 21:48
Re: GZDoom Multiplayer Support
That's sort of the problem: Impossible was not a suggestion, it's impossible as eating the sun. The input framework functions on a reverse acknowledgement concept, where all data is confirmed delivered until asked otherwise. This works because a node will not run any gametics when absent of any input for that gametic, which is where latency is introduced. This affects singleplayer in just the same way. The core concept of this is if you miss a packet, the game will not run and will ask for a retransmit. If you're running gametics faster than the network buffer wants to build them, the game will also not run, which is exactly how the interpolation functions (tryruntics just freeruns without delay, letting the renderer and the network figure out the rest). Your own input is treated just the same as other nodes, so the game cannot physically advance until the input for the next gametic has been successfully received for all nodes. The buffer itself will physically stop building gametics if it has too much before it gets a chance to overrun old tics (incase they need resending).Eruanna wrote:I've had bad experiences with running GZDoom multiplayer over the internet. I was in the United States, my partner was in Italy. We were testing Stronghold multiplayer at the time, trying to make sure that the game functioned properly. Everything went well until - well, a lag spike, and the clients were out of sync. The more we played, the more lag spikes and desyncs there were, and it ended up that we just couldn't play it at all.
This is how it has been since vanilla, so nothing about any kind of lagspike at all can cause a desync.
So you are confusing the two.Eruanna wrote:On point two, I am fairly certain I am not confusing P2P and C/S. When I say P2P I am referring to a technology where only player movements are broadcast and the games are kept in sync by running their own copies of the simulation and AI using a shared random number seed so that all results of random calculations will be the same. When I say C/S I am referring to a technology by which actor and game positions and states are forcefully replicated from a server to all its clients, thus allowing the clients to run their own game sims that get updated by the server. Player movement commands are often not replicated in C/S, rather the actors and environment objects that players control are, instead.

Peer-to-peer is a communication framework in which no server exists to maintain as a central endpoint or exist as a point of failure. This is commonly used when you need a set of data to be transferred independent of a 3rd party, which you may notice sounds exactly like how a torrent works. That's exactly the communication torrenting uses to make sure a file can exist and be transferred regardless of the state of anybody who originally hosted the file. Doom uses it to make sure the state of any one datagram or stream can't be backlogged by other nodes, although still uses an arbitrator to dictate the speed of the game (however they only host their own input). It also makes it incidentally compatible with the concept of supernodes, which in a way is also how torrents work.
Client Server is pretty much the opposite. A centralized system or host maintains all connections and communication between all connected clients. A lower bandwidth footprint overall, while introducing an added delay in communication between any two clients. It also makes re-establishing connections a problem if the host vanishes, which now would either need to migrate to a new host (if one could exist), or discard the session entirely.
Funny thing about both of those explanations is that neither of them actually properly explain how Doom, ZDoom or Zandro works. The concepts are actually interchangeable. Notably, you explanation of P2P actually has nothing to do with P2P, but rather what's called a Deterministic Model. While indeed also what G/Z/Doom use, actually has nothing to do with P2P at all, other than it just so happens that's how player inputs are routed. The two otherwise to rely on each other, especially for demos which are just a file stream containing inputs piped into each gametic, allowing them to be played back at any speed regardless of the speed of the original recording (which isn't kept either, as demos are notably timeless other than their number of inputs overall.)
It's also what allows Doom to frameskip without changing the physics. Frame took too long to render? 5 inputs constructed during the time? It just runs 5 gametics at one cycle.
The more you know:
Peer to Peer is actually still used these days for racing games, typically Kart racers (It's also used in Elite Dangerous to keep server costs down via instancing which would be fine if the instancing wasn't trash). Each node sends the current state of their vehicle including speed, position, state, damage model, if they fire their weapon, who they attempt to hit... Basically everything about their own actions and what state they are in. Still Peer to peer, and notably not even remotely deterministic. It's done this way to combat the latency-to-laptime problem by providing the best possible average for all players. And considering the typically low number of players per race, it's very efficient.