Reject P2P, Return to Quake III Netcode
I think the scope and other things have changed too.
Modern games have and do so much more; and need to account for many more scenarios (plus cheating, griefing, chat, voice chat, etc)
there's definitely less micro-optimization; but that's not really a bad thing either as it comes with many other benefits when you only optimize the actual hot paths
and make everything else more scalable/maintainable
238 Replies
I disagree. It varies from game to game, of course, but take a single genre like arena shooters. Quake III, Quake Champions, Counter-strike, CS:GO, Halo:CE, Halo 4.
Players move around with simplified physics in a constrained map, shooting and being shot. There's grenades and shields and power-ups but unless it involves physics the computation involved is simple buffs/debuffs, maybe a barrier or special explosion, but that's just an extension of existing netcode entities.
Chat and cheating have always been a thing. Anti-cheat is a little newer, but was definitely present in the early-2000s where servers would simply kick clients who reported impossible inputs. Modern anti-cheat goes a lot farther doing things like locking your GPU driver, disabling antiviruses, and streaming system telemetry like process IDs in realtime to a central server, but that's all pretty separate. Usually a whole separate program.
Griefing isn't really relevant because there's no automated way to control that. It's like the Scunthorpe problem, only orders of magnitude more complex.
Similarly, voice chat it a whole seperate system. You're not putting voice data into the same packets as game data. The actual voice chat is provided by third-parties more often than not, the same hosts doing matchmaking and such.
when you only optimize the actual hot paths and make everything else more scalable/maintainableId was maintaining their games with fewer than 10 people. Quake III today is maintained by a tiny handful of volunteers and it runs as flawlessly now as it ever did. Blizzard has thousands of developers and can't reliably maintain a comparably complex arena shooter. If games were signifigantly more complex, like if clients were sending full-body motion capture or receiving streamed environment data or video or something, it'd make sense to require 100x more bandwidth. The thing is that, within comparable genres, they really aren't. Why would Civilization VI require more bandwidth than Civilization III? The game states and actions haven't gotten faster or more complex, the netcode is just less efficient and much buggier. Look at Age of Empires II versus Age of Empires II: HD. Game logic is the same, game mechanics are the same. They replaced the renderer and the netcode so it runs on modern GPUs and has P2P (TIL) netcode with Steam for matchmaking. The modern netcode is orders of magnitude less efficient, and randomly crashes/desyncs so badly that I never play online because it's not worth investing effort into a game that randomly desyncs after an hour. Same game, modern netcode, buggier, slower. It's not that games have gotten more complex. It's that developers aren't under the constraint of making their games playable on dial-up so if the game requires 200mpbs up/down within 10ms latency, it's acceptable.
You're specifically talking about networking here; when in actuality the overall game is doing a lot more.
The actual calculations are more complex, as are the environments, bounding boxes, the number of assets in the world, etc
all the meshes and bounding boxes in a Quake III game, put together, are less than some single models/rooms in modern games
not to mention things like destructible (even partially) environments and stuff that exist in many modern FPS
Quake III, for example, didn't let you shoot a hole through any "wood mesh", see what's on the other side, and continue firing through it to hit your enemy
something which say Rainbow Six Siege or other modern shooters have as a "default" now
nor could you use grenades to blow open walls or destroy railing posts
or really "peek" corners or anything
so the overall computation power, the number of vertices and bounding boxes involved, etc
were vastly simpler and could be handled on the lowest end of machines
that, combined with average available network bandwidth basically forced games to come up with clever tricks so they could run and do stuff
we could probably do a lot more with games if developers did the same nowadays; and there are some games which do
but, its also not required nor necessarily good to do that since it greatly increases the complexity in an already vastly more complex world
Is code takes up barely any room in an application unless you are obfuscating it or adding anti piracy and anticheat protections
Is for instance the launch executable of doom eternal is IIRC 90% smaller than the denuvo version
Aside from that as we've decided to cram 4K textures on everything We have more demanding and more expensive materials with more mapping and more dynamic environments
My Minecraft clone is tiny in codebase, that being said, the test scene bistro is nearly a gig in size when it comes to gltf
And fbx files
You wanna know where most of the slowness and bloat came from and where fast lean and bloat free went to? It went to shit like that. Back then only AI and players with a small amount of lighting were dynamic
Well, that's simply assets which doesn't really add to the slowness of network nor "much" to some other costs.
More complex and/or dynamic meshes do add cost to computations though; particularly if you do mesh level (or near mesh level) collisions
And that has nothing to do with networking. You're not streaming geometry between clients, just game state and user input.
IdTech3 had that, as does Source.
Cool feature. not involved in netcode, though.
Actually you could. All of that was pretty standard in IdTech3 and Source games.
Just to make you keep this in mind, source was a baby with its first few lines of code during idtech 3's time
Yes, but it's astonishingly efficient compared to modern multiplayer games.
HL had just released
Source is also derived from GoldSrc which is itself derived from IdTech2.
If by idtech 2 you mean quake 1 yes
Netcode, generally speaking, is only concerned with game state and user actions.
There's no reason to stream the geometry of a level in realtime during a multiplayer game. That should be coming from a file.
No I mean IdTech2. The engine.
This one time in particular is flakey
Quake II engine
The Quake II engine is a game engine developed by id Software for use in their 1997 first-person shooter Quake II. It is the successor to the Quake engine. Since its release, the Quake II engine has been licensed for use in several other games.One of the engine's most notable features was out-of-the-box support for hardware-accelerated graphics,...
Quake 1 and Quake 2 both are considered idtech 2, but very different
or at least parts are very different
Goldsrc was forked from the Quake 1 variant
I disagree. They're related but were released separately at separate times. Id considers them separate engines.
not the Quake 2 varient
No id considers DOOM idtech 1
You're welcome to correct the Id folks.
LOL
Obviously not. Quake was IdTech1
Except it wasnt
Modern game state includes geometry changes in destructible environments
If I shoot a hole through the an object, or blow up a wall, everyone else sees that change (and generally speaking, the same change)
Doom engine
id Tech 1, also known as the Doom engine, is the game engine that powers the id Software games Doom and Doom II: Hell on Earth. It is also used in Heretic, Hexen: Beyond Heretic, Strife: Quest for the Sigil, Hacx: Twitch 'n Kill, Freedoom, and other games produced by licensees. It was created by John Carmack, with auxiliary functions written by ...
Quake was idtech 2
the exact geometry might not be streamed; but the change that caused it is and is reproducible enough for them to have the same view you get
Oof, got me there.
Okay. Quake was the Quake Engine. Quake II was IdTech2.
LOL thats why I like to consider this period by name
there is no confusion
We didn't need to do that in Source or IdTech3 games. Don't need to do that now in Unreal.
Are you talking about Digital Molecular Matter?
You do need to for it to be consistent on client and server and enemies
And the changed that caused it would be a shot like any other.
You don't need to stream the animation of wood pieces splintering and whatnot. Just the player shooting in that direction at that time.
no, I'm talking about things in games like CoD or R6 Siege; where part of the gameplay revolves around reinforcing, destroying, or modifying the environment
which involves a lot more overall state tracking and data sharing then simply player positions
We are talking PARTIALLY DISTRUCTABLE
I disagree. Can you give an example of a game that streams graphics in this way?
As in I can put a hole inside without tearing it all down
State and player tracking, yes.
Level geometry, no.
What do you mean stream graphics thats not what we are saying also minecraft
Yes. That's how destructible maps work.
Now you have to have enough info to recreate the distructed geo, so there is a bit more information and state needed
1. Minecraft doesn't stream the graphics either, just player actions and game state.
2. I'm talking about this notion of sending graphics over the network. Modern graphics do not add extra burden to netcode unless you're streaming meshes and textures.
and the hole appears the same on all player maps
and was influenced based on many factors; such as character, tool used, position they were facing, where the action that caused the destruction occurred, etc
Yes. That's game state. No different that physics entitities.
its a lot more state to share; even if its not sharing the actual geometry
Yes. Sending that information is a LOT more efficient than streaming textures to every client in real time.
No one said we were streaming textures
Yes, I know. My point is the amount of data shared and the requirement that it be deterministic makes a huge difference when comparing a modern game to something from 2005
Also yeah, not everyone wants to do a shitload of bitpacking
sorry
we are simply doing, processing, and sharing more
The argument was that modern graphics require additional bandwidth because visual changes have to be synced across clients, so sending player actions and game state isn't sufficient. I disagree.
I didnt say that and neither did he
No, the argument was that it requires additional state and information to be shared; so it takes more bandwidth overall
there are many more entities at play
I still disagree.
Doing more, sure. Sharing more? Not significantly. Maybe a few kilobytes per second more.
Not enough to justify hundreds of megabytes per second.
No one is sending megabytes per second
Millions more? I doubt it.
Who the hell is sending actual hundreds of megs per second
Xept steam
Modern multiplayer games where you need 100mpbs+ to keep up?
No game actually uses that
thats more for latency and tolerance
so they can say you have shit internet if you dont keep up
Aight, so why on earth does Halo 4 need so much more bandwidth than Halo:CE did?
Bandwidth has nothing to do with latency.
Think about it this way; the average map size on a game from 2005 vs a modern FPS game is significant
The average game from 2005 had less than a hundred interactable environmental pieces and generally no more than 4-16 players
Modern games have the same 4-16 players, sometimes more
much larger maps; and almost every environmental feature can now be interacted with or modified
More entities
more state info that needs to be shared
and yes
That's... completely unrelated. What?
less bitpacking
...
Bitpacking is a matter of bandwidth, not latency.
in R6 siege; every single painting and railing piece is now interactable
so you must be able to identify every one of these uniquely, even when simply sharing the state with everyone else
Meant to respond to the top one
you can shoot and destroy these things individually
not the bottom one
Right, so how many thousands of paintings are there?
its thousands of individual objects total; not just paintings
How many bits does it take to uniquely identify an entity? 256? 512?
shorts is the least I would actually do
I haven't played R6.
I mean, I played the Windows 95 one, but not the modern one.
HL:A has an entity count
16834 or whatever
16bit integer limit
singed
assuming you have at least 65k objects; then a ushort
and you have to be able to stream every object that's been interacted with
for each player
so that quickly adds
not to mention player positioning; weapon changes, texture ids, model ids, etc
their input changes
in vastly larger maps
If you have 65k objects in a 4-player arena shooter you're doing something weird. This isn't Fortnite.
Neither is R6siege or COD
this isn't unusual for modern games where the entire environment is interactable
Or quake
Neither of those are arena shooters.
Quake is actually efficient, except for the new one.
also, the average FPS game has about 10 people per "session"
5 per team
Not my point. WoW has tons of players at once.
and doesn't have destructible environments
It also does not load the entire level at once
WoW is largely static and doesn't sync much more than player data
ents in a radius I guess
You guys are missing the point. You're comparing arena shooters to bigger games with more players.
who made the comparison?
A 4-player FPS with a small arena map does have 65,000 entities.
we're talking about why modern games use as much bandwidth as they do
and its literally because they have more data to share
You guys? I'm not talking about R6 or COD or WoW or PUBG or Fortnite.
I'm talking about Quake.
No, you're comparing big modern games to tiny old games. Of course they have different needs.
Uhh YOU mentioned WOW and Fortnite
You guys mentioned R6 and COD.
ANd so are you
the entire discussion was around why modern games (i.e. things like R6 and CoD) use more bandwidth as compared to games from 2005?
I'm comparing Quake III and Quake Champions. What's wrong with that?
At least they are arenashooters...
Pretty sure they aren't, unless R6 changed a LOT in the reboot.
Neither of them are Quake, more importantly.
The new one is a Class based shooter, player state grew quite a bit
at least
Let's keep it apples to apples.
Quake III could manage 4 players with under 1mbps.
Quake Champions needs ~100mbps for the same number of players.
Same game, effectively. Only a few players at a time. One old, one modern.
One made to be efficient by a tiny group of people, one built by a massive studio.
How?
it also uses a completely different engine
But still using networking. Pay attention.
and tracking/sharing more state and data; including costumes, textures, and other customization options
By having shit like abilities, cooldowns, different movement code for each
^
Quake Champions is streaming textures to clients?
Yeah, none of that should be in netcode. That can be calculated client side.
NEVER DO ANY OF THAT CLIENTSIDE
no, its likely sharing the ids for the textures; which is still a lot more than the single set of static textures everyone got in 2005
What, you're telling me your game loads all resources and assets at runtime off a server?
yeah, games don't do this stuff client side unless you are P2P, and then its part of the shared state validation
because otherwise you end up with cheaters being able to activate abilities before cooldown has happened, etc
Wait. I thought the whole argument was that modern games were P2P, no central server computing these things.
You said abilites and cooldown and movemnt should be done client side
Yes, but you don't need a server to render your freaking scene.
Thats... not how that works
Yes. Are you running flipping PhysX on a server?
no one ever said you did; we're saying that there is literally more state being shared
Most game engines have a client server model even in P2P
And yes...
No, Redhacker said to do that stuff on the server to prevent cheaters.
Because you want to be Server Authorative
you dont trust the client
Exactly, so it doesn't matter if a "cheater" triggers an effect client-side. Only they can see it.
modern P2P systems are structured and there is a shared state management and validation
so unless all players are cheating; once you activate an ability, activating it again before the cooldown occurs won't work for cheaters
Exactly my point.
Abilities, movement, cooldowns are not client
because the overall P2P model prevents it
So why bother rendering it on the server-side?
so its not client side still
No, it's client-side.
You dont render any of that
they are state that needs to be shared
so the server can validate and send the data accordingly
...you render everything. It's a game.
no, client side means "happening on my machine"
validation and consistency being asserted by multiple machines in a structured P2P system is not "client side"
Bullshit
Pretty sure it can be client-side and still be validated server-side.
Unless you're calculating all your mesh transforms on a server or something.
Quake does not render AABBS
Okay so you simulate a particle effect but don't render it?
info_player_starts
That's physics, not graphics.
info_player_teleport_dest
etc
Those aren't graphics. They're markers for game logic.
Movement, Abilites, Cooldowns arent graphics either
yes, actions occur on client-side; but the eventual state and validation that the action was correct/etc happens via the shared system (whether server or p2p)
so if you do 'x' and the rest of the system says "no"; your action is "undone"
the most common form of this is movement, in the form of "rubber banding"
Yes, they are. How are players aware of them if there's no visual indication?
The visualization does not matter
but it also applies to cheaters and stuff, in the form of saying "I activated an ability" when a "cooldown" was still pending
it could be console
Right. THat's the old-school way back when 10mbps was plenty.
on the side of the screen
Yes.
And its still used
even in UE4
It's sort of important. Games are a visual medium.
and S2
We are talking about network traffic remember
Right. It's a good system.
This notion of streaming graphics and such is malarkey though.
the old school shared a lot less data overall; because the environment, game world, number of interactable entities, and customizable options (models, textures, # of available weapons, etc) differed
it was simply less stuff because everything was simpler
No one said we are streaming grahics
I wasn't the one who claimed better graphics increase bandwidth consumption.
stop pulling out of your ass
You literally said the netcode had to send textures in Quake Champions.
No
I did not
not even the other guy did
He said they had to be referenced
it was said that there are more textures and customization options
and therefore you need to send IDs indicating which option is in use
Texture IDs work very well
Please keep your argument straight.
that's not saying textures have to be streamed
that's saying data about the textures in use has to be streamed
I talked about movement code, abilities and cooldowns
Yes. and how many megabytes do those require?
In the old days we could do it in 4 and it'd be overkill.
well many games use GUID
so 64 bits?
maybe 128
I dont remember
which means, if you have 4 possible textures; you need at least 2 bits to determine which one is in use
but modern games have a lot more than 4 skins per thing; generally speaking 😄
Also again no game is sending multiple megs a second in data consistently
Okay, so a whole 128-bit GUID because apparently there's more textures than there are atoms in the universe.
Still that's 16 bytes added to each player's state. 4 players, that's 64 bytes added.
A tiny part of that 100mbps.
except maybe MC
Minecraft's not even that bad. The modern chunk format is wonderfully efficient.
No game sends that much
they might recommend it to ensure you aint using dial up
I mean, it's a lot but not that bad. I can play Minecraft well on 40mbps down, and even host at 1mbps up.
Have you tried implementing a Minecraft client? The netcode is a bit messy but extremely efficient.
It's the rubber-banding implementation that's terrible.
now expand that out to be state about every unique/interactable entity in the game
which often includes every floor, wall, or ceiling
sometimes with modifications on them
what does that have to do with any of what I said other than except maybe MC
plus the overhead of the packets themselves
Listen, I can play Quake III flawlessly with my Internet.
Quake Champion I lag everywhere.
Halo 4 I lag everywhere.
Halo:CE is fine.
identifying which packet is which and what order they came in
timestamps
You said no game sends as much as MC. I will insist that MC sends a LOT less than Quake Champions.
it quickly adds up; especially when playing at 15-30fps
and no, not every game is the most efficient here
I said this
Multi-megabit timestamps?
Also, this was a concern for IdTech3 and was dealt with in only a few bytes.
but its very easy to see how a game from 2005 is "more efficient" than 2020
because its literally doing less
If you're playing at 30fps you have other problems.
I'm talking about the general minimum frequency networking packets get set at; considering typical ping is 30-200ms
Except when it's not.
Explain AoE II.
Exact same game, just new rendererer and new netcode. Suddenly laggy and unstable with terrible rubbber-banding unless you have exceptional Internet.
Look there probably is less efficency, but in order to be that bad they would have to be sending the packet 10-20 times over
sorry I dont want to pack all my 10 0-4 state data into like 3 bytes
and make it a fuckin mess to decrypt
If you can show me a 4-player arena shooter a la Quake where the game state is more than 10MB for a good reason, I'll concede.
Otherwise, I cannot fathom any small game needing that much data.
It's not decryption. It's just pointers. If you know what you're doing it's 1 lines of code.
the fuck? data alignment is done in bytes
you can only go as small as a byte
games from 1990-2005 also took every shortcut they could get and we+them are paying for it later
see also "unix tried to use 32-bit timestamps and those expire in 2038; so a breaking change to support 64-bit timestamps was required"
64-bit, even at 1ns, is good enough for any foreseeable future though (584.94 years); the actual datetime stamps are likely much less precise; but still likely 64-bits
to get the values out I have to pull the data out of those bytes
Bruv, my day job is embedded systems development. I work with 1024 bytes of RAM, sometimes less, and data communication needs to fit in under 9600 baud.
Data structure optimization isn't hard. A few weeks ago I wrote a function in C89 to decode USB packets into a specific structure. One line of code, just casting an array with an offset to a struct type.
the most important thing to optimize for is developer time; at least until something else shows to be a problem under a hot spot profiler
Bitshift and bitwise operations. Not complicated stuff.
Its unreadable... especially to a human
Ever used a
union
?I know
It's a struct. It's exceptionally readable. One sec.
dear god
would you look at that
that's also incredibly specialized environments that do basically nothing except for a given thing
they aren't running 50-100 (or more) services in the background to support a modern environment
They also arent generalized networking solutions
like UE4
and would never scale to something like say Discord even; no matter how optimized it gets 😄
Unity
I'm not the best at everything; but I'm decent enough at lowlevel optimization
enough so that I own and manage the numerics and hardware intrinsics code in .NET
I do understand things here
And by the magic of modern time-sharing OSes, this is true of games too! You don't need to combine other process' data into your packets.
Behold, the "unreadable... especially to a human" struct.
that's not optimally bitpacked either 😉
What does this mean!? I don't know!! Let's replace it with 10MBs of human-readable markup.
You are talking using specialized netcode, yet you fail to realize many games just use the netcode library they find on github
you've got at least 2 bytes padding between
usbAlength
and usbWaddress
Heck, doesn't even need to be. It replaced a contractor's code that was 1088 bytes long for every packet. T_T
I wrote the bloody driver for it so of course it was my fault it was so slow.
What? No.
This code is for an 8-bit CPU.
Struct packing doesn't apply here.
EXACTLY MY POINT.
Studios with 1000+ programmers should be able to spare one to work on netcode.
Bethesda can't manage netcode as efficient as Id produced in the 90s.
that generally doesn't matter; the natural alignment of a
uint32_t
is 4-bytes; not 1-byte
so unless you're using a custom compiler or explicitly have a pack
directive somewhere, you have padding thereAnd bethesda makes huge dynamic open world games
and outsourced netcode development to another studio
your point?
Sorry a world with 4000+ entities around is not as small as an engine that had 1000 at most on even the most populated maps
An internal build of avr-gcc. I don't know what black magic the compiler team puts in there, and I'm afraid to look. I eat lunch with one of the compiler devs. 2 MS's from MIT and almost never says anything. Nice guy, though.
Anyhow, actually iterating over the bytes shows there's no padding. We validated the whole thing closely.
I used to teach university classes on embedded C. This is my area of expertise.
Quake Champions is a huge dynamic open world game?
Shoot, there's my whole problem. Here I thought it was an arena shooter with a small number of players per session.
So you have a highly customized scenario designed for a very specific piece of hardware and environment
not one that runs on any modern device, including x86, x64, ARM32, ARM64, PowerPC, etc
Didn't know Quake Champions had 4000+ entities either. Crazy. Can you show me which map has all of that?
so you can optimize around xplat considerations enough so that a custom compiler was built
If I compiled this for x86 it'd waste 4 bytes. I could fix the packing if needed.
Are you suggesting that compiling for x86 makes it reasonable for KBs of data to balloon to MB?
I mean, it's literally the manufacturer's officially supported compiler.
That's like calling MSVC a custom compiler.
I was talking about the game made by bethesda, not id, there is a distinction
id works with a level of autonomy
Id is owned by Bethesda and made Quake Champions.
no Saber Interactive and ID make Quake Champions
I'm saying that it adds up in modern games where everything needs to be considered and with all the features and functionality provided
its simply not comparable to a game from 2005
and while some games are likely terribly inefficient; and perhaps Quake Champions is one of them; its often not the case
Okay. Fine.
Id in 2020 can't produce netcode as efficient as Id in 1995. Better?
its really not; one is the de-facto compiler for the operating system and software that runs on some 90% of desktop computers in the world
A few bytes per entity doesn't add up to billions.
Does your game's netcode take MBs per player?
Yeah but it's a custom compiler made by Microsoft.
Show me a game using MB of data for netcode specifically
Quake Champions and Halo 4, in my specific experience.
I will sit here waiting for you to actually show me it happening...
no; its one of the 3 standard and most used C/C++ compilers
not really "custom" in that it wasn't designed for a specific scenario and isn't only used for that scenario
I dont wanna hear about it, I wanna SEE it
How? An OBS recording of my lagging out on a 40mbps connection?
And avr-gcc is literally the only compiler for AVR. I don't see the problem.
most networking is in the term of Mb, not MB
so if you have 40mbps, that's 5 megabytes per second; or only 5 million bytes
Yes but that's 60+ game state updates per second.
Not my problem, I dont believe netcode itself is using that, the only reason I could think of them doing it like that would be for encryption and I doubt that is the case
sure, that's 85 thousand bytes per frame
when accounting for heartbeat, state sharing, and receiving/sending data to up to 16-32 people that's adds up quickly
and that's only if saturating a 40mbps stream
Most strong encryption cyphers don't add more than a few dozen bytes, unless they're doing something weird.
even at just 4 people, in an unstructured P2P system that's 20kb per person; so it doesn't seem unfeasible to hit that
you could probably go smaller; but how much less maintainable is that over time; particularly if you have to account for new features in the game
and again, maybe Quake Champions is a case of some really bad code here; but modern games in general do a lot and its completely conceivable that such limits are hit
Salting can affect that
now the point is
and security/protection in general against mitm attacks (or other vulns)
Yes, you could feasibly hit that limit nowadays, sorry your internet does not keep up with the modern expectation of pheasability
You don't need to account for new features unless you're accepting clients being different versions.
And, again, a compact data format isn't unmaintainable. If you don't know how to do memory management, sure, but if you really don't want to deal with that yourself you can at least use something like protobuf or flatbuffers. Fast, efficient, the format is defined semantically. I use FlatBuffers in a C# blockchain implementation and the overhead is virtually nil, and the structure is literally just a class with a decorator.
Don't really need your own encryption for that. Normal SSL has that covered.
If you're really paranoid and don't want the ISP or government to tamper with your packets in realtime, then a rolling stream cipher is orders of magnitude more efficient and virtually impossible to crack in realtime unless the original key exchange was compromised too. But honestly, that's new levels of paranoia.
I'm almost used to it.
It's annoying because the flipping Republicans said that making ISPs a competetive free market would somehow be Communism, but legally mandating a monopoly so that citizens in my state have exactly one ISP to choose from is Capitalism.
I've been protesting about Net Neutrality for too long. It's on my (long) list of reasons why I'm looking at immigrating elsewhere in a few years when I no longer have family attachments.
But that's politics.
IDK