Thank you for actually getting in contact with the FR guys and telling the entire history behind the creation of .kkrieger. Quality work! More demoscene-related videos would be awesome. :)
Hey, one of the Kkrieger devs here... There's a bunch of stuff that comes up a lot in the comments, so let's address that :) "Where is this technology nowadays?" It exists. It's called Substance Designer (made by different people than us, and now owned by Adobe), and it's pretty much the same thing we did back then - artist driven procedural texture generation, and used everywhere in games, animated video and VFX. But usually it's used to generate the textures offline, which are then stored as normal images in the game data along with everything else. Which brings us to the next question: "Why aren't games nowadays using it?" Several reasons. The biggest one: It's a giant load of hard, unintuitive work. You can for example easily make a concrete texture with it, but it takes hours and hours of painstakingly tweaking numbers until you arrive at anything worthwhile. You know what's way faster? Going outside, taking a picture of a concrete wall, loading that into Photoshop and tweaking it a bit. And that's not lazy by the way - game devs are already working overtime for months and years, and y'all probably wouldn't want games to take three times as long to develop just to save some HD space. And there's also a technical reason: This tech (ours, Substance, doesn't really matter) is way, WAY slower than just loading an asset from disk. You'd pay for the saved space with ridiculous loading times, and you also can't stream stuff in while the game is running, because generating the textures takes a lot of GPU and would seriously impact frame rate. Imagine the micro stutters everyone's complaining about, just ten times worse. "You're just using the DirectX libraries, that's cheating" Oh great, THAT "point" again. Er, no. As in yes, we're using DirectX (as in Direct3D, DirectSound and DirectInput) but because that's how you get your GPU to draw polygons and run shaders, your sound card to output sound, and how you get input from the keyboard and mouse. That's it. DirectX isn't an "engine", all the meshes, textures, lighting, shadows, effects, music, SFX, etc. come 100% from our code, and the only asset that comes out of Windows is the font used for the menu and HUD (Arial with a bunch of effects on top :) ). If this still had been the DOS days where you had to program the hardware directly, Kkrieger wouldn't perhaps be 96k but still not significantly bigger, and it would only run on one specific model of GPU and sound card. So, no. There's definitely a full engine in there that works similar to what Doom3 did back in the day (direct per-pixel Phong lighting, stencil shadows, etc), just slower, because optimizing for performance would be way more code. "That sound clearly isn't MIDI" Yes and no. We're not talking about General MIDI as in "let's play a .mid file", but MIDI the protocol that sends musical notes to a synthesizer. The actual audio is coming out of a realtime software synthesizer (you might call it "virtual analog") that's part of the Kkrieger code, and MIDI is only used to store the musical score. I chose MIDI back then because that enabled us to use a normal audio tool (DAW) for creating the audio with the synthesizer running as a plugin. Export a .mid file from it, add the synth code and the sound bank, enjoy the music and sound effects. "Where are those guys now?" We exist, living our lives, all outside actual game development nowadays (because we wanted to have lives), but still quietly working away on various things that you might or might not have seen. Some of us are still active in the demoscene, and we're all fine, thank you :)
Being able to hear first hand experiences makes me wonder what side project someone else is working on right now that could open the door to new opportunities
A legend here by himself. I am very impressed with the work you done, and I hope you all are doing well nowadays. Do you communicate with each other from the team today?
Draw hundreds of unique textures and your game will fit on a CD. Write a game that can draw its own textures and it will fit on a floppy. Honestly kinda brilliant.
True enough, of course as is the case with anything procedural (at least currently), after a while you notice the limits of the algorithm, the similarities in elements etc.
definitely was and still is impressive to be able to create something like this, but, ultimately i think, it's useless especially for a video game- handcrafted experience just wipes the floor with anything a computer will try to spit out with current technology
@@yellowblanka6058 I remember some game where they were at least talking about using modern large texture maps and procedural generation just to break the texture size limits. By using a number of say grass textures and bump maps they could generate procedural grass textures to cover huge areas without having to repeat them anywhere in sight of the player. Same with wall textures and just about any other kind of texture. This was back when a large texture was 256 x 256 pixels. Repeating patterns were a common theme on walls, roads and so on. I still find myself looking for them occasionally. Last that I remember actually noticing was in Shadow of the Tomb Raider where I saw some textures that kept repeating. Using procedural generation this could have been avoided at the cost of processing power. With ever increasing texture size and memory on the graphics cards the necessity for procedurally generated textures are smaller than it used to be. Also just adding more textures is low hanging fruit when it comes to increasing the graphical fidelity. It's easy to do compared to inventing new graphical algorithms or techniques and all it requires is that the users replace their graphics cards with new hardware that has more memory. Cost which doesn't fall on the game companies but on the customers. And even if a lot of people doesn't really upgrade this generation they will be checking out the reviews and the reviewers will be using the large texture packs and thus experience the highest graphical fidelity and that is what they will review. Texture resolution can always be made larger, but the fidelity gain will decrease the larger they get. Currently I think the limit is 4K, which means that you could blow up a texture to fill a 4K display at 1 to 1 pixel mapping. Only reason to make them larger would be to match a 8K display or to make it less obvious that a texture is repeating. The later can be minimized by using several textures for the same material and vary them. Or you can implement procedural generation of texture maps. Guess which is easier, faster and cheaper to implement?
Actually, although the music *score* is MIDI, it does not just send MIDI commands to the sound card. There's an entire softsynth in the Farbrausch engine, so basically the sound is pretty much like the visuals: generated procedurally via algorithms.
@@noor-rx1ij Yea, and like .werkkzeug, it was released to the public as a free tool. This is called the V2 Synthesizer System, by Farbrausch, and can be used as a VSTi in most DAWs.
Finally somebody realized it too.. what a moron developers. I still able to find calculator app off play store from 2012 download size? Eh, just about 900kb installed 1.02mb
yeah the damn issue today is that ego + creds means developper want to SAY "we have the biggest game ever" like Gears of War & similar i played lately that was 100+ gb download which is ridiculous. It's laziness and carelessness. Basically they make you redownload the whole game evrey time they make a small patch as to be "impressive". The reason the game is so big is because they use UNCOMPRESSED VIDEOS to make you believe that it's the console running those graphics (and it works for non-technical ppl, ppl do believe that). So every patch, you just redownload the videos from the game for no reason. it's laziness + marketing
@@joelalain also skins for weapons and characters are just like there uncompressed along with dlc just sitting there behind a paywall just wasted space (why do you think that you can just play the dlc you bought without downloading it)
The silly thing is that a lot of modern games use procedural textures, but theyre all baked and sampled to huge resolutions so they have lots of details. If they could get the procedural texture generator that the texture painting software uses into the game, they could create the textures in real time to the resolution they needed. Theyd be able to get the game really small, but then theyd need a lot more horsepower, RAM, and load time.
@@AltimaNEO You wouldn't even affect the performance or requirements in any way if you generate the assets locally and keep them stored (yes, this isn't much different from just downloading the assets, but it allows for much greater customization and optimization). Only the first launch would take extra time. This is commonly done with shaders and such, where they're computed and then just loaded from the cache (until the game or graphics driver is updated), cause generating them takes a bit of time, but there's no need to do it for each launch from scratch. Textures could similarly be generated in the desired resolution depending on the machine's specs and user preferences. If you later get a better GPU, you could just regenerate the textures in a higher resolution whenever.
@@AltimaNEO This procedural texture generator, in most cases as far as games are concerned, is called Substance (nowadays from Adobe, sadly), and you can easily put that into your game, but most developers opt against it for the reasons you described.
You can blame 4k textures for that as they can easily make for >80% of the file size. Most people can't even run 4k textures but the developers won't give you the option to not download them because most people are sheep and they think file size = value. For consoles there's even an incentive to make them so large because if you can't fit another game in your console then you'll have to play their game for longer.
2:28 I was there in person when they ran this demo. The entire place was stunned. Everything before it suddenly felt obsolete. I remember it just going on and on and on with these complex 3D scenes that even had animation in them. And a fantastic soundtrack to boot. With every new scene it felt like I was watching real magic happening right before my eyes. I am SO glad I was there to experience it.
thanks for sharing your memories! is demo scene still alive? although it seem like current gamedev don't care about tech excellence anymore, it has it reincarnation in mobile VR devices, where performance is everything and even the most advanced game looks like directly from 2000x.
So the reason it's so small, is that it's not a game at all - it's the instructions to create the game. A dream of a game, plugged into reality. Amazing.
it took advantage of preinstalled softwares from the computer's chipset. current Intel chipset have prebuilt 3d software but back then as I remember GPUs aren't really a thing when you have a CPU that is also a good GPU.
@@user-ti6ix5tn2o you remember weirdly and incorrectly. I think either your English or historical computer literacy (or both) is lacking. "current Intel chipset have prebuilt 3d software " - what? No. Just no. I think you're talking about processors with iGPUs. But he wasn't talking about integrated. He literally said in the video that it requires at least a GeForce 4. Which in 2004, was about 2 years old at the time. The GeForce 6 series came out in 2004. I personally had a GeForce FX 5600 (and another machine with a Radeon 9600) at the time. "CPU that is also a good GPU" - What? APUs weren't a thing back then. It was the era of the Athlon XP and Pentium 4, neither of which had integrated graphics on the die. Maybe on the motherboard, but if you think APUs are weak, you don't remember motherboard graphics (shudder). Onboard graphics chips were not even close to current; utilizing previous generation technology as opposed to AIBs, sharing system memory and just generally sucking. No, GPUs were actually more of a thing back then than now. Nowadays, you can game with integrated graphics. Back then, not as much. At least, in my world. I always built systems with SOME kind of video card. Even if it was low-end (like a GF4MX) it was miles better than onboard graphics, which some motherboards lacked anyway. I built more systems for myself and wife during the naughties (like half a dozen or more over the span of the decade, several coexisting) than any other time in my life, so I'm pretty familiar with some of the hardware from that era. All AMD (XP/64/Opteron) chips, no P4s. I did have a P3 celeron 1200 though.
@@SeeJayPlayGames "I personally had a bla bla bla" As opposed to you figuratively having it? That's redundant. I think your English literacy is lacking. Also you just attempted to clown someone who might not have English as a first language with a list of computer shit you have owned. Wtf is that?! Grow up, man.
Hmm. I think he meant more like in software. You could run many games in cpu software mode OR 3d mode when all details changed a lot. So when used cpu acceleration, only resolution and hz in monitor changes how much it needed raw power. I don't know why, but I think that nerd knows pretty well, what he is talking to. And I know, I'm not in same level.
I remember one of local magazines writing an article about: "a game which is smaller than it's screenshot" must have been this game since I don't remember details...
@@SockyNoob definitely huge in Sweden and Finland, where most of the major game developers have roots in the demoscene (DICE, Remedy, Avalanche, Starbreeze and Massive Entertainment to name a few). Got the impression that many New Zealand and Australian developers have scene roots as well, with NZ in particular having a really big scene.
I'm always amused by how these small teams are like the geek version of a band. Everyone has an specialization. And once in a while, the right combination of personalities, expertise, and tastes come together in their spare time in a garage somewhere to make something amazing.
I heard something interesting from an old guy who lived and worked in Soviet Union. He was flying remote controlled planes and most of the parts he was manufacturing himself and when I asked him how he learned and got the tools (molds) to do that he replied: When you had an innovative hobby the state run factory was encouraging you to evolve it and provided the tools you needed. It's impressive because nowdays is like a crime to do anything else besides your work in the eyes of your employer, which is stupid because it makes the work seem as a chore, thus reducing productivity. And I believe that's why you almost never see original ideas coming out of giant corporations. Instead, they buy them readymade from garage people.
@@perseusarkouda Apple, Ford, Tesla, AMD, etc... None of them produced any original ideas for sure. You know, the fact that a company isn't listening engineering advice from some random warehouse guy does not mean that there are no internal and external competitions/events in giant corporations to support creativity. Corporations might be capitalist evil bloodsucking satan reincarnates, but they still understand the value of innovation.
@@danielduncan6806 If he's 40 - which is likely, since most folks go to uni at 18 - 24, then you could very well argue that he is at his half way mark.
Wow, so for the textures they literally created a miniature version of Substance Designer working in real time *IN 2004*. That is incredibly impressive
To be fair that kind of procedural texture generation was very common in the demoscene by at least late 1990s. Farbrausch were not the first, many groups did it before, famously Aardbei released their Texture Gevaarte II in 1999 and published articles about how all of that works (referencing The Black Lotus 64k intros from as far back as 1997, so who knows how deep that really goes). Texture/mesh generation is also usually not in realtime, there's a precalc at launch. I don't know the history behind Substance Designer, but I do know that there are active demosceners working on it. There are other major products that grew from demoscene, e.g. Notch was born from a demotool.
@@provod_ ah yes, real time definitely was not the right word choice. Not even substance works in 'real time', you still have to wait for it to process. Nevertheless, I was still very impressed with this project. I'll have to look into some of these other demoscene works!
I'm not a native speaker, but I found myself wondering "Doesn't that mean 'tool' in English?". Guess they must have been a bit stumped for names, called it by what it did, and never got around to coming up with a better name.
It's kinda impressive, but at the same time kinda not. The whole time I thought this was a very impressive video game from 1994, sort of in line with the likes of Quake and other early 3D games. These were usually on the PC and installed from Floppy Disks. So to hear that this was done by a talented handful of enthusiasts really impressed me. But then I heard 2004. And my mind went rushing to that era. We had games like: Doom 3, Half-Life 2, Halo 2, GTA SA, NFS Underground 2, MGS3, Fable, and Kotor. These games had great visuals, sound, design, variety, script, etc etc... that it leaves a long impression on you. I don't think the same could be said about kkrieger, but that's just my opinion.
@@ekinteko Everyone's a critic, it's impressive. I couldn't have done it. Optimisation at its best possibly, it is a demo at the end of the day created by a small talented team and a very good one. Not a massive AAA title.
@@Lainyyyyy . That's my point. Fitting on a Floppy Disk would've been useful, heck crucial, back during the Early-90's era. But this is 2007, where the computing scene (and technology) in general has evolved way past that. Just note, games made the leap from 1.4MB Floppy Disks, to 64MB Cartridges, to 700MB CDs, then 4.7GB DVDs, to 25GB BluRays, 32GB microSD cards, and then to 128GB SSDs. Not to mention the elephant in the room, that was HDDs of those days, ranging from mostly 128GB to above 512GB in the market. During this era, most people (+95%) were way past Windows 95 and Floppy Disks. Most PCs were in Windows XP and 128GB HDDs landscape. It soon evolved into Windows 7, 128GB SSDs, and 1TB HDDs. Whilst on the Home Console front, people were upgrading their DVD-stored PS2 and Xbox, to the likes of HDD/BluRay-stored gaming on the Xbox 360 and PS3. It's akin to developing a new carburetor technology today, when automobiles have evolved past to Turbo-Diesels, Efficient Unleadeds, Hybrids, and lastly to EVs.
@@Nostalgianerd That is exactly how I felt reading this. Those poor guys that probably also worked their asses off, finnishing a minute before the deadline etc... To then compete with this fucking Nobel prize 5d-warp drive of software.
@@Nostalgianerd No need to feel sad! There were 3 others - T$ made a funny top-down game about running over moles with a lawnmower (I've sat next to 'em at a couple of parties in Germany - great times!), r1911 made a basic game where you dodge cubes (I had huge hopes for some slamming acid music and was quite disappointed at the time), "John Trapolka Memorial Krew" and friends made B-Clopd 3D - a humorous production which is actually quite funny to play and had some decent enough 3D code behind it. You gotta remember these are "games" categories at a "demoparty" - so while a good game will go appreciated, most peeps (esp. 20 years ago) were there for the demos. It's quite a miracle that this got made at all to some degree - but I guess if you were testing an engine for it's gaming capability and needed a deadline, then Breakpoint is the party to release it at. As you say, it was on the edge of 64-bit and the fast internet explosion.
I remember back in 2009 or so, I was at a LAN party called "the sleepless lan", and breakfast cost $20 extra ontop of the ticket price.. A friend of mine wanted to just pay, but I said hold on.. We went to a local store and bought some bread, then placed it ontop of my GTX275 before launching the "8kb demo" file, cranking temperatures inside up to 85 celcius.. It took about 15 minutes before we had crispy toasted bread to melt cheese ontop. Good times, had a lot of nerds arrive and comment on how I was going to damage my top of the range graphics card and that we were mad etc, but everything went well and the bread was delicious.
Lot of people don't realize how resilient electronics really are. You can literally do ANYTHING to them... as long as you do not break a circuit on the board or pull components off or "bridge" 2 circuit paths together.... it will work. THat sounds like a lot of "ifs" that can screw it up but really if you don't lay any metal on top of it while running or scratch the hell out of it... it works. Remember the "static wristband" craze? The one who "invented" that useless crap made TONS of $$ and contributed to the mass illusion that electronics are these super fragile things that if you even TOUCH them they blow up or something. A static spark from your finger.... even after dragging your 100% wool covered feet across the deepest shag carpet you can find on the lowest humidity ever.... the spark from your finger will not FRY something that was DESIGNED to carry and manipulate electrical currents.... Lightning bolts are literally the exact same thing as the spark that shoots from your finger and ya, THAT can fry something but it will also split a tree down the middle too.... I mean the water you wash your hands with will also cut through steel too. I remember back in the day when the Atari was the console and when the original Nintendo came out. Sometimes the game cartridges would not work because we kids did EVERYTHING you can think of to them.... they were spilled on, sat on, stepped on, kicked around, and may even have served as a makeshift hockey puck at one time.... and if you are old enough to remember those things, you also remember the "universal fix" that worked damn near every time.... take deepest breath you can take and then blow it out faster than you ever exhaled it before while shaking your head back and forth and the hurricane level winds you produce are directed at the connector end of the cartridge..... throw it back in the console and fire up your game. THose things were essentially the same thing as a PCIE card in your system now that was enclosed ina hard plastic box with the gold fingers exposed at the bottom. And somehow "electronics" went from indestructable to breaking if you even LOOK at it .... I got a "junk box" of parts.... full of cards and stuff just thrown in without static bags or any thing. I guarantee I can dig down to the bottom of it and pull a card out and it will work just fine... unless my system no longer has a slot compatable with it or I cannot find a driver that will work with it ... but the "hardware" will work just fine. If I would have seen you toasting bread off your GPU.... I be like can you toast me one too?? :)
I must admit having a screenshot from a game contain more data than the game itself is kind of poetic. Its like an abstraction of the holographic principle
To be fair, it's only larger if you take an uncompressed screenshot (such as BMP format). A JPG, which is compressed, is smaller than 96KB. Still, I remember testing that very thing out way back in the day, and it's STILL impressive to me.
@@dandare6865 the amiga demo scene is still going strong.. this one from last year is amazing :)ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-iD9xk3SDSYc.html
HOLY SHIT I HAVE EVERYTHING THEY'VE EVER RELEASED LITERALLY SITTING ON THE SYSTEM I'M TYPING THIS ON. I love blowing my friends' minds with these things. Even if they do often kick off antivirus software something fierce XD
@@HappyBeezerStudios ... especially because most demoscene executable packers, and definitely kkrunchy, unfortunately made it into the hands of malware authors, and a lot of AV software now flag everything compressed with it by default. It's a stupid game that we can't win :/
@@kb1337 It'd be a matter of just changing the algorithm enough to not be recognized by the malware software, since chances are that the antivirus is only looking for a specific pattern related to kkrunchy, or something
@@Gogglesofkrome This would help a bit, yes, but it would probably only result in a game of cat and mouse against the AV/Malware scene, and furthermore AV software also employs dynamic analysis techniques - essentially, run the exe in a sandbox and check if it tries to do something that looks sus - and those usually run into timeouts because the PPM(ish) decompression in eg. kkrunchy or Crinkler takes really long, and then decide to err on the side of caution and flag the executable.
Farbrausch inspired me to become a software developer. The art this team produced is amazing and truly impressive. It's software magic, even though nowadays I actually understand what the software does. I have one fireproof safe in the house, that holds passports and a CD. A CD with a backup of almost all of the things Farbrausch has made in those days. Those guys rock!
@@EbonyPope All the computation done do create or decompress assets requires CPU time so you either have to let it run before the game / level can be played (not the best user experience) or do it all in on the fly (which uses up CPU cycles). Optimisation is about balancing size vs performance and with storage space growing in size and dropping in price (as well as how much such heavy size optimisation complicates making a good game) developers don't prioritise size so much outside of such demos. Games still do use techniques like this, especially for generation of the world, look up the game Fuel if you want a good example.
@@birdfacemd Yeah, disc rot's a bitch. Even the black discs aren't immune to it, even though it was designed to combat it. I've had gold ones but I lost them. Do those rot as well?
You know, things like this make me feel like when a boomer goes "ah, classics, not like that new trash" I actually think "you know what? They're right on this one."
@Nexxol Yeah pretty much. There's just been a drop in quality as far as creativity is concerned. We still get the occasional gem now and then but for the most part creativity is dying off and realism is king. For some reason even a lot of games that should have more cartoony artstyles are coming out these days with realistic lighting engines and realistic looking textures. At least we have the indie crowd to pick up the slack a bit though.
@@Crow_Rising There are more creative games than ever. Only the large AAA games tend to stick to proven designs, but that is pretty much a must, since they are much more expensive to make than 20 years ago. Now they need teams of hundreds not just 30. For such a high investment, they want to make sure that it sells. Not much experimentation here in AAA games, but the thousands of Indy games can cover that.
@@vast634 the games of yesteryear aren't as fun as they used to be. But they were super fun at the time. Blows the pants off the experience you get now, despite the amount of games available. Too bad the younger generation can't just play old games to feel what it was like. At least there were no IAP's or dlc. You just bought the game and had fun. It was yours.
I remember back in the day my older brother, being really into programming, followed demo scene closely. When I saw him playing this game I was nothing more than curious, even after bein told it was 96k didn't really tell me much. Looking back at it, at this point in time while I am struggling to be an indie game dev in unforgiving economy, having about 10 years worth of halfhearted self-taught programming experience (I do have issues), I can finally recognize and appreciate the amount of dedication and labor put into these
"Pentium 3 1.5 Ghz, 512 MB ram, GeForce 4" Well fuck that was my actual set up that I got in 2004-ish. Was stuck with it until 2012, so THAT wasn't much fun, but still!
LoL don't feel too bad man. I still have an intel core i7 920 with a Radeon R5 260X OC 2GB card (thankfully I don't use it anymore, replaced it with my ryzen 3600 and gtx 1080 ti
2004: Entire game fit in 96K 2021: turn on my PS4 and have to download a 5GB update just so i'm allowed to play my game again. That's progress for you!
The quirk is that this was an experiment: commercial games from 2004 took about 2-3GB, where over 95% of space was occupied by assets (like textures, models, music and voice). Todays "patch" is often the whole game (with all assets), not the difference between your version and the latests version), so in case of PS4/PS5 versions the patch is often 15-50GB...
@@PiotrPilinko Because they pack single highly compressed (and often encrypted) file using dedupe. Not the entire game is in there but all textures are. Encrypted to make life more difficult for cheaters, compressed so it all takes up way less space (imagine if it were uncompressed) and because decompressing and loading it into memory takes less time than loading it in uncompressed from the drive(when you have more than enough cpu to spare), and the dedupe sometimes to make it take even less space. For consoles specifically they will at times (hdd specifically) put more then 1 copy on the drive, some ealrier pc titles did it too. Saves time seeking if you put things close together or have multiple copies and defrag is predictable (works better with 1 single file, again, so you can order it in a way you want). Now these days all will have an ssd so that is less an issue.
@@PiotrPilinko you think it is normal? No, it is lazy developers and designers who could have done a much better job but didn't. I'm a developer so I'm a bit more qualified than most to say this.
I ran this on a PC which was outdated at the time and it ran fine. It did seem a bit slower than what's shown here and the initial loading took forever, but it was still fast moving and visually better than most of the games the system could run. Amazing what such small code can produce.
Back then you had FPS games trying to fit in the smallest sizes conceivable, today developers are competing to see how much disk space they can waste...
In current game developers defense, RAM is much more expensive than storage. I mean, unless you can afford more than 128 GB, which I know most non enterprise people won't have, developers had to bake all the textures to be stored in storage, which is way cheaper in cost and more accessible. Not to mention the performance of processing the procedural stuff can be a burden to CPU, tanking down framerate and the loading time will be longer, it's still loading even if you're done taking a shit. I'm not saying that .kkrieger isn't impressive, but people shouldn't take the prod as expectation. And it buggers me as a game developer myself. What .kkrieger did is basically store all the assets into RAM. And not to mention .kkrieger is a one of a kind prod. It's not that contemporary devs not trying to fit in the smallest size conceivable even back then.
@@Gogglesofkrome That's quite a naive imagination... But wait, *_there's more!_* How about procedural texture and mesh generation, which can take a lot of CPU power, and can prolong the load times by a lot. And by a lot, I really mean *A LOT*. The CPU would have to generate the PBR texture layers and calculating the meshes, out of small instructions. But wait, *_there's more!_* How about generating lightmap for static, more accurate lighting? Not everyone have realtime raytracing capable GPU (and yes, I care a lot about low-spec gamers) and generating lightmap can took a long, long time. We're talking about minutes, so the load time would be even longer. Even software only low resolution GI tracers for quadratic polys can take 3 or so minutes to generate. But wait, *_there's more!_* Generating humanoid characters procedurally, just using math, from scratch, is borderline impossible. The time and manpower would be best used to create artistic character models, instead of coming up with imprecise math algorithm to get a shoddy looking character model. Not even folks at Shadertoy managed to get a good looking, production ready, full body human model (but math wise, it's still impressive achievement). And let's not get into clothing. But wait, *_there's more!_* RAM is a volatile memory, so if you decided to shut down your computer with 128+ GB of RAM, you will have to endure the extremely long load time if you want to run the game again. Love it or hate it, the way games are shipped now is the most efficient method, by putting all the pre-made stuff into the storage, give it a bit of compression, and the RAM could focus on loading the assets, and the CPU could focus on executing game threads, thus reducing load times. Not to mention the advancement in SSDs, cutting down load times more while still being reasonably inexpensive. (I really should turn this writing into a Medium article.)
fr-08 is still my favourite demoscene produkkt. I still remember just how jaw-dropping I found it when I first saw it. Frankly, I still haven't been able to pick my jaw up. It still gives me shivers and I still think it looks just as amazing as I thought it did back then. Seeing a video on .kkrieger and farbrausch made me all mushy and warm on the inside.
This is excellent! I used to do 6502 machine language programming on the Commodore 64 when I was in elementary school, and I always valued efficient code optimized both for size and efficiency. It was always a fun challenge, and I wish more developers valued it today.
@@djancak: Even if you do simple optimizations, you'll already be ahead of the game. The compilers these days are pretty smart, but they still don't understand the larger picture of what you're trying to ultimately accomplish and so you, as the human being who can perceive this, has the final advantage. There are many great articles to be found with Google about optimizing code in most, if not all, programming languages. This article came up first for me (for C++), and could serve as a great starting point -- just focus on one technique at a time, and gradually you may find you're writing optimized code on-the-fly as a newly developed habit. 🙂
@@mrkitty777 also thanks to abundant storage and internet speeds. back then i was desperate for highly compressed games n stuff to make most out of what i had
It's not compression. It's procedural generation. Basically rather than store textures at all, you store a list of instructions for generating them at runtime. only the executable is 96k. It takes up substantially more memory while it's running.
This wasn't compression, more like practical code golfing. This is not to diminish the dev's achievement, on the contrary. I think kkrieger is very impressive.
In the year 2050: haha remember when we had to ship games with all the textures on a disc? Now we just ship the DRM protected AI on a liquid diamond discplate and they pay 18.99 dogecoin a month to unlock it on their Facebook account with the voice command "alexa, play cyberpunk 6". Wasn't technology so anti-consumer and archaic back then? Anyway, my break's almost over; I'm behind on my Tesla funded thermonuclear missile daily manufacturing quota and my drm sustainence device needs upgrading to support the 12.7 version of the OS without crashing and failing to crush the maggots into paste.
The fundamentals for those were already laid in the early 80s, so procedural texturing was already around and in broad use during the 90s. I used different implementations of procedural noise in most of my renderman and mental ray shaders at the time.
@Darren Munsell Thats dumped into memory and gpu/cpu. It's more efficient to have the render texture output that is flattened and just uses gpu memory. I can easily slow my system down in memory and performance using substance designer. Sometime taking minutes to open up procedural files as it renders the final output.
@Darren Munsell in particular, it generally takes several orders of magnitude (think thousands to possibly millions of times nowadays!) longer to generate a procedural texture than to load one from disk. Especially with the hilariously underpowered CPUs in last gen consoles, which were actually the limiting factor in load times for most titles just moving the data into memory. Seemingly about half of that file size, in fact, goes away on the next gen console versions, where they can depend on fast storage to avoid storing redundant data and fast CPUs (or custom hardware) to use more aggressive compression. It's quite annoying when people that clearly have no idea what they're talking about call incredibly talented, overworked, and underpaid people lazy and stupid.
I was doing procedural texturing (And constructive solid geometry, for that matter) years before that. It's already existed for a while (and I got ancient computer science literature to prove it), though only recently procedural generation became quite mainstream. But kkrieger nevertheless blew me away when I first saw it back then. It's not the techniques themselves, but how they are done so expertly and efficiently. Just an amazing feat of programming and design.
@@Etobio I have no idea what that specific song is. I was speaking to the type of music in general. Techno and other forms of EDM was constantly showing up on the radio and in commercials, tv shows, and movies. To this day I still listen to the Groove and Hackers soundtracks.
Well, technically 20GB or more shouldn't be seen as readily available when it comes to storage and bandwidth but some game developers seem to casually ignore that, seeing it as the customer's problems. And there's so many files that aren't even used for any system that can run the games. Feels like a pre-download system checker and download optimizer which removes files that won't be used would be a good bare minimum.
Wow this is amazing! I'm majoring in CS right now, and this level of optimization is so far from where I'm at. Hopefully I'll be able to achieve this ability in the future.
What blew my mind back then, was the realization that this 96kb demo despite its limitations looked better than F.E.A.R texture wise; It was crazy!. I have spent years wondering how the hell that small team managed to do that.
It’s not actually that complicated, if you understand coding on even a basic level. Most coding, is based on the fundamental aspects of algebra, and mathematical operations. In essence, what they did, put extremely simply. Was put a series of numbers, through multiple multiplication tables. If you’re familiar with math, the basic idea, that if you take 2 to the 100th power, or 2x2 and do it one hundred times, the resulting number, will be exponentially more massive than 2 by itself. By no means does this mean that such an accomplishment isn’t incredible, because it is. But....it makes logical sense how they did it. It’s simply a testament to their command of coding, their understanding of algorithms, and their knowledge of machine pathways. Simply put. While it makes sense, and is logical, it is also a testament to the brilliance and cunning of the minds that worked on it.
To explain just how small 96kb is: If all of the code was written down into a book, it would only be 32 pages long (assuming the normal average of 3000 letters/page). You could read that in just half an hour. Other things you could fit into 96KB: - 6 seconds of low quality .mp3 audio (128 kbps) - A 160x200px .bmp image - about 3 frames (125 milliseconds) of 240p h.264 video
I just wanted to thank you that putting all these information together and introduce this extraordinary game. I remember when I was a kid my brother told me there’s a game just less than 100 kb and it require top of the spec to run, but I wasn’t able to find it until I see your video.
Thanks for doing a video on one of my favourite pieces of software engineering ever. I simply worship .farbrausch, what they did is legendary. That said, you are doing modern game development a huge disservice. While they tend to throw around memory and discspace like there's no tomorrow, part of it is the reason of the limitations of modern hardware. If you look at Uncharted 1 vs Uncharted 3 the difference is starling and most of it is due to modern data streaming. The technology behind all it is striking. And the algorythms for certain screen space solutions in Anti Aliasing make my head spin... in parts. And still we have an army of technical artists who make sure that all you see will fit precisely in the given performance envelope of the console... or at least, when not pushed to go beyond. While making an Indie game nowadays allows immense leeway and being pretty spenthrift with resources, when you want what you call graphic beauty, it is still a huge task, even if the work is not readily apparent. And we are still not done: I am eagerly awaiting the tech breakdown of Nanite in Unreal 5, which is a realtime implementation of a REYES renderer! Brian Karis and his team are my current programming heroes in that regard and it fills me with just as much wonder as kkrieger did almost 20 years ago. What a time to be alive.
Well we still have to deal with memory constrains nowadays, it's not like we have unlimited RAM and CPU ^^ But it's true that in terms of file size, there's no effort at all to decrease it. Games are often larger than 100 Gb.
It's been like that for years now, at least for PC gaming. Can't get playable fps? It's not my game's fault, get better hardware mate! With closed architectures you're forced to optimise to make a game run properly, but when you can upgrade individual parts of your system the devs will get lazy and skip any polishing and optimisation. There's a reason you can run Doom on everything while you can't run Crysis on anything.
@@abadenoughdude300 There's probably a bit of that too. Personally, to make my little video game side project, it's something I think about. I don't dev it on my best set-up for this reason. I want to the game to be playable on older machines too.
However, faster storage is changing how that RAM is used. Asset streaming really boosted the amount of texture data that would be useful to include in your game. If the entire area of a game had to be in RAM at the same time, you'd be severely limited in how high resolution (or numerous) your textures could be, because they'd be taking up massive amounts of RAM. With ultra-fast asset streaming, you can fill the entire RAM up with just textures that are visible in the immediate surroundings of your character, and load new textures as needed. Now, you could spend the entire RAM budget on viewing just 10% of the area, which means you'd need to store 10x the amount of data on disk to construct the entire stage in the same level of detail as your RAM allows you to. With NVMe gen 4 drives, you could conceivably pull textures from storage so fast that you could unload what's behind your character when you turn around, and re-load it faster than you can turn back, which means the full 10+ GB of RAM can be spent on a 90-180 degree (depending on how fast the devs want to allow you to yurn around) cone in front of you, rather than the entire 360 degrees around you. But again, this means you increase the texture detail by 2-4x, which means potentially 2-4x storage requirements too, if you want to keep the entire game as detailed as your VRAM technically allows it to be.
It's like a sound card or synthesizer, a few simple functions like sine, triangle, and square waves can be combined and filtered to make pretty much any sound you can imagine. A few simple images can be combined and filtered in the exact same way to produce any image you can imagine. Storing those base images and the list of transformations takes much less space than storing the final image.
I really love the design of the levels and enemies though, I would love to see it remade and updated in an engine-like unreal with models and textures/lighting fully realized. It's ominous and a bit scary, with a strong feeling of horror/fantasy/steampunk.
Still amazes me what developers did back in the days to make mind-blowing games. Nowadays games barely run stable properly and require beefy computers due to lack of optimizations and techniques.
Ah yes, The Party 2000. I was there. Coding, listening to Press Play on Tape. Me and my friend, were happy I brought a UPS. Suddenly peoply wondered how we could still be coding, when the whole area was black. Oh... powercut! Hee, memories. Also, can the real Karl Koder please stand up?!
This is extremely impressive. If only COD were that way... But if they had a competition for making 96kb games, imagine what they could do with a 5mb limit!
I remember my roommate showing me their first big work in 2001 (I was a year behind) As a 17 year old kid it blew my mind. 20 years later and still blowing me away.
OH, another cool kinda tiny demo. There was this cool Operating System called MenuET that was a full OS with web browser and server, ftp client, IRC client, etc, that all fit on a 1.44 floppy.
Jon Burton: Welcome to coding secrets, and here we're gonna show you how we managed to fit the Sonic 3D blast intro sequence on a 4 MB cartridge. These guys: hold my beer!
For those who struggle to find the song from 1:38, the song used is Typekast - Pushing Walls. link: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE--eXI7CEry1Q.html I actually liked the song when I first heard it.
I'm happy to say that I was there in that time, I'm from Venezuela and I was discovering the demo scene, I remember play the demo with my brand new G-Force 8800GT, it was the kind of computer experience that just happens in life very few times. Thank you for the memories.
This is an impressive journey. I love tales from the demo scene. I remember checking out the demo scene back in the 80's on my C64 and again on the Amiga 500. These guys were making computers do things that not even the engineers thought would be possible. Awesome stuff.
Quem viveu e curtia as demos nos anos 90 e início de 2000, sabe que, além bonitos e pequenos, as demos conseguiam, milagrosamente, rodar/executar em equipamentos de baixa performance. Assembler era bom demais! A capacidade desses caras era demas! Além de PCs e amiga, no fim dos anos 80 já existiam demos para TK-90 / 90x (Zx Spectrum) e eram incríveis!
@@spudwish Anyone who lived and enjoyed the demos in the 90s and early 2000s knows that, in addition to being beautiful and small, the demos miraculously managed to run on low-performance equipment. Assembler was too good! These guys had great skill! In addition to PCs and friends, in the late 80's there were demos for the TK-90 / 90x (Zx Spectrum) and they were amazing! (SORRY about poor english)
Oh I remember when the demo party took place and we where astonished by that "game" demo. And the LAN partys afterwards where everyone checked how good it would run on their pc.
Microsoft: Our team has been working hard to ensure to deliver efficiency and optimised code. We now have notepad.exe down to only 198K in size!! kkrieger: lol.. hold my beer
Your explanation of Farbrauschs "trickery" leaves out a part (which I think is quite genious): It requires Direct X (8.0), and they hook into not only Direct X routines for rendering, animation, lighting etc (ie, they don't need to write those routines and fit them into their own code), they also hook into primitives and more importantly *textures* provided by the Direct X standard libs. It's still super impressive, but it does take away some of them "magic" behind the file size.
@@drojf not on me, just what I remember the talk was about when farbrausch first came out with these demos. Might not apply to all of them. Or I might remember wrong and the buzz was all about them using dxs routines, shaders etc instead of writing them from scratch .
That's incredible! What wonder. Thier solution seems to be like they actually wrote a kind of DNA code for the end model (in this case, a game). Maybe the same approach, but in reverse, could be used to build libraries of perceptions of textures from a camera input and used for spacial recognition, and also stacked for object recognition. That would be a jump ahead in robotics.
This is exactly what a neural network does, but instead of using a manually set parameters for nodes, it uses convolution filters to derive representations to arrive at conclusions, the feedback of which adapt the representation as well as the fully connected part of the network. The problem is mostly in the lack of generalization. Such networks need thousands of examples instead of just a few, like kid's brains do.
I remember the later years, of the various Spectrum magazines, they had some brilliant music and image demos, I had the ZX 48k which was given to me in 1992.
0:32 "Somewhere in Europe" -- You know, Norway is still a part of Europe even though we're not a member of the EU. The EU and Europe are two different things.
@@andreaslaroi8956 LOL - so you're Swiss I take it? To be fair, Switzerland has always been the odd one out on the world map. I, myself, sometimes have to check if you're still a thing :p But really - you're a beautiful country inhabited by lovable weirdos. Greets from those other weirdos who, apparently, are no longer a part of Europe.
@@djancak I don't know, but for years after I first saw it in a gaming magazine, I heard pretty much nobody talking about it. I can understand that people brushed it aside as some sort of novelty, but technologically, it is very impressive!
I haven't thought of these demos in a long time! Demos took a while to load and many were like music videos. Some were interactive but kkrieger was the first game that I remember. I think that some demo developers went on to work on the procedural elements of Spore.
Damn. It's kinda like, why didn't this team work on their tools and release them commercially? This is so clever, it could be used in a lot of games, but obviously it's not. Also, what a genius way to create a rail shooter as opposed to an FPS.
@@WillowEpp Yeah, i found that out just now after having a look at their repository. My point still stands, though, regarding innovations in software development in general.
5:25 The music was similar to MIDI in the sense that instruments, notes and various modulations were just instructions, but they were all being sent to code included in that 63.5 kB that emulated an entire synthesizer. Imagine all of the components that make a high end FM synth (no, the OPL and OPN chips were no match for this thing), yet all included in the software that was running. It's even more impressive.
Thought initially the soundtrack is the the data that generates the models and animations. If you look into a deep neural network. The neurons do have activation maps that somewhat look like the seeded maps. That's quite the interesting point. Was hoping to learn about the gameplay: what weapons, what enemies, how long etc.