People are scatter brained these days, especially when they’re younger. I listen to so much verbal diarrhea in my day to day. Even professional talking heads often misspeak or stutter or struggle to get their words out. Watching old talk shows, lectures, interviews or programs like this and you’ll notice people used to be much more put together.
@@peterjszerszen Don't forget about Linux, I've transitioned over about 3 months ago and am really enjoying it. Command line is a great way to work still.
It's amazing to realize that someone so successful as Kildall became an alcoholic ... show's that prestige and success does not bring purpose and meaning. We have to reach higher for that. John 5:24.
The device that is now showing you this video is way more powerful than the machines they are talking about. But a lot of respect for the people in the video whp paved the path to get where we are today. They made it possible.
Random fact - if you read the book Jurassic park (1990), in the beginning the government had no idea what was happening on the island and were investigating the company. They knew that the company Ingen had imported three cray X-MP super computers on to the island. The ones in this video were the earlier versions used to design nuclear weapons. One computer was enough do most of the work anyone would need - yet this company named ingen had imported 3 of these on a remote island. The government was baffled as to why someone would need that much computing power on a remote island - little did they know that the park were using them to reconstruct the genome of the dinosaurs
Little did they also know the park boasted some silicon graphics machines as well. Screw the dinosaurs, they should have sold tickets for people to come and see the computers.
The fun thing about this technology in this video is that, over time these supercomputers with their specialized parallel and high performance hardware started using more standard operating systems (like, eventually, Linux...), meaning those operating systems had to add support for this kind of specialized hardware. Later on, when typical consumer hardware like desktop and phone CPUs became more like these old supercomputers (with lots of pipelining, and multiple cores), the OSes already had the support in place to be able to run in those kinds of environments.
Well to put that into some perspective the little battery powered pocket sized Nintendo 3DS handheld system is 5 gigaflops. Half what those powerful supercomputers were at the time.
<a href="#" class="seekto" data-time="447">7:27</a> If you watch the SF film _The Last Starfighter_ which came out this same year, its CG is just streets ahead of anything from around the same time. These were rendered on a Cray X-MP super. The cost of the computing time was a third of the entire movie budget, and the result took up a similar portion of the movie running time.
And worth every penny! Of course a modern PC and someone who knows how to use Blender would blow it out of the water nowadays. But the cgi holds surprisingly well and even has aged with charm instead of the uncanniness you see even in modern movies. The models are very complex and wouldn't be easy to handle, even for a supercomputer of the time.
In most cases...the practical kind like processor speeds and RAM amounts, they far surpassed. In the case of AI, etc., they usually underdelivered. That's the nature of futurism, though.
It is easy to scoff at these early Supercomputers but thinking out loud here, they were the foundation of knowledge and experience thar brought us what we have today. The road to today was build by these machines and people.
SARSteam Railways - so true what you say. These days the term 'hypervisor' is the big buzzword, but in reality IBM coined the word back in 1970 when they were designing virtual computers.
It's fun to read about these older computers. Back then, there were several companies designing supercomputers, and each one was very different. They all had different ideas.
It's ironic that I am Casualty watching this show on the internet, on a modest modern PC (i5, 2g HD and 12gb ram) that was not even dreamed of only 30 years ago. Were will this technology take us 30 years from now?
Ray Kurzweil in his book Age of Intelligent Machines wrote about stuff like current laptops, smartphones or tablets 30 years ago. In his opinion, in 10 years machines will be able to do everything humans can now.
@@janruudschutrups9382 by the mid 1980s consumer desktop hard drives were available in the 10 to 20MB range. A 2GB drive would not be sold until 1992, but it cost an arm and a leg. 2GB would become normal in the consumer space more towards the mid 1990s.
Well, the smartphone is the product of all this work, plus almost 25 years more of work. It's what makes these shows so damned interesting in the modern day.
Gary mentions Grace Hopper at <a href="#" class="seekto" data-time="535">8:55</a>. As I write this in 2023, NVidia has introduced their Grace Hopper superchip for AI workloads almost 4 decades after she's mentioned. The world is so different than in 1984.
Summit, the US's new supercomputer, is more than twice as powerful as the current world leader. The machine can process 200,000 trillion calculations per second so yeah we have come a hell of a long way over the last 36 years
My first computer (Olivetti M24 SP running at 10 Mhz and with 20MByte hard drive) cost more that my first car (nice big luxury 6 cylinder)... both bought in the same year.
Pluck Gary from 1995 (a week before he dies) and put him in 2020. He'd be in frikkin tech heaven! And he'd be gobsmacked it was only 25 years. He thought the 80s tech evolution went fast...
I recall reading an article in New Scientist back in the 1990's about the race to build the first Teraflops supercomputer. The latest Xbox supposedly has around 12 times that performance.
Not really the same numbers. For supercomputers they always quote the flops in terms of double precision math (64 bit) as that is the standard for high quality simulations where the tiniest bit of accuracy makes a difference. In the world of consumer gpu's, games don't need that type of precision. the 12 tflops of the current xbox is single precision (32 bit); the double precision is only 759 gflops - short a terraflop. So that "old" supercomputer you read about is still faster at what it was designed to do than the current xbox. Keep in mind though, that the crippling of the double precision performance is intentionally done so that people don't buy these cheap consumer units to stack them up for highly scientific work - they want you to buy the super expensive commercial/industrial GPUs. And those are wicked fast - the current Nvidia A100, a standard "budget" server farm GPU does 10 Tflops double precision, for example. With that said, modern gpu's have one added massive thing that these old supercomputers didn't have and that is super fast memory access. the Xbox has DDR6 320bit memory at half a terabyte per second, which so much exceeds these old supercomputers that their throughput would make them effectively faster despite their slower computational speed in double precision. It is in fact this reason why you can run a neuro network AI LLM on today's moderate spec computer hardware that would never be close to achievable on those old supercomputers of the 90s.
I wish they had waited a year and done a special on the Cray 2 super computer that came out next year in 1985. That was something I loved as a kid, I was born the same year the Cray 2 became operational.
I used a Cray to calculate stresses in structures around this time. Never saw it, but did recently see one in a museum. I was surprised how small it looked, even so there were masses of wires at the back. The speed of light problem went away when everything was put on microchips.
@@gregorymalchuk272 : Stresses in bus bodies, using NASTRAN. Buses take a hell of a pounding, on lousy road surfaces and a large difference between empty and full weights. Even had a [double decker] load case of full on top and empty downstairs
Now transistors are closing in on the molecular structure of silicon, we are still at the dawn of processor based computing. As quantum computing becomes more viable and probable over possible we could soon make far larger leaps than ever before. If this is super-computing what will my eight year old granddaughter see in her life?
Quantum computers are a _lot_ of hype. Don't get me wrong, they're a real technology with real applications, however popscience seems to imply that quantum computing is the next generation of computing in general, and that it will "replace" our current CPUs or whatever. That's not the case, because quantum computing does not provide any benefit to any of the tasks we normally run on "normal" computers. Instead they are able to run a whole new set of tasks that you _can't_ do on a conventional computer.
It is still amazing how much each level of computer systems has progressed from the earlier systems in less than a decade for each level. Even the new super computers make the older super computers look like antiques by mere comparison.
Just to be clear, the system was not 100mhz but rather performed 100 million floating point operations per second(mflop). Processor speed and number of operations are not necessarily the same thing.
I remember the panic over the Japanese computer threat. Then Linux came along, Intel developed the Pentium, Donald Becker developed Beowulf cluster software, and the whole notion of the "supercomputer" was fundamentally changed.
Yes, itÄs quite funny to see, how afraid they are about japanese computers. In the end, the japanese computer manufacturers just began to build Windows machines.
@@acmenipponair It was a real fear at that time since their government subsidized the cost of R&D and they had the existing manufacturing capacity, as well as customer base to deliver to. In the end, it became irrelevant since the consumer market drove high performance computing to an exponentially higher degree. Now in the current world we see total proof of this as supercomputers are just the same microprocessors in your ordinary devices, just tons of them well balanced together.
@@oldtwinsna8347 Another thing that may have contributed is that there was a bit of a trade war with Japan in the 80s over this sort of stuff. Some US government officials even smashed a Japanese radio as part of a demonstration claiming that the Japanese sold precision CNC machine tools to the USSR. Though other countries had already sold the USSR similar equipment. But even before that, the US was trying to limit the amount of imports from Japan. The strange thing is that the US never seemed to do this with China. Stuff coming from China has American brand names on it, and it seems they are OK with that, even if it de-industrialized the country. Apparently competition from Japan was a serious issue, but they had no issues with just completely shutting factories down and making stuff in China.
<a href="#" class="seekto" data-time="1074">17:54</a> What seems to have happened is that FORTRAN has evolved to include features to take advantage of vector units and highly-parallel processing.
Love this show, because it really show the developement of IT field and computer tech from 1983 till 2002. These things were the ultra modern stuff back than.. i can imagine what we have nowadays, not talking about the military black projects or DARPA, they are like 50 or 60 years ahead of current technology.
Back when the people who actually developed the hardware and had to sell it. Not like today where any marketing wanker can sell something he hasn't touched.
Some people seem to be getting confused over the speeds of these computers. The Cray computer was working at 100 MIPS. that's instructions per second. Not 100 MHz which is machine cycles per second. Not every instruction is processed in every cycle. Also in terms of catching up to moore's law. A Cray of the time would be about as powerful as an iPad 2. Also these supercomputers were NOT general purpose. They were mainly vector processor based. That's a somewhat different form of math to what Most modern processors use Most often (gp processors have vector units but tend to rely on their integer and floating point units). Also megaFLOPS were mentioned. 100 megaflops is equivalent to maybe a Pentium or late 486. 10 gigaFLOPS would be more akin to a powerPC G5 or Pentium 4. Most modern arm(mobile) processors are about that fast.
But what matters is how fast these old supercomputers would be with today's type of applications we can relate to. For example, would you be able to create a program on a Cray to do real time h.264 decoding at HD resolution?
To the vector processors: We have such processors in our computers today. But NOT as the main purpose CPUs. But the GPUs. And there you can say, that a Cray One is applicable with the GPU of a small smartphone.
@@acmenipponair No, we've had vector processing in CPUs ever since Intel MMX and successors like 3DNow, SSE, AVX, AVX-512. PowerPC had AltiVec/VMX and VSX, ARM had VFP, Neon, SVE, SPARC had Vis...
Not quite. A p4-2.53(2002) had a performance of 0.4 GFlops. Non-overclocked. Which is about 1,000 times the speed of an 8088 (1981). Time interval: 21 years. A ryzen 5 3600 has a raw speed of around 500 Gflops. So thats roughly a 1,000 fold increase in CPU calculation speed in 18 year's time. So speed increase is increasing per time interval. Slightly.
The Cray X-MP sold for $15 million - the 1.2GB worth of drives sold for an additional million dollars. The processor in most modern smartphones is about as fast if not faster.
Sunway TaihuLight, the current fastest system, is able to do about 93 PetaFLOPS. (93,000,000,000,000,000 Floating Point Operations Per Second.) So, it's 93 quadrillion calculations, which is roughly 3.1x the speed you've mentioned. I'm sure that by 2020-2025, we'll be looking at early quantum computers that might be able to scale up to the ExaFLOPS level by truncating the floating points into smaller block chains. (Though memory models are currently an issue, since we would basically need analog variability in each cell to represent the variation between 0-10, for instance, rather than 0-1. Programming something like that is also likely going to be a bastard of a job, since the level of complexity is going to also increase by a magnitude, I'd think.
The US dominates the top500. China is a heady hitter these days too. The top of the list bops around. It’s current top super computer is Japan, the US has spots 2 and 3, and China is in spot 4. Everyone leapfrogs each other, it’s always interesting to see what the next fastest machine in the world is.
<a href="#" class="seekto" data-time="1243">20:43</a> Not long after this, I think it was, one major US university was about to choose to buy a Fujitsu super, until Government pressure made them change their minds and go back to a good old all-American Cray instead.
100,000,000 operations/second or 100 mips used to be the pinnacle of computer technology in 1984 via Cray computer. Now, we have a I7-4790k micro computer which can perform, 144,550,000,000 or 144,550 mips. A high end microcomputer made in 2014 is 1445.5 times faster than a super computer made in 1984.
800 million operations per second... My GTX 1080 can do 9 trillion floating point operations per second... it would take 11,250 of those supercomputers to match my graphics card.
I was just thinking that..these supercomputers that they are speaking of in this video are WEAK compared to what is available for the consumer market today. Hell, the GPU in my phone could run circles around what they had in 84.
Some supers today use programmable GPUs, too. How high do you think your gaming rig would score on a list of the world's top supers www.top500.org/ ? Wouldn’t make it anywhere near the list...
The Cray-2 super computer came out the following year (1985) and was liquid cooled, could perform 1.9 billion floating point operations per second, and consumed 200kW of power ! A smartphone today is hundred of times more powerful, and consumes a very small fraction of the power, and runs cool. But it's still not smart enough to correct software bugs on its own, nor can it program itself. But maybe it's better that way ?
People that comment on these videos always say some variation of "The hardware I have today is way better than what they showed!", but they leave out that without the "building blocks" of previous technology, what we have today would not exist.
It would be hard pressed to show a 144p video on a 3 inch screen, even if you added a specialized graphics unit in line with the CPU's capabilities and the top end of what was available at the time. If you wanted 480i or 576i colour video at full frame rates, analogue TV was still your only choice.
It´s funny to see in 1984 the fastest super computer being able to do awesome 100 million operations per second... And today, a top cell phone is able to compute 16 trillion. So an iPhone 13 is 160 thousand times fater than this super computer. Actually an iPhone13 is faster than the #1 super computer from 2001 and it´s faster the ALL the computation power on Earth , the sum of ALL COMPUTERS ON EARTH (super, mainframes, enterprise, domestic) in 1980. That´s amazing.
It's a marvel alright. But that iPhone is probably being used by someone watching RU-vid while pooping - yet, the archaic supercomputing would've been doing something more useful.
Journalism was very different back then because it was specialized and funded easily. Now everyone with a youtube channel is essentially a journalist and funding is infinitesimally small.
Well, professional TV stations often took the footage not from news agencies, but from their partner TV stations, like for example CNN takes footage from Germany from n-tv
It's so interesting to hear the one salesman talk about how the petroleum industry is investing heavily in super-computing. I'm sure the viewers at the time assumed that the petroleum industry was forward thinking and looking to benefit from advances in technology to benefit all of society. Obviously in hindsight we know that wasn't the case.
I've worked in the petroleum industry, they still do invest a lot in computers for very number crunching intensive tasks like Seismic Data Processing/Res Sim. The biggest energy companies still own their own data centers and process their own data. However starting around 1995 onwards, most of these tasks were outsourced to data processing companies specialized in taking the gathered field data and turning them into seismic models. The energy companies realized that they were not computer specialists and didn't want to spend the money and time to develop in house software solutions or to own IT infrastructure and personnel. Also most seismic data sets which required supercomputer level power to process in 1984 can now be done on a robust PC in a few hours time on a geophysicist's desktop or compute server.
You’re a ridiculous koolaid drinker. Whiners about muh climate change have no solutions beyond solar and wind fairy dust which any physicist for decades could’ve told you cannot supply our energy needs. Your ilk are shooting us all in the foot because you have a toe ache.
Super computer dual core? HA! Smartphones nowadays like Apple's A14 Bionic and Samsung's Snapdragon 888 can provide Octa core CPU power and a 16 core GPU! If these technology was available back in the 80s, the Cold War could have and might have started World War 3.
It's not what you think it is. I also thought I saw what it is not, but it is not what I saw. PS: very interesting picture indeed, but take a CLOSE look at it and you will see what it really is.
No, modern desktops are 100-10000 times faster. A quad processor Gray X-MP was 800 MFLOPS. Modern top desktop processors are at about 100 GFLOPS. GPUs handle 7 or more TFLOPS.
Well being they use fractal dimensions to describe the complexity of any object. Its how you got the PC graphical use interface for objects to display on your screen. Not sure what the next leap will be. Maybe the 3 diemisonal objects like in Star Wars that you can walk around and interact with, without the need for any glasses of Oculus hardware. But as they said. You need some serious compiler overhaul to take advantage of the hardware already present.
@<a href="#" class="seekto" data-time="918">15:18</a> : Scary to see that guy messing with liquid N with no protection. One splash would kill your hand.
You can pour liquid nitrogen over your skin with minimal effect. Your hand is so comparatively hot that the nirogen starts to evatorate before it can make close contact effectively floating it away from your skin. you have to actively plunge your hand in the LN for a few seconds just to cool it enough to start causing some damage.