Тёмный

Let's Look At Some Big, Expensive Old Servers! 

This Does Not Compute
Подписаться 404 тыс.
Просмотров 631 тыс.
50% 1

Recently at work I decommissioned some servers from around 2012 and 2013. But even compared to modern servers, they still have some crazy specifications!
---------------------------------------­------------------------------------
Please consider supporting my work on Patreon: / thisdoesnotcompute
Follow me on Twitter and Instagram! @thisdoesnotcomp
TDNC t-shirts and apparel can be found at www.redbubble.com/people/this...
This Does Not Compute
PO Box 131141
St. Paul, MN 55113
---------------------------------------­------------------------------------
Music: "Osaka", "Noreste", "Merlot" and "Jon Gordo" by Birocratic (birocratic.lnk.to/allYL).
Additional music by Lakey Inspired ( / lakeyinspired .
Intro music by BoxCat Games (www.box-cat.com).

Наука

Опубликовано:

 

23 июл 2018

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 1,5 тыс.   
@RangieNZ
@RangieNZ 6 лет назад
You know when someone is talking about something they love, when they can just spit out nearly half an hour of highly technical data, without stumbling/ repeating/ delays, etc. Excellent.
@BlueRice
@BlueRice 3 года назад
Passion motivate people to learn about things they like. Only if I had passion for school.
@webserververse5749
@webserververse5749 3 года назад
Is actually pretty base level knowledge here. . . Either that or I learned too much without any "formal" education. As tier 1 technical support whom tells people to restart their computer or how to use their email. I own my own server rack, "built" my own servers, host my own web server, built my own website, set up compute clusters, etc. Sadly I never found an opportunity to put any of this "highly technical data" to meaningful use. This knowledge isn't hard to come by; opportunity is.
@BlueRice
@BlueRice 3 года назад
@@webserververse5749 I agree to some extent. Everyone has different positions in life. Natural born smart but born poor may never get the right education and or influence in life to unleash their potential. Some born average but has better education that maximize their knowledge base on their school. Their environment contribute to strive for success etc.. This could be base level. But it's more precisely which is harder to manufacture because of their technology back then.
@BlueRice
@BlueRice 3 года назад
To me it's fascinating when he can recalled and remember part of the calculation real time or numbers. So he knows beyond just hardware. It's more than just components assembly. It's like he can debug information real time in his head. Fascinating
@DaxtonAnderson
@DaxtonAnderson 6 лет назад
>30 minute lunch break >28 minute video That's some true dedication right there!
@gorillaau
@gorillaau 6 лет назад
Daxton Anderson You then head into the office, grabbing a coffee and sugar snack on the way.
@brukernavn46
@brukernavn46 5 лет назад
+2 minutes he cut away at 17:45
@RandoWisLuL
@RandoWisLuL 5 лет назад
probably an hour lunch.
@rebeccarainharrod
@rebeccarainharrod 4 года назад
I typically skip breakfast and lunch, just eat dinner. So I don't feel so bad about lunch break videos!
@JaredConnell
@JaredConnell 2 года назад
How is that dedication? It'd be dedication if the video was an hour and your lunch was 30 mins but you stayed out to finish it. But the way you put it, you were just lucky to find a video that matches your lunch break...
@RavenHawkTech
@RavenHawkTech 5 лет назад
I work at Lenovo formally IBM one thing to note is that the Blue and Orange coloring on the handles typically have a meaning: in this server Blue is required to power down the system to remove and Orange is hot swappable
@StormyHotwolf88
@StormyHotwolf88 5 лет назад
It that still true for current servers?
5 лет назад
StormyHotWolf88 I believe so
@joeprestera2239
@joeprestera2239 5 лет назад
I used to work with only IBM gear and loved the gear and support! I couldn't stand the original Lenovo servers they started with. It was like the Black & Decker version of power tools... I miss the IBM gear and people.
@nicokammel
@nicokammel 4 года назад
not only Lenovo/IBM, the blue - cold swap and amber (as it is called with another major company)/orange - hot swappable Btw: I was not amazed by the video since I work with bigger servers for a long time. It is sad the the 8s systems are no more there, which was really cool at this time ... All major vendors build systems with the same specs. You want a 4h/4s system with an assload of RAM and cores? Which one do you want? You can get it from all major vendors. The only two differences are: 1. Raidcontroller features (all build in the same OEM cards with different names and subsets of functions) 2. Remote Access via BMC or higher level (like iDRAC or ILO or Xclarity) 3. Noise (some vendors cool down the servers really silent, others are called HPE ^^) 4. Price Tag
@retsam1721
@retsam1721 4 года назад
@@joeprestera2239 I always thought their support was horrible and don't get me started with their server designs compared to HP. HP is the best designed servers out there.
@alexanderstanza
@alexanderstanza 4 года назад
I was one of the engineers for that DL580 G7. The project code name was Hydrazine, indicating 8 risers at that drawer. Your servers were fully loaded, that’s why you need 4 power supply units (PSUs). If I recall correctly, there was a SKU with only 2 hot swap PSUs. iLO (the light-out module) boots up automatically when the PSUs connected to power supply. The reason why the manageability module has VGA and PS2 because usually these are connected to KVM (or vKVM). And, you point out well about the weight. In fact, during our QA testing there was an accident due to that. That’s why our mechanical engineer pasted that “weight warning” sticker on top of the chassis. Looking at this machine brings back a lot of memory. Nice video.
@2501bproject
@2501bproject Год назад
I was one of your customers in Saudi Arabia! 2TB of RAM fully loaded in 2012 for bioinformatics applications
@dreammfyre
@dreammfyre 6 лет назад
This is how I imagined consumer PCs looking in the future when I was a kid. Everything hot swappable and built like lego blocks you just popped in. But here we are, 20 years later, still using the same ATX standards and cases...
@FloppydriveMaestro
@FloppydriveMaestro 6 лет назад
Same here. I mean a lot of hardware is hot swappable but its not practical. For instance in my pc the hard drive is hot swappable but to get the hard drive out you have to remove the graphics card so its not ideal lol
@LasstUnsSpielen
@LasstUnsSpielen 6 лет назад
Hey, no you are completely wrong! We innovated computers a whole lot in recent years by soldering and glueing pc components together so they are unswappable :)
@soviut
@soviut 6 лет назад
You saw how much much the HP machine weighed. The more modular you make something, the heavier it tends to get. Not to mention the cost of having hot swappable couplings on everything. The average consumer really doesn't care, or want, to hot swap PC components on a regular basis; Turning off your computer when you get a new hard drive or video card every few years seems pretty reasonable. On server hardware the expense, weight and added design complexity is warranted because in most cases, the machines cannot be allowed shut down. Most consumers, even enthusiasts wouldn't be willing to pay those kind of premiums for a very occasional convenience.
@soviut
@soviut 6 лет назад
And we have raspberry pi that are an entire PC on a chip that you can't upgrade at all but are cheap as hell. Swappable parts tend not to be anywhere near as thin, light, cheap or efficient.
@imbadwrench
@imbadwrench 6 лет назад
I have always wanted a server tower to put multiple PCs inside of it. Like, a windows PC and a linux pc networked together and using remote access. Of course I went to college and learned how to build token ring networks and i got novell certified so Im really not the go to guy for making wise decisions.
@CommentCritic
@CommentCritic 5 лет назад
Ah yes, finally a machine that can comfortably run Google Chrome.
@bloodytears4you
@bloodytears4you 4 года назад
Im enjoying the chrome jokes.
@WahyuSetiawan-sz4lc
@WahyuSetiawan-sz4lc 3 года назад
How did you even remember what is where with that much tab
@HenryLoenwind
@HenryLoenwind 2 года назад
@@WahyuSetiawan-sz4lc Same way you remember where stuff is in your house. Remembering physical locations is something humans are good at. I can tell you that I have youtube on a tab that's about 1/4 of the screen from the left in the browser window that's halfway down my virtual screens (between the window with the tabs of my devices on the left and the one with my email program on the right). I don't have any idea how I named that bookmark I made 3 days ago. I actually don't even remember what I bookmarked, so how am I to find it again when I need it? But I wrote a little browser extension that can replace the content of a tab with a minimal page that just has the link to the real content and the same title+favicon. Really helps the browser with resource consumption.
@Nord72
@Nord72 2 года назад
@@WahyuSetiawan-sz4lc : thats the meaning, when you close it and later reopen it, you dont need to concentrate on remembering the tabs contents.
@travisnelson9104
@travisnelson9104 2 года назад
Try running chrome on one of those mini pcs. We have clients that use them and it's a nightmare. Constant crashes. Trying to figure how to make chrome run as barebones as possible.
@Jonathan_O
@Jonathan_O 4 года назад
Just retired 4 of those machines (but w/ less RAM) for a client. The client still has one left up and running but it’s only a no-critical file server. They were good machines! Built like a tank and heavy as hell !!
@redsquirrelftw
@redsquirrelftw 5 лет назад
IBM stuff is super expensive, but also super well built. Great to work with. It's crazy how companies don't keep hardware for long though. If I paid that kind of money I'd be keeping it for 10+ years or until it fails completely. That is a beast that has more than enough power even by today's standards. I know why they do it (warranty, support etc) but it still seems like such a waste to replace everything so often. I saw the same sort of thing when I worked in IT. The fun part is I sometimes got to bring some of it home though. :D
@NUCLEARARMAMENT
@NUCLEARARMAMENT 5 лет назад
Power costs don't matter, because I can buy a 1 MW diesel genset housed in a trailer in perfect working condition for $50,000 to $100,000 and buy No. 2 diesel fuel and generate my own electricity for 10 cents per kWh. If I want to go cheaper I produce my own biofuel and reduce the cost of electricity by 5 to 10 times.
@slowtony2
@slowtony2 4 года назад
Thanks for the tour through some "big" top-end Intel server systems. They may be the last of the breed. As with your replacements, I think the trend is away from putting so much in one box. This also brought back memories. I worked with mid-range dual-CPU HP and Dell rackmount systems in 2002 to 2008. Mainly for database work in an imaging system. People don't realize that we called their location a machine room for a reason. Lots of power, heat and sound (even though it was quieter than when we had an IBM mainframe with 14 inch drives and three-phase motors). These systems were built like tanks for a specific kind of heavy use. They really don't fit in a home environment, starting with their power demands. Then you have the need for an external storage system. We used a fiber Storage Area Network (SAN) back then because the fastest Ethernet we had was 1 Gigabit. They DO look cheap on eBay, but when you put the whole system together with rack and memory and storage and networking it gets pretty expensive -- just like a fixer-upper yacht. When it breaks, it's like paying for parts to fix a Ferrari. I loved working with them though. It was an honor and a great memory.
@rwdplz1
@rwdplz1 6 лет назад
I bought several of these when I lived in Silicon Valley from recyclers, and installed them in a 42U server rack with some networking equipment. Frequently the mounting rails cost more than the servers.
@rocktheworld2k6
@rocktheworld2k6 6 лет назад
How much to mounting rails generally go for?
@calaphos
@calaphos 6 лет назад
How much do you pay for power?
@nigratruo
@nigratruo 6 лет назад
I'm curious, how much did you pay for it?
@NoorquackerInd
@NoorquackerInd 6 лет назад
I got a Dell PowerEdge 2950 II with 2 4 core Xeons and 12 GB of ECC with a 750 GB hard drive for free from my friend for $100. Mounting rails _are_ more expensive ;-; I just use a shelf instead
@danstone_0001
@danstone_0001 6 лет назад
Yeah, that's true, HP Dl380 G5/6 are like $350.00 and that not even the ball bearing kind. Just the standard one non quick deploy
@protonjinx
@protonjinx 6 лет назад
I still wish that some day I could get my hands on some beefy exotic server hardware to experiment and learn new things.
@ytmadpoo
@ytmadpoo 2 года назад
Ebay around and you can get something pretty similar to these... Proliant DL580 (or DL380 if you just want dual socket) for a couple hundred... maybe more if you want drives. They are fun to work on, for sure. Make sure you have enough electricity wherever you plug it in. What the video failed to mention is that these were probably running off 240V and at least a 30A circuit, I'd guess. For the amount of equipment on that cart, there were probably a couple of those and the load was split between the circuits and separate PDUs. And they can be VERY loud. I know the HP servers for sure do a "power on blow out" when you first turn it on. All the fans run at 100% for a few seconds before going to the normal automatic (based on temp) mode. It'll sound like a jet engine taking off. Always a crowd pleaser when I'd have a new one on my desk before taking it to the server room...
@tarajoe07
@tarajoe07 Год назад
Ebay and watch for something you can drive to and pick up. Deals can be had
@BrainboxccGames
@BrainboxccGames 4 года назад
When you said 'old', i thought you meant like, 1990s... I still have a server of this era running my dev/test setup in my home. :)
@thewhitescatpack8653
@thewhitescatpack8653 5 лет назад
I work in DC / Infrastructure and appreciate your videos. Most people do not know about this type of stuff, especially when you get into chassis and blades. Those are still viable servers for dev / test / qa. :) It's a classic case of Moore's Law :)
@MatthewHill
@MatthewHill 4 года назад
So it's basically half a Threadripper. :-) Still though, I always admired the build quality on those old IBMs. Shame they've been getting out of the hardware business.
@DarkVeilGaming
@DarkVeilGaming 6 лет назад
Love seeing servers! Would really enjoy more videos of this nature, there's something so satisfying about seeing commercial equipment.
@thetaleteller4692
@thetaleteller4692 6 лет назад
Old? Thats still a beefy lot of computing power!
@MrBillrookard
@MrBillrookard 6 лет назад
Yes it is. The nice thing is you can get that stuff pretty damn cheap. I have a few Xeon servers in my basement - one cost me about $250 for twin hex-core Xeons, chassis, motherboard, fans, heatsinks, and 36GB of ECC RAM. I use it for a storage server and it runs just fine. The HDDs actually cost more than the server did. (about 40TB worth of ZFS storage)
@chunkyburp
@chunkyburp 6 лет назад
Its all about maintenance. Old because no warranty. Need 4 hour on mission critical systems.
@danielh4995
@danielh4995 6 лет назад
We have some at work that are newer version of the TOTL single server platform like this that are 4 socket 8core procs 3TB of RAM per server and 4 direct attached storage units totaling about 500TB. While that unit is very impressive and was TOTL in its day the TOTL stuff today is mind boggling
@geebee123
@geebee123 5 лет назад
I know.....right ! ! !
@lpseem3770
@lpseem3770 5 лет назад
1U and 2U Servers from around 5-10 years ago are becoming really cheap, just because they are too small to handle multiple GPU's. Aside from missing a couple of CPU instruction sets (user mainly for AI) they're eating home computers alive while running 24/7 for months.
@alecjahn
@alecjahn 6 лет назад
Ah I miss working around things like that. Really gets my brain salivating! Maybe I'll get back in it someday. Cool stuff! Nice machines and thanks for sharing them in depth!
@pcsproshop8972
@pcsproshop8972 5 лет назад
Thank you for the quick tour! Love the server side of things, spent many years at it. Don't want to return, but, still way cool to see what "Used to be" the "Ultimate" solution. BTW: For future reference, "Hot Swap" is specified as, having the ability to remove & replace while the machine is OPERATING (thus the HOT). I believe what you're referring to here is "Tool-Less" replacement components. Super cool tour, would love to see more!!!
@ytmadpoo
@ytmadpoo 2 года назад
You really could hot-swap a lot of the components (fans, drives, power supplies). I don't know about the more recent HP servers, but I remember on some older ones you could even hot swap memory. You'd have to configure it for that ahead of time and have a spare bank, but it was possible. The Proliant generations I worked on no longer had that... maybe it was only on the old, clunky Compaq branded Proliants that had that feature. Probably wouldn't make much sense now... if your server has bad RAM, you'd move the workloads off and do offline maintenance. Not worth the hassle and cost of having spare banks of RAM.
@reggiebenes2916
@reggiebenes2916 6 лет назад
That was really interesting. I'm always amazed how well servers are made. I love your normal videos, but that was a cool change of pace.
@AnonymousFreakYT
@AnonymousFreakYT 6 лет назад
Oh, yeah, IBM QPI ports... You can connect multiple of those servers together, and they act as a single system. In 2011, having a "16-socket", 128-core system with 8 Terabytes of RAM...
@williamfernandez5170
@williamfernandez5170 6 лет назад
It’s awesome how we’ve transitioned from this type of configuration to Hyper-Converged type of configuration, still awesome and very pricey/high end IBM. Thanks for posting!
@nrdesign1991
@nrdesign1991 4 года назад
The thing is still _crazy_ when compared to consumer machines, especially if compared to consumer tech of 2012
@aldosansan2335
@aldosansan2335 5 лет назад
Nice, this gives me the inspiration to share old vintage machinery, where I work we have servers from back the 90's collecting dust but totally operating machines. :)
@philipbrindle867
@philipbrindle867 6 лет назад
Thanks for sharing with us. The build quality of that server is amazing, and no doubt it was pretty expensive too...
@mashzmash
@mashzmash 6 лет назад
Wow, look how clean and dust-free they are after 6 years of service!
@harmstrongg
@harmstrongg 5 лет назад
Controlled climate, my dude. All the dust is handled at the environment level.
@mashzmash
@mashzmash 5 лет назад
@@harmstrongg I work in well controlled datacenter climates too, but they're not clean rooms. After 5+ years we usually see at least some basic buildup. Not even enough to warrant cleaning. These look like they just came out of a bath.
@vipervidsgamingplus5723
@vipervidsgamingplus5723 5 лет назад
mashzmash these were probably cleaned more recently before decommission
@scottcol23
@scottcol23 3 года назад
server rooms are "clean Rooms" usually negative updraft HVAC systems that keep everything clean.
@tomonabudget
@tomonabudget 4 года назад
I'd love to see the lights out interface. I work with company making hardware and my manager insists on using SSH, because he doesn't believe in UI's, yet IBM did that in high end servers 9 years ago.
@late.student
@late.student 6 лет назад
Got a new job working in a data center and this is a really cool breakdown of some server gear. Thanks for taking the time.
@mlmmt
@mlmmt 6 лет назад
That was so cool to watch, its always cool seeing just what is inside some of those larger servers, and all the special stuff that goes with them, wish I had all of that ram...
@nono_ct200
@nono_ct200 4 года назад
Thank you for spending your lunch break for showing us these Servers Sir brilliant service ❤️👍
@steve101968
@steve101968 4 года назад
Vmotion still blows my mind. I just refer to it as magic. I've only used Dell and HP and super micro. It's nice to see the quality of those ibm
@jasona8396
@jasona8396 2 года назад
Found this video pretty fascinating. Never really seen the inside of rackmount servers, didn't know I was interested, but turns out I was. Thanks for the look!
@tylertc1
@tylertc1 6 лет назад
I'm a huge hardware nerd - very good walkthrough! Well done and by all means share whenever you have the chance. Really enjoyed watching.
@ThatGuyInVegas
@ThatGuyInVegas 6 лет назад
The title of the video had me thinking I was going to see older servers, like or HP G4 series or Dell PE cheese grater series, thank you for not taking us back to Compaq Proliant :D Nice video.
@jonnyo2121
@jonnyo2121 4 года назад
We still have a couple HP G4s hanging around.
@bloodytears4you
@bloodytears4you 4 года назад
I was thinking of the Proliant G series just as i clicked on this. Im happy to see IBM.
@odemata87
@odemata87 6 лет назад
Thank you Colin for this. Really enjoyed this video
@samohtw1
@samohtw1 2 года назад
Great presentation. I have been retired from the IT field for 15 years now and really enjoyed your video. Thanks for the memories.
@MatthewRulla
@MatthewRulla 4 года назад
I installed and managed hundreds of these boxes, and as you said, mostly for VM's. But we always used SAN, not NAS. I actually worked for Big Blue at this time but I preferred the HP machines. Coming from a DEC VMX background and getting my administrator feet wet in the early NT4 days of Windows (circa pre-Y2K), these 2011 (ish) boxes were a dream come true toward the end of my (30+yr) IT support carer.
@Saghetti
@Saghetti 3 года назад
I actually have a DL580 G7 that I use to run my VMs. It's also retired equipment. It has 128gb of RAM and 40 cores @ 2.4ghz. It was previously used as a mirror database, meaning it has some really fast PCIe SSDs. BTW: I tried and yes, it will run off of 2 power supplies.
@genericgreensquid6669
@genericgreensquid6669 6 лет назад
Oh my god dude! 500 gigs of ram. Don't know if you mentioned it, but I hope that those things don't get recycled and go to good hands.
@dood9245
@dood9245 6 лет назад
Generic Green Squid it's DDR3 server ram. It wouldn't work for modern servers or most consumer boards
@genericgreensquid6669
@genericgreensquid6669 6 лет назад
but the rest of the stuff could be useful to someone, I'm sure.
@chaddoebelin6931
@chaddoebelin6931 6 лет назад
It's too loud to run at home.
@dood9245
@dood9245 6 лет назад
Generic Green Squid not likely since there is thousands of hours of wear on all that equipment and hard drives can never be used again because you can't ever completely get rid of the data without destroying them. Any company with money to buy it wouldn't anyway because they need the latest equipment in order to handle network traffic and future expectations. The commercial world doesn't purchase like the consumer one.
@NickShvelidze
@NickShvelidze 6 лет назад
Tyler Have you watched the video? There are literally no storage drives in there.
@TheLastAnalogJunkie
@TheLastAnalogJunkie 6 лет назад
Thanks for the inside look at some really killer gear!
@zerocks88
@zerocks88 4 года назад
you're obviously incredibly intelligent to be able to walk through all this seemingly and presumably without it being scripted and without just stopping to think almost at all for half an hour really enjoy this kind of content, I work with a lot of this stuff but generally not directly with the hardware
@DarenPage
@DarenPage 6 лет назад
I love the engineering that goes into these things.
@JeffDeWitt
@JeffDeWitt 5 лет назад
I miss working on these things. I do have a few of those memory sticks, they work fine as little flash drives.
@humanbeing_
@humanbeing_ 5 лет назад
This was A FANTASTIC video. I recently just purchased an HP ProLiant DL380 G6 with only 64GB OF RAM, but 12 x 146GB 10K SAS drives, RAID controller, dual 2.5GHz Xeon CPU'S (I forget the exact versions) *AND* a Cisco SF220-48 48-port 10/100 plus x2 RJ-45 GB / x2 SFP GB managed switch *AND* D-Link DES-1250G 48-Port plus 1000BASE-T x2 RJ-45 / x2 mini-GBIC managed switch... AAAND 12 random mostly dual VGA/DVI or dual DVI/DVI circa 2012 video cards, all for $200!!! So nothing as beastly nor awesome as the ones you showed, but still. For me I'm a one-man show IT & Compliance Consultant, and for that price I just couldn't pass this gear up. Happy to say I've subscribed and turned on notifications for your channel, and plan on (right now) starting to watch your complete back catalogue of content. Again; great video! Cheers, -H.B
@MichiganPeatMoss
@MichiganPeatMoss 4 года назад
2011? Old? (here in 2020), Ok, I will concede to feeling old and realize technological cycles continue to turn ever-faster. For "truly old", check out Mr. Carlson's Lab and his vacuum tube equipment. Truly a master of his craft.
@zyxwvutsrqponmlkh
@zyxwvutsrqponmlkh 6 лет назад
Classic lga 2011, and you gotta love the prices of the used ddr3 ecc, it's like $1.50/gig. I love dabbling in the higher end workstation hardware because I can buy the scraps off servers like this, you can often times find a 6 year old xeon for 1/10th or 1/20th the original msrp.
@linushyper300
@linushyper300 6 лет назад
- Dangit, is Colin playing with the servers again? I lost all my work, again!
@tvhistory3397
@tvhistory3397 2 года назад
Awesome equipment. I just decommissioned a IBM Netfinity 5100 from 2001 last week. It went home with me.
@user-mp9rd4hg8b
@user-mp9rd4hg8b 5 лет назад
I bought a 2 year old Dell R805 2u rack server from a liquidator on Ebay. It was a VMWare ESX host and was loaded. $3000 new, paid $800, still under warranty. I powered it up, created my VMs, and put them to work. That was 12 years ago. Still running like a champ.
@seanmchughnt
@seanmchughnt 6 лет назад
Dude! Nice showing of a DC server. Work with these everyday. The one you showed is a monster even nowadays.
@mingchenzhang3113
@mingchenzhang3113 6 лет назад
You can say those indeed do not compute anymore
@SummonerArthur
@SummonerArthur 6 лет назад
Mingchen Zhang LMAO
@tess4647
@tess4647 4 года назад
I used to like Sun's internal hardware design, but that first IBM box is beautifully setup :)
@bertholtappels1081
@bertholtappels1081 4 года назад
This was awesome. Fascinating. Maybe the best content you’ve done so far.
@slughead
@slughead 4 года назад
You've decommissioned better than I have in production. The extended warranties are killer from IBM.
@grayrabbit2211
@grayrabbit2211 2 года назад
Same here. Just bought some Dell PowerEdge R720s for my office this year to replace the Dell 2650 and 2950 servers we are running. Our 2650s are from 2004 and have been running continuously since then. Truly remarkable. Almost sad to shut them down.
@moconnell663
@moconnell663 2 года назад
@@grayrabbit2211 a word on your 2950s, I had a capacitor failure on the motherboard in one of those. Nothing convinces the boss to give up some money to replace 12yo mission critical hardware like a cloud of magic smoke. Luckily that server was able to soldier on with a single CPU and single bank of memory for a few more weeks.
@grayrabbit2211
@grayrabbit2211 2 года назад
@@moconnell663 I'm old school - there's still a soldering iron and scope on my desk. Still running some 2650s in the rack.
@holyravioli5795
@holyravioli5795 5 лет назад
Damn i loved these things, so much cooler looking than modern servers.
@hariranormal5584
@hariranormal5584 3 года назад
Servers have not changed their looks much according to me, they still look pretty amazing and nicely built
@44Bigs
@44Bigs 4 года назад
3:11 a cool thing with IBM/Lenovo servers is the colour coding of components. Red components, such as the fan assembly, are hot swappable. Blue components, such as the RAM card, are not hot swappable (or at least not without some decommissioning beforehand).
@chrisnyc4688
@chrisnyc4688 4 года назад
I support / work on those IBM x3850 x5 servers. When looking at the rear, the 4 QPI port slots allowed you to connect two 4U racks with 4 QPI cables and you can then configure the "IMM" software to make both racks run the one operating system - 8 sockets and all memory/PCIe cards from both racks. And with a stop, IMM reconfigure and then start, then back to running as 2 independent OS's (leaving the QPI cables in place for a future reconfig).
@rahikkala
@rahikkala 4 года назад
Ah, I remember installing and deploying identical models "back in the day". They were monsters back then.
@JimFortune
@JimFortune 5 лет назад
Thanks for donating your lunch hour to the public good.;->
@cageordie
@cageordie 5 лет назад
I worked for a couple of server equipment manufacturers. We used a hydraulic cart, so we jacked the table up to the level of the equipment and then just pulled them out onto the table. Same at the other end, just jack it up to the bench or shelf and shove it off again. Amazon sells them, lift tables. Just thank your luck you never had to mess with an AN/WSC3, and that wasn't the worst thing I ever had to shove in a rack.
@cgln8760
@cgln8760 5 лет назад
you used to be able to pair two of these using a QIC cable and the result was a single system from the OS point of view - massive. We had 10 of these with this configuration connected to a pair of clustered V7000 storage units (Fiber connected), as a pair of Vsphere clusters split between 8km (separate datacentres) of 1GB dark fibre.
@brownmanbeaniehead
@brownmanbeaniehead 6 лет назад
Google Chrome ram requirements: 1 terabyte
@ThisDoesNotCompute
@ThisDoesNotCompute 6 лет назад
truth
@killertruth186
@killertruth186 5 лет назад
SkyGamer It sounds like you have more RAM for Chrome?
@cosmo1494
@cosmo1494 5 лет назад
One blade server per tab
@PrinceWesterburg
@PrinceWesterburg 5 лет назад
Wow, they've really optimised that down XD
@akeiai
@akeiai 5 лет назад
@@SkyGamer911 nope, not true, based on my experience, edge opens up faster than chrome. But the internet speed is faster with chrome than edge. Chrome has used 2gb almost with 16 tabs. Edge doesn't even get that amount.
@muskaos
@muskaos 6 лет назад
You would need a 20a circuit per box to run them at full tilt. I like old gear, but no way would I get one of those, just too costly in electricity.
@bentendo6464
@bentendo6464 6 лет назад
I got a couple of old IBM 6U servers that were circa 2003 and used them as a coffee table. Pretty awesome hardware!
@rbmwiv
@rbmwiv 5 лет назад
The IBM, that is a beautiful machine, lots of incredible engineering of many kinds to build that. Truly a work of art.
@PearComputingDevices
@PearComputingDevices 6 лет назад
I like HP business class hardware, they make their consumer lines look pretty stupid once you get the quality and such, and IBM... Nobody got fired for going with IBM, but this comes at a cost. I bet that server being a genuine IBM server was about 20k, if not more. We had 1U and IBM tower servers in our company replaced at the time, I bought up the IBM PowerPC powered server. I think it was a 604e, but our company had spent nearly 50k at the time. Our boss warned us over scratching the goods.. lol. But I love it. Nice stuff.
@BlackEpyon
@BlackEpyon 5 лет назад
I can usually price out a generic server of similar specs for half the cost. But the reliability and customer support may just be worth it, especially if you're a large business that can afford no downtime.
@MrKillswitch88
@MrKillswitch88 6 лет назад
It is always sad to see stuff like this go to the scrap knowing there still a lot of life left in them.
@paulstubbs7678
@paulstubbs7678 4 года назад
Considering how much power they suck, going to newer much more efficient hardware is probably environmentally a good idea. - although I'd love the power supplies.
@buffuniballer
@buffuniballer 4 года назад
It's usually cheaper to buy newer, faster enterprise servers that come with a year or two of service than it is to keep a support contract on older gear.A At some point, the parts to service these become more difficult to find. ( I have a customer who insists on still running an late 1990s Sun E10000 on 250MHz procs and SBus cards. They can probably get the same or faster performance using less power if they simply got something from 2010... But finding parts for older servers, my 1990s vintage machine being an extreme example, helps make the case for upgrades to newer, more power and usually more efficient hardware.
@johnrauner2515
@johnrauner2515 4 года назад
@@buffuniballer If they are in an air conditioned server room with constant temperature and running 24/7 they will last for years with out any hardware failures. I have 2 servers in my home server room with motherboards from 2007 still going strong and I expect them to still be going in another 10 years. I've worked in the broadcasting industry (radio and television) and equipment in both industries is left running 24/7 so it lasts. 90% of the damage is caused by the thermal shock of turning things on and off.
@buffuniballer
@buffuniballer 4 года назад
@@johnrauner2515 it's more about the odds and costs associated with a failure with some enterprise customers. What you and I can tolerate in our homes (and I have some 1990s Sun gear that still runs) is quite different from what a financial firm or other can tolerate while the markets or open, or other such circumstances. The odds of that 1990s or 2000s era server going down are greater and the difficulty in finding parts is a real concern. Not all enterprise customers are willing to take those risks.
@buffuniballer
@buffuniballer 4 года назад
@@johnrauner2515 the other things is spinning components do fail. Fans and hard drives immediately come to mind. Not to mention that the sub 100g drives of a generation ago are being replaced by multi-terabyte drives. You mentioned (then apparently edited out) broadcast industry. I don't know if the storage requirements of broadcast are increasing, but typical IT storage growth seems exponential, so the drives from the 1990s and early 2000s just don't cut it to store the volume of data collected today. So while I don't disagree with your comments about equipment lasting, there are also very good reasons for enterprise users to upgrade to modern gear. More capability in the same space and lower power consumption come to mind as well as the better availability of repair parts.
@bubba1984
@bubba1984 4 года назад
The clarity of no-BS explanation around this entire video is unbelievable, it is almost as if you sir did not get the memo of what it takes to be in the profession -- which is 100% BS around every simple piece of IT. The few dinosaurs left in this field who are willing to tell the truth is what keeps me in the space of IT to this day. Good work sir!
@marvk
@marvk 6 лет назад
Great video Colin, I've enjoyed this! Different from your usual stuff but interesting nonetheless!
@LEVELMotorsports
@LEVELMotorsports 6 лет назад
It's funny to see this. When this was purchased everyone was probably booting from either DOM or SD card to boot a hypervisor and then running storage from a SAN. Now, everyone is coming back to having storage inside the box with products like Nutanix or Cisco HyperFlex. We're now seeing multiple 10Gb, 40Gb, and even 100Gb Ethernet uplinks from hyper converged server nodes to the storage fabric and intelligent and redundant distribution of server storage on local disk within the hyper-converged clusters. Tons of cool stuff, but it's all so expensive. Source: I work for one of the US' largest Cisco resellers ;-)
@raykall
@raykall 6 лет назад
Interesting stuff man!
@comput3rman77
@comput3rman77 6 лет назад
My company just finished migrating off of Cisco blades and an EMC SAN to Nutanix nodes. We installed about $1000000 worth of new hardware to do it.
@jamesmulroy59
@jamesmulroy59 5 лет назад
What's the benefit of separating the CPU from the storage?
@nbenci9005
@nbenci9005 5 лет назад
Benefit is the one of having separate nodes for different tasks, lets say hypervisor node will be buffed up with memory capacity and cpu power and space that would been used up by drives is now used up for better part arrangement and cooling. And with storage nodes u don't have to sacrifice much on memory / cpu space and cooling so u can shove in more drives and adequately cool those. I have 2 of IBM's X3650 M3, they have 18 ram slots ( 128gb max capacity ), 2 cpu sockets and 16 drive slots with DOM port inside so one could say those are good servers for overall tasks, both computing and storage, but if one would want to get more processing power, even by cramming in 2 strongest xeons of that era u get limited, where in 1st server from this video u can cram in 4 cpus and shitload of ram.
@bassman87
@bassman87 5 лет назад
so true. I also work for a Cisco partner. I suspect Cisco went with hyperflex because they feel they lost out during the storage wars. many of the top storage companies got bought up by Cisco's competitors so they feel they need to finally try to compete.
@Browningate
@Browningate 6 лет назад
This most definitely does compute. Quite a lot.
@NoorquackerInd
@NoorquackerInd 6 лет назад
But not anymore ;-; they've been decommissioned
@TheEulerID
@TheEulerID 3 года назад
I used to design a lot of systems with this generation (and earlier kit) and one very good reason to upgrade it to modern stuff is the power consumption. Not just the electricity used by the servers, but the tremendous amount of heat, all of which had to be removed and air conditioning is a huge cost. Then there is the requirement to provide UPS and auxiliary power supplies. When retrofitting an old computer centre which used to contain mainframes, the heat generation went through the rough. Imagine having a row of a dozen 19" racks, so something approaching a hundred servers each pumping out over 1kW. That's just one row in data centre room that might have had 20 or more such rows. In practice, we used to have heat budgets with, perhaps, only 4 kW per rack. Even that is a huge amount. There were tricks that has to be pulled, alternating "hot row, cool row" layouts (you do not want your server pulling in the hot air emitted from another server). SAN storage arrays also produce a huge amount of heat too (not to forget something like a giant EMC or HDS array being incredibly noisy). So all these super high density servers that put massive amounts of computing power into a rack were great in principle, but for very large data centres, especially older ones, the power consumption and heat generation could be crippling. Fortunately the vendors got to understand this and the energy efficiency of modern servers is considerable better.
@coldhardwick
@coldhardwick 6 лет назад
New to the channel. That was fascinating! Thanks!
@mentalplayground
@mentalplayground 4 года назад
Cool video. Thanks to AMD those core numbers are not that impressive since 2019 but before .. very impressive.
@misfitsman805
@misfitsman805 6 лет назад
Man I'd love to have that first IBM server shown lol
@matthewpalin
@matthewpalin 5 лет назад
awesome video very informative i learned a few things i didn't know about servers.
@mossup-
@mossup- 4 года назад
This video actually kept me interested the whole time where others hav'nt, I love server stuff, always have since the days of the 386, I just wish i could afford it. Thankyou for this video and your time for making it.
@pinkipromise
@pinkipromise 4 года назад
decomm stuff are cheap. i got an 8 core dell 720 server for $50
@memadmax69
@memadmax69 6 лет назад
Dude, I would've absolutely killed for this setup back in the day... Still will. =P
@kyle_mk17
@kyle_mk17 4 года назад
Business machines are surprisingly good. I'm typing this from my Hp EliteBook 8770w.
@wildbill23c
@wildbill23c 6 лет назад
They're built to upgrade, update, repair, etc. Something many of today's computers lack since they've decided to start soldering stuff on the motherboards so the general consumer can't upgrade stuff, forces people to go back to the store and open their wallets again.
@LudgerPeters
@LudgerPeters 6 лет назад
Well if you buy a normal PC everything is up-gradable. If you buy a laptop or a All-In-One PC then nothing is upgradble.
@wildbill23c
@wildbill23c 6 лет назад
Weird, I can upgrade my laptops. Many of the newer ones though everything is soldered on the board as are batteries. Why? To make the consumer buy a new one. My Lenovo Thinkpad T430 I upgraded the processor to an i7 and RAM to 16GB, its running a 512GB SSD rather than its old regular spinning drive. You are somewhat limited to peripheral upgrades but haven't really thought of anything I'd need to change for ports that it doesn't have or that you can't get a USB version. I understand what you are saying to a point though. All In One PC's depending on the system some can be upgraded somewhat, but once again you are kind of limited to available parts in some cases. Now mother boards in laptops and all in one PC's yep you are limited oftentimes by the design of the laptop's case due to port locations.
@LudgerPeters
@LudgerPeters 6 лет назад
Well the idea with laptops is they are trying to go smaller and smaller. Any Laptop that is not a ultrabook you can still upgrade Ram and HDD. You get some weird non mainstream laptops that take desktop cpus and are still up-gradable, Electronics is one of the more unique brands. But going forward even with my up-gradable parts on my desktop I still have to normally do a complete overhaul when you do the cpu+ram+mother board.
@wildbill23c
@wildbill23c 6 лет назад
Very true. Usually when I upgrade its a new machine or new to me, I run them till they die most of the time unless I find a great deal on one somewhere that I can't pass up.
@compfreak530
@compfreak530 6 лет назад
alot of non apple products are still upgradeable. i can upgrade ram, hard drive, and even in some cases the cpu on my dell and lonovo laptops. but apple.... nope just toss it
@ferrer985
@ferrer985 5 лет назад
I must say I learned a great deal from this vid. Very clear explaination in basic language of relevant information. Nice vid
@nitto999
@nitto999 5 лет назад
Loved the video! God, that was some hardware (and still is) for 2012!
@thespreeman401
@thespreeman401 5 лет назад
I remember when having a 1mb sip memory chip was considered amazing tech. Yes I am old!
@wellsilver3972
@wellsilver3972 2 года назад
Maybe in a year or two a petabyte will be what 1 megabyte was like back then
@therealcanadagaming
@therealcanadagaming 2 года назад
comic book guy
@JasonLeaman
@JasonLeaman 6 лет назад
I'll take the Ibm :) :) nice video Definitely do more !
@meatloaf666999
@meatloaf666999 5 лет назад
When I was younger I had a Compaq quad Pentium 3 Xeon server at 800mhz each if I remember right, I want to say about 4GB of ram and it had 4 18.2GB SCSI drives. That thing was a boat anchor! I think the spec sheet said it weighed 130lbs! The front panel was like 1/4" metal plate. lol
@stephenroberts7554
@stephenroberts7554 Год назад
I’ve worked on similar servers back in college back in the late 2000s and the one they had that we started with working on was a older dell server that had 4 cpus in it. It was the loudest server I’ve ever seen when you first booted it up it sounded like a jet taking off 😂
@tripjet999
@tripjet999 5 лет назад
"Next time: We shut down all the servers at Google headquarters."
@ahweikun
@ahweikun 4 года назад
tripjet999 google: LULZ, heard of data centre redundancy?
@erkinalp
@erkinalp 4 года назад
_sigh_ *RU-vid deletes the parent comment* _sigh_
@PicaDelphon
@PicaDelphon 5 лет назад
My Question How Much for the Old Used Units, I still run with a Proliant 7000
@ElNeroDiablo
@ElNeroDiablo 4 года назад
Found this vid after being referenced from your newer decommissioned server vid, and I gotta say - running that at ~240V 50Hz like at home in the EU/UK/Aus/NZ you get 1975W out of each PSU?! daaang, that's beefy!
@Billy_bSLAYER
@Billy_bSLAYER 5 лет назад
Lol, I love it, I just worked on one of these last year for Charter (Spectrum). Unfortunately, I will lose my access if I were to record there at all! Thanks for the great info.
@bfgtech48
@bfgtech48 5 лет назад
I'd love to build a gaming PC in one of these, the chassis looks so cool and retro.
@spiral9316
@spiral9316 4 года назад
Omg very cool explanation I want one
@Locutus
@Locutus 5 лет назад
Good video. I was very pleasantly surprised by the depth and knowledge of this video.
@cheetobambito9724
@cheetobambito9724 2 года назад
awesome video! loved learning about a server used back in 2011! Subbed for more old server videos!:D
@cyberjack
@cyberjack 5 лет назад
still pretty fast servers even in 2019 ... i'd gladly still use them
@MegaMarcusred
@MegaMarcusred 4 года назад
I use the x3850 x5 as a student for SAP HANA dev - realy awesome machine!
@llothar68
@llothar68 3 года назад
Can afford the power cost? Must live somewhere in Kentucky or China. The new AMD Epic can be run even in student dorms on student budgets (at least in 8 years when they get their way to EBay and can be bought third hand).
@GreenLinuxPenguin
@GreenLinuxPenguin 6 лет назад
I am disappointed, I was expecting like lga 771-era servers, having one of these in my homelab would be a dream!
@h0ll1s
@h0ll1s 5 лет назад
Thanks for showing this to us, so interesting! Are there any other uses for servers not in a server usage? Like maybe processing power for research or other personal use??
@timburton1080
@timburton1080 2 года назад
Deployed a load of x3950 for a Geo science company years back, with the scaling cables etc, each one had a TB of ram which was insane for the era.
@krist0sh
@krist0sh 6 лет назад
I'd love to get something similar for our local LAN-party. Compared to modern servers these become less efficient to run 24/7, but for LAN-parties that occur a couple of times a year they are perfect. I currently run a DL585 G2 upgraded to 48GB RAM and 4x 6-core CPU (which essentially makes it a G6) for virtual game-servers.
@FireAlert
@FireAlert 4 года назад
that server has more ram space than my hard drive
@paulstubbs7678
@paulstubbs7678 4 года назад
Crazy project, turn one of them servers into an external ram based drive for your PC! However considering the power draw, it would be highly advisable to scrap it shortly after verifying it works (and making a RU-vid vid of it)
@hariranormal5584
@hariranormal5584 3 года назад
Lel, biggest servers can go upto 48TB of RAM. And also, IBM's power10 architecture can somehow support upto 2 PB of ram, yes, 2000TB of ram. We taught 1PB of storage was epic SSD density per full rack is around 70PB.
@joejohnson1213
@joejohnson1213 4 года назад
To take it one step further the first system (IBM x3850 X5) can have 2 systems connected together through the QPI ports in back to create a system with 8 CPUs. The system he described would then have 16 CPUs , 128 core, 256 threads and 1TB of RAM as a single system.
@dj9choco
@dj9choco 4 года назад
very crazy configuration for those servers. i have a pair of ml350 with dual xeon and also vmware... excellent video to explain a bit about the enterprise hardware.
Далее
Big Old HP Server from 1996!
27:49
Просмотров 187 тыс.
Why do we need SO MANY SERVERS??
13:37
Просмотров 3,3 млн
Получилось у Миланы?😂
00:13
Просмотров 524 тыс.
Could this prebuilt vintage PC actually be GOOD?!
16:36
My HUGE (but small) 1U Server Upgrade
17:17
Просмотров 201 тыс.
Why you should or shouldn't buy used servers!
12:35
Просмотров 5 тыс.
Developing Software for a Sun Fire v440
25:53
Просмотров 40 тыс.
Tour of Home Network 2020
13:31
Просмотров 3,1 млн
Budget Storage Server 2021! | 80TB NAS
13:53
Просмотров 2 млн
Overclocking a 90's Industrial Computer
14:35
Просмотров 32 тыс.
What did we use before USB? | Nostalgia Nerd
12:01
Просмотров 1,6 млн
The Petabyte Pi Project
22:27
Просмотров 2,2 млн
iPhone 16 - НЕ СТОИТ ПРОПУСКАТЬ
4:50