Тёмный

INSANE PetaByte Homelab! (TrueNAS Scale ZFS + 10Gb Networking + 40Gb SMB Fail) 

Digital Spaceport
Подписаться 18 тыс.
Просмотров 45 тыс.
50% 1

Check out our INSANE 1PB network share powered by TrueNAS Scale!
FEATURED GEAR:
Netapp DE6600 geni.us/netapp_de6600_60bay
NetApp DE6600 digitalspaceport.com/netapp-d...
DE6600 SAS2 Modules geni.us/6600_SAS2_EMM
DE6600 PSU X-48564-00-R6 geni.us/DE6600_PSU
DE6600 Replacement Tray X-48566-00-R6 geni.us/DE6600_tray
DE6600 Fan Canister X-48565-00-R6 geni.us/DE6600_fan_canister
👇HOMELAB GEAR (#ad)👇
RACK - StarTech 42U Rack geni.us/42u_Rack
DISK SHELF (JBOD) + CABLE
Netapp ds4246 geni.us/netapp_4246_caddies
Netapp ds4243 geni.us/netapp-ds4243-wCaddy
QSFP to 8088 (SAS Cable needed for 4246 & 4243 JBODs) geni.us/mCZCP
HARD DRIVES shop.digitalspaceport.com
RAM
DDR4 RAM geni.us/DDR4_ECC_8x32GB
SERVER
Dell r720 geni.us/OAJ7Fl
Dell r720xd geni.us/5wG9n6
Dell t620 geni.us/dell_t620_256gb
SERVER RAILS + CABLE MANAGEMENT
APC Server Rails geni.us/APC-SERVER-RAILS
Cable Zip Ties geni.us/Cable_ZipTies
Monoprice 1U Cable Mgmt geni.us/Monoprice_1UCableMgmt
Cable Mgmt Tray geni.us/ServerRackCableMgmt
Dymo Label Maker geni.us/DYMO_LabelMaker
HBA
LSI 9207-8e geni.us/LSI-9207-8e
ENCLOSURE
Leviton 47605-42N geni.us/leviton_47605-42N
SWITCH
Dell 5548 Switch geni.us/Dell_5548
Mellanox sx6036 Switch geni.us/Mellanox_SX6036
Brocade icx6610 Switch geni.us/Brocade_ICX6610
UPS
Eaton 9PX6K geni.us/Eaton9PX6K
Eaton 9PX11K geni.us/Eaton9PX11K
Be sure to 👍✅Subscribe✅👍 for more content like this!
Join this channel to get Store discounts + more perks www.youtube.com/@digitalspace...
Shop our Store (receive 3% or 5% off unlimited items w/channel membership) shop.digitalspaceport.com/
Please share this video to help spread the word and drop a comment below with your thoughts or questions. Thanks for watching!
☕Buy me a coffee www.buymeacoffee.com/gospaceport
🔴Patreon / digitalspaceport
🛒Shop
Check out Shop.DigitalSpaceport.com for great deals on hardware.
DSP Website
🌐 digitalspaceport.com
Chapters
0:00 TrueNas Scale PetaByte Project
0:48 Unboxing a PetaByte
1:55 Putting drives in NetApp DE6600
4:22 JBOD Power Up
4:47 Wiring Up 40Gb Network
7:00 ZFS SSD Array Install
8:10 TrueNas Scale Hardware Overview
9:24 Create ZFS Flash Array
10:00 Create PB ZFS Array
11:00 Setup SMB Share TrueNas Scale
12:30 Map 1PB Network Share
13:05 Moving Files over 40Gb
14:30 40Gb network SMB Windows 11
16:20 Troubleshooting SMB Windows networking performance
19:35 Could it be the EPYC CPU?
#homelab #datacenter #truenas #zfs #homedatacenter #homenetwork #networking
Disclaimers: This is not financial advice. Do your own research to make informed decisions about how you mine, farm, invest in and/or trade cryptocurrencies.
*****
As an Amazon Associate I earn from qualifying purchases.
When you click on links to various merchants on this site and make a purchase, this can result in this site earning a commission. Affiliate programs and affiliations include, but are not limited to, the eBay Partner Network.
Other Merchant Affiliate Partners for this site include, but are not limited to, Newegg, Best Buy, Lenovo, Samsung, and LG. I earn a commission if you click on links and make a purchase from the merchant.
*****

Наука

Опубликовано:

 

9 июн 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 173   
@HomeSysAdmin
@HomeSysAdmin Год назад
2:36 Ooooh that perfect drive cube stack!! Wow 1PB in a singe array - you're making me 8x 18TB look tiny.
@DigitalSpaceport
@DigitalSpaceport Год назад
I had a hard time committing to taking them and putting them in the jbod the stack looked so good.
@CaleMcCollough
@CaleMcCollough 9 месяцев назад
He must be single. There is no way the wife would allow that much server hardware in the house.
@quochung9999
@quochung9999 5 месяцев назад
He paid off the house I guested.
@GaryFromIT
@GaryFromIT 4 месяца назад
He does in fact have a wife, she even videos him with it for hours at a time.
@slug.racing
@slug.racing 7 дней назад
Maybe you should borrow some big boy pants.
@BigBenAdv
@BigBenAdv Год назад
You probably need to look into NUMA and QPI bus saturation being the issue on your Truenas box since it's and older dual-socket Xeon setup. Odds are the QPI bus is saturated when performing this test. For some context: I've successfully ran single connection sustained transfers up to 93Gbit/s (excluding networking overheads on the link) between on Windows 2012 R2 boxes in a routed network as part of an unpaid POC back in the day (2017). Servers used were dual-socket Xeon E5-2650 v4 (originally) w/ 128GB of RAM, running Starwind RAMdisk (because we couldn't afford NVME VROC for an unpaid POC). Out of the box without any tuning on W2012R2, I could only sustain about 46-50Gbit/s. With tuning on the Windows stack (RSC, RSS, NUMA pinning & processes affinity pinning), that went up to about 70Gbit/s (the QPI bus was the bottleneck here). Eventually, I took out the 2nd socket proc for each server to eliminate QPI bus saturation and the pinning/ affinity issues and obtained 93Gbit/s sustained (on the Arista switches running OSPF for routing, the actual utilization with the networking overheads was about 97Gbit/s). The single 12C/24T Xeon was only about 50% loaded with non-RDMA TCP transfers. The file transfer test was done with a Q1T1 test on Crystaldiskmark (other utilities like diskspd or Windows Explorer copies seem to have some other limitations/ inefficiencies). For the best chance at testing such transfers, I'd say that you should remove one processor from the Dell server running Truenas. 1) Processes running on cores on socket 1 will need to traverse the QPI to reach memory attached to socket 2 (and vice versa). 2) If your NIC and HBA are attached to PCIe lanes on different sockets, that's also traffic that will hit your QPI bus. 3) Processes on socket 1 accessing either the NIC or HBA attached to PCIE on the 2nd socket will also hit your QPI bus. All of these will potentially end up saturating the QPI and 'artificially' limit the performance you could get. By placing all memory, NIC, and HBA to only one socket, you can effectively eliminate QPI link saturation issues.
@DigitalSpaceport
@DigitalSpaceport Год назад
Incredible info. Thanks for writing this up I will remove a processor and test this out with the nic and HBA attached
@punx4life85
@punx4life85 Год назад
Awesome vid! Thanks g! Picked up another 66tb for my farm
@thecryptoecho
@thecryptoecho Год назад
Love catching up on your build. You never stop building.
@DigitalSpaceport
@DigitalSpaceport Год назад
I'm going to get into software more at some point here, but man machines are fun!
@rodrimora
@rodrimora Год назад
I believe that the windows explorer copy/paste is limited to 1 core go that would be the bottle neck. Also I think at 14:40 you said the "write cache", but the RAM in ZFS is not used for write cache as far as I know, only for read cache.
@DigitalSpaceport
@DigitalSpaceport Год назад
Yeah I'm checking into robocopy GUI here. I spent a day trying to get smb multichannel to work with the other 10gb nic in the computer so I hope to be able to track it down soon. In the past my Ryzen 3600 stomped this transfer speed so your spot on. I thought ram buffered writes in ZFS?
@xxcr4ckzzxx840
@xxcr4ckzzxx840 Год назад
​@@DigitalSpaceport Rodri is right here. Single Core only for Windows explorer copies. SMB Multichannel is a PAIN between Windows and Linux. If you ever get that to work reliably make a dedicated Video about it PLEASE! For the buffered writes; you are right here. ZFS buffers ~5s of writes in RAM and moves it to the Disks. Thats btw the reason Disk benchmarks on ZFS are useless and also why the copying speed flactuates quite a bit on most spinning rust setups. There is an option to tune it ofc, so that you buffer exactly as much data in RAM, as your drives can write until the RAM buffer is full again. If I only could remember the Name... EDIT: If SMB Multichannel is a nono, then try NFS v4 with a Linux System. It will perform substantially better, as SMB is single-threaded only too iirc. EDIT2: openzfs.github.io/openzfs-docs/Performance%20and%20Tuning/Module%20Parameters.html#zfs-txg-timeout - Have a look at that. Its the Cache thing above and might need tuning in your setup. BEWARE! This is a deep, deep rabbit hole!
@BloodyIron
@BloodyIron 8 месяцев назад
Thanks for showing these examples!
@DigitalSpaceport
@DigitalSpaceport 8 месяцев назад
I try man I try. It's a lot of guesswork over here so appreciate the feedback on things you find valuable.
@BloodyIron
@BloodyIron 8 месяцев назад
@@DigitalSpaceport Well I've more been watching a bunch of your eps for IT specific infra, not the crypto stuff myself. So thanks again!
@chrisumali9841
@chrisumali9841 11 месяцев назад
thanks for the demo and info, MegaUpload lol... Have a great day
@pfeilspitze
@pfeilspitze 9 месяцев назад
19:38 "now we have this set up in a much more common-sense [...]" -- I'm a ZFS noob, but is 60 disks in a single Z2 really a good idea? Seems like the odds of losing 3/60 disks would be relatively high, particularly if they all come from one batch of returned drives. What if it was 6x (RaidZ2 | 10 wide) instead, say? Then it could stripe the reads and writes over all those vdevs too...
@christiandu7771
@christiandu7771 Год назад
Thanks for your video, can you tell me where you buy these disk (not available in your shop) ?
@DigitalSpaceport
@DigitalSpaceport Год назад
Hard drives do sell fast. We notify paid channel members first of new stock as soon as it is posted to the store. Members also receive a 3% or 5% discount code depending on which level they sign up for (a code is generated for each month which has unlimited use while the code is active.) ru-vid.comjoin Another way you can get notified is to subscribe to the e-mail list on shop.digitalspaceport.com. We send out an e-mail notification when hard drives come back in stock if there are any left after the channel members have been notified.
@arigornstrider
@arigornstrider Год назад
That stack of 20TB drives! 🤤
@DigitalSpaceport
@DigitalSpaceport Год назад
#density
@TheSouthernMale
@TheSouthernMale Год назад
He is just trying to insure he keeps a head of me in the pool 🤣🤣🤣🤣
@DigitalSpaceport
@DigitalSpaceport Год назад
that word you use there "Trying"....are you sure your using it right?
@TheSouthernMale
@TheSouthernMale Год назад
@@DigitalSpaceport Of course I am, be careful or one day you will fill a strong wind as I pass you by. 😛 Maybe someday you and I will be number 1 and 2 in the pool, you of course being the later. 😛😛
@TheSouthernMale
@TheSouthernMale Год назад
@@DigitalSpaceport You also seam to forget that while you are using compressed plots to stay a head of me, I am not so far, just imagine that strong wind as I pass you by once I compress them. 🤣🌪🌪🌪🌪
@laughingvampire7555
@laughingvampire7555 11 месяцев назад
that sound of fans is just relaxing to me
@DigitalSpaceport
@DigitalSpaceport 10 месяцев назад
10 hour server white noise video hummm yeah
@JasonsLabVideos
@JasonsLabVideos Год назад
Nice setup !! Looking skookum man !! Keep going !
@DigitalSpaceport
@DigitalSpaceport Год назад
Thanks appreciate it 👍
@notmyname1486
@notmyname1486 7 месяцев назад
just found this channel, but what is your use case for all of this
@TheSasquatchjones
@TheSasquatchjones Год назад
My God! it's been a while. Great video!
@DigitalSpaceport
@DigitalSpaceport Год назад
Howdy 🤠
@mitchell1466
@mitchell1466 Год назад
Hi, loves your video I noticed when you where in the iDrac you were on a Dell 720XD, I am looking at going to 10GB for my setup and was wondering what 10GB NIC you have installed?
@DigitalSpaceport
@DigitalSpaceport Год назад
Mellanox connect x2
@mitchell1466
@mitchell1466 Год назад
@@DigitalSpaceport hey thanks for the reply Did you have any difficulty getting it to work with Scale or did you just plug it in and TrueNAS picked it up?
@user-ve9ju8zg1q
@user-ve9ju8zg1q Год назад
Hello,Is it possible to update the ESM/EMM firmware?
@TannerCDavis
@TannerCDavis Год назад
Arn't you limited to 6gbps sas cable connections? Do you have multi path option on to get above 6? The speeds above 12gbps are probably due to writing to ram, then slows down to write to disk thru the wire connections.
@DigitalSpaceport
@DigitalSpaceport Год назад
SAS2 is pretty decent if you have wide mode running. The DE6600 can do that on the first connected device but not on the second daisy chained (that I have been able to figure out at least)
@thumbnailqualityassurance7853
How does TrueNAS know how to light the disk failure LED on the netapp if a disk fails?
@ernestoditerribile
@ernestoditerribile Год назад
It’s not TrueNas doing that. It’s the HBA/RAID/Disk controller of the NetApp checking the S.M.A.R.T. Status
@TVJAY
@TVJAY Год назад
I am new to your channel, is there any chance you can do an overall tour of your setup and how you got to where you are?
@DigitalSpaceport
@DigitalSpaceport Год назад
What a great idea. I'll get one in the works here. Welcome and thanks!
@Mruktz
@Mruktz 11 месяцев назад
I have a humble homelab, but what would you even realistically need a petabyte storage system for?
@DigitalSpaceport
@DigitalSpaceport 11 месяцев назад
video on that all soon
@samishiikihaku
@samishiikihaku Год назад
Not sure of the differences, but Dell, before EMC, used the same enclosure style as well. PowerVault MD3060E and other varities. Though the prices may be a bit different.
@DigitalSpaceport
@DigitalSpaceport Год назад
NetApp made both of these variants and there are not meaningful differences that I have seen at all provided you use the SAS controller for the management node. Just stickers and of course the Dell front bezel. I do have some of the Dell branded EMMs and they work out perfect.
@samishiikihaku
@samishiikihaku Год назад
@@DigitalSpaceport Yep. Just wanted to show another option. Incase people can't find the netapp version.
@xtlmeth
@xtlmeth 11 месяцев назад
What SAS card and cables did you use to connect the JBOD to the server?
@DigitalSpaceport
@DigitalSpaceport 11 месяцев назад
LSI 9207-8e and 8088-8088 cables. Linked in description to exact ones I bought.
@Firesealb99
@Firesealb99 Год назад
You had me at "caddies not needed"
@DigitalSpaceport
@DigitalSpaceport Год назад
It feels good to not need to screw in caddies for sure
@maximloginov
@maximloginov 11 месяцев назад
Hello. What kind of raid you finaly using for plotting? Stripe? Raidz2?
@DigitalSpaceport
@DigitalSpaceport 11 месяцев назад
RAID0 of 12 disks. So far none have blown out and the performance is awesome. Worst case one blows out and I need to replot it, not a huge deal. I can do that in a few days.
@fisherbu
@fisherbu Год назад
nice job! how to make a plot only 74gb?
@DigitalSpaceport
@DigitalSpaceport Год назад
Gotta create it with the Gigahorse or bladebit cuda. Bladebit isn't farmable yet, but Gigahorse is
@watb8689
@watb8689 Год назад
you have some insane homelab. how is the energy bill coming
@DigitalSpaceport
@DigitalSpaceport Год назад
runs 225-250 per month. Not really that bad.
@juaorok
@juaorok Год назад
That awesome Right now I have a Supermicro SC836 16 bay with 7 x 12tb hdds and 96Gb of ram I'm upgrading little by little, saving money to upgrade my network
@DigitalSpaceport
@DigitalSpaceport Год назад
I'm fairly unhappy with my 40Gb performance but attribute it to the EPYC not having a high core speed + smb multichannel not working as I was hoping. The 10Gb nic is an easy win on almost any machine and fully maxes out here however.
@electronicparadiseonline2103
That's freakin insane. Your out of your mind DS. That's a ton of storage and you look like you just came home from the grocery store or something.
@linmal2242
@linmal2242 10 месяцев назад
Does this array run 24/7 or is it powered down most of the time? What is your power bill like?
@DigitalSpaceport
@DigitalSpaceport 10 месяцев назад
Runs 24/7 and is around 2.2 amps at 245V so 540w per jbod. Per disk that is 9 watts which is among the best per tray efficient jbods I have measured
@user-qs6ws3sw5p
@user-qs6ws3sw5p 11 месяцев назад
Quick question, could the DE6600 handle SATA HDD or only SAS? Could i just buy a SAS HBA card and plug it into my ubuntu server? Thanks
@DigitalSpaceport
@DigitalSpaceport 11 месяцев назад
It handles SAS or SATA. Get like a 9207-8e vs say a 9205 or 9200
@user-qs6ws3sw5p
@user-qs6ws3sw5p 11 месяцев назад
@DigitalSpaceport thanks for the reply. One controller can handle all the 60 drives with a acceptable response time? ( chia farming ;-) )
@trousersnake1486
@trousersnake1486 11 месяцев назад
This is waaaay above my knowledge base of pc hardware but its impressive what I do understand. Looking to upgrade from my ryzen 5900x x570 system to an eypc system when finances allow.
@DigitalSpaceport
@DigitalSpaceport 10 месяцев назад
Its funny how things get out of control in life lol
@visheshgupta9100
@visheshgupta9100 11 месяцев назад
I was wondering if you could do a video on TrueNAS scale with multiple nodes. There is no video on youtube that discusses this in detail. In layman terms, I would like to deploy 3 different servers, and control them all from one place. My question is do we need to install TrueNAS scale on every server? Or is it that we have 1 TrueNAS Scale server, and others are TrueNAS Core?
@DigitalSpaceport
@DigitalSpaceport 10 месяцев назад
If you need to use True Commander (the software that manages multiple nodes) I think you need to contact them about a license. I have heard its affordable by I don't really know that. You dont mix and match Core + Scale really.
@visheshgupta9100
@visheshgupta9100 10 месяцев назад
@@DigitalSpaceport Thank you, that is exactly what I was looking for.
@visheshgupta9100
@visheshgupta9100 10 месяцев назад
​@@DigitalSpaceport I am planning on building a homelab, I was thinking of having multiple servers for different kinds of media. For instance, one for game storage, the other for critical data & backup, and so on and so forth. So essentially, I wont need all the NAS running at all times. I want to be able to power on just the system I need, so it would work as a standalone system, and also the ability to control all the systems from one place so that I don't have to configure users/permissions/shares for each and every system individually.
@visheshgupta9100
@visheshgupta9100 10 месяцев назад
​@@DigitalSpaceport I was considering TrueNAS at first, but now I am kind of leaning towards UnRaid due to their recent implementation of ZFS. So essentially I could use UnRaid known for its parity, or use ZFS known for its speed and reliability or use both at the same time. In terms of flexibility, to mix and match drives, ease of use, low hardware requirements, I believe UnRaid has an upper hand. What are your thoughts?
@MHM4V3R1CK
@MHM4V3R1CK 10 месяцев назад
@@visheshgupta9100 I think truenas command is free now. Truenas scale is the Linux/debian flavor of truenas and core is the older but very stable BSD flavor btw
@gustcol1
@gustcol1 Год назад
I have the same problem with my 100Gbps network..
@ewenchan1239
@ewenchan1239 Год назад
Unless you're writing to an array of enterprise grade NVMe U.2 SSDs, 100 Gbps for storage for a home lab user -- you'll never be able to hit more than a few percent of the 100 Gbps line speed/capacity. (I have 100 Gbps as well (Infiniband).) Even if you enable NFSoRDMA, if you're going to be using spinning rust, it's not going to make THAT much of a difference. (The highest I've been able to momentarily get is about 32 Gbps kernel/cached writes. More often than not, my system hovers around 16 Gbps nominal max.)
@user-ve9ju8zg1q
@user-ve9ju8zg1q Год назад
How do you solve the problem that the indicator light of the sata hard disk is not on?
@DigitalSpaceport
@DigitalSpaceport Год назад
The SATA light is technically on, its just very very faint. It does not show well on the JBOD with the camera. If it is important to you to have the full blinking light, then an SATA to SAS interposer would be needed.
@trillioncap
@trillioncap Год назад
wow invredible
@StenIsaksson
@StenIsaksson 11 месяцев назад
I heard something about Windows not being supported with Epyc CPU's Wrong apparently
@DigitalSpaceport
@DigitalSpaceport 11 месяцев назад
Yeah so just use the ryzenmaster drivers and it works great. I was discouraged initially also by what I had been seeing others say.
@rfitzgerald2004
@rfitzgerald2004 7 месяцев назад
On your shop, do you ship to UK?
@DigitalSpaceport
@DigitalSpaceport 7 месяцев назад
Yes just make sure to add your phone number on the form. Customs will stop the shipment w/o that.
@jonathan.sullivan
@jonathan.sullivan Год назад
I'm interested in seeing what you can get for performance with multiple Vdevs, Tom Lawrence has a good breakdown video for logically how many disks per vdev one hsould have for performance. 1 Raidz2 vdev across 60 disks def isn't it but it's fun to see. #subcribed
@DigitalSpaceport
@DigitalSpaceport Год назад
For sure it was just to show the size lol but I am working on a followup here. Looks like I have a solution for the pathetic SMB transfer speeds also.
@apdewis
@apdewis 22 дня назад
The power bill on that setup must me monstrous. My more modest setup of R630s and 320s costs me enough. I am envious.
@DigitalSpaceport
@DigitalSpaceport 21 день назад
One would think but I'm base rate 10c/kWh and so it isn't that bad. I also shed load dynamically via HA and proxmox and don't run all the machines at once very often, unless needed. Usually about 250 total with the house for the electric bill. Cooling is consistently the largest user in the garage.
@apdewis
@apdewis 21 день назад
I can only envy that per/kWh rate as well. Mine is somewhere around A$0.22. That said still ends up being a lot better value than AWS is at work...
@capnrob97
@capnrob97 8 месяцев назад
For a home lab, how could you even begin to use a petabyte worth of storage?
@LampJustin
@LampJustin Год назад
You don't plan on using a single Rz2 in production, right? Right? One Rz2 shouldn't be much wider than 8 drives for optimal performance and redundancy. Recovering from a failure with a 60 drive z2 would take a freaking long time and chances are really high that other drives will go boom as well. It has to read all 1PiB after all...
@DigitalSpaceport
@DigitalSpaceport Год назад
Nooooo. I have a second video filmed with the same array and it's much more common sense and safe in it's layout.
@LampJustin
@LampJustin Год назад
@@DigitalSpaceport good good, the other setup would have been pure insanity :D
@TheDropForged
@TheDropForged 6 месяцев назад
Serious question, I see people with server equivalent to enterprise. Do people really need that size?
@TheDropForged
@TheDropForged 6 месяцев назад
Aa you do crypto stuff
@DigitalSpaceport
@DigitalSpaceport 6 месяцев назад
Yeah ai had a smaller half rack prior which is more than enough nowadays for most homelabbers
@skyhawk21
@skyhawk21 Год назад
Need help, got whs server with 50tb of drives, need a cheap good quality 2.5gb switch maybe with 10gb port and also cheap quality 10 gb card for server??? 1gb don’t cut it
@DigitalSpaceport
@DigitalSpaceport Год назад
Skip the 2.5G and just roll out 10Gbit. You will be much happier. This mikrotik switch is great to start with. geni.us/goNi9C
@carbongrip2108
@carbongrip2108 Год назад
I hope you enabled Jumbo Frames on all your NICs...
@DigitalSpaceport
@DigitalSpaceport Год назад
Yes I do. I have a home side segment that bridges to these devices but the big gear is all on Jumbo
@contaxHH
@contaxHH 9 месяцев назад
will the floor collapse?
@DigitalSpaceport
@DigitalSpaceport 9 месяцев назад
No but I did static load calculations on the 4" 6 slump reinforced slab in the garage and decided to put plate steel squares under the risers to distribute the load slightly better. So far zero issues. It should be okay even fully loaded, which it's not. Calculated to 800, 1200 and 2000 lbs per rack from left to right facing from front. Good question. Also don't put racks on wooden subflooring if they approach 1000 lbs without additional load deflection bracing.
@bokami3445
@bokami3445 Месяц назад
OMG! How long would a scrub take on this monster!
@502RetailPartners
@502RetailPartners 10 месяцев назад
Are those drives used you sell or new?
@DigitalSpaceport
@DigitalSpaceport 10 месяцев назад
Refurbished from Seagate but they dont have power on hours like used pulls do.
@philippemiller4740
@philippemiller4740 4 месяца назад
60 wide raidz2 doesn't make much more sense haha. Try 10 wide raidz2 x 6. That would make much more sense no? Maybe you're limited by smb, have you tried using iscsi or nfs?
@Murr808
@Murr808 Год назад
What do you do about windows updates ?
@DigitalSpaceport
@DigitalSpaceport Год назад
Suffer through them eventually. They only allow you to postpone so much unfortunately. I try to do them when behind KB
@jondonnelly4831
@jondonnelly4831 Год назад
You can block it by added some entries to the hosts file.
@marcelovictor3031
@marcelovictor3031 Год назад
how many hds of 20 tb do you have?
@DigitalSpaceport
@DigitalSpaceport Год назад
The real question is how many of 22TB do I have😜
@blueprint4221
@blueprint4221 11 месяцев назад
watch that bend radius, sir
@DigitalSpaceport
@DigitalSpaceport 11 месяцев назад
so about that.... Needed this comment before the video lol
@trininox
@trininox 2 месяца назад
How's the electric bill?
@DigitalSpaceport
@DigitalSpaceport 2 месяца назад
250/mo
@hescominsoon
@hescominsoon Год назад
try it over iscsi isntead of smb. also sclae does not perform as wlel as core does across the board in my own testing. unless you want to run vm's then sclae is the way to go. if you want only storage then core is your best bet.
@DigitalSpaceport
@DigitalSpaceport Год назад
I've had a lot of folks tell me to go core for performance and I'm going to check it out. I use proxmox for VMs also so no reason really. I will do a video on iSCSI I think after I learn some more about it and give a "noobs perspective" to iSCSI. I used it only once in the past via a Dell MD1000i and it was a painful thing as a result of that. Time for another round!
@hescominsoon
@hescominsoon Год назад
@@DigitalSpaceport iscsi in core is easy.... I do not use proxmox as my hypervisor so I don't know about setting up from iscsi on that end.
@ewenchan1239
@ewenchan1239 Год назад
I couldn't find the comment where you asked me about SMB Direct in Windows 11, but it does look like that if you go to Control Panel -> Programs and Features -> Turn Windows Features On and Off -> SMB Direct -- it does appear that in Windows 11, you can enable it. Just thought that I would pass that along to you.
@sm8081
@sm8081 Год назад
Envy….Bad feeling, I know…😅
@seelook1
@seelook1 9 месяцев назад
I'm going to sound like a kid. THIS IS SO COOL! I WANT IT. My wife will kill me if i ever add something like this lol.😅
@DigitalSpaceport
@DigitalSpaceport 9 месяцев назад
You have 1 life. Live your best one. (advice that gets folks divorced hahaha)
@jimmerin5034
@jimmerin5034 11 месяцев назад
5 axis stabilization makin me dizzy ahaha
@DigitalSpaceport
@DigitalSpaceport 11 месяцев назад
Ill try to use less 🚁
@GreatVomitto
@GreatVomitto Месяц назад
You know that your raid is fast when you are limited by the CPU.
@DigitalSpaceport
@DigitalSpaceport 21 день назад
A never ending problem except I got some RDMA going now!
@DennisJlg-cu1vw
@DennisJlg-cu1vw Год назад
Hello my tip do not use Win 10 or 11 but a Win server why, Alex from The Geek Freak found out that there is a limit to the network in normal systems and this is not active in servers. For info
@DigitalSpaceport
@DigitalSpaceport Год назад
This is easy to test out! Will toss a server 2022 on and see what kinda perf benefits I can get
@TheSouthernMale
@TheSouthernMale Год назад
Considering I have found 8 block so far this month and lost a lot of Chia I am going to try solo farming for a while and see how it goes. Of course with my luck I will stop winning blocks now. LOL
@DigitalSpaceport
@DigitalSpaceport Год назад
LMK! Chia cat 🔮
@Alan.livingston
@Alan.livingston Год назад
That’s a lot of adult material you seem to be hoarding there, sir.
@DigitalSpaceport
@DigitalSpaceport Год назад
The backups backup has backups
@nexovec
@nexovec 10 месяцев назад
Uhh 60 drives in one vdev :D this video is so wrong :D Good job.
@franciscooteiza
@franciscooteiza 5 месяцев назад
Not if you are using dRaid (not the case in this video)
@TheSouthernMale
@TheSouthernMale Год назад
Hey, if you see me leave the Pool its because I am losing money on it. I calculated that if I was solo farming I would have earned 4.5 additional XCH in the last 10 days. I will wait and see if the wining trend continues.
@DigitalSpaceport
@DigitalSpaceport Год назад
I have gone that path indeed sir. GL to you, if you show back up I'll know why 😂
@TheSouthernMale
@TheSouthernMale Год назад
@@DigitalSpaceport I am not saying I will leave yet, but if I do leave and then come back it will only be to jump a head of you in the pool. 😛
@pudseugenio4118
@pudseugenio4118 11 месяцев назад
looks like a gold bar from a cave
@DigitalSpaceport
@DigitalSpaceport 11 месяцев назад
YARRRR 🏴‍☠️🦜
@Paberu85
@Paberu85 Год назад
I wonder why would somebody do such a thing to his own house, and wallet?..
@ajandruzzi
@ajandruzzi Год назад
I assume you’re looking for an answer better than “because he can”.
@DigitalSpaceport
@DigitalSpaceport Год назад
you and me both
@jondonnelly4831
@jondonnelly4831 Год назад
Why do you need a PetaByte as a home user/ small office user ?
@DigitalSpaceport
@DigitalSpaceport Год назад
I have a lot of isos
@rsqq8
@rsqq8 Год назад
This man ISO’s 😂
@leo_craft1
@leo_craft1 7 месяцев назад
They cost 1000€+
@linegik
@linegik Год назад
Chia dump
@FakeName39
@FakeName39 Год назад
This dude running amazon from his home
@charliebrown1947
@charliebrown1947 9 месяцев назад
you need higher MTU. you're breaking these files into too many 1500 byte packets (think DDoS attack). btw i hope you didnt keep that 60 wide raidz2 configuration... that's silly. Oh, you also have your ARC set to use 50% of your ram (default) and half your ram is literally doing absolutely nothing at all. change the default to allow it to use 90% or even more
@InSaiyan-Shinobi
@InSaiyan-Shinobi Год назад
Is this at your house Wtf?
@jj-icejoe6642
@jj-icejoe6642 Год назад
All that wasted money…
@ewenchan1239
@ewenchan1239 Год назад
The other problem that you might be running into it looks like according to wiki, your Intel Xeon E5-2667 v4 are only 8-core/16-thread processors each, which means that having to deal with/manage a 60-wide raidz2 ZFS pool is going to tax the processor quite heavily when it is trying to manage that many drives with only 16-cores/32-threads total (possibly). Keep an eye on out on your %iowait.
@DigitalSpaceport
@DigitalSpaceport Год назад
I have part 2 filmed here of me trying out other more logical configurations then 60 wide. Out this week!
@ewenchan1239
@ewenchan1239 Год назад
@@DigitalSpaceport I look forward to watching that video. (The reason why I mention to keep an eye out on your %iowait is because my server has 36 drives, 32 of which are handled by ZFS under Proxmox, and under heavy disk load tasks, the %iowait will jump/start to climb until those tasks are finished, and then the %iowait will fall back down.)
@16VScirockstar
@16VScirockstar 11 месяцев назад
The throughput is constrained through the PCIe 3.x bottleneck, de.wikipedia.org/wiki/PCI_Express
@DigitalSpaceport
@DigitalSpaceport 11 месяцев назад
I think SAS2 is bottlenecking even before PCIe3 here. I have another video in the works that will check into but pushing the limits of my spindles is a topic I am diving more and more into.
@16VScirockstar
@16VScirockstar 11 месяцев назад
@@DigitalSpaceport nice! Just out of curiosity, what do you need this setup for? It must be immensely expensive. As a former SAN admin, I can tell, striping over more disks doesn't give you more linear performance. I remember the threshold, 17 years ago, was at around 10 disks per stripeset.
Далее
INSANE Homelab Networking!
48:59
Просмотров 131 тыс.
UNRAID Vs TRUENAS: Which Home Server NAS Is Best?
48:13
Китайка и Шашлычок😂😆
00:19
Просмотров 236 тыс.
Getting the Most Performance out of TrueNAS and ZFS
18:31
Ultimate Homelab Cable Setup
32:45
Просмотров 257 тыс.
Trash to Treasure? A 25 gig NAS DIY
25:26
Просмотров 55 тыс.
Choosing The BEST Drive Layout For Your NAS
21:42
Просмотров 108 тыс.
My network is bigger than yours ;)
29:09
Просмотров 1,8 млн
Building My ULTIMATE, All-inOne, HomeLab Server
17:21
Странный чехол из Technodeus ⚡️
0:44