Тёмный

What is the best GPU, A6000 or the RTX 3090.The truth for rendering, Workstation GPU vs. Gaming GPU 

Mediaman Studio Services
Подписаться 12 тыс.
Просмотров 229 тыс.
50% 1

#Nvidia #RTX #WorkstationGPU
In this video I look at rendering with the RTX A6000 vs. the RTX 3090 in Blender, Maya and C4D to see what is the king of the GPUs.
Disney Data set download by Darby Edelen:
www.redshift3d.com/forums/vie...
Redshift render:
www.redshift3d.com/
Blender demo files: www.blender.org/download/demo...
AMD Threadripper Pro:
www.amd.com/en/processors/ryz...
Discord: / discord
Facebook: / mediamanstudioreview

Кино

Опубликовано:

 

7 сен 2021

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 1 тыс.   
@nickcifarelli8887
@nickcifarelli8887 2 года назад
I'm a 3Ds Max Arch Viz artist with V-ray and was watching this video, praying that you would not overlook the HUGE importance of VRAM capacity in poly-heavy complex scenes. Thank you for your review. Yes the A6000 smashed that render by half compared to the 3090, due, in a large part, to the 48GBs of VRAM. If the GPU doesn't have the VRAM necessary to load the scene with textures, poly counts, particle simulations, etc, it has to page file the textures on the program disk. This accounts for a far slower workflow. Also, with NVLink, you can essentially double your VRAM to 96GB (assuming of course you can afford two A6000s!) and that would allow you to load entire cityscapes, forests etc. HAVING SAID THAT, your point is very valid, for the large majority of content creators out there, performance per dollar, you can get 2 or 3 3090s (MSRP) for the price of a single A6000. So realistically, you have made a very valid point. I personally am still using a Ttitan xP with 11GBs and it cannot load heavy scenes with carpets and Vray fur. I would personally gladly upgrade to 2x 3090s then shell out for a A6000. You can set one as the dedicated interface and the other one for rendering. Nvidea caters to high, high-end post houses (ILM, Weta Digital, etc) with the Quadro series and hence the hefty price tag. Yes the drivers are ISV certified and the RAM is ECC, but for the average guy out there, or even for the more professional prosumer like myself, a A6000 is a pipe dream. Id be happy with dual 3090s tbh. Thank you for the video, and you got yourself a new subscriber.
@MediamanStudioServices
@MediamanStudioServices 2 года назад
thanks Nick. I totally agree with you. As I have worked at ILM, I know that these houses use the workstations cards, but the sad fact is that even at the A-listed studios, not all artist need the pricy workstations GPUs, the bulk of CG work is not that complex. Set extensions, some simple 3D add and FX. But for these complex Avatar type scenes, they need the Vram in the A6000. and even then all the shots are comp at the end in Nuke. so they use layers.
@nickcifarelli8887
@nickcifarelli8887 2 года назад
@@MediamanStudioServices Nuke is one of my favourite programs. Still, exporting all the render passes for Nuke to compose still requires a hefty GPU. But I am glad we are on the same page. Well said sir.
@MediamanStudioServices
@MediamanStudioServices 2 года назад
Hi Nick, thanks for your comments. I agree that the A6000 is only for a smaller sub-set of users, but for 70% of all content creators a RTX3xxx card is all they need. As you stated, you are using a 11GB card and and the 3090 24gb would be a great upgrade for you. thanks for watching.
@blackgamingstudio5104
@blackgamingstudio5104 2 года назад
Nvlink does not double vram that many people said are you sure nvlink double vram
@blackgamingstudio5104
@blackgamingstudio5104 2 года назад
Is 96gb vram is enough for 4k character creation and 4k interior exterior designing in 3ds max and substance painter and red shift
@JetCooper3D
@JetCooper3D Год назад
I work at Pinewood Studios UK and work for Marvel, Disney Lucas etc. We switch to Geforce card back on Star Wars ep7 and have never looked back. The Geforce cards are stable and good to go. Saved money can be routed to other hardware. Great advice to all and great video - subscribed. Thank you. (We use the new 4090 RTX in all of our workstations now / 3090's before).
@MediamanStudioServices
@MediamanStudioServices Год назад
thanks for sharing your experiences with the channel
@The0zx
@The0zx Год назад
Hi, Bro! Do you create 3d models for Marvel movies and Disney Lucas movies? Can you tell me about the computer specs you and your team are currently using? I dream of working on 3D models like you. But right now I don't know how complex the 3D model that I will make is. I need information about RAM usage capacity, processor, etc.
@goldenheartOh
@goldenheartOh Год назад
​@@The0zxis it still true Blender 3D is so optimized it can run on a potato? I used to have a similar dream 20yrs ago & Blender 3D was awesome. & I did have a potato for a pc. My point is I strongly suggest you get a feel for it on Blender before building a pc for it.
@dazrelixs
@dazrelixs 6 месяцев назад
but you guys render locally or on the farm?
@rupasree8055
@rupasree8055 Год назад
Thx for doing this video , we really appreciate it, as there are only few videos regarding workstation GPUS
@concinnity1240
@concinnity1240 Год назад
This video helped me out so much and answered all the questions I've had while I'm trying to build a workstation for CAD. Thank you so much! Excellent video.
@mikebrown9826
@mikebrown9826 Год назад
Glade I could help
@CrimsonKing666
@CrimsonKing666 Год назад
Something important about the price is that GeForce cards are more unstable. I used to work with Deep Learning with a GeForce 3090 and was pretty common to see my computer crashing or stopping the training. I'm using an RTX a5000 and I never had that issue anymore.
@wonderwonder9027
@wonderwonder9027 2 года назад
I'm a civil engineer. First thing comes first and I'd like to congratulation you on the way you put the video together...... really straight to the point and full of just the important information instead of wasting time talking so much about things an average viewer won't understand...... Second thing is that there is no decent reviewer out there that do tests -the same professional level that you do - on computer parts whither its GPUs or CPUs on engineering tasks..... I mean yes there are a lot of artists out there that would like to know how fast their art work will be rendered...... but there are -as far as I know - many professional work loads are not being tested..... I don't want to be very technical but how do these cards handle: -Matlap AI workloads -BIM applications like Revit and Robot and maybe Sap2000, Etap and Safe for structural analysis -GIS analysis and clustering -Analysis of aerial photo for environmental, hydraulic and structural purposes I mean yes there are some workloads for 3DMax and Maya and some video production that can put artists on the limits of what this hardware can and can't do, but the way I see it that they are very rare and in those rare cases they do have big studios and production companies: ie. can directly address Nvidia about technical stuff. But we the engineers are reaching those limits on a daily basis and we are the ones who really need to make educated decisions about the hardware we use. thank you for your time reading this.
@MediamanStudioServices
@MediamanStudioServices 2 года назад
hi wonder wonder, thanks for the kind words. I worked on a project for Warner Bros World theme park in Abu Dhabi and was amazed on how well a lower end GPU could handle such complex geo in Revit. Wish we had this kind of viewport performance in Maya or Max. As for how the RTX3090 or A6000 will handle civil engineer type work loads. well I could not say. I have zero experience in these type of project and apps. you can check out this channel. maybe he can help ru-vid.com thanks for watching my channel
@andrewfischer247
@andrewfischer247 2 года назад
This was really well done and I appreciate how you compared several scenarios. Subscribed!
@MediamanStudioServices
@MediamanStudioServices 2 года назад
Thanks Andrew
@rapatouille
@rapatouille Год назад
great comparison presentation! love your kitchen studio! very nice concept
@TrueMathSquare
@TrueMathSquare Год назад
I just found your channel and I love it.
@fanimations2363
@fanimations2363 2 года назад
Something i've been searching a lot on RU-vid, great comparison , loved it, thanks!
@MediamanStudioServices
@MediamanStudioServices 2 года назад
thanks F Animations
@originor4751
@originor4751 Год назад
Very useful. Thank you for putting this out!
@yushkovyaroslav
@yushkovyaroslav Год назад
Very good video really shows what matters. Honestly underrated channel a lot more relevant content than some of the "bigger' channels out there.
@MediamanStudioServices
@MediamanStudioServices Год назад
thnaks Y Y. I am looking to do some new videos soome. Just need to get the equipment. That is the hard part
@yubawang7652
@yubawang7652 2 года назад
Thank you sir! Finally see someone that knows what he's talking about and showing actual production scene
@MediamanStudioServices
@MediamanStudioServices 2 года назад
thank you Yuba Wang. I am trying to provide relevant data.
@NimaCn
@NimaCn 2 года назад
Thanks a lot for the comprehensive video. Subbed for the future videos!
@MediamanStudioServices
@MediamanStudioServices 2 года назад
thanks Nima Chegini
@webdesign6776
@webdesign6776 2 года назад
I always enjoy your videos ,in this one I especially liked knowing that the studio ready drivers had the same bug fixes as the "Quadro" drivers
@MediamanStudioServices
@MediamanStudioServices 2 года назад
Glad you like it!. Thanks for watching
@oscarcampbellhobson
@oscarcampbellhobson 9 месяцев назад
Thank you for being blunt and to the point, not babbling about everything nobody cares about
@craigvfx
@craigvfx Год назад
Would you do a comparison with the new 4090 vs A6000 ADA Lovelace cards
@surroundrive
@surroundrive Год назад
Excellent production design: your set, lighting, audio, video, dialogue...liked and sub'd.
@tyrannicpuppy
@tyrannicpuppy 2 года назад
Very nice. As someone starting to dip their toes into 3D content creation for fun, but only currently has the 4GB 1650 Super I could afford when putting the tower together midway through last year, it's nice to see a video addressing the content usability of the 30 series cards. LTT and that ilk make great videos, but they barely give the render benchmarks a mention and they certainly don't go into this level of detail. I know the new fancier cards are on the horizon, but this has helped convince me to grab a 30 series now and enjoy stable and yet powerful rendering compared to what I'm getting now. I can always splurge again in a few years if the newer ones are really so much better, but by then I might be doing far more complex stuff with it thanks to a few years of practice and need the extra horses.
@kitewinds663
@kitewinds663 2 года назад
Thanks for video, very helpful! A comparison of Solidworks assembly and drawings performance between the A6000 and the W6800 AMD-card would be interesting. Also the A5500 is of interest. Thanks again.
@kentharris7427
@kentharris7427 Год назад
You can rent the cards for $1.50 an hour for one card or $6.00 an hour for 4 RTX6000 cards or $1,000 per month per card, cloud based. I personally have the 3090 card in my PC which is good for most applications. If I need raw speed for any given time I will rent the cards.
@graphguy
@graphguy 2 месяца назад
You said exactly what I wanted to hear. I play zero games, but I do alot of amateur work with Blender 3D and have been perplexed on going with a new RTX or a studio ready graphic card. thanks!
@rahulkamath6984
@rahulkamath6984 2 месяца назад
so what did you actually go with?
@graphguy
@graphguy 2 месяца назад
@@rahulkamath6984 haha decided to go to Italy for 2 weeks, then decide!
@rahulkamath6984
@rahulkamath6984 2 месяца назад
@@graphguy hahaha you don’t need any benchmarking to decide that I guess 😅
@noth606
@noth606 Месяц назад
"You said exactly what I wanted to hear." - Eh, you don't seem to realize, but that is a very bad thing. It means the video isn't only useless to you, it does you a disservice, when you're evaluating options and have a preference, the input you need is the opposite side of yours. If your criteria survives unscathed, you had and still have the right idea, if not - reconsider. I you instead watch things that confirm your preference, you're invalidating your previous preference to a degree because you're just reinforcing it which is worse than doing nothing.
@mikemora6410
@mikemora6410 2 года назад
Thank you for all your content, I truly appreciate it. This video was very helpful.
@armalik11
@armalik11 23 дня назад
Excellent video. Gave some vital information about the cards and how memory has been useful for different situations
@user-xj1ll7qu8l
@user-xj1ll7qu8l 20 дней назад
the rtx 4090 beats the a6000 haha
@steve55619
@steve55619 11 месяцев назад
Don't forget about AI and ML work. Larger LLM's benefit from more VRAM. Also note how much heat you produce and power consumption with 2x 3090 in NVLink vs RTX A6000
@yayandeleon
@yayandeleon Год назад
finally a more sensible benchmark on actual workflow usage for these cards. sick and tire of those gaming benchmarks who thinks the only actual usage for GPU's are for gaming only and complain for the high price tag
@oscaroscar9941
@oscaroscar9941 2 года назад
Just the things that I want to see. Well done!
@MediamanStudioServices
@MediamanStudioServices 2 года назад
thanks Oscar Please check out the rest of the videos on my channel
@nigeldawson5960
@nigeldawson5960 2 года назад
Thanks for the info. I’ve watched many of your vids and they helped me build the best machine for me. Much appreciated.
@MediamanStudioServices
@MediamanStudioServices 2 года назад
Thanks for watching Nigel
@qkayaman
@qkayaman 2 года назад
It depends on what you need them for, but for me I need A6000/A5000, not RTX 3090. Why? Multi-GPU setup, where I need peer-to-peer (P2P) access between all GPUs; memory transfer between GPUs through host memory is a no go for me. P2P is part of Nvidia GPUDirect, and only possible with 3090 over NVLink (i.e. only possible between pairs), so if you want a 4 GPU setup and need P2P, forget about it. With A6000/A5000 P2P is possible over PCIe, which means running 4 of them is possible. Also dual slot profile makes it easier to stack (I know blower type 3090 is available, but hard to find). May also be interested to know, in Windows in order to enable P2P over NVlink, SLI mode needs to be enabled in NVidia Control Panel. Fun fact: NVidia disabled this in latest Windows drivers, so if you want it to work, need to roll back to a pre-Jan 2021 driver! Linux drivers don't have this issue though.
@Oldyellowbrick
@Oldyellowbrick 2 года назад
I think the cost difference is pretty insane BUT I use Octane render and I am always maxing out VRAM and having to reduce scenes...Not only does it slow you down considerably when you reach the 'ceiling' but you tend to get alot of issues when you reach the 90% mark of the VRAM capacity with systems crashes. 48gb would be very welcome in my workflow but I will wait to see what the 40 series will offer.
@felixjen3208
@felixjen3208 2 года назад
Thanks for the great video! You got yourself a new subscriber! I would love to see some possible benchmarks on Keyshot and whether the additional VRAM actually makes a difference there. My guess would be no, given the typical lack of crazy complexity in most Keyshot scenes.
@MediamanStudioServices
@MediamanStudioServices 2 года назад
Thanks for the sub, i will look into Keyshot for a video
@billywilliam7747
@billywilliam7747 10 месяцев назад
Very good insights I learn from your channel - THANK YOU
@MediamanStudioServices
@MediamanStudioServices 10 месяцев назад
thanks for watching
@thewizardsofthezoo5376
@thewizardsofthezoo5376 Год назад
One thing is the power consumption and the lack of VRam, then if it takes a couple of minutes or hours more to run is less critical, for LLM fine tuning, those consumer cards are useless because of lack of RAM, because it's the size of the VRam that determines what you can load in for training.
@jeremiahMndy
@jeremiahMndy Год назад
Keep making these please I'm a pro 3D artist and your videos have really helped.
@MediamanStudioServices
@MediamanStudioServices Год назад
thanks for watching, I hope to make new videos soon. Sorry for the long delay in making content
@Sitrec
@Sitrec 2 года назад
Just wanted to say that I really appreciate you and your content. There is so much misinformation when it comes to hardware in the creative space and content like this has been really missing.
@mistrrhappy
@mistrrhappy Год назад
I'll second the request for the A6000 vs 4090 comparison! Interesting to see the results!
@EdinGacic
@EdinGacic Год назад
do you have any tests with dual 3090 with NVLink on big scenes like the one you showed where double VRAM made a huge difference? I am debating if adding another RTX 3090 FE to my workstation is better at 750-800 EUR used or selling the RTX 3090 and buying the new RTX 4090. I am leaning more towards two 3090s if NVLink actually works and scales good. It would be cool if you can do test like this :)
@thomasrichter1219
@thomasrichter1219 Год назад
I have exactly the same thoughts. Have you already made a decision?
@Andbar93
@Andbar93 Год назад
Thanks for the video, I hardly found comparisons between the Quadro and the rtx in a professional environment, I wish you could make comparisons in AI tasks such as generated images.
@tanguero2k7
@tanguero2k7 Год назад
Hi there! Let me save you some time (TLDR): The results both in rendering and AI, given the same prompt, parameters AND SEED, are the same ( on both a 6GB RTX 3060 (mobile) and a much faster 24GB Quadro RTX 5000). The long version: I bought an RTX 3060 based laptop because I would never give more than 300€-400€ ($300 aprox) for an 8GB card for both rendering (blender) and AI (local implementations of stability-ai and BERT related workloads) work. When I get to where I want, I move everything over to an RTX 5000 at my workplace. Other than the size of the generated images, I can only say the 5000 (naturally) returns faster: my 6GB 3060 often crashes when attempting to render 4K (blender) or simply refuses to generate textures/images (dreamtextures on blender / stability-ai on the shell ) above 512x512. This, however, might change in a near future due to a recent paper by nvidia themselves where the model used for raytracing was told to be bellow 1 MB (yes, you're reading 1 Megabyte). Have a look at 2-minute-papers channel and have a look for yourself. Oh! Btw, if you'd like to test some workload before buying, let me know. (Edited to add that I also do some occasional photogrammetry work with the free and open source meshroom)
@pedrorivera1892
@pedrorivera1892 2 года назад
Thank you for the video. When doing 3d renderings the biggest difference I found between A SERIES Vs GEFORCE is temperature.
@MediamanStudioServices
@MediamanStudioServices 2 года назад
thanks for sharing your experiences with the channel
@H2ydrogen
@H2ydrogen Год назад
thanks man, well explained and showcased
@kuhan333
@kuhan333 2 года назад
Hi, Great Video! I have a couple of questions, 1: What are you thoughts on GPU for Unreal Engine (Content creation/ Virtual Camera/ Green screen maybe but No LED wall ) 3090 rtx vs A5000. 2: There are many makers for 3090 card, Which one would you recommend, founder's edition vs other makers.(I was looking in 3090 FE vs 3090 ASUS ROG-STRIX but if you have other recommendation please do share) Thanks in advance,
@MediamanStudioServices
@MediamanStudioServices 2 года назад
for UE4 i would use the RTX3090 as UE required a lot of processing power, and the 3090 has more than the A5000 As for brands, sorry i have not done a comparesion of the different brands. I have a Gigabyte Tubro and it had been great for me. I also have a Strix 3060ti and that us also a good GPU model for me. so you will have to find what is best in your market and also available. Thanks for watching
@CreativeAudience
@CreativeAudience Год назад
Thank you for your test, I agree with you. I'm a 3D motion graphic designer and animation. I have been working with C4D and Octane render for many years. From my experience, I'm working on Quadro and Geforce. The 3D preview frame rate performance and rendering of Quadro and Geforce are not different. Quadro is just only of their marketing or product positioning but the price is too cruel. Quadro only has more RAM but it's a lot more expensive than Geforce 5 times. For me, It's a huge cost. I try to argue with other people over the years about Quadro and Geforce but no one believed. Especially the computer sellers and people who are not graphic designer.
@abzaman77
@abzaman77 2 года назад
Great content! I was wondering if workstation gpu's would make any difference in complex 2d workflow like a complex scene or drawing in adobe illustrator . im currently using gtx 980 and sometimes when some drawings get too complex, it becomes slow and sometimes unresponsive to changes and i don't see any measurable gpu usages but utilizes cpu more (mostly 1 or 2 core) . I couldn't find anything regarding this online that's why im asking if a moderate quadro/rtx a series will make my 2d workflow faster in terms of viewport performance and rapid changes to complex drawings in Adobe Illustrator? TIA
@E_Clip
@E_Clip 2 года назад
The memory pooling since the 2080Ti's have been great for production and I really don't see myself buying a quadro (or A as they are called now) ever again. The pooled Vram from 2 x 3090's is more than enough for my workloads (mostly ArchVis). Great content mate, glad I found you! Keep it up :)
@MediamanStudioServices
@MediamanStudioServices 2 года назад
Glade I could help
@dittofarmers9007
@dittofarmers9007 Год назад
Hi would you please the same test but not for rendering, but for computation. Try software such as Ansys mechanical or Abaqus. These software has the option to utilize GPU in the computation. We really would like to know if Geforce can work as good as Quadro for computing in double precision mode. Thanks. That will help a lot of engineers out there!
@ronniecoleby
@ronniecoleby Год назад
Just what I was looking for thanks so much. I think this solved my dilemma I'll go for the 3090 and maybe add a second card in the future - with prices tumbling down now that seems more sensible! I'm looking to purchase a workstation for a personal project which uses Metahumans in Unreal Engine. Would be great to see how the two hold up in the viewport in Unreal - in a filmmaking (24fps) context. I know this is a more niche use case though! :-)
@mikebrown9826
@mikebrown9826 Год назад
You may want to research more. I believe Unreal can only use one GPU. And I am not sure if the nvlink will work for Unreal. But you could render on one GPU while using the second to continue to work in the program.
@kszanika7782
@kszanika7782 Год назад
Thank you for this very informative video. Now that the GPU prices are down, would love to see a test using dual Nvidia RTX 3090 Ti, and if possible please test them with Video editing software like DaVinci Resolve 18 and Adobe Premier.
@muser7935
@muser7935 Год назад
Very informative 👏 and quick
@essa07
@essa07 2 года назад
professional comparison …just what I need
@mikebrown9826
@mikebrown9826 2 года назад
Glad I could help
@NarekAvetisyan
@NarekAvetisyan 2 года назад
Great review just what I was looking for! Q: Can you test the memory pooling of 2 RTX 3090's with NVLink in Blender? I'd really like to see that.
@MediamanStudioServices
@MediamanStudioServices 2 года назад
Hi Narek Avetisyan, I would love to but I do not have the equipment anymore. I only have for a short time to make the videos. Thanks for watching
@cadetsparklez3300
@cadetsparklez3300 2 года назад
can you still spoof them to quadros? up until 2000 series ik you can just remove a resistor and it changes the device id
@loganpenciu7317
@loganpenciu7317 2 года назад
You sir are speaking my language! Been looking for a channel that talking about computers in a 3d production studio setting. Subscribed! :)
@MediamanStudioServices
@MediamanStudioServices 2 года назад
thanks for watching Logan
@wonderwonder9027
@wonderwonder9027 2 года назад
Can you please do the following tests on the A6000: - AutoDesk Revit architecture render -MatLap heat exchange simulation -AutoDesk Advance Steel stress and displacement calculation -AutoDesk Robot wind load simulation and seismic load calculation I know they are out of the scope of this channel but me being a civil engineer have no idea what to expect from this kind of investment if I'm going to make it....... And no other channel are nice enough to read through the comments section let alone give an answer....... Thanks for your time
@oliverleemans6363
@oliverleemans6363 Год назад
Can You test the A5000 against the RTX 3090 or the RTX 4070 ?
@Betoromero22
@Betoromero22 Год назад
Por fin alguien serio que se dedica a hacer videos para creadores!!! Gracias por compartir
@TheKevphil
@TheKevphil Год назад
Glad I found your channel and am a new subscriber! I was looking around with wide eyes the past couple days regarding the A4000 and all I could find were "tests" using games as the benchmark. So I guess I'm looking much more favorably now at the 3070 and 3080. In either case, however, I am looking at the used market, specifically eBay. Any comment on going this route? And finally (don't laugh!), I have a Core i7 970. That was the first 6-core chip. Would a 3070 work in my system (decently) or would the CPU throttle it? (Blender modeling and renders.) Thanks!
@mikebrown9826
@mikebrown9826 Год назад
Glad I could help out. As for the CPU. Don't really know. But as most GPU rendering does not use the CPU for much of the process. I would guess a 3070 would work well. I am also guessing that your system is only pcie gen 3. So that is also a bottleneck. But the upgrade to 3070 will be an improvement
@Livingston3d
@Livingston3d Год назад
Great sir.! Please help me. Will this GPUs (Radeon RX 6800 XT, Radeon RX 6600 XT, Radeon RX 7000 XT ) are good for Maya, 3ds max, Zbrush, painter, blender, etc. Is it good for our modeling and hard file viewport navigation and rendering purposes?
@Amarthir
@Amarthir Год назад
They will ;')
@Pixel_FX
@Pixel_FX Год назад
Radeon GPUs Render slower in Blender cycles compared to nvidia because of Optix. For every other program they are fine, its rendering that Radeon cards are slower. Dont know about upcoming 7000 series performance. I have a 5700XT and 3080s. my 5700XT was faster than 2070 until optix introduced. after optix in blender, RTX cards became way faster.
@The0zx
@The0zx Год назад
@@Pixel_FX How about Intel ARC A770 for the same work?
@wagnerdesouza6512
@wagnerdesouza6512 2 года назад
Very interesting channel, testing hardware with professional softwares, and not with games. Would be nice to see gpu tests with Substance Painter.
@MediamanStudioServices
@MediamanStudioServices 2 года назад
that's a great idea for a video. Thanks for watching
@faceless_ghost
@faceless_ghost 2 года назад
you're replying to every comment,really great! you're putting a lot of time in it! thank you so much💗
@MediamanStudioServices
@MediamanStudioServices 2 года назад
yes Imran Mohd Abdul, I reply to all comments. Thanks for watching
@ukaszgaluba8981
@ukaszgaluba8981 Год назад
Very good content. Keep it going, mister!
@SteveGrin
@SteveGrin Год назад
Speaking from experience, my A4000 out performs my 3080 in AutoCAD and Revit in two ways. First the 3080 lags during certain operations - for example "override graphics in view" in REVIT or "layers" in cad. The second thing is the artifacts that you get with the G-force card when rotating a model are annoying. Every time I get a new WS, I try the current top of the line g-force card and every time I end up back to the WS card.
@animhaxx
@animhaxx Год назад
So you saying is A4000 is better at viewport handling?
@SteveGrin
@SteveGrin Год назад
I guess you could say that.
@bravestbullfighter
@bravestbullfighter Год назад
Interesting! How about follow-up with A6000 vs 4090?
@Royameadow
@Royameadow Год назад
We don't get a lot of testing with the Quadro cards as is, so it truly is welcoming whenever we get to see somebody take the time to compare these products to their GeForce counterparts, the added software compatibility and considerably higher Video Memory truly does make a difference and today the subject of Clearance Factor has become ever more important in a time where the RTX 4090 barely has any Dual Slot options when compared to the Quadro L6000 (RTX 6000: Ada Generation) that many people are beginning to flat~out ditch GeForce because of the lack of smaller options to fit in a smaller case (I've been using Micro ATX since I999; never have I used a larger form factor). In this era, the core audience that will benefit heavily from products such as the Quadro L6000 is definitely in the AI and Deep Learning space: Now, I don't know what your history is with certain Memory Intensive AI software such as OpenAI Jukebox, but it would be incredibly welcoming to see how much faster Music Continuations with it render on the L6000 when compared to the 4090, even the Tesla cards witness a major handicap at the 0I6 and 024 GB threshold in this particular workload that jumping to a card with a capacity of 048 or better truly does help the process run smoother without either becoming sluggish or crashing due to an Out of Memory error; Jukebox is something that I truly hope we will see more people in the Techtuber scene showcase as part of their Benchmark Suite, it doesn't get a lot of attention outside of a select few and thus having concrete numbers on how fast it works under plentiful conditions would show the Quadros' true worth over GeForce, we're still a few generations away from being able to render a Sixty Second Sample in the same time as its length or shorter and it'll be nice to see which cards get the most out of it until that time ultimately comes. (:
@joaoalexdias
@joaoalexdias 2 года назад
Hi thank you for your review! I’m a 3D character animator using mostly Maya, I worked in the majors studios using workstations with both cards you mentioned, and my thought is that a Quadro card is more efficient in the viewport and computing processes than a GeForce. Even in my personal rigs with lower end cards I’ve noticed that, for example I had a workstation with a Quadro K2000D with 2G of VRAM and a laptop 5 years newer with a GTX1060 and I had a better viewport speeds with the workstation, the workstation CPU was a bit better but wouldn’t justify the differences in performance. Your totally right on that comparison with the A4000, I would definitely choose that one over an RTX 3070 or even 3080. I guess it all comes down on the production usage, if you’re going for the renders or even 3D generalist a Gforce might be a better choice budget wise, but if you’re doing stuff like character animation, vfx and simulations I would definitely choose a middle end quadro.
@MediamanStudioServices
@MediamanStudioServices 2 года назад
hi Joao Dias, I would not compare a K2000 and a laptop 1060, this is not a good comparison at all. you do know that all laptop GPU are rated very much differently than desktop GPU. A RTX3080 Laptop is not the same as a desktop 3080. the Laptop version of a 3080 is more like a 3060 desktop GPU. Laptops just can not deliver the power required to drive these GPUs. But thanks for sharing your comments and watching the channel.
@rashdanml
@rashdanml Год назад
I think the key point here is that Nvidia only recently started releasing Studio ready drivers for Geforce cards, as of the 3000 series. It USED to be true that Geforce wasn't suited for workstation usage because of the lack of driver optimizations for the Geforce line. The underlying hardware has pretty much always been the same with differences in numbers. Weaker workstation GPUs (i.e. fewer CUDA cores than Geforce) were still preferred for workstation use because the Studio drivers were better optimized to use that hardware.
@technicallyme
@technicallyme Год назад
I got a a4000 for 400 (what a change a year makes ) but it solved my problem with the 3070. Not enough memory
@subhaprakashbeura5120
@subhaprakashbeura5120 2 года назад
You did a great job 💯 Thanks for suggestions
@MediamanStudioServices
@MediamanStudioServices 2 года назад
thanks for watching and the comments
@samfallon2568
@samfallon2568 2 года назад
I would be very interested to know what kind of difference work station gpus make in cad programs such as Solidworks or fusion.
@arjayjalmaani
@arjayjalmaani 2 года назад
Would you say for real-time video rendering using Unreal Engine, would I still need an A6000 or would two 3090’s be as well, if not slightly better?
@MediamanStudioServices
@MediamanStudioServices 2 года назад
I would go with the one A6000 as Unreal does not support multi GPU rendering
@arjayjalmaani
@arjayjalmaani 2 года назад
@@MediamanStudioServices Good to know. Thanks for the reply!
@capezonmyback
@capezonmyback Год назад
I can't find a new NVlink 2x 3090 Benchmark. Would be really helpfull!
@mohamedsakka2338
@mohamedsakka2338 Год назад
ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-jw_mnwo9Nag.html you can skip the building part it might be boring
@12pure
@12pure 2 года назад
Finally found your video. That is what I was looking for, thanks! Do you still have the 2 GPUs? I would be glad if you do some AI and Molecular Dynamics workloads with these both cards. That’s why I need that powerful GPUs, also Im a enthusiast on Space Simulators so a bit of gaming benchmarks because would not hurt. Thank you.
@arjunmohurle
@arjunmohurle 2 года назад
I like master like you, now people counting views without knowing that technology advances or disadvantages. Thank you for sharing your knowledge 🙏🏻
@epsilonplus3514
@epsilonplus3514 2 года назад
can I use mixed multiple gpu on blender? like gtx 1070 and 1050ti.
@MediamanStudioServices
@MediamanStudioServices 2 года назад
yes you can but the renders will be limited to the lowest amount of Vram in any one single GPU. This is the limitations for multi GPU rendering. The package is bundled up with Vram limitations of a single card. So say you have a card with 6GB and one with 12GB. the render packages will only be set for 6GB and the second 12GB card will not use 6 of its GB of Vram. I hope you understand my response.
@epsilonplus3514
@epsilonplus3514 2 года назад
@@MediamanStudioServices thank you for answering.
@sideffect8
@sideffect8 2 года назад
You should benchmark the 2x nvlinked 3090s vs the A6000. Graph the memory usage and bandwidth during the pre render. Curious to know how the complex scenes saturate the extra memory
@MediamanStudioServices
@MediamanStudioServices 2 года назад
hi Sideffect8, I totally would but I do not have two 3090 anymore. Thanks for watching
@siminc7905
@siminc7905 2 года назад
Nvlink chopsoff about 10% performace. so do the math
@acoenterres2164
@acoenterres2164 2 года назад
very informative and depends on the industry or people need to.... thx alot👍👍👍
@Damuskinous
@Damuskinous 2 года назад
Amazing video! Could I ask about the storage space on top on the back? Looks really nice and I'm looking for something to store some of my gear.
@MediamanStudioServices
@MediamanStudioServices 2 года назад
Hi Damuskinous, thanks for the kind comments. The storage racks in the background of my videos is just IKEA shelves. I put them on a platform to hold them up.
@Mr_i_o
@Mr_i_o 2 года назад
Perhaps you would consider doing performance per dollar with dual NVLink 3090 vs dual a6000 vs single a6000?
@MediamanStudioServices
@MediamanStudioServices 2 года назад
I would love to but getting the GPU to make the video is very hard. I am still looking for the GPU but once I get the cards, you bet I will make a video. Thanks for watching
@sameerkadam4956
@sameerkadam4956 10 месяцев назад
Nobody is talking about TDP comparison between workstation and GeForce cards. cost of running system and power bills in commercial setup or running workstation or server 24 x 7 RTX3090 TDP is 350Watt whereas a4000 TDP is only 140Watt.
@MediamanStudioServices
@MediamanStudioServices 10 месяцев назад
Hi sameerkadam4956, I agree that power usage is a big topic. I will make a video on this subject. However, using slower GPUs for a render does not necessarily reduce the overall power consumption. It just takes longer to render the frame. However, looking closely at power utilization is a big factor that is overlooked in purchasing GPUs for projects. Thanks for watching and the video topic idea.
@emanggitulah4319
@emanggitulah4319 2 года назад
Great to see this content. As you said a lot of other channels say that you have to have a quaddro.
@MediamanStudioServices
@MediamanStudioServices 2 года назад
Hi Emang, I have worked in so many studios that do not us "Quadros" GPUs. Thanks for watching
@hamedkhadivi
@hamedkhadivi 2 года назад
Thanks for really usefull data there is one more question that bothers my mind, what about gpu based simulations? i wnoder if you mind take a look at houdini 19 showreel, there is roughly a billion voxels in the scene. which guy can handle it better?
@MediamanStudioServices
@MediamanStudioServices 2 года назад
hi Hamed Khadivi, yes, I would like to do some houdini testing in the future. Please stay tuned to the channel for future videos thanks for watching
@m1sterv1sual
@m1sterv1sual 2 года назад
If you are using Unreal Engine go with A6000. You can put twice as much texture in streaming pool. And that’s a big deal since you are running real-time with no multi-GPU support.
@MediamanStudioServices
@MediamanStudioServices 2 года назад
Hi Mister Visual, yes you are correct, I wish Unreal was multi GPU but if you are doing UR productions the A6000 is the best GPU.
@scottsturm7327
@scottsturm7327 2 года назад
@@MediamanStudioServices so if one were making a movie in UR, 2 nvlinked A6000's won't make a difference?
@MediamanStudioServices
@MediamanStudioServices 2 года назад
@@scottsturm7327 I would suggest you do your research on how Unreal using GPU before you invest in 2 A6000 GPUs. that is a lot of money to spend on graphic processing. It is my understanding that Unreal only uses one GPU to process and render. But don't take my word on this. check Unreal's support page for more information.
@GifCoDigital
@GifCoDigital 2 года назад
@@MediamanStudioServices just say you have no idea what you are talking about. You borowed a card for a few days. Why are you putting these videos out if you have no real long term work experience using these cards.
@MediamanStudioServices
@MediamanStudioServices 2 года назад
@@GifCoDigital Well hello, I love when someone make such negative comments on my channel. It really reflects the kind of person they truly are. When you have worked for 25 years for companies like Lucas film, ILM and Dreamworks on movies like Transformers1, 2, 3, 4 and 5, and three of the Star Wars movies. As well as a few feature Animated films, I think this gives me the experience to do RU-vid videos on GPUs for the creative industry. What have you worked on GifCo? Maybe this channel is not for you.
@Eneeki
@Eneeki 6 месяцев назад
I currently run a RTX 4090 on my Blender workstation and have one simple question. Is it better to build a 2nd PC with another RTX 4090 to do network renders, or can i put X2 4090's in the same system, or lastly, should I spend big money and get dual A6000's in the same workstation to speed up rendering time for more complex renders? A duplicate system to what i currently have is going to cost about 4k less than the cost of the dual A6000's, and adding a 2nd RTX 4090 is a fraction of the cost but i do not know if it is possible without SLI. A work associate of mine has dual A5000's and renders really fast but I do not know if it would be better, or even possibly faster to have 2 computers with 4090's vs 1 computer with dual A6000's or even have dual RTX 4090s? If anyone has a good answer or other questions I do not know to ask, please let me know.
@mikebrown9826
@mikebrown9826 6 месяцев назад
I would get a second system if your 4090 can fit all of the projects into the 24GB of Vram, also, you do not need a super powerful system to put the 4090 GPU for rendering. A 6-8 core system with 32GB of RAM is good enough for a render system. Adding a second GPU to your current system is also an option but having a second system, dedicated to rendering is a good thing. You can also use Deadline, which is free to set up network rendering.
@Eneeki
@Eneeki 6 месяцев назад
@@mikebrown9826 Thank you. The ram is defiantly needed. I currently run 128 gig of DDR4 3200 and can fill the scene enough that i am close to 90% ram usage during rendering often. My current bottleneck is defiantly V-ram. To give you an idea of my current system, CPU is a 5950X, MB is an Asus B550 plus, 128 gig DDR4 3200 ram, Asus Tuf gaming 4090, and all drives are solid state. I will be upgrading the CPU soon and either get a 2nd box or a 2nd video card in the system. I just desperately need to cut down render times to the point i am considered spending 10k on dual A5000's or A6000's. Thank you for your input. I will look further into network rendering options.
@Eneeki
@Eneeki 5 месяцев назад
@@mikebrown9826 Well the v ram is not enough currently thus why i am looking for a 2nd GPU. I currently have a 5950X and was going to upgrade to the 7950X3D to prevent another upgrade a year or 2 down the line. A lower end Thread ripper is also an option but I know they run much hotter than the socket 5 CPU's. Network rendering is something to consider but I am worried i will trip the breakers constantly with 2 systems in the office. There are a lot of things in that room that require a lot of power like the 3D printer. I would probably do well adding another circuit to the room for the 3D printer, that would solve all of those issues. I will look further into network rendering. thank you.
@rahulkamath6984
@rahulkamath6984 2 месяца назад
sorry but i am new to the term network rendering. does that mean that you render stuff via network? and how is the GPU being used in that case
@Eneeki
@Eneeki 2 месяца назад
@@rahulkamath6984 Yes. I have a network of 2 computers using Daz 3D's bridge that allows both computers to dedicate the video cards resources for rendering. it is faster for rendering, but know that it needs a really fast network. 10G router and network cards at the minimum for it to really make a difference. this is a very layman way of saying it, but it allows both computers video cards to work together in speeding up a 3D render.
@dustinjenkins8215
@dustinjenkins8215 2 года назад
Excellent content! Definitely subscribing.
@MediamanStudioServices
@MediamanStudioServices 2 года назад
thanks Dustin
@ShepperdOneill
@ShepperdOneill 2 года назад
Good video! Thanks for sharing! Any chance of testing these cards with Arnold? I use Arnold (C4DtoA) and I'd love to see some comparisons when rendering with the GPU. Arnold has slowly been making progress with its GPU function.
@MediamanStudioServices
@MediamanStudioServices 2 года назад
that is something I can look into for a future video. Thanks for watching
@Nekko_X
@Nekko_X Год назад
Very interesting, but now I have a question, I plan to upgrade the graphics cards of all my computers but I could only afford to upgrade to an A4000 and not an A6000, but I could upgrade all of them to a 3090, all the computers are used only for 3D modeling and animation, not for rendering or anything else. But usually heavy scenes are created, generally many polygons, so my question is, would the A4000 or the 3090 be better for that type of work? The only thing that interests me is that the scene is seen in real time when modeling, or when animating, and that the viewer of the work area does not get stuck and is seen in real time either when loading a preview in it viewer or simply while animating. It would be very useful if someone helps me with this question, because I can't find videos or information about this, only tests in video games, which is of no use to me at all. Which would be better? I'm not looking for the best on the planet either, but I do want to be sure that I make the best decision... I've been thinking for 3 weeks and I still don't know. The worst thing is that I have to make the decision before the following week. Damn.... :/
@khayelihlemngoma2393
@khayelihlemngoma2393 Год назад
Thank you for this comment. I can relate to it. I am building a workstation to run ArchiCAD and I am undecided on the graphics card. By initial assessment it seems the 3090 has more memory than the A4000 and both have ray tracing. I hope he answers your question.
@mikebrown9826
@mikebrown9826 Год назад
The 3090 is a way better choice. It has way more Cuda and RT cores than the a4000 and the increase in vRAM is better for the complex/large scene files. Get the 3090 for sure
@Nekko_X
@Nekko_X Год назад
@@mikebrown9826 Yes, I know that the 3090 has more vram and cuda cores, but my fear is that quadro cards are specially made to work with many polygons without problem, which makes rendering them look smooth in the editor, and the preview of the animation in the work area also looks smooth. Well, they were made for that. And I have seen that the "gaming" cards, which in this case is the 3090, will have a lot of power but they are not so good when a high amount of polygons is already used, because the work area no longer begins to look smooth nor the previews in real time. That is my biggest concern, I don't want to buy the 3090 and have the work area not work well, or buy the A4000 and not have enough power or vram. All my life I have used quadro cards in my computers, it is supposed that with each generation their power improves, but now the doubt comes to me between the power of the nvidia 3000 series and the A series quadro. I have also heard and read that the "gaming" series of nvidia tend to crash in programs where polygons are used, while with quadro cards it almost never happens (and I know). That's another one of my fears... I think I will have to take a chance and choosing to purchase the 3090 cards, I sincerely hope that I do not regret my choice. Thanks for the help!
@mikebrown9826
@mikebrown9826 Год назад
@@Nekko_X I think you will find the 3090 does just a good of a job and the workstation. GPUs. Good luck on your purchase
@goldenheartOh
@goldenheartOh Год назад
Was your deadline to put the expense on 2022 for tax filing? Without the deadline, worst case scenario would be to buy one 3090 and try it out 1st.
@checkmate8015
@checkmate8015 2 года назад
Which one is better for me as a game dev and 3D artist
@faradaysinfinity
@faradaysinfinity 2 года назад
I too am asking this. Intuition says a6000. But still. I mainly use Unreal 5
@mikebrown9826
@mikebrown9826 2 года назад
Get the rtx3070 as this is the middle tier gpu. So your dev needs to play on this card. And it is still powerful
@mikebrown9826
@mikebrown9826 2 года назад
@@faradaysinfinity if your doing unreal then get the 3090. Or a6000
@RafiAnimates
@RafiAnimates 2 года назад
Thank you, this was very informative. I'm curious how these cards compare when running real-time production software e.g. Unreal 5, Nvidia Omniverse and Masterpiece Studio Pro. A similar video testing those would be great.
@MediamanStudioServices
@MediamanStudioServices 2 года назад
hi Rafi, thats a great idea for a future video Thanks for watching
@josephmascarenhas5001
@josephmascarenhas5001 Год назад
Thank you for the informative video. I would like to know which is the cheapest graphics card for workstation purpose. Thanks
@Psalm_23
@Psalm_23 Год назад
What about a6000 Vs 4090 comparison test?
@gustavopaco2433
@gustavopaco2433 Год назад
but the new ada a6000
@itsaman96
@itsaman96 Год назад
I always install studio drivers on my 1660 super 6gb
@duh4293
@duh4293 Год назад
Another 1660 super user in the wild. Heck yeah.
@57kod
@57kod 2 года назад
Excellent video, This channel is gold for creators!!! So, you are using the same studio driver for both cards? Then, Could we run the two in the same system?
@MediamanStudioServices
@MediamanStudioServices 2 года назад
Hi 57 Fod, thanks for the kind works. Please help grow the channel and share on Facebook. I am not using the same drivers for both cards. I am using the GeForce Studio drivers for the 3090 and the A6000 Studio drivers for the workstation GPU. You could run both cards at the same time but I have not tested for performance using the same workstation drivers on the Geforce cards.
@zaydraco
@zaydraco Год назад
For AI workloads the memory and tensor cores are important due to the amount in the data sets. If you have bigger data sets it becomes more important to have bigger memory or have a better algorithm to partition the exchange of RAM and GPU RAM... If I remember from the Cuda API you move memory from one to the other, the thing with normal memory in a program is that you can page into swap memory or virtual pages, but it is not the same with GPUs. At least not automatically handled by the OS as far as I know.
@andreasfjellborg1810
@andreasfjellborg1810 2 года назад
Here(Sweden) you can get 3x 3090 for one A6000, going 3x 3090 would be a quite a lot faster than a single 3090...
@MediamanStudioServices
@MediamanStudioServices 2 года назад
hi Andreas, i would love to have 3 RTX3090, but its so hard to find them in the market right now at a good price Thanks for watching
@scottsturm7327
@scottsturm7327 Год назад
Hoping for a 4090 vs A6000 video ; - )
@enmanuel7112
@enmanuel7112 Год назад
duuude that's not even a comparison those gpus are generationally different the Axxxx is ampere base where the 4xxx series is Ada based. The 4090 will beat the crap out of the quadro card immediately
@scottsturm7327
@scottsturm7327 Год назад
@@enmanuel7112 you do realize a new ADA based A6000 card is coming out soon, yes?
@enmanuel7112
@enmanuel7112 Год назад
@@scottsturm7327 didn't know that, ty
@jdonaldjr
@jdonaldjr Год назад
@@scottsturm7327 what would you recommend for fcd simulations cad and some 3d rendering some used rtx Quadro 4000, any rtx from the 3000 series starting from at least a 3080 or the current A4000
@paolol4267
@paolol4267 2 года назад
Hi, thanks for the video this channel has a lot of truly interesting information that's not easy to find elsewhere. If possible I'd like to ask a question: would you recommend an a4000 or an rtx 3080 for doing 3d animation in Maya and Unreal Engine (nowadays they are priced basically the same)?
@MediamanStudioServices
@MediamanStudioServices 2 года назад
The RTX3080 as it had more CUDA cores for processing power. But see if you can get the 12GB version
@paolol4267
@paolol4267 2 года назад
@@MediamanStudioServices thank you very much!
@Gettutorials123
@Gettutorials123 2 года назад
Thank you again for such informative video!
@MediamanStudioServices
@MediamanStudioServices 2 года назад
you most welcome GET, Thanks fro watching
@renanmonteirobarbosa8129
@renanmonteirobarbosa8129 Год назад
2 things, VRAM and NVLINK. Thats the main difference.
@mtz1085
@mtz1085 Год назад
I'm an industrial designer, I use SOLIDWORKS, I usually work on furniture, kitchens and lighting. Is rare for me to work with more then 100 pieces at the same time in one file. As you probably know SOLIDWORKS recommend the use of Quadro. I saw that only with Quadro, unless you cheat, you can render with SOLIDWORKS program. I usually use keyshot for rendering. My question is, do I really need a Quadro? Or it's fine with a GeForce 4080. Thanks
@romarivideo6067
@romarivideo6067 Год назад
Привет!)У меня была rtx3080 и тоже работаю в кейшоте, делаю анимацию сварочных аппаратов для компании, один кадр с аппаратом на металлическим полу рендерилась в темноте с источниками света около 7-8 минут (7000 passes), ролик должен был быть около 2 минут, то есть у меня уходил 1 месяц на рендер, на днях обновил карту до rtx 4090 palit game rock TGP 500 W и на мое удивление время рендера особо не изменилось 😂Все из за отражений и отскоков лучей на этом металлическом полу видимо 😢Я уже боюсь брать такие карты как rtx a6000, ведь она может тоже не дать особого прироста производительности в моем случае..
@entique4216
@entique4216 11 месяцев назад
I see my client use gaming laptop (RTX 3060 to 4060) and its running fine with 3dsmax and solidwwork , unless you are professional and your budget is ''unlimited'' then you can buy a quadrop or workstation in general
@teamEP789
@teamEP789 5 месяцев назад
you definetly don't need a quadro. i've been a professional 3d artist for over 25 years and quadros except in some rare cases, are always worse performing than "game" cards.
@golemmoon6220
@golemmoon6220 Год назад
What resolution of monitor do you use HD, 4K, 8K?
@brunochapeton9213
@brunochapeton9213 Год назад
Can you please do a comparison with the NVIDIA GPUs when using unreal engine 5.2 for virtual production? Thank you!
Далее
RTX A5000 vs RTX3080 for Creative 3D workflow
11:29
Просмотров 101 тыс.
Workstation vs Desktop, what is the difference
10:24
Просмотров 67 тыс.
На фейсконтроле 💂
09:41
Просмотров 457 тыс.
How FAST is the RTX 4090 for 3D Animation + Rendering??
13:59
The Best Pro Workstation GPU!
6:03
Просмотров 10 тыс.
WHY I use the Nvidia QUADRO RTX 8000
13:45
Просмотров 975 тыс.
The billion dollar race for the perfect display
18:32
GeForce vs Quadro as Fast As Possible
4:40
Просмотров 2,2 млн
I switched back to AMD... and I have no regrets.
24:11
Просмотров 453 тыс.
ТЫ С ДРУГОМ В ДЕТСТВЕ😂#shorts
1:00