Тёмный

Jetson Nano + RealSense D400 Depth Camera 

JetsonHacks
Подписаться 36 тыс.
Просмотров 31 тыс.
50% 1

Опубликовано:

 

12 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 111   
@rftulak
@rftulak 5 лет назад
Jazz, brilliant instruction with oblique levity .. all smiles here :-)
@Skyentific
@Skyentific 5 лет назад
Great video! I will use this info for my future video on my channel. I plan to put the realsense camera on two axis gimbal (head of the robot). Thank you!
@JetsonHacks
@JetsonHacks 5 лет назад
thank you for the kind words. Good luck on your project, and thanks for watching!
@jemo_hack
@jemo_hack 5 лет назад
Jim, Awesome series you have put together. Thank you for your time and for sharing...
@JetsonHacks
@JetsonHacks 5 лет назад
You're welcome, and thanks for watching!
@mattizzle81
@mattizzle81 5 лет назад
Just got my first Jetson Nano and been playing with it all weekend (used up my whole long weekend!!). At first I was a little disappointed with the performance, but I don't have a proper power supply yet. So far only have a ZED Mini camera, which for my purposes I don't like that much. I tried to do some 3D mapping and I found it would run out of memory and start swapping like crazy. I was very disappointed. Seemed like this little thing would be useless for me and I'd be better off streaming over wifi or coughing up much more $$$$ for a Xavier. Then I decided to try adding a Samsung T5 USB SSD and put a huge 64GB swap file on there. Night and day difference, now it runs flawlessly. I'm happy with this little thing. Definitely one of the best deals around if you are into this 3D computer vision stuff. I was able to run the same 3D mapping apps on a smartphone under a Linux chroot, and the processor on my Note9 is faster, but the Note 9 cost me $1500, and the Nano less than $200. The only other possible alternative I can imagine is something like a Vivostick with a Coral USB TPU, but that thing has only 2GB of RAM, probably not even worth trying. Then again that SSD made a huge difference with the Nano.
@JetsonHacks
@JetsonHacks 5 лет назад
Glad to hear that you were able to bend the little Nano to your will. Thanks for watching!
@antoniodourado
@antoniodourado 5 лет назад
Poor shark. Can't watch Jaws and now can't even wear a realsense camera. Thank you for another quality video!
@JetsonHacks
@JetsonHacks 5 лет назад
I'm glad you like the video. As for Shark, Mako Plush, believe me when I tell you that I have to pay for that off camera. Thanks for watching!
@rickhunt3183
@rickhunt3183 5 лет назад
It would have been nice for Intel to provide a more secure hard wired option for the camera, but that's nothing a little surgery can't fix. This camera seems to be a big step up from lidar. There could be an advantage to using both technologies at the same time for navigation and object detection, or maybe not. Could be an interesting weekend project to find out. I've been looking at the TI mmwave sensor boards and I think the Jetson would be a perfect platform to take advantage of that technology. If you're wanting to track multiple targets or a single target among many. The TI mmwave sensor board will do it without any optics. You might consider taking a look. Anyway, I enjoyed your video. I always find them refreshing and relaxing. I think it's because you're always relaxed. I'm always intensely focused when working on a project and wanting as much solitude as possible. Have a great night Jim.
@JetsonHacks
@JetsonHacks 5 лет назад
Thank you for the kind words. It's not apparent, but it took several weeks to get the RealSense camera to work with the Nano. The videos tend to be after everything is working, so it's easier to be relaxed. I'll note that the cameras are really developer kits, similar to the Jetsons. There are OEM modules that people can integrate into their products. I'm interested in the mmWave stuff, but I'm not quite sure how easy it is to get it to work with the Jetson. Thanks for watching!
@rickhunt3183
@rickhunt3183 5 лет назад
Interfacing the mmwave board might not be so much like working with legos, but persistence and confidence will take you a long way. I haven't made it work with the nano, but I'm confident there's away to do it. Make sure the words "I can't" aren't in your vocabulary. There is always away. It just has to be found. Thank you for your time Jim and for being so nice.
@mattizzle81
@mattizzle81 5 лет назад
I just bought a Structure Core sensor. The depth performance on it looks amazing, but sadly for some reason I can't get decent framerates on the Jetson Nano. Depth only is at 30 FPS, but as soon as I add RGB the framerate drops to 9 FPS, and IR? Forget it. I see you are getting a full 60 FPS there with all streams enabled! What the heck. The Structure sensor works great on my high end laptop and desktop. 30 FPS with all streams enabled, and the depth performance looks very good, I think it looks better than what I see in this video. It seems I can't use it properly with the Nano though! That is what I bought this thing for, in fact I bought two!!! Bah. I'll have to convert them to other uses like smart security cameras or something, or get one of these cameras. Structure sensor looks superior quality wise though, but I'll have to find another portable device to connect it to. There has to be some kind of USB 3 bandwidth issue on the Jetson Nano.
@vvwording4844
@vvwording4844 5 лет назад
I'm having so much fun trying to learn AI. You and some of the other tutors on the Network seem to be having fun too.
@JetsonHacks
@JetsonHacks 5 лет назад
Good to hear! Thanks for watching.
@wpegley
@wpegley 3 года назад
Subscribed, but branding suggestion, Viscous shark jaws that would match your skill level better.
@JetsonHacks
@JetsonHacks 3 года назад
Thanks for the suggestion. Mako Shark has been clamoring behind the scenes for 24/7 coverage, and I'm afraid that would just encourage him more. Thanks for watching!
@NikolaosTsarmpopoulos
@NikolaosTsarmpopoulos 5 лет назад
As always, useful video to get started.
@JetsonHacks
@JetsonHacks 5 лет назад
Thank you for the kind words, and thanks for watching!
@Wastl42
@Wastl42 5 лет назад
Great post - as always. Is there a particular reason that you built librealsense without CUDA?
@JetsonHacks
@JetsonHacks 5 лет назад
Thank you for the kind words. The CUDA code is used for mapping purposes, color conversion, and image rectification of the RGB stream when displaying in 3D and so on. To compile with CUDA and the librealsense library, you also need to compile a newer version of CMake than the one in the Ubuntu repository (takes about another 30 minutes). In the case of the Nano, there's really not enough horsepower to do a real texture mapped point cloud (or at least I could not get better than a frame or two per second with the realsense-viewer application). If you are just using the depth map and color stream in a robotics application, for example, the CUDA code isn't that useful. For a faster machine, like a Jetson TX2 or Xavier, it makes sense to use the CUDA enhancements. Hope this makes sense, and thanks for watching!
@gerokatseros
@gerokatseros 4 года назад
Wonderful!! one question though, if I wan't to use just pyrealsense2 with python, do I have to install the SDK, or is there a wheel for it?
@hk2780
@hk2780 5 лет назад
Could you make a video which can connect laptop and nano wirelessly. For example I want to see the nano work as wifi hotspot and get the video streaming. The laptop can get the streaming.
@zhencanwang7248
@zhencanwang7248 Год назад
Hello, have you worked out the current situation yet
@MohamedAshrafZahana
@MohamedAshrafZahana 5 лет назад
Thanks for the video. You saved me a lot of time. I appreciate your effort.
@JetsonHacks
@JetsonHacks 5 лет назад
You are welcome, and thanks for watching!
@gennadyplyushchev1465
@gennadyplyushchev1465 5 лет назад
Thank you, great video!
@JetsonHacks
@JetsonHacks 5 лет назад
You are welcome. Thank you for the kind words, and thanks for watching!
@macargnelutti
@macargnelutti 5 лет назад
Thanks a million for this.
@JetsonHacks
@JetsonHacks 5 лет назад
You are welcome, and thanks for watching!
@Drawbotify
@Drawbotify 5 лет назад
Awesome videos as always! I have a question. As i started to patch Ubuntu I had a "Jetson Board Mismatch" telling me that the processor appear to be a Nvidia Jetson nano Dev Kit. Do you have a patch for this version or is there a way to make it work with the patch you already provided? Thank you !!!
@JetsonHacks
@JetsonHacks 5 лет назад
The current release in the Github repository is for L4T 32.1. You appear to be using a newer version. Thanks for watching!
@kestergascoyne6924
@kestergascoyne6924 5 лет назад
Excellent info. Thank you.
@JetsonHacks
@JetsonHacks 5 лет назад
Thank you for the kind words. Thanks for watching!
@SgtSiff
@SgtSiff 5 лет назад
Hi Jim, great videos. Please could you consider doing a video comparing the Zed camera with a T265+D435 setup? Looks like the Intel solution is not only cheaper, but potentially performs much better due to the hardware IMU processor of the T265 combined with infrared depth tracking on the D435. It would also be nice to see how this combination then compares with the D435i, which as far as I am aware, has the same IMU as the T265, but lacks the hardware processing (and has no pose stream). Thanks
@JetsonHacks
@JetsonHacks 5 лет назад
Thank you for the kind words. I don't really do comparisons right now as there are so many different applications that people use these devices for that it's unfair to the various vendors to say "This one is better than another". It all depends on what the selection criteria is, and how it is being used. Also, there are several manufacturers in this space with competing products. These include Structure.IO with their Structure Core product, econ-systems makes a stereo vision camera, MYNT Eye too. Each has their advantages, disadvantages, indoor, outdoor, range, accuracy, FOV, and so on. Also, it depends on how you intend to use the Nano. If you are using it simply as a depth processor, or for occupancy mapping, or a tracking camera that's one thing. For example, the ZED uses the computing capabilities of a host to build its depth map. That may be good (you're using the Nano as a depth processor), or bad (you need as much processing as you can get so you can run neural nets/CUDA code). Like all engineering, there's tradeoffs and design considerations that you have to take into account in any project. It's a good idea to study for your component selection and fully understand your project needs. With that said, the Intel Tracking Camera T265 is not like the others in the group. It's difficult to find two global shutter cameras with fish eye lenses for $200, more or less with an integrated IMU and a vision processor that straight out gives a stream of poses. But obviously that's of little value if you are not on a mobile base of some sort. If the camera is stationary (for example, tracking people in a space), that's a completely different use case than on a mobile base (a robot mapping a room as an example). If you are trying to visualize a point cloud, that's another thing, and so on. The only thing I try to do here is share "How do you get this thing to work to begin with", and let others figure out how they would actually use them in their project.
@SgtSiff
@SgtSiff 5 лет назад
@@JetsonHacks Perfectly reasonable! I wasn't even aware of the offerings from MYNT or Structure. I've emailed support from both and it looks like the MYNT is more suitable for my needs. Thanks for your detailed response.
@JetsonHacks
@JetsonHacks 5 лет назад
@@SgtSiff I'm glad you found the other vendors. Lots of interesting offerings out there.
@dukathneu
@dukathneu 5 лет назад
Thank you very much for your videos.
@JetsonHacks
@JetsonHacks 5 лет назад
You're welcome, and thanks for watching!
@EricRohlfs
@EricRohlfs 5 лет назад
Can you run the D435 and the T265 at the same time on the Jetson Nano?
@JetsonHacks
@JetsonHacks 5 лет назад
Sure. You have to be a little careful if you are streaming all the cameras (that's 5 cameras + the depth stream and pose stream) as you start approaching the bandwidth limits of the USB 3 Hub. If you have other USB devices connected on the Jetson, things can get a little laggy. Thanks for watching!
@EricRohlfs
@EricRohlfs 5 лет назад
Sounds like I might have to use VNC or ssh from another computer for keyboard and mouse then. Thanks for the quick response.
@JetsonHacks
@JetsonHacks 5 лет назад
@@EricRohlfs You should try it with the keyboard and mouse first, they don't use much bandwidth. If you are trying to use them with a USB drive at the same time, then you're probably over the limit.
@arslanhayat1400
@arslanhayat1400 Месяц назад
Hi. Is it possible to connect realsense D455 camera with jetson nano orin?
@JetsonHacks
@JetsonHacks Месяц назад
I would think so, but I haven't tried it. Thanks for watching!
@sy2532
@sy2532 5 лет назад
I know you do not believe in a syllabus...but are you going to lead the install of ROS and a RealSense wrapper for the Nano?
@JetsonHacks
@JetsonHacks 5 лет назад
I have no plans to do so at the present time.
@sy2532
@sy2532 5 лет назад
@@JetsonHacks I understand.... I just not sure where to go as far as RealSense they only say they support ubuntu 16.04 yet we are running the Nano on 18.04. Only ROS Kinetic runs on 16.04...if I want 18.04 that means ROS Melodic...but will the RealSense ROS wrapper install on Melodic? Thought you might have a solution.
@AliKutluozen
@AliKutluozen 5 лет назад
@@sy2532 Hi, we are experiencing the same exact issue, have you had any luck with it?
@sy2532
@sy2532 5 лет назад
@@AliKutluozen I have a few problems that I am working on with the Nano...you need to be more specific. If you mean getting the T265 working, I have not gotten it to work on the Nano. I have gotten the T265 working on the TX2(using JetPack 4.2)...so I am hopeful...that the same process will work for the Nano. I need to be able to rebuild the Kernel...I am hoping Jim will soon give that to us...for the Nano...but I am currently doing that line by - line...similar to TX2.
@AliKutluozen
@AliKutluozen 5 лет назад
@@sy2532 Hey, thanks for answering back! Here is my situation: I have Jetson Nano with Ubuntu 18.04, ROS Melodic installed... I also have a Realsense D435i that I have installed thanks to the tutorial, it works. Yet, when installing realsense-ros, ROS can't find the Intel SDK 2.0 ... So, here is the missing link; ROS and Realsense SDK.
@glikar1
@glikar1 5 лет назад
Great project! In your opinion, is the camera worth the 400+ dollars? Love to see a comparison to cameras used in your other camera focused videos.
@JetsonHacks
@JetsonHacks 5 лет назад
Thank you for the kind words. I only do videos on products that I think are worthwhile. The RealSense Depth Camera is $200. I don't really do comparisons right now as there are so many different applications that people use these devices for that it's unfair to the various vendors to say "This one is better than another". It all depends on what the selection criteria is, and how it is being used. Also, there are several manufacturers in this space with competing products. These include Structure.IO with their Structure Core product, econ-systems makes a stereo vision camera, MYNT Eye too. Each has their advantages, disadvantages, indoor, outdoor, range, accuracy, FOV, and so on. Also, it depends on how you intend to use the Nano. If you are using it simply as a depth processor, or for occupancy mapping, or a tracking camera that's one thing. For example, the ZED uses the computing capabilities of a host to build its depth map. That may be good (you're using the Nano as a depth processor), or bad (you need as much processing as you can get so you can run neural nets/CUDA code). Like all engineering, there's tradeoffs and design considerations that you have to take into account in any project. It's a good idea to study for your component selection and fully understand your project needs. With that said, the Intel Tracking Camera T265 is not like the others in the group. It's difficult to find two global shutter cameras with fish eye lenses for $200, more or less with an integrated IMU and a vision processor that straight out gives a stream of poses. But obviously that's of little value if you are not on a mobile base of some sort. If the camera is stationary (for example, tracking people in a space), that's a completely different use case than on a mobile base (a robot mapping a room as an example). If you are trying to visualize a point cloud, that's another thing, and so on. The only thing I try to do here is share "How do you get this thing to work to begin with", and let others figure out how they would actually use them in their project.
@glikar1
@glikar1 5 лет назад
@@JetsonHacks Sorry, missed the May 6th video on the T265. Both videos have the same title. T265 is tempting at 200$. After doing some research into Global vs Rolling Shutters I realised how a phone snaps a QR code so quickly. Thanks for a thoughful reply, good info!
@JetsonHacks
@JetsonHacks 5 лет назад
@@glikar1 I'm glad you found it useful!
@timhoksong5608
@timhoksong5608 5 лет назад
I've been working on Jetson nano Developer Kit and i tried to follow the instructions of the video and when i get to the point to install patchUbuntu.sh, the result gave me Jetson Board Mismatch! I really need your help.
@l.ittlethings6
@l.ittlethings6 4 месяца назад
thnks for the video! i have a question, can the realsense interface be opened using ssh?
@JetsonHacks
@JetsonHacks 4 месяца назад
I'm not sure what you mean. ssh is usually text only interface. Do you mean using it through a ssh tunnel?
@jackflynn3097
@jackflynn3097 5 лет назад
laser shark!
@JetsonHacks
@JetsonHacks 5 лет назад
Hard not to be a fan of the Laser Shark!
@CuteLittleMiku
@CuteLittleMiku 5 лет назад
Nevermind your shark is cute
@JetsonHacks
@JetsonHacks 5 лет назад
Now he's blushing. Thanks for watching!
@user-xu5vv4un7x
@user-xu5vv4un7x 5 лет назад
How can I change the settings from the infrared depth-finding camera to the RGB camera? I want to detect real-time objects, but my webcam is recognized as an infrared depth-finding camera. How can I make it recognize as an RGB camera? I am using realsense d435i camera.
@MrBigmachines
@MrBigmachines 5 лет назад
Great video and install script! My realsense D435i is now running on my Nano, but the RGB overlay is not working. Which Firmware do you use? I tried already the last 3 versions without success. It shows for one frame the RGB image in 3D mode and then it is something between gray and green. In 2D mode everything works fine.
@JetsonHacks
@JetsonHacks 5 лет назад
The firmware version on this camera is 05.10.13.00 You should lower the resolution to 848x800 @ 30fps on the cameras. The Jetson Nano doesn't have the horsepower to run the point clouds in the realsense-viewer application at higher resolutions. I don't know if this is related to the task or the application. Thanks for watching!
@sy2532
@sy2532 5 лет назад
Mounting the laser works, but had problems with super capacitors and water.
@JetsonHacks
@JetsonHacks 5 лет назад
Cameras like the RealSense D400s are for use with Land Sharks only. Thanks for watching!
@Diamond_Hanz
@Diamond_Hanz 5 лет назад
"laserrrr"
@JetsonHacks
@JetsonHacks 5 лет назад
Yes, I will have to get some Dr. Evil instructional materials so I can pronounce it correctly. Thanks for watching!
@baddad4414
@baddad4414 3 года назад
How are you interfacing the DR435i to the nano when its got a USB C connection and the jetson nano doesn’t?
@JetsonHacks
@JetsonHacks 3 года назад
The D435i ships with a USB C to USB A cable. Thanks for watching!
@baddad4414
@baddad4414 3 года назад
@@JetsonHacks thanks For replying. Thats odd but at least i can now buy one, looking forward to playing with one. thanks.
@kbuilds7287
@kbuilds7287 3 года назад
Should I get this or buy a used Kinect v2?
@JetsonHacks
@JetsonHacks 3 года назад
You should get whatever meets your needs and use case.
@Osmanity
@Osmanity 5 лет назад
You are awesome!!!
@CuteLittleMiku
@CuteLittleMiku 5 лет назад
Hi do you know when the new release will be out? I know there's a patch but I don't think I'm ready for messing with the kernel...
@JetsonHacks
@JetsonHacks 5 лет назад
New release of what software?
@CuteLittleMiku
@CuteLittleMiku 5 лет назад
I guess L4T ..
@JetsonHacks
@JetsonHacks 5 лет назад
I would guess around the time they release the Jetson Nano production module in June.
@ZhijingShao
@ZhijingShao 5 лет назад
How many realsense can work with one Nano together? Each usb 3.0 should have like 5Gbps, but maybe the 4 ports share the 5Gbps, not 20Gpbs?
@JetsonHacks
@JetsonHacks 5 лет назад
That's correct. There is only one USB hub for the 4 USB 3.0 ports so you have the 5Gbps limit. Thanks for watching!
@muchamadshofiudin7344
@muchamadshofiudin7344 5 лет назад
Can it be combine with intel stick
@patrickweber83
@patrickweber83 2 года назад
Even if this is 2 years old, I try to get a D435 running on Jetpack4.6 without much succhess. I was able to build everything from source and it seems to run fine. However, as soon as I start realsense-viewer or ROS wrapper for it, performace of jetson is incredible poor. I don't get more than 1pfs with CPU usage of 100% on all cores. I build it with CUDA support but GPU gets used only a bit. I powered everything with barrel jack and a beefy (200W, 5V) DC/DC converter. Any Ideas how I can improve things?
@JetsonHacks
@JetsonHacks 2 года назад
Intel updated their instructions for installing RealSense cameras on Jetson. Here's a video, please read the article in the description of that video. Make sure that you also upgrade the firmware in the camera. Also, make sure that the camera is running USB 3.2 which is shown in the RealSense Viewer application. Good luck!
@patrickweber83
@patrickweber83 2 года назад
@@JetsonHacks thank you for your answer. Could you please share a link to the video you mentioned?
@JetsonHacks
@JetsonHacks 2 года назад
@@patrickweber83 Oops, sorry about that: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-lL3zxwN5Lnw.html
@antoniodourado
@antoniodourado 5 лет назад
Hey Jim. Could you help me here? I installed my D435i following the same steps and your script. I'm having this annoying issue: when I shutdown my nano and then turn it back on, D435i is recognized normally as intended. However, if I reboot instead turn off/on, it stops being recognized after reboot with a real long list of "unable to enumerate USB device". The same behavior applies to my desktop PC, not just for Nano, though.
@JetsonHacks
@JetsonHacks 5 лет назад
This sounds like a broader issue than just the Nano install. You may want to ask on the librealsense Github repository. Thanks for watching!
@christianealford8071
@christianealford8071 4 года назад
Thank you so much! Question: I'm trying to run the rs-dnn.cpp example on the nano with the D435 camera - can this be done? I'm getting errors when it comes to using CMake
@JetsonHacks
@JetsonHacks 4 года назад
I have not tried it. It's difficult to tell what your issue is from your description. Thanks for watching!
@christianealford8071
@christianealford8071 4 года назад
Sorry for the vagueness! It seems as though most of the demos from realsense are able to work on the nano - however the dnn demo does not appear in the demo folder and remains as a .cpp file in the librealsense/wrapper/opencv/dnn folder. I was wondering if this was typical on the nano? Thanks
@JetsonHacks
@JetsonHacks 4 года назад
@@christianealford8071 Not quite sure what "typical on the Nano" means. The OpenCV DNN is sample code which requires a properly configured development environment to build. There is a reliance on the version of OpenCV and the installation particulars. Looking at /librealsense/wrappers/opencv/dnn you can see a short explanation on how to build the sample. It feels like they make the assumption that the user has a good working knowledge of OpenCV and understands about how to use the DNN module. I haven't tried it, but it is certainly beyond the scope of what I could answer in a RU-vid comment. I would guess that the major issue is in configuring the dev environment. The typical answer to "getting errors when it comes to using CMake" is usually to fix the errors, but that presumes that you know what the actual errors you encounter might mean. It's not very forgiving or helpful.
@olalekanisola8763
@olalekanisola8763 4 года назад
Great work, I used this tutorial to install my real sense D345 on jetson nano. It worked well with the terminal. But when I launch with cheese, it doesn't show anything. Kindly help with a solution
@JetsonHacks
@JetsonHacks 4 года назад
Cheese requires a UVC backend for the librealsense drivers. You will either need to compile using the UVC flag, or you can use the branch 'easyInstall' which loads a compatible version from a librealsense repository. Thanks for watching!
@olalekanisola8763
@olalekanisola8763 4 года назад
@@JetsonHacks thank you for your response,could you please direct me to the appropriate link for executing this?Thank you
@JetsonHacks
@JetsonHacks 4 года назад
@@olalekanisola8763 The easyInstall branch of the Github repository installLibrealsense on the Github JetsonHacksNano account.
@olalekanisola8763
@olalekanisola8763 4 года назад
@@JetsonHacks tried installing all over again and it generated an error. The jetson Nano hangs each time i launch the cheese or the real sense. How can I uninstall everything and so that I can reinstall all over?
@JetsonHacks
@JetsonHacks 4 года назад
@@olalekanisola8763 I do not know what you have done, it is difficult to advise you. You should reflash the SD card and start again. The UVC version is now the default version in the repository.
@kryptonic010
@kryptonic010 4 года назад
When running your scripts a bunch of errors are thrown during the make process. Do you have updated scripts for L4T 32.2.3. I get the following message when applying the patch: Jetson BSP Version: L4T R32.2.3 Jetson_L4T=32.2.3 ==== L4T Kernel Version Mismatch! ============= This repository is for modifying the kernel for a L4T 32.2.1 system. You are attempting to modify a L4T NVIDIA Jetson Nano Developer Kit system with L4T 32.2.3 The L4T releases must match! There may be versions in the tag/release sections that meet your needs Great video and the theme music still rings in my head ... ;-)
@JetsonHacks
@JetsonHacks 4 года назад
Intel recommends using a .deb package for newer versions of librealsense. See the 'easyInstall' branch in the repository.
@kryptonic010
@kryptonic010 4 года назад
@@JetsonHacks Thanks for the insight. I stepped back to L4T 32.2.1 and brought up RealSense. After loading the application did recognize I had an outdated firmware version so I went on and updated it. So far, so good. Deeper into the rabbit hole I go.
@JetsonHacks
@JetsonHacks 4 года назад
@@kryptonic010 I'm glad you got it to work.
@SpaceBound-1
@SpaceBound-1 5 лет назад
I wish I had money for all these toys.
@srikanthn6404
@srikanthn6404 5 лет назад
Can we use up-board instead of Jetson nano
@JetsonHacks
@JetsonHacks 5 лет назад
I don't understand the question. Do you mean use these scripts on the Up-board? Or just use the Up-board with these cameras?
Далее
Depth Camera - Computerphile
12:34
Просмотров 248 тыс.
Teaching my custom AI drone to track humans
19:52
Просмотров 28 тыс.
Аушев, Путин, «пощечина»
00:56
Просмотров 675 тыс.
Почему не Попал?!
00:15
Просмотров 41 тыс.
Jetson Nano GPIO
20:43
Просмотров 45 тыс.
Connect Intel Realsense with Python and OpenCV
13:21
Просмотров 15 тыс.
Jetson Nano - Run On USB Drive
12:05
Просмотров 37 тыс.
RealSense Camera ROS Wrapper - Jetson Nano
7:14
Просмотров 26 тыс.
Jetbot Neural Network Based Collision Avoidance
7:50