Тёмный
No video :(

Intel Realsense T265 tracking camera test - ROS driver 

Karol Majek
Подписаться 6 тыс.
Просмотров 38 тыс.
50% 1

If you want to support me, buy me a coffee:
bit.ly/Coffee4K...
I prefer espresso
Anyone is reading this?

Опубликовано:

 

25 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 76   
@longwang8670
@longwang8670 5 лет назад
Your videos are all very useful. I just keep watching one by one, on and on, and never want to leave or go bed. Thank you for your videos.
@KarolMajek
@KarolMajek 5 лет назад
Thank you so much for this comment! I really need such comments, it really helps me to continue this work :-)
@rickykim4193
@rickykim4193 Месяц назад
Hi, can I also access linear velocity of the camera along the x, y, z axis? If so, which topic can I access them?
@masterlitheon
@masterlitheon 5 лет назад
Very usefull!
@benpiriz4386
@benpiriz4386 5 лет назад
Cool review. software name are you using to view the XYZ plane? Thank you.
@KarolMajek
@KarolMajek 5 лет назад
I'm using Rviz from ROS. It works mainly on Linux systems, but theoretically can be run on Windows 10
@breggs
@breggs 4 года назад
I need to do indoor positioning with Aruco markers for a virtual reality experience. Do you have any recommendations how I might accomplish this (is there a more appropriate sensor than the T265, etc.)? I would really appreciate any help as you seem to be very knowledgeable in this field.
@cesarhcq
@cesarhcq 4 года назад
In order to use the aruco marker, you don't need the real sense camera only. You can use a Webcam for example, but it depends of the application
@breggs
@breggs 4 года назад
@@cesarhcq Thanks for the response! I'm trying to do something like this video ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-SruntQFFefI.html. Place aruco markers around a physical environment and strap a sensor to the front of an all-in-one virtual reality headset to determine exactly where the player is within the environment (based on the visible markers). In other words I need to use aruco markers to figure out what I'm looking at, but also a sensor that can tell where I am relative to the marker(s). The headset can do most of the tracking, I just need to use the markers+sensor to correct any drift that happens over time.
@cesarhcq
@cesarhcq 4 года назад
@@breggs Thanks for the explanation! The Aruco marker can return the position and rotation of the user. In practice the aruco marker is very useful to determine the groundtruth in mobile robotics. I recommend you to see this video ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-wlT_0fhGrGg.html. It will be very easy to understand more about how to use the aruco marker and a single camera (webcam or realsense). With the realsense camera you will tracking the user based on the trajectory from Inertial sensor and images. However, the aruco marker you will have a groundtruth with more precision. Don't forget that Aruco Marker have different IDs and you'll need to calibrate the camera.
@KarolMajek
@KarolMajek 4 года назад
If you're working with ROS: I tested this library: wiki.ros.org/aruco_detect It was able to detect aruco from 3m using very hard webcam from Creative. They also have aruco slam, but I didn't test that.
@ChopLabalagun
@ChopLabalagun 4 года назад
Do you know if we can do slam with 3d mesh reconstruciton with the T265 camera?
@jerryhan9401
@jerryhan9401 5 лет назад
What are some fields maybe this camera’s useful? Do you mean maybe tracking positions of drones or cars or moving traces? But for drones or cars some other accessories can be used to do that while not that intensive for processing.
@KarolMajek
@KarolMajek 5 лет назад
Processing is on this 1.5W device, not on PC. It can be used for VR/AR. Mobile robots Yes drones have good visual odometry already.
@jerryhan9401
@jerryhan9401 5 лет назад
Karol Majek Yeah! I got u. But still I think the better part of this camera is the facing direction of camera while the tracking’s also pretty cool.
@jerryhan9401
@jerryhan9401 5 лет назад
Karol Majek Cuz I don’t get how the position tracing can be useful for vr games or other stuff
@jerryhan9401
@jerryhan9401 5 лет назад
Karol Majek also I think the software itself is more important and much more useful, if it can be used on other cameras.
@KarolMajek
@KarolMajek 5 лет назад
Large scale VR. I was walking with this sensor 200x200 meters and had only 30 closed loop error. But it was outdoor, so pretty hard environment. It can be a good solution for multi player VR games in large hangars
@bikcrum
@bikcrum Год назад
I am trying to mount them in bipedal robot Cassie (ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-a_YGPbWJO5g.html) but since the motion is too jerky, the camera is losing its tracking. Do you recommend any method that can help stabilization?
@laurafaresin6575
@laurafaresin6575 5 лет назад
I prefer Macchiato Coffee ^_^ but the camera is super cool ...how have you got it.. from Intel directly?
@KarolMajek
@KarolMajek 5 лет назад
Bought it from intel. Just clicked buy/preorder/i_dont_remember_exactly. Sadly I don't have any support (there were 3-4 people who bought me coffee, but regular one, not macchiato :-) ) and of course yt ads (hope to get $5). I payed for the sensor + duty (approx $100), but still with it! Now I'm buying from Germany. So this isn't paid promotion or something. My honest opinion. Macchiato next time :-)
@KarolMajek
@KarolMajek 5 лет назад
Now I get $70 monthly from YT ads in total (all videos). Just checked this. Need to eat something from time to time so I'm doing few other things as well :-)
@SARVOTARZANVFX
@SARVOTARZANVFX 5 лет назад
Can we export its camera position data in another 3D application like Houdini or 3ds max ? In what format ? Will it work if we place T265 over some DSLR camera to fetch the position data of DSLR ?
@KarolMajek
@KarolMajek 5 лет назад
I don't know if there are any plugins yet. Yes, you can place it on camera to track it's position
@SARVOTARZANVFX
@SARVOTARZANVFX 5 лет назад
@@KarolMajek , If it can export position data in fbx format, then it will work for 3d application, can you share screen where it shows export formats, please ?
@SARVOTARZANVFX
@SARVOTARZANVFX 5 лет назад
@@KarolMajek my email ID is sarvobusiness(at the rate)yahoo.co.in
@KarolMajek
@KarolMajek 5 лет назад
Fbx is proprietary format. What you can get now is a list of xyz rpy with timestamp in e.g. CSV format. T265 is aimed at developers. There's API, but no exporters to any popular format Are there any more open formats or possibility to import spreadsheet?
@SARVOTARZANVFX
@SARVOTARZANVFX 5 лет назад
@@KarolMajek I guess, spreadsheet will work. thanks for your response.
@karankatiyar5414
@karankatiyar5414 3 года назад
So I see that you are using Tracking camera for it's Orientation pose but can we use a dseries camera ,run slam on it and get the same pose and orientation of camera ? Is that possible ? How accurate will that be ? Your videos are really good , thank you
@KarolMajek
@KarolMajek 3 года назад
Thanks. There's a demo t265+d435 rtabmap. Yes, you can use t265 pose as initial pose guess for slam
@karankatiyar5414
@karankatiyar5414 3 года назад
@@KarolMajek can I use d435 , implement slam on it and get the pose of the camera just like tracking camera gives by default? Will the result be same?
@KarolMajek
@KarolMajek 3 года назад
@@karankatiyar5414 do you know ROS? Check this launch: github.com/IntelRealSense/realsense-ros/blob/development/realsense2_camera/launch/rs_rtabmap.launch It runs d435,t265 and RTABMap. You need to mount sensors together - there is a part to 3d print. Then you can build 3d map by moving cameras
@hothaifatariq11
@hothaifatariq11 5 лет назад
Did you test it outdoor?, vehicle navigation?
@KarolMajek
@KarolMajek 5 лет назад
I already tested outdoor, in a car and hand held. In the first scenario it lost tracking, i don't know when, I have only the video, no rosbag. Second scenario - one kilometer, 4 loops (2 different 8 does), urban area, final error 20-30 meters. In first I mounted it inside the car, a lot of the car was visible in 160deg FOV. In the second sensor sensor was seeing only the environment.
@hothaifatariq11
@hothaifatariq11 5 лет назад
@@KarolMajek Thanks for the information. Can you access the sync IMU data in addition to the two cameras.
@KarolMajek
@KarolMajek 5 лет назад
You have 30fps two cameras and IMU 200hz. I was using only the ROS driver
@alexr5300
@alexr5300 5 лет назад
Can you describe how to generate the trajectory with the axis as in your video. Can you share the .rviz file or suggest topics to add. Thanks.
@KarolMajek
@KarolMajek 5 лет назад
Do you have the sensor? Add /camera/odom/sample (odometry topic) Change visualization from arrow to axes
@alexr5300
@alexr5300 5 лет назад
@@KarolMajek Hi Thanks. I have the camera and just tested it. Works perfect. Thanks for the tip. I can confirm that the performance is not so good with vibration. Its more or less best suited for ground robots rather than UAV's for accurate tracking and mapping when fused together with the wheel odometry. Good thing is that this sensor has a very good IMU that is factory calibrated that makes things pretty easy to work with and integrating into robotics project. Keep posting more interesting videos. Thanks.
@KarolMajek
@KarolMajek 5 лет назад
Thanks! Check also how static objects in FOV change the results. I think there's a difference, but I'm not sure. Now I have mounted it as a bumper, so it can see only the stuff around, but I'm going to place it somewhere else, will see. I already made experiment with 1km distance with 4 loops (2 different 8-shapes) going down 15m and up, and the error was as little as 20-30 meters (no other source of odometry), and this I consider a good input for 6D SLAM
@jacekstankiewicz9002
@jacekstankiewicz9002 5 лет назад
@@KarolMajek Are you going to make some video with results from your car? Maybe, at least, you can share the results somehow?
@KarolMajek
@KarolMajek 5 лет назад
I can't make it during next 2 weeks. Hardware and software is ready but I'm not :-)
@mechaliomar2447
@mechaliomar2447 5 лет назад
Thanks so much for the video. But I'm still confused about using it for building dense map with another RGB-D camera (using T265 for the 6DOF pose estimate), because we have many pose estimation algorithms like okvis,rovio, VINS... all those algorithms require a VI sensors with IMU and stereo camera hardware synchronized a'd time stamped. I think this camera is a solution for that issue,.. But still not sure since no one had tried it on real UAV. What do you think?
@KarolMajek
@KarolMajek 5 лет назад
I'm not working with UAVs. The lack of at least PPS can be a problem. For ground robots it's enough
@mechaliomar2447
@mechaliomar2447 5 лет назад
@@KarolMajek please what do you mean by PPS
@KarolMajek
@KarolMajek 5 лет назад
Pulse per second from GPS
@mattizzle81
@mattizzle81 5 лет назад
DVO Slam is meant for use with RGBD cameras, it uses the full depth frame for tracking instead of just sparse features. I have not tried it yet though, it is made for an old version of ROS.
@alzalame
@alzalame 3 года назад
Witam , dobry filmik,dużo detali wychodzi na jaw , widoczne jest że IMU ma słabą inercję , ciekaw jestem czy można zamienić wewnętrzne IMU na zewnętrzne o lepszym inerc jak np czujniki xsesns ?
@KarolMajek
@KarolMajek 3 года назад
Xsensa polecam; Realsense ma jakąś opcję synchronizacji, więc można fuzję zrobić. Gorzej z tym, że się czasem T265 gubi i daje NaNy. Do T265 nie da się dodać zewnętrznego imu
@NikolaosTsarmpopoulos
@NikolaosTsarmpopoulos 5 лет назад
Any robot can have vibrations and, to be honest, I consider a 1% closed loop error to be significant.
@KarolMajek
@KarolMajek 5 лет назад
Handheld without GPS or any other sensor, 20 meters error after 1600 meters indoor/outdoor multilevel walk (start and end 4th floor). Still think it is bad? And after fusing this VO with wheel odometry in a mobile robots, do you think the result would be worse (of course no stairs...)? Such accuracy for $200 is the best price/quality
@mattizzle81
@mattizzle81 5 лет назад
@@KarolMajek Better than using DVO Slam with an RGBD camera? That is what I was planning on using for 3D mapping, however maybe I should just get this. It would be easier anyway.
@KarolMajek
@KarolMajek 5 лет назад
Easier and 1.5 Watt without using your CPU/GPU
@mattizzle81
@mattizzle81 5 лет назад
@@KarolMajek Ok I am sold on this thing, buying one today :) I will just have to use a gimbal maybe to make sure vibrations or the rotation you demonstrated isn't a problem. I noticed the same sensitivity to that kind of motion on the ZED Mini camera.
@KarolMajek
@KarolMajek 5 лет назад
Hope you will be happy with it :-)
@tarekhemia1938
@tarekhemia1938 5 лет назад
Hello , this camera can work for indoor positioning ?
@KarolMajek
@KarolMajek 5 лет назад
Relative only. So you will know what distance you walked, but not exactly where you are in the building (depends where you start the sensor) For global positioning you will need external source e.g. Aruco markers Between markers you will navigate on Vslam from t265 and markers will help you localize in the global map
@tarekhemia1938
@tarekhemia1938 5 лет назад
@@KarolMajek thank you , i work in drone and i have D435 camera but need powerful hardware and is not easy to use for slam , Is the T265 have the same problem or can work in the smal card like odroid or Rasp ? what is the tools that you use in this video ?
@KarolMajek
@KarolMajek 5 лет назад
It requires 1.5Watt usb 3. Processing is on device. I'm using ROS with ROS driver for realsense. Rviz for visualization
@KarolMajek
@KarolMajek 5 лет назад
So it doesn't require computing power, but USB 3 with 1.5W. I have one ITX computer which is not compatible, but had usb 3. I will try with external usb 3 hub with external power source
@tarekhemia1938
@tarekhemia1938 5 лет назад
@@KarolMajek I used the D435 with Ordroid XU4 it worked without external hub , i see that this camera is very sensitive with vibration , is not easy to use in drone , maybe better with stablized gimbal .
@shivanibaldwa2064
@shivanibaldwa2064 5 лет назад
Hey, how to solve the issue for visual odometry or any detailed description for what you showed in this video with code .
@KarolMajek
@KarolMajek 5 лет назад
This result you get by just buying Intel Realsense T265 sensor. Really! Look for GitHub librealsense and realsense-ros
@theappliedcoder9824
@theappliedcoder9824 5 лет назад
sir, I want to machine learning can u teach me or show me the path another thing can i do some projects with you
@zhmai5424
@zhmai5424 5 лет назад
is this a normal camera or is a xyz camera?it very cool.
@KarolMajek
@KarolMajek 5 лет назад
As output you get two streams of grayscale 848x800px @30fps And 200hz raw IMU And 6D position (xyzrpy)
@zhmai5424
@zhmai5424 5 лет назад
@@KarolMajek THX :)
Далее
How to win a argument
9:28
Просмотров 474 тыс.
skibidi toilet multiverse 041
06:01
Просмотров 4,7 млн
Intel RealSense Tracking and Depth, T265 with D435
10:10
Open-source SLAM with Intel RealSense depth cameras
27:40
Intel RealSense D400 Comparison + SDK 2.0 & Wrappers
34:52
[Realsense Comparisons] D455 v.s. D435
5:04
Просмотров 10 тыс.
Sensor Showcase | Depth Cameras
4:34
Просмотров 17 тыс.
I forced EVERYONE to use Linux
22:59
Просмотров 446 тыс.
Connect Intel Realsense with Python and OpenCV
13:21
Просмотров 15 тыс.