Тёмный

Custom Depth Map from Kinect & RealSense Point Clouds in TouchDesigner - Tutorial 

The Interactive & Immersive HQ
Подписаться 30 тыс.
Просмотров 13 тыс.
50% 1

Get access to 200+ hours of TouchDesigner video training, a private Facebook group where Elburz and Matthew Ragan answer all your questions, and twice-monthly group coaching/mastermind calls here: iihq.tv/Trial
It can be difficult to create visual effects with a sensor's depth map when it's full of walls, floors, and ceilings. With many of the new operators in TouchDesigner focused on working with point clouds, there are effective new techniques we can use with only a handful of operators. Using Point Transform TOPs, Math TOPs, a little bit of GLSL, and new instancing setups, we can clean up and process our point cloud before rendering it back into a new custom depth map.

Опубликовано:

 

2 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 21   
@blickgeist9026
@blickgeist9026 2 года назад
I want a baby from you elburz
@borjonx
@borjonx 7 месяцев назад
Any tips on putting the Kinectv2 color data back on the point cloud pixels? I have it working, but the pt cloud doesn't match up perfectly with the color image.
@TheInteractiveImmersiveHQ
@TheInteractiveImmersiveHQ 7 месяцев назад
For accurate alignment, you could try using either the Color Point Cloud option of the Kinect TOP's Image parameter, or you could instead set the Image parameter to Color and the Camera Remap switch to On. Elburz covers this functionality in the Kinect Azure Point Cloud video (ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-P_PjAr2Yzao.html). Although the video is focused on the Azure, you can get similar results with the Kinect v2. Hope that helps!
@deimovprojects
@deimovprojects 2 месяца назад
Hello. excelent tutorial! in this video you use GLSL to crop with the depth.. but how can you crop with Player Index?
@TheInteractiveImmersiveHQ
@TheInteractiveImmersiveHQ 2 месяца назад
The Player Index texture is a little different, but somewhat simpler to deal with since it's more like a mask rather than a 3D point cloud -- try using a Threshold TOP and adjusting the Threshold parameter until you are able to isolate the person's outline. Hope that helps!
@AKM0D
@AKM0D 2 года назад
Hi hope you are really great, I have a inquiry this is useful when you want to use the kinect for some interactive floor installation I mean the distance between the persona and the sensor, appreciate your reply
@TheInteractiveImmersiveHQ
@TheInteractiveImmersiveHQ 2 года назад
Yes this can be useful because you can re-orient how your tracking space is defined regardless of how your camera is setup. So you could have a camera facing forward and then turn it into something that feels like top-down tracking. Does that answer your question?
@C0llinsW0rth.
@C0llinsW0rth. 3 месяца назад
Does this use the azure? I have a Kinect 2 and didn't know if it would work as well
@TheInteractiveImmersiveHQ
@TheInteractiveImmersiveHQ 2 месяца назад
Elburz is using a Kinect Azure in the video, but the Kinect 2 also has point cloud support. You can access a _Color Point Cloud_ and _Depth Point Cloud_ via the Image parameter of the Kinect TOP. Hope that helps!
@RusticRaver
@RusticRaver 2 года назад
If we needed to create a dense pointcloud of a static building, sculpture using kinect azure plugged in laptop(mobile scanner), coud you show us how to scan with several passes and accumulate points for next level 3d mapping. Or maybe that would not work in TD due to limit of points, I have no idea. thx
@TheTretistaKelverian
@TheTretistaKelverian 8 месяцев назад
Hello, did yo do it?
@RusticRaver
@RusticRaver 8 месяцев назад
hi man, well I got a kinect azure for a bit that one actually works on laptop with the usb. But even using a strong laptop there is just not enough processing power to really do that, best way would probably be using lidar on a drone. I had much more success with photogrametry to scan stuff then animated a portion of the points where the kinect detect your hand then project on the wall now made of pointcloud, actually I think kinect is good for the bin, it cost a fortune and might be good for small live pointcloud, now you can track skeleton via video alone. I actually think touch designer itself has serious limits unless you handle point cloud with glsl. I sort of gave up, too much hassle and even if you do it right it kind of slow to work with. Perhaps when computing power is multiplied by 3x or 4 it will be worth it.(I waisted enough time in front of computers for a lifetime) Good luck to you still @@TheTretistaKelverian
@Sosigarets
@Sosigarets 2 года назад
love this tutorial, thank you so much!! i would love to make a point cloud map with data from two cameras. do you have any sources on this, or could consider making a tutorial on it?
@rolandomotta6354
@rolandomotta6354 Год назад
incredible tutorial! What I'm trying to do is maintain the colorization ratio of the depth map even if the body moves one meter forward or backward. That it is not in a specific place. but rather render the depth map wherever the person is standing.
@TheInteractiveImmersiveHQ
@TheInteractiveImmersiveHQ Год назад
You might be able to achieve this with the Remap TOP, which would allow you to apply/map a specific texture to the points in the scene. Might also be worth trying to normalize the point position data with the Limit TOP, but that might cause some color shifting artifacts of its own.
@Sosigarets
@Sosigarets 2 года назад
This was a great ttutorial and helped me out a lot. Do you know if it is possible to program this with several cameras? I know kinect v1 has that possibility, does it work with Azure?
@TheInteractiveImmersiveHQ
@TheInteractiveImmersiveHQ Год назад
That's great, we're glad to hear it! Yes, the Azure Kinect SDK allows you to connect as many Kinects to your computer as you have bandwidth for. The Kinect V2 was the version that only allowed one Kinect to be connected to the computer at a time.
@mateafriend7550
@mateafriend7550 2 года назад
Thank you for this tutorial! I was wondering if it would be possible to use this custom depth map to 3D projection map instead of re creating the model inside of Cinema4d so that you could 3D projection map in real time?
@taj_ninny
@taj_ninny Год назад
Yes i've used this technique for live human body projection mapping, works well, you just need to calibrate correctly between your point cloud and your rendering camera (VP intrinsics/extrinsics) and it's also a bit computer demanding (I was using nvidia flow to set the performer's body on fire at the same time). Actually for correct projection mapping, you need the actual xyz values, so don't use the custom depth map, but keep the glsl filtered point cloud.
@-303-
@-303- 2 года назад
This is awesome! I have been looking for this info for a while. Thanks for sharing! BTW - Azure rhymes with measure…
@TheInteractiveImmersiveHQ
@TheInteractiveImmersiveHQ 2 года назад
No problem! It's tricky to wrap your head around the first time but then becomes such an incredible workflow to keep in the back pocket. Haha I've been corrected so many times it might be one of those things I say wrong forever *face palm*
Далее
AI Portrait Installation in TouchDesigner
18:47
Просмотров 12 тыс.
Agamographs (Interactive) - TouchDesigner Tutorial 72
18:44
НОВАЯ "БУХАНКА" 2024. ФИНАЛ
1:39:04
Просмотров 360 тыс.
У КОТЯТ ОТКРЫЛИСЬ ГЛАЗКИ#cat
00:26
Depth TOP Luma Blur
9:52
Просмотров 863
Projection Mapping Tools in TouchDesigner
9:59
Просмотров 47 тыс.
MiDaS
12:15
Просмотров 1,8 тыс.
Kinect Azure Point Cloud in TouchDesigner Tutorial
14:50
Principles of Beautiful Figures for Research Papers
1:01:14
Kinect point cloud tutorial
16:18
Просмотров 30 тыс.