Тёмный

WWDC23: Integrate with motorized iPhone stands using DockKit | Apple 

Apple Developer
Подписаться 142 тыс.
Просмотров 3 тыс.
50% 1

Discover how you can create incredible photo and video experiences in your camera app when integrating with DockKit-compatible motorized stands. We’ll show how your app can automatically track subjects in live video across a 360-degree field of view, take direct control of the stand to customize framing, directly control the motors, and provide your own inference model for tracking other objects. Finally, we’ll demonstrate how to create a sense of emotion through dynamic device animations.
To learn more techniques for image tracking, check out “Detect animal poses in Vision” from WWDC23 and "Classify hand poses and actions with Create ML” from WWDC21.
Explore related documentation, sample code, and more:
DockKit: developer.appl...
Create ML: developer.appl...
Vision: developer.appl...
Detect animal poses in Vision: developer.appl...
Classify hand poses and actions with Create ML: developer.appl...
Detect Body and Hand Pose with Vision: developer.appl...
More Apple Developer resources:
Video sessions: apple.co/Video...
Documentation: apple.co/Devel...
Forums: apple.co/Devel...
App: apple.co/Devel...

Опубликовано:

 

29 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии    
Далее
WWDC20: Data Essentials in SwiftUI | Apple
36:16
Просмотров 1,8 тыс.
WWDC24: SwiftUI essentials | Apple
24:16
Просмотров 56 тыс.
Новый вид животных Supertype
00:59
Просмотров 211 тыс.
WWDC21: Demystify SwiftUI | Apple
40:18
Просмотров 2,8 тыс.
WWDC19: Designing Audio-Haptic Experiences | Apple
26:03
WWDC22: The SwiftUI cookbook for navigation | Apple
26:07
Get started with custom product pages | 2021 | Apple
19:43
WWDC24: Explore Swift performance | Apple
34:36
Просмотров 16 тыс.
WWDC22: Add accessibility to your Unity games | Apple
16:08
Новый вид животных Supertype
00:59
Просмотров 211 тыс.