Тёмный

FTC 18225 High Definition WA State Control Award Video Submission (Freight Frenzy 2021-2022) 

High Definition
Подписаться 183
Просмотров 14 тыс.
50% 1

Опубликовано:

 

4 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 56   
@cool_syder_4207
@cool_syder_4207 2 года назад
If anyone deserves the control award, it has got to be you guys!
@highdefinition6017
@highdefinition6017 2 года назад
Thanks!
@ToeDexterity
@ToeDexterity 2 года назад
This is really off-the-charts amazing! Well done!!
@rjhuang7650
@rjhuang7650 2 года назад
This is awesome. The robot is intelligent.
@robosapiens7051
@robosapiens7051 2 года назад
Hey guys! This is easily one of the coolest robots I have seen this season, only question I had is what you guys used for your intelligent claw, do you all use regular cameras or any special programs. Thanks again and congrats on making worlds!
@highdefinition6017
@highdefinition6017 2 года назад
We use a regular Logitech webcam (I'm not sure what specific model), and we don't use any libraries for detection; all the code used for processing the image is developed by our team.
@elementalsb8112
@elementalsb8112 2 года назад
This is some crazy vision I was wondering if you knew where I could go to learn more about computer vision/tensorflow and or opencv, like commands and how to really understand an utilize the vision. I guess what I'm asking is how did you learn to use tensoflow lite and what I could to be able to master it like you. Thanks a lot :)
@highdefinition6017
@highdefinition6017 2 года назад
Hi! We learned TensorFlow Lite basically throughout trying it out and experiencing what it was capable of. There's plenty of helpful resources in the external samples in the FtcRobotController folder, for example this one: github.com/FIRST-Tech-Challenge/FtcRobotController/blob/master/FtcRobotController/src/main/java/org/firstinspires/ftc/robotcontroller/external/samples/ConceptTensorFlowObjectDetection.java We've found that TensorFlow works well for game elements (a new TFLite model is provided every year) but we've found it difficult to use for custom objects or other use cases like warehouse freight detection. This is why we decided to create a custom vision algorithm, which you can see at 0:33. If you want to learn more about our custom vision algorithms, feel free to contact our lead programmer at null_awe#0184 on Discord. There are probably great videos on how to use OpenCV for FTC on RU-vid, but we haven't really used OpenCV, at least not yet.
@spidernh
@spidernh 2 года назад
Congrats on worlds!
@highdefinition6017
@highdefinition6017 2 года назад
Thank you so much! :D
@yotamdubiner2545
@yotamdubiner2545 2 года назад
Wow! Amazing! I'm definitely going to try to implement that kind of intelligence in our bot.
@highdefinition6017
@highdefinition6017 2 года назад
Feel free to reach out if you have any questions!
@yotamdubiner2545
@yotamdubiner2545 2 года назад
@@highdefinition6017 actually, can you give me an idea of where to start? Maybe some theory, what to Google,, etc
@highdefinition6017
@highdefinition6017 2 года назад
@@yotamdubiner2545 Try looking at the sample class in external samples called "ConceptWebcam". This teaches you how to retrieve image frames from the webcam directly. Then, the most important thing is converting the RGB color format of the pixels to HSV, which you can work more with (filtering is much easier with HSV).
@yotamdubiner2545
@yotamdubiner2545 2 года назад
@@highdefinition6017 I know how to do that. I'm familiar with eocv. I just need to know how you determine the distance from the object by using the pixel of it in the image
@yotamdubiner2545
@yotamdubiner2545 2 года назад
@@highdefinition6017 and how u determine the angle
@elywickander4666
@elywickander4666 2 года назад
most advanced clawbot ever made
@divinaabiodun
@divinaabiodun Год назад
Good job on keep it up 😜😜
@timmyytjr1131
@timmyytjr1131 2 года назад
Omg you guys are absolutely crazy! Do you guys use a motion planning library like Roadrunner or did you custom make the movements? And as for the hybrid PID, how did you create a tuner for that? I can't wait to see you guys at worlds! I'm definitely going for y'alls pins if you have any lol
@highdefinition6017
@highdefinition6017 2 года назад
Nope, everything is custom made! For the PID tuner, we have a base class that handles all the normal PID logic which applies to all PID models (for any subsystem), and then all we have to do for a new subsystem is just implement a few methods (getError, setPower, cancel) and then it's ready to be tuned in tele-op.
@useruseruseruser-i6s
@useruseruseruser-i6s 2 года назад
wowww!!
@calvinz9126
@calvinz9126 Год назад
that claw really has aimbot
@pjwetherell9414
@pjwetherell9414 2 года назад
Did you run your vision in a separate thread? How long did it take to get a picture and fully process it. Did you attempt to find the real world coordinates of the freight? If so how accurate was that?
@tharunkumara.r229
@tharunkumara.r229 2 года назад
DANG thats amazing! Both your software and hardware just top notch! I have a quick question, what motors did you all use for your turret?
@highdefinition6017
@highdefinition6017 2 года назад
Thanks! We use a GoBilda yellow jacket motor
@brandon-mz1vs
@brandon-mz1vs 2 года назад
Nice robot. Can your intelligent claw detect the individual blocks at the start of the game when they are all clumped together?
@highdefinition6017
@highdefinition6017 2 года назад
Yep! We don't actually detect all of the blocks, only the closest one (lowest in the image). However, it does pick out one individual lowest block.
@robotkg6540
@robotkg6540 2 года назад
How did you create the rotating platform?3d Print? Custom Made from any material? Is it using servo or motor to rotate?
@highdefinition6017
@highdefinition6017 2 года назад
Cool question! Our turret is a lazy susan bearing powered by a motor. There is a carbon fiber plate attached on top of the lazy susan bearing so that we can easily attach our delivery arm. We also have another lazy susan bearing on the intake arm but that is custom 3D printed.
@acronicosftc
@acronicosftc 2 года назад
rapaaaaaaz
@honeykohms8345
@honeykohms8345 2 года назад
This is absolutely insane. Congrats on getting to worlds! Quick question: I noticed that in your detection code you have set it up to use a webcam. Is it possible to achieve this with the phone camera as well?
@highdefinition6017
@highdefinition6017 2 года назад
We actually haven't tried it with a phone camera before, although we've been asked this same question before. Our guess is you could try looking at the sample classes that is provided in the FTCRobotController, and see if there is something that can retrieve camera frames. When we tried this last year for ring detection (counting orange pixels instead), the only way we could retrieve an image from the phone camera was to use Vuforia to return an image (a process of which we've kind of forgotten the specifics of... but you could probably find online). We're not aware of another way currently.
@honeykohms8345
@honeykohms8345 2 года назад
@@highdefinition6017 Alright thank you. I'll try it out and let you know how it goes.
@LeontinHainaru
@LeontinHainaru 2 года назад
How cool is that. Congrats guys. How you do the real time detection? Open CV?
@highdefinition6017
@highdefinition6017 2 года назад
All of our vision algorithms are completely original this year. The logic described in the video is run in a background thread, and when highly optimized, it can cycle camera frames extremely quickly that allows us to do real-time detection. If you are in the FTC Discord, there was another example of real-time detection that also used a highly-optimized version of our code: discord.com/channels/225450307654647808/771188718198456321/946616121451249725
@LeontinHainaru
@LeontinHainaru 2 года назад
@@highdefinition6017 COngrats and keep going.
@fundooguy316
@fundooguy316 2 года назад
One of the best robots I've seen this season. I don't see the link to your portfolio. Can you share it ?
@highdefinition6017
@highdefinition6017 2 года назад
Hi there! We have not made the portfolio public yet as we are still mid-season. Following the World Championships, we will most likely make it public - so stay tuned for then!
@fasvi1285
@fasvi1285 2 года назад
@@highdefinition6017 Have you made the code public? You mentioned in the video that you were considering doing that.
@mateusbernart2101
@mateusbernart2101 2 года назад
Hi guys, How did you make the rotating Platform?
@highdefinition6017
@highdefinition6017 2 года назад
The rotating platform in both the delivery arm and the intake arm is a lazy susan bearing. For the delivery arm, it was big enough that we could buy it online. However, since the intake arm was smaller, we used Fusion 360 to CAD our own lazy susan bearing and 3D printed it.
@mateusbernart2101
@mateusbernart2101 2 года назад
@@highdefinition6017 but how you motorized it?
@highdefinition6017
@highdefinition6017 2 года назад
​@@mateusbernart2101 I'll be explaining the intake arm lazy susan bearing, but the delivery arm works similarly with a motor in place of a servo. The Lazy Susan bearing has two parts, an inner part (mounted to the drivetrain) and the outer part. In between is a set of balls that allow for the rotation of the bearing. We have attached a gear connected to the outer part and the above plate. This is controlled by the servo you see on top of the bearing. When the servo rotates, it rotates the gear. Another gear is located in the inner part of the bearing. Therefore, both gears will move and the intake arm will rotate! If you don't understand my explanation, then we can also share the CAD file for you to take a better look at the components.
@VeerNanda
@VeerNanda 2 года назад
@@highdefinition6017 would it be possible for you to share the cad?
@highdefinition6017
@highdefinition6017 2 года назад
​@@VeerNanda Added the link to the video description! :)
@leohai6700
@leohai6700 2 года назад
Simply awesome. You have a repository for your code?
@highdefinition6017
@highdefinition6017 2 года назад
Yeah, the repo is private as of now because it's not cleaned up, which we'll probably do if we release the code to public. Our shipping element detector code, however, is public here: github.com/HiiDeff/ShippingElementDetector
@leohai6700
@leohai6700 2 года назад
@@highdefinition6017 thank you
@kaushikreddy2775
@kaushikreddy2775 2 года назад
Woah... what kinds of mecanum wheels do you use?
@highdefinition6017
@highdefinition6017 2 года назад
6” mecanum wheels!
@jebsho
@jebsho 2 года назад
I'm just curious but you said your robot has 25 sensors!? Could you list the one's you used?
@highdefinition6017
@highdefinition6017 2 года назад
1 IMU 2 Logitech Webcams 5 REV 2m Distance Sensors 6 Motor Encoders 12 Servo Encoders
@17viKing17
@17viKing17 2 года назад
Hello, tell me please. You are using a servo dymamixel 12a? If so, please tell me how you managed to reflash them to use with contrl hab, which library did you use?
@highdefinition6017
@highdefinition6017 2 года назад
We're using 11 GoBilda servos and one Savox servo, not Dynamixel servos. I'm not sure what you mean by reflashing them using the control hub.
Далее
MA Control Award Winner | Wolfpack Machina 18438
3:39
AI Learns to Run Faster than Usain Bolt | World Record
10:22
I tried to make a Valorant AI using computer vision
19:23
Has Generative AI Already Peaked? - Computerphile
12:48
goBILDA Freight Frenzy Robot in 3 Days Overview
6:38
Neuralink Begins First Human Experiments!
14:39
Просмотров 2,2 млн
The Fastest Maze-Solving Competition On Earth
25:22
Просмотров 20 млн
FTC 18225 HIGH DEFINITION | FREIGHT FRENZY REVEAL
1:42
FTC Tips and Tricks: Collection/Intake Mechanisms
5:13