In this Robot Mark II, we tried adding hooks and ultrasonic sensors to find out whether multi-commands can be applied to the AI robot, and can make decisions based on feedback from the installed sensors.
Omg how are you able to program this? I have had both the nxt and ev3 sets and never been able to speak to it, the microphone on the nxt wpuldnt even support something like this
I actually talked through a laptop that functions as a server side that sends to a Large language models with Google, then the results of the generated output are then sent again to the robot via Bluetooth