Many thanks with this tutorial, It has helped me a lot with the "easy" way to link Python with V-REP. I've safe lot of time trying to do it. Well explained and structured. Thanks again for your time ;-)
This is very in depth and detailed walk through tutorial, it helps me a lot. There's only one error popped out -- since I'm using Mac, for the handlers for left and right motors, the opmode streaming didn't work for me for whatever reason, I switched to oneshot_wait, and break it through.
Great work. Thanks a lot for sharing this information in a coherent fashion. Also, I appreciate the effort put on the description section. It helped me do a quick second pass on the video for details.
It was very helpful tutorial on Vrep. I at first tried to follow with latest verson of Coppeliasim but after a while I have installed V-rep version. Actually at the end I have found out it does not matter at all. One thing about the prismatic joint part, You have to keep the prismatic joint as a child to another parent object.
Thank you very much sir. Sir if possible kindly upload Lua (Torch+LuaJit) Robot Simulation with V-Rep | V-rep Tutorial. Tutorial for image display and saving operation using remote api function of Lua. Thank you once again for such a nice and easy way of explanation. It's so useful.
Great video! Do you know a lot about ROS(Robot Operating System). It seems like a very robust system that a lot of robots are using. Unfortunately there are not a lot of good tutorials, you should do them if you have some background in it!
Well, you technically know everything you need on the V-Rep side. All you need is a python fuzzy logic library and a fuzzy logic control scheme (which will highly depend on what you are trying to do). The fuzzy logic library will contain functions which will take inputs (sensor readings) and decide what the best output is (actuator commands).
Hello, thanks for the great tutorial that has helped me a lot, I would like to ask will it work if I want to extract the data from the Hokuyo Laser Range Finder from vrep to python by using the method you did for the proximity sensor on the pioneer robot? Or is there other method to extract the data from the Hokuyo laser sensor? Your assistance is greatly appreciate.
Hello sir, first of all thank you for this great tutorial, I am pursuing my PhD in intelligent networked robotic system, so I want to use fuzzy logic and neutral network to make my robot autonomous, sir kindly upload a video or give me some links where I can find how to API Matlab with VREP(though I have done that still want to learn from you sir), and how to use fuzzy logic and neutral network in VREP. please sir I really need your help.
Hi, Nikolai. your video is my good start. I mean that you help me too much. But I have a problem now: I type command as what you show us, but my vision sensor handle is 57 however errorcode is still 1, I am troubled in that a few days and google search no answer. Could you help me?
hi.initially a want to apologize for my english,no use translator. only want to know if vrep i a good training to develop after real robot with my personal script python. i know about ros,but is only linux,and is hard to work on a VM with my notebook
Thanks for sharing your knowledge ! What version of v-rep do you use? My version (education 3.3) uses sci notation (so number look like 0.001E+02) to show all numbers with float point which annoys me very much. Wondering which version should I install to get regular notation? Thanks.
Hello Nikolai, what a great tutorial, its really detailed, step by step and really clear, thanks a lot. I did what you describe in the tutorial and managed to program other robots like the dr20 and pionner, by previouslly disabling the demo scripts that come with the models by default. How ever i havent been able to program the E-puck since even if i disable the four scripts it has by default (speaker,body,light and the epcuk child script), the robots keeps getting control by directly by Vrep, not allowing me to send commands from python. How can a I solve this ? Is there any way of imporint the model in a "blanck" manner, with out any default scripts ?
+David Alvarez I could already solve the problem, to do so , follow the following steps: 1. From the E-puck exploration panel (where all the Epucks compoenents are displayed) access 2. Doble click one the wheels e.g. 'ePuck_leftJoint' (you've got to do the same procedure for both wheels) and the Joint Properties Windows will apper (you can also access to it through [Menu bar --> Tools --> Scene object properties]). 3. On the bottom left, click on the 'dinamic properties ' icon 4. the Joint Dynamic Properties windows will pop up, change the ... change the wheels 'Target Velocity deg/s' to zero :D
Thanks for sharing the video. I have a question. I simulated a simple kinematic control of Kuka Youbot robot's arm in V-rep using the Jacobian Inverse control method. I wrote all my codes in Matlab, interfacing with V-rep. Everything works fine except for the speed of the program. It is too slow such that every loop is executed about 0.8~1 sec. This delay in turn negatively affects on the performance of the controller. So, my question is that: Will I experience the same speed issue if I write my programs in Python? Thank you!
Is V-Rep is better for Swarm Robotics simulation(for less than 15 robots on the scene)? I want to create an academic project by implementing some of the SI algorithms. If I added more robots on the scene, what should be the strategy for writing code? Could you please make a video on that using Python Remote API?
No, it's "bad" at swarm robotics simulations. In theory you can decrease the dynamics solver to improve simulation speed. But of all the robot simulation software i've used V-REP is the most resource heavy. Which will be your main limitation with Swarm Robotics.
hello good morning , first of all thank you very much for all the tutorials VREP , I would ask if the integration of Python in V -REP is possible and through call Python libraries openCV and could use them to process information from the camera in VREP
Hello, In this video I showed you how to export the image from V-Rep into Python and store it in a variable of type numpy. You can absolutely use OpenCV in Python to process these variables and do any computation with it. I recommend that you look at some OpenCV Python tutorials that show you how to use numpy arrays.
+Nikolai K. I would like to read the image data from a camera (not a vision sensor). However, I can't find any remote API functions for it. Any suggestions? Thanks
Hello Andrei, I have not had a chance to use C++ with V-Rep, but it is possible. There is a bit more information in the following link: www.coppeliarobotics.com/helpFiles/en/remoteApiClientSide.htm I hope this helps.
Hello Nikolai, I am having some problem while importing v-rep into spyder, i think there is a problem in remote api bindings, so if you can help me with my query. Also, i am using windows 10 home.
@@gregbaker8971 you probably already solved it, but i found this: iakashp.wordpress.com/2017/08/06/the-import-vrep-problemsolved-on-spyder-2-0/ Also, i want to thank both Nikolai and Akash. The tutorial and the instructions provided helped me a lot.
@@rodrigolobo6351 Great to have the answer there for people encountering the same problem! I can't remember exactly how I solved it, but it was probably similar.
Peter Patel Hello Peter. You should take a look at the earlier v-rep tutorials. Select the robot in the model tree and then use the "Object item.shift" or "Object/item rotate". If you want to change the position of a robot's links, you should expand the model tree and double click on the icon for the actuators (blue cylinder with yellow cylinder through it).You can change the position in the new dialog window.
Dear Nikolai, really your video was extremely helpful and it guides me through my initial steps in simulating the problem, but currently I'm facing a problem, and I really would like to hear your suggestions please follow this link www.forum.coppeliarobotics.com/viewtopic.php?f=9&t=2901&p=11641#p11641 My best regards
The final python script and the corresponding v-rep scene file can be downloaded at: www.edisondev.net/VREP/04Pytho... this link doesn't work. Please I need it