Browsing by Author "Rybalskii, Igor"
Now showing 1 - 2 of 2
- Results Per Page
- Sort Options
Item Augmented reality (AR) for enabling human-robot collaboration with ROS robots(Tartu Ülikool, 2022) Rybalskii, Igor; Tartu Ülikool. Loodus- ja täppisteaduste valdkond; Tartu Ülikool. TehnoloogiainstituutWith current industrialization trends on the reintroduction of human into the manufacturing process and development of augmented reality I propose the interface, which uses Augmented reality to allow the operator to interact with robotic systems, such as manipulators and mobile robots. 2 interfaces were created: one for manipulators and one for mobile robots. Both of them are developed to work on Microsoft Hololens 2 augmented reality glasses and robots, running ROS. Interface for manipulators allows user to control the robot by sending end-effector goals and previewing the goal joint states of the manipulator. Interface for mobile robots allows the user to send navigation goals and preview the robot’s movement trajectory. Interfaces were developed using Unity game engine. Developed interfaces were tested with UR5e manipulator and Robotont mobile robot.Item Gesture Detection Software for Human-Robot Collaboration(2020) Rybalskii, Igor; Valner, Robert; Kruusamäe, KarlWith robots becoming more complex machines with more actions available at their disposal, it becomes harder for humans to control them without prior training. I propose a gesture detection system which uses OpenPose and ROS (Robot Operating System) to control mobile robotic platforms. Output from OpenPose is normalized into a joint angle form, which is also used to describe gestures in the system. Proposed normalization method in combination with the capability to change described gestures in a separate YAML configuration file makes the whole system scalable for a developer who can add, remove or modify gestures described by angle notation. The developed system is able to detect static gestures and was tested on three sets, each consisting of 5 gestures to control a Clearpath Jackal mobile robot. In estonian: Robotid on muutumas tehniliselt aina keerukamaks ning nende abil on võimalik täita üha enam ülesandeid. Ka robotite juhtimine on inimestele muutumas väga keeruliseks. Käesolevas lõputöös luuakse kehakeele-põhine süsteem, mis kasutab tarkvarateeke OpenPose ja ROS, et juhtida mobiilset robotplatvormi. OpenPose’i väljund normeeritakse nurkade esitlusele, milles on kirjeldatud ka kasutatavad žestid. Loodud süsteem on skaleeritav, sest normeeritud kujul žeste saab robotsüsteemi arendaja vastavalt vajadusele lisada, muuta ja eemaldada YAML-tüüpi konfiguratsioonifailis. Valminud lahenduse demostreerimiseks implementeeriti kolm erinevat 5-žestilist komplekti, mille abil juhiti Clearpath Jackal mobiilset robotit.