Human-robot interaction based on gestures for service robots

dc.contributor.author de Sousa,P en
dc.contributor.author Esteves,T en
dc.contributor.author Campos,D en
dc.contributor.author Duarte,F en
dc.contributor.author Santos,J en
dc.contributor.author Leão,J en
dc.contributor.author Xavier,J en
dc.contributor.author de Matos,L en
dc.contributor.author Camarneiro,M en
dc.contributor.author Penas,M en
dc.contributor.author Miranda,M en
dc.contributor.author Silva,R en
dc.contributor.author Neves,AJR en
dc.contributor.author Luís Filipe Teixeira en
dc.date.accessioned 2018-01-13T12:23:45Z
dc.date.available 2018-01-13T12:23:45Z
dc.date.issued 2018 en
dc.description.abstract Gesture recognition is very important for Human-Robot Interfaces. In this paper, we present a novel depth based method for gesture recognition to improve the interaction of a service robot autonomous shopping cart, mostly used by reduced mobility people. In the proposed solution, the identification of the user is already implemented by the software present on the robot where a bounding box focusing on the user is extracted. Based on the analysis of the depth histogram, the distance from the user to the robot is calculated and the user is segmented using from the background. Then, a region growing algorithm is applied to delete all other objects in the image. We apply again a threshold technique to the original image, to obtain all the objects in front of the user. Intercepting the threshold based segmentation result with the region growing resulting image, we obtain candidate objects to be arms of the user. By applying a labelling algorithm to obtain each object individually, a Principal Component Analysis is computed to each one to obtain its center and orientation. Using that information, we intercept the silhouette of the arm with a line obtaining the upper point of the interception which indicates the hand position. A Kalman filter is then applied to track the hand and based on state machines to describe gestures (Start, Stop, Pause) we perform gesture recognition. We tested the proposed approach in a real case scenario with different users and we obtained an accuracy around 89,7%. © 2018, Springer International Publishing AG. en
dc.identifier.uri http://repositorio.inesctec.pt/handle/123456789/6016
dc.identifier.uri http://dx.doi.org/10.1007/978-3-319-68195-5_76 en
dc.language eng en
dc.relation 4357 en
dc.rights info:eu-repo/semantics/openAccess en
dc.title Human-robot interaction based on gestures for service robots en
dc.type article en
dc.type Publication en
Files
Original bundle
Now showing 1 - 1 of 1
Thumbnail Image
Name:
P-00N-5VX.pdf
Size:
1.96 MB
Format:
Adobe Portable Document Format
Description: