dc.contributor.author | Luneckas, Mindaugas | |
dc.contributor.author | Luneckas, Tomas | |
dc.contributor.author | Udris, Dainius | |
dc.contributor.author | Plonis, Darius | |
dc.contributor.author | Maskeliūnas, Rytis | |
dc.contributor.author | Damaševičius, Robertas | |
dc.date.accessioned | 2023-09-18T20:34:56Z | |
dc.date.available | 2023-09-18T20:34:56Z | |
dc.date.issued | 2021 | |
dc.identifier.issn | 1861-2776 | |
dc.identifier.other | (SCOPUS_ID)85097164886 | |
dc.identifier.uri | https://etalpykla.vilniustech.lt/handle/123456789/151063 | |
dc.description.abstract | Walking robots are considered as a promising solution for locomotion across irregular or rough terrain. While wheeled or tracked robots require flat surface like roads or driveways, walking robots can adapt to almost any terrain type. However, overcoming diverse terrain obstacles still remains a challenging task even for multi-legged robots with a high number of degrees of freedom. Here, we present a novel method for obstacle overcoming for walking robots based on the use of tactile sensors and generative recurrent neural network for positional error prediction. By using tactile sensors positioned on the front side of the legs, we demonstrate that a robot is able to successfully overcome obstacles close to robots height in the terrains of different complexity. The proposed method can be used by any type of a legged machine and can be considered as a step toward more advanced walking robot locomotion in unstructured terrain and uncertain environment. | eng |
dc.format | PDF | |
dc.format.extent | p. 9-24 | |
dc.format.medium | tekstas / txt | |
dc.language.iso | eng | |
dc.relation.isreferencedby | Dimensions | |
dc.relation.isreferencedby | Scopus | |
dc.relation.isreferencedby | Professional ProQuest Central | |
dc.relation.isreferencedby | Science Citation Index Expanded (Web of Science) | |
dc.source.uri | https://doi.org/10.1007/s11370-020-00340-9 | |
dc.subject | H600 - Elektronikos ir elektros inžinerija / Electronic and electrical engineering | |
dc.title | A hybrid tactile sensor-based obstacle overcoming method for hexapod walking robots | |
dc.type | Straipsnis Web of Science DB / Article in Web of Science DB | |
dcterms.accessRights | This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. | |
dcterms.license | Creative Commons – Attribution – 4.0 International | |
dcterms.references | 75 | |
dc.type.pubtype | S1 - Straipsnis Web of Science DB / Web of Science DB article | |
dc.contributor.institution | Vilniaus Gedimino technikos universitetas | |
dc.contributor.institution | Vytauto Didžiojo universitetas | |
dc.contributor.institution | Silesian University of Technology | |
dc.contributor.faculty | Elektronikos fakultetas / Faculty of Electronics | |
dc.subject.researchfield | T 001 - Elektros ir elektronikos inžinerija / Electrical and electronic engineering | |
dc.subject.vgtuprioritizedfields | MC0505 - Inovatyvios elektroninės sistemos / Innovative Electronic Systems | |
dc.subject.ltspecializations | L106 - Transportas, logistika ir informacinės ir ryšių technologijos (IRT) / Transport, logistic and information and communication technologies | |
dc.subject.en | hexapod robot | |
dc.subject.en | obstacle overcoming | |
dc.subject.en | tactile sensors | |
dc.subject.en | bio-inspired robotics | |
dc.subject.en | recurrent neural network | |
dcterms.sourcetitle | Intelligent service robotics | |
dc.description.issue | iss. 1 | |
dc.description.volume | vol. 14 | |
dc.publisher.name | Springer | |
dc.publisher.city | Heidelberg | |
dc.identifier.doi | 2-s2.0-85097164886 | |
dc.identifier.doi | 85097164886 | |
dc.identifier.doi | 1 | |
dc.identifier.doi | 000599014900001 | |
dc.identifier.doi | 10.1007/s11370-020-00340-9 | |
dc.identifier.elaba | 77679717 | |