dc.contributor.author | Furmonas, Justas | |
dc.contributor.author | Liobe, John Charles | |
dc.contributor.author | Barzdėnas, Vaidotas | |
dc.date.accessioned | 2023-09-18T16:12:16Z | |
dc.date.available | 2023-09-18T16:12:16Z | |
dc.date.issued | 2022 | |
dc.identifier.issn | 1424-8220 | |
dc.identifier.uri | https://etalpykla.vilniustech.lt/handle/123456789/112328 | |
dc.description.abstract | Event-based cameras have increasingly become more commonplace in the commercial space as the performance of these cameras has also continued to increase to the degree where they can exponentially outperform their frame-based counterparts in many applications. However, instantiations of event-based cameras for depth estimation are sparse. After a short introduction detailing the salient differences and features of an event-based camera compared to that of a traditional, frame-based one, this work summarizes the published event-based methods and systems known to date. An analytical review of these methods and systems is performed, justifying the conclusions drawn. This work is concluded with insights and recommendations for further development in the field of event-based camera depth estimation. | eng |
dc.format | PDF | |
dc.format.extent | p. 1-26 | |
dc.format.medium | tekstas / txt | |
dc.language.iso | eng | |
dc.relation.isreferencedby | Science Citation Index Expanded (Web of Science) | |
dc.relation.isreferencedby | Scopus | |
dc.relation.isreferencedby | DOAJ | |
dc.relation.isreferencedby | INSPEC | |
dc.relation.isreferencedby | CABI (abstracts) | |
dc.relation.isreferencedby | Gale's Academic OneFile | |
dc.source.uri | https://www.mdpi.com/1424-8220/22/3/1201 | |
dc.title | Analytical review of event-based camera depth estimation methods and systems | |
dc.type | Straipsnis Web of Science DB / Article in Web of Science DB | |
dcterms.accessRights | This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license https://creativecommons.org/licenses/by/4.0/). | |
dcterms.license | Creative Commons – Attribution – 4.0 International | |
dcterms.references | 66 | |
dc.type.pubtype | S1 - Straipsnis Web of Science DB / Web of Science DB article | |
dc.contributor.institution | Vilniaus Gedimino technikos universitetas | |
dc.contributor.faculty | Elektronikos fakultetas / Faculty of Electronics | |
dc.subject.researchfield | T 001 - Elektros ir elektronikos inžinerija / Electrical and electronic engineering | |
dc.subject.studydirection | E09 - Elektronikos inžinerija / Electronic engineering | |
dc.subject.vgtuprioritizedfields | IK0202 - Išmaniosios signalų apdorojimo ir ryšių technologijos / Smart Signal Processing and Telecommunication Technologies | |
dc.subject.ltspecializations | L106 - Transportas, logistika ir informacinės ir ryšių technologijos (IRT) / Transport, logistic and information and communication technologies | |
dc.subject.en | event-based camera | |
dc.subject.en | neuromorphic | |
dc.subject.en | depth estimation | |
dc.subject.en | monocular | |
dcterms.sourcetitle | Sensors | |
dc.description.issue | iss. 3 | |
dc.description.volume | vol. 22 | |
dc.publisher.name | MDPI | |
dc.publisher.city | Basel | |
dc.identifier.doi | 000755485800001 | |
dc.identifier.doi | 10.3390/s22031201 | |
dc.identifier.elaba | 118606870 | |