Image processing and vision techniques for smart vehicles
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Ul, Haq Ehsan | - |
dc.contributor.author | Hussain, Pirzada Syed jahanzeb | - |
dc.contributor.author | Piao, Jingchun | - |
dc.contributor.author | Yu, Teng | - |
dc.contributor.author | Shin, Hyunchul | - |
dc.date.accessioned | 2021-06-23T09:43:13Z | - |
dc.date.available | 2021-06-23T09:43:13Z | - |
dc.date.issued | 2012-05 | - |
dc.identifier.issn | 0271-4302 | - |
dc.identifier.issn | 2158-1525 | - |
dc.identifier.uri | https://scholarworks.bwise.kr/erica/handle/2021.sw.erica/36165 | - |
dc.description.abstract | The idea of safe and smart vehicles has been thoroughly researched over the past decades to ensure drivers' safety from possibly dangerous situations. This paper presents a brief review of different applications of image processing and computer vision techniques in smart vehicles. To detect other on-road vehicles, researchers have approached the problem from various angles; with solutions ranging from active sensors like radar to passive sensors like cameras. Recently, researchers are working to create a panoramic 360 degree view of the vehicle's environment by merging different images from sides, rear and front of the car using passive sensors. There has also been work on constructing high resolution images from low cost, low resolution cameras, in order to reduce final cost of the system. In this paper, we have presented a new algorithm for mono-camera based vehicle detection systems, by incorporating different low level (edges) and high level features (Bag-of-features). To extract edge information flawlessly, we presented a new edge detection method, namely Difference of BiGaussian (DoBG). Experimental results show average 98.5% recognition rates, which is one of the best results achieved so far. © 2012 IEEE. | - |
dc.format.extent | 4 | - |
dc.language | 영어 | - |
dc.language.iso | ENG | - |
dc.publisher | IEEE | - |
dc.title | Image processing and vision techniques for smart vehicles | - |
dc.type | Article | - |
dc.publisher.location | 미국 | - |
dc.identifier.doi | 10.1109/ISCAS.2012.6271453 | - |
dc.identifier.scopusid | 2-s2.0-84866634804 | - |
dc.identifier.wosid | 000316903701108 | - |
dc.identifier.bibliographicCitation | 2012 IEEE International Symposium on Circuits and Systems (ISCAS), pp 1211 - 1214 | - |
dc.citation.title | 2012 IEEE International Symposium on Circuits and Systems (ISCAS) | - |
dc.citation.startPage | 1211 | - |
dc.citation.endPage | 1214 | - |
dc.type.docType | Conference Paper | - |
dc.description.isOpenAccess | N | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
dc.relation.journalResearchArea | Engineering | - |
dc.relation.journalWebOfScienceCategory | Engineering, Electrical & Electronic | - |
dc.subject.keywordPlus | RECOGNITION | - |
dc.subject.keywordPlus | SYMMETRY | - |
dc.subject.keywordAuthor | Recognition rates | - |
dc.subject.keywordAuthor | High-level features | - |
dc.subject.keywordAuthor | Vision technique | - |
dc.subject.keywordAuthor | Research | - |
dc.subject.keywordAuthor | Vehicles | - |
dc.subject.keywordAuthor | Active sensor | - |
dc.subject.keywordAuthor | Image processing and computer vision | - |
dc.subject.keywordAuthor | Edge detection methods | - |
dc.subject.keywordAuthor | Edge information | - |
dc.subject.keywordAuthor | High resolution image | - |
dc.subject.keywordAuthor | Dangerous situations | - |
dc.subject.keywordAuthor | Sensors | - |
dc.subject.keywordAuthor | Low costs | - |
dc.subject.keywordAuthor | Computer vision | - |
dc.identifier.url | https://ieeexplore.ieee.org/document/6271453/ | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
55 Hanyangdeahak-ro, Sangnok-gu, Ansan, Gyeonggi-do, 15588, Korea+82-31-400-4269 sweetbrain@hanyang.ac.kr
COPYRIGHT © 2021 HANYANG UNIVERSITY. ALL RIGHTS RESERVED.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.