An Efficient Object Augmentation Scheme for Supporting Pervasiveness in a Mobile Augmented Reality
- Authors
- Jang, Sung-Bong; Ko, Young-Woong
- Issue Date
- Oct-2020
- Publisher
- KOREA INFORMATION PROCESSING SOC
- Keywords
- Augmented Object Similarity; Context Awareness; Mobile Augmented Reality; Object Augmentation
- Citation
- JOURNAL OF INFORMATION PROCESSING SYSTEMS, v.16, no.5, pp.1214 - 1222
- Journal Title
- JOURNAL OF INFORMATION PROCESSING SYSTEMS
- Volume
- 16
- Number
- 5
- Start Page
- 1214
- End Page
- 1222
- URI
- https://scholarworks.bwise.kr/kumoh/handle/2020.sw.kumoh/18508
- DOI
- 10.3745/JIPS.04.0192
- ISSN
- 1976-913X
- Abstract
- Pervasive augmented reality (AR) technology can be used to efficiently search for the required information regarding products in stores through text augmentation in an Internet of Things (IoT) environment. The evolution of context awareness and image processing technologies are the main driving forces that realize this type of AR service. One of the problems to be addressed in the service is that augmented objects are fixed and cannot be replaced efficiently in real time. To address this problem, a real-time mobile AR framework is proposed. In this framework, an optimal object to be augmented is selected based on object similarity comparison, and the augmented objects are efficiently managed using distributed metadata servers to adapt to the user requirements, in a given situation. To evaluate the feasibility of the proposed framework, a prototype system was implemented, and a qualitative evaluation based on questionnaires was conducted. The experimental results show that the proposed framework provides a better user experience than existing features in smartphones, and through fast AR service, the users are able to conveniently obtain additional information on products or objects.
- Files in This Item
- There are no files associated with this item.
- Appears in
Collections - Industry-Academic Co. Fund. > 1. Journal Articles
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.