Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

User generated interactive audio-visual communication system using Braille method

Authors
Yang, J.Jeong, J.Kim, K.Legrady, G.
Issue Date
2014
Publisher
International Information Institute Ltd.
Keywords
Audio-visual communication; Braille; Interactive media art; Sensor network
Citation
Information (Japan), v.17, no.9B, pp.4391 - 4398
Journal Title
Information (Japan)
Volume
17
Number
9B
Start Page
4391
End Page
4398
URI
http://scholarworks.bwise.kr/ssu/handle/2018.sw.ssu/11037
ISSN
1343-4500
Abstract
Technological advances in digital media have diversified user-generated contents in audio-visual communication as well as artistic expressions. User-generated content can help maximizing interactions with the user through various memods of communication. The best way of interactive communication can be achieved through the integration of the user generated experience and physical reaction on both perceptual level and mechanical level. The user's physical experience using the sense of touch is not common practice in art as well as HCI.[1] This research focuses on designing an interactive media art using Braille method as a user generated audio-visual communication system, which provides a synesthetic platform where users can create and share their texts and sounds with others through audiovisual and tactile senses. Users can control and create a new way of communication pattern in this system beyond a traditional single medium based conversation. Also this experimental work proposes the potential usage of integrated sense elements for resolving the limitation of interaction between the art work and users including impaired persons because user-generating texts and sounds using Braille method can expand the current limitations entailed behind the communication problems between visually or hearing impaired persons and normal persons. The proposed system basically consists of Braille recognition modules, which activate as dialogue boxes according to the user participation. Each module is composed of 6 input LED devices, CLCD(Character LCD) display, and a PC for data processing and sensor network. The audio-visual communication between two modules is implemented in MFC program and ZigBee network, which collect coming information and transmitting text and audio data in real-time and allow users including impaired persons, who are in near distance(50m), to chat each other in a single space. © 2014 International Information Institute.
Files in This Item
There are no files associated with this item.
Appears in
Collections
College of Information Technology > Global School of Media > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Kim, Kyu jung photo

Kim, Kyu jung
College of Information Technology (Global School of Media)
Read more

Altmetrics

Total Views & Downloads

BROWSE