Scalable transfer learning framework for capturing human perceptions of place through visual-aural data integration
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Le, Quang Hoai | - |
dc.contributor.author | Dinh, Nguyen Ngoc Han | - |
dc.contributor.author | Kim, Byeol | - |
dc.contributor.author | Ahn, Yonghan | - |
dc.date.accessioned | 2025-07-30T05:00:26Z | - |
dc.date.available | 2025-07-30T05:00:26Z | - |
dc.date.issued | 2025-11 | - |
dc.identifier.issn | 0264-2751 | - |
dc.identifier.issn | 1873-6084 | - |
dc.identifier.uri | https://scholarworks.bwise.kr/erica/handle/2021.sw.erica/126223 | - |
dc.description.abstract | Understanding how people perceive urban environments is critical to creating sustainable, engaging, and inclusive cities, particularly in rapid, economically driven urban expansion. Although various methods of measuring Human Perception of Place (HPP) have been developed by adopting computer vision and street-view images, these approaches are solely visual-based and neglect the influence of other senses on human subjective perception, introducing visual bias. Additionally, limited generalizability of predictive models poses a challenge when applying them across diverse urban contexts. In response, this study proposes a scalable framework for capturing HPP using a transfer learning-based Feedforward Neural Network (FNN) combined with cross-modal techniques that integrate both visual and auditory data. Leveraging the Place Pulse dataset, the proposed models incorporate visual-aural experiences to mitigate visual bias and achieve improved prediction accuracy. The results indicate that the proposed approach significantly outperforms traditional tree-based and margin-based regression models, achieving an average R2 improvement of 27 % over GBRT and offering stronger alignment with public consensus. These findings also highlight how architectural diversity, active street life, and vibrant soundscapes positively influence perceptions of beauty, liveliness, and wealth. Conversely, areas with high traffic and chaotic noise are often perceived as less safe, despite their vibrancy. This research underscores the value of multisensory data in capturing the complexity of human place perception and provides practical guidance for urban planners and policymakers, supporting the design of data-driven, human-centered planning strategies that foster livability and well-being in diverse urban settings. © 2025 Elsevier Ltd | - |
dc.format.extent | 16 | - |
dc.language | 영어 | - |
dc.language.iso | ENG | - |
dc.publisher | Elsevier Ltd | - |
dc.title | Scalable transfer learning framework for capturing human perceptions of place through visual-aural data integration | - |
dc.type | Article | - |
dc.publisher.location | 영국 | - |
dc.identifier.doi | 10.1016/j.cities.2025.106286 | - |
dc.identifier.scopusid | 2-s2.0-105011069587 | - |
dc.identifier.wosid | 001538648500003 | - |
dc.identifier.bibliographicCitation | Cities, v.166, pp 1 - 16 | - |
dc.citation.title | Cities | - |
dc.citation.volume | 166 | - |
dc.citation.startPage | 1 | - |
dc.citation.endPage | 16 | - |
dc.type.docType | Article | - |
dc.description.isOpenAccess | N | - |
dc.description.journalRegisteredClass | ssci | - |
dc.description.journalRegisteredClass | scopus | - |
dc.relation.journalResearchArea | Urban Studies | - |
dc.relation.journalWebOfScienceCategory | Urban Studies | - |
dc.subject.keywordPlus | URBAN | - |
dc.subject.keywordPlus | SOUNDSCAPE | - |
dc.subject.keywordPlus | LANDSCAPE | - |
dc.subject.keywordPlus | ENVIRONMENTS | - |
dc.subject.keywordPlus | QUALITIES | - |
dc.subject.keywordAuthor | Deep learning | - |
dc.subject.keywordAuthor | Environmental psychology | - |
dc.subject.keywordAuthor | Place perception | - |
dc.subject.keywordAuthor | Street-view image | - |
dc.subject.keywordAuthor | Urban analytics | - |
dc.subject.keywordAuthor | Urban soundscape | - |
dc.identifier.url | https://www.sciencedirect.com/science/article/pii/S0264275125005876?pes=vor&utm_source=scopus&getft_integrator=scopus | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
55 Hanyangdeahak-ro, Sangnok-gu, Ansan, Gyeonggi-do, 15588, Korea+82-31-400-4269 sweetbrain@hanyang.ac.kr
COPYRIGHT © 2021 HANYANG UNIVERSITY. ALL RIGHTS RESERVED.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.