Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Tabular Transfer Learning via Prompting LLMs

Full metadata record
DC Field Value Language
dc.contributor.author윤석민-
dc.date.accessioned2025-01-21T01:30:22Z-
dc.date.available2025-01-21T01:30:22Z-
dc.date.issued2024-09-
dc.identifier.urihttps://scholarworks.bwise.kr/erica/handle/2021.sw.erica/122025-
dc.description.abstractLearning with a limited number of labeled data is a central problem in realworld applications of machine learning, as it is often expensive to obtain annotations. To deal with the scarcity of labeled data, transfer learning is a conventional approach; it suggests to learn a transferable knowledge by training a neural network from multiple other sources. In this paper, we investigate transfer learning of tabular tasks, which has been less studied and successful in the literature, compared to other domains, e.g., vision and language. This is because tables are inherently heterogeneous, i.e., they contain different columns and feature spaces, making transfer learning difficult. On the other hand, recent advances in natural language processing suggest that the label scarcity issue can be mitigated by utilizing in-context learning capability of large language models (LLMs). Inspired by this and the fact that LLMs can also process tables within a unified language space, we ask whether LLMs can be effective for tabular transfer learning, in particular, under the scenarios where the source and targetdatasets are of different format. As a positive answer, we propose a novel tabular transfer learning framework, coined Prompt to Transfer (P2T), that utilizes unlabeled (or heterogeneous) source data with LLMs. Specifically,P2T identifies a column feature in a source dataset that is strongly correlated with a target task feature to create examples relevant to the target task, thus creating pseudo-demonstrations for prompts. Experimental results demonstrate that P2T outperforms previous methods on various tabular learning benchmarks, showing good promise for the important, yet underexplored tabular transfer learning problem. Code is available at ttps://github.com/jaehyun513/P2T.-
dc.format.extent18-
dc.language영어-
dc.language.isoENG-
dc.publisherCOLM Organizing Committee-
dc.titleTabular Transfer Learning via Prompting LLMs-
dc.typeArticle-
dc.identifier.doi10.48550/arXiv.2408.11063-
dc.identifier.bibliographicCitationConference on Language Modeling (COLM), pp 1 - 18-
dc.citation.titleConference on Language Modeling (COLM)-
dc.citation.startPage1-
dc.citation.endPage18-
dc.type.docTypeProceeding-
dc.description.isOpenAccessN-
dc.description.journalRegisteredClassforeign-
dc.identifier.urlhttps://arxiv.org/abs/2408.11063-
Files in This Item
Go to Link
Appears in
Collections
COLLEGE OF COMPUTING > DEPARTMENT OF ARTIFICIAL INTELLIGENCE > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Yun, Sukmin photo

Yun, Sukmin
ERICA 소프트웨어융합대학 (DEPARTMENT OF ARTIFICIAL INTELLIGENCE)
Read more

Altmetrics

Total Views & Downloads

BROWSE