Please use this identifier to cite or link to this item: https://doi.org/10.21256/zhaw-19637
Publication type: Conference paper
Type of review: Peer review (publication)
Title: Entity matching with transformer architectures - a step forward in data integration
Authors: Brunner, Ursin
Stockinger, Kurt
et. al: No
DOI: 10.5441/002/edbt.2020.58
10.21256/zhaw-19637
Proceedings: Proceedings of EDBT 2020
Page(s): 463
Pages to: 473
Conference details: 23rd International Conference on Extending Database Technology, Copenhagen, 30 March - 2 April 2020
Issue Date: Mar-2020
Publisher / Ed. Institution: OpenProceedings
ISBN: 978-3-89318-083-7
Language: English
Subjects: Entity matching; Data integration; Machine learning; Neural networks; Transformers; BERT
Subject (DDC): 006: Special computer methods
Abstract: Transformer architectures have proven to be very effective and provide state-of-the-art results in many natural language tasks. The attention-based architecture in combination with pre-training on large amounts of text lead to the recent breakthrough and a variety of slightly different implementations. In this paper we analyze how well four of the most recent attention-based transformer architectures (BERT, XLNet, RoBERTa and DistilBERT) perform on the task of entity matching - a crucial part of data integration. Entity matching (EM) is the task of finding data instances that refer to the same real-world entity. It is a challenging task if the data instances consist of long textual data or if the data instances are "dirty" due to misplaced values. To evaluate the capability of transformer architectures and transfer-learning on the task of EM, we empirically compare the four approaches on inherently difficult data sets. We show that transformer architectures outperform classical deep learning methods in EM by an average margin of 27.5%.
URI: https://digitalcollection.zhaw.ch/handle/11475/19637
Fulltext version: Published version
License (according to publishing contract): CC BY-NC-ND 4.0: Attribution - Non commercial - No derivatives 4.0 International
Departement: School of Engineering
Organisational Unit: Institute of Computer Science (InIT)
Appears in collections:Publikationen School of Engineering

Files in This Item:
File Description SizeFormat 
Entity_Machting_with_Transformers_edbt_2020__Camera_Ready.pdfEntity Machting with Transformers EDBT 20201.12 MBAdobe PDFThumbnail
View/Open
Show full item record
Brunner, U., & Stockinger, K. (2020). Entity matching with transformer architectures - a step forward in data integration [Conference paper]. Proceedings of EDBT 2020, 463–473. https://doi.org/10.5441/002/edbt.2020.58
Brunner, U. and Stockinger, K. (2020) ‘Entity matching with transformer architectures - a step forward in data integration’, in Proceedings of EDBT 2020. OpenProceedings, pp. 463–473. Available at: https://doi.org/10.5441/002/edbt.2020.58.
U. Brunner and K. Stockinger, “Entity matching with transformer architectures - a step forward in data integration,” in Proceedings of EDBT 2020, Mar. 2020, pp. 463–473. doi: 10.5441/002/edbt.2020.58.
BRUNNER, Ursin und Kurt STOCKINGER, 2020. Entity matching with transformer architectures - a step forward in data integration. In: Proceedings of EDBT 2020. Conference paper. OpenProceedings. März 2020. S. 463–473. ISBN 978-3-89318-083-7
Brunner, Ursin, and Kurt Stockinger. 2020. “Entity Matching with Transformer Architectures - a Step Forward in Data Integration.” Conference paper. In Proceedings of EDBT 2020, 463–73. OpenProceedings. https://doi.org/10.5441/002/edbt.2020.58.
Brunner, Ursin, and Kurt Stockinger. “Entity Matching with Transformer Architectures - a Step Forward in Data Integration.” Proceedings of EDBT 2020, OpenProceedings, 2020, pp. 463–73, https://doi.org/10.5441/002/edbt.2020.58.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.