Transfer language space with similar domain adaptation: a case study with hepatocellular carcinoma

Amara Tariq, Omar Kallas, Patricia Balthazar, Scott Jeffery Lee, Terry Desser, Daniel Rubin, Judy Wawira Gichoya, Imon Banerjee

Research output: Contribution to journalArticlepeer-review

Abstract

Background: Transfer learning is a common practice in image classification with deep learning where the available data is often limited for training a complex model with millions of parameters. However, transferring language models requires special attention since cross-domain vocabularies (e.g. between two different modalities MR and US) do not always overlap as the pixel intensity range overlaps mostly for images. Method: We present a concept of similar domain adaptation where we transfer inter-institutional language models (context-dependent and context-independent) between two different modalities (ultrasound and MRI) to capture liver abnormalities. Results: We use MR and US screening exam reports for hepatocellular carcinoma as the use-case and apply the transfer language space strategy to automatically label imaging exams with and without structured template with > 0.9 average f1-score. Conclusion: We conclude that transfer learning along with fine-tuning the discriminative model is often more effective for performing shared targeted tasks than the training for a language space from scratch.

Original languageEnglish (US)
Article number8
JournalJournal of Biomedical Semantics
Volume13
Issue number1
DOIs
StatePublished - Dec 2022

Keywords

  • BERT
  • Language model
  • Radiology report
  • Transfer learning
  • Word2vec

ASJC Scopus subject areas

  • Information Systems
  • Computer Science Applications
  • Health Informatics
  • Computer Networks and Communications

Fingerprint

Dive into the research topics of 'Transfer language space with similar domain adaptation: a case study with hepatocellular carcinoma'. Together they form a unique fingerprint.

Cite this