Attention Mechanism with BERT for Content Annotation and Categorization of Pregnancy-Related Questions on a Community Q&A Site

dc.contributor.authorLuo, Xiao
dc.contributor.authorDing, Haoran
dc.contributor.authorTang, Matthew
dc.contributor.authorGandhi, Priyanka
dc.contributor.authorZhang, Zhan
dc.contributor.authorHe, Zhe
dc.contributor.departmentEngineering Technology, School of Engineering and Technologyen_US
dc.date.accessioned2022-07-01T10:44:46Z
dc.date.available2022-07-01T10:44:46Z
dc.date.issued2020-12
dc.description.abstractIn recent years, the social web has been increasingly used for health information seeking, sharing, and subsequent health-related research. Women often use the Internet or social networking sites to seek information related to pregnancy in different stages. They may ask questions about birth control, trying to conceive, labor, or taking care of a newborn or baby. Classifying different types of questions about pregnancy information (e.g., before, during, and after pregnancy) can inform the design of social media and professional websites for pregnancy education and support. This research aims to investigate the attention mechanism built-in or added on top of the BERT model in classifying and annotating the pregnancy-related questions posted on a community Q&A site. We evaluated two BERT-based models and compared them against the traditional machine learning models for question classification. Most importantly, we investigated two attention mechanisms: the built-in self-attention mechanism of BERT and the additional attention layer on top of BERT for relevant term annotation. The classification performance showed that the BERT-based models worked better than the traditional models, and BERT with an additional attention layer can achieve higher overall precision than the basic BERT model. The results also showed that both attention mechanisms work differently on annotating relevant content, and they could serve as feature selection methods for text mining in general.en_US
dc.eprint.versionAuthor's manuscripten_US
dc.identifier.citationLuo X, Ding H, Tang M, Gandhi P, Zhang Z, He Z. Attention Mechanism with BERT for Content Annotation and Categorization of Pregnancy-Related Questions on a Community Q&A Site. Proceedings (IEEE Int Conf Bioinformatics Biomed). 2020;2020:1077-1081. doi:10.1109/bibm49941.2020.9313379en_US
dc.identifier.urihttps://hdl.handle.net/1805/29465
dc.language.isoen_USen_US
dc.publisherIEEEen_US
dc.relation.isversionof10.1109/bibm49941.2020.9313379en_US
dc.relation.journalProceedings (IEEE Int Conf Bioinformatics Biomed)en_US
dc.rightsPublisher Policyen_US
dc.sourcePMCen_US
dc.subjectAI Interpretationen_US
dc.subjectContent Annotationen_US
dc.subjectCon-sumer’s Question Classificationen_US
dc.subjectNLPen_US
dc.titleAttention Mechanism with BERT for Content Annotation and Categorization of Pregnancy-Related Questions on a Community Q&A Siteen_US
dc.typeArticleen_US
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
nihms-1668723.pdf
Size:
1.43 MB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.99 KB
Format:
Item-specific license agreed upon to submission
Description: