中大機構典藏-NCU Institutional Repository-提供博碩士論文、考古題、期刊論文、研究計畫等下載:Item 987654321/86529
English  |  正體中文  |  简体中文  |  Items with full text/Total items : 80990/80990 (100%)
Visitors : 44969839      Online Users : 4337
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version


    Please use this identifier to cite or link to this item: http://ir.lib.ncu.edu.tw/handle/987654321/86529


    Title: 構件聚焦多頭共同注意力網路在基於面向的情感分析;Aspect-based sentiment analysis with component focusing multi-head coattention networks
    Authors: 廖源昱;Liao, Yuan-Yu
    Contributors: 資訊管理學系
    Keywords: 深度學習;神經網路;情感分析;BERT
    Date: 2021-06-29
    Issue Date: 2021-12-07 12:56:41 (UTC+8)
    Publisher: 國立中央大學
    Abstract: 基於面向的情感分析 (Aspect-based Sentiment Analysis; ABSA) 目的為從文本中預測特定目標的情感極性,過去這類任務的研究大多採用文字嵌入再透過RNN網路進行編碼,近年開始有人使用注意力機制去學習文本和目標之間的關係,但多文字目標和使用平均池化的問題存在這類任務的許多研究當中,本文提出構件聚焦多頭共同注意力網路 (Component Focusing Multi-head Coattention Networks; CF-MCAN) 模型,包含擴展文本、構件聚焦、多頭共同注意力三個模組來改善過去所遇到的問題,擴展文本能夠讓BERT的能力在ABSA任務上得到更好的發揮,構件聚焦讓文本能夠將形容詞及副詞的權重提高,改善過去只使用平均池化,將每個字都視為同等重要的問題,多頭共同注意力網路能夠在學習文本表示前,先學習多文字目標中的重要字詞,並且可以讓序列型資料對序列型資料進行注意力機制,在三個資料集上與過去論文進行比較,我們透過實驗證明提出模型的有效性。;The purpose of Aspect-based Sentiment Analysis (ABSA) is to predict the sentiment polarity of a specific target from the text. In the past, the majority of the related research used word embedding and then encoding through the RNN network. In recent years, some researchers have started to learn the relationship between the context and the target by using attention mechanism, but multi-word targets and the use of average pooling arise some problems in many studies of this type of task. This paper proposes component focusing multi-head coattention networks (CF-MCAN) model which contains three modules: extended context, component focusing, and multi-headed coattention, to improve the problems encountered in the past. The extended context can exert better BERT′s ability in the ABSA task, and the component focusing allows the context to increase the weight of adjectives and adverbs, improving the problem of using average pooling to treat every word as an equally important issue. The multi-head coattention network can learn the important words in the multi-word target before learning the context representation, and can make the sequence data perform the attention mechanism on the sequence data. Comparing three data sets with past papers, our research proves the effectiveness of the proposed model through experiments.
    Appears in Collections:[Graduate Institute of Information Management] Electronic Thesis & Dissertation

    Files in This Item:

    File Description SizeFormat
    index.html0KbHTML70View/Open


    All items in NCUIR are protected by copyright, with all rights reserved.

    社群 sharing

    ::: Copyright National Central University. | 國立中央大學圖書館版權所有 | 收藏本站 | 設為首頁 | 最佳瀏覽畫面: 1024*768 | 建站日期:8-24-2009 :::
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - 隱私權政策聲明