中大機構典藏-NCU Institutional Repository-提供博碩士論文、考古題、期刊論文、研究計畫等下載:Item 987654321/84012
English  |  正體中文  |  简体中文  |  Items with full text/Total items : 80990/80990 (100%)
Visitors : 45477866      Online Users : 4311
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version


    Please use this identifier to cite or link to this item: http://ir.lib.ncu.edu.tw/handle/987654321/84012


    Title: Enhanced Model Agnostic Meta Learning with Meta Gradient Memory
    Authors: 劉旻融;Liu, Ming-Rong
    Contributors: 資訊管理學系
    Keywords: 深度學習;機器學習;元學習;連續學習
    Date: 2020-07-16
    Issue Date: 2020-09-02 17:55:01 (UTC+8)
    Publisher: 國立中央大學
    Abstract: 現今深度學習模型若要提升準確率至相當水準,經常動輒數千、數萬筆訓練資料才可達成,若要模型學習過去未見過之其他類別資料,往往需要將模型重新訓練。這些實務上的需求使得元學習和連續學習等領域逐漸受到重視,但元學習雖以良好的模型學習彈性著稱,但因訓練過程的高不穩定性使得效能並不可靠。另一方面,連續學習的高穩定性,降低其可學習的任務數量。因此本篇論文著重透過結合元學習與連續學習兩種小樣本學習上表現卓越的演算法,透過連續學習提升元學習的穩定性,同時也透過元學習改善連續學習的學習彈性。此外,過去在深度學習領域研究中,發現所謂的穩定性-彈性困境,意指為兩種效能表現經常會有取捨關係,無法兼得,然後在本篇研究的實驗結果中,該篇模型可在現今小樣本學習常見之資料集上,同時提高測試準確率和驗證準確率。;Recently, the importance of few shot learning field has obviously increased, and variety of famous learning methods, like Meta-learning and Continuous learning. These methods proposed to solve few shot learning, which main purpose is both training model with only few amounts of data and maintaining high generalization ability. MAML, which is an elegant and effective Meta-Learning method demonstrates its powerful performance in Omniglot and Mini-Imagenet N-way K-shot classification experiments. However, the recent research points out that the problems of instable performance of MAML and others model′s architecture problems. On the other hand, continuous learning models usually face the issue of catastrophic forgetting when the models not only learn new tasks but keep remembering the knowledge about previous tasks. Therefore, we propose our method, En-MAML, which is based on MAML framework, to combine the flexible adaptation characteristic from meta-learning with the stability performance from continual learning. We evaluate our model on Omniglot and Mini-Imagenet datasets, and follow the N-way K-shot experiment protocol. From our experiment results, our model demonstrates higher accuracy and stability on Omniglot and Mini-Imagenet.
    Appears in Collections:[Graduate Institute of Information Management] Electronic Thesis & Dissertation

    Files in This Item:

    File Description SizeFormat
    index.html0KbHTML126View/Open


    All items in NCUIR are protected by copyright, with all rights reserved.

    社群 sharing

    ::: Copyright National Central University. | 國立中央大學圖書館版權所有 | 收藏本站 | 設為首頁 | 最佳瀏覽畫面: 1024*768 | 建站日期:8-24-2009 :::
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - 隱私權政策聲明