中大機構典藏-NCU Institutional Repository-提供博碩士論文、考古題、期刊論文、研究計畫等下載:Item 987654321/91812
English  |  正體中文  |  简体中文  |  Items with full text/Total items : 80990/80990 (100%)
Visitors : 41947795      Online Users : 1355
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version


    Please use this identifier to cite or link to this item: http://ir.lib.ncu.edu.tw/handle/987654321/91812


    Title: 時間序列預測中變換器架構之位置編碼設計;Design of Transformer Architecture based on different Position Encoding in Time series forecasting
    Authors: 游景翔;You, Ching-Siang
    Contributors: 工業管理研究所
    Keywords: 資料探勘;深度學習;時間序列;變換器模型;Data mining;Deep learning;Time series;Transformer model
    Date: 2023-07-10
    Issue Date: 2024-09-19 14:14:35 (UTC+8)
    Publisher: 國立中央大學
    Abstract: 時間序列分析和預測是資料探勘中的一個重要領域。時間序列資料是在統一的時間間隔內收集大量的數據值,例如年、月、週、日等。透過分析時間序列,我們可以預測數據的變化,並提供未來資料的預測。近年來,時間序列預測一直是研究的焦點,並在機器學習和人工智能領域引發了各種研究和發展。隨著數據可用性的增加和計算能力的提升,也出現了許多基於深度學習的模型。且因為不同領域間的多樣性,衍伸許多不同的深度學習模型。時間序列趨勢預測一直是一個重要的課題,其結果可以為各領域的應用提供基礎,例如生產計劃的控制和優化等。
    變換器模型(Transformer Model)最初是為了處理自然語言而提出的一種神經網絡架構。它使用一種稱為注意力或自我注意力(Self-attention)的機制來檢測序列中元素之間的相互影響和相互依賴關係。在本研究中,我們將變換器模型應用於時間序列資料的預測,並探討其並行計算的特性是否能解決長短期記憶模型(LSTM)在學習長序列時的限制。
    此外,我們通過使用不同的位置編碼機制(Positional Encoding)來提供時間序列資料在序列中的位置信息,並探討不同的位置編碼方式對於時間序列預測在變換器模型中的影響。我們在實驗中使用了五種真實的時間序列資料,評估各種模型對於不同時間趨勢的預測結果。

    ;Time series analysis and forecasting are essential components of data mining. Time series data refers to a collection of data values gathered at regular time intervals, such as yearly, monthly, weekly, or daily intervals. By analyzing time series data, we can predict the changes occurring within the dataset and forecast future data trends. Time series prediction has been a research hotspot in the past ten years, with the increase in data availability and the improvement of computing power, many deep learning-based models have also emerged in recent years, and many different deep learning model designs have also emerged considering the diversity of time series problems between other domains. Time series trend forecasting has always been an important topic. The predicted results can provide the basis for applications in various fields, such as the control and optimization of production planning.
    Transformer Model is a kind of neural network, which was mainly applied to natural language processing when it was first proposed. It primarily uses a set of mechanisms called attention or self-attention to detect data elements in sequences that influence and depend on each other. In this study, we use the Transformer model to predict time series data and explore whether its parallel operation characteristics can solve the long-short-term memory model (LSTM) with a certain length limit in sequence learning. In addition, we use different Positional Encoding mechanisms to give time series data position information in the sequence and discuss the impact of different position encoding methods to express the positional relationship of time series data at time points on the time series prediction of the transformer model. In Chapter 4, we used 5 kinds of real-world time series data to examine each model′s results to predict different time trends.
    Appears in Collections:[Graduate Institute of Industrial Management] Electronic Thesis & Dissertation

    Files in This Item:

    File Description SizeFormat
    index.html0KbHTML17View/Open


    All items in NCUIR are protected by copyright, with all rights reserved.

    社群 sharing

    ::: Copyright National Central University. | 國立中央大學圖書館版權所有 | 收藏本站 | 設為首頁 | 最佳瀏覽畫面: 1024*768 | 建站日期:8-24-2009 :::
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - 隱私權政策聲明