;Time series analysis and forecasting are essential components of data mining. Time series data refers to a collection of data values gathered at regular time intervals, such as yearly, monthly, weekly, or daily intervals. By analyzing time series data, we can predict the changes occurring within the dataset and forecast future data trends. Time series prediction has been a research hotspot in the past ten years, with the increase in data availability and the improvement of computing power, many deep learning-based models have also emerged in recent years, and many different deep learning model designs have also emerged considering the diversity of time series problems between other domains. Time series trend forecasting has always been an important topic. The predicted results can provide the basis for applications in various fields, such as the control and optimization of production planning. Transformer Model is a kind of neural network, which was mainly applied to natural language processing when it was first proposed. It primarily uses a set of mechanisms called attention or self-attention to detect data elements in sequences that influence and depend on each other. In this study, we use the Transformer model to predict time series data and explore whether its parallel operation characteristics can solve the long-short-term memory model (LSTM) with a certain length limit in sequence learning. In addition, we use different Positional Encoding mechanisms to give time series data position information in the sequence and discuss the impact of different position encoding methods to express the positional relationship of time series data at time points on the time series prediction of the transformer model. In Chapter 4, we used 5 kinds of real-world time series data to examine each model′s results to predict different time trends.