• Part of
    Ubiquity Network logo
    Interesse beim KIT-Verlag zu publizieren? Informationen für Autorinnen und Autoren

    Lesen sie das Kapitel
  • No readable formats available
  • Evaluation of Transformer Architectures forElectrical Load Time-Series Forecasting

    Matthias Hertel, Simon Ott, Benjamin Schäfer, Ralf Mikut, Veit Hagenmeyer, Oliver Neumann

    Kapitel/Beitrag aus dem Buch: Schulte, H et al. 2022. Proceedings – 32. Workshop Computational Intelligence: Berlin, 1. – 2. Dezember 2022.

     Download

    Accurate forecasts of the electrical load are needed to stabilize the electrical
    grid and maximize the use of renewable energies. Many good forecasting
    methods exist, including neural networks, and we compare them to the recently
    developed Transformers, which are the state-of-the-art machine learning
    technique for many sequence-related tasks. We apply different types of
    Transformers, namely the Time-Series Transformer, the Convolutional Self-
    Attention Transformer and the Informer, to electrical load data from Baden-
    Württemberg. Our results show that the Transformes give up to 11% better
    forecasts than multi-layer perceptrons for long prediction horizons. Furthermore,
    we analyze the Transformers’ attention scores to get insights into the
    model.

    :

    Empfohlene Zitierweise für das Kapitel/den Beitrag
    Hertel, M et al. 2022. Evaluation of Transformer Architectures forElectrical Load Time-Series Forecasting. In: Schulte, H et al (eds.), Proceedings – 32. Workshop Computational Intelligence: Berlin, 1. – 2. Dezember 2022. Karlsruhe: KIT Scientific Publishing. DOI: https://doi.org/10.58895/ksp/1000151141-6
    Lizenz

    This chapter distributed under the terms of the Creative Commons Attribution + ShareAlike 4.0 license. Copyright is retained by the author(s)

    Peer Review Informationen

    Dieses Buch ist Peer reviewed. Informationen dazu Hier finden Sie mehr Informationen zur wissenschaftlichen Qualitätssicherung der MAP-Publikationen.

    Weitere Informationen

    Veröffentlicht am 20. November 2022

    DOI
    https://doi.org/10.58895/ksp/1000151141-6