Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
1 result(s) for "Tripura, Sajib"
Sort by:
Charging stations demand forecasting using LSTM based hybrid transformer model
Accurate forecasting of energy demand for electric vehicles (EVs) is crucial for maintaining the stability and reliability of power systems. By predicting demand over various periods, charging station owners can ensure a continuous energy supply. Medium-term and long-term demand predictions, which extend from a few weeks to several days, help analyze charging demand across different periods based on historical trends. This study proposes a Transformer model that utilizes an LSTM-based encoder-decoder for forecasting the demand at electric vehicle charging stations (EVCS). The proposed model is compared with traditional deep learning-based LSTM and Transformer models. The research employs open datasets from ACN, including charging data from Caltech and JPL. Both datasets are used to train and test the models. Predictions are made for 30 days, 120 days, and 240 days ahead, with results compared to actual demand. Performance is evaluated using Mean Absolute Error (MAE) and Mean Squared Error (MSE). Compared to baseline models, the proposed LSTM-Transformer model for Caltech data shows a significant improvement at the 30-day horizon, lowering MAE by up to 17.27% and MSE by 19.79%. The accuracy improvement are minor but consistent in longer horizons (120 and 240 days), with an MAE and MSE improvement of up to 5.71% and 4.85%, respectively. The LSTM-Transformer model also shows better accuracy across all horizons for JPL data by reducing MAE and MSE by up to 24.91% and 23.17% at 30 days, 5.00% and 5.17% at 120 days, and 3.90% and 4.86% at 240 days. The results indicate that the Hybrid Transformer model outperforms the baseline models for both datasets in medium-term and long-term predictions. The 30, 120, and 240-day predictions demonstrate lower error rates with the proposed model when utilizing Caltech and JPL charging data for these time frames.