You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi,
I tried to reproduce your reported results in Table 9 for PEMS08 with a prediction window size of 48 and 96 using your multivariate_forecasting/PEMS/iTransformer_08.sh script.
However my MAE and MSE turned out to be significantly higher than the ones given in your paper. Do you have any idea why this is the case? Did you use other hyperparameters to achieve the results reported in the paper than used in the script?
Hi,
I tried to reproduce your reported results in Table 9 for PEMS08 with a prediction window size of 48 and 96 using your multivariate_forecasting/PEMS/iTransformer_08.sh script.
However my MAE and MSE turned out to be significantly higher than the ones given in your paper. Do you have any idea why this is the case? Did you use other hyperparameters to achieve the results reported in the paper than used in the script?
PEMS08, pred_len=48:
Args in experiment:
Namespace(is_training=1, model_id='PEMS08_96_48', model='iTransformer', data='PEMS', root_path='./dataset/PEMS/', data_path='PEMS08.npz', features='M', target='OT', freq='h', checkpoints='./checkpoints/', seq_len=96, label_len=48, pred_len=48, enc_in=170, dec_in=170, c_out=170, d_model=512, n_heads=8, e_layers=4, d_layers=1, d_ff=512, moving_avg=25, factor=1, distil=True, dropout=0.1, embed='timeF', activation='gelu', output_attention=False, do_predict=False, num_workers=10, itr=1, train_epochs=10, batch_size=16, patience=3, learning_rate=0.001, des='Exp', loss='MSE', lradj='type1', use_amp=False, use_gpu=True, gpu=0, use_multi_gpu=False, devices='0,1,2,3', exp_name='MTSF', channel_independence=False, inverse=False, class_strategy='projection', target_root_path='./data/electricity/', target_data_path='electricity.csv', efficient_training=False, use_norm=0, partial_start_index=0)
mse:0.2378038763999939, mae:0.2811351418495178
Diff: MSE: +0.051, MAE: +0.046
PEMS08, pred_len=96:
Args in experiment:
Namespace(is_training=1, model_id='PEMS08_96_96', model='iTransformer', data='PEMS', root_path='./dataset/PEMS/', data_path='PEMS08.npz', features='M', target='OT', freq='h', checkpoints='./checkpoints/', seq_len=96, label_len=48, pred_len=96, enc_in=170, dec_in=170, c_out=170, d_model=512, n_heads=8, e_layers=4, d_layers=1, d_ff=512, moving_avg=25, factor=1, distil=True, dropout=0.1, embed='timeF', activation='gelu', output_attention=False, do_predict=False, num_workers=10, itr=1, train_epochs=10, batch_size=16, patience=3, learning_rate=0.001, des='Exp', loss='MSE', lradj='type1', use_amp=False, use_gpu=True, gpu=0, use_multi_gpu=False, devices='0,1,2,3', exp_name='MTSF', channel_independence=False, inverse=False, class_strategy='projection', target_root_path='./data/electricity/', target_data_path='electricity.csv', efficient_training=False, use_norm=0, partial_start_index=0)
mse:0.3024812638759613, mae:0.3287292718887329
Diff: MSE: +0.081, MAE: +0.061
The text was updated successfully, but these errors were encountered: