bandgap

/home/ubuntu/.conda/envs/comformer1/bin/python /home/ubuntu/sophia/code/ComFormer/comformer/scripts/train_jarvis.py 
INFO: Pandarallel will run on 16 workers.
INFO: Pandarallel will use Memory file system to transfer data between the main process and workers.
fatal: not a git repository (or any of the parent directories): .git
name='iComformer' conv_layers=4 edge_layers=1 atom_input_features=92 edge_features=256 triplet_input_features=256 node_features=256 fc_layers=1 fc_features=256 output_features=1 node_layer_head=1 edge_layer_head=1 nn_based=False link='identity' zero_inflated=False use_angle=False angle_lattice=False classification=False
{'dataset': 'dft_3d', 'target': 'mbj_bandgap', 'epochs': 500, 'batch_size': 64, 'weight_decay': 1e-05, 'learning_rate': 0.001, 'criterion': 'mse', 'optimizer': 'adamw', 'scheduler': 'onecycle', 'save_dataloader': False, 'pin_memory': False, 'write_predictions': True, 'num_workers': 4, 'classification_threshold': None, 'atom_features': 'cgcnn', 'model': {'name': 'iComformer', 'use_angle': False}, 'cutoff': 4.0, 'max_neighbors': 25, 'matrix_input': False, 'pyg_input': True, 'use_lattice': True, 'use_angle': False, 'neighbor_strategy': 'k-nearest', 'output_dir': 'yourpath-jarvis'}
config:
{'atom_features': 'cgcnn',
 'batch_size': 64,
 'classification_threshold': None,
 'criterion': 'mse',
 'cutoff': 4.0,
 'dataset': 'dft_3d',
 'distributed': False,
 'epochs': 500,
 'filename': 'sample',
 'id_tag': 'jid',
 'keep_data_order': False,
 'learning_rate': 0.001,
 'log_tensorboard': False,
 'matrix_input': False,
 'max_neighbors': 25,
 'model': {'angle_lattice': False,
           'atom_input_features': 92,
           'classification': False,
           'conv_layers': 4,
           'edge_features': 256,
           'edge_layer_head': 1,
           'edge_layers': 1,
           'fc_features': 256,
           'fc_layers': 1,
           'link': 'identity',
           'name': 'iComformer',
           'nn_based': False,
           'node_features': 256,
           'node_layer_head': 1,
           'output_features': 1,
           'triplet_input_features': 256,
           'use_angle': False,
           'zero_inflated': False},
 'n_early_stopping': None,
 'n_test': None,
 'n_train': None,
 'n_val': None,
 'neighbor_strategy': 'k-nearest',
 'num_workers': 4,
 'optimizer': 'adamw',
 'output_dir': 'yourpath-jarvis',
 'pin_memory': False,
 'progress': True,
 'pyg_input': True,
 'random_seed': 123,
 'save_dataloader': False,
 'scheduler': 'onecycle',
 'standard_scalar_and_pca': False,
 'store_outputs': True,
 'target': 'mbj_bandgap',
 'target_multiplication_factor': None,
 'test_ratio': 0.1,
 'train_ratio': 0.8,
 'use_angle': False,
 'use_canonize': True,
 'use_lattice': True,
 'val_ratio': 0.1,
 'version': 'NA',
 'warmup_steps': 2000,
 'weight_decay': 1e-05,
 'write_checkpoint': True,
 'write_predictions': True}
Obtaining 3D dataset 55k ...
Reference:https://www.nature.com/articles/s41524-020-00440-1
Loading the zipfile...
Loading completed.
data range 27.501 0.0
./saved_data/train_datatest_graph_angle.pkl
graphs not saved11111111111111111111111111111111111111
mean 1.488970 std 2.316133
normalize using training mean 1.488970 and std 2.316053
warning: could not load CGCNN features for 103
Setting it to max atomic number available here, 103
warning: could not load CGCNN features for 101
Setting it to max atomic number available here, 103
warning: could not load CGCNN features for 102
Setting it to max atomic number available here, 103
data range 23.901 0.0
./saved_data/val_datatest_graph_angle.pkl
graphs not saved11111111111111111111111111111111111111
mean 1.575177 std 2.411932
normalize using training mean 1.488970 and std 2.316053
data range 14.498 0.0
./saved_data/test_datatest_graph_angle.pkl
graphs not saved11111111111111111111111111111111111111
mean 1.488127 std 2.255386
normalize using training mean 1.488970 and std 2.316053
n_train: 14537
n_val: 1817
n_test: 1817
std train: 2.316053234484043
I am using the correct version of matformer
I am using the correct version of matformer
I am using the correct version of matformer
I am using the correct version of matformer
I am using the invariant version of EPCNet
config:
{'angle_lattice': False,
 'atom_input_features': 92,
 'classification': False,
 'conv_layers': 4,
 'edge_features': 256,
 'edge_layer_head': 1,
 'edge_layers': 1,
 'fc_features': 256,
 'fc_layers': 1,
 'link': 'identity',
 'name': 'iComformer',
 'nn_based': False,
 'node_features': 256,
 'node_layer_head': 1,
 'output_features': 1,
 'triplet_input_features': 256,
 'use_angle': False,
 'zero_inflated': False}
使用使用使用使用使用使用使用使用使用使用使用使用使用使用使用使用使用使用使用使用
使用warmup_steps= 2000
使用config.epochs= 500
使用steps_per_epoch= 227
使用pct_start= 0.01762114537444934
使用warmup_steps= 2000
使用config.epochs= 500
使用steps_per_epoch= 227
Total size: 18.967243194580078
Total trainable parameter number 4966497
Val_MAE: 1.2240
Train_MAE: -1.0000
Val_MAE: 1.0492
Train_MAE: -1.0000
Val_MAE: 2.1772
Train_MAE: -1.0000
Val_MAE: 1.0166
Train_MAE: -1.0000
Val_MAE: 1.3406
Train_MAE: -1.0000
Val_MAE: 0.8557
Train_MAE: -1.0000
Val_MAE: 1.2122
Train_MAE: -1.0000
Val_MAE: 1.0970
Train_MAE: -1.0000
Val_MAE: 0.8284
Train_MAE: -1.0000
Val_MAE: 0.8024
Train_MAE: -1.0000
Val_MAE: 0.7196
Train_MAE: -1.0000
Val_MAE: 0.7849
Train_MAE: -1.0000
Val_MAE: 0.8244
Train_MAE: -1.0000
Val_MAE: 0.8461
Train_MAE: -1.0000
Val_MAE: 0.7600
Train_MAE: -1.0000
Val_MAE: 1.1451
Train_MAE: -1.0000
Val_MAE: 0.8315
Train_MAE: -1.0000
Val_MAE: 0.7019
Train_MAE: -1.0000
Val_MAE: 0.7286
Train_MAE: -1.0000
Val_MAE: 1.9904
Train_MAE: 2.0265
Val_MAE: 0.6003
Train_MAE: -1.0000
Val_MAE: 1.2508
Train_MAE: -1.0000
Val_MAE: 0.8776
Train_MAE: -1.0000
Val_MAE: 0.8328
Train_MAE: -1.0000
Val_MAE: 0.6651
Train_MAE: -1.0000
Val_MAE: 0.7533
Train_MAE: -1.0000
Val_MAE: 0.8580
Train_MAE: -1.0000
Val_MAE: 0.7522
Train_MAE: -1.0000
Val_MAE: 1.6394
Train_MAE: -1.0000
Val_MAE: 0.7082
Train_MAE: -1.0000
Val_MAE: 0.8715
Train_MAE: -1.0000
Val_MAE: 0.7707
Train_MAE: -1.0000
Val_MAE: 0.6321
Train_MAE: -1.0000
Val_MAE: 4.4257
Train_MAE: -1.0000
Val_MAE: 0.6674
Train_MAE: -1.0000
Val_MAE: 1.1278
Train_MAE: -1.0000
Val_MAE: 0.8835
Train_MAE: -1.0000
Val_MAE: 0.5918
Train_MAE: -1.0000
Val_MAE: 0.5902
Train_MAE: -1.0000
Val_MAE: 0.5047
Train_MAE: 0.4957
Val_MAE: 0.6685
Train_MAE: -1.0000
Val_MAE: 0.6429
Train_MAE: -1.0000
Val_MAE: 0.7329
Train_MAE: -1.0000
Val_MAE: 0.8386
Train_MAE: -1.0000
Val_MAE: 1.3391
Train_MAE: -1.0000
Val_MAE: 0.5376
Train_MAE: -1.0000
Val_MAE: 0.9498
Train_MAE: -1.0000
Val_MAE: 1.0388
Train_MAE: -1.0000
Val_MAE: 0.6830
Train_MAE: -1.0000
Val_MAE: 0.6249
Train_MAE: -1.0000
Val_MAE: 0.4966
Train_MAE: -1.0000
Val_MAE: 1.1729
Train_MAE: -1.0000
Val_MAE: 2.2782
Train_MAE: -1.0000
Val_MAE: 0.5182
Train_MAE: -1.0000
Val_MAE: 1.0050
Train_MAE: -1.0000
Val_MAE: 0.5378
Train_MAE: -1.0000
Val_MAE: 0.8314
Train_MAE: -1.0000
Val_MAE: 0.4942
Train_MAE: -1.0000
Val_MAE: 0.6140
Train_MAE: -1.0000
Val_MAE: 1.4123
Train_MAE: 1.4165
Val_MAE: 0.6597
Train_MAE: -1.0000
Val_MAE: 0.9247
Train_MAE: -1.0000
Val_MAE: 0.5674
Train_MAE: -1.0000
Val_MAE: 0.7040
Train_MAE: -1.0000
Val_MAE: 0.7444
Train_MAE: -1.0000
Val_MAE: 0.4622
Train_MAE: -1.0000
Val_MAE: 0.4839
Train_MAE: -1.0000
Val_MAE: 0.5652
Train_MAE: -1.0000
Val_MAE: 0.7197
Train_MAE: -1.0000
Val_MAE: 0.5672
Train_MAE: -1.0000
Val_MAE: 0.5957
Train_MAE: -1.0000
Val_MAE: 0.7538
Train_MAE: -1.0000
Val_MAE: 0.4880
Train_MAE: -1.0000
Val_MAE: 0.5459
Train_MAE: -1.0000
Val_MAE: 0.4612
Train_MAE: -1.0000
Val_MAE: 0.5337
Train_MAE: -1.0000
Val_MAE: 0.4502
Train_MAE: -1.0000
Val_MAE: 0.5817
Train_MAE: -1.0000
Val_MAE: 0.4450
Train_MAE: -1.0000
Val_MAE: 0.7260
Train_MAE: 0.6565
Val_MAE: 0.4891
Train_MAE: -1.0000
Val_MAE: 0.4943
Train_MAE: -1.0000
Val_MAE: 0.5229
Train_MAE: -1.0000
Val_MAE: 0.5270
Train_MAE: -1.0000
Val_MAE: 0.4447
Train_MAE: -1.0000
Val_MAE: 0.3924
Train_MAE: -1.0000
Val_MAE: 0.4388
Train_MAE: -1.0000
Val_MAE: 0.4873
Train_MAE: -1.0000
Val_MAE: 0.5768
Train_MAE: -1.0000
Val_MAE: 0.4631
Train_MAE: -1.0000
Val_MAE: 0.6652
Train_MAE: -1.0000
Val_MAE: 0.4087
Train_MAE: -1.0000
Val_MAE: 0.9597
Train_MAE: -1.0000
Val_MAE: 0.7891
Train_MAE: -1.0000
Val_MAE: 0.4720
Train_MAE: -1.0000
Val_MAE: 0.4385
Train_MAE: -1.0000
Val_MAE: 0.3996
Train_MAE: -1.0000
Val_MAE: 0.3968
Train_MAE: -1.0000
Val_MAE: 0.4604
Train_MAE: -1.0000
Val_MAE: 0.4429
Train_MAE: 0.3635
Val_MAE: 0.5964
Train_MAE: -1.0000
Val_MAE: 0.4565
Train_MAE: -1.0000
Val_MAE: 0.7969
Train_MAE: -1.0000
Val_MAE: 1.0990
Train_MAE: -1.0000
Val_MAE: 0.5251
Train_MAE: -1.0000
Val_MAE: 0.3853
Train_MAE: -1.0000
Val_MAE: 0.3769
Train_MAE: -1.0000
Val_MAE: 0.3722
Train_MAE: -1.0000
Val_MAE: 0.3742
Train_MAE: -1.0000
Val_MAE: 0.3923
Train_MAE: -1.0000
Val_MAE: 0.4851
Train_MAE: -1.0000
Val_MAE: 0.8387
Train_MAE: -1.0000
Val_MAE: 0.4214
Train_MAE: -1.0000
Val_MAE: 0.4814
Train_MAE: -1.0000
Val_MAE: 0.3914
Train_MAE: -1.0000
Val_MAE: 0.4295
Train_MAE: -1.0000
Val_MAE: 0.3946
Train_MAE: -1.0000
Val_MAE: 0.4277
Train_MAE: -1.0000
Val_MAE: 0.4255
Train_MAE: -1.0000
Val_MAE: 0.4733
Train_MAE: 0.3504
Val_MAE: 0.3847
Train_MAE: -1.0000
Val_MAE: 0.4788
Train_MAE: -1.0000
Val_MAE: 0.4057
Train_MAE: -1.0000
Val_MAE: 0.4198
Train_MAE: -1.0000
Val_MAE: 0.3888
Train_MAE: -1.0000
Val_MAE: 0.3820
Train_MAE: -1.0000
Val_MAE: 0.3800
Train_MAE: -1.0000
Val_MAE: 0.3877
Train_MAE: -1.0000
Val_MAE: 0.3526
Train_MAE: -1.0000
Val_MAE: 0.3970
Train_MAE: -1.0000
Val_MAE: 0.3941
Train_MAE: -1.0000
Val_MAE: 0.3717
Train_MAE: -1.0000
Val_MAE: 0.3911
Train_MAE: -1.0000
Val_MAE: 0.3725
Train_MAE: -1.0000
Val_MAE: 0.3825
Train_MAE: -1.0000
Val_MAE: 0.4669
Train_MAE: -1.0000
Val_MAE: 0.5892
Train_MAE: -1.0000
Val_MAE: 0.4829
Train_MAE: -1.0000
Val_MAE: 0.3510
Train_MAE: -1.0000
Val_MAE: 0.3572
Train_MAE: 0.2218
Val_MAE: 0.3352
Train_MAE: -1.0000
Val_MAE: 0.4494
Train_MAE: -1.0000
Val_MAE: 0.3501
Train_MAE: -1.0000
Val_MAE: 0.3285
Train_MAE: -1.0000
Val_MAE: 0.3731
Train_MAE: -1.0000
Val_MAE: 0.3527
Train_MAE: -1.0000
Val_MAE: 0.4230
Train_MAE: -1.0000
Val_MAE: 0.3721
Train_MAE: -1.0000
Val_MAE: 0.4066
Train_MAE: -1.0000
Val_MAE: 0.3846
Train_MAE: -1.0000
Val_MAE: 0.3784
Train_MAE: -1.0000
Val_MAE: 0.3303
Train_MAE: -1.0000
Val_MAE: 0.3497
Train_MAE: -1.0000
Val_MAE: 0.3362
Train_MAE: -1.0000
Val_MAE: 0.3196
Train_MAE: -1.0000
Val_MAE: 0.3265
Train_MAE: -1.0000
Val_MAE: 0.3811
Train_MAE: -1.0000
Val_MAE: 0.3575
Train_MAE: -1.0000
Val_MAE: 0.3358
Train_MAE: -1.0000
Val_MAE: 0.3200
Train_MAE: 0.1755
Val_MAE: 0.3274
Train_MAE: -1.0000
Val_MAE: 0.3173
Train_MAE: -1.0000
Val_MAE: 0.4021
Train_MAE: -1.0000
Val_MAE: 0.3652
Train_MAE: -1.0000
Val_MAE: 0.3735
Train_MAE: -1.0000
Val_MAE: 0.3764
Train_MAE: -1.0000
Val_MAE: 0.3768
Train_MAE: -1.0000
Val_MAE: 0.3212
Train_MAE: -1.0000
Val_MAE: 0.3190
Train_MAE: -1.0000
Val_MAE: 0.3144
Train_MAE: -1.0000
Val_MAE: 0.3794
Train_MAE: -1.0000
Val_MAE: 0.5356
Train_MAE: -1.0000
Val_MAE: 0.3381
Train_MAE: -1.0000
Val_MAE: 0.3227
Train_MAE: -1.0000
Val_MAE: 0.3168
Train_MAE: -1.0000
Val_MAE: 0.3025
Train_MAE: -1.0000
Val_MAE: 0.3470
Train_MAE: -1.0000
Val_MAE: 0.3320
Train_MAE: -1.0000
Val_MAE: 0.3495
Train_MAE: -1.0000
Val_MAE: 0.3348
Train_MAE: 0.1594
Val_MAE: 0.4242
Train_MAE: -1.0000
Val_MAE: 0.3333
Train_MAE: -1.0000
Val_MAE: 0.5741
Train_MAE: -1.0000
Val_MAE: 0.4809
Train_MAE: -1.0000
Val_MAE: 0.4468
Train_MAE: -1.0000
Val_MAE: 0.3847
Train_MAE: -1.0000
Val_MAE: 0.3329
Train_MAE: -1.0000
Val_MAE: 0.3027
Train_MAE: -1.0000
Val_MAE: 0.3169
Train_MAE: -1.0000
Val_MAE: 0.3237
Train_MAE: -1.0000
Val_MAE: 0.3081
Train_MAE: -1.0000
Val_MAE: 0.3196
Train_MAE: -1.0000
Val_MAE: 0.2993
Train_MAE: -1.0000
Val_MAE: 0.3231
Train_MAE: -1.0000
Val_MAE: 0.2991
Train_MAE: -1.0000
Val_MAE: 0.2968
Train_MAE: -1.0000
Val_MAE: 0.2951
Train_MAE: -1.0000
Val_MAE: 0.2978
Train_MAE: -1.0000
Val_MAE: 0.3065
Train_MAE: -1.0000
Val_MAE: 0.3158
Train_MAE: 0.1207
Val_MAE: 0.3261
Train_MAE: -1.0000
Val_MAE: 0.3066
Train_MAE: -1.0000
Val_MAE: 0.3438
Train_MAE: -1.0000
Val_MAE: 0.3168
Train_MAE: -1.0000
Val_MAE: 0.3151
Train_MAE: -1.0000
Val_MAE: 0.3180
Train_MAE: -1.0000
Val_MAE: 0.3219
Train_MAE: -1.0000
Val_MAE: 0.3444
Train_MAE: -1.0000
Val_MAE: 0.3876
Train_MAE: -1.0000
Val_MAE: 0.3334
Train_MAE: -1.0000
Val_MAE: 0.3102
Train_MAE: -1.0000
Val_MAE: 0.3134
Train_MAE: -1.0000
Val_MAE: 0.3239
Train_MAE: -1.0000
Val_MAE: 0.3263
Train_MAE: -1.0000
Val_MAE: 0.3100
Train_MAE: -1.0000
Val_MAE: 0.3332
Train_MAE: -1.0000
Val_MAE: 0.2978
Train_MAE: -1.0000
Val_MAE: 0.3292
Train_MAE: -1.0000
Val_MAE: 0.3374
Train_MAE: -1.0000
Val_MAE: 0.2929
Train_MAE: 0.0936
Val_MAE: 0.2978
Train_MAE: -1.0000
Val_MAE: 0.2938
Train_MAE: -1.0000
Val_MAE: 0.3001
Train_MAE: -1.0000
Val_MAE: 0.3043
Train_MAE: -1.0000
Val_MAE: 0.3103
Train_MAE: -1.0000
Val_MAE: 0.3165
Train_MAE: -1.0000
Val_MAE: 0.3574
Train_MAE: -1.0000
Val_MAE: 0.3165
Train_MAE: -1.0000
Val_MAE: 0.3417
Train_MAE: -1.0000
Val_MAE: 0.2968
Train_MAE: -1.0000
Val_MAE: 0.2811
Train_MAE: -1.0000
Val_MAE: 0.2932
Train_MAE: -1.0000
Val_MAE: 0.2842
Train_MAE: -1.0000
Val_MAE: 0.2887
Train_MAE: -1.0000
Val_MAE: 0.2954
Train_MAE: -1.0000
Val_MAE: 0.2862
Train_MAE: -1.0000
Val_MAE: 0.3174
Train_MAE: -1.0000
Val_MAE: 0.3406
Train_MAE: -1.0000
Val_MAE: 0.3200
Train_MAE: -1.0000
Val_MAE: 0.3187
Train_MAE: 0.1304
Val_MAE: 0.3332
Train_MAE: -1.0000
Val_MAE: 0.2963
Train_MAE: -1.0000
Val_MAE: 0.3167
Train_MAE: -1.0000
Val_MAE: 0.2943
Train_MAE: -1.0000
Val_MAE: 0.2825
Train_MAE: -1.0000
Val_MAE: 0.3073
Train_MAE: -1.0000
Val_MAE: 0.3185
Train_MAE: -1.0000
Val_MAE: 0.3363
Train_MAE: -1.0000
Val_MAE: 0.3036
Train_MAE: -1.0000
Val_MAE: 0.2893
Train_MAE: -1.0000
Val_MAE: 0.2825
Train_MAE: -1.0000
Val_MAE: 0.3013
Train_MAE: -1.0000
Val_MAE: 0.3053
Train_MAE: -1.0000
Val_MAE: 0.3192
Train_MAE: -1.0000
Val_MAE: 0.3396
Train_MAE: -1.0000
Val_MAE: 0.2819
Train_MAE: -1.0000
Val_MAE: 0.2761
Train_MAE: -1.0000
Val_MAE: 0.2777
Train_MAE: -1.0000
Val_MAE: 0.2776
Train_MAE: -1.0000
Val_MAE: 0.2726
Train_MAE: 0.0689
Val_MAE: 0.2764
Train_MAE: -1.0000
Val_MAE: 0.2941
Train_MAE: -1.0000
Val_MAE: 0.2730
Train_MAE: -1.0000
Val_MAE: 0.3306
Train_MAE: -1.0000
Val_MAE: 0.2968
Train_MAE: -1.0000
Val_MAE: 0.2890
Train_MAE: -1.0000
Val_MAE: 0.2779
Train_MAE: -1.0000
Val_MAE: 0.3059
Train_MAE: -1.0000
Val_MAE: 0.2845
Train_MAE: -1.0000
Val_MAE: 0.2836
Train_MAE: -1.0000
Val_MAE: 0.2891
Train_MAE: -1.0000
Val_MAE: 0.2962
Train_MAE: -1.0000
Val_MAE: 0.2811
Train_MAE: -1.0000
Val_MAE: 0.3049
Train_MAE: -1.0000
Val_MAE: 0.2950
Train_MAE: -1.0000
Val_MAE: 0.2894
Train_MAE: -1.0000
Val_MAE: 0.2915
Train_MAE: -1.0000
Val_MAE: 0.2825
Train_MAE: -1.0000
Val_MAE: 0.2782
Train_MAE: -1.0000
Val_MAE: 0.2821
Train_MAE: 0.0773
Val_MAE: 0.2807
Train_MAE: -1.0000
Val_MAE: 0.2791
Train_MAE: -1.0000
Val_MAE: 0.3063
Train_MAE: -1.0000
Val_MAE: 0.2832
Train_MAE: -1.0000
Val_MAE: 0.2888
Train_MAE: -1.0000
Val_MAE: 0.3006
Train_MAE: -1.0000
Val_MAE: 0.2856
Train_MAE: -1.0000
Val_MAE: 0.2985
Train_MAE: -1.0000
Val_MAE: 0.2939
Train_MAE: -1.0000
Val_MAE: 0.3038
Train_MAE: -1.0000
Val_MAE: 0.2974
Train_MAE: -1.0000
Val_MAE: 0.2806
Train_MAE: -1.0000
Val_MAE: 0.2760
Train_MAE: -1.0000
Val_MAE: 0.2681
Train_MAE: -1.0000
Val_MAE: 0.2792
Train_MAE: -1.0000
Val_MAE: 0.2845
Train_MAE: -1.0000
Val_MAE: 0.2937
Train_MAE: -1.0000
Val_MAE: 0.2900
Train_MAE: -1.0000
Val_MAE: 0.2769
Train_MAE: -1.0000
Val_MAE: 0.2706
Train_MAE: 0.0609
Val_MAE: 0.2778
Train_MAE: -1.0000
Val_MAE: 0.2804
Train_MAE: -1.0000
Val_MAE: 0.2708
Train_MAE: -1.0000
Val_MAE: 0.2730
Train_MAE: -1.0000
Val_MAE: 0.2789
Train_MAE: -1.0000
Val_MAE: 0.2798
Train_MAE: -1.0000
Val_MAE: 0.2735
Train_MAE: -1.0000
Val_MAE: 0.2672
Train_MAE: -1.0000
Val_MAE: 0.2864
Train_MAE: -1.0000
Val_MAE: 0.2663
Train_MAE: -1.0000
Val_MAE: 0.2715
Train_MAE: -1.0000
Val_MAE: 0.2900
Train_MAE: -1.0000
Val_MAE: 0.2761
Train_MAE: -1.0000
Val_MAE: 0.2896
Train_MAE: -1.0000
Val_MAE: 0.2709
Train_MAE: -1.0000
Val_MAE: 0.2665
Train_MAE: -1.0000
Val_MAE: 0.2766
Train_MAE: -1.0000
Val_MAE: 0.2678
Train_MAE: -1.0000
Val_MAE: 0.2722
Train_MAE: -1.0000
Val_MAE: 0.2696
Train_MAE: 0.0549
Val_MAE: 0.2713
Train_MAE: -1.0000
Val_MAE: 0.2695
Train_MAE: -1.0000
Val_MAE: 0.2768
Train_MAE: -1.0000
Val_MAE: 0.2840
Train_MAE: -1.0000
Val_MAE: 0.2956
Train_MAE: -1.0000
Val_MAE: 0.2839
Train_MAE: -1.0000
Val_MAE: 0.2694
Train_MAE: -1.0000
Val_MAE: 0.2755
Train_MAE: -1.0000
Val_MAE: 0.2790
Train_MAE: -1.0000
Val_MAE: 0.2705
Train_MAE: -1.0000
Val_MAE: 0.2659
Train_MAE: -1.0000
Val_MAE: 0.2692
Train_MAE: -1.0000
Val_MAE: 0.2695
Train_MAE: -1.0000
Val_MAE: 0.2785
Train_MAE: -1.0000
Val_MAE: 0.2777
Train_MAE: -1.0000
Val_MAE: 0.2746
Train_MAE: -1.0000
Val_MAE: 0.2671
Train_MAE: -1.0000
Val_MAE: 0.2731
Train_MAE: -1.0000
Val_MAE: 0.2767
Train_MAE: -1.0000
Val_MAE: 0.2693
Train_MAE: 0.0498
Val_MAE: 0.2684
Train_MAE: -1.0000
Val_MAE: 0.2803
Train_MAE: -1.0000
Val_MAE: 0.2816
Train_MAE: -1.0000
Val_MAE: 0.2611
Train_MAE: -1.0000
Val_MAE: 0.2633
Train_MAE: -1.0000
Val_MAE: 0.2633
Train_MAE: -1.0000
Val_MAE: 0.2630
Train_MAE: -1.0000
Val_MAE: 0.2779
Train_MAE: -1.0000
Val_MAE: 0.2818
Train_MAE: -1.0000
Val_MAE: 0.2661
Train_MAE: -1.0000
Val_MAE: 0.2640
Train_MAE: -1.0000
Val_MAE: 0.2696
Train_MAE: -1.0000
Val_MAE: 0.2631
Train_MAE: -1.0000
Val_MAE: 0.2619
Train_MAE: -1.0000
Val_MAE: 0.2666
Train_MAE: -1.0000
Val_MAE: 0.2640
Train_MAE: -1.0000
Val_MAE: 0.2620
Train_MAE: -1.0000
Val_MAE: 0.2679
Train_MAE: -1.0000
Val_MAE: 0.2680
Train_MAE: -1.0000
Val_MAE: 0.2662
Train_MAE: 0.0443
Val_MAE: 0.2650
Train_MAE: -1.0000
Val_MAE: 0.2797
Train_MAE: -1.0000
Val_MAE: 0.2671
Train_MAE: -1.0000
Val_MAE: 0.2698
Train_MAE: -1.0000
Val_MAE: 0.2618
Train_MAE: -1.0000
Val_MAE: 0.2600
Train_MAE: -1.0000
Val_MAE: 0.2660
Train_MAE: -1.0000
Val_MAE: 0.2602
Train_MAE: -1.0000
Val_MAE: 0.2661
Train_MAE: -1.0000
Val_MAE: 0.2668
Train_MAE: -1.0000
Val_MAE: 0.2652
Train_MAE: -1.0000
Val_MAE: 0.2624
Train_MAE: -1.0000
Val_MAE: 0.2687
Train_MAE: -1.0000
Val_MAE: 0.2658
Train_MAE: -1.0000
Val_MAE: 0.2607
Train_MAE: -1.0000
Val_MAE: 0.2619
Train_MAE: -1.0000
Val_MAE: 0.2641
Train_MAE: -1.0000
Val_MAE: 0.2706
Train_MAE: -1.0000
Val_MAE: 0.2649
Train_MAE: -1.0000
Val_MAE: 0.2667
Train_MAE: 0.0329
Val_MAE: 0.2670
Train_MAE: -1.0000
Val_MAE: 0.2740
Train_MAE: -1.0000
Val_MAE: 0.2708
Train_MAE: -1.0000
Val_MAE: 0.2649
Train_MAE: -1.0000
Val_MAE: 0.2639
Train_MAE: -1.0000
Val_MAE: 0.2689
Train_MAE: -1.0000
Val_MAE: 0.2643
Train_MAE: -1.0000
Val_MAE: 0.2672
Train_MAE: -1.0000
Val_MAE: 0.2667
Train_MAE: -1.0000
Val_MAE: 0.2614
Train_MAE: -1.0000
Val_MAE: 0.2636
Train_MAE: -1.0000
Val_MAE: 0.2612
Train_MAE: -1.0000
Val_MAE: 0.2637
Train_MAE: -1.0000
Val_MAE: 0.2619
Train_MAE: -1.0000
Val_MAE: 0.2648
Train_MAE: -1.0000
Val_MAE: 0.2653
Train_MAE: -1.0000
Val_MAE: 0.2603
Train_MAE: -1.0000
Val_MAE: 0.2603
Train_MAE: -1.0000
Val_MAE: 0.2604
Train_MAE: -1.0000
Val_MAE: 0.2665
Train_MAE: 0.0359
Val_MAE: 0.2654
Train_MAE: -1.0000
Val_MAE: 0.2628
Train_MAE: -1.0000
Val_MAE: 0.2667
Train_MAE: -1.0000
Val_MAE: 0.2673
Train_MAE: -1.0000
Val_MAE: 0.2620
Train_MAE: -1.0000
Val_MAE: 0.2612
Train_MAE: -1.0000
Val_MAE: 0.2606
Train_MAE: -1.0000
Val_MAE: 0.2604
Train_MAE: -1.0000
Val_MAE: 0.2599
Train_MAE: -1.0000
Val_MAE: 0.2646
Train_MAE: -1.0000
Val_MAE: 0.2653
Train_MAE: -1.0000
Val_MAE: 0.2595
Train_MAE: -1.0000
Val_MAE: 0.2587
Train_MAE: -1.0000
Val_MAE: 0.2640
Train_MAE: -1.0000
Val_MAE: 0.2642
Train_MAE: -1.0000
Val_MAE: 0.2643
Train_MAE: -1.0000
Val_MAE: 0.2621
Train_MAE: -1.0000
Val_MAE: 0.2634
Train_MAE: -1.0000
Val_MAE: 0.2655
Train_MAE: -1.0000
Val_MAE: 0.2609
Train_MAE: 0.0249
Val_MAE: 0.2597
Train_MAE: -1.0000
Val_MAE: 0.2809
Train_MAE: -1.0000
Val_MAE: 0.2654
Train_MAE: -1.0000
Val_MAE: 0.2605
Train_MAE: -1.0000
Val_MAE: 0.2615
Train_MAE: -1.0000
Val_MAE: 0.2639
Train_MAE: -1.0000
Val_MAE: 0.2602
Train_MAE: -1.0000
Val_MAE: 0.2600
Train_MAE: -1.0000
Val_MAE: 0.2631
Train_MAE: -1.0000
Val_MAE: 0.2592
Train_MAE: -1.0000
Val_MAE: 0.2627
Train_MAE: -1.0000
Val_MAE: 0.2623
Train_MAE: -1.0000
Val_MAE: 0.2646
Train_MAE: -1.0000
Val_MAE: 0.2660
Train_MAE: -1.0000
Val_MAE: 0.2638
Train_MAE: -1.0000
Val_MAE: 0.2635
Train_MAE: -1.0000
Val_MAE: 0.2613
Train_MAE: -1.0000
Val_MAE: 0.2634
Train_MAE: -1.0000
Val_MAE: 0.2597
Train_MAE: -1.0000
Val_MAE: 0.2667
Train_MAE: 0.0287
Val_MAE: 0.2669
Train_MAE: -1.0000
Val_MAE: 0.2612
Train_MAE: -1.0000
Val_MAE: 0.2584
Train_MAE: -1.0000
Val_MAE: 0.2614
Train_MAE: -1.0000
Val_MAE: 0.2610
Train_MAE: -1.0000
Val_MAE: 0.2626
Train_MAE: -1.0000
Val_MAE: 0.2608
Train_MAE: -1.0000
Val_MAE: 0.2611
Train_MAE: -1.0000
Val_MAE: 0.2628
Train_MAE: -1.0000
Val_MAE: 0.2673
Train_MAE: -1.0000
Val_MAE: 0.2632
Train_MAE: -1.0000
Val_MAE: 0.2642
Train_MAE: -1.0000
Val_MAE: 0.2624
Train_MAE: -1.0000
Val_MAE: 0.2602
Train_MAE: -1.0000
Val_MAE: 0.2593
Train_MAE: -1.0000
Val_MAE: 0.2601
Train_MAE: -1.0000
Val_MAE: 0.2614
Train_MAE: -1.0000
Val_MAE: 0.2602
Train_MAE: -1.0000
Val_MAE: 0.2606
Train_MAE: -1.0000
Val_MAE: 0.2595
Train_MAE: 0.0186
Val_MAE: 0.2595
Train_MAE: -1.0000
Val_MAE: 0.2598
Train_MAE: -1.0000
Val_MAE: 0.2600
Train_MAE: -1.0000
Val_MAE: 0.2609
Train_MAE: -1.0000
Val_MAE: 0.2610
Train_MAE: -1.0000
Val_MAE: 0.2600
Train_MAE: -1.0000
Val_MAE: 0.2640
Train_MAE: -1.0000
Val_MAE: 0.2595
Train_MAE: -1.0000
Val_MAE: 0.2638
Train_MAE: -1.0000
Val_MAE: 0.2584
Train_MAE: -1.0000
Val_MAE: 0.2628
Train_MAE: -1.0000
Val_MAE: 0.2607
Train_MAE: -1.0000
Val_MAE: 0.2605
Train_MAE: -1.0000
Val_MAE: 0.2622
Train_MAE: -1.0000
Val_MAE: 0.2585
Train_MAE: -1.0000
Val_MAE: 0.2602
Train_MAE: -1.0000
Val_MAE: 0.2609
Train_MAE: -1.0000
Val_MAE: 0.2609
Train_MAE: -1.0000
Val_MAE: 0.2578
Train_MAE: -1.0000
Val_MAE: 0.2596
Train_MAE: 0.0171
Val_MAE: 0.2634
Train_MAE: -1.0000
Val_MAE: 0.2603
Train_MAE: -1.0000
Val_MAE: 0.2616
Train_MAE: -1.0000
Val_MAE: 0.2604
Train_MAE: -1.0000
Val_MAE: 0.2596
Train_MAE: -1.0000
Val_MAE: 0.2573
Train_MAE: -1.0000
Val_MAE: 0.2582
Train_MAE: -1.0000
Val_MAE: 0.2584
Train_MAE: -1.0000
Val_MAE: 0.2612
Train_MAE: -1.0000
Val_MAE: 0.2605
Train_MAE: -1.0000
Val_MAE: 0.2588
Train_MAE: -1.0000
Val_MAE: 0.2597
Train_MAE: -1.0000
Val_MAE: 0.2577
Train_MAE: -1.0000
Val_MAE: 0.2586
Train_MAE: -1.0000
Val_MAE: 0.2603
Train_MAE: -1.0000
Val_MAE: 0.2601
Train_MAE: -1.0000
Val_MAE: 0.2583
Train_MAE: -1.0000
Val_MAE: 0.2629
Train_MAE: -1.0000
Val_MAE: 0.2597
Train_MAE: -1.0000
Val_MAE: 0.2609
Train_MAE: 0.0162
100%|██████████| 1817/1817 [00:09<00:00, 186.51it/s]
Test MAE: 0.2687768995987804
train= {'loss': [1.4285094139334389, 0.14521408081054688, 0.6088336154752891, 0.2542968951658005, 0.08072120397626566, 0.060061072462980966, 0.034030683240176295, 0.02257103016723095, 0.015402810689111113, 0.011315699715971421, 0.00966341905131739, 0.011955621484092679, 0.008345972598911908, 0.006777030780977089, 0.006310827931643582, 0.006523036221575632, 0.0065924134023389105, 0.007235208797034713, 0.007497810057081315, 0.0059825208218612334, 0.0049126458062999576, 0.006045156113376702, 0.0058393299842195885, 0.006040796834466741, 0.005784488984666732], 'mae': [2.0264885102062453, 0.49572205997109203, 1.4165144123212214, 0.6565092368085526, 0.3635104445990124, 0.35035584556720906, 0.22175986407688505, 0.17545114742864068, 0.15935550110396038, 0.12073957612817877, 0.09360037063327323, 0.1303999063422412, 0.06889784148794798, 0.07731338921528838, 0.0609197499644487, 0.054871283170961335, 0.0498089792829846, 0.044303624313285835, 0.03294533623437053, 0.03588441558976684, 0.02494248859252604, 0.0286916138279645, 0.018557668687338914, 0.01706945792912994, 0.01623096574389679], 'neg_mae': [-2.0264885102062453, -0.49572205997109203, -1.4165144123212214, -0.6565092368085526, -0.3635104445990124, -0.35035584556720906, -0.22175986407688505, -0.17545114742864068, -0.15935550110396038, -0.12073957612817877, -0.09360037063327323, -0.1303999063422412, -0.06889784148794798, -0.07731338921528838, -0.0609197499644487, -0.054871283170961335, -0.0498089792829846, -0.044303624313285835, -0.03294533623437053, -0.03588441558976684, -0.02494248859252604, -0.0286916138279645, -0.018557668687338914, -0.01706945792912994, -0.01623096574389679]}
validation= {'loss': [2.067276818411691, 0.557591472353254, 1.8346504483904158, 0.9104178973606655, 1.1882227488926478, 0.391197885785784, 1.3060270036969865, 0.3902064732142857, 0.29276578766959055, 0.31003665924072266, 0.275498628616333, 0.3242753914424351, 0.33235086713518414, 0.3643597194126674, 0.26878275190080914, 0.6886297634669712, 0.36398121288844515, 0.24088307789393834, 0.2435779571533203, 1.402298927307129, 0.20008419241224015, 0.6384049824305943, 0.3227172579084124, 0.3424253463745117, 0.21318863119397843, 0.226467388016837, 0.28263923100062777, 0.2429624114717756, 0.9968112536839077, 0.23990021433149064, 0.36501857212611605, 0.30318781307765413, 0.174632157598223, 4.862494332449777, 0.2371753113610404, 0.6538443565368652, 0.4218228544507708, 0.1823667117527553, 0.17758112294333323, 0.14791149752480642, 0.2433262722832816, 0.23350839955466135, 0.28987189701625277, 0.24788228103092738, 0.5466805866786412, 0.1899177176611764, 0.3535502978733608, 0.44891507284981863, 0.21481689385005406, 0.2034437656402588, 0.1370278937476022, 0.4772672653198242, 1.4446304866245814, 0.13747801099504744, 0.5241289819989886, 0.17114686965942383, 0.40270849636622835, 0.15374347141810826, 0.28398743697575163, 0.610297611781529, 0.23986990111214773, 0.456740140914917, 0.15753093787602016, 0.2818373441696167, 0.24662254537854875, 0.14366890702928817, 0.1524510383605957, 0.17694621426718576, 0.3032619612557547, 0.1720064197267805, 0.22903776168823242, 0.278279219354902, 0.13118301119123185, 0.1651679618018014, 0.12719545194080897, 0.16142570972442627, 0.12265564714159284, 0.17649289539882115, 0.12694988080433436, 0.2918143612997873, 0.14609924384525844, 0.15239744526999338, 0.16083931922912598, 0.15328691686902726, 0.10632696322032384, 0.09966456890106201, 0.1271324838910784, 0.13618127788816178, 0.215655939919608, 0.13023556130273, 0.1963979857308524, 0.1037210396357945, 0.3343799114227295, 0.32685961042131695, 0.17315384319850377, 0.11509490864617485, 0.1132361888885498, 0.10115194320678711, 0.12849563360214233, 0.12124110119683403, 0.19909322261810303, 0.13871548857007707, 0.30011541502816336, 0.5243923664093018, 0.2206070934023176, 0.10633042028972081, 0.09835326671600342, 0.09031489065715245, 0.08614416633333478, 0.08799761533737183, 0.12598695925303868, 0.5674052579062325, 0.11816221475601196, 0.14415383338928223, 0.10992722000394549, 0.11335734810147967, 0.11676874331065587, 0.12057484047753471, 0.10887119599751063, 0.11696187938962664, 0.1016690560749599, 0.1312840666089739, 0.11035408292497907, 0.12111445835658483, 0.12179324456623622, 0.10269911800112043, 0.093775817326137, 0.10315957239695958, 0.09504810401371547, 0.10386058262416295, 0.11074124915259224, 0.09899725232805524, 0.10709255933761597, 0.10050688471112933, 0.0928691029548645, 0.139857360294887, 0.21390673092433385, 0.14616160733359201, 0.09059735706874303, 0.08510783740452357, 0.07973544938223702, 0.11967680283955165, 0.09155098029545375, 0.08146640232631139, 0.09884614603860038, 0.12254653658185687, 0.11963191202708653, 0.13830667734146118, 0.12413146666118077, 0.10114530154636928, 0.09456704344068255, 0.08539738825389318, 0.08814636298588344, 0.09043266092027936, 0.08271404675074986, 0.08752435445785522, 0.104033248765128, 0.09205174446105957, 0.08463658605303083, 0.08278357131140572, 0.08299700702939715, 0.08420040777751378, 0.11270793846675328, 0.09202330453055245, 0.10541943141392299, 0.095044663974217, 0.0913216301373073, 0.08188289403915405, 0.08347113643373762, 0.08202285426003593, 0.09704373564038958, 0.14447193486349924, 0.09184983798435756, 0.09009092194693429, 0.09247472456523351, 0.08243014131273542, 0.09042318378176008, 0.08401368345533099, 0.09432177032743182, 0.08777856826782227, 0.12350505590438843, 0.09338558571679252, 0.21215135710580008, 0.16468649251120432, 0.12241262197494507, 0.1010955912726266, 0.09272198166166033, 0.077738276549748, 0.0911569425037929, 0.08611926862171718, 0.08180573156901769, 0.07679080963134766, 0.08453973702022008, 0.07895614419664655, 0.07888126373291016, 0.08286548512322563, 0.07814954008374896, 0.07743580000741142, 0.08227477754865374, 0.08074806417737689, 0.0883932454245431, 0.07687896490097046, 0.09076832021985735, 0.08640095165797643, 0.10138274090630668, 0.10707431180136544, 0.08458877461297172, 0.09912620271955218, 0.09631385973521642, 0.08794536760875157, 0.08612546750477382, 0.07951838629586357, 0.08693634612219674, 0.08854734897613525, 0.08580327885491508, 0.08206063508987427, 0.08040945870535714, 0.08164286613464355, 0.08975155012948173, 0.081814706325531, 0.07901820966175624, 0.08204196180616107, 0.08285374300820487, 0.08345351900373187, 0.08467987605503627, 0.08814675467354911, 0.10802526133401054, 0.08037394285202026, 0.09596713100160871, 0.08568517650876727, 0.07430980035236903, 0.07959629808153425, 0.07539792571749006, 0.0811540058680943, 0.08293164627892631, 0.07730448246002197, 0.08488220827920097, 0.08736424786703927, 0.08830597570964269, 0.0825171981539045, 0.08995591742651803, 0.07812789508274623, 0.0884659971509661, 0.0790629472051348, 0.07720016581671578, 0.08375779220036098, 0.08660367556980678, 0.09450349637440272, 0.07660489422934395, 0.07904553413391113, 0.0752100944519043, 0.0889263493674142, 0.07690072059631348, 0.08003631659916469, 0.07641726732254028, 0.07463042225156512, 0.07377380132675171, 0.07562846796853202, 0.072380644934518, 0.07296044485909599, 0.07267911945070539, 0.07508194446563721, 0.07416895457676478, 0.08455955130713326, 0.07952083008629936, 0.0840659396989005, 0.0739190833909171, 0.07799124717712402, 0.07403170210974556, 0.07585304124014718, 0.07554239886147636, 0.07745531627110072, 0.07527109554835729, 0.0880894660949707, 0.08215746709278651, 0.0799430183001927, 0.07516060556684222, 0.07520249911717006, 0.0781129172870091, 0.07455663170133318, 0.07698496750422887, 0.07830557652882167, 0.07474409682410103, 0.07713968413216728, 0.08280315569468907, 0.07977969305855888, 0.08334811244692121, 0.08524927071162633, 0.08283441407339913, 0.09242011819566999, 0.07940779413495745, 0.07361056123461042, 0.07312628201075963, 0.07507412774222237, 0.0722094007900783, 0.07621259348733085, 0.07564147881099156, 0.0759010740688869, 0.07514740739549909, 0.0748311196054731, 0.07625800371170044, 0.0724691663469587, 0.07281981195722308, 0.07296140704836164, 0.07654448917933873, 0.07288646697998047, 0.07245326042175293, 0.0724964908191136, 0.07301988771983556, 0.072097761290414, 0.07489350863865443, 0.07343881470816475, 0.07591623919350761, 0.07402987139565605, 0.0731977139200483, 0.07344743183680943, 0.07286011321204049, 0.07135009339877538, 0.07371106318065099, 0.0727234057017735, 0.07162713153021676, 0.07479878834315709, 0.07437604665756226, 0.07731743369783674, 0.07500626359667097, 0.0720672607421875, 0.07234432016100202, 0.07302065406526838, 0.07324068886893136, 0.07234923328672137, 0.0738914864403861, 0.07197573355266026, 0.07539895602634974, 0.0780630452292306, 0.07465075595038277, 0.07299426623753139, 0.07378622463771276, 0.0719436492238726, 0.07151121752602714, 0.07013446092605591, 0.07078470928328377, 0.0729021430015564, 0.0712896031992776, 0.07111685190882001, 0.07117668219975062, 0.0700518446309226, 0.07052243181637355, 0.07383225645337786, 0.07198620694024223, 0.07047955904688154, 0.07185683931623187, 0.07235573870795113, 0.07050283891814095, 0.07299677814756121, 0.07253490175519671, 0.07203731366566249, 0.072884202003479, 0.0707864761352539, 0.07221429688589913, 0.07235630920955113, 0.07192379236221313, 0.07243200710841588, 0.07050618955067225, 0.06987923809460231, 0.07165145022528512, 0.0717001301901681, 0.07385935953685216, 0.07127196022442409, 0.07166145529065814, 0.07229514632906232, 0.07360934359686715, 0.07242270026888166, 0.07144801957266671, 0.0714821560042245, 0.07075471537453788, 0.07219143424715314, 0.07244498389107841, 0.07305829014096941, 0.07222973448889596, 0.07284952061516899, 0.07248044013977051, 0.0718233755656651, 0.07388826778956822, 0.07239483083997454, 0.07220155852181571, 0.07410906042371478, 0.07162884303501674, 0.07229870557785034, 0.07128210578645978, 0.07166150638035365, 0.07125331674303327, 0.07127136843545097, 0.07117428524153573, 0.07154245035988945, 0.07288366556167603, 0.07131771530423846, 0.07029476761817932, 0.07086967570441109, 0.06996197785649981, 0.07202968427113124, 0.0721642289842878, 0.07130019153867449, 0.07125769768442426, 0.07074511476925441, 0.07176056929997035, 0.07222977706364223, 0.07121369242668152, 0.07168157611574445, 0.07009985191481453, 0.07003045933587211, 0.07011682220867702, 0.07114074911390032, 0.07035331215177264, 0.07030744637761797, 0.07162731034415108, 0.07076914395604815, 0.07052211250577654, 0.07127729058265686, 0.07199514763695854, 0.0716208815574646, 0.0713920167514256, 0.07283255883625575, 0.0716611317225865, 0.07153233460017613, 0.07180334840502058, 0.07188015324728829, 0.07105176363672529, 0.07091867923736572, 0.07208681958062309, 0.07051091534750802, 0.07184915883200509, 0.07191139459609985, 0.07152161427906581, 0.07158832890646798, 0.07173797913960048, 0.07098958321980067, 0.07182358843939644, 0.07184630632400513, 0.07108718582562037, 0.0711108318396977, 0.07142796260969979, 0.07101313556943621, 0.07119344813483101, 0.07167336770466395, 0.07142745171274457, 0.07221761771610805, 0.07127259884561811, 0.0715024471282959, 0.07114594749041966, 0.07149462188993182, 0.07122066191264562, 0.07157452617372785, 0.07105220215661186, 0.07121006080082484, 0.07122673732893807, 0.07113836918558393, 0.07190265825816564, 0.07102933526039124, 0.07082349061965942, 0.07097721525600978, 0.07087872283799308, 0.07107329368591309, 0.07111346295901708, 0.0709940322807857, 0.07094256367002215, 0.07109014477048602, 0.07139210190091815, 0.07114472559520177, 0.07185758863176618, 0.07092523149081639, 0.07139153565679278, 0.07132715838296073, 0.07125080909047808, 0.07161757775715419, 0.07089653185435704, 0.07110229560307094, 0.07156085968017578, 0.0713220749582563, 0.07091151816504342, 0.07128556711333138, 0.07126608490943909, 0.07113716857773918, 0.07106336525508336, 0.07117130075182233, 0.07111707329750061, 0.0704015280519213, 0.07083484530448914, 0.07096669929368156, 0.0717185480254037, 0.07152114595685687, 0.07113847136497498, 0.07152611868722099, 0.07064825296401978, 0.07098262224878583, 0.0714278519153595, 0.0712020993232727, 0.0707411255155291, 0.0709504953452519, 0.07121391807283674, 0.07156156642096383], 'mae': [1.2240497812840936, 1.0491972805760905, 2.177223645181375, 1.0165889551122016, 1.3406404656419575, 0.8556971617035429, 1.2121574992947959, 1.096971665489795, 0.8284315123160368, 0.802401427767067, 0.719558394861918, 0.7849073347658958, 0.8244395721606307, 0.8460709102034103, 0.7599813149222924, 1.1451407874559183, 0.8314646137679758, 0.7019170248669686, 0.7286219646284569, 1.9904003726025485, 0.6002509745494625, 1.2507536166653395, 0.877560498481567, 0.8328499010136844, 0.6651481834920743, 0.7533338135893031, 0.8580113116175633, 0.752174687640414, 1.6393800543176313, 0.7082194862584551, 0.8714797808659297, 0.7706629595128149, 0.6321018529355485, 4.425711685267495, 0.6674090076172386, 1.1278464295100477, 0.8834925980702745, 0.5918437221842274, 0.5901696789472359, 0.504724962843481, 0.6684728413280538, 0.6428842722232064, 0.7328587616605032, 0.8385542616195116, 1.3390572576799178, 0.5376255152079863, 0.9498036898184474, 1.0387576530272988, 0.682961836804866, 0.6249374227647295, 0.4965813023768585, 1.172860100470215, 2.278217482910515, 0.5182140208949412, 1.0050129991780996, 0.5378123526797397, 0.8313596187606278, 0.49421769200483123, 0.6140073715039298, 1.4123180565260316, 0.6596589009384892, 0.9246873221516053, 0.5673594938798069, 0.7040027258393812, 0.7443534667624728, 0.4621893633790538, 0.48387678816129576, 0.5652266989784787, 0.7196639420593873, 0.5672209730436664, 0.5956554511493357, 0.7537971011010648, 0.48800369935920557, 0.545889632005353, 0.4611682849604869, 0.533690529519376, 0.45018877324168033, 0.5816573921325013, 0.44503995534002166, 0.7260429212248238, 0.4890807067486288, 0.4943243435825431, 0.5229037321528994, 0.5269863892568031, 0.4447497399891326, 0.39243067260861136, 0.43881988860306176, 0.4873330250147483, 0.5767905067299124, 0.46314867538833937, 0.6652100287856587, 0.4087350325513134, 0.9597066674536491, 0.7891325358053952, 0.4719616691662675, 0.43848431477792393, 0.3995807458362769, 0.39676209136215584, 0.4604135988332905, 0.4429333894694546, 0.5963881285559834, 0.4565384074463856, 0.7968620931232023, 1.098952844760678, 0.5251244830521187, 0.38530296308085804, 0.37685417813007166, 0.37218848714238956, 0.37424302881510085, 0.39227562551098366, 0.4850543335949451, 0.838719129812868, 0.4213784508741706, 0.4813836497636747, 0.39143909691938195, 0.42947781780463545, 0.394562284245394, 0.42766568759517015, 0.4254531801056186, 0.4733356759566412, 0.38471795708950357, 0.4787528976993981, 0.40569195223503973, 0.4197582067318534, 0.3888251077695016, 0.38200323268463954, 0.3800390135389105, 0.38773492670147053, 0.35255970601661707, 0.39698304629497466, 0.3941400560133653, 0.3717046108275341, 0.3910843936507566, 0.37250560204061567, 0.38254540449944247, 0.46690565864713696, 0.5892280764639832, 0.48288083439324747, 0.35096060286751485, 0.3571833911129297, 0.33521107567025404, 0.44941547041896623, 0.35011232051476004, 0.3284903105408151, 0.37309510443724353, 0.3527432303476427, 0.42300859499651955, 0.372069056307585, 0.4066051958113492, 0.384591781646789, 0.378431548653686, 0.3302761328352161, 0.3497283511697064, 0.33624189630024653, 0.31955502637209715, 0.32650140060823424, 0.38108703094696605, 0.35750638289173237, 0.33582731984561287, 0.3200088872096454, 0.32736427655705175, 0.3172742247971876, 0.40213198252308785, 0.3652048597941412, 0.3734891709730864, 0.3763916794613404, 0.3767968292417606, 0.3211800232936726, 0.3189619150186052, 0.3144111584604543, 0.3793895590719713, 0.535622563869464, 0.3381289271554495, 0.322694030000869, 0.3168376396220054, 0.30250645219342753, 0.3469526491985913, 0.33202608249280907, 0.3495495598968632, 0.3347660104324949, 0.42417403168965206, 0.33332262489759623, 0.5740617014769553, 0.48088423324112006, 0.44676334070856544, 0.38469989258410714, 0.33289207437159674, 0.3027093426208498, 0.316867457888555, 0.323726290269392, 0.3081107677819226, 0.3195894396492989, 0.29925194222392737, 0.32314181674708303, 0.29912631897133407, 0.2967574627931547, 0.29514613257708106, 0.2977544026149909, 0.30649141108801453, 0.31578926723458617, 0.3260855620047519, 0.3065623083553729, 0.34379088744839204, 0.31679711281131795, 0.31508770968548816, 0.3180144356547759, 0.32187937208228545, 0.3443872527793844, 0.3875785385706911, 0.3334405174884915, 0.31023411628796166, 0.31337199581541514, 0.32389281503311224, 0.3262766000656422, 0.3100043065921265, 0.3331707331720902, 0.29782902716566806, 0.32920717164470314, 0.3374052425593484, 0.29285178262312417, 0.297819561049303, 0.29380245680122674, 0.3001208528219276, 0.30430931349728546, 0.31031162011569974, 0.3164690724371206, 0.35744248660626887, 0.31650632554923186, 0.3416715817628838, 0.2967709717300505, 0.2810968382054214, 0.29317572101356343, 0.28415894935980374, 0.2886740304919042, 0.29536059927597474, 0.28615276984024896, 0.31737815486644455, 0.34064719020358925, 0.3200006832421291, 0.31871972048719255, 0.33324563381782796, 0.29629271592178674, 0.31666216148989, 0.29432770793302737, 0.2824766429920687, 0.30732405462769147, 0.31852448183716525, 0.33627408109588736, 0.3036492490915872, 0.2892865868260958, 0.28251590765390755, 0.30130865321497236, 0.3053269801697458, 0.31924869890231194, 0.33957223380604507, 0.2819229343479459, 0.2760695218515221, 0.27765265092925845, 0.27764641906931814, 0.2725802127540195, 0.2764443011752714, 0.29405581346151966, 0.27297859820545395, 0.33057098263890883, 0.2967663372772469, 0.28900246528761675, 0.2779469288217541, 0.305889050549991, 0.2844753345782267, 0.28362375880581986, 0.28906121437230675, 0.296219294356731, 0.2810793258901462, 0.3048929390133367, 0.29503246029639857, 0.2893607380709547, 0.2914890365670096, 0.2824754991696746, 0.27821604262157956, 0.2820630920333746, 0.28069265475771343, 0.27906083434392487, 0.30629656685950246, 0.28322904147447686, 0.28879377686392094, 0.30063123426260496, 0.28564130374040475, 0.29848126230428945, 0.293872703273085, 0.30384722897114513, 0.29742730885243013, 0.28063116444349273, 0.2760036534584826, 0.268144785046915, 0.2792372196455253, 0.28452489364161243, 0.29367764211273956, 0.29003064329345735, 0.27689327118603113, 0.27056373275752565, 0.27775283399412093, 0.2804001320409604, 0.2708099898305449, 0.2729622691547244, 0.278923733425239, 0.27983847580330623, 0.2734739719074777, 0.267188805899433, 0.2864075661390724, 0.2663225773677362, 0.27153515350732804, 0.2900163060713796, 0.27606443381397594, 0.2896018282221247, 0.27090068705796666, 0.2664924349932599, 0.2765507555422257, 0.26783425698799285, 0.27216123849949125, 0.26955144993874786, 0.27127550582386756, 0.2695071958447417, 0.27679707177847224, 0.28401887055867814, 0.2956258674660269, 0.2838587945867315, 0.2694250970063514, 0.2754897616663199, 0.2789627219920172, 0.27047599369146796, 0.265878419299462, 0.2692455563326294, 0.26947098794964575, 0.27848744406619325, 0.2777179276900251, 0.2746203185992742, 0.2670849547144791, 0.2731178290003219, 0.276738835441752, 0.2693300217001109, 0.26836394536183955, 0.2803384445159821, 0.281565154591502, 0.2611427184928307, 0.26333935051639523, 0.2633267487489843, 0.26299699264119586, 0.27790194504794474, 0.2818359841249186, 0.26613071102166397, 0.2640231393762332, 0.26963325296100177, 0.26311604877556094, 0.2618625969213101, 0.2666499275042819, 0.26396543550855844, 0.2619603345727783, 0.2678922961139555, 0.2679887913376508, 0.2661578275008344, 0.2650342980936926, 0.27973874632318624, 0.2671025853562088, 0.2698437754447433, 0.26176168417664447, 0.25998037800859264, 0.2659895475613715, 0.2601579662958144, 0.26612822616611814, 0.26677535354611764, 0.26523383593823563, 0.2624045912464313, 0.26870441001376594, 0.26581130847864964, 0.2606733963318864, 0.2618618869625827, 0.2641469482898565, 0.270613982058563, 0.2648748531961703, 0.26671778772597327, 0.2670273297311074, 0.2739900527390498, 0.2707925169574213, 0.26491074555405414, 0.26390430017370137, 0.26890473670133935, 0.2643339632512917, 0.26723603787587896, 0.26665692848617684, 0.2614243748968398, 0.263599471505674, 0.26123674858205603, 0.2636684558286836, 0.26188156859619155, 0.26483282758372506, 0.2652819356420151, 0.26034853105088646, 0.2602511878209334, 0.26042033548772986, 0.26646892747095363, 0.26537466414024014, 0.26281013544836673, 0.2667074341611991, 0.26731020884181356, 0.26197735386115945, 0.2612451103181784, 0.2605812200237826, 0.26044845774176406, 0.25994900177705793, 0.2646332502970305, 0.2653066067077912, 0.25948616785003875, 0.25868754316604836, 0.2639686303228316, 0.26422417602253395, 0.26429183903346765, 0.26205560708977643, 0.26342066051175506, 0.2654780420193758, 0.2608831102515218, 0.2596510163223193, 0.2808528884982661, 0.2653683731170726, 0.2604637612965541, 0.2614597150646023, 0.2638662582185597, 0.2602040150077148, 0.2600238827017199, 0.2630624863337959, 0.25915917269285665, 0.2626736261619537, 0.26232466172637464, 0.2646312979105302, 0.266002918450737, 0.26376226898607547, 0.2634832552062183, 0.26128413832710806, 0.2634113521639962, 0.259745421111984, 0.2667462057961439, 0.2668539617540983, 0.26117186624280453, 0.2584354880967556, 0.26137513137066626, 0.2610001153940078, 0.26259351915221524, 0.26081483588723947, 0.2610540525362123, 0.2628217511619896, 0.2672605708941248, 0.26322423859718214, 0.2641650325163287, 0.26238510682358007, 0.2601622852114059, 0.25930195328136146, 0.26010274728368543, 0.261428555764901, 0.26020298951177523, 0.2606056741577254, 0.2595071116324962, 0.2595474215113503, 0.25984398704863415, 0.26003772689690363, 0.2608968952834782, 0.26097802778915624, 0.2599979297660193, 0.2640203784256267, 0.25949853296454045, 0.26384432838231425, 0.2584032441378874, 0.2627981647553802, 0.2606587632936722, 0.26051493748815213, 0.2622034362736759, 0.2585140568625847, 0.26019381921154666, 0.26089200445668964, 0.26085055075544145, 0.257768126893026, 0.2596137237680565, 0.2633732313245514, 0.26032242034657976, 0.2616452509453555, 0.2604453023696424, 0.25964786095019765, 0.2573049971498704, 0.25820678278116305, 0.25842959149510325, 0.2611572726467419, 0.2605262968277901, 0.25881953632611227, 0.2597232743439052, 0.25767457010961897, 0.258560933859667, 0.26031985660673096, 0.26009927637435165, 0.25827063962447494, 0.2628510764016452, 0.2597092723801153, 0.26086561765732236], 'neg_mae': [-1.2240497812840936, -1.0491972805760905, -2.177223645181375, -1.0165889551122016, -1.3406404656419575, -0.8556971617035429, -1.2121574992947959, -1.096971665489795, -0.8284315123160368, -0.802401427767067, -0.719558394861918, -0.7849073347658958, -0.8244395721606307, -0.8460709102034103, -0.7599813149222924, -1.1451407874559183, -0.8314646137679758, -0.7019170248669686, -0.7286219646284569, -1.9904003726025485, -0.6002509745494625, -1.2507536166653395, -0.877560498481567, -0.8328499010136844, -0.6651481834920743, -0.7533338135893031, -0.8580113116175633, -0.752174687640414, -1.6393800543176313, -0.7082194862584551, -0.8714797808659297, -0.7706629595128149, -0.6321018529355485, -4.425711685267495, -0.6674090076172386, -1.1278464295100477, -0.8834925980702745, -0.5918437221842274, -0.5901696789472359, -0.504724962843481, -0.6684728413280538, -0.6428842722232064, -0.7328587616605032, -0.8385542616195116, -1.3390572576799178, -0.5376255152079863, -0.9498036898184474, -1.0387576530272988, -0.682961836804866, -0.6249374227647295, -0.4965813023768585, -1.172860100470215, -2.278217482910515, -0.5182140208949412, -1.0050129991780996, -0.5378123526797397, -0.8313596187606278, -0.49421769200483123, -0.6140073715039298, -1.4123180565260316, -0.6596589009384892, -0.9246873221516053, -0.5673594938798069, -0.7040027258393812, -0.7443534667624728, -0.4621893633790538, -0.48387678816129576, -0.5652266989784787, -0.7196639420593873, -0.5672209730436664, -0.5956554511493357, -0.7537971011010648, -0.48800369935920557, -0.545889632005353, -0.4611682849604869, -0.533690529519376, -0.45018877324168033, -0.5816573921325013, -0.44503995534002166, -0.7260429212248238, -0.4890807067486288, -0.4943243435825431, -0.5229037321528994, -0.5269863892568031, -0.4447497399891326, -0.39243067260861136, -0.43881988860306176, -0.4873330250147483, -0.5767905067299124, -0.46314867538833937, -0.6652100287856587, -0.4087350325513134, -0.9597066674536491, -0.7891325358053952, -0.4719616691662675, -0.43848431477792393, -0.3995807458362769, -0.39676209136215584, -0.4604135988332905, -0.4429333894694546, -0.5963881285559834, -0.4565384074463856, -0.7968620931232023, -1.098952844760678, -0.5251244830521187, -0.38530296308085804, -0.37685417813007166, -0.37218848714238956, -0.37424302881510085, -0.39227562551098366, -0.4850543335949451, -0.838719129812868, -0.4213784508741706, -0.4813836497636747, -0.39143909691938195, -0.42947781780463545, -0.394562284245394, -0.42766568759517015, -0.4254531801056186, -0.4733356759566412, -0.38471795708950357, -0.4787528976993981, -0.40569195223503973, -0.4197582067318534, -0.3888251077695016, -0.38200323268463954, -0.3800390135389105, -0.38773492670147053, -0.35255970601661707, -0.39698304629497466, -0.3941400560133653, -0.3717046108275341, -0.3910843936507566, -0.37250560204061567, -0.38254540449944247, -0.46690565864713696, -0.5892280764639832, -0.48288083439324747, -0.35096060286751485, -0.3571833911129297, -0.33521107567025404, -0.44941547041896623, -0.35011232051476004, -0.3284903105408151, -0.37309510443724353, -0.3527432303476427, -0.42300859499651955, -0.372069056307585, -0.4066051958113492, -0.384591781646789, -0.378431548653686, -0.3302761328352161, -0.3497283511697064, -0.33624189630024653, -0.31955502637209715, -0.32650140060823424, -0.38108703094696605, -0.35750638289173237, -0.33582731984561287, -0.3200088872096454, -0.32736427655705175, -0.3172742247971876, -0.40213198252308785, -0.3652048597941412, -0.3734891709730864, -0.3763916794613404, -0.3767968292417606, -0.3211800232936726, -0.3189619150186052, -0.3144111584604543, -0.3793895590719713, -0.535622563869464, -0.3381289271554495, -0.322694030000869, -0.3168376396220054, -0.30250645219342753, -0.3469526491985913, -0.33202608249280907, -0.3495495598968632, -0.3347660104324949, -0.42417403168965206, -0.33332262489759623, -0.5740617014769553, -0.48088423324112006, -0.44676334070856544, -0.38469989258410714, -0.33289207437159674, -0.3027093426208498, -0.316867457888555, -0.323726290269392, -0.3081107677819226, -0.3195894396492989, -0.29925194222392737, -0.32314181674708303, -0.29912631897133407, -0.2967574627931547, -0.29514613257708106, -0.2977544026149909, -0.30649141108801453, -0.31578926723458617, -0.3260855620047519, -0.3065623083553729, -0.34379088744839204, -0.31679711281131795, -0.31508770968548816, -0.3180144356547759, -0.32187937208228545, -0.3443872527793844, -0.3875785385706911, -0.3334405174884915, -0.31023411628796166, -0.31337199581541514, -0.32389281503311224, -0.3262766000656422, -0.3100043065921265, -0.3331707331720902, -0.29782902716566806, -0.32920717164470314, -0.3374052425593484, -0.29285178262312417, -0.297819561049303, -0.29380245680122674, -0.3001208528219276, -0.30430931349728546, -0.31031162011569974, -0.3164690724371206, -0.35744248660626887, -0.31650632554923186, -0.3416715817628838, -0.2967709717300505, -0.2810968382054214, -0.29317572101356343, -0.28415894935980374, -0.2886740304919042, -0.29536059927597474, -0.28615276984024896, -0.31737815486644455, -0.34064719020358925, -0.3200006832421291, -0.31871972048719255, -0.33324563381782796, -0.29629271592178674, -0.31666216148989, -0.29432770793302737, -0.2824766429920687, -0.30732405462769147, -0.31852448183716525, -0.33627408109588736, -0.3036492490915872, -0.2892865868260958, -0.28251590765390755, -0.30130865321497236, -0.3053269801697458, -0.31924869890231194, -0.33957223380604507, -0.2819229343479459, -0.2760695218515221, -0.27765265092925845, -0.27764641906931814, -0.2725802127540195, -0.2764443011752714, -0.29405581346151966, -0.27297859820545395, -0.33057098263890883, -0.2967663372772469, -0.28900246528761675, -0.2779469288217541, -0.305889050549991, -0.2844753345782267, -0.28362375880581986, -0.28906121437230675, -0.296219294356731, -0.2810793258901462, -0.3048929390133367, -0.29503246029639857, -0.2893607380709547, -0.2914890365670096, -0.2824754991696746, -0.27821604262157956, -0.2820630920333746, -0.28069265475771343, -0.27906083434392487, -0.30629656685950246, -0.28322904147447686, -0.28879377686392094, -0.30063123426260496, -0.28564130374040475, -0.29848126230428945, -0.293872703273085, -0.30384722897114513, -0.29742730885243013, -0.28063116444349273, -0.2760036534584826, -0.268144785046915, -0.2792372196455253, -0.28452489364161243, -0.29367764211273956, -0.29003064329345735, -0.27689327118603113, -0.27056373275752565, -0.27775283399412093, -0.2804001320409604, -0.2708099898305449, -0.2729622691547244, -0.278923733425239, -0.27983847580330623, -0.2734739719074777, -0.267188805899433, -0.2864075661390724, -0.2663225773677362, -0.27153515350732804, -0.2900163060713796, -0.27606443381397594, -0.2896018282221247, -0.27090068705796666, -0.2664924349932599, -0.2765507555422257, -0.26783425698799285, -0.27216123849949125, -0.26955144993874786, -0.27127550582386756, -0.2695071958447417, -0.27679707177847224, -0.28401887055867814, -0.2956258674660269, -0.2838587945867315, -0.2694250970063514, -0.2754897616663199, -0.2789627219920172, -0.27047599369146796, -0.265878419299462, -0.2692455563326294, -0.26947098794964575, -0.27848744406619325, -0.2777179276900251, -0.2746203185992742, -0.2670849547144791, -0.2731178290003219, -0.276738835441752, -0.2693300217001109, -0.26836394536183955, -0.2803384445159821, -0.281565154591502, -0.2611427184928307, -0.26333935051639523, -0.2633267487489843, -0.26299699264119586, -0.27790194504794474, -0.2818359841249186, -0.26613071102166397, -0.2640231393762332, -0.26963325296100177, -0.26311604877556094, -0.2618625969213101, -0.2666499275042819, -0.26396543550855844, -0.2619603345727783, -0.2678922961139555, -0.2679887913376508, -0.2661578275008344, -0.2650342980936926, -0.27973874632318624, -0.2671025853562088, -0.2698437754447433, -0.26176168417664447, -0.25998037800859264, -0.2659895475613715, -0.2601579662958144, -0.26612822616611814, -0.26677535354611764, -0.26523383593823563, -0.2624045912464313, -0.26870441001376594, -0.26581130847864964, -0.2606733963318864, -0.2618618869625827, -0.2641469482898565, -0.270613982058563, -0.2648748531961703, -0.26671778772597327, -0.2670273297311074, -0.2739900527390498, -0.2707925169574213, -0.26491074555405414, -0.26390430017370137, -0.26890473670133935, -0.2643339632512917, -0.26723603787587896, -0.26665692848617684, -0.2614243748968398, -0.263599471505674, -0.26123674858205603, -0.2636684558286836, -0.26188156859619155, -0.26483282758372506, -0.2652819356420151, -0.26034853105088646, -0.2602511878209334, -0.26042033548772986, -0.26646892747095363, -0.26537466414024014, -0.26281013544836673, -0.2667074341611991, -0.26731020884181356, -0.26197735386115945, -0.2612451103181784, -0.2605812200237826, -0.26044845774176406, -0.25994900177705793, -0.2646332502970305, -0.2653066067077912, -0.25948616785003875, -0.25868754316604836, -0.2639686303228316, -0.26422417602253395, -0.26429183903346765, -0.26205560708977643, -0.26342066051175506, -0.2654780420193758, -0.2608831102515218, -0.2596510163223193, -0.2808528884982661, -0.2653683731170726, -0.2604637612965541, -0.2614597150646023, -0.2638662582185597, -0.2602040150077148, -0.2600238827017199, -0.2630624863337959, -0.25915917269285665, -0.2626736261619537, -0.26232466172637464, -0.2646312979105302, -0.266002918450737, -0.26376226898607547, -0.2634832552062183, -0.26128413832710806, -0.2634113521639962, -0.259745421111984, -0.2667462057961439, -0.2668539617540983, -0.26117186624280453, -0.2584354880967556, -0.26137513137066626, -0.2610001153940078, -0.26259351915221524, -0.26081483588723947, -0.2610540525362123, -0.2628217511619896, -0.2672605708941248, -0.26322423859718214, -0.2641650325163287, -0.26238510682358007, -0.2601622852114059, -0.25930195328136146, -0.26010274728368543, -0.261428555764901, -0.26020298951177523, -0.2606056741577254, -0.2595071116324962, -0.2595474215113503, -0.25984398704863415, -0.26003772689690363, -0.2608968952834782, -0.26097802778915624, -0.2599979297660193, -0.2640203784256267, -0.25949853296454045, -0.26384432838231425, -0.2584032441378874, -0.2627981647553802, -0.2606587632936722, -0.26051493748815213, -0.2622034362736759, -0.2585140568625847, -0.26019381921154666, -0.26089200445668964, -0.26085055075544145, -0.257768126893026, -0.2596137237680565, -0.2633732313245514, -0.26032242034657976, -0.2616452509453555, -0.2604453023696424, -0.25964786095019765, -0.2573049971498704, -0.25820678278116305, -0.25842959149510325, -0.2611572726467419, -0.2605262968277901, -0.25881953632611227, -0.2597232743439052, -0.25767457010961897, -0.258560933859667, -0.26031985660673096, -0.26009927637435165, -0.25827063962447494, -0.2628510764016452, -0.2597092723801153, -0.26086561765732236]}
Toal time: 5210.497064113617




Process finished with exit code 0

  • 2
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值