Coder Social home page Coder Social logo

osvai / gridconv Goto Github PK

View Code? Open in Web Editor NEW
33.0 1.0 1.0 10.8 MB

The official project website of "3D Human Pose Lifting with Grid Convolution" (GridConv for short, oral in AAAI 2023)

License: Apache License 2.0

Python 100.00%
2d-to-3d 3d-human-pose-estimation aaai2023 graph-convolutional-network human36m

gridconv's People

Contributors

kyang-06 avatar yaoanbang avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar

Forkers

davidpengiupui

gridconv's Issues

For testing on Internet Videos

Hi, can you share the setting used for running GridConv on Internet Videos? Also, did you used simple Matplotlib to visualize the final Output?

Error loading dataset

When I try to run, there is a problem with the loading prompt.I also have a question that other 2D attitude estimators, such as CPN and SH, can be selected in the input parameters, but the given dataset only includes the values of gt and HRNet. Is it convenient to provide data from other 2D attitude estimators?

UnicodeDecodeError: 'utf-8' codec can't decode byte 0xb1 in position 0:
image

Maybe some slight errors in code

Thanks for your great work!
I find some error here when I run the code, and here are the errors and solutions (maybe only for me?)

modify

cur_data_2d = torch.load(os.path.join(data_path, data_2d_file))
cur_data_3d = torch.load(os.path.join(data_path, data_3d_file))

to

cur_data_2d = torch.load(os.path.join(data_path, data_2d_file), encoding='Latin1')
cur_data_3d = torch.load(os.path.join(data_path, data_3d_file), encoding='Latin1')

and

self.donut_conv = nn.Conv2d(in_channels=in_channels, out_channels=out_channels, kernel_size=kernel_size,
padding=[kernel_size // 2 * 2, kernel_size // 2 * 2], padding_mode='circular', bias=bias)
self.tablet_conv = nn.Conv2d(in_channels=in_channels, out_channels=out_channels, kernel_size=kernel_size,
padding=[kernel_size // 2, kernel_size // 2], padding_mode='zero', bias=bias)

to

self.donut_conv = nn.Conv2d(in_channels=in_channels, out_channels=out_channels, kernel_size=kernel_size,
                          padding=[kernel_size // 2, kernel_size // 2], padding_mode='circular', bias=bias)
self.tablet_conv = nn.Conv2d(in_channels=in_channels, out_channels=out_channels, kernel_size=kernel_size,
                           padding=[kernel_size // 2, kernel_size // 2], padding_mode='zeros', bias=bias)

difficulty on reproducing the result

Tks for your great work!
I’m confused when I try to reproduce the result from scratch following the readme
I don't know why the "error mean of 524892 samples" seemed to stay still, can you give me some advice about it?
Here‘s my training log: (the --device 3 only used for my server, run on cuda:3)

~/GridConv/src$ python main.py --exp hrnet_dgridconv-autogrids_5x5 \
>                --input hrnet --lifting_model dgridconv_autogrids \
>                --grid_shape 5 5 --num_block 2 --hidsize 256 \
>                --padding_mode c z --device 3

==================Options=================
{   'batch': 200,
    'ckpt': 'checkpoint',
    'data_rootdir': './data/',
    'device': 3,
    'dropout': 0.25,
    'epoch': 200,
    'eval': False,
    'exp': 'hrnet_dgridconv-autogrids_5x5',
    'grid_shape': [5, 5],
    'hidsize': 256,
    'input': 'hrnet',
    'lifting_model': 'dgridconv_autogrids',
    'load': None,
    'loss': 'l2',
    'lr': 0.001,
    'lr_decay': 1,
    'lr_gamma': 0.96,
    'max_temp': 30,
    'num_block': 2,
    'padding_mode': ['c', 'z'],
    'prepare_grid': False,
    'procrustes': False,
    'temp_epoch': 10,
    'test_batch': 1000}
==========================================

----------get dgridconv_autogrids model----------
>>> Loading dataset...
100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 600/600 [00:14<00:00, 42.82it/s]
100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 236/236 [00:04<00:00, 51.48it/s]
/home/yeke/miniconda3/envs/grid/lib/python3.11/site-packages/torch/optim/lr_scheduler.py:384: UserWarning: To get the last learning rate computed by the scheduler, please use `get_last_lr()`.
  warnings.warn("To get the last learning rate computed by the scheduler, "
==========================
>>> epoch: 1 | lr: 0.00100000
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:44 | ETA: 0:00:01 | loss: 0.035523
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:45 | ETA: 0:00:01 | loss: 0.034365
>>> error mean of 524892 samples: 204.549 <<<
>>> error by dim: x: 55.156,  y:145.706, z:84.967 <<<
==========================
>>> epoch: 2 | lr: 0.00092160
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:19 | ETA: 0:00:01 | loss: 0.030408
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033974
>>> error mean of 524892 samples: 196.909 <<<
>>> error by dim: x: 51.848,  y:142.857, z:79.948 <<<
==========================
>>> epoch: 3 | lr: 0.00088474
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:17 | ETA: 0:00:01 | loss: 0.030029
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033902
>>> error mean of 524892 samples: 195.202 <<<
>>> error by dim: x: 51.918,  y:142.067, z:78.403 <<<
==========================
>>> epoch: 4 | lr: 0.00084935
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:17 | ETA: 0:00:01 | loss: 0.029894
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033970
>>> error mean of 524892 samples: 196.226 <<<
>>> error by dim: x: 51.143,  y:144.104, z:78.772 <<<
==========================
>>> epoch: 5 | lr: 0.00081537
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 1e+02ms | Total: 0:12:18 | ETA: 0:00:01 | loss: 0.029822
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:49 | ETA: 0:00:01 | loss: 0.033919
>>> error mean of 524892 samples: 195.309 <<<
>>> error by dim: x: 51.484,  y:142.360, z:78.574 <<<
==========================
>>> epoch: 6 | lr: 0.00078276
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:18 | ETA: 0:00:01 | loss: 0.029778
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033885
>>> error mean of 524892 samples: 194.653 <<<
>>> error by dim: x: 51.286,  y:141.831, z:78.289 <<<
==========================
>>> epoch: 7 | lr: 0.00075145
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:17 | ETA: 0:00:01 | loss: 0.029747
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033885
>>> error mean of 524892 samples: 194.395 <<<
>>> error by dim: x: 51.217,  y:141.729, z:78.085 <<<
==========================
>>> epoch: 8 | lr: 0.00072139
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 1e+02ms | Total: 0:12:18 | ETA: 0:00:01 | loss: 0.029723
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033901
>>> error mean of 524892 samples: 194.887 <<<
>>> error by dim: x: 51.208,  y:142.600, z:78.097 <<<
==========================
>>> epoch: 9 | lr: 0.00069253
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 1e+02ms | Total: 0:12:17 | ETA: 0:00:01 | loss: 0.029706
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033886
>>> error mean of 524892 samples: 194.532 <<<
>>> error by dim: x: 51.155,  y:142.049, z:78.072 <<<
==========================
>>> epoch: 10 | lr: 0.00066483
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:16 | ETA: 0:00:01 | loss: 0.029699
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033890
>>> error mean of 524892 samples: 194.618 <<<
>>> error by dim: x: 51.390,  y:142.153, z:77.936 <<<
==========================
>>> epoch: 11 | lr: 0.00063824
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:17 | ETA: 0:00:01 | loss: 0.029687
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033887
>>> error mean of 524892 samples: 194.560 <<<
>>> error by dim: x: 51.082,  y:142.481, z:77.892 <<<
==========================
>>> epoch: 12 | lr: 0.00061271
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:18 | ETA: 0:00:01 | loss: 0.029671
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033881
>>> error mean of 524892 samples: 194.390 <<<
>>> error by dim: x: 51.083,  y:142.117, z:77.894 <<<
==========================
>>> epoch: 13 | lr: 0.00058820
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 1e+02ms | Total: 0:12:18 | ETA: 0:00:01 | loss: 0.029659
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:41 | ETA: 0:00:01 | loss: 0.033924
>>> error mean of 524892 samples: 195.365 <<<
>>> error by dim: x: 51.281,  y:143.164, z:78.223 <<<
==========================
>>> epoch: 14 | lr: 0.00056467
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:18 | ETA: 0:00:01 | loss: 0.029650
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033882
>>> error mean of 524892 samples: 194.350 <<<
>>> error by dim: x: 51.134,  y:141.998, z:77.893 <<<
==========================
>>> epoch: 15 | lr: 0.00054209
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:17 | ETA: 0:00:01 | loss: 0.029641
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:41 | ETA: 0:00:01 | loss: 0.033895
>>> error mean of 524892 samples: 194.774 <<<
>>> error by dim: x: 51.062,  y:142.833, z:77.887 <<<
==========================
>>> epoch: 16 | lr: 0.00052040
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:17 | ETA: 0:00:01 | loss: 0.029635
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033898
>>> error mean of 524892 samples: 194.718 <<<
>>> error by dim: x: 51.034,  y:142.560, z:78.042 <<<
==========================
>>> epoch: 17 | lr: 0.00049959
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:19 | ETA: 0:00:01 | loss: 0.029628
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:41 | ETA: 0:00:01 | loss: 0.033868
>>> error mean of 524892 samples: 194.049 <<<
>>> error by dim: x: 50.857,  y:141.732, z:77.893 <<<
==========================
>>> epoch: 18 | lr: 0.00047960
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:18 | ETA: 0:00:01 | loss: 0.029622
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033893
>>> error mean of 524892 samples: 194.647 <<<
>>> error by dim: x: 50.925,  y:142.905, z:77.820 <<<
==========================
>>> epoch: 19 | lr: 0.00046042
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:18 | ETA: 0:00:01 | loss: 0.029617
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033882
>>> error mean of 524892 samples: 194.399 <<<
>>> error by dim: x: 50.942,  y:142.289, z:77.876 <<<
==========================
>>> epoch: 20 | lr: 0.00044200
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:18 | ETA: 0:00:01 | loss: 0.029613
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033904
>>> error mean of 524892 samples: 195.171 <<<
>>> error by dim: x: 51.428,  y:142.999, z:77.979 <<<
==========================
>>> epoch: 21 | lr: 0.00042432
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 1e+02ms | Total: 0:12:18 | ETA: 0:00:01 | loss: 0.029608
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033886
>>> error mean of 524892 samples: 194.460 <<<
>>> error by dim: x: 51.110,  y:142.047, z:77.992 <<<
==========================
>>> epoch: 22 | lr: 0.00040735
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:17 | ETA: 0:00:01 | loss: 0.029604
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:43 | ETA: 0:00:01 | loss: 0.033873
>>> error mean of 524892 samples: 194.304 <<<
>>> error by dim: x: 51.107,  y:142.238, z:77.724 <<<
==========================
>>> epoch: 23 | lr: 0.00039106
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:17 | ETA: 0:00:01 | loss: 0.029601
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:50 | ETA: 0:00:01 | loss: 0.033900
>>> error mean of 524892 samples: 194.755 <<<
>>> error by dim: x: 51.009,  y:142.438, z:78.136 <<<
==========================
>>> epoch: 24 | lr: 0.00037541
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:18 | ETA: 0:00:01 | loss: 0.029597
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:51 | ETA: 0:00:01 | loss: 0.033886
>>> error mean of 524892 samples: 194.284 <<<
>>> error by dim: x: 51.088,  y:141.892, z:77.973 <<<
==========================
>>> epoch: 25 | lr: 0.00036040
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:18 | ETA: 0:00:01 | loss: 0.029594
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:47 | ETA: 0:00:01 | loss: 0.033880
>>> error mean of 524892 samples: 194.607 <<<
>>> error by dim: x: 51.043,  y:142.940, z:77.612 <<<
==========================
>>> epoch: 26 | lr: 0.00034598
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:18 | ETA: 0:00:01 | loss: 0.029591
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033875
>>> error mean of 524892 samples: 194.232 <<<
>>> error by dim: x: 50.982,  y:142.154, z:77.735 <<<
==========================
>>> epoch: 27 | lr: 0.00033214
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:18 | ETA: 0:00:01 | loss: 0.029588
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033893
>>> error mean of 524892 samples: 194.482 <<<
>>> error by dim: x: 51.044,  y:141.991, z:78.085 <<<
==========================
>>> epoch: 28 | lr: 0.00031886
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 1e+02ms | Total: 0:12:19 | ETA: 0:00:01 | loss: 0.029585
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033874
>>> error mean of 524892 samples: 193.994 <<<
>>> error by dim: x: 50.821,  y:141.743, z:77.864 <<<
==========================
>>> epoch: 29 | lr: 0.00030610
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 1e+02ms | Total: 0:12:18 | ETA: 0:00:01 | loss: 0.029583
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033885
>>> error mean of 524892 samples: 194.346 <<<
>>> error by dim: x: 51.010,  y:142.066, z:77.924 <<<
==========================
>>> epoch: 30 | lr: 0.00029386
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:15 | ETA: 0:00:01 | loss: 0.029581
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033889
>>> error mean of 524892 samples: 194.803 <<<
>>> error by dim: x: 51.184,  y:142.671, z:77.926 <<<
==========================
>>> epoch: 31 | lr: 0.00028210
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 1e+02ms | Total: 0:12:17 | ETA: 0:00:01 | loss: 0.029579
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033885
>>> error mean of 524892 samples: 194.328 <<<
>>> error by dim: x: 51.046,  y:141.962, z:77.949 <<<
==========================
>>> epoch: 32 | lr: 0.00027082
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 1e+02ms | Total: 0:12:18 | ETA: 0:00:01 | loss: 0.029577
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:47 | ETA: 0:00:01 | loss: 0.033890
>>> error mean of 524892 samples: 194.497 <<<
>>> error by dim: x: 51.021,  y:142.281, z:77.948 <<<
==========================
>>> epoch: 33 | lr: 0.00025999
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:18 | ETA: 0:00:01 | loss: 0.029574
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:51 | ETA: 0:00:01 | loss: 0.033880
>>> error mean of 524892 samples: 194.154 <<<
>>> error by dim: x: 50.961,  y:141.702, z:77.973 <<<
==========================
>>> epoch: 34 | lr: 0.00024959
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:18 | ETA: 0:00:01 | loss: 0.029573
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033889
>>> error mean of 524892 samples: 194.510 <<<
>>> error by dim: x: 50.980,  y:142.348, z:77.940 <<<
==========================
>>> epoch: 35 | lr: 0.00023960
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 1e+02ms | Total: 0:12:18 | ETA: 0:00:01 | loss: 0.029571
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:43 | ETA: 0:00:01 | loss: 0.033903
>>> error mean of 524892 samples: 194.925 <<<
>>> error by dim: x: 51.126,  y:142.760, z:78.025 <<<
==========================
>>> epoch: 36 | lr: 0.00023002
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:19 | ETA: 0:00:01 | loss: 0.029570
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:51 | ETA: 0:00:01 | loss: 0.033915
>>> error mean of 524892 samples: 195.077 <<<
>>> error by dim: x: 51.092,  y:143.049, z:78.033 <<<
==========================
>>> epoch: 37 | lr: 0.00022082
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 1e+02ms | Total: 0:12:17 | ETA: 0:00:01 | loss: 0.029568
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:41 | ETA: 0:00:01 | loss: 0.033886
>>> error mean of 524892 samples: 194.536 <<<
>>> error by dim: x: 50.991,  y:142.505, z:77.847 <<<
==========================
>>> epoch: 38 | lr: 0.00021199
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 1e+02ms | Total: 0:12:17 | ETA: 0:00:01 | loss: 0.029566
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:50 | ETA: 0:00:01 | loss: 0.033892
>>> error mean of 524892 samples: 194.532 <<<
>>> error by dim: x: 50.951,  y:142.607, z:77.839 <<<
==========================
>>> epoch: 39 | lr: 0.00020351
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 1e+02ms | Total: 0:12:16 | ETA: 0:00:01 | loss: 0.029565
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:44 | ETA: 0:00:01 | loss: 0.033911
>>> error mean of 524892 samples: 194.954 <<<
>>> error by dim: x: 51.089,  y:142.808, z:78.091 <<<
==========================
>>> epoch: 40 | lr: 0.00019537
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:16 | ETA: 0:00:01 | loss: 0.029564
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:41 | ETA: 0:00:01 | loss: 0.033885
>>> error mean of 524892 samples: 194.430 <<<
>>> error by dim: x: 51.017,  y:142.351, z:77.807 <<<
==========================
>>> epoch: 41 | lr: 0.00018755
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:17 | ETA: 0:00:01 | loss: 0.029562
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:48 | ETA: 0:00:01 | loss: 0.033893
>>> error mean of 524892 samples: 194.646 <<<
>>> error by dim: x: 50.990,  y:142.487, z:78.007 <<<
==========================
>>> epoch: 42 | lr: 0.00018005
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:18 | ETA: 0:00:01 | loss: 0.029561
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033888
>>> error mean of 524892 samples: 194.531 <<<
>>> error by dim: x: 51.023,  y:142.629, z:77.738 <<<
==========================
>>> epoch: 43 | lr: 0.00017285
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:18 | ETA: 0:00:01 | loss: 0.029560
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:41 | ETA: 0:00:01 | loss: 0.033891
>>> error mean of 524892 samples: 194.512 <<<
>>> error by dim: x: 50.985,  y:142.343, z:77.976 <<<
==========================
>>> epoch: 44 | lr: 0.00016593
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:18 | ETA: 0:00:01 | loss: 0.029559
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033884
>>> error mean of 524892 samples: 194.306 <<<
>>> error by dim: x: 50.924,  y:142.300, z:77.771 <<<
==========================
>>> epoch: 45 | lr: 0.00015930
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 1e+02ms | Total: 0:12:17 | ETA: 0:00:01 | loss: 0.029558
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:41 | ETA: 0:00:01 | loss: 0.033892
>>> error mean of 524892 samples: 194.458 <<<
>>> error by dim: x: 50.908,  y:142.249, z:77.976 <<<
==========================
>>> epoch: 46 | lr: 0.00015292
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 1e+02ms | Total: 0:12:18 | ETA: 0:00:01 | loss: 0.029557
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:51 | ETA: 0:00:01 | loss: 0.033892
>>> error mean of 524892 samples: 194.538 <<<
>>> error by dim: x: 50.975,  y:142.419, z:77.931 <<<
==========================
>>> epoch: 47 | lr: 0.00014681
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:16 | ETA: 0:00:01 | loss: 0.029556
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:41 | ETA: 0:00:01 | loss: 0.033884
>>> error mean of 524892 samples: 194.231 <<<
>>> error by dim: x: 50.937,  y:142.038, z:77.841 <<<
==========================
>>> epoch: 48 | lr: 0.00014094
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:17 | ETA: 0:00:01 | loss: 0.029555
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033880
>>> error mean of 524892 samples: 194.351 <<<
>>> error by dim: x: 50.941,  y:142.445, z:77.715 <<<
==========================
>>> epoch: 49 | lr: 0.00013530
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:17 | ETA: 0:00:01 | loss: 0.029554
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:41 | ETA: 0:00:01 | loss: 0.033891
>>> error mean of 524892 samples: 194.560 <<<
>>> error by dim: x: 51.037,  y:142.393, z:77.909 <<<
==========================
>>> epoch: 50 | lr: 0.00012989
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:17 | ETA: 0:00:01 | loss: 0.029553
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:50 | ETA: 0:00:01 | loss: 0.033894
>>> error mean of 524892 samples: 194.623 <<<
>>> error by dim: x: 51.027,  y:142.517, z:77.906 <<<
==========================
>>> epoch: 51 | lr: 0.00012469
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:17 | ETA: 0:00:01 | loss: 0.029553
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033879
>>> error mean of 524892 samples: 194.300 <<<
>>> error by dim: x: 50.953,  y:142.404, z:77.686 <<<
==========================
>>> epoch: 52 | lr: 0.00011970
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:18 | ETA: 0:00:01 | loss: 0.029552
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033891
>>> error mean of 524892 samples: 194.594 <<<
>>> error by dim: x: 50.952,  y:142.679, z:77.820 <<<
==========================
>>> epoch: 53 | lr: 0.00011491
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 1e+02ms | Total: 0:12:18 | ETA: 0:00:01 | loss: 0.029551
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033881
>>> error mean of 524892 samples: 194.278 <<<
>>> error by dim: x: 50.952,  y:142.102, z:77.838 <<<
==========================
>>> epoch: 54 | lr: 0.00011032
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 1e+02ms | Total: 0:12:18 | ETA: 0:00:01 | loss: 0.029551
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:41 | ETA: 0:00:01 | loss: 0.033885
>>> error mean of 524892 samples: 194.338 <<<
>>> error by dim: x: 50.965,  y:142.196, z:77.849 <<<
==========================
>>> epoch: 55 | lr: 0.00010591
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:16 | ETA: 0:00:01 | loss: 0.029550
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:41 | ETA: 0:00:01 | loss: 0.033885
>>> error mean of 524892 samples: 194.326 <<<
>>> error by dim: x: 50.855,  y:142.344, z:77.795 <<<
==========================
>>> epoch: 56 | lr: 0.00010167
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:16 | ETA: 0:00:01 | loss: 0.029549
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:41 | ETA: 0:00:01 | loss: 0.033891
>>> error mean of 524892 samples: 194.427 <<<
>>> error by dim: x: 50.958,  y:142.330, z:77.860 <<<
==========================
>>> epoch: 57 | lr: 0.00009760
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:16 | ETA: 0:00:01 | loss: 0.029549
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033890
>>> error mean of 524892 samples: 194.524 <<<
>>> error by dim: x: 50.945,  y:142.509, z:77.859 <<<
==========================
>>> epoch: 58 | lr: 0.00009370
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:18 | ETA: 0:00:01 | loss: 0.029548
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033889
>>> error mean of 524892 samples: 194.530 <<<
>>> error by dim: x: 50.972,  y:142.549, z:77.829 <<<
==========================
>>> epoch: 59 | lr: 0.00008995
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:16 | ETA: 0:00:01 | loss: 0.029547
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:41 | ETA: 0:00:01 | loss: 0.033891
>>> error mean of 524892 samples: 194.386 <<<
>>> error by dim: x: 50.976,  y:142.180, z:77.911 <<<
==========================
>>> epoch: 60 | lr: 0.00008635
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:17 | ETA: 0:00:01 | loss: 0.029547
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033885
>>> error mean of 524892 samples: 194.390 <<<
>>> error by dim: x: 50.977,  y:142.284, z:77.844 <<<
==========================
>>> epoch: 61 | lr: 0.00008290
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:18 | ETA: 0:00:01 | loss: 0.029546
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033887
>>> error mean of 524892 samples: 194.477 <<<
>>> error by dim: x: 51.039,  y:142.539, z:77.751 <<<
==========================
>>> epoch: 62 | lr: 0.00007958
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 1e+02ms | Total: 0:12:16 | ETA: 0:00:01 | loss: 0.029546
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033891
>>> error mean of 524892 samples: 194.581 <<<
>>> error by dim: x: 51.021,  y:142.599, z:77.846 <<<
==========================
>>> epoch: 63 | lr: 0.00007640
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:17 | ETA: 0:00:01 | loss: 0.029545
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033889
>>> error mean of 524892 samples: 194.473 <<<
>>> error by dim: x: 50.933,  y:142.410, z:77.889 <<<
==========================
>>> epoch: 64 | lr: 0.00007334
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 1e+02ms | Total: 0:12:18 | ETA: 0:00:01 | loss: 0.029545
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033890
>>> error mean of 524892 samples: 194.431 <<<
>>> error by dim: x: 50.922,  y:142.332, z:77.885 <<<
==========================
>>> epoch: 65 | lr: 0.00007041
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:17 | ETA: 0:00:01 | loss: 0.029544
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:46 | ETA: 0:00:01 | loss: 0.033891
>>> error mean of 524892 samples: 194.479 <<<
>>> error by dim: x: 50.960,  y:142.399, z:77.874 <<<
==========================
>>> epoch: 66 | lr: 0.00006759
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:17 | ETA: 0:00:01 | loss: 0.029544
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:50 | ETA: 0:00:01 | loss: 0.033890
>>> error mean of 524892 samples: 194.428 <<<
>>> error by dim: x: 50.977,  y:142.309, z:77.882 <<<
==========================
>>> epoch: 67 | lr: 0.00006489
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:17 | ETA: 0:00:01 | loss: 0.029543
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033886
>>> error mean of 524892 samples: 194.463 <<<
>>> error by dim: x: 50.972,  y:142.614, z:77.710 <<<
==========================
>>> epoch: 68 | lr: 0.00006229
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 1e+02ms | Total: 0:12:17 | ETA: 0:00:01 | loss: 0.029543
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:47 | ETA: 0:00:01 | loss: 0.033895
>>> error mean of 524892 samples: 194.558 <<<
>>> error by dim: x: 51.015,  y:142.333, z:77.977 <<<
==========================
>>> epoch: 69 | lr: 0.00005980
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 1e+02ms | Total: 0:12:18 | ETA: 0:00:01 | loss: 0.029543
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:45 | ETA: 0:00:01 | loss: 0.033893
>>> error mean of 524892 samples: 194.483 <<<
>>> error by dim: x: 50.933,  y:142.417, z:77.891 <<<
==========================
>>> epoch: 70 | lr: 0.00005741
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:18 | ETA: 0:00:01 | loss: 0.029542
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:45 | ETA: 0:00:01 | loss: 0.033887
>>> error mean of 524892 samples: 194.297 <<<
>>> error by dim: x: 50.930,  y:142.176, z:77.832 <<<
==========================
>>> epoch: 71 | lr: 0.00005511
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:17 | ETA: 0:00:01 | loss: 0.029542
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033884
>>> error mean of 524892 samples: 194.333 <<<
>>> error by dim: x: 50.954,  y:142.305, z:77.778 <<<
==========================
>>> epoch: 72 | lr: 0.00005291
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:18 | ETA: 0:00:01 | loss: 0.029542
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033885
>>> error mean of 524892 samples: 194.375 <<<
>>> error by dim: x: 50.925,  y:142.435, z:77.759 <<<
==========================
>>> epoch: 73 | lr: 0.00005079
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:16 | ETA: 0:00:01 | loss: 0.029541
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:51 | ETA: 0:00:01 | loss: 0.033884
>>> error mean of 524892 samples: 194.295 <<<
>>> error by dim: x: 50.917,  y:142.148, z:77.854 <<<
==========================
>>> epoch: 74 | lr: 0.00004876
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:18 | ETA: 0:00:01 | loss: 0.029541
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033891
>>> error mean of 524892 samples: 194.479 <<<
>>> error by dim: x: 50.977,  y:142.358, z:77.893 <<<
==========================
>>> epoch: 75 | lr: 0.00004681
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:17 | ETA: 0:00:01 | loss: 0.029541
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:48 | ETA: 0:00:01 | loss: 0.033894
>>> error mean of 524892 samples: 194.590 <<<
>>> error by dim: x: 51.000,  y:142.722, z:77.765 <<<
==========================
>>> epoch: 76 | lr: 0.00004494
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 1e+02ms | Total: 0:12:18 | ETA: 0:00:01 | loss: 0.029540
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:49 | ETA: 0:00:01 | loss: 0.033890
>>> error mean of 524892 samples: 194.444 <<<
>>> error by dim: x: 50.932,  y:142.344, z:77.890 <<<
==========================
>>> epoch: 77 | lr: 0.00004314
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:17 | ETA: 0:00:01 | loss: 0.029540
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033892
>>> error mean of 524892 samples: 194.507 <<<
>>> error by dim: x: 51.014,  y:142.471, z:77.844 <<<
==========================
>>> epoch: 78 | lr: 0.00004141
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 1e+02ms | Total: 0:12:17 | ETA: 0:00:01 | loss: 0.029540
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033890
>>> error mean of 524892 samples: 194.504 <<<
>>> error by dim: x: 51.039,  y:142.487, z:77.802 <<<
==========================
>>> epoch: 79 | lr: 0.00003976
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 1e+02ms | Total: 0:12:17 | ETA: 0:00:01 | loss: 0.029540
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:47 | ETA: 0:00:01 | loss: 0.033892
>>> error mean of 524892 samples: 194.497 <<<
>>> error by dim: x: 50.939,  y:142.530, z:77.829 <<<
==========================
>>> epoch: 80 | lr: 0.00003817
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 1e+02ms | Total: 0:12:17 | ETA: 0:00:01 | loss: 0.029540
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:47 | ETA: 0:00:01 | loss: 0.033887
>>> error mean of 524892 samples: 194.412 <<<
>>> error by dim: x: 50.964,  y:142.462, z:77.756 <<<
==========================
>>> epoch: 81 | lr: 0.00003664
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 1e+02ms | Total: 0:12:18 | ETA: 0:00:01 | loss: 0.029539
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:50 | ETA: 0:00:01 | loss: 0.033894
>>> error mean of 524892 samples: 194.508 <<<
>>> error by dim: x: 50.954,  y:142.478, z:77.874 <<<
==========================
>>> epoch: 82 | lr: 0.00003518
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:18 | ETA: 0:00:01 | loss: 0.029539
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:46 | ETA: 0:00:01 | loss: 0.033890
>>> error mean of 524892 samples: 194.435 <<<
>>> error by dim: x: 50.958,  y:142.352, z:77.857 <<<
==========================
>>> epoch: 83 | lr: 0.00003377
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:17 | ETA: 0:00:01 | loss: 0.029539
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033888
>>> error mean of 524892 samples: 194.422 <<<
>>> error by dim: x: 51.017,  y:142.360, z:77.806 <<<
==========================
>>> epoch: 84 | lr: 0.00003242
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 1e+02ms | Total: 0:12:16 | ETA: 0:00:01 | loss: 0.029538
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033894
>>> error mean of 524892 samples: 194.578 <<<
>>> error by dim: x: 50.976,  y:142.566, z:77.874 <<<
==========================
>>> epoch: 85 | lr: 0.00003112
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:18 | ETA: 0:00:01 | loss: 0.029539
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:44 | ETA: 0:00:01 | loss: 0.033889
>>> error mean of 524892 samples: 194.556 <<<
>>> error by dim: x: 51.015,  y:142.569, z:77.824 <<<
==========================
>>> epoch: 86 | lr: 0.00002988
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 1e+02ms | Total: 0:12:18 | ETA: 0:00:01 | loss: 0.029538
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:45 | ETA: 0:00:01 | loss: 0.033891
>>> error mean of 524892 samples: 194.459 <<<
>>> error by dim: x: 50.943,  y:142.388, z:77.874 <<<
==========================
>>> epoch: 87 | lr: 0.00002868
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:18 | ETA: 0:00:01 | loss: 0.029538
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033897
>>> error mean of 524892 samples: 194.594 <<<
>>> error by dim: x: 50.974,  y:142.585, z:77.876 <<<
==========================
>>> epoch: 88 | lr: 0.00002753
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 1e+02ms | Total: 0:12:19 | ETA: 0:00:01 | loss: 0.029538
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:48 | ETA: 0:00:01 | loss: 0.033887
>>> error mean of 524892 samples: 194.400 <<<
>>> error by dim: x: 50.981,  y:142.323, z:77.825 <<<
==========================
>>> epoch: 89 | lr: 0.00002643
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 1e+02ms | Total: 0:12:18 | ETA: 0:00:01 | loss: 0.029538
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033890
>>> error mean of 524892 samples: 194.510 <<<
>>> error by dim: x: 50.967,  y:142.627, z:77.752 <<<
==========================
>>> epoch: 90 | lr: 0.00002538
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:18 | ETA: 0:00:01 | loss: 0.029538
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033894
>>> error mean of 524892 samples: 194.504 <<<
>>> error by dim: x: 50.983,  y:142.428, z:77.873 <<<
==========================
>>> epoch: 91 | lr: 0.00002436
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:18 | ETA: 0:00:01 | loss: 0.029537
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:50 | ETA: 0:00:01 | loss: 0.033889
>>> error mean of 524892 samples: 194.494 <<<
>>> error by dim: x: 50.975,  y:142.592, z:77.758 <<<
==========================
>>> epoch: 92 | lr: 0.00002339
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 1e+02ms | Total: 0:12:16 | ETA: 0:00:01 | loss: 0.029537
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:44 | ETA: 0:00:01 | loss: 0.033885
>>> error mean of 524892 samples: 194.365 <<<
>>> error by dim: x: 50.944,  y:142.372, z:77.775 <<<
==========================
>>> epoch: 93 | lr: 0.00002245
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:17 | ETA: 0:00:01 | loss: 0.029537
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033891
>>> error mean of 524892 samples: 194.460 <<<
>>> error by dim: x: 50.952,  y:142.454, z:77.823 <<<
==========================
>>> epoch: 94 | lr: 0.00002155
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 1e+02ms | Total: 0:12:18 | ETA: 0:00:01 | loss: 0.029537
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:42 | ETA: 0:00:01 | loss: 0.033897
>>> error mean of 524892 samples: 194.602 <<<
>>> error by dim: x: 50.991,  y:142.534, z:77.918 <<<
==========================
>>> epoch: 95 | lr: 0.00002069
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 1e+02ms | Total: 0:12:17 | ETA: 0:00:01 | loss: 0.029537
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:43 | ETA: 0:00:01 | loss: 0.033894
>>> error mean of 524892 samples: 194.489 <<<
>>> error by dim: x: 50.937,  y:142.373, z:77.920 <<<
==========================
>>> epoch: 96 | lr: 0.00001986
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (7799/7799) | batch: 9e+01ms | Total: 0:12:17 | ETA: 0:00:01 | loss: 0.029537
Inferring...
>>> |>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>| (525/525) | batch: 0e+00ms | Total: 0:01:43 | ETA: 0:00:01 | loss: 0.033892

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.