PixelShuffleUpBlock + ResSkip 512×512 Training log

隨手記錄log, 512×512 與 384×384 相比, 訓練時間拉長約 2.28倍, 首輪 log 數據比較有感, 差異比較大.

figure:

log:

Epoch: [  0], Batch: [   0/ 831] | Total Time: 3s
d_loss: 1.8555, g_loss: 48.5514, const_loss: 0.0003, l1_loss: 35.8118, fm_loss: 0.0314, perc_loss: 11.8152, edge: 0.1999
Checkpoint step 100 reached, but saving starts after step 200.
Epoch: [ 0], Batch: [ 100/ 831] | Total Time: 3m 42s
d_loss: 1.3870, g_loss: 41.4099, const_loss: 0.0001, l1_loss: 30.0919, fm_loss: 0.0299, perc_loss: 10.3960, edge: 0.1991
Epoch: [ 0], Batch: [ 200/ 831] | Total Time: 7m 29s
d_loss: 1.3868, g_loss: 36.1580, const_loss: 0.0001, l1_loss: 25.9894, fm_loss: 0.0238, perc_loss: 9.2647, edge: 0.1872
Epoch: [ 0], Batch: [ 300/ 831] | Total Time: 11m 14s
d_loss: 1.3878, g_loss: 39.6941, const_loss: 0.0001, l1_loss: 28.1615, fm_loss: 0.0257, perc_loss: 10.6019, edge: 0.2120
Epoch: [ 0], Batch: [ 400/ 831] | Total Time: 14m 55s
d_loss: 1.3868, g_loss: 38.5238, const_loss: 0.0001, l1_loss: 27.4452, fm_loss: 0.0246, perc_loss: 10.1615, edge: 0.1991
Epoch: [ 0], Batch: [ 500/ 831] | Total Time: 18m 36s
d_loss: 1.3868, g_loss: 38.4518, const_loss: 0.0002, l1_loss: 27.4393, fm_loss: 0.0242, perc_loss: 10.1010, edge: 0.1942
Epoch: [ 0], Batch: [ 600/ 831] | Total Time: 22m 17s
d_loss: 1.3872, g_loss: 37.0487, const_loss: 0.0001, l1_loss: 26.8958, fm_loss: 0.0240, perc_loss: 9.2513, edge: 0.1846
Epoch: [ 0], Batch: [ 700/ 831] | Total Time: 25m 58s
d_loss: 1.3867, g_loss: 38.7305, const_loss: 0.0001, l1_loss: 27.9832, fm_loss: 0.0224, perc_loss: 9.8449, edge: 0.1871
Epoch: [ 0], Batch: [ 800/ 831] | Total Time: 29m 39s
d_loss: 1.3868, g_loss: 37.9838, const_loss: 0.0004, l1_loss: 28.0558, fm_loss: 0.0239, perc_loss: 9.0325, edge: 0.1784
--- End of Epoch 0 --- Time: 1843.7s ---
LR Scheduler stepped. Current LR G: 0.000200, LR D: 0.000200
Epoch: [ 1], Batch: [ 0/ 831] | Total Time: 30m 45s
d_loss: 1.3882, g_loss: 36.0948, const_loss: 0.0002, l1_loss: 25.8642, fm_loss: 0.0187, perc_loss: 9.3354, edge: 0.1831
Epoch: [ 1], Batch: [ 100/ 831] | Total Time: 34m 30s
d_loss: 1.3868, g_loss: 36.8987, const_loss: 0.0002, l1_loss: 26.4981, fm_loss: 0.0205, perc_loss: 9.4850, edge: 0.2014
Epoch: [ 1], Batch: [ 200/ 831] | Total Time: 38m 10s
d_loss: 1.3867, g_loss: 36.2823, const_loss: 0.0001, l1_loss: 26.2852, fm_loss: 0.0202, perc_loss: 9.0958, edge: 0.1881
Epoch: [ 1], Batch: [ 300/ 831] | Total Time: 41m 52s
d_loss: 1.3868, g_loss: 39.1493, const_loss: 0.0001, l1_loss: 27.9802, fm_loss: 0.0206, perc_loss: 10.2437, edge: 0.2113
Epoch: [ 1], Batch: [ 400/ 831] | Total Time: 45m 33s
d_loss: 1.3869, g_loss: 35.6709, const_loss: 0.0001, l1_loss: 25.6619, fm_loss: 0.0182, perc_loss: 9.1091, edge: 0.1886
Epoch: [ 1], Batch: [ 500/ 831] | Total Time: 49m 19s
d_loss: 1.3872, g_loss: 33.1971, const_loss: 0.0002, l1_loss: 24.2608, fm_loss: 0.0166, perc_loss: 8.0567, edge: 0.1694
Epoch: [ 1], Batch: [ 600/ 831] | Total Time: 52m 59s
d_loss: 1.3867, g_loss: 37.4253, const_loss: 0.0002, l1_loss: 26.8524, fm_loss: 0.0191, perc_loss: 9.6647, edge: 0.1960
Epoch: [ 1], Batch: [ 700/ 831] | Total Time: 56m 40s
d_loss: 1.3870, g_loss: 37.0443, const_loss: 0.0001, l1_loss: 26.4690, fm_loss: 0.0181, perc_loss: 9.6677, edge: 0.1959
Epoch: [ 1], Batch: [ 800/ 831] | Total Time: 1h 24s
d_loss: 1.3868, g_loss: 36.6171, const_loss: 0.0001, l1_loss: 26.7109, fm_loss: 0.0185, perc_loss: 9.0182, edge: 0.1766
--- End of Epoch 1 --- Time: 1844.9s ---
LR Scheduler stepped. Current LR G: 0.000199, LR D: 0.000199
Epoch: [ 2], Batch: [ 0/ 831] | Total Time: 1h 1m 30s
d_loss: 1.3869, g_loss: 33.3594, const_loss: 0.0001, l1_loss: 23.8799, fm_loss: 0.0157, perc_loss: 8.5950, edge: 0.1758
Epoch: [ 2], Batch: [ 100/ 831] | Total Time: 1h 5m 15s
d_loss: 1.3867, g_loss: 32.8736, const_loss: 0.0001, l1_loss: 23.7862, fm_loss: 0.0139, perc_loss: 8.2090, edge: 0.1712
Epoch: [ 2], Batch: [ 200/ 831] | Total Time: 1h 8m 57s
d_loss: 1.3869, g_loss: 35.8764, const_loss: 0.0001, l1_loss: 25.9877, fm_loss: 0.0159, perc_loss: 8.9948, edge: 0.1847
Epoch: [ 2], Batch: [ 300/ 831] | Total Time: 1h 12m 37s
d_loss: 1.3873, g_loss: 27.9786, const_loss: 0.0003, l1_loss: 20.1063, fm_loss: 0.0107, perc_loss: 7.0191, edge: 0.1487
Epoch: [ 2], Batch: [ 400/ 831] | Total Time: 1h 16m 18s
d_loss: 1.3868, g_loss: 32.5896, const_loss: 0.0002, l1_loss: 22.9733, fm_loss: 0.0131, perc_loss: 8.7263, edge: 0.1835
Epoch: [ 2], Batch: [ 500/ 831] | Total Time: 1h 19m 59s
d_loss: 1.3876, g_loss: 31.8721, const_loss: 0.0001, l1_loss: 22.6266, fm_loss: 0.0123, perc_loss: 8.3598, edge: 0.1800
Epoch: [ 2], Batch: [ 600/ 831] | Total Time: 1h 23m 39s
d_loss: 1.3867, g_loss: 32.4699, const_loss: 0.0001, l1_loss: 23.2523, fm_loss: 0.0126, perc_loss: 8.3410, edge: 0.1706
Epoch: [ 2], Batch: [ 700/ 831] | Total Time: 1h 27m 25s
d_loss: 1.3867, g_loss: 31.5842, const_loss: 0.0001, l1_loss: 22.5872, fm_loss: 0.0118, perc_loss: 8.1213, edge: 0.1706
Epoch: [ 2], Batch: [ 800/ 831] | Total Time: 1h 31m 7s
d_loss: 1.3872, g_loss: 36.3781, const_loss: 0.0001, l1_loss: 25.8096, fm_loss: 0.0141, perc_loss: 9.6648, edge: 0.1961
--- End of Epoch 2 --- Time: 1843.3s ---
LR Scheduler stepped. Current LR G: 0.000197, LR D: 0.000197
Epoch: [ 3], Batch: [ 0/ 831] | Total Time: 1h 32m 14s
d_loss: 1.3867, g_loss: 32.6280, const_loss: 0.0001, l1_loss: 23.3588, fm_loss: 0.0116, perc_loss: 8.3858, edge: 0.1783
Epoch: [ 3], Batch: [ 100/ 831] | Total Time: 1h 35m 54s
d_loss: 1.3867, g_loss: 36.0779, const_loss: 0.0001, l1_loss: 26.0461, fm_loss: 0.0137, perc_loss: 9.1326, edge: 0.1921
Epoch: [ 3], Batch: [ 200/ 831] | Total Time: 1h 39m 34s
d_loss: 1.3869, g_loss: 32.3989, const_loss: 0.0002, l1_loss: 23.0922, fm_loss: 0.0122, perc_loss: 8.4207, edge: 0.1802
Epoch: [ 3], Batch: [ 300/ 831] | Total Time: 1h 43m 19s
d_loss: 1.3870, g_loss: 32.2563, const_loss: 0.0001, l1_loss: 23.1509, fm_loss: 0.0112, perc_loss: 8.2281, edge: 0.1727
Epoch: [ 3], Batch: [ 400/ 831] | Total Time: 1h 46m 59s
d_loss: 1.3875, g_loss: 32.0288, const_loss: 0.0001, l1_loss: 22.8382, fm_loss: 0.0105, perc_loss: 8.3199, edge: 0.1668
Epoch: [ 3], Batch: [ 500/ 831] | Total Time: 1h 50m 42s
d_loss: 1.3869, g_loss: 33.7018, const_loss: 0.0001, l1_loss: 24.1624, fm_loss: 0.0124, perc_loss: 8.6473, edge: 0.1863
Epoch: [ 3], Batch: [ 600/ 831] | Total Time: 1h 54m 23s
d_loss: 1.3869, g_loss: 35.0490, const_loss: 0.0002, l1_loss: 25.0036, fm_loss: 0.0114, perc_loss: 9.1460, edge: 0.1944
Epoch: [ 3], Batch: [ 700/ 831] | Total Time: 1h 58m 4s
d_loss: 1.3867, g_loss: 30.9307, const_loss: 0.0001, l1_loss: 22.2519, fm_loss: 0.0096, perc_loss: 7.8038, edge: 0.1720
Epoch: [ 3], Batch: [ 800/ 831] | Total Time: 2h 1m 45s
d_loss: 1.3874, g_loss: 31.3321, const_loss: 0.0001, l1_loss: 22.3879, fm_loss: 0.0106, perc_loss: 8.0673, edge: 0.1729
--- End of Epoch 3 --- Time: 1840.6s ---
LR Scheduler stepped. Current LR G: 0.000195, LR D: 0.000195
Epoch: [ 4], Batch: [ 0/ 831] | Total Time: 2h 2m 54s
d_loss: 1.3867, g_loss: 31.1261, const_loss: 0.0001, l1_loss: 22.3814, fm_loss: 0.0110, perc_loss: 7.8735, edge: 0.1668
Epoch: [ 4], Batch: [ 100/ 831] | Total Time: 2h 6m 35s
d_loss: 1.3867, g_loss: 34.3805, const_loss: 0.0001, l1_loss: 24.8031, fm_loss: 0.0119, perc_loss: 8.6855, edge: 0.1865
Epoch: [ 4], Batch: [ 200/ 831] | Total Time: 2h 10m 15s
d_loss: 1.3868, g_loss: 32.0789, const_loss: 0.0001, l1_loss: 22.9830, fm_loss: 0.0107, perc_loss: 8.2176, edge: 0.1741
Epoch: [ 4], Batch: [ 300/ 831] | Total Time: 2h 13m 56s
d_loss: 1.3873, g_loss: 34.2499, const_loss: 0.0001, l1_loss: 24.5459, fm_loss: 0.0112, perc_loss: 8.8121, edge: 0.1872
Epoch: [ 4], Batch: [ 400/ 831] | Total Time: 2h 17m 43s
d_loss: 1.3887, g_loss: 34.3537, const_loss: 0.0001, l1_loss: 24.6973, fm_loss: 0.0113, perc_loss: 8.7695, edge: 0.1828
Epoch: [ 4], Batch: [ 500/ 831] | Total Time: 2h 21m 28s
d_loss: 1.3867, g_loss: 30.5413, const_loss: 0.0001, l1_loss: 22.1016, fm_loss: 0.0096, perc_loss: 7.5840, edge: 0.1527
Epoch: [ 4], Batch: [ 600/ 831] | Total Time: 2h 25m 8s
d_loss: 1.3868, g_loss: 34.3100, const_loss: 0.0001, l1_loss: 24.8765, fm_loss: 0.0110, perc_loss: 8.5498, edge: 0.1793
Epoch: [ 4], Batch: [ 700/ 831] | Total Time: 2h 28m 49s
d_loss: 1.3870, g_loss: 37.7089, const_loss: 0.0000, l1_loss: 26.7227, fm_loss: 0.0137, perc_loss: 10.0693, edge: 0.2098
Epoch: [ 4], Batch: [ 800/ 831] | Total Time: 2h 32m 29s
d_loss: 1.3869, g_loss: 33.5865, const_loss: 0.0001, l1_loss: 24.3283, fm_loss: 0.0126, perc_loss: 8.3819, edge: 0.1707
--- End of Epoch 4 --- Time: 1841.9s ---
LR Scheduler stepped. Current LR G: 0.000192, LR D: 0.000192
Epoch: [ 5], Batch: [ 0/ 831] | Total Time: 2h 33m 36s
d_loss: 1.3873, g_loss: 31.7718, const_loss: 0.0001, l1_loss: 22.7510, fm_loss: 0.0100, perc_loss: 8.1445, edge: 0.1729
Epoch: [ 5], Batch: [ 100/ 831] | Total Time: 2h 37m 17s
d_loss: 1.3868, g_loss: 29.6126, const_loss: 0.0001, l1_loss: 21.1932, fm_loss: 0.0093, perc_loss: 7.5534, edge: 0.1632
Epoch: [ 5], Batch: [ 200/ 831] | Total Time: 2h 40m 58s
d_loss: 1.3867, g_loss: 28.3905, const_loss: 0.0000, l1_loss: 20.3083, fm_loss: 0.0085, perc_loss: 7.2206, edge: 0.1598
Epoch: [ 5], Batch: [ 300/ 831] | Total Time: 2h 44m 40s
d_loss: 1.3870, g_loss: 28.6722, const_loss: 0.0001, l1_loss: 20.5758, fm_loss: 0.0095, perc_loss: 7.2429, edge: 0.1506
Epoch: [ 5], Batch: [ 400/ 831] | Total Time: 2h 48m 21s
d_loss: 1.3872, g_loss: 30.3480, const_loss: 0.0001, l1_loss: 21.7726, fm_loss: 0.0092, perc_loss: 7.6991, edge: 0.1737
Epoch: [ 5], Batch: [ 500/ 831] | Total Time: 2h 52m 2s
d_loss: 1.3869, g_loss: 28.6966, const_loss: 0.0001, l1_loss: 20.8293, fm_loss: 0.0092, perc_loss: 7.0095, edge: 0.1552
Epoch: [ 5], Batch: [ 600/ 831] | Total Time: 2h 55m 49s
d_loss: 1.3867, g_loss: 30.3430, const_loss: 0.0001, l1_loss: 21.7186, fm_loss: 0.0090, perc_loss: 7.7619, edge: 0.1602
Epoch: [ 5], Batch: [ 700/ 831] | Total Time: 2h 59m 31s
d_loss: 1.3867, g_loss: 26.9955, const_loss: 0.0001, l1_loss: 19.2643, fm_loss: 0.0076, perc_loss: 6.8735, edge: 0.1568
Epoch: [ 5], Batch: [ 800/ 831] | Total Time: 3h 3m 12s
d_loss: 1.3870, g_loss: 30.2710, const_loss: 0.0001, l1_loss: 21.4690, fm_loss: 0.0085, perc_loss: 7.9313, edge: 0.1687
--- End of Epoch 5 --- Time: 1842.3s ---
LR Scheduler stepped. Current LR G: 0.000189, LR D: 0.000189
Epoch: [ 6], Batch: [ 0/ 831] | Total Time: 3h 4m 18s
d_loss: 1.3867, g_loss: 29.1717, const_loss: 0.0000, l1_loss: 20.8315, fm_loss: 0.0087, perc_loss: 7.4665, edge: 0.1716
Epoch: [ 6], Batch: [ 100/ 831] | Total Time: 3h 8m 0s
d_loss: 1.3868, g_loss: 28.7507, const_loss: 0.0000, l1_loss: 20.4751, fm_loss: 0.0085, perc_loss: 7.4092, edge: 0.1645
Epoch: [ 6], Batch: [ 200/ 831] | Total Time: 3h 11m 41s
d_loss: 1.3868, g_loss: 28.8108, const_loss: 0.0000, l1_loss: 20.3923, fm_loss: 0.0081, perc_loss: 7.5529, edge: 0.1643
Epoch: [ 6], Batch: [ 300/ 831] | Total Time: 3h 15m 22s
d_loss: 1.3868, g_loss: 26.9529, const_loss: 0.0000, l1_loss: 19.3393, fm_loss: 0.0073, perc_loss: 6.7591, edge: 0.1538
Epoch: [ 6], Batch: [ 400/ 831] | Total Time: 3h 19m 3s
d_loss: 1.3867, g_loss: 30.6014, const_loss: 0.0001, l1_loss: 21.8041, fm_loss: 0.0088, perc_loss: 7.9229, edge: 0.1723
Epoch: [ 6], Batch: [ 500/ 831] | Total Time: 3h 22m 43s
d_loss: 1.3868, g_loss: 30.8951, const_loss: 0.0001, l1_loss: 22.4039, fm_loss: 0.0092, perc_loss: 7.6097, edge: 0.1789
Epoch: [ 6], Batch: [ 600/ 831] | Total Time: 3h 26m 24s
d_loss: 1.3867, g_loss: 32.4369, const_loss: 0.0000, l1_loss: 23.1193, fm_loss: 0.0103, perc_loss: 8.4328, edge: 0.1816
Epoch: [ 6], Batch: [ 700/ 831] | Total Time: 3h 30m 5s
d_loss: 1.3867, g_loss: 29.6670, const_loss: 0.0001, l1_loss: 21.0724, fm_loss: 0.0085, perc_loss: 7.7221, edge: 0.1707
Epoch: [ 6], Batch: [ 800/ 831] | Total Time: 3h 33m 46s
d_loss: 1.3867, g_loss: 28.0461, const_loss: 0.0001, l1_loss: 20.1075, fm_loss: 0.0074, perc_loss: 7.0802, edge: 0.1576
--- End of Epoch 6 --- Time: 1836.4s ---
LR Scheduler stepped. Current LR G: 0.000185, LR D: 0.000185
Epoch: [ 7], Batch: [ 0/ 831] | Total Time: 3h 34m 55s
d_loss: 1.3867, g_loss: 27.3546, const_loss: 0.0001, l1_loss: 19.8167, fm_loss: 0.0079, perc_loss: 6.6833, edge: 0.1533
Epoch: [ 7], Batch: [ 100/ 831] | Total Time: 3h 38m 35s
d_loss: 1.3868, g_loss: 28.8592, const_loss: 0.0000, l1_loss: 20.6384, fm_loss: 0.0080, perc_loss: 7.3468, edge: 0.1726
Epoch: [ 7], Batch: [ 200/ 831] | Total Time: 3h 42m 21s
d_loss: 1.3868, g_loss: 28.2519, const_loss: 0.0000, l1_loss: 20.1436, fm_loss: 0.0087, perc_loss: 7.2524, edge: 0.1539
Epoch: [ 7], Batch: [ 300/ 831] | Total Time: 3h 46m 2s
d_loss: 1.3867, g_loss: 26.5044, const_loss: 0.0002, l1_loss: 19.0200, fm_loss: 0.0071, perc_loss: 6.6267, edge: 0.1570
Epoch: [ 7], Batch: [ 400/ 831] | Total Time: 3h 49m 48s
d_loss: 1.3868, g_loss: 30.1898, const_loss: 0.0001, l1_loss: 21.3035, fm_loss: 0.0088, perc_loss: 8.0104, edge: 0.1736
Epoch: [ 7], Batch: [ 500/ 831] | Total Time: 3h 53m 36s
d_loss: 1.3868, g_loss: 32.9112, const_loss: 0.0000, l1_loss: 23.4795, fm_loss: 0.0102, perc_loss: 8.5379, edge: 0.1902
Epoch: [ 7], Batch: [ 600/ 831] | Total Time: 3h 57m 22s
d_loss: 1.3868, g_loss: 32.9340, const_loss: 0.0001, l1_loss: 23.5750, fm_loss: 0.0115, perc_loss: 8.4768, edge: 0.1773

from 1154 to 1156, 這次有增加一些訓練資料, 所以每一個 epoch 花費的時間有拉長, figure:

log:

Epoch: [  0], Batch: [   0/1009] | Total Time: 2s
d_loss: 1.3868, g_loss: 30.8982, const_loss: 0.0000, l1_loss: 21.9767, fm_loss: 0.0096, perc_loss: 8.0531, edge: 0.1655
Checkpoint step 100 reached, but saving starts after step 200.
Epoch: [ 0], Batch: [ 100/1009] | Total Time: 3m 37s
d_loss: 1.3869, g_loss: 28.9803, const_loss: 0.0000, l1_loss: 20.4443, fm_loss: 0.0078, perc_loss: 7.6666, edge: 0.1682
Epoch: [ 0], Batch: [ 200/1009] | Total Time: 7m 21s
d_loss: 1.3867, g_loss: 30.1839, const_loss: 0.0000, l1_loss: 21.2023, fm_loss: 0.0082, perc_loss: 8.0982, edge: 0.1818
Epoch: [ 0], Batch: [ 300/1009] | Total Time: 11m 2s
d_loss: 1.3867, g_loss: 32.5938, const_loss: 0.0001, l1_loss: 23.2653, fm_loss: 0.0093, perc_loss: 8.4389, edge: 0.1869
Epoch: [ 0], Batch: [ 400/1009] | Total Time: 14m 49s
d_loss: 1.3868, g_loss: 29.0254, const_loss: 0.0000, l1_loss: 20.6883, fm_loss: 0.0080, perc_loss: 7.4740, edge: 0.1617
Epoch: [ 0], Batch: [ 500/1009] | Total Time: 18m 31s
d_loss: 1.3869, g_loss: 32.2797, const_loss: 0.0000, l1_loss: 23.0599, fm_loss: 0.0090, perc_loss: 8.3407, edge: 0.1768
Epoch: [ 0], Batch: [ 600/1009] | Total Time: 22m 12s
d_loss: 1.3867, g_loss: 28.9536, const_loss: 0.0001, l1_loss: 20.5581, fm_loss: 0.0077, perc_loss: 7.5244, edge: 0.1698
Epoch: [ 0], Batch: [ 700/1009] | Total Time: 25m 53s
d_loss: 1.3867, g_loss: 30.3983, const_loss: 0.0001, l1_loss: 21.6764, fm_loss: 0.0082, perc_loss: 7.8442, edge: 0.1761
Epoch: [ 0], Batch: [ 800/1009] | Total Time: 29m 34s
d_loss: 1.3869, g_loss: 30.7399, const_loss: 0.0000, l1_loss: 22.0017, fm_loss: 0.0097, perc_loss: 7.8638, edge: 0.1718
Epoch: [ 0], Batch: [ 900/1009] | Total Time: 33m 15s
d_loss: 1.3868, g_loss: 32.3250, const_loss: 0.0000, l1_loss: 22.9278, fm_loss: 0.0094, perc_loss: 8.5072, edge: 0.1871
Epoch: [ 0], Batch: [1000/1009] | Total Time: 36m 57s
d_loss: 1.3869, g_loss: 26.8201, const_loss: 0.0000, l1_loss: 19.0496, fm_loss: 0.0072, perc_loss: 6.9205, edge: 0.1495
--- End of Epoch 0 --- Time: 2234.1s ---
LR Scheduler stepped. Current LR G: 0.000190, LR D: 0.000190
Epoch: [ 1], Batch: [ 0/1009] | Total Time: 37m 16s
d_loss: 1.3868, g_loss: 28.3229, const_loss: 0.0000, l1_loss: 20.0727, fm_loss: 0.0067, perc_loss: 7.3894, edge: 0.1608
Epoch: [ 1], Batch: [ 100/1009] | Total Time: 41m 2s
d_loss: 1.3869, g_loss: 31.8044, const_loss: 0.0000, l1_loss: 22.6914, fm_loss: 0.0095, perc_loss: 8.2369, edge: 0.1732
Epoch: [ 1], Batch: [ 200/1009] | Total Time: 44m 44s
d_loss: 1.3867, g_loss: 29.9654, const_loss: 0.0000, l1_loss: 21.2983, fm_loss: 0.0078, perc_loss: 7.7923, edge: 0.1735
Epoch: [ 1], Batch: [ 300/1009] | Total Time: 48m 25s
d_loss: 1.3867, g_loss: 29.1098, const_loss: 0.0000, l1_loss: 20.8463, fm_loss: 0.0084, perc_loss: 7.3878, edge: 0.1740
Epoch: [ 1], Batch: [ 400/1009] | Total Time: 52m 8s
d_loss: 1.3882, g_loss: 26.3511, const_loss: 0.0000, l1_loss: 18.8556, fm_loss: 0.0071, perc_loss: 6.6387, edge: 0.1562
Epoch: [ 1], Batch: [ 500/1009] | Total Time: 55m 48s
d_loss: 1.3867, g_loss: 33.1218, const_loss: 0.0000, l1_loss: 23.4463, fm_loss: 0.0096, perc_loss: 8.7765, edge: 0.1965
Epoch: [ 1], Batch: [ 600/1009] | Total Time: 59m 29s
d_loss: 1.3869, g_loss: 27.1650, const_loss: 0.0000, l1_loss: 19.6032, fm_loss: 0.0068, perc_loss: 6.7061, edge: 0.1555
Epoch: [ 1], Batch: [ 700/1009] | Total Time: 1h 3m 16s
d_loss: 1.3867, g_loss: 34.4517, const_loss: 0.0000, l1_loss: 24.6461, fm_loss: 0.0096, perc_loss: 8.9072, edge: 0.1955
Epoch: [ 1], Batch: [ 800/1009] | Total Time: 1h 6m 57s
d_loss: 1.3867, g_loss: 25.9927, const_loss: 0.0000, l1_loss: 18.8615, fm_loss: 0.0065, perc_loss: 6.2929, edge: 0.1384
Epoch: [ 1], Batch: [ 900/1009] | Total Time: 1h 10m 38s
d_loss: 1.3867, g_loss: 30.7794, const_loss: 0.0000, l1_loss: 21.7066, fm_loss: 0.0082, perc_loss: 8.1862, edge: 0.1850
Epoch: [ 1], Batch: [1000/1009] | Total Time: 1h 14m 20s
d_loss: 1.3868, g_loss: 27.2384, const_loss: 0.0000, l1_loss: 19.2381, fm_loss: 0.0076, perc_loss: 7.1477, edge: 0.1516
--- End of Epoch 1 --- Time: 2243.0s ---
LR Scheduler stepped. Current LR G: 0.000189, LR D: 0.000189
Epoch: [ 2], Batch: [ 0/1009] | Total Time: 1h 14m 39s
d_loss: 1.3867, g_loss: 28.9539, const_loss: 0.0000, l1_loss: 20.7931, fm_loss: 0.0083, perc_loss: 7.2900, edge: 0.1690
Epoch: [ 2], Batch: [ 100/1009] | Total Time: 1h 18m 22s
d_loss: 1.3869, g_loss: 28.5101, const_loss: 0.0000, l1_loss: 20.3565, fm_loss: 0.0073, perc_loss: 7.2837, edge: 0.1692
Epoch: [ 2], Batch: [ 200/1009] | Total Time: 1h 22m 3s
d_loss: 1.3867, g_loss: 30.9734, const_loss: 0.0000, l1_loss: 22.0455, fm_loss: 0.0080, perc_loss: 8.0474, edge: 0.1792
Epoch: [ 2], Batch: [ 300/1009] | Total Time: 1h 25m 45s
d_loss: 1.3867, g_loss: 30.1032, const_loss: 0.0000, l1_loss: 21.5729, fm_loss: 0.0076, perc_loss: 7.6616, edge: 0.1676
Epoch: [ 2], Batch: [ 400/1009] | Total Time: 1h 29m 26s
d_loss: 1.3868, g_loss: 31.2766, const_loss: 0.0000, l1_loss: 22.1735, fm_loss: 0.0079, perc_loss: 8.2260, edge: 0.1759
Epoch: [ 2], Batch: [ 500/1009] | Total Time: 1h 33m 11s
d_loss: 1.3867, g_loss: 27.0743, const_loss: 0.0000, l1_loss: 19.2202, fm_loss: 0.0066, perc_loss: 6.9907, edge: 0.1634
Epoch: [ 2], Batch: [ 600/1009] | Total Time: 1h 36m 52s
d_loss: 1.3868, g_loss: 29.5774, const_loss: 0.0000, l1_loss: 21.1771, fm_loss: 0.0076, perc_loss: 7.5297, edge: 0.1697
Epoch: [ 2], Batch: [ 700/1009] | Total Time: 1h 40m 36s
d_loss: 1.3871, g_loss: 30.8395, const_loss: 0.0000, l1_loss: 22.0560, fm_loss: 0.0077, perc_loss: 7.9053, edge: 0.1776
Epoch: [ 2], Batch: [ 800/1009] | Total Time: 1h 44m 17s
d_loss: 1.3867, g_loss: 27.0825, const_loss: 0.0000, l1_loss: 19.2367, fm_loss: 0.0063, perc_loss: 6.9839, edge: 0.1623
Epoch: [ 2], Batch: [ 900/1009] | Total Time: 1h 47m 59s
d_loss: 1.3869, g_loss: 26.7615, const_loss: 0.0000, l1_loss: 19.2430, fm_loss: 0.0074, perc_loss: 6.6605, edge: 0.1572
Epoch: [ 2], Batch: [1000/1009] | Total Time: 1h 51m 40s
d_loss: 1.3867, g_loss: 29.6087, const_loss: 0.0000, l1_loss: 20.9553, fm_loss: 0.0074, perc_loss: 7.7765, edge: 0.1761
--- End of Epoch 2 --- Time: 2240.5s ---
LR Scheduler stepped. Current LR G: 0.000187, LR D: 0.000187
Epoch: [ 3], Batch: [ 0/1009] | Total Time: 1h 51m 59s
d_loss: 1.3867, g_loss: 29.7576, const_loss: 0.0000, l1_loss: 20.9788, fm_loss: 0.0074, perc_loss: 7.9003, edge: 0.1777
Epoch: [ 3], Batch: [ 100/1009] | Total Time: 1h 55m 40s
d_loss: 1.3868, g_loss: 27.0645, const_loss: 0.0000, l1_loss: 19.1410, fm_loss: 0.0077, perc_loss: 7.0605, edge: 0.1619
Epoch: [ 3], Batch: [ 200/1009] | Total Time: 1h 59m 25s
d_loss: 1.3868, g_loss: 27.4857, const_loss: 0.0000, l1_loss: 19.4952, fm_loss: 0.0071, perc_loss: 7.1304, edge: 0.1596
Epoch: [ 3], Batch: [ 300/1009] | Total Time: 2h 3m 6s
d_loss: 1.3869, g_loss: 30.4294, const_loss: 0.0000, l1_loss: 21.7812, fm_loss: 0.0087, perc_loss: 7.7735, edge: 0.1726
Epoch: [ 3], Batch: [ 400/1009] | Total Time: 2h 6m 47s
d_loss: 1.3867, g_loss: 30.4152, const_loss: 0.0000, l1_loss: 21.6118, fm_loss: 0.0085, perc_loss: 7.9246, edge: 0.1769
Epoch: [ 3], Batch: [ 500/1009] | Total Time: 2h 10m 28s
d_loss: 1.3868, g_loss: 24.7118, const_loss: 0.0000, l1_loss: 17.6776, fm_loss: 0.0064, perc_loss: 6.1923, edge: 0.1421
Epoch: [ 3], Batch: [ 600/1009] | Total Time: 2h 14m 10s
d_loss: 1.3870, g_loss: 27.6480, const_loss: 0.0000, l1_loss: 19.7870, fm_loss: 0.0069, perc_loss: 7.0050, edge: 0.1557
Epoch: [ 3], Batch: [ 700/1009] | Total Time: 2h 17m 50s
d_loss: 1.3872, g_loss: 29.1706, const_loss: 0.0000, l1_loss: 20.8029, fm_loss: 0.0063, perc_loss: 7.4896, edge: 0.1784
Epoch: [ 3], Batch: [ 800/1009] | Total Time: 2h 21m 36s
d_loss: 1.3867, g_loss: 27.0851, const_loss: 0.0000, l1_loss: 19.3615, fm_loss: 0.0058, perc_loss: 6.8649, edge: 0.1594
Epoch: [ 3], Batch: [ 900/1009] | Total Time: 2h 25m 17s
d_loss: 1.3867, g_loss: 29.4218, const_loss: 0.0000, l1_loss: 21.0527, fm_loss: 0.0071, perc_loss: 7.5092, edge: 0.1595
Epoch: [ 3], Batch: [1000/1009] | Total Time: 2h 29m 4s
d_loss: 1.3868, g_loss: 27.5423, const_loss: 0.0000, l1_loss: 19.8119, fm_loss: 0.0063, perc_loss: 6.8660, edge: 0.1648
--- End of Epoch 3 --- Time: 2243.7s ---
LR Scheduler stepped. Current LR G: 0.000185, LR D: 0.000185
Epoch: [ 4], Batch: [ 0/1009] | Total Time: 2h 29m 23s
d_loss: 1.3867, g_loss: 26.1920, const_loss: 0.0000, l1_loss: 18.7728, fm_loss: 0.0068, perc_loss: 6.5619, edge: 0.1571
Epoch: [ 4], Batch: [ 100/1009] | Total Time: 2h 33m 5s
d_loss: 1.3874, g_loss: 27.6825, const_loss: 0.0000, l1_loss: 19.5811, fm_loss: 0.0075, perc_loss: 7.2365, edge: 0.1641
Epoch: [ 4], Batch: [ 200/1009] | Total Time: 2h 36m 46s
d_loss: 1.3868, g_loss: 24.6662, const_loss: 0.0000, l1_loss: 17.5898, fm_loss: 0.0059, perc_loss: 6.2324, edge: 0.1447
Epoch: [ 4], Batch: [ 300/1009] | Total Time: 2h 40m 27s
d_loss: 1.3873, g_loss: 29.2333, const_loss: 0.0000, l1_loss: 20.7596, fm_loss: 0.0071, perc_loss: 7.5919, edge: 0.1814
Epoch: [ 4], Batch: [ 400/1009] | Total Time: 2h 44m 8s
d_loss: 1.3868, g_loss: 28.3472, const_loss: 0.0000, l1_loss: 20.1356, fm_loss: 0.0072, perc_loss: 7.3441, edge: 0.1669
Epoch: [ 4], Batch: [ 500/1009] | Total Time: 2h 47m 50s
d_loss: 1.3867, g_loss: 29.1811, const_loss: 0.0000, l1_loss: 20.5613, fm_loss: 0.0072, perc_loss: 7.7458, edge: 0.1735
Epoch: [ 4], Batch: [ 600/1009] | Total Time: 2h 51m 31s
d_loss: 1.3870, g_loss: 27.2661, const_loss: 0.0000, l1_loss: 19.4570, fm_loss: 0.0063, perc_loss: 6.9491, edge: 0.1603
Epoch: [ 4], Batch: [ 700/1009] | Total Time: 2h 55m 12s
d_loss: 1.3867, g_loss: 30.0584, const_loss: 0.0000, l1_loss: 21.2344, fm_loss: 0.0071, perc_loss: 7.9527, edge: 0.1709
Epoch: [ 4], Batch: [ 800/1009] | Total Time: 2h 58m 53s
d_loss: 1.3867, g_loss: 28.7596, const_loss: 0.0000, l1_loss: 20.2697, fm_loss: 0.0072, perc_loss: 7.6180, edge: 0.1718
Epoch: [ 4], Batch: [ 900/1009] | Total Time: 3h 2m 35s
d_loss: 1.3868, g_loss: 30.3219, const_loss: 0.0000, l1_loss: 21.4957, fm_loss: 0.0071, perc_loss: 7.9548, edge: 0.1709
Epoch: [ 4], Batch: [1000/1009] | Total Time: 3h 6m 16s
d_loss: 1.3867, g_loss: 26.5545, const_loss: 0.0000, l1_loss: 19.0008, fm_loss: 0.0061, perc_loss: 6.7011, edge: 0.1531
--- End of Epoch 4 --- Time: 2232.2s ---
LR Scheduler stepped. Current LR G: 0.000183, LR D: 0.000183
Epoch: [ 5], Batch: [ 0/1009] | Total Time: 3h 6m 35s
d_loss: 1.3869, g_loss: 25.9558, const_loss: 0.0000, l1_loss: 18.2952, fm_loss: 0.0059, perc_loss: 6.8037, edge: 0.1577
Epoch: [ 5], Batch: [ 100/1009] | Total Time: 3h 10m 18s
d_loss: 1.3869, g_loss: 27.9378, const_loss: 0.0000, l1_loss: 20.0886, fm_loss: 0.0065, perc_loss: 6.9790, edge: 0.1702
Epoch: [ 5], Batch: [ 200/1009] | Total Time: 3h 13m 59s
d_loss: 1.3867, g_loss: 25.6503, const_loss: 0.0000, l1_loss: 18.2971, fm_loss: 0.0062, perc_loss: 6.5019, edge: 0.1518
Epoch: [ 5], Batch: [ 300/1009] | Total Time: 3h 17m 40s
d_loss: 1.3868, g_loss: 25.8447, const_loss: 0.0000, l1_loss: 18.3820, fm_loss: 0.0058, perc_loss: 6.6138, edge: 0.1499
Epoch: [ 5], Batch: [ 400/1009] | Total Time: 3h 21m 21s
d_loss: 1.3869, g_loss: 25.8463, const_loss: 0.0000, l1_loss: 18.3924, fm_loss: 0.0065, perc_loss: 6.6053, edge: 0.1488
Epoch: [ 5], Batch: [ 500/1009] | Total Time: 3h 25m 3s
d_loss: 1.3869, g_loss: 27.0186, const_loss: 0.0000, l1_loss: 19.3287, fm_loss: 0.0066, perc_loss: 6.8275, edge: 0.1625
Epoch: [ 5], Batch: [ 600/1009] | Total Time: 3h 28m 44s
d_loss: 1.3868, g_loss: 24.8367, const_loss: 0.0000, l1_loss: 17.7645, fm_loss: 0.0050, perc_loss: 6.2275, edge: 0.1463
Epoch: [ 5], Batch: [ 700/1009] | Total Time: 3h 32m 25s
d_loss: 1.3869, g_loss: 26.6676, const_loss: 0.0000, l1_loss: 19.0766, fm_loss: 0.0059, perc_loss: 6.7345, edge: 0.1572
Epoch: [ 5], Batch: [ 800/1009] | Total Time: 3h 36m 6s
d_loss: 1.3867, g_loss: 25.8846, const_loss: 0.0000, l1_loss: 18.3836, fm_loss: 0.0063, perc_loss: 6.6498, edge: 0.1515
Epoch: [ 5], Batch: [ 900/1009] | Total Time: 3h 39m 49s
d_loss: 1.3867, g_loss: 28.1501, const_loss: 0.0000, l1_loss: 20.1098, fm_loss: 0.0070, perc_loss: 7.1771, edge: 0.1629
Epoch: [ 5], Batch: [1000/1009] | Total Time: 3h 43m 30s
d_loss: 1.3868, g_loss: 29.9697, const_loss: 0.0000, l1_loss: 21.3228, fm_loss: 0.0070, perc_loss: 7.7649, edge: 0.1817
--- End of Epoch 5 --- Time: 2234.3s ---
LR Scheduler stepped. Current LR G: 0.000180, LR D: 0.000180
Epoch: [ 6], Batch: [ 0/1009] | Total Time: 3h 43m 49s
d_loss: 1.3868, g_loss: 27.0016, const_loss: 0.0000, l1_loss: 19.3607, fm_loss: 0.0062, perc_loss: 6.7821, edge: 0.1593
Epoch: [ 6], Batch: [ 100/1009] | Total Time: 3h 47m 31s
d_loss: 1.3868, g_loss: 27.0326, const_loss: 0.0000, l1_loss: 19.3591, fm_loss: 0.0058, perc_loss: 6.8141, edge: 0.1603
Epoch: [ 6], Batch: [ 200/1009] | Total Time: 3h 51m 12s
d_loss: 1.3868, g_loss: 26.7903, const_loss: 0.0000, l1_loss: 19.2709, fm_loss: 0.0059, perc_loss: 6.6624, edge: 0.1576
Epoch: [ 6], Batch: [ 300/1009] | Total Time: 3h 54m 53s
d_loss: 1.3867, g_loss: 28.7907, const_loss: 0.0000, l1_loss: 20.7099, fm_loss: 0.0072, perc_loss: 7.2099, edge: 0.1703
Epoch: [ 6], Batch: [ 400/1009] | Total Time: 3h 58m 40s
d_loss: 1.3868, g_loss: 27.4810, const_loss: 0.0000, l1_loss: 19.6863, fm_loss: 0.0069, perc_loss: 6.9346, edge: 0.1598
Epoch: [ 6], Batch: [ 500/1009] | Total Time: 4h 2m 22s
d_loss: 1.3867, g_loss: 26.0709, const_loss: 0.0000, l1_loss: 18.5806, fm_loss: 0.0067, perc_loss: 6.6352, edge: 0.1551
Epoch: [ 6], Batch: [ 600/1009] | Total Time: 4h 6m 3s
d_loss: 1.3867, g_loss: 27.7621, const_loss: 0.0000, l1_loss: 19.8631, fm_loss: 0.0063, perc_loss: 7.0288, edge: 0.1706
Epoch: [ 6], Batch: [ 700/1009] | Total Time: 4h 9m 45s
d_loss: 1.3868, g_loss: 26.2892, const_loss: 0.0000, l1_loss: 18.5212, fm_loss: 0.0060, perc_loss: 6.9108, edge: 0.1578
Epoch: [ 6], Batch: [ 800/1009] | Total Time: 4h 13m 26s
d_loss: 1.3872, g_loss: 26.2446, const_loss: 0.0000, l1_loss: 18.6828, fm_loss: 0.0067, perc_loss: 6.7054, edge: 0.1563
Epoch: [ 6], Batch: [ 900/1009] | Total Time: 4h 17m 48s
d_loss: 1.3867, g_loss: 26.5457, const_loss: 0.0000, l1_loss: 18.9076, fm_loss: 0.0059, perc_loss: 6.7872, edge: 0.1517
Epoch: [ 6], Batch: [1000/1009] | Total Time: 4h 21m 29s
d_loss: 1.3868, g_loss: 26.4308, const_loss: 0.0000, l1_loss: 18.6193, fm_loss: 0.0056, perc_loss: 6.9545, edge: 0.1581
--- End of Epoch 6 --- Time: 2279.0s ---
LR Scheduler stepped. Current LR G: 0.000176, LR D: 0.000176
Epoch: [ 7], Batch: [ 0/1009] | Total Time: 4h 21m 49s
d_loss: 1.3867, g_loss: 26.4335, const_loss: 0.0000, l1_loss: 18.9572, fm_loss: 0.0055, perc_loss: 6.6257, edge: 0.1516
Epoch: [ 7], Batch: [ 100/1009] | Total Time: 4h 25m 33s
d_loss: 1.3867, g_loss: 26.3623, const_loss: 0.0000, l1_loss: 18.8974, fm_loss: 0.0054, perc_loss: 6.6119, edge: 0.1543
Epoch: [ 7], Batch: [ 200/1009] | Total Time: 4h 29m 15s
d_loss: 1.3868, g_loss: 25.3038, const_loss: 0.0000, l1_loss: 18.0112, fm_loss: 0.0052, perc_loss: 6.4438, edge: 0.1501

from 1156 to 1158, 數據還下降了, 感動.

log:

Epoch: [ 0], Batch: [ 0/1009] | Total Time: 4s
d_loss: 1.3867, g_loss: 27.4468, const_loss: 0.0000, l1_loss: 19.4603, fm_loss: 0.0061, perc_loss: 7.1275, edge: 0.1595
Checkpoint step 100 reached, but saving starts after step 200.
Epoch: [ 0], Batch: [ 100/1009] | Total Time: 3m 32s
d_loss: 1.3869, g_loss: 26.0118, const_loss: 0.0000, l1_loss: 18.4836, fm_loss: 0.0055, perc_loss: 6.6740, edge: 0.1554
Checkpoint saved at step 200
Epoch: [ 0], Batch: [ 200/1009] | Total Time: 7m 14s
d_loss: 1.3867, g_loss: 27.6323, const_loss: 0.0000, l1_loss: 19.5828, fm_loss: 0.0059, perc_loss: 7.1838, edge: 0.1664
Checkpoint saved at step 300
Epoch: [ 0], Batch: [ 300/1009] | Total Time: 10m 52s
d_loss: 1.3868, g_loss: 28.6344, const_loss: 0.0000, l1_loss: 20.5764, fm_loss: 0.0070, perc_loss: 7.1857, edge: 0.1721
Checkpoint saved at step 400
Epoch: [ 0], Batch: [ 400/1009] | Total Time: 14m 30s
d_loss: 1.3868, g_loss: 25.8924, const_loss: 0.0000, l1_loss: 18.4857, fm_loss: 0.0059, perc_loss: 6.5566, edge: 0.1510
Checkpoint saved at step 500
Epoch: [ 0], Batch: [ 500/1009] | Total Time: 18m 9s
d_loss: 1.3870, g_loss: 27.8126, const_loss: 0.0000, l1_loss: 19.8412, fm_loss: 0.0065, perc_loss: 7.1083, edge: 0.1633
Checkpoint saved at step 600
Epoch: [ 0], Batch: [ 600/1009] | Total Time: 21m 49s
d_loss: 1.3867, g_loss: 25.7959, const_loss: 0.0000, l1_loss: 18.4473, fm_loss: 0.0060, perc_loss: 6.4922, edge: 0.1569
Checkpoint saved at step 700
Epoch: [ 0], Batch: [ 700/1009] | Total Time: 25m 27s
d_loss: 1.3867, g_loss: 27.2869, const_loss: 0.0000, l1_loss: 19.5104, fm_loss: 0.0063, perc_loss: 6.9137, edge: 0.1631
Checkpoint saved at step 800
Epoch: [ 0], Batch: [ 800/1009] | Total Time: 29m 5s
d_loss: 1.3872, g_loss: 27.2351, const_loss: 0.0000, l1_loss: 19.5685, fm_loss: 0.0068, perc_loss: 6.8088, edge: 0.1577
Checkpoint saved at step 900
Epoch: [ 0], Batch: [ 900/1009] | Total Time: 32m 47s
d_loss: 1.3868, g_loss: 29.0892, const_loss: 0.0000, l1_loss: 20.7493, fm_loss: 0.0072, perc_loss: 7.4635, edge: 0.1759
Checkpoint saved at step 1000
Epoch: [ 0], Batch: [1000/1009] | Total Time: 36m 25s
d_loss: 1.3869, g_loss: 24.4382, const_loss: 0.0000, l1_loss: 17.4027, fm_loss: 0.0048, perc_loss: 6.1999, edge: 0.1375
--- End of Epoch 0 --- Time: 2202.0s ---
LR Scheduler stepped. Current LR G: 0.000190, LR D: 0.000190
Epoch: [ 1], Batch: [ 0/1009] | Total Time: 36m 44s
d_loss: 1.3867, g_loss: 25.4377, const_loss: 0.0000, l1_loss: 18.1201, fm_loss: 0.0052, perc_loss: 6.4641, edge: 0.1549
Checkpoint saved at step 1100
Epoch: [ 1], Batch: [ 100/1009] | Total Time: 40m 22s
d_loss: 1.3870, g_loss: 27.8938, const_loss: 0.0000, l1_loss: 19.7808, fm_loss: 0.0065, perc_loss: 7.2477, edge: 0.1655
Checkpoint saved at step 1200
Epoch: [ 1], Batch: [ 200/1009] | Total Time: 44m 0s
d_loss: 1.3867, g_loss: 28.4796, const_loss: 0.0000, l1_loss: 20.2706, fm_loss: 0.0059, perc_loss: 7.3381, edge: 0.1715
Checkpoint saved at step 1300
Epoch: [ 1], Batch: [ 300/1009] | Total Time: 47m 44s
d_loss: 1.3871, g_loss: 26.9728, const_loss: 0.0000, l1_loss: 19.4934, fm_loss: 0.0067, perc_loss: 6.6164, edge: 0.1629
Checkpoint saved at step 1400
Epoch: [ 1], Batch: [ 400/1009] | Total Time: 51m 23s
d_loss: 1.3879, g_loss: 25.1016, const_loss: 0.0000, l1_loss: 18.0415, fm_loss: 0.0055, perc_loss: 6.2116, edge: 0.1497
Checkpoint saved at step 1500
Epoch: [ 1], Batch: [ 500/1009] | Total Time: 55m 1s
d_loss: 1.3867, g_loss: 29.2177, const_loss: 0.0000, l1_loss: 20.8823, fm_loss: 0.0065, perc_loss: 7.4543, edge: 0.1812
Checkpoint saved at step 1600
Epoch: [ 1], Batch: [ 600/1009] | Total Time: 58m 39s
d_loss: 1.3868, g_loss: 26.1449, const_loss: 0.0000, l1_loss: 18.8758, fm_loss: 0.0058, perc_loss: 6.4136, edge: 0.1563
Checkpoint saved at step 1700
Epoch: [ 1], Batch: [ 700/1009] | Total Time: 1h 2m 19s
d_loss: 1.3867, g_loss: 29.8010, const_loss: 0.0000, l1_loss: 21.1752, fm_loss: 0.0069, perc_loss: 7.7447, edge: 0.1808
Checkpoint saved at step 1800
Epoch: [ 1], Batch: [ 800/1009] | Total Time: 1h 5m 57s
d_loss: 1.3867, g_loss: 23.6851, const_loss: 0.0000, l1_loss: 17.1872, fm_loss: 0.0052, perc_loss: 5.6700, edge: 0.1294
Checkpoint saved at step 1900
Epoch: [ 1], Batch: [ 900/1009] | Total Time: 1h 9m 35s
d_loss: 1.3867, g_loss: 28.4856, const_loss: 0.0000, l1_loss: 20.2635, fm_loss: 0.0065, perc_loss: 7.3490, edge: 0.1733
Checkpoint saved at step 2000
Epoch: [ 1], Batch: [1000/1009] | Total Time: 1h 13m 14s
d_loss: 1.3867, g_loss: 24.5244, const_loss: 0.0000, l1_loss: 17.3751, fm_loss: 0.0056, perc_loss: 6.3054, edge: 0.1449
--- End of Epoch 1 --- Time: 2208.9s ---
LR Scheduler stepped. Current LR G: 0.000189, LR D: 0.000189
Epoch: [ 2], Batch: [ 0/1009] | Total Time: 1h 13m 33s
d_loss: 1.3867, g_loss: 26.6472, const_loss: 0.0000, l1_loss: 19.2029, fm_loss: 0.0062, perc_loss: 6.5874, edge: 0.1573
Checkpoint saved at step 2100
Epoch: [ 2], Batch: [ 100/1009] | Total Time: 1h 17m 11s
d_loss: 1.3868, g_loss: 26.7355, const_loss: 0.0000, l1_loss: 19.1291, fm_loss: 0.0063, perc_loss: 6.7501, edge: 0.1566
Checkpoint saved at step 2200
Epoch: [ 2], Batch: [ 200/1009] | Total Time: 1h 20m 51s
d_loss: 1.3867, g_loss: 28.8581, const_loss: 0.0000, l1_loss: 20.6037, fm_loss: 0.0065, perc_loss: 7.3799, edge: 0.1747
Checkpoint saved at step 2300
Epoch: [ 2], Batch: [ 300/1009] | Total Time: 1h 24m 29s
d_loss: 1.3868, g_loss: 27.5763, const_loss: 0.0000, l1_loss: 19.8683, fm_loss: 0.0059, perc_loss: 6.8524, edge: 0.1563
Checkpoint saved at step 2400
Epoch: [ 2], Batch: [ 400/1009] | Total Time: 1h 28m 8s
d_loss: 1.3868, g_loss: 28.7304, const_loss: 0.0000, l1_loss: 20.3308, fm_loss: 0.0063, perc_loss: 7.5292, edge: 0.1707
Checkpoint saved at step 2500
Epoch: [ 2], Batch: [ 500/1009] | Total Time: 1h 31m 52s
d_loss: 1.3867, g_loss: 25.5138, const_loss: 0.0000, l1_loss: 18.2073, fm_loss: 0.0053, perc_loss: 6.4543, edge: 0.1535
Checkpoint saved at step 2600
Epoch: [ 2], Batch: [ 600/1009] | Total Time: 1h 35m 31s
d_loss: 1.3867, g_loss: 26.8008, const_loss: 0.0000, l1_loss: 19.4062, fm_loss: 0.0063, perc_loss: 6.5386, edge: 0.1563
Checkpoint saved at step 2700
Epoch: [ 2], Batch: [ 700/1009] | Total Time: 1h 39m 14s
d_loss: 1.3868, g_loss: 28.1078, const_loss: 0.0000, l1_loss: 20.1147, fm_loss: 0.0057, perc_loss: 7.1275, edge: 0.1666
Checkpoint saved at step 2800
Epoch: [ 2], Batch: [ 800/1009] | Total Time: 1h 42m 52s
d_loss: 1.3868, g_loss: 25.1592, const_loss: 0.0000, l1_loss: 17.9936, fm_loss: 0.0049, perc_loss: 6.3095, edge: 0.1579
Checkpoint saved at step 2900
Epoch: [ 2], Batch: [ 900/1009] | Total Time: 1h 46m 31s
d_loss: 1.3868, g_loss: 25.1547, const_loss: 0.0000, l1_loss: 18.0206, fm_loss: 0.0057, perc_loss: 6.2805, edge: 0.1551
Checkpoint saved at step 3000
Epoch: [ 2], Batch: [1000/1009] | Total Time: 1h 50m 9s
d_loss: 1.3867, g_loss: 27.5919, const_loss: 0.0000, l1_loss: 19.6670, fm_loss: 0.0058, perc_loss: 7.0602, edge: 0.1655
--- End of Epoch 2 --- Time: 2214.9s ---
LR Scheduler stepped. Current LR G: 0.000187, LR D: 0.000187
Epoch: [ 3], Batch: [ 0/1009] | Total Time: 1h 50m 27s
d_loss: 1.3868, g_loss: 27.2271, const_loss: 0.0000, l1_loss: 19.3584, fm_loss: 0.0056, perc_loss: 7.0062, edge: 0.1635
Checkpoint saved at step 3100
Epoch: [ 3], Batch: [ 100/1009] | Total Time: 1h 54m 10s
d_loss: 1.3868, g_loss: 25.5382, const_loss: 0.0000, l1_loss: 18.1504, fm_loss: 0.0058, perc_loss: 6.5283, edge: 0.1603
Checkpoint saved at step 3200
Epoch: [ 3], Batch: [ 200/1009] | Total Time: 1h 57m 49s
d_loss: 1.3868, g_loss: 24.9940, const_loss: 0.0000, l1_loss: 17.8663, fm_loss: 0.0049, perc_loss: 6.2795, edge: 0.1500
Checkpoint saved at step 3300
Epoch: [ 3], Batch: [ 300/1009] | Total Time: 2h 1m 27s
d_loss: 1.3868, g_loss: 27.6818, const_loss: 0.0000, l1_loss: 19.8678, fm_loss: 0.0061, perc_loss: 6.9518, edge: 0.1627
Checkpoint saved at step 3400
Epoch: [ 3], Batch: [ 400/1009] | Total Time: 2h 5m 5s
d_loss: 1.3868, g_loss: 27.5276, const_loss: 0.0000, l1_loss: 19.6712, fm_loss: 0.0057, perc_loss: 6.9935, edge: 0.1639
Checkpoint saved at step 3500
Epoch: [ 3], Batch: [ 500/1009] | Total Time: 2h 8m 43s
d_loss: 1.3867, g_loss: 23.2102, const_loss: 0.0000, l1_loss: 16.6920, fm_loss: 0.0049, perc_loss: 5.6766, edge: 0.1433
Checkpoint saved at step 3600
Epoch: [ 3], Batch: [ 600/1009] | Total Time: 2h 12m 21s
d_loss: 1.3869, g_loss: 25.9864, const_loss: 0.0000, l1_loss: 18.6269, fm_loss: 0.0051, perc_loss: 6.5096, edge: 0.1515
Checkpoint saved at step 3700
Epoch: [ 3], Batch: [ 700/1009] | Total Time: 2h 16m 1s
d_loss: 1.3873, g_loss: 27.4017, const_loss: 0.0000, l1_loss: 19.6077, fm_loss: 0.0055, perc_loss: 6.9278, edge: 0.1673
Checkpoint saved at step 3800
Epoch: [ 3], Batch: [ 800/1009] | Total Time: 2h 19m 39s
d_loss: 1.3867, g_loss: 25.0102, const_loss: 0.0000, l1_loss: 17.9769, fm_loss: 0.0048, perc_loss: 6.1759, edge: 0.1592
Checkpoint saved at step 3900
Epoch: [ 3], Batch: [ 900/1009] | Total Time: 2h 23m 21s
d_loss: 1.3868, g_loss: 26.4315, const_loss: 0.0000, l1_loss: 19.0407, fm_loss: 0.0057, perc_loss: 6.5392, edge: 0.1526
Checkpoint saved at step 4000
Epoch: [ 3], Batch: [1000/1009] | Total Time: 2h 27m 0s
d_loss: 1.3867, g_loss: 26.2646, const_loss: 0.0000, l1_loss: 18.9668, fm_loss: 0.0052, perc_loss: 6.4426, edge: 0.1566
--- End of Epoch 3 --- Time: 2211.2s ---
LR Scheduler stepped. Current LR G: 0.000185, LR D: 0.000185
Epoch: [ 4], Batch: [ 0/1009] | Total Time: 2h 27m 19s
d_loss: 1.3867, g_loss: 24.8697, const_loss: 0.0000, l1_loss: 17.9426, fm_loss: 0.0056, perc_loss: 6.0806, edge: 0.1475
Checkpoint saved at step 4100
Epoch: [ 4], Batch: [ 100/1009] | Total Time: 2h 31m 0s
d_loss: 1.3873, g_loss: 25.0928, const_loss: 0.0000, l1_loss: 17.9985, fm_loss: 0.0052, perc_loss: 6.2492, edge: 0.1465
Checkpoint saved at step 4200
Epoch: [ 4], Batch: [ 200/1009] | Total Time: 2h 34m 38s
d_loss: 1.3867, g_loss: 23.3064, const_loss: 0.0000, l1_loss: 16.7801, fm_loss: 0.0050, perc_loss: 5.6932, edge: 0.1347
Checkpoint saved at step 4300
Epoch: [ 4], Batch: [ 300/1009] | Total Time: 2h 38m 17s
d_loss: 1.3871, g_loss: 26.8814, const_loss: 0.0000, l1_loss: 19.2733, fm_loss: 0.0057, perc_loss: 6.7456, edge: 0.1634
Checkpoint saved at step 4400
Epoch: [ 4], Batch: [ 400/1009] | Total Time: 2h 42m 0s
d_loss: 1.3870, g_loss: 27.5083, const_loss: 0.0000, l1_loss: 19.6014, fm_loss: 0.0061, perc_loss: 7.0413, edge: 0.1662
Checkpoint saved at step 4500
Epoch: [ 4], Batch: [ 500/1009] | Total Time: 2h 45m 47s
d_loss: 1.3868, g_loss: 26.1615, const_loss: 0.0000, l1_loss: 18.6360, fm_loss: 0.0056, perc_loss: 6.6804, edge: 0.1462
Checkpoint saved at step 4600
Epoch: [ 4], Batch: [ 600/1009] | Total Time: 2h 49m 33s
d_loss: 1.3869, g_loss: 24.6412, const_loss: 0.0000, l1_loss: 17.7397, fm_loss: 0.0053, perc_loss: 6.0594, edge: 0.1435
Checkpoint saved at step 4700
Epoch: [ 4], Batch: [ 700/1009] | Total Time: 2h 53m 12s
d_loss: 1.3867, g_loss: 27.1844, const_loss: 0.0000, l1_loss: 19.2991, fm_loss: 0.0055, perc_loss: 7.0269, edge: 0.1595
Checkpoint saved at step 4800
Epoch: [ 4], Batch: [ 800/1009] | Total Time: 2h 56m 51s
d_loss: 1.3867, g_loss: 27.0156, const_loss: 0.0000, l1_loss: 19.1828, fm_loss: 0.0060, perc_loss: 6.9711, edge: 0.1628
Checkpoint saved at step 4900
Epoch: [ 4], Batch: [ 900/1009] | Total Time: 3h 30s
d_loss: 1.3868, g_loss: 27.8600, const_loss: 0.0000, l1_loss: 19.7286, fm_loss: 0.0054, perc_loss: 7.2718, edge: 0.1607
Checkpoint saved at step 5000
Epoch: [ 4], Batch: [1000/1009] | Total Time: 3h 4m 8s
d_loss: 1.3868, g_loss: 24.1605, const_loss: 0.0000, l1_loss: 17.3423, fm_loss: 0.0049, perc_loss: 5.9795, edge: 0.1404
--- End of Epoch 4 --- Time: 2228.7s ---
LR Scheduler stepped. Current LR G: 0.000183, LR D: 0.000183
Epoch: [ 5], Batch: [ 0/1009] | Total Time: 3h 4m 27s
d_loss: 1.3868, g_loss: 24.6256, const_loss: 0.0000, l1_loss: 17.4912, fm_loss: 0.0052, perc_loss: 6.2934, edge: 0.1424
Checkpoint saved at step 5100
Epoch: [ 5], Batch: [ 100/1009] | Total Time: 3h 8m 5s
d_loss: 1.3868, g_loss: 27.2796, const_loss: 0.0000, l1_loss: 19.7554, fm_loss: 0.0062, perc_loss: 6.6581, edge: 0.1666
Checkpoint saved at step 5200
Epoch: [ 5], Batch: [ 200/1009] | Total Time: 3h 11m 46s
d_loss: 1.3867, g_loss: 25.2086, const_loss: 0.0000, l1_loss: 18.0151, fm_loss: 0.0056, perc_loss: 6.3390, edge: 0.1555
Checkpoint saved at step 5300
Epoch: [ 5], Batch: [ 300/1009] | Total Time: 3h 15m 24s
d_loss: 1.3868, g_loss: 23.7966, const_loss: 0.0000, l1_loss: 17.0142, fm_loss: 0.0049, perc_loss: 5.9445, edge: 0.1397
Checkpoint saved at step 5400
Epoch: [ 5], Batch: [ 400/1009] | Total Time: 3h 19m 2s
d_loss: 1.3868, g_loss: 23.9481, const_loss: 0.0000, l1_loss: 17.1553, fm_loss: 0.0055, perc_loss: 5.9527, edge: 0.1412
Checkpoint saved at step 5500
Epoch: [ 5], Batch: [ 500/1009] | Total Time: 3h 22m 44s
d_loss: 1.3868, g_loss: 24.6401, const_loss: 0.0000, l1_loss: 17.8327, fm_loss: 0.0049, perc_loss: 5.9649, edge: 0.1442
Checkpoint saved at step 5600
Epoch: [ 5], Batch: [ 600/1009] | Total Time: 3h 26m 27s
d_loss: 1.3868, g_loss: 24.0142, const_loss: 0.0000, l1_loss: 17.2575, fm_loss: 0.0045, perc_loss: 5.9171, edge: 0.1418
Checkpoint saved at step 5700
Epoch: [ 5], Batch: [ 700/1009] | Total Time: 3h 30m 5s
d_loss: 1.3870, g_loss: 24.8988, const_loss: 0.0000, l1_loss: 17.9521, fm_loss: 0.0053, perc_loss: 6.1006, edge: 0.1475
Checkpoint saved at step 5800
Epoch: [ 5], Batch: [ 800/1009] | Total Time: 3h 33m 43s
d_loss: 1.3867, g_loss: 24.6327, const_loss: 0.0000, l1_loss: 17.5678, fm_loss: 0.0054, perc_loss: 6.2193, edge: 0.1468
Checkpoint saved at step 5900
Epoch: [ 5], Batch: [ 900/1009] | Total Time: 3h 37m 24s
d_loss: 1.3867, g_loss: 26.1928, const_loss: 0.0000, l1_loss: 18.7654, fm_loss: 0.0057, perc_loss: 6.5712, edge: 0.1572
Checkpoint saved at step 6000
Epoch: [ 5], Batch: [1000/1009] | Total Time: 3h 41m 7s
d_loss: 1.3868, g_loss: 26.9755, const_loss: 0.0000, l1_loss: 19.4256, fm_loss: 0.0053, perc_loss: 6.6837, edge: 0.1674
--- End of Epoch 5 --- Time: 2218.9s ---
LR Scheduler stepped. Current LR G: 0.000180, LR D: 0.000180
Epoch: [ 6], Batch: [ 0/1009] | Total Time: 3h 41m 26s
d_loss: 1.3867, g_loss: 25.8302, const_loss: 0.0000, l1_loss: 18.5985, fm_loss: 0.0049, perc_loss: 6.3772, edge: 0.1562
Checkpoint saved at step 6100
Epoch: [ 6], Batch: [ 100/1009] | Total Time: 3h 45m 4s
d_loss: 1.3867, g_loss: 24.5542, const_loss: 0.0000, l1_loss: 17.7908, fm_loss: 0.0048, perc_loss: 5.9185, edge: 0.1467
Checkpoint saved at step 6200
Epoch: [ 6], Batch: [ 200/1009] | Total Time: 3h 48m 42s
d_loss: 1.3868, g_loss: 25.1046, const_loss: 0.0000, l1_loss: 18.2424, fm_loss: 0.0050, perc_loss: 6.0237, edge: 0.1402
Checkpoint saved at step 6300
Epoch: [ 6], Batch: [ 300/1009] | Total Time: 3h 52m 25s
d_loss: 1.3867, g_loss: 26.1998, const_loss: 0.0000, l1_loss: 18.9886, fm_loss: 0.0054, perc_loss: 6.3564, edge: 0.1560
Checkpoint saved at step 6400
Epoch: [ 6], Batch: [ 400/1009] | Total Time: 3h 56m 3s
d_loss: 1.3867, g_loss: 25.4709, const_loss: 0.0000, l1_loss: 18.4774, fm_loss: 0.0052, perc_loss: 6.1472, edge: 0.1477
Checkpoint saved at step 6500
Epoch: [ 6], Batch: [ 500/1009] | Total Time: 3h 59m 45s
d_loss: 1.3867, g_loss: 24.1072, const_loss: 0.0000, l1_loss: 17.3882, fm_loss: 0.0048, perc_loss: 5.8741, edge: 0.1467
Checkpoint saved at step 6600
Epoch: [ 6], Batch: [ 600/1009] | Total Time: 4h 3m 26s
d_loss: 1.3867, g_loss: 25.9461, const_loss: 0.0000, l1_loss: 18.7469, fm_loss: 0.0050, perc_loss: 6.3451, edge: 0.1557
Checkpoint saved at step 6700
Epoch: [ 6], Batch: [ 700/1009] | Total Time: 4h 7m 9s
d_loss: 1.3868, g_loss: 25.5958, const_loss: 0.0000, l1_loss: 18.1268, fm_loss: 0.0052, perc_loss: 6.6175, edge: 0.1529
Checkpoint saved at step 6800
Epoch: [ 6], Batch: [ 800/1009] | Total Time: 4h 10m 52s
d_loss: 1.3870, g_loss: 24.7392, const_loss: 0.0000, l1_loss: 17.7183, fm_loss: 0.0053, perc_loss: 6.1796, edge: 0.1427
Checkpoint saved at step 6900
Epoch: [ 6], Batch: [ 900/1009] | Total Time: 4h 14m 36s
d_loss: 1.3868, g_loss: 24.2026, const_loss: 0.0000, l1_loss: 17.3675, fm_loss: 0.0046, perc_loss: 6.0009, edge: 0.1363
Checkpoint saved at step 7000
Epoch: [ 6], Batch: [1000/1009] | Total Time: 4h 18m 14s
d_loss: 1.3868, g_loss: 24.8848, const_loss: 0.0000, l1_loss: 17.6767, fm_loss: 0.0045, perc_loss: 6.3608, edge: 0.1494
--- End of Epoch 6 --- Time: 2226.2s ---
LR Scheduler stepped. Current LR G: 0.000176, LR D: 0.000176
Epoch: [ 7], Batch: [ 0/1009] | Total Time: 4h 18m 32s
d_loss: 1.3868, g_loss: 25.3357, const_loss: 0.0000, l1_loss: 18.2289, fm_loss: 0.0046, perc_loss: 6.2517, edge: 0.1571
Checkpoint saved at step 7100
Epoch: [ 7], Batch: [ 100/1009] | Total Time: 4h 22m 16s
d_loss: 1.3867, g_loss: 25.1645, const_loss: 0.0000, l1_loss: 18.1652, fm_loss: 0.0045, perc_loss: 6.1519, edge: 0.1496
Checkpoint saved at step 7200
Epoch: [ 7], Batch: [ 200/1009] | Total Time: 4h 25m 54s
d_loss: 1.3868, g_loss: 24.6297, const_loss: 0.0000, l1_loss: 17.5838, fm_loss: 0.0045, perc_loss: 6.2022, edge: 0.1458
Checkpoint saved at step 7300
Epoch: [ 7], Batch: [ 300/1009] | Total Time: 4h 29m 35s
d_loss: 1.3870, g_loss: 24.2667, const_loss: 0.0000, l1_loss: 17.3304, fm_loss: 0.0053, perc_loss: 6.0914, edge: 0.1462
Checkpoint saved at step 7400
Epoch: [ 7], Batch: [ 400/1009] | Total Time: 4h 33m 13s
d_loss: 1.3871, g_loss: 26.0116, const_loss: 0.0000, l1_loss: 18.6272, fm_loss: 0.0054, perc_loss: 6.5268, edge: 0.1588
Checkpoint saved at step 7500
Epoch: [ 7], Batch: [ 500/1009] | Total Time: 4h 36m 51s
d_loss: 1.3867, g_loss: 24.8220, const_loss: 0.0000, l1_loss: 17.9253, fm_loss: 0.0051, perc_loss: 6.0462, edge: 0.1521
Checkpoint saved at step 7600
Epoch: [ 7], Batch: [ 600/1009] | Total Time: 4h 40m 31s
d_loss: 1.3868, g_loss: 24.7958, const_loss: 0.0000, l1_loss: 17.7864, fm_loss: 0.0047, perc_loss: 6.1582, edge: 0.1537
Checkpoint saved at step 7700
Epoch: [ 7], Batch: [ 700/1009] | Total Time: 4h 44m 9s
d_loss: 1.3868, g_loss: 26.4558, const_loss: 0.0000, l1_loss: 18.9894, fm_loss: 0.0057, perc_loss: 6.6074, edge: 0.1600
Checkpoint saved at step 7800
Epoch: [ 7], Batch: [ 800/1009] | Total Time: 4h 47m 53s
d_loss: 1.3867, g_loss: 25.9501, const_loss: 0.0000, l1_loss: 18.4271, fm_loss: 0.0049, perc_loss: 6.6655, edge: 0.1592

from 1158 to 1160

log:

Epoch: [ 0], Batch: [ 0/1009] | Total Time: 4s
d_loss: 1.3867, g_loss: 24.7558, const_loss: 0.0000, l1_loss: 17.7667, fm_loss: 0.0048, perc_loss: 6.1467, edge: 0.1441
Checkpoint step 100 reached, but saving starts after step 200.
Epoch: [ 0], Batch: [ 100/1009] | Total Time: 3m 34s
d_loss: 1.3868, g_loss: 24.7391, const_loss: 0.0000, l1_loss: 17.6924, fm_loss: 0.0047, perc_loss: 6.1983, edge: 0.1504
Epoch: [ 0], Batch: [ 200/1009] | Total Time: 7m 18s
d_loss: 1.3868, g_loss: 27.1275, const_loss: 0.0000, l1_loss: 19.2378, fm_loss: 0.0054, perc_loss: 7.0147, edge: 0.1762
Epoch: [ 0], Batch: [ 300/1009] | Total Time: 10m 59s
d_loss: 1.3868, g_loss: 27.4094, const_loss: 0.0000, l1_loss: 19.7788, fm_loss: 0.0056, perc_loss: 6.7702, edge: 0.1614
Epoch: [ 0], Batch: [ 400/1009] | Total Time: 14m 40s
d_loss: 1.3868, g_loss: 24.2752, const_loss: 0.0000, l1_loss: 17.4419, fm_loss: 0.0049, perc_loss: 5.9890, edge: 0.1460
Epoch: [ 0], Batch: [ 500/1009] | Total Time: 18m 22s
d_loss: 1.3870, g_loss: 25.0448, const_loss: 0.0000, l1_loss: 18.0076, fm_loss: 0.0050, perc_loss: 6.1911, edge: 0.1477
Epoch: [ 0], Batch: [ 600/1009] | Total Time: 22m 8s
d_loss: 1.3867, g_loss: 24.4103, const_loss: 0.0000, l1_loss: 17.5830, fm_loss: 0.0051, perc_loss: 5.9827, edge: 0.1461
Epoch: [ 0], Batch: [ 700/1009] | Total Time: 25m 49s
d_loss: 1.3867, g_loss: 25.8083, const_loss: 0.0000, l1_loss: 18.5793, fm_loss: 0.0052, perc_loss: 6.3800, edge: 0.1505
Epoch: [ 0], Batch: [ 800/1009] | Total Time: 29m 30s
d_loss: 1.3869, g_loss: 25.4893, const_loss: 0.0000, l1_loss: 18.4792, fm_loss: 0.0057, perc_loss: 6.1680, edge: 0.1430
Epoch: [ 0], Batch: [ 900/1009] | Total Time: 33m 17s
d_loss: 1.3868, g_loss: 26.9654, const_loss: 0.0000, l1_loss: 19.3956, fm_loss: 0.0056, perc_loss: 6.7057, edge: 0.1651
Epoch: [ 0], Batch: [1000/1009] | Total Time: 36m 58s
d_loss: 1.3868, g_loss: 22.4203, const_loss: 0.0000, l1_loss: 16.0903, fm_loss: 0.0043, perc_loss: 5.5035, edge: 0.1288
— End of Epoch 0 — Time: 2235.3s —
LR Scheduler stepped. Current LR G: 0.000181, LR D: 0.000181
Epoch: [ 1], Batch: [ 0/1009] | Total Time: 37m 17s
d_loss: 1.3867, g_loss: 23.7967, const_loss: 0.0000, l1_loss: 17.0454, fm_loss: 0.0041, perc_loss: 5.9146, edge: 0.1394
Epoch: [ 1], Batch: [ 100/1009] | Total Time: 41m 3s
d_loss: 1.3869, g_loss: 26.6760, const_loss: 0.0000, l1_loss: 18.9547, fm_loss: 0.0054, perc_loss: 6.8625, edge: 0.1600
Epoch: [ 1], Batch: [ 200/1009] | Total Time: 44m 44s
d_loss: 1.3867, g_loss: 26.2516, const_loss: 0.0000, l1_loss: 18.9546, fm_loss: 0.0048, perc_loss: 6.4399, edge: 0.1589
Epoch: [ 1], Batch: [ 300/1009] | Total Time: 48m 26s
d_loss: 1.3871, g_loss: 25.3624, const_loss: 0.0000, l1_loss: 18.4535, fm_loss: 0.0056, perc_loss: 6.0570, edge: 0.1530
Epoch: [ 1], Batch: [ 400/1009] | Total Time: 52m 7s
d_loss: 1.3872, g_loss: 24.2149, const_loss: 0.0000, l1_loss: 17.4126, fm_loss: 0.0047, perc_loss: 5.9602, edge: 0.1440
Epoch: [ 1], Batch: [ 500/1009] | Total Time: 55m 49s
d_loss: 1.3867, g_loss: 27.8639, const_loss: 0.0000, l1_loss: 20.0479, fm_loss: 0.0056, perc_loss: 6.9454, edge: 0.1717
Epoch: [ 1], Batch: [ 600/1009] | Total Time: 59m 34s
d_loss: 1.3869, g_loss: 23.8443, const_loss: 0.0000, l1_loss: 17.5190, fm_loss: 0.0047, perc_loss: 5.4886, edge: 0.1387
Epoch: [ 1], Batch: [ 700/1009] | Total Time: 1h 3m 15s
d_loss: 1.3867, g_loss: 28.0768, const_loss: 0.0000, l1_loss: 20.0954, fm_loss: 0.0061, perc_loss: 7.1141, edge: 0.1684
Epoch: [ 1], Batch: [ 800/1009] | Total Time: 1h 6m 58s
d_loss: 1.3867, g_loss: 21.4321, const_loss: 0.0000, l1_loss: 15.6936, fm_loss: 0.0044, perc_loss: 4.9261, edge: 0.1146
Epoch: [ 1], Batch: [ 900/1009] | Total Time: 1h 10m 43s
d_loss: 1.3867, g_loss: 26.3037, const_loss: 0.0000, l1_loss: 18.8999, fm_loss: 0.0059, perc_loss: 6.5425, edge: 0.1620
Epoch: [ 1], Batch: [1000/1009] | Total Time: 1h 14m 24s
d_loss: 1.3869, g_loss: 22.2951, const_loss: 0.0000, l1_loss: 15.9515, fm_loss: 0.0044, perc_loss: 5.5122, edge: 0.1336
— End of Epoch 1 — Time: 2245.9s —
LR Scheduler stepped. Current LR G: 0.000180, LR D: 0.000180
Epoch: [ 2], Batch: [ 0/1009] | Total Time: 1h 14m 43s
d_loss: 1.3867, g_loss: 25.1559, const_loss: 0.0000, l1_loss: 18.2749, fm_loss: 0.0055, perc_loss: 6.0284, edge: 0.1536
Epoch: [ 2], Batch: [ 100/1009] | Total Time: 1h 18m 26s
d_loss: 1.3868, g_loss: 25.0787, const_loss: 0.0000, l1_loss: 18.1505, fm_loss: 0.0053, perc_loss: 6.0858, edge: 0.1437
Epoch: [ 2], Batch: [ 200/1009] | Total Time: 1h 22m 6s
d_loss: 1.3867, g_loss: 26.5349, const_loss: 0.0000, l1_loss: 19.1547, fm_loss: 0.0054, perc_loss: 6.5215, edge: 0.1598
Epoch: [ 2], Batch: [ 300/1009] | Total Time: 1h 25m 47s
d_loss: 1.3868, g_loss: 25.3208, const_loss: 0.0000, l1_loss: 18.4597, fm_loss: 0.0049, perc_loss: 6.0165, edge: 0.1463
Epoch: [ 2], Batch: [ 400/1009] | Total Time: 1h 29m 28s
d_loss: 1.3870, g_loss: 26.0720, const_loss: 0.0000, l1_loss: 18.6313, fm_loss: 0.0050, perc_loss: 6.5906, edge: 0.1517
Epoch: [ 2], Batch: [ 500/1009] | Total Time: 1h 33m 9s
d_loss: 1.3868, g_loss: 24.7633, const_loss: 0.0000, l1_loss: 17.7534, fm_loss: 0.0048, perc_loss: 6.1608, edge: 0.1510
Epoch: [ 2], Batch: [ 600/1009] | Total Time: 1h 36m 52s
d_loss: 1.3868, g_loss: 24.7543, const_loss: 0.0000, l1_loss: 18.1620, fm_loss: 0.0051, perc_loss: 5.7489, edge: 0.1449
Epoch: [ 2], Batch: [ 700/1009] | Total Time: 1h 40m 33s
d_loss: 1.3868, g_loss: 26.9685, const_loss: 0.0000, l1_loss: 19.4052, fm_loss: 0.0055, perc_loss: 6.7026, edge: 0.1618
Epoch: [ 2], Batch: [ 800/1009] | Total Time: 1h 44m 15s
d_loss: 1.3868, g_loss: 23.8598, const_loss: 0.0000, l1_loss: 17.2621, fm_loss: 0.0040, perc_loss: 5.7615, edge: 0.1388
Epoch: [ 2], Batch: [ 900/1009] | Total Time: 1h 47m 56s
d_loss: 1.3868, g_loss: 23.6864, const_loss: 0.0000, l1_loss: 17.1266, fm_loss: 0.0051, perc_loss: 5.7177, edge: 0.1436
Epoch: [ 2], Batch: [1000/1009] | Total Time: 1h 51m 37s
d_loss: 1.3867, g_loss: 25.6165, const_loss: 0.0000, l1_loss: 18.3797, fm_loss: 0.0050, perc_loss: 6.3849, edge: 0.1535
— End of Epoch 2 — Time: 2233.3s —
LR Scheduler stepped. Current LR G: 0.000179, LR D: 0.000179
Epoch: [ 3], Batch: [ 0/1009] | Total Time: 1h 51m 56s
d_loss: 1.3867, g_loss: 25.1027, const_loss: 0.0000, l1_loss: 18.0438, fm_loss: 0.0048, perc_loss: 6.2178, edge: 0.1429
Epoch: [ 3], Batch: [ 100/1009] | Total Time: 1h 55m 39s
d_loss: 1.3869, g_loss: 24.2934, const_loss: 0.0000, l1_loss: 17.4046, fm_loss: 0.0051, perc_loss: 6.0406, edge: 0.1497
Epoch: [ 3], Batch: [ 200/1009] | Total Time: 1h 59m 21s
d_loss: 1.3868, g_loss: 23.9419, const_loss: 0.0000, l1_loss: 17.2449, fm_loss: 0.0047, perc_loss: 5.8589, edge: 0.1400
Epoch: [ 3], Batch: [ 300/1009] | Total Time: 2h 3m 2s
d_loss: 1.3868, g_loss: 25.9150, const_loss: 0.0000, l1_loss: 18.7444, fm_loss: 0.0053, perc_loss: 6.3226, edge: 0.1494
Epoch: [ 3], Batch: [ 400/1009] | Total Time: 2h 6m 43s
d_loss: 1.3868, g_loss: 25.9746, const_loss: 0.0000, l1_loss: 18.7037, fm_loss: 0.0052, perc_loss: 6.4146, edge: 0.1578
Epoch: [ 3], Batch: [ 500/1009] | Total Time: 2h 10m 24s
d_loss: 1.3867, g_loss: 22.6107, const_loss: 0.0000, l1_loss: 16.3498, fm_loss: 0.0047, perc_loss: 5.4357, edge: 0.1273
Epoch: [ 3], Batch: [ 600/1009] | Total Time: 2h 14m 4s
d_loss: 1.3868, g_loss: 25.0793, const_loss: 0.0000, l1_loss: 18.0654, fm_loss: 0.0051, perc_loss: 6.1632, edge: 0.1523
Epoch: [ 3], Batch: [ 700/1009] | Total Time: 2h 17m 46s
d_loss: 1.3870, g_loss: 26.5170, const_loss: 0.0000, l1_loss: 19.1081, fm_loss: 0.0058, perc_loss: 6.5433, edge: 0.1669
Epoch: [ 3], Batch: [ 800/1009] | Total Time: 2h 21m 27s
d_loss: 1.3867, g_loss: 23.5676, const_loss: 0.0000, l1_loss: 17.1551, fm_loss: 0.0042, perc_loss: 5.5722, edge: 0.1428
Epoch: [ 3], Batch: [ 900/1009] | Total Time: 2h 25m 8s
d_loss: 1.3867, g_loss: 24.1084, const_loss: 0.0000, l1_loss: 17.5736, fm_loss: 0.0047, perc_loss: 5.6985, edge: 0.1383
Epoch: [ 3], Batch: [1000/1009] | Total Time: 2h 28m 51s
d_loss: 1.3868, g_loss: 25.0465, const_loss: 0.0000, l1_loss: 18.2928, fm_loss: 0.0049, perc_loss: 5.9076, edge: 0.1479
— End of Epoch 3 — Time: 2233.9s —
LR Scheduler stepped. Current LR G: 0.000177, LR D: 0.000177
Epoch: [ 4], Batch: [ 0/1009] | Total Time: 2h 29m 10s
d_loss: 1.3867, g_loss: 24.5002, const_loss: 0.0000, l1_loss: 17.6540, fm_loss: 0.0051, perc_loss: 5.9960, edge: 0.1518
Epoch: [ 4], Batch: [ 100/1009] | Total Time: 2h 32m 51s
d_loss: 1.3871, g_loss: 24.2080, const_loss: 0.0000, l1_loss: 17.4562, fm_loss: 0.0049, perc_loss: 5.9095, edge: 0.1441
Epoch: [ 4], Batch: [ 200/1009] | Total Time: 2h 36m 36s
d_loss: 1.3868, g_loss: 22.2481, const_loss: 0.0000, l1_loss: 16.1472, fm_loss: 0.0045, perc_loss: 5.2802, edge: 0.1228
Epoch: [ 4], Batch: [ 300/1009] | Total Time: 2h 40m 17s
d_loss: 1.3870, g_loss: 25.0682, const_loss: 0.0000, l1_loss: 18.2122, fm_loss: 0.0048, perc_loss: 6.0056, edge: 0.1523
Epoch: [ 4], Batch: [ 400/1009] | Total Time: 2h 43m 58s
d_loss: 1.3870, g_loss: 25.7276, const_loss: 0.0000, l1_loss: 18.5621, fm_loss: 0.0052, perc_loss: 6.3140, edge: 0.1530
Epoch: [ 4], Batch: [ 500/1009] | Total Time: 2h 47m 39s
d_loss: 1.3867, g_loss: 25.1335, const_loss: 0.0000, l1_loss: 18.0418, fm_loss: 0.0049, perc_loss: 6.2524, edge: 0.1410
Epoch: [ 4], Batch: [ 600/1009] | Total Time: 2h 51m 19s
d_loss: 1.3868, g_loss: 24.6278, const_loss: 0.0000, l1_loss: 17.7074, fm_loss: 0.0050, perc_loss: 6.0695, edge: 0.1530
Epoch: [ 4], Batch: [ 700/1009] | Total Time: 2h 55m 2s
d_loss: 1.3867, g_loss: 25.8059, const_loss: 0.0000, l1_loss: 18.4378, fm_loss: 0.0046, perc_loss: 6.5117, edge: 0.1584
Epoch: [ 4], Batch: [ 800/1009] | Total Time: 2h 58m 42s
d_loss: 1.3868, g_loss: 25.6562, const_loss: 0.0000, l1_loss: 18.4251, fm_loss: 0.0054, perc_loss: 6.3812, edge: 0.1512
Epoch: [ 4], Batch: [ 900/1009] | Total Time: 3h 2m 23s
d_loss: 1.3869, g_loss: 26.3204, const_loss: 0.0000, l1_loss: 18.7854, fm_loss: 0.0050, perc_loss: 6.6819, edge: 0.1547
Epoch: [ 4], Batch: [1000/1009] | Total Time: 3h 6m 3s
d_loss: 1.3868, g_loss: 23.3493, const_loss: 0.0000, l1_loss: 16.8301, fm_loss: 0.0044, perc_loss: 5.6818, edge: 0.1397
— End of Epoch 4 — Time: 2232.2s —
LR Scheduler stepped. Current LR G: 0.000174, LR D: 0.000174
Epoch: [ 5], Batch: [ 0/1009] | Total Time: 3h 6m 22s
d_loss: 1.3868, g_loss: 23.4269, const_loss: 0.0000, l1_loss: 16.7323, fm_loss: 0.0046, perc_loss: 5.8573, edge: 0.1393
Epoch: [ 5], Batch: [ 100/1009] | Total Time: 3h 10m 4s
d_loss: 1.3869, g_loss: 25.2448, const_loss: 0.0000, l1_loss: 18.4365, fm_loss: 0.0050, perc_loss: 5.9572, edge: 0.1527
Epoch: [ 5], Batch: [ 200/1009] | Total Time: 3h 13m 44s
d_loss: 1.3867, g_loss: 23.7591, const_loss: 0.0000, l1_loss: 17.1357, fm_loss: 0.0048, perc_loss: 5.7857, edge: 0.1395
Epoch: [ 5], Batch: [ 300/1009] | Total Time: 3h 17m 25s
d_loss: 1.3867, g_loss: 21.8123, const_loss: 0.0000, l1_loss: 15.7882, fm_loss: 0.0039, perc_loss: 5.2006, edge: 0.1262
Epoch: [ 5], Batch: [ 400/1009] | Total Time: 3h 21m 8s
d_loss: 1.3868, g_loss: 22.3919, const_loss: 0.0000, l1_loss: 16.1566, fm_loss: 0.0045, perc_loss: 5.4044, edge: 0.1331
Epoch: [ 5], Batch: [ 500/1009] | Total Time: 3h 24m 49s
d_loss: 1.3869, g_loss: 23.2374, const_loss: 0.0000, l1_loss: 17.0338, fm_loss: 0.0042, perc_loss: 5.3715, edge: 0.1346
Epoch: [ 5], Batch: [ 600/1009] | Total Time: 3h 28m 31s
d_loss: 1.3868, g_loss: 22.3560, const_loss: 0.0000, l1_loss: 16.2686, fm_loss: 0.0040, perc_loss: 5.2530, edge: 0.1370
Epoch: [ 5], Batch: [ 700/1009] | Total Time: 3h 32m 12s
d_loss: 1.3870, g_loss: 23.0263, const_loss: 0.0000, l1_loss: 16.7302, fm_loss: 0.0041, perc_loss: 5.4665, edge: 0.1322
Epoch: [ 5], Batch: [ 800/1009] | Total Time: 3h 35m 52s
d_loss: 1.3867, g_loss: 23.1058, const_loss: 0.0000, l1_loss: 16.6536, fm_loss: 0.0046, perc_loss: 5.6191, edge: 0.1351
Epoch: [ 5], Batch: [ 900/1009] | Total Time: 3h 39m 33s
d_loss: 1.3867, g_loss: 25.3159, const_loss: 0.0000, l1_loss: 18.2485, fm_loss: 0.0054, perc_loss: 6.2177, edge: 0.1509
Epoch: [ 5], Batch: [1000/1009] | Total Time: 3h 43m 16s
d_loss: 1.3868, g_loss: 26.6028, const_loss: 0.0000, l1_loss: 19.1634, fm_loss: 0.0049, perc_loss: 6.5784, edge: 0.1627
— End of Epoch 5 — Time: 2232.4s —
LR Scheduler stepped. Current LR G: 0.000171, LR D: 0.000171
Epoch: [ 6], Batch: [ 0/1009] | Total Time: 3h 43m 35s
d_loss: 1.3867, g_loss: 25.3590, const_loss: 0.0000, l1_loss: 18.3549, fm_loss: 0.0048, perc_loss: 6.1626, edge: 0.1434
Epoch: [ 6], Batch: [ 100/1009] | Total Time: 3h 47m 15s
d_loss: 1.3867, g_loss: 23.6972, const_loss: 0.0000, l1_loss: 17.2924, fm_loss: 0.0042, perc_loss: 5.5693, edge: 0.1379
Epoch: [ 6], Batch: [ 200/1009] | Total Time: 3h 50m 58s
d_loss: 1.3867, g_loss: 24.2893, const_loss: 0.0000, l1_loss: 17.7183, fm_loss: 0.0048, perc_loss: 5.7328, edge: 0.1401
Epoch: [ 6], Batch: [ 300/1009] | Total Time: 3h 54m 40s
d_loss: 1.3868, g_loss: 24.9951, const_loss: 0.0000, l1_loss: 18.2343, fm_loss: 0.0052, perc_loss: 5.9156, edge: 0.1467
Epoch: [ 6], Batch: [ 400/1009] | Total Time: 3h 58m 20s
d_loss: 1.3869, g_loss: 25.3706, const_loss: 0.0000, l1_loss: 18.3712, fm_loss: 0.0056, perc_loss: 6.1521, edge: 0.1484
Epoch: [ 6], Batch: [ 500/1009] | Total Time: 4h 2m 1s
d_loss: 1.3867, g_loss: 23.5549, const_loss: 0.0000, l1_loss: 17.0930, fm_loss: 0.0052, perc_loss: 5.6225, edge: 0.1409
Epoch: [ 6], Batch: [ 600/1009] | Total Time: 4h 5m 42s
d_loss: 1.3867, g_loss: 25.6265, const_loss: 0.0000, l1_loss: 18.5552, fm_loss: 0.0051, perc_loss: 6.2152, edge: 0.1576
Epoch: [ 6], Batch: [ 700/1009] | Total Time: 4h 9m 22s
d_loss: 1.3868, g_loss: 24.3633, const_loss: 0.0000, l1_loss: 17.3400, fm_loss: 0.0052, perc_loss: 6.1761, edge: 0.1486
Epoch: [ 6], Batch: [ 800/1009] | Total Time: 4h 13m 3s
d_loss: 1.3873, g_loss: 23.8975, const_loss: 0.0000, l1_loss: 17.2128, fm_loss: 0.0053, perc_loss: 5.8418, edge: 0.1442
Epoch: [ 6], Batch: [ 900/1009] | Total Time: 4h 16m 44s
d_loss: 1.3868, g_loss: 22.9879, const_loss: 0.0000, l1_loss: 16.6665, fm_loss: 0.0045, perc_loss: 5.4901, edge: 0.1335
Epoch: [ 6], Batch: [1000/1009] | Total Time: 4h 20m 34s
d_loss: 1.3867, g_loss: 23.7422, const_loss: 0.0000, l1_loss: 16.9704, fm_loss: 0.0045, perc_loss: 5.9321, edge: 0.1418
— End of Epoch 6 — Time: 2238.0s —
LR Scheduler stepped. Current LR G: 0.000168, LR D: 0.000168
Epoch: [ 7], Batch: [ 0/1009] | Total Time: 4h 20m 53s
d_loss: 1.3867, g_loss: 24.0875, const_loss: 0.0000, l1_loss: 17.5073, fm_loss: 0.0042, perc_loss: 5.7361, edge: 0.1465
Epoch: [ 7], Batch: [ 100/1009] | Total Time: 4h 24m 34s
d_loss: 1.3867, g_loss: 23.8644, const_loss: 0.0000, l1_loss: 17.3827, fm_loss: 0.0042, perc_loss: 5.6385, edge: 0.1456


from 1162 to 1164, 調整訓練的資料集.

在訓練的同時, 推論了部份一些後續才加入的訓練資料, 居然發現了幾筆錯誤的訓練資料, 在推論的時候, 推論出正確的答案, 有問題的部件是”大”, 在 cjktc 是分開的, 在 cjkjp 是連續區塊, 之前直接拿 cjkjp 裡推論出來的結果, 加入進去訓練資料, 順便拿來 cjktc 使用, 現在看起來, 之前的這個作法有改善空間, 要拿 cjkjp 推論結果到 cjktc 的 SOP 應該是:

  • 先把 cjkjp 的 glyph 使用 font_image_combiner.py 變成 paired image.
  • 人工確定 paired image 正確:
    • 遇到有問題的直接用繪圖軟體修改,.
    • 遇到需要參考 cjkjp 的部件, 可以使用 generate_glyphs.py 產生,
  • 再把 paired image 透過 crop_images.py 再取得 glyph.

log:

Epoch: [  0], Batch: [   0/ 754] | Total Time: 4s
d_loss: 1.3868, g_loss: 23.7721, const_loss: 0.0000, l1_loss: 17.3004, fm_loss: 0.0044, perc_loss: 5.6309, edge: 0.1431
Checkpoint step 100 reached, but saving starts after step 200.
Epoch: [ 0], Batch: [ 100/ 754] | Total Time: 3m 47s
d_loss: 1.3868, g_loss: 21.3491, const_loss: 0.0000, l1_loss: 15.4901, fm_loss: 0.0038, perc_loss: 5.0341, edge: 0.1276
Epoch: [ 0], Batch: [ 200/ 754] | Total Time: 7m 34s
d_loss: 1.3867, g_loss: 24.3483, const_loss: 0.0000, l1_loss: 17.7561, fm_loss: 0.0045, perc_loss: 5.7519, edge: 0.1425
Epoch: [ 0], Batch: [ 300/ 754] | Total Time: 11m 19s
d_loss: 1.3868, g_loss: 25.2301, const_loss: 0.0000, l1_loss: 18.1125, fm_loss: 0.0044, perc_loss: 6.2608, edge: 0.1589
Epoch: [ 0], Batch: [ 400/ 754] | Total Time: 15m 9s
d_loss: 1.3869, g_loss: 24.2669, const_loss: 0.0000, l1_loss: 17.7740, fm_loss: 0.0043, perc_loss: 5.6484, edge: 0.1469
Epoch: [ 0], Batch: [ 500/ 754] | Total Time: 18m 55s
d_loss: 1.3870, g_loss: 26.5581, const_loss: 0.0000, l1_loss: 19.4427, fm_loss: 0.0052, perc_loss: 6.2565, edge: 0.1604
Epoch: [ 0], Batch: [ 600/ 754] | Total Time: 22m 41s
d_loss: 1.3867, g_loss: 26.9234, const_loss: 0.0000, l1_loss: 19.3161, fm_loss: 0.0056, perc_loss: 6.7420, edge: 0.1663
Epoch: [ 0], Batch: [ 700/ 754] | Total Time: 26m 26s
d_loss: 1.3868, g_loss: 25.2209, const_loss: 0.0000, l1_loss: 18.4877, fm_loss: 0.0051, perc_loss: 5.8827, edge: 0.1521
--- End of Epoch 0 --- Time: 1704.6s ---
LR Scheduler stepped. Current LR G: 0.000169, LR D: 0.000169
Epoch: [ 1], Batch: [ 0/ 754] | Total Time: 28m 26s
d_loss: 1.3868, g_loss: 23.3075, const_loss: 0.0000, l1_loss: 16.9781, fm_loss: 0.0047, perc_loss: 5.4975, edge: 0.1339
Epoch: [ 1], Batch: [ 100/ 754] | Total Time: 32m 12s
d_loss: 1.3870, g_loss: 23.8906, const_loss: 0.0000, l1_loss: 17.5084, fm_loss: 0.0041, perc_loss: 5.5397, edge: 0.1451
Epoch: [ 1], Batch: [ 200/ 754] | Total Time: 35m 58s
d_loss: 1.3868, g_loss: 26.0178, const_loss: 0.0000, l1_loss: 18.6564, fm_loss: 0.0051, perc_loss: 6.5041, edge: 0.1587
Epoch: [ 1], Batch: [ 300/ 754] | Total Time: 39m 47s
d_loss: 1.3867, g_loss: 22.5149, const_loss: 0.0000, l1_loss: 16.3779, fm_loss: 0.0045, perc_loss: 5.3097, edge: 0.1294
Epoch: [ 1], Batch: [ 400/ 754] | Total Time: 43m 32s
d_loss: 1.3868, g_loss: 24.3155, const_loss: 0.0000, l1_loss: 17.5798, fm_loss: 0.0044, perc_loss: 5.8880, edge: 0.1499
Epoch: [ 1], Batch: [ 500/ 754] | Total Time: 47m 24s
d_loss: 1.3867, g_loss: 22.8659, const_loss: 0.0000, l1_loss: 16.6164, fm_loss: 0.0043, perc_loss: 5.4104, edge: 0.1415
Epoch: [ 1], Batch: [ 600/ 754] | Total Time: 51m 9s
d_loss: 1.3867, g_loss: 24.5221, const_loss: 0.0000, l1_loss: 17.8462, fm_loss: 0.0048, perc_loss: 5.8345, edge: 0.1433
Epoch: [ 1], Batch: [ 700/ 754] | Total Time: 54m 55s
d_loss: 1.3867, g_loss: 22.9477, const_loss: 0.0000, l1_loss: 16.8106, fm_loss: 0.0045, perc_loss: 5.3072, edge: 0.1321
--- End of Epoch 1 --- Time: 1712.7s ---
LR Scheduler stepped. Current LR G: 0.000168, LR D: 0.000168
Epoch: [ 2], Batch: [ 0/ 754] | Total Time: 56m 59s
d_loss: 1.3867, g_loss: 22.5367, const_loss: 0.0000, l1_loss: 16.5993, fm_loss: 0.0048, perc_loss: 5.1020, edge: 0.1373
Epoch: [ 2], Batch: [ 100/ 754] | Total Time: 1h 45s
d_loss: 1.3870, g_loss: 24.7297, const_loss: 0.0000, l1_loss: 17.7633, fm_loss: 0.0050, perc_loss: 6.1132, edge: 0.1549
Epoch: [ 2], Batch: [ 200/ 754] | Total Time: 1h 4m 32s
d_loss: 1.3867, g_loss: 26.0871, const_loss: 0.0000, l1_loss: 18.7976, fm_loss: 0.0051, perc_loss: 6.4264, edge: 0.1647
Epoch: [ 2], Batch: [ 300/ 754] | Total Time: 1h 8m 18s
d_loss: 1.3867, g_loss: 24.4473, const_loss: 0.0000, l1_loss: 17.7015, fm_loss: 0.0051, perc_loss: 5.8922, edge: 0.1552
Epoch: [ 2], Batch: [ 400/ 754] | Total Time: 1h 12m 3s
d_loss: 1.3868, g_loss: 22.5452, const_loss: 0.0000, l1_loss: 16.6010, fm_loss: 0.0043, perc_loss: 5.1101, edge: 0.1365
Epoch: [ 2], Batch: [ 500/ 754] | Total Time: 1h 15m 48s
d_loss: 1.3867, g_loss: 24.0918, const_loss: 0.0000, l1_loss: 17.5022, fm_loss: 0.0049, perc_loss: 5.7411, edge: 0.1502
Epoch: [ 2], Batch: [ 600/ 754] | Total Time: 1h 19m 34s
d_loss: 1.3870, g_loss: 23.2932, const_loss: 0.0000, l1_loss: 16.9766, fm_loss: 0.0045, perc_loss: 5.4867, edge: 0.1325
Epoch: [ 2], Batch: [ 700/ 754] | Total Time: 1h 23m 21s
d_loss: 1.3868, g_loss: 23.2227, const_loss: 0.0000, l1_loss: 17.0640, fm_loss: 0.0046, perc_loss: 5.3253, edge: 0.1355
--- End of Epoch 2 --- Time: 1701.2s ---
LR Scheduler stepped. Current LR G: 0.000167, LR D: 0.000167
Epoch: [ 3], Batch: [ 0/ 754] | Total Time: 1h 25m 20s
d_loss: 1.3869, g_loss: 23.3367, const_loss: 0.0000, l1_loss: 17.1186, fm_loss: 0.0046, perc_loss: 5.3822, edge: 0.1380
Epoch: [ 3], Batch: [ 100/ 754] | Total Time: 1h 29m 6s
d_loss: 1.3867, g_loss: 23.9186, const_loss: 0.0000, l1_loss: 17.3593, fm_loss: 0.0049, perc_loss: 5.7131, edge: 0.1480
Epoch: [ 3], Batch: [ 200/ 754] | Total Time: 1h 32m 51s
d_loss: 1.3868, g_loss: 23.9397, const_loss: 0.0000, l1_loss: 17.5782, fm_loss: 0.0050, perc_loss: 5.5210, edge: 0.1421
Epoch: [ 3], Batch: [ 300/ 754] | Total Time: 1h 36m 38s
d_loss: 1.3868, g_loss: 24.8069, const_loss: 0.0000, l1_loss: 18.2470, fm_loss: 0.0046, perc_loss: 5.7158, edge: 0.1461
Epoch: [ 3], Batch: [ 400/ 754] | Total Time: 1h 40m 24s
d_loss: 1.3867, g_loss: 23.4566, const_loss: 0.0000, l1_loss: 17.3722, fm_loss: 0.0046, perc_loss: 5.2581, edge: 0.1283
Epoch: [ 3], Batch: [ 500/ 754] | Total Time: 1h 44m 14s
d_loss: 1.3867, g_loss: 24.7179, const_loss: 0.0000, l1_loss: 17.9961, fm_loss: 0.0054, perc_loss: 5.8768, edge: 0.1462
Epoch: [ 3], Batch: [ 600/ 754] | Total Time: 1h 48m 1s
d_loss: 1.3867, g_loss: 22.5480, const_loss: 0.0000, l1_loss: 16.4079, fm_loss: 0.0053, perc_loss: 5.3016, edge: 0.1399
Epoch: [ 3], Batch: [ 700/ 754] | Total Time: 1h 51m 48s
d_loss: 1.3870, g_loss: 21.4451, const_loss: 0.0000, l1_loss: 15.5974, fm_loss: 0.0044, perc_loss: 5.0232, edge: 0.1269
--- End of Epoch 3 --- Time: 1713.9s ---
LR Scheduler stepped. Current LR G: 0.000165, LR D: 0.000165
Epoch: [ 4], Batch: [ 0/ 754] | Total Time: 1h 53m 54s
d_loss: 1.3868, g_loss: 20.6485, const_loss: 0.0000, l1_loss: 15.2096, fm_loss: 0.0042, perc_loss: 4.6214, edge: 0.1199
Epoch: [ 4], Batch: [ 100/ 754] | Total Time: 1h 57m 41s
d_loss: 1.3868, g_loss: 23.5247, const_loss: 0.0000, l1_loss: 17.4051, fm_loss: 0.0046, perc_loss: 5.2795, edge: 0.1422
Epoch: [ 4], Batch: [ 200/ 754] | Total Time: 2h 1m 32s
d_loss: 1.3867, g_loss: 23.1534, const_loss: 0.0000, l1_loss: 16.9008, fm_loss: 0.0050, perc_loss: 5.4206, edge: 0.1337
Epoch: [ 4], Batch: [ 300/ 754] | Total Time: 2h 5m 18s
d_loss: 1.3867, g_loss: 25.2854, const_loss: 0.0000, l1_loss: 18.3383, fm_loss: 0.0049, perc_loss: 6.0909, edge: 0.1580
Epoch: [ 4], Batch: [ 400/ 754] | Total Time: 2h 9m 4s
d_loss: 1.3868, g_loss: 22.0279, const_loss: 0.0000, l1_loss: 15.9894, fm_loss: 0.0040, perc_loss: 5.2090, edge: 0.1321
Epoch: [ 4], Batch: [ 500/ 754] | Total Time: 2h 12m 49s
d_loss: 1.3867, g_loss: 25.6002, const_loss: 0.0000, l1_loss: 18.5104, fm_loss: 0.0050, perc_loss: 6.2391, edge: 0.1524
Epoch: [ 4], Batch: [ 600/ 754] | Total Time: 2h 16m 40s
d_loss: 1.3868, g_loss: 22.0732, const_loss: 0.0000, l1_loss: 16.2510, fm_loss: 0.0045, perc_loss: 4.9972, edge: 0.1271
Epoch: [ 4], Batch: [ 700/ 754] | Total Time: 2h 20m 30s
d_loss: 1.3868, g_loss: 24.1903, const_loss: 0.0000, l1_loss: 17.7384, fm_loss: 0.0048, perc_loss: 5.6112, edge: 0.1426
--- End of Epoch 4 --- Time: 1715.6s ---
LR Scheduler stepped. Current LR G: 0.000163, LR D: 0.000163
Epoch: [ 5], Batch: [ 0/ 754] | Total Time: 2h 22m 30s
d_loss: 1.3867, g_loss: 24.3715, const_loss: 0.0000, l1_loss: 17.6256, fm_loss: 0.0049, perc_loss: 5.8971, edge: 0.1505
Epoch: [ 5], Batch: [ 100/ 754] | Total Time: 2h 26m 15s
d_loss: 1.3868, g_loss: 24.8325, const_loss: 0.0000, l1_loss: 18.4382, fm_loss: 0.0049, perc_loss: 5.5507, edge: 0.1453
Epoch: [ 5], Batch: [ 200/ 754] | Total Time: 2h 30m 0s
d_loss: 1.3868, g_loss: 21.2287, const_loss: 0.0000, l1_loss: 15.8653, fm_loss: 0.0040, perc_loss: 4.5480, edge: 0.1181
Epoch: [ 5], Batch: [ 300/ 754] | Total Time: 2h 33m 46s
d_loss: 1.3868, g_loss: 21.6806, const_loss: 0.0000, l1_loss: 15.7637, fm_loss: 0.0041, perc_loss: 5.0890, edge: 0.1305
Epoch: [ 5], Batch: [ 400/ 754] | Total Time: 2h 37m 31s
d_loss: 1.3868, g_loss: 23.3764, const_loss: 0.0000, l1_loss: 17.1179, fm_loss: 0.0042, perc_loss: 5.4186, edge: 0.1424
Epoch: [ 5], Batch: [ 500/ 754] | Total Time: 2h 41m 16s
d_loss: 1.3869, g_loss: 25.6885, const_loss: 0.0000, l1_loss: 18.5076, fm_loss: 0.0047, perc_loss: 6.3174, edge: 0.1654
Epoch: [ 5], Batch: [ 600/ 754] | Total Time: 2h 45m 2s
d_loss: 1.3867, g_loss: 23.2912, const_loss: 0.0000, l1_loss: 16.9680, fm_loss: 0.0042, perc_loss: 5.4919, edge: 0.1337
Epoch: [ 5], Batch: [ 700/ 754] | Total Time: 2h 48m 47s
d_loss: 1.3868, g_loss: 22.3667, const_loss: 0.0000, l1_loss: 16.4658, fm_loss: 0.0044, perc_loss: 5.0718, edge: 0.1313
--- End of Epoch 5 --- Time: 1701.5s ---
LR Scheduler stepped. Current LR G: 0.000160, LR D: 0.000160
Epoch: [ 6], Batch: [ 0/ 754] | Total Time: 2h 50m 51s
d_loss: 1.3872, g_loss: 19.1159, const_loss: 0.0000, l1_loss: 13.8743, fm_loss: 0.0040, perc_loss: 4.4299, edge: 0.1143
Epoch: [ 6], Batch: [ 100/ 754] | Total Time: 2h 54m 38s
d_loss: 1.3867, g_loss: 24.6305, const_loss: 0.0000, l1_loss: 18.0005, fm_loss: 0.0043, perc_loss: 5.7730, edge: 0.1593
Epoch: [ 6], Batch: [ 200/ 754] | Total Time: 2h 58m 23s
d_loss: 1.3868, g_loss: 24.3995, const_loss: 0.0000, l1_loss: 17.8404, fm_loss: 0.0047, perc_loss: 5.7147, edge: 0.1464
Epoch: [ 6], Batch: [ 300/ 754] | Total Time: 3h 2m 9s
d_loss: 1.3867, g_loss: 22.0963, const_loss: 0.0000, l1_loss: 16.2120, fm_loss: 0.0041, perc_loss: 5.0557, edge: 0.1312
Epoch: [ 6], Batch: [ 400/ 754] | Total Time: 3h 5m 54s
d_loss: 1.3867, g_loss: 25.0869, const_loss: 0.0000, l1_loss: 18.4738, fm_loss: 0.0051, perc_loss: 5.7681, edge: 0.1465
Epoch: [ 6], Batch: [ 500/ 754] | Total Time: 3h 9m 40s
d_loss: 1.3868, g_loss: 24.0227, const_loss: 0.0000, l1_loss: 17.6524, fm_loss: 0.0046, perc_loss: 5.5234, edge: 0.1490
Epoch: [ 6], Batch: [ 600/ 754] | Total Time: 3h 13m 26s
d_loss: 1.3878, g_loss: 22.6864, const_loss: 0.0000, l1_loss: 16.6233, fm_loss: 0.0036, perc_loss: 5.2294, edge: 0.1368
Epoch: [ 6], Batch: [ 700/ 754] | Total Time: 3h 17m 11s
d_loss: 1.3867, g_loss: 22.6604, const_loss: 0.0000, l1_loss: 16.5546, fm_loss: 0.0051, perc_loss: 5.2719, edge: 0.1354
--- End of Epoch 6 --- Time: 1699.4s ---
LR Scheduler stepped. Current LR G: 0.000157, LR D: 0.000157
Epoch: [ 7], Batch: [ 0/ 754] | Total Time: 3h 19m 11s
d_loss: 1.3867, g_loss: 24.1686, const_loss: 0.0000, l1_loss: 17.8711, fm_loss: 0.0043, perc_loss: 5.4563, edge: 0.1435
Epoch: [ 7], Batch: [ 100/ 754] | Total Time: 3h 22m 56s
d_loss: 1.3867, g_loss: 22.7477, const_loss: 0.0000, l1_loss: 16.7319, fm_loss: 0.0047, perc_loss: 5.1952, edge: 0.1225
Epoch: [ 7], Batch: [ 200/ 754] | Total Time: 3h 26m 41s
d_loss: 1.3867, g_loss: 24.4125, const_loss: 0.0000, l1_loss: 17.8651, fm_loss: 0.0045, perc_loss: 5.7035, edge: 0.1460

from 1172 to 1174, 這次把訓練資料增加到 30,748 筆, 1個 epoch 要8220 秒 (=137分鐘), 真是超久的, 而且好像數據很難再降低。

log:

unpickled total 30748 examples
Starting training from epoch 0/39...
Epoch: [ 0], Batch: [ 0/3844] | Total Time: 4s
d_loss: 1.3867, g_loss: 22.0652, const_loss: 0.0000, l1_loss: 16.6972, fm_loss: 0.0033, perc_loss: 4.5536, edge: 0.1178
Epoch: [ 0], Batch: [ 100/3844] | Total Time: 3m 36s
d_loss: 1.3867, g_loss: 22.5999, const_loss: 0.0000, l1_loss: 16.8246, fm_loss: 0.0031, perc_loss: 4.9632, edge: 0.1157
Epoch: [ 0], Batch: [ 200/3844] | Total Time: 7m 9s
d_loss: 1.3868, g_loss: 21.4693, const_loss: 0.0000, l1_loss: 15.8191, fm_loss: 0.0033, perc_loss: 4.8274, edge: 0.1260
Epoch: [ 0], Batch: [ 300/3844] | Total Time: 10m 43s
d_loss: 1.3868, g_loss: 22.3933, const_loss: 0.0000, l1_loss: 16.4512, fm_loss: 0.0036, perc_loss: 5.1100, edge: 0.1351
Epoch: [ 0], Batch: [ 400/3844] | Total Time: 14m 16s
d_loss: 1.3867, g_loss: 22.1562, const_loss: 0.0000, l1_loss: 16.5636, fm_loss: 0.0033, perc_loss: 4.7763, edge: 0.1197
Epoch: [ 0], Batch: [ 500/3844] | Total Time: 17m 49s
d_loss: 1.3875, g_loss: 23.3556, const_loss: 0.0000, l1_loss: 16.9972, fm_loss: 0.0034, perc_loss: 5.5162, edge: 0.1455
Epoch: [ 0], Batch: [ 600/3844] | Total Time: 21m 23s
d_loss: 1.3876, g_loss: 21.5394, const_loss: 0.0000, l1_loss: 15.8542, fm_loss: 0.0033, perc_loss: 4.8624, edge: 0.1261
Epoch: [ 0], Batch: [ 700/3844] | Total Time: 24m 57s
d_loss: 1.3867, g_loss: 21.7208, const_loss: 0.0000, l1_loss: 16.0768, fm_loss: 0.0031, perc_loss: 4.8272, edge: 0.1204
Epoch: [ 0], Batch: [ 800/3844] | Total Time: 28m 35s
d_loss: 1.3868, g_loss: 22.9525, const_loss: 0.0000, l1_loss: 16.9001, fm_loss: 0.0037, perc_loss: 5.2240, edge: 0.1314
Epoch: [ 0], Batch: [ 900/3844] | Total Time: 32m 9s
d_loss: 1.3869, g_loss: 23.4761, const_loss: 0.0000, l1_loss: 17.0843, fm_loss: 0.0037, perc_loss: 5.5608, edge: 0.1338
Epoch: [ 0], Batch: [1000/3844] | Total Time: 35m 42s
d_loss: 1.3868, g_loss: 21.6477, const_loss: 0.0000, l1_loss: 15.9812, fm_loss: 0.0035, perc_loss: 4.8460, edge: 0.1237
Epoch: [ 0], Batch: [1100/3844] | Total Time: 39m 16s
d_loss: 1.3867, g_loss: 22.0287, const_loss: 0.0000, l1_loss: 16.0500, fm_loss: 0.0035, perc_loss: 5.1570, edge: 0.1248
Epoch: [ 0], Batch: [1200/3844] | Total Time: 42m 50s
d_loss: 1.3868, g_loss: 22.9119, const_loss: 0.0000, l1_loss: 16.6397, fm_loss: 0.0038, perc_loss: 5.4377, edge: 0.1374
Epoch: [ 0], Batch: [1300/3844] | Total Time: 46m 24s
d_loss: 1.3868, g_loss: 23.3113, const_loss: 0.0000, l1_loss: 17.2334, fm_loss: 0.0037, perc_loss: 5.2414, edge: 0.1394
Epoch: [ 0], Batch: [1400/3844] | Total Time: 49m 57s
d_loss: 1.3869, g_loss: 23.1812, const_loss: 0.0000, l1_loss: 17.0714, fm_loss: 0.0039, perc_loss: 5.2746, edge: 0.1378
Epoch: [ 0], Batch: [1500/3844] | Total Time: 53m 31s
d_loss: 1.3868, g_loss: 22.3590, const_loss: 0.0000, l1_loss: 16.4997, fm_loss: 0.0035, perc_loss: 5.0408, edge: 0.1218
Epoch: [ 0], Batch: [1600/3844] | Total Time: 57m 10s
d_loss: 1.3868, g_loss: 22.1923, const_loss: 0.0000, l1_loss: 16.2541, fm_loss: 0.0039, perc_loss: 5.1119, edge: 0.1290
Epoch: [ 0], Batch: [1700/3844] | Total Time: 1h 44s
d_loss: 1.3868, g_loss: 22.3506, const_loss: 0.0000, l1_loss: 16.3587, fm_loss: 0.0032, perc_loss: 5.1709, edge: 0.1244
Epoch: [ 0], Batch: [1800/3844] | Total Time: 1h 4m 18s
d_loss: 1.3868, g_loss: 22.1176, const_loss: 0.0000, l1_loss: 15.8362, fm_loss: 0.0036, perc_loss: 5.4511, edge: 0.1333
Epoch: [ 0], Batch: [1900/3844] | Total Time: 1h 7m 51s
d_loss: 1.3867, g_loss: 23.8044, const_loss: 0.0000, l1_loss: 17.5213, fm_loss: 0.0039, perc_loss: 5.4487, edge: 0.1372
Epoch: [ 0], Batch: [2000/3844] | Total Time: 1h 11m 25s
d_loss: 1.3870, g_loss: 23.5628, const_loss: 0.0000, l1_loss: 17.2402, fm_loss: 0.0036, perc_loss: 5.4876, edge: 0.1380
Epoch: [ 0], Batch: [2100/3844] | Total Time: 1h 14m 58s
d_loss: 1.3868, g_loss: 23.2823, const_loss: 0.0000, l1_loss: 16.9659, fm_loss: 0.0036, perc_loss: 5.4833, edge: 0.1361
Epoch: [ 0], Batch: [2200/3844] | Total Time: 1h 18m 32s
d_loss: 1.3874, g_loss: 21.1247, const_loss: 0.0000, l1_loss: 15.4932, fm_loss: 0.0031, perc_loss: 4.8181, edge: 0.1170
Epoch: [ 0], Batch: [2300/3844] | Total Time: 1h 22m 5s
d_loss: 1.3868, g_loss: 23.7013, const_loss: 0.0000, l1_loss: 17.3964, fm_loss: 0.0035, perc_loss: 5.4785, edge: 0.1296
Epoch: [ 0], Batch: [2400/3844] | Total Time: 1h 25m 46s
d_loss: 1.3868, g_loss: 24.8029, const_loss: 0.0000, l1_loss: 18.0349, fm_loss: 0.0036, perc_loss: 5.9192, edge: 0.1519
Epoch: [ 0], Batch: [2500/3844] | Total Time: 1h 29m 20s
d_loss: 1.3867, g_loss: 22.8857, const_loss: 0.0000, l1_loss: 16.9028, fm_loss: 0.0038, perc_loss: 5.1530, edge: 0.1328
Epoch: [ 0], Batch: [2600/3844] | Total Time: 1h 32m 53s
d_loss: 1.3872, g_loss: 23.2065, const_loss: 0.0000, l1_loss: 16.7542, fm_loss: 0.0039, perc_loss: 5.6139, edge: 0.1411
Epoch: [ 0], Batch: [2700/3844] | Total Time: 1h 36m 26s
d_loss: 1.3867, g_loss: 21.9103, const_loss: 0.0000, l1_loss: 16.1255, fm_loss: 0.0032, perc_loss: 4.9648, edge: 0.1235
Epoch: [ 0], Batch: [2800/3844] | Total Time: 1h 40m 0s
d_loss: 1.3868, g_loss: 24.9170, const_loss: 0.0000, l1_loss: 18.2111, fm_loss: 0.0040, perc_loss: 5.8501, edge: 0.1584
Epoch: [ 0], Batch: [2900/3844] | Total Time: 1h 43m 33s
d_loss: 1.3869, g_loss: 25.1383, const_loss: 0.0000, l1_loss: 18.1676, fm_loss: 0.0039, perc_loss: 6.1143, edge: 0.1591
Epoch: [ 0], Batch: [3000/3844] | Total Time: 1h 47m 6s
d_loss: 1.3867, g_loss: 24.3530, const_loss: 0.0000, l1_loss: 17.9599, fm_loss: 0.0039, perc_loss: 5.5527, edge: 0.1432
Epoch: [ 0], Batch: [3100/3844] | Total Time: 1h 50m 38s
d_loss: 1.3871, g_loss: 24.6068, const_loss: 0.0000, l1_loss: 18.1720, fm_loss: 0.0041, perc_loss: 5.5956, edge: 0.1417
Epoch: [ 0], Batch: [3200/3844] | Total Time: 1h 54m 15s
d_loss: 1.3868, g_loss: 25.2986, const_loss: 0.0000, l1_loss: 18.2189, fm_loss: 0.0043, perc_loss: 6.2260, edge: 0.1560
Epoch: [ 0], Batch: [3300/3844] | Total Time: 1h 57m 47s
d_loss: 1.3868, g_loss: 24.5850, const_loss: 0.0000, l1_loss: 18.0756, fm_loss: 0.0040, perc_loss: 5.6572, edge: 0.1549
Epoch: [ 0], Batch: [3400/3844] | Total Time: 2h 1m 19s
d_loss: 1.3867, g_loss: 24.1326, const_loss: 0.0000, l1_loss: 17.5008, fm_loss: 0.0038, perc_loss: 5.7886, edge: 0.1461
Epoch: [ 0], Batch: [3500/3844] | Total Time: 2h 4m 51s
d_loss: 1.3870, g_loss: 23.0209, const_loss: 0.0000, l1_loss: 16.5229, fm_loss: 0.0039, perc_loss: 5.6593, edge: 0.1415
Epoch: [ 0], Batch: [3600/3844] | Total Time: 2h 8m 23s
d_loss: 1.3868, g_loss: 24.4072, const_loss: 0.0000, l1_loss: 17.8018, fm_loss: 0.0040, perc_loss: 5.7646, edge: 0.1433
Epoch: [ 0], Batch: [3700/3844] | Total Time: 2h 11m 55s
d_loss: 1.3868, g_loss: 22.3950, const_loss: 0.0000, l1_loss: 16.2786, fm_loss: 0.0031, perc_loss: 5.2950, edge: 0.1250
Epoch: [ 0], Batch: [3800/3844] | Total Time: 2h 15m 29s
d_loss: 1.3868, g_loss: 22.6758, const_loss: 0.0000, l1_loss: 16.7203, fm_loss: 0.0034, perc_loss: 5.1238, edge: 0.1349
--- End of Epoch 0 --- Time: 8220.4s ---
LR Scheduler stepped. Current LR G: 0.000199, LR D: 0.000199
Epoch: [ 1], Batch: [ 0/3844] | Total Time: 2h 17m 2s
d_loss: 1.3878, g_loss: 26.9273, const_loss: 0.0000, l1_loss: 19.5350, fm_loss: 0.0037, perc_loss: 6.5356, edge: 0.1597
Epoch: [ 1], Batch: [ 100/3844] | Total Time: 2h 20m 36s
d_loss: 1.3868, g_loss: 23.7761, const_loss: 0.0000, l1_loss: 17.5872, fm_loss: 0.0036, perc_loss: 5.3508, edge: 0.1411
Epoch: [ 1], Batch: [ 200/3844] | Total Time: 2h 24m 12s
d_loss: 1.3867, g_loss: 23.3211, const_loss: 0.0000, l1_loss: 16.9601, fm_loss: 0.0036, perc_loss: 5.5267, edge: 0.1374
Epoch: [ 1], Batch: [ 300/3844] | Total Time: 2h 27m 46s
d_loss: 1.3867, g_loss: 24.4588, const_loss: 0.0000, l1_loss: 17.9076, fm_loss: 0.0036, perc_loss: 5.7104, edge: 0.1439
Epoch: [ 1], Batch: [ 400/3844] | Total Time: 2h 31m 19s
d_loss: 1.3867, g_loss: 21.6107, const_loss: 0.0000, l1_loss: 15.7492, fm_loss: 0.0035, perc_loss: 5.0429, edge: 0.1217
Epoch: [ 1], Batch: [ 500/3844] | Total Time: 2h 34m 53s
d_loss: 1.3867, g_loss: 21.9894, const_loss: 0.0000, l1_loss: 15.9008, fm_loss: 0.0033, perc_loss: 5.2616, edge: 0.1305
Epoch: [ 1], Batch: [ 600/3844] | Total Time: 2h 38m 26s
d_loss: 1.3867, g_loss: 22.2295, const_loss: 0.0000, l1_loss: 16.4070, fm_loss: 0.0033, perc_loss: 4.9961, edge: 0.1297
Epoch: [ 1], Batch: [ 700/3844] | Total Time: 2h 41m 59s
d_loss: 1.3867, g_loss: 24.2764, const_loss: 0.0000, l1_loss: 17.7808, fm_loss: 0.0036, perc_loss: 5.6566, edge: 0.1420
Epoch: [ 1], Batch: [ 800/3844] | Total Time: 2h 45m 32s
d_loss: 1.3867, g_loss: 22.1281, const_loss: 0.0000, l1_loss: 16.0946, fm_loss: 0.0034, perc_loss: 5.2100, edge: 0.1268
Epoch: [ 1], Batch: [ 900/3844] | Total Time: 2h 49m 5s
d_loss: 1.3867, g_loss: 25.2790, const_loss: 0.0000, l1_loss: 18.3000, fm_loss: 0.0040, perc_loss: 6.1368, edge: 0.1449
Epoch: [ 1], Batch: [1000/3844] | Total Time: 2h 52m 41s
d_loss: 1.3868, g_loss: 25.5344, const_loss: 0.0000, l1_loss: 18.5234, fm_loss: 0.0041, perc_loss: 6.1624, edge: 0.1512
Epoch: [ 1], Batch: [1100/3844] | Total Time: 2h 56m 15s
d_loss: 1.3869, g_loss: 23.8928, const_loss: 0.0000, l1_loss: 17.6494, fm_loss: 0.0037, perc_loss: 5.4115, edge: 0.1347
Epoch: [ 1], Batch: [1200/3844] | Total Time: 2h 59m 48s
d_loss: 1.3867, g_loss: 22.6636, const_loss: 0.0000, l1_loss: 16.6719, fm_loss: 0.0036, perc_loss: 5.1645, edge: 0.1302
Epoch: [ 1], Batch: [1300/3844] | Total Time: 3h 3m 22s
d_loss: 1.3867, g_loss: 23.2262, const_loss: 0.0000, l1_loss: 16.9673, fm_loss: 0.0037, perc_loss: 5.4233, edge: 0.1385
Epoch: [ 1], Batch: [1400/3844] | Total Time: 3h 6m 55s
d_loss: 1.3868, g_loss: 24.8132, const_loss: 0.0000, l1_loss: 17.9436, fm_loss: 0.0042, perc_loss: 6.0191, edge: 0.1530
Epoch: [ 1], Batch: [1500/3844] | Total Time: 3h 10m 29s
d_loss: 1.3868, g_loss: 23.1708, const_loss: 0.0000, l1_loss: 16.9291, fm_loss: 0.0037, perc_loss: 5.4122, edge: 0.1323
Epoch: [ 1], Batch: [1600/3844] | Total Time: 3h 14m 3s
d_loss: 1.3867, g_loss: 23.6657, const_loss: 0.0000, l1_loss: 17.3465, fm_loss: 0.0038, perc_loss: 5.4733, edge: 0.1487
Epoch: [ 1], Batch: [1700/3844] | Total Time: 3h 17m 37s
d_loss: 1.3868, g_loss: 22.4674, const_loss: 0.0000, l1_loss: 16.5844, fm_loss: 0.0034, perc_loss: 5.0608, edge: 0.1254
Epoch: [ 1], Batch: [1800/3844] | Total Time: 3h 21m 16s
d_loss: 1.3867, g_loss: 22.7599, const_loss: 0.0000, l1_loss: 16.7418, fm_loss: 0.0034, perc_loss: 5.1996, edge: 0.1217
Epoch: [ 1], Batch: [1900/3844] | Total Time: 3h 24m 50s
d_loss: 1.3867, g_loss: 24.3168, const_loss: 0.0000, l1_loss: 17.7692, fm_loss: 0.0046, perc_loss: 5.6998, edge: 0.1499
Epoch: [ 1], Batch: [2000/3844] | Total Time: 3h 28m 23s
d_loss: 1.3868, g_loss: 22.3701, const_loss: 0.0000, l1_loss: 16.3860, fm_loss: 0.0036, perc_loss: 5.1528, edge: 0.1344
Epoch: [ 1], Batch: [2100/3844] | Total Time: 3h 31m 57s
d_loss: 1.3868, g_loss: 25.8147, const_loss: 0.0000, l1_loss: 18.9331, fm_loss: 0.0040, perc_loss: 6.0279, edge: 0.1563
Epoch: [ 1], Batch: [2200/3844] | Total Time: 3h 35m 30s
d_loss: 1.3869, g_loss: 25.4411, const_loss: 0.0000, l1_loss: 18.5014, fm_loss: 0.0039, perc_loss: 6.0881, edge: 0.1544
Epoch: [ 1], Batch: [2300/3844] | Total Time: 3h 39m 4s
d_loss: 1.3868, g_loss: 25.8282, const_loss: 0.0000, l1_loss: 18.7006, fm_loss: 0.0042, perc_loss: 6.2713, edge: 0.1588
Epoch: [ 1], Batch: [2400/3844] | Total Time: 3h 42m 38s
d_loss: 1.3867, g_loss: 23.0147, const_loss: 0.0000, l1_loss: 16.9426, fm_loss: 0.0038, perc_loss: 5.2372, edge: 0.1378
Epoch: [ 1], Batch: [2500/3844] | Total Time: 3h 46m 11s
d_loss: 1.3867, g_loss: 23.5482, const_loss: 0.0000, l1_loss: 17.1813, fm_loss: 0.0037, perc_loss: 5.5385, edge: 0.1313
Epoch: [ 1], Batch: [2600/3844] | Total Time: 3h 49m 50s
d_loss: 1.3867, g_loss: 24.4259, const_loss: 0.0000, l1_loss: 17.8144, fm_loss: 0.0039, perc_loss: 5.7728, edge: 0.1415
Epoch: [ 1], Batch: [2700/3844] | Total Time: 3h 53m 24s
d_loss: 1.3870, g_loss: 23.6275, const_loss: 0.0000, l1_loss: 17.4462, fm_loss: 0.0036, perc_loss: 5.3544, edge: 0.1299
Epoch: [ 1], Batch: [2800/3844] | Total Time: 3h 56m 57s
d_loss: 1.3868, g_loss: 24.7124, const_loss: 0.0000, l1_loss: 18.0233, fm_loss: 0.0044, perc_loss: 5.8400, edge: 0.1512
Epoch: [ 1], Batch: [2900/3844] | Total Time: 4h 31s
d_loss: 1.3867, g_loss: 24.1717, const_loss: 0.0000, l1_loss: 17.4967, fm_loss: 0.0038, perc_loss: 5.8328, edge: 0.1450
Epoch: [ 1], Batch: [3000/3844] | Total Time: 4h 4m 5s
d_loss: 1.3869, g_loss: 22.0352, const_loss: 0.0000, l1_loss: 15.9942, fm_loss: 0.0038, perc_loss: 5.2063, edge: 0.1375
Epoch: [ 1], Batch: [3100/3844] | Total Time: 4h 7m 38s
d_loss: 1.3867, g_loss: 22.2059, const_loss: 0.0000, l1_loss: 16.1812, fm_loss: 0.0038, perc_loss: 5.1953, edge: 0.1323
Epoch: [ 1], Batch: [3200/3844] | Total Time: 4h 11m 11s
d_loss: 1.3867, g_loss: 25.2152, const_loss: 0.0000, l1_loss: 18.3992, fm_loss: 0.0043, perc_loss: 5.9663, edge: 0.1521
Epoch: [ 1], Batch: [3300/3844] | Total Time: 4h 14m 44s
d_loss: 1.3867, g_loss: 22.9282, const_loss: 0.0000, l1_loss: 16.7573, fm_loss: 0.0035, perc_loss: 5.3390, edge: 0.1350
Epoch: [ 1], Batch: [3400/3844] | Total Time: 4h 18m 25s
d_loss: 1.3868, g_loss: 25.2379, const_loss: 0.0000, l1_loss: 18.4865, fm_loss: 0.0046, perc_loss: 5.9036, edge: 0.1497

發佈留言

發佈留言必須填寫的電子郵件地址不會公開。 必填欄位標示為 *