主要影響就是訓練花費時間, 512×512 是 256×256 的3倍時間, 還有 batch size 的關係, 所以 512×512 造成的偏差會比較大一點.
除了調整 resolution, 也可以試著增強G 的 final_channels, 畫質也會變好.
256×256 regular style 的 console log:
Epoch: [ 0], [ 0/ 45] time: 26.14, d_loss: 1.40957, g_loss: 7.95551, category_loss: 0.00000, cheat_loss: 0.77449, const_loss: 0.00980, l1_loss: 7.17123
Epoch: [ 1], [ 0/ 45] time: 431.05, d_loss: 4.29400, g_loss: 8.10423, category_loss: 0.00001, cheat_loss: 1.28892, const_loss: 0.00732, l1_loss: 6.80797
Epoch: [ 2], [ 0/ 45] time: 843.68, d_loss: 1.88517, g_loss: 7.31769, category_loss: 0.00158, cheat_loss: 1.19841, const_loss: 0.00523, l1_loss: 6.11405
Epoch: [ 3], [ 0/ 45] time: 1255.93, d_loss: 2.35987, g_loss: 7.44909, category_loss: 0.00002, cheat_loss: 1.31800, const_loss: 0.00856, l1_loss: 6.12253
Epoch: [ 4], [ 0/ 45] time: 1668.46, d_loss: 2.54748, g_loss: 7.11578, category_loss: 0.00015, cheat_loss: 0.84949, const_loss: 0.00585, l1_loss: 6.26042
Epoch: [ 5], [ 0/ 45] time: 2080.87, d_loss: 2.13288, g_loss: 6.81926, category_loss: 0.00000, cheat_loss: 0.63917, const_loss: 0.00627, l1_loss: 6.17382
Epoch: [ 6], [ 0/ 45] time: 2493.44, d_loss: 1.47338, g_loss: 6.43606, category_loss: 0.00000, cheat_loss: 0.54430, const_loss: 0.00460, l1_loss: 5.88716
Epoch: [ 7], [ 0/ 45] time: 2906.28, d_loss: 1.56289, g_loss: 6.46395, category_loss: 0.00000, cheat_loss: 0.80159, const_loss: 0.00400, l1_loss: 5.65836
Epoch: [ 8], [ 0/ 45] time: 3318.75, d_loss: 1.54652, g_loss: 6.90791, category_loss: 0.00000, cheat_loss: 1.14259, const_loss: 0.00441, l1_loss: 5.76091
Epoch: [ 9], [ 0/ 45] time: 3732.71, d_loss: 1.72232, g_loss: 5.73225, category_loss: 0.00000, cheat_loss: 0.40109, const_loss: 0.00444, l1_loss: 5.32672
Epoch: [10], [ 0/ 45] time: 4145.09, d_loss: 1.63648, g_loss: 6.76048, category_loss: 0.00000, cheat_loss: 0.69323, const_loss: 0.00576, l1_loss: 6.06149
Epoch: [11], [ 0/ 45] time: 4557.65, d_loss: 1.39904, g_loss: 6.41119, category_loss: 0.00000, cheat_loss: 0.73149, const_loss: 0.00420, l1_loss: 5.67549
Epoch: [12], [ 0/ 45] time: 4970.41, d_loss: 1.35578, g_loss: 6.41434, category_loss: 0.00000, cheat_loss: 0.79072, const_loss: 0.00421, l1_loss: 5.61941
Epoch: [13], [ 0/ 45] time: 5383.24, d_loss: 1.46898, g_loss: 6.39344, category_loss: 0.00000, cheat_loss: 0.77842, const_loss: 0.00322, l1_loss: 5.61179
Epoch: [14], [ 0/ 45] time: 5801.82, d_loss: 1.42688, g_loss: 6.04748, category_loss: 0.00000, cheat_loss: 0.64107, const_loss: 0.00423, l1_loss: 5.40218
Epoch: [15], [ 0/ 45] time: 6214.77, d_loss: 1.52659, g_loss: 5.95127, category_loss: 0.00000, cheat_loss: 0.64333, const_loss: 0.00312, l1_loss: 5.30482
Epoch: [16], [ 0/ 45] time: 6627.24, d_loss: 1.40160, g_loss: 6.65553, category_loss: 0.00000, cheat_loss: 0.78474, const_loss: 0.00339, l1_loss: 5.86741
Epoch: [17], [ 0/ 45] time: 7039.85, d_loss: 1.37407, g_loss: 6.09667, category_loss: 0.00000, cheat_loss: 0.73831, const_loss: 0.00325, l1_loss: 5.35511
Epoch: [18], [ 0/ 45] time: 7453.68, d_loss: 1.43629, g_loss: 6.14127, category_loss: 0.00000, cheat_loss: 0.78559, const_loss: 0.00333, l1_loss: 5.35235
Epoch: [19], [ 0/ 45] time: 7865.81, d_loss: 1.46511, g_loss: 6.16847, category_loss: 0.00001, cheat_loss: 0.75095, const_loss: 0.00318, l1_loss: 5.41435
Epoch: [20], [ 0/ 45] time: 8278.32, d_loss: 1.40473, g_loss: 5.65997, category_loss: 0.00000, cheat_loss: 0.61385, const_loss: 0.00304, l1_loss: 5.04308
Epoch: [21], [ 0/ 45] time: 8690.71, d_loss: 1.34687, g_loss: 5.76431, category_loss: 0.00000, cheat_loss: 0.77681, const_loss: 0.00308, l1_loss: 4.98442
Epoch: [22], [ 0/ 45] time: 9103.41, d_loss: 1.33141, g_loss: 6.14065, category_loss: 0.00000, cheat_loss: 0.87175, const_loss: 0.00385, l1_loss: 5.26505
Epoch: [23], [ 0/ 45] time: 9517.54, d_loss: 1.42019, g_loss: 5.76008, category_loss: 0.00000, cheat_loss: 0.65872, const_loss: 0.00289, l1_loss: 5.09846
512×512 regular style:
Epoch: [ 0], [ 100/ 215] time: 606.98, d_loss: 0.31174, g_loss: 13.93359, category_loss: 0.00002, cheat_loss: 3.24874, const_loss: 0.23946, l1_loss: 10.44538
Epoch: [ 0], [ 200/ 215] time: 1212.25, d_loss: 0.45079, g_loss: 14.34433, category_loss: 0.00001, cheat_loss: 2.75728, const_loss: 0.25732, l1_loss: 11.32973
Epoch: [ 1], [ 0/ 215] time: 1305.91, d_loss: 0.18158, g_loss: 15.42831, category_loss: 0.00002, cheat_loss: 4.07555, const_loss: 0.25275, l1_loss: 11.10001
Epoch: [ 1], [ 100/ 215] time: 1912.50, d_loss: 0.25792, g_loss: 12.77551, category_loss: 0.00002, cheat_loss: 1.18476, const_loss: 0.26111, l1_loss: 11.32963
Epoch: [ 1], [ 200/ 215] time: 2519.25, d_loss: 0.08402, g_loss: 17.10276, category_loss: 0.00002, cheat_loss: 4.76643, const_loss: 0.26600, l1_loss: 12.07032
Epoch: [ 2], [ 0/ 215] time: 2606.57, d_loss: 0.16091, g_loss: 16.14471, category_loss: 0.00003, cheat_loss: 4.35494, const_loss: 0.28869, l1_loss: 11.50108
Epoch: [ 2], [ 100/ 215] time: 3216.68, d_loss: 0.17906, g_loss: 14.70552, category_loss: 0.00002, cheat_loss: 3.53643, const_loss: 0.28399, l1_loss: 10.88509
Epoch: [ 2], [ 200/ 215] time: 3829.41, d_loss: 0.93324, g_loss: 14.33587, category_loss: 0.00001, cheat_loss: 2.01551, const_loss: 0.27378, l1_loss: 12.04657
Epoch: [ 3], [ 0/ 215] time: 3917.06, d_loss: 1.41822, g_loss: 11.95745, category_loss: 0.00001, cheat_loss: 1.14869, const_loss: 0.25951, l1_loss: 10.54925
Epoch: [ 3], [ 100/ 215] time: 4528.22, d_loss: 0.16337, g_loss: 15.73500, category_loss: 0.00002, cheat_loss: 3.10425, const_loss: 0.30736, l1_loss: 12.32339
Epoch: [ 3], [ 200/ 215] time: 5133.97, d_loss: 0.24355, g_loss: 16.40785, category_loss: 0.00003, cheat_loss: 3.99776, const_loss: 0.27812, l1_loss: 12.13196
Epoch: [ 4], [ 0/ 215] time: 5220.26, d_loss: 0.12751, g_loss: 16.04850, category_loss: 0.00002, cheat_loss: 4.45992, const_loss: 0.29583, l1_loss: 11.29275
Epoch: [ 4], [ 100/ 215] time: 5820.05, d_loss: 0.23068, g_loss: 14.54076, category_loss: 0.00002, cheat_loss: 2.81461, const_loss: 0.31861, l1_loss: 11.40752
Epoch: [ 4], [ 200/ 215] time: 6424.60, d_loss: 0.33250, g_loss: 12.77896, category_loss: 0.00002, cheat_loss: 1.03109, const_loss: 0.27153, l1_loss: 11.47633
Epoch: [ 5], [ 0/ 215] time: 6511.78, d_loss: 0.35609, g_loss: 17.78728, category_loss: 0.00002, cheat_loss: 5.55148, const_loss: 0.26762, l1_loss: 11.96817
Epoch: [ 5], [ 100/ 215] time: 7119.03, d_loss: 0.20058, g_loss: 11.88620, category_loss: 0.00002, cheat_loss: 1.46443, const_loss: 0.28879, l1_loss: 10.13297
Epoch: [ 5], [ 200/ 215] time: 7732.42, d_loss: 0.02682, g_loss: 15.94793, category_loss: 0.00002, cheat_loss: 3.92855, const_loss: 0.25941, l1_loss: 11.75997
Epoch: [ 6], [ 0/ 215] time: 7820.20, d_loss: 0.25289, g_loss: 13.19630, category_loss: 0.00002, cheat_loss: 1.58104, const_loss: 0.25611, l1_loss: 11.35915
Epoch: [ 6], [ 100/ 215] time: 8428.00, d_loss: 0.07260, g_loss: 14.08982, category_loss: 0.00002, cheat_loss: 2.64803, const_loss: 0.25747, l1_loss: 11.18432
Epoch: [ 6], [ 200/ 215] time: 9039.00, d_loss: 0.19923, g_loss: 13.51740, category_loss: 0.00001, cheat_loss: 2.52061, const_loss: 0.25316, l1_loss: 10.74363
Epoch: [ 7], [ 0/ 215] time: 9129.58, d_loss: 0.10501, g_loss: 14.10990, category_loss: 0.00001, cheat_loss: 3.65700, const_loss: 0.31224, l1_loss: 10.14066
Epoch: [ 7], [ 100/ 215] time: 9740.36, d_loss: 0.37006, g_loss: 13.18886, category_loss: 0.00002, cheat_loss: 1.39241, const_loss: 0.26942, l1_loss: 11.52702
Epoch: [ 7], [ 200/ 215] time: 10352.41, d_loss: 0.03985, g_loss: 16.22905, category_loss: 0.00002, cheat_loss: 4.23743, const_loss: 0.26772, l1_loss: 11.72391
Epoch: [ 8], [ 0/ 215] time: 10440.27, d_loss: 0.03505, g_loss: 16.62378, category_loss: 0.00002, cheat_loss: 3.80674, const_loss: 0.26524, l1_loss: 12.55178
512×512 thin style:
Epoch: [ 0], [ 100/ 221] time: 606.40, d_loss: 0.06194, g_loss: 16.90748, category_loss: 0.00002, cheat_loss: 2.46379, const_loss: 0.28582, l1_loss: 14.15786
Epoch: [ 0], [ 200/ 221] time: 1184.78, d_loss: 1.69232, g_loss: 22.57175, category_loss: 0.00002, cheat_loss: 7.01814, const_loss: 0.37845, l1_loss: 15.17514
Epoch: [ 1], [ 0/ 221] time: 1310.44, d_loss: 0.10237, g_loss: 20.51014, category_loss: 0.00004, cheat_loss: 3.75793, const_loss: 0.38718, l1_loss: 16.36501
Epoch: [ 1], [ 100/ 221] time: 1893.25, d_loss: 1.28714, g_loss: 28.07741, category_loss: 0.00002, cheat_loss: 11.71363, const_loss: 0.34063, l1_loss: 16.02314
Epoch: [ 1], [ 200/ 221] time: 2479.76, d_loss: 0.21078, g_loss: 19.89846, category_loss: 0.00005, cheat_loss: 3.27212, const_loss: 0.40745, l1_loss: 16.21886
Epoch: [ 2], [ 0/ 221] time: 2597.86, d_loss: 0.03788, g_loss: 22.53658, category_loss: 0.00007, cheat_loss: 5.92009, const_loss: 0.39226, l1_loss: 16.22420
Epoch: [ 2], [ 100/ 221] time: 3180.87, d_loss: 0.28448, g_loss: 17.30076, category_loss: 0.00002, cheat_loss: 2.33146, const_loss: 0.47490, l1_loss: 14.49439
Epoch: [ 2], [ 200/ 221] time: 3762.27, d_loss: 0.03560, g_loss: 19.19828, category_loss: 0.00007, cheat_loss: 4.20593, const_loss: 0.36600, l1_loss: 14.62633
Epoch: [ 3], [ 0/ 221] time: 3880.43, d_loss: 0.02630, g_loss: 18.93944, category_loss: 0.00004, cheat_loss: 3.58121, const_loss: 0.41637, l1_loss: 14.94183
Epoch: [ 3], [ 100/ 221] time: 4464.79, d_loss: 0.57211, g_loss: 17.58680, category_loss: 0.00009, cheat_loss: 1.91106, const_loss: 0.37507, l1_loss: 15.30064
Epoch: [ 3], [ 200/ 221] time: 5047.76, d_loss: 2.81330, g_loss: 15.15771, category_loss: 0.00002, cheat_loss: 0.15222, const_loss: 0.39393, l1_loss: 14.61155
Epoch: [ 4], [ 0/ 221] time: 5165.35, d_loss: 0.56849, g_loss: 23.05492, category_loss: 0.00006, cheat_loss: 6.34440, const_loss: 0.37937, l1_loss: 16.33110
Epoch: [ 4], [ 100/ 221] time: 5744.85, d_loss: 0.17806, g_loss: 17.27935, category_loss: 0.00007, cheat_loss: 3.38671, const_loss: 0.34699, l1_loss: 13.54560
Epoch: [ 4], [ 200/ 221] time: 6322.63, d_loss: 0.18185, g_loss: 17.78687, category_loss: 0.00002, cheat_loss: 2.08401, const_loss: 0.43055, l1_loss: 15.27229
Epoch: [ 5], [ 0/ 221] time: 6442.41, d_loss: 2.13744, g_loss: 21.42826, category_loss: 0.00004, cheat_loss: 5.15124, const_loss: 0.37891, l1_loss: 15.89808
Epoch: [ 5], [ 100/ 221] time: 7021.45, d_loss: 0.08028, g_loss: 19.47682, category_loss: 0.00008, cheat_loss: 3.71971, const_loss: 0.36993, l1_loss: 15.38715
Epoch: [ 5], [ 200/ 221] time: 7601.96, d_loss: 0.02521, g_loss: 19.58899, category_loss: 0.00004, cheat_loss: 3.72400, const_loss: 0.37101, l1_loss: 15.49395
Epoch: [ 6], [ 0/ 221] time: 7719.85, d_loss: 0.09147, g_loss: 21.11709, category_loss: 0.00005, cheat_loss: 4.08192, const_loss: 0.35442, l1_loss: 16.68073
Epoch: [ 6], [ 100/ 221] time: 8303.10, d_loss: 0.02701, g_loss: 16.91349, category_loss: 0.00002, cheat_loss: 2.31061, const_loss: 0.36774, l1_loss: 14.23512
Epoch: [ 6], [ 200/ 221] time: 8882.61, d_loss: 0.15102, g_loss: 19.19582, category_loss: 0.00002, cheat_loss: 3.81039, const_loss: 0.35015, l1_loss: 15.03527
Epoch: [ 7], [ 0/ 221] time: 9000.20, d_loss: 0.02309, g_loss: 22.37927, category_loss: 0.00005, cheat_loss: 5.76501, const_loss: 0.32940, l1_loss: 16.28485
Epoch: [ 7], [ 100/ 221] time: 9581.13, d_loss: 0.02461, g_loss: 22.49588, category_loss: 0.00006, cheat_loss: 5.07542, const_loss: 0.35766, l1_loss: 17.06278
Epoch: [ 7], [ 200/ 221] time: 10160.70, d_loss: 0.55793, g_loss: 19.72526, category_loss: 0.00004, cheat_loss: 4.79006, const_loss: 0.33927, l1_loss: 14.59591
Epoch: [ 8], [ 0/ 221] time: 10278.53, d_loss: 0.02843, g_loss: 18.65423, category_loss: 0.00004, cheat_loss: 4.81907, const_loss: 0.31892, l1_loss: 13.51622
Epoch: [ 8], [ 100/ 221] time: 10858.93, d_loss: 0.03031, g_loss: 20.73404, category_loss: 0.00008, cheat_loss: 5.38640, const_loss: 0.32427, l1_loss: 15.02334
Epoch: [ 8], [ 200/ 221] time: 11441.91, d_loss: 0.04380, g_loss: 19.52982, category_loss: 0.00003, cheat_loss: 4.72400, const_loss: 0.33755, l1_loss: 14.46826
Epoch: [ 9], [ 0/ 221] time: 11559.79, d_loss: 0.11075, g_loss: 18.61131, category_loss: 0.00004, cheat_loss: 2.50399, const_loss: 0.35531, l1_loss: 15.75200
Epoch: [ 9], [ 100/ 221] time: 12142.98, d_loss: 0.04427, g_loss: 21.76070, category_loss: 0.00002, cheat_loss: 4.73563, const_loss: 0.31739, l1_loss: 16.70768
Epoch: [ 9], [ 200/ 221] time: 12726.48, d_loss: 0.01125, g_loss: 22.93324, category_loss: 0.00004, cheat_loss: 6.95069, const_loss: 0.32098, l1_loss: 15.66155
Epoch: [10], [ 0/ 221] time: 12853.06, d_loss: 0.15521, g_loss: 18.61334, category_loss: 0.00007, cheat_loss: 4.33218, const_loss: 0.31655, l1_loss: 13.96458
Epoch: [10], [ 100/ 221] time: 13436.05, d_loss: 0.07787, g_loss: 20.53145, category_loss: 0.00007, cheat_loss: 5.18724, const_loss: 0.33911, l1_loss: 15.00507
Epoch: [10], [ 200/ 221] time: 14016.40, d_loss: 0.08501, g_loss: 20.79124, category_loss: 0.00004, cheat_loss: 5.51355, const_loss: 0.31750, l1_loss: 14.96017
256×256 thin style, 在 final_channels = 256 情況下,
Epoch: [ 1], [ 0/ 47] time: 360.09, d_loss: 1.81648, g_loss: 8.05295, category_loss: 0.00008, cheat_loss: 0.49621, const_loss: 0.00571, l1_loss: 7.55099
Epoch: [ 2], [ 0/ 47] time: 715.98, d_loss: 1.51517, g_loss: 8.47806, category_loss: 0.00969, cheat_loss: 1.26263, const_loss: 0.00942, l1_loss: 7.20600
Epoch: [ 3], [ 0/ 47] time: 1076.88, d_loss: 1.39151, g_loss: 9.45060, category_loss: 0.00330, cheat_loss: 1.59625, const_loss: 0.00699, l1_loss: 7.84735
Epoch: [ 4], [ 0/ 47] time: 1432.70, d_loss: 1.77968, g_loss: 11.19139, category_loss: 0.00004, cheat_loss: 3.08767, const_loss: 0.01036, l1_loss: 8.09336
Epoch: [ 5], [ 0/ 47] time: 1791.91, d_loss: 1.28329, g_loss: 9.48213, category_loss: 0.00007, cheat_loss: 1.64778, const_loss: 0.01058, l1_loss: 7.82376
Epoch: [ 6], [ 0/ 47] time: 2148.23, d_loss: 1.65145, g_loss: 8.50315, category_loss: 0.00051, cheat_loss: 0.59454, const_loss: 0.00790, l1_loss: 7.89413
Epoch: [ 7], [ 0/ 47] time: 2505.78, d_loss: 1.28920, g_loss: 8.97056, category_loss: 0.00000, cheat_loss: 1.22283, const_loss: 0.00823, l1_loss: 7.73950
Epoch: [ 8], [ 0/ 47] time: 2861.63, d_loss: 1.56068, g_loss: 9.06203, category_loss: 0.00219, cheat_loss: 1.63755, const_loss: 0.00642, l1_loss: 7.41806
Epoch: [ 9], [ 0/ 47] time: 3221.33, d_loss: 1.53459, g_loss: 10.47424, category_loss: 0.00001, cheat_loss: 2.48992, const_loss: 0.00709, l1_loss: 7.97724
Decay net_D learning rate from 0.00050 to 0.00025.
Decay net_G learning rate from 0.00050 to 0.00025.
Epoch: [10], [ 0/ 47] time: 3577.16, d_loss: 2.60345, g_loss: 9.48926, category_loss: 0.00114, cheat_loss: 1.03732, const_loss: 0.01239, l1_loss: 8.43954
Epoch: [11], [ 0/ 47] time: 3940.63, d_loss: 1.34707, g_loss: 8.19656, category_loss: 0.00000, cheat_loss: 0.87049, const_loss: 0.00583, l1_loss: 7.32025
Epoch: [12], [ 0/ 47] time: 4296.30, d_loss: 0.87700, g_loss: 10.14889, category_loss: 0.00126, cheat_loss: 2.26760, const_loss: 0.00385, l1_loss: 7.87743
Epoch: [13], [ 0/ 47] time: 4655.38, d_loss: 1.65686, g_loss: 8.37238, category_loss: 0.00000, cheat_loss: 0.80900, const_loss: 0.00472, l1_loss: 7.55866
Epoch: [14], [ 0/ 47] time: 5011.44, d_loss: 0.78299, g_loss: 10.14690, category_loss: 0.00000, cheat_loss: 2.28178, const_loss: 0.00397, l1_loss: 7.86115
Epoch: [15], [ 0/ 47] time: 5369.26, d_loss: 0.77644, g_loss: 9.11055, category_loss: 0.00001, cheat_loss: 1.03779, const_loss: 0.00357, l1_loss: 8.06919
Epoch: [16], [ 0/ 47] time: 5725.36, d_loss: 1.03227, g_loss: 8.24855, category_loss: 0.00000, cheat_loss: 0.61025, const_loss: 0.00747, l1_loss: 7.63083
Epoch: [17], [ 0/ 47] time: 6081.56, d_loss: 0.84785, g_loss: 8.97112, category_loss: 0.00000, cheat_loss: 1.49794, const_loss: 0.00429, l1_loss: 7.46889
Epoch: [18], [ 0/ 47] time: 6443.13, d_loss: 0.82620, g_loss: 8.52151, category_loss: 0.00000, cheat_loss: 1.33499, const_loss: 0.00764, l1_loss: 7.17889
Epoch: [19], [ 0/ 47] time: 6799.21, d_loss: 0.70555, g_loss: 10.75724, category_loss: 0.00000, cheat_loss: 2.19755, const_loss: 0.00824, l1_loss: 8.55145
Decay net_D learning rate from 0.00025 to 0.00013.
Decay net_G learning rate from 0.00025 to 0.00013.
Epoch: [20], [ 0/ 47] time: 7156.61, d_loss: 2.74217, g_loss: 10.73975, category_loss: 0.00000, cheat_loss: 2.18354, const_loss: 0.01205, l1_loss: 8.54416
Epoch: [21], [ 0/ 47] time: 7512.87, d_loss: 0.90747, g_loss: 9.45579, category_loss: 0.00000, cheat_loss: 1.69721, const_loss: 0.00558, l1_loss: 7.75300
Epoch: [22], [ 0/ 47] time: 7870.22, d_loss: 0.58048, g_loss: 10.02434, category_loss: 0.00004, cheat_loss: 1.84721, const_loss: 0.00537, l1_loss: 8.17176
Epoch: [23], [ 0/ 47] time: 8226.16, d_loss: 0.85992, g_loss: 8.82779, category_loss: 0.00000, cheat_loss: 0.76111, const_loss: 0.00522, l1_loss: 8.06147
Epoch: [24], [ 0/ 47] time: 8586.65, d_loss: 0.48779, g_loss: 9.96469, category_loss: 0.00000, cheat_loss: 1.76908, const_loss: 0.00574, l1_loss: 8.18987
Epoch: [25], [ 0/ 47] time: 8942.70, d_loss: 0.76323, g_loss: 11.38750, category_loss: 0.00000, cheat_loss: 3.29595, const_loss: 0.00522, l1_loss: 8.08633
Epoch: [26], [ 0/ 47] time: 9300.20, d_loss: 0.62175, g_loss: 9.00046, category_loss: 0.00000, cheat_loss: 1.32260, const_loss: 0.00590, l1_loss: 7.67196
Epoch: [27], [ 0/ 47] time: 9656.29, d_loss: 0.41603, g_loss: 10.93949, category_loss: 0.00000, cheat_loss: 2.48615, const_loss: 0.00598, l1_loss: 8.44735
滿有趣的,, 原來每次的 Decay learning rate 會讓 d_loss 突然再長大.
256×256 thin style, 在 final_channels = 256 改成 512 情況下:
Epoch: [ 1], [ 0/ 47] time: 361.85, d_loss: 8.39628, g_loss: 8.69999, category_loss: 0.00000, cheat_loss: 0.31410, const_loss: 0.01050, l1_loss: 8.37539
Epoch: [ 2], [ 0/ 47] time: 719.53, d_loss: 4.36914, g_loss: 11.11966, category_loss: 0.00000, cheat_loss: 3.60200, const_loss: 0.00824, l1_loss: 7.50941
Epoch: [ 3], [ 0/ 47] time: 1080.39, d_loss: 4.45263, g_loss: 8.06205, category_loss: 0.00000, cheat_loss: 0.47213, const_loss: 0.00634, l1_loss: 7.58358
Epoch: [ 4], [ 0/ 47] time: 1438.55, d_loss: 8.08992, g_loss: 13.24943, category_loss: 0.00008, cheat_loss: 5.26327, const_loss: 0.01430, l1_loss: 7.97185
Epoch: [ 5], [ 0/ 47] time: 1798.56, d_loss: 3.60792, g_loss: 9.88743, category_loss: 0.09016, cheat_loss: 2.15218, const_loss: 0.00835, l1_loss: 7.72689
Epoch: [ 6], [ 0/ 47] time: 2156.66, d_loss: 2.63320, g_loss: 8.65966, category_loss: 0.03130, cheat_loss: 0.41417, const_loss: 0.01390, l1_loss: 8.23007
Epoch: [ 7], [ 0/ 47] time: 2517.04, d_loss: 2.33885, g_loss: 8.99310, category_loss: 0.00000, cheat_loss: 0.96819, const_loss: 0.00884, l1_loss: 8.01606
Epoch: [ 8], [ 0/ 47] time: 2874.65, d_loss: 3.12392, g_loss: 9.07584, category_loss: 0.00826, cheat_loss: 1.57318, const_loss: 0.01214, l1_loss: 7.49052
Epoch: [ 9], [ 0/ 47] time: 3239.68, d_loss: 4.68467, g_loss: 8.65565, category_loss: 0.74174, cheat_loss: 0.56121, const_loss: 0.01337, l1_loss: 8.07804
Decay net_D learning rate from 0.00050 to 0.00025.
Decay net_G learning rate from 0.00050 to 0.00025.
Epoch: [10], [ 0/ 47] time: 3597.89, d_loss: 2.75090, g_loss: 9.95582, category_loss: 0.33848, cheat_loss: 1.28554, const_loss: 0.01345, l1_loss: 8.62938
Epoch: [11], [ 0/ 47] time: 3963.18, d_loss: 2.89727, g_loss: 9.85377, category_loss: 0.01938, cheat_loss: 1.51332, const_loss: 0.00609, l1_loss: 8.33435
Epoch: [12], [ 0/ 47] time: 4349.42, d_loss: 1.54349, g_loss: 8.00276, category_loss: 0.00802, cheat_loss: 0.84313, const_loss: 0.00565, l1_loss: 7.15398
Epoch: [13], [ 0/ 47] time: 4738.65, d_loss: 2.28097, g_loss: 9.60725, category_loss: 0.00000, cheat_loss: 1.86188, const_loss: 0.00477, l1_loss: 7.74061
大約 100 個 Epoch 之後, light style 的 d_loss 介於 1.5 ~ 5.0, 感覺推論的效果差不多, 效果是差強人意, 好消息是推論出來的結果還沒有崩掉, 之前的 final_channels=1, 持續訓練會產生雜點, 線條會產生波紋, 筆觸的細節容易消失, 就是圓頭變成方頭, 而且這時候 d_loss 低於 0.01 而且愈訓練, 還愈持續下降, 但推論出來的結果反而愈差!
256×256 thin style, 在 final_channels = 512, 再接著訓練 50分鐘, 7個 epoch, 推論結果的品質有部份的提升.
Epoch: [ 1], [ 0/ 49] time: 465.70, d_loss: 2.23387, g_loss: 8.52372, category_loss: 0.00467, cheat_loss: 1.42327, const_loss: 0.00785, l1_loss: 7.09260
Epoch: [ 2], [ 0/ 49] time: 924.72, d_loss: 2.96535, g_loss: 10.29011, category_loss: 0.00000, cheat_loss: 2.88320, const_loss: 0.00995, l1_loss: 7.39182
Epoch: [ 3], [ 0/ 49] time: 1381.85, d_loss: 2.98210, g_loss: 8.13968, category_loss: 0.00000, cheat_loss: 0.77989, const_loss: 0.00660, l1_loss: 7.35319
Epoch: [ 4], [ 0/ 49] time: 1828.72, d_loss: 1.38947, g_loss: 8.37554, category_loss: 0.00770, cheat_loss: 1.35332, const_loss: 0.00569, l1_loss: 7.01502
Epoch: [ 5], [ 0/ 49] time: 2282.06, d_loss: 1.52924, g_loss: 8.68131, category_loss: 0.00000, cheat_loss: 1.28559, const_loss: 0.00723, l1_loss: 7.38849
Epoch: [ 6], [ 0/ 49] time: 2728.68, d_loss: 1.57393, g_loss: 10.14317, category_loss: 0.00340, cheat_loss: 1.39852, const_loss: 0.04536, l1_loss: 8.69929
Epoch: [ 7], [ 0/ 49] time: 3181.39, d_loss: 1.27554, g_loss: 9.30466, category_loss: 0.00881, cheat_loss: 1.36761, const_loss: 0.02494, l1_loss: 7.91210
測試訓練 384×384 regular style, console:
Epoch: [ 0], [ 100/ 221] time: 634.66, d_loss: 1.70170, g_loss: 22.49987, cheat_loss: 0.42731, const_loss: 0.24854, l1_loss: 21.81741
Epoch: [ 0], [ 200/ 221] time: 1263.72, d_loss: 1.48091, g_loss: 22.52045, cheat_loss: 0.60753, const_loss: 0.13540, l1_loss: 21.77489
Epoch: [ 1], [ 0/ 221] time: 1391.64, d_loss: 1.59082, g_loss: 20.73450, cheat_loss: 0.65361, const_loss: 0.21593, l1_loss: 19.86307
Epoch: [ 1], [ 100/ 221] time: 2022.13, d_loss: 1.41247, g_loss: 19.61304, cheat_loss: 0.62105, const_loss: 0.18016, l1_loss: 18.81098
Epoch: [ 1], [ 200/ 221] time: 2649.22, d_loss: 1.41584, g_loss: 17.20602, cheat_loss: 0.64389, const_loss: 0.15569, l1_loss: 16.40570
Epoch: [ 2], [ 0/ 221] time: 2777.23, d_loss: 1.41360, g_loss: 18.28119, cheat_loss: 0.63881, const_loss: 0.18331, l1_loss: 17.45838
Epoch: [ 2], [ 100/ 221] time: 3402.93, d_loss: 1.42232, g_loss: 17.60743, cheat_loss: 0.65547, const_loss: 0.16658, l1_loss: 16.78504
Epoch: [ 2], [ 200/ 221] time: 4031.25, d_loss: 1.40368, g_loss: 16.59215, cheat_loss: 0.65257, const_loss: 0.14757, l1_loss: 15.79176
Epoch: [ 3], [ 0/ 221] time: 4158.74, d_loss: 1.42200, g_loss: 17.40098, cheat_loss: 0.65941, const_loss: 0.10877, l1_loss: 16.63258
Epoch: [ 3], [ 100/ 221] time: 4783.63, d_loss: 1.41223, g_loss: 17.48169, cheat_loss: 0.67534, const_loss: 0.12153, l1_loss: 16.68464
Epoch: [ 3], [ 200/ 221] time: 5408.28, d_loss: 1.39375, g_loss: 15.83211, cheat_loss: 0.67299, const_loss: 0.10264, l1_loss: 15.05636
Epoch: [ 4], [ 0/ 221] time: 5535.40, d_loss: 1.39302, g_loss: 16.69980, cheat_loss: 0.68224, const_loss: 0.09738, l1_loss: 15.92006
Epoch: [ 4], [ 100/ 221] time: 6161.13, d_loss: 1.39293, g_loss: 17.78801, cheat_loss: 0.67948, const_loss: 0.13956, l1_loss: 16.96887
Epoch: [ 4], [ 200/ 221] time: 6786.12, d_loss: 1.39770, g_loss: 15.28551, cheat_loss: 0.67972, const_loss: 0.13683, l1_loss: 14.46888
Epoch: [ 5], [ 0/ 221] time: 6913.42, d_loss: 1.40720, g_loss: 15.67122, cheat_loss: 0.67266, const_loss: 0.14205, l1_loss: 14.85644
Epoch: [ 5], [ 100/ 221] time: 7545.94, d_loss: 1.39295, g_loss: 14.36823, cheat_loss: 0.68092, const_loss: 0.12484, l1_loss: 13.56240
Epoch: [ 5], [ 200/ 221] time: 8171.66, d_loss: 1.39301, g_loss: 14.40486, cheat_loss: 0.68241, const_loss: 0.11845, l1_loss: 13.60395
Epoch: [ 6], [ 0/ 221] time: 8298.72, d_loss: 1.39161, g_loss: 14.13372, cheat_loss: 0.67828, const_loss: 0.11094, l1_loss: 13.34445
Epoch: [ 6], [ 100/ 221] time: 8924.61, d_loss: 1.39644, g_loss: 14.17825, cheat_loss: 0.68293, const_loss: 0.07967, l1_loss: 13.41558
Epoch: [ 6], [ 200/ 221] time: 9552.01, d_loss: 1.41293, g_loss: 13.47848, cheat_loss: 0.67802, const_loss: 0.09617, l1_loss: 12.70425
Epoch: [ 7], [ 0/ 221] time: 9679.24, d_loss: 1.39550, g_loss: 13.60544, cheat_loss: 0.68243, const_loss: 0.09686, l1_loss: 12.82611
Epoch: [ 7], [ 100/ 221] time: 10303.91, d_loss: 1.39225, g_loss: 12.80960, cheat_loss: 0.68257, const_loss: 0.10408, l1_loss: 12.02289
Epoch: [ 7], [ 200/ 221] time: 10928.28, d_loss: 1.46856, g_loss: 12.71749, cheat_loss: 0.68491, const_loss: 0.09512, l1_loss: 11.93743
Epoch: [ 8], [ 0/ 221] time: 11055.41, d_loss: 1.41648, g_loss: 13.58018, cheat_loss: 0.68493, const_loss: 0.11424, l1_loss: 12.78097
Epoch: [ 8], [ 100/ 221] time: 11683.00, d_loss: 1.39109, g_loss: 12.73312, cheat_loss: 0.68135, const_loss: 0.11578, l1_loss: 11.93596
Epoch: [ 8], [ 200/ 221] time: 12306.74, d_loss: 1.39255, g_loss: 12.82240, cheat_loss: 0.68034, const_loss: 0.09786, l1_loss: 12.04418
Epoch: [ 9], [ 0/ 221] time: 12433.23, d_loss: 1.43981, g_loss: 12.24602, cheat_loss: 0.68305, const_loss: 0.08212, l1_loss: 11.48081
Epoch: [ 9], [ 100/ 221] time: 13054.40, d_loss: 1.44596, g_loss: 11.31403, cheat_loss: 0.68690, const_loss: 0.06841, l1_loss: 10.55870
1 個 epoch, 在 colab 上要花掉 1391秒, 超久的…
resume:
Epoch: [ 0], [ 0/ 196] time: 6, d_loss: 1.41847, g_loss: 12.82164, cheat_loss: 0.68701, const_loss: 0.08576, l1_loss: 12.04887
Epoch: [ 0], [ 100/ 196] time: 545, d_loss: 1.40066, g_loss: 8.72282, cheat_loss: 0.68164, const_loss: 0.05754, l1_loss: 7.98364
Epoch: [ 1], [ 0/ 196] time: 1062, d_loss: 1.40546, g_loss: 7.96335, cheat_loss: 0.67383, const_loss: 0.04352, l1_loss: 7.24601
Epoch: [ 1], [ 100/ 196] time: 1604, d_loss: 1.41142, g_loss: 8.75384, cheat_loss: 0.67480, const_loss: 0.04330, l1_loss: 8.03574
Epoch: [ 2], [ 0/ 196] time: 2121, d_loss: 1.46303, g_loss: 7.63563, cheat_loss: 0.68262, const_loss: 0.03082, l1_loss: 6.92220
Epoch: [ 2], [ 100/ 196] time: 2661, d_loss: 1.43247, g_loss: 8.88612, cheat_loss: 0.68555, const_loss: 0.04210, l1_loss: 8.15847
Epoch: [ 3], [ 0/ 196] time: 3176, d_loss: 1.44787, g_loss: 7.80330, cheat_loss: 0.68262, const_loss: 0.03887, l1_loss: 7.08181
Epoch: [ 3], [ 100/ 196] time: 3715, d_loss: 1.46882, g_loss: 7.74398, cheat_loss: 0.68213, const_loss: 0.03757, l1_loss: 7.02428
Epoch: [ 4], [ 0/ 196] time: 4231, d_loss: 1.46882, g_loss: 7.77863, cheat_loss: 0.66943, const_loss: 0.04005, l1_loss: 7.06915
Epoch: [ 4], [ 100/ 196] time: 4769, d_loss: 1.42624, g_loss: 7.81067, cheat_loss: 0.66846, const_loss: 0.03795, l1_loss: 7.10426
Epoch: [ 5], [ 0/ 196] time: 5293, d_loss: 1.45096, g_loss: 7.69336, cheat_loss: 0.67822, const_loss: 0.03943, l1_loss: 6.97571
Epoch: [ 5], [ 100/ 196] time: 5830, d_loss: 1.43659, g_loss: 7.91153, cheat_loss: 0.67139, const_loss: 0.03534, l1_loss: 7.20480
Epoch: [ 6], [ 0/ 196] time: 6348, d_loss: 1.41970, g_loss: 6.80506, cheat_loss: 0.66406, const_loss: 0.03250, l1_loss: 6.10850
Epoch: [ 6], [ 100/ 196] time: 6885, d_loss: 1.45816, g_loss: 7.53015, cheat_loss: 0.66016, const_loss: 0.03092, l1_loss: 6.83907
Epoch: [ 7], [ 0/ 196] time: 7409, d_loss: 1.44013, g_loss: 6.55591, cheat_loss: 0.66943, const_loss: 0.03925, l1_loss: 5.84721
Epoch: [ 7], [ 100/ 196] time: 7947, d_loss: 1.41193, g_loss: 6.73148, cheat_loss: 0.66406, const_loss: 0.03507, l1_loss: 6.03233
說明: g_loss, from 12.8 ~ 6.7
resume:
Epoch: [ 1], [ 0/ 202] time: 1142, d_loss: 1.44004, g_loss: 5.80526, cheat_loss: 0.66113, const_loss: 0.02661, l1_loss: 5.07184, fm_loss: 0.04567
Epoch: [ 1], [ 100/ 202] time: 1706, d_loss: 1.40803, g_loss: 5.13457, cheat_loss: 0.67285, const_loss: 0.03044, l1_loss: 4.38962, fm_loss: 0.04166
Epoch: [ 1], [ 200/ 202] time: 2271, d_loss: 1.39724, g_loss: 6.47388, cheat_loss: 0.68262, const_loss: 0.02895, l1_loss: 5.72070, fm_loss: 0.04161
Epoch: [ 2], [ 0/ 202] time: 2280, d_loss: 1.39731, g_loss: 5.63391, cheat_loss: 0.66650, const_loss: 0.03386, l1_loss: 4.89516, fm_loss: 0.03838
Epoch: [ 2], [ 100/ 202] time: 2843, d_loss: 1.45686, g_loss: 5.68246, cheat_loss: 0.66992, const_loss: 0.02058, l1_loss: 4.95654, fm_loss: 0.03541
Epoch: [ 2], [ 200/ 202] time: 3410, d_loss: 1.40458, g_loss: 4.86240, cheat_loss: 0.67480, const_loss: 0.02480, l1_loss: 4.13255, fm_loss: 0.03024
Epoch: [ 3], [ 0/ 202] time: 3418, d_loss: 1.40617, g_loss: 5.00629, cheat_loss: 0.67432, const_loss: 0.02830, l1_loss: 4.27324, fm_loss: 0.03043
Epoch: [ 3], [ 100/ 202] time: 3982, d_loss: 1.39211, g_loss: 5.22306, cheat_loss: 0.67285, const_loss: 0.03338, l1_loss: 4.48781, fm_loss: 0.02902
Epoch: [ 3], [ 200/ 202] time: 4547, d_loss: 1.45827, g_loss: 5.57950, cheat_loss: 0.68213, const_loss: 0.02456, l1_loss: 4.84348, fm_loss: 0.02932
Epoch: [ 4], [ 0/ 202] time: 4555, d_loss: 1.47288, g_loss: 5.40314, cheat_loss: 0.68652, const_loss: 0.01939, l1_loss: 4.66541, fm_loss: 0.03182
Epoch: [ 4], [ 100/ 202] time: 5120, d_loss: 1.39476, g_loss: 5.72734, cheat_loss: 0.67822, const_loss: 0.02388, l1_loss: 4.99732, fm_loss: 0.02792
Epoch: [ 4], [ 200/ 202] time: 5692, d_loss: 1.39564, g_loss: 5.22532, cheat_loss: 0.68457, const_loss: 0.03895, l1_loss: 4.47553, fm_loss: 0.02627
Epoch: [ 5], [ 0/ 202] time: 5700, d_loss: 1.39160, g_loss: 6.46715, cheat_loss: 0.68018, const_loss: 0.02223, l1_loss: 5.73634, fm_loss: 0.02840
Epoch: [ 5], [ 100/ 202] time: 6264, d_loss: 1.39195, g_loss: 5.03146, cheat_loss: 0.68262, const_loss: 0.02123, l1_loss: 4.30367, fm_loss: 0.02394
Epoch: [ 5], [ 200/ 202] time: 6831, d_loss: 1.40944, g_loss: 5.17871, cheat_loss: 0.68115, const_loss: 0.01796, l1_loss: 4.45584, fm_loss: 0.02375
Epoch: [ 6], [ 0/ 202] time: 6839, d_loss: 1.41229, g_loss: 4.85709, cheat_loss: 0.68555, const_loss: 0.01708, l1_loss: 4.13129, fm_loss: 0.02317
Epoch: [ 6], [ 100/ 202] time: 7404, d_loss: 1.39649, g_loss: 5.24412, cheat_loss: 0.67773, const_loss: 0.02019, l1_loss: 4.52355, fm_loss: 0.02264
Epoch: [ 6], [ 200/ 202] time: 7973, d_loss: 1.39143, g_loss: 5.19182, cheat_loss: 0.68555, const_loss: 0.02055, l1_loss: 4.46391, fm_loss: 0.02181
Epoch: [ 7], [ 0/ 202] time: 7982, d_loss: 1.38972, g_loss: 4.63458, cheat_loss: 0.68262, const_loss: 0.01598, l1_loss: 3.91555, fm_loss: 0.02044
Epoch: [ 7], [ 100/ 202] time: 8545, d_loss: 1.39089, g_loss: 5.00209, cheat_loss: 0.68506, const_loss: 0.01831, l1_loss: 4.27795, fm_loss: 0.02077
Epoch: [ 7], [ 200/ 202] time: 9110, d_loss: 1.39031, g_loss: 4.92829, cheat_loss: 0.67920, const_loss: 0.02005, l1_loss: 4.20916, fm_loss: 0.01988
說明: g_loss 在 4.6 ~ 6.4 有點難下降.
調整 Generator 程式碼,將其中的 BatchNorm2d 替換為 InstanceNorm2d, Discriminator 中的 InstanceNorm2d 替換為 BatchNorm2d
Epoch: [ 0], [ 0/ 202] time: 8, d_loss: 1.40712, g_loss: 26.15663, cheat_loss: 0.68213, const_loss: 0.10550, l1_loss: 25.30757, fm_loss: 0.06144
Epoch: [ 0], [ 100/ 202] time: 590, d_loss: 1.48672, g_loss: 8.84747, cheat_loss: 0.60889, const_loss: 0.06266, l1_loss: 8.06202, fm_loss: 0.11390
Epoch: [ 0], [ 200/ 202] time: 1172, d_loss: 1.40296, g_loss: 9.34245, cheat_loss: 0.65771, const_loss: 0.07859, l1_loss: 8.55779, fm_loss: 0.04836
Epoch: [ 1], [ 0/ 202] time: 1182, d_loss: 1.39629, g_loss: 8.71565, cheat_loss: 0.66113, const_loss: 0.06928, l1_loss: 7.94140, fm_loss: 0.04383
Epoch: [ 1], [ 100/ 202] time: 1765, d_loss: 1.43670, g_loss: 7.73438, cheat_loss: 0.67480, const_loss: 0.05115, l1_loss: 6.97976, fm_loss: 0.02867
Epoch: [ 1], [ 200/ 202] time: 2348, d_loss: 1.40638, g_loss: 7.85832, cheat_loss: 0.68652, const_loss: 0.06765, l1_loss: 7.07678, fm_loss: 0.02737
Epoch: [ 2], [ 0/ 202] time: 2357, d_loss: 1.39239, g_loss: 7.65690, cheat_loss: 0.68164, const_loss: 0.04307, l1_loss: 6.90672, fm_loss: 0.02548
Epoch: [ 2], [ 100/ 202] time: 2938, d_loss: 1.38933, g_loss: 6.81601, cheat_loss: 0.68115, const_loss: 0.03963, l1_loss: 6.07037, fm_loss: 0.02486
Epoch: [ 2], [ 200/ 202] time: 3521, d_loss: 1.41135, g_loss: 7.13150, cheat_loss: 0.68164, const_loss: 0.04640, l1_loss: 6.37542, fm_loss: 0.02804
Epoch: [ 3], [ 0/ 202] time: 3530, d_loss: 1.43597, g_loss: 7.47637, cheat_loss: 0.69092, const_loss: 0.07728, l1_loss: 6.67873, fm_loss: 0.02944
Epoch: [ 3], [ 100/ 202] time: 4111, d_loss: 1.39918, g_loss: 7.17783, cheat_loss: 0.68311, const_loss: 0.05178, l1_loss: 6.41804, fm_loss: 0.02491
Epoch: [ 3], [ 200/ 202] time: 4694, d_loss: 1.42672, g_loss: 6.70351, cheat_loss: 0.68896, const_loss: 0.04235, l1_loss: 5.94963, fm_loss: 0.02256
Epoch: [ 4], [ 0/ 202] time: 4702, d_loss: 1.39787, g_loss: 6.80694, cheat_loss: 0.69092, const_loss: 0.03582, l1_loss: 6.05838, fm_loss: 0.02183
Epoch: [ 4], [ 100/ 202] time: 5284, d_loss: 1.38997, g_loss: 6.34044, cheat_loss: 0.68066, const_loss: 0.03963, l1_loss: 5.59820, fm_loss: 0.02194
Epoch: [ 4], [ 200/ 202] time: 5866, d_loss: 1.40147, g_loss: 6.45916, cheat_loss: 0.69141, const_loss: 0.03558, l1_loss: 5.70890, fm_loss: 0.02327
Epoch: [ 5], [ 0/ 202] time: 5874, d_loss: 1.40212, g_loss: 5.63403, cheat_loss: 0.68359, const_loss: 0.04937, l1_loss: 4.88081, fm_loss: 0.02026
Epoch: [ 5], [ 100/ 202] time: 6456, d_loss: 1.39155, g_loss: 5.78776, cheat_loss: 0.68359, const_loss: 0.03568, l1_loss: 5.04722, fm_loss: 0.02127
Epoch: [ 5], [ 200/ 202] time: 7039, d_loss: 1.39225, g_loss: 6.35629, cheat_loss: 0.68359, const_loss: 0.11321, l1_loss: 5.53837, fm_loss: 0.02111
Epoch: [ 6], [ 0/ 202] time: 7048, d_loss: 1.39133, g_loss: 6.38244, cheat_loss: 0.68994, const_loss: 0.05826, l1_loss: 5.61338, fm_loss: 0.02086
Epoch: [ 6], [ 100/ 202] time: 7628, d_loss: 1.38921, g_loss: 6.15718, cheat_loss: 0.68262, const_loss: 0.03424, l1_loss: 5.41628, fm_loss: 0.02404
Epoch: [ 6], [ 200/ 202] time: 8210, d_loss: 1.39549, g_loss: 5.85658, cheat_loss: 0.68457, const_loss: 0.04217, l1_loss: 5.11076, fm_loss: 0.01908
Epoch: [ 7], [ 0/ 202] time: 8219, d_loss: 1.39021, g_loss: 6.96465, cheat_loss: 0.68408, const_loss: 0.03439, l1_loss: 6.22502, fm_loss: 0.02116
Epoch: [ 7], [ 100/ 202] time: 8800, d_loss: 1.39144, g_loss: 5.57077, cheat_loss: 0.67822, const_loss: 0.03597, l1_loss: 4.83961, fm_loss: 0.01697
Epoch: [ 7], [ 200/ 202] time: 9384, d_loss: 1.39012, g_loss: 5.69039, cheat_loss: 0.68652, const_loss: 0.03513, l1_loss: 4.95026, fm_loss: 0.01848
Epoch: [ 8], [ 0/ 202] time: 9393, d_loss: 1.40387, g_loss: 5.68651, cheat_loss: 0.68555, const_loss: 0.03184, l1_loss: 4.94915, fm_loss: 0.01997
Epoch: [ 8], [ 100/ 202] time: 9975, d_loss: 1.38946, g_loss: 5.37525, cheat_loss: 0.68750, const_loss: 0.04059, l1_loss: 4.63120, fm_loss: 0.01596
Epoch: [ 8], [ 200/ 202] time: 10559, d_loss: 1.38896, g_loss: 5.19324, cheat_loss: 0.68750, const_loss: 0.02264, l1_loss: 4.46650, fm_loss: 0.01660
Epoch: [ 9], [ 0/ 202] time: 10567, d_loss: 1.38873, g_loss: 5.40119, cheat_loss: 0.68262, const_loss: 0.02946, l1_loss: 4.67303, fm_loss: 0.01609
Epoch: [ 9], [ 100/ 202] time: 11149, d_loss: 1.38959, g_loss: 5.42721, cheat_loss: 0.69238, const_loss: 0.03315, l1_loss: 4.68469, fm_loss: 0.01699
Epoch: [ 9], [ 200/ 202] time: 11738, d_loss: 1.38867, g_loss: 5.54938, cheat_loss: 0.68945, const_loss: 0.02868, l1_loss: 4.81458, fm_loss: 0.01666
Epoch: [10], [ 0/ 202] time: 11746, d_loss: 1.39234, g_loss: 5.18427, cheat_loss: 0.68848, const_loss: 0.02229, l1_loss: 4.45758, fm_loss: 0.01592
Epoch: [10], [ 100/ 202] time: 12328, d_loss: 1.39055, g_loss: 5.84795, cheat_loss: 0.68701, const_loss: 0.02416, l1_loss: 5.12025, fm_loss: 0.01653
Epoch: [10], [ 200/ 202] time: 12917, d_loss: 1.39030, g_loss: 5.44530, cheat_loss: 0.69092, const_loss: 0.02134, l1_loss: 4.71811, fm_loss: 0.01493
Epoch: [11], [ 0/ 202] time: 12926, d_loss: 1.38862, g_loss: 4.78757, cheat_loss: 0.68945, const_loss: 0.02272, l1_loss: 4.06196, fm_loss: 0.01343
更換Generator 為 Instance Normalization (IN), 初期 g_loss 由於並沒有該層, 所以 g_loss 突然被拉高, 訓練 10 epoch 就回來 5 了.
增加 vgg16 之後的訓練, g_loss from 5 to 10, vgg_loss 以極慢的數速, 微微下降:
Epoch: [ 0], [ 100/ 294] time: 526, d_loss: 1.38721, g_loss: 10.57981, cheat_loss: 0.69141, const_loss: 0.02269, l1_loss: 3.66918, fm_loss: 0.01239, vgg_loss: 0.61841
Epoch: [ 0], [ 200/ 294] time: 1046, d_loss: 1.39747, g_loss: 10.77263, cheat_loss: 0.69043, const_loss: 0.03536, l1_loss: 3.75207, fm_loss: 0.01292, vgg_loss: 0.62819
Epoch: [ 1], [ 0/ 294] time: 1536, d_loss: 1.38700, g_loss: 10.02796, cheat_loss: 0.69189, const_loss: 0.02411, l1_loss: 3.48752, fm_loss: 0.01150, vgg_loss: 0.58129
Epoch: [ 1], [ 100/ 294] time: 2055, d_loss: 1.38757, g_loss: 10.62183, cheat_loss: 0.68945, const_loss: 0.03176, l1_loss: 3.76850, fm_loss: 0.01323, vgg_loss: 0.61189
Epoch: [ 1], [ 200/ 294] time: 2577, d_loss: 1.38769, g_loss: 11.69133, cheat_loss: 0.69238, const_loss: 0.02257, l1_loss: 4.43377, fm_loss: 0.01731, vgg_loss: 0.65253
Epoch: [ 2], [ 0/ 294] time: 3065, d_loss: 1.38730, g_loss: 11.11434, cheat_loss: 0.69336, const_loss: 0.02291, l1_loss: 4.08125, fm_loss: 0.01357, vgg_loss: 0.63033
Epoch: [ 2], [ 100/ 294] time: 3586, d_loss: 1.38716, g_loss: 11.05450, cheat_loss: 0.69434, const_loss: 0.02658, l1_loss: 4.06144, fm_loss: 0.01349, vgg_loss: 0.62587
resume:
Epoch: [ 0], [ 100/ 294] time: 515, d_loss: 1.38774, g_loss: 9.71862, cheat_loss: 0.69238, const_loss: 0.02789, l1_loss: 3.29930, fm_loss: 0.01034, vgg_loss: 0.56887
Epoch: [ 0], [ 200/ 294] time: 1024, d_loss: 1.38738, g_loss: 10.06664, cheat_loss: 0.69287, const_loss: 0.03963, l1_loss: 3.47987, fm_loss: 0.01116, vgg_loss: 0.58431
Epoch: [ 1], [ 0/ 294] time: 1508, d_loss: 1.38711, g_loss: 10.33678, cheat_loss: 0.69141, const_loss: 0.01839, l1_loss: 3.71158, fm_loss: 0.01285, vgg_loss: 0.59026
Epoch: [ 1], [ 100/ 294] time: 2016, d_loss: 1.38815, g_loss: 9.68200, cheat_loss: 0.69092, const_loss: 0.02603, l1_loss: 3.32159, fm_loss: 0.01178, vgg_loss: 0.56317
Epoch: [ 1], [ 200/ 294] time: 2526, d_loss: 1.38750, g_loss: 11.04752, cheat_loss: 0.68994, const_loss: 0.02302, l1_loss: 4.08738, fm_loss: 0.01534, vgg_loss: 0.62318
Epoch: [ 2], [ 0/ 294] time: 3003, d_loss: 1.39048, g_loss: 9.92757, cheat_loss: 0.69434, const_loss: 0.02435, l1_loss: 3.48197, fm_loss: 0.01181, vgg_loss: 0.57151
Epoch: [ 2], [ 100/ 294] time: 3518, d_loss: 1.38703, g_loss: 10.41778, cheat_loss: 0.69385, const_loss: 0.02012, l1_loss: 3.72904, fm_loss: 0.01173, vgg_loss: 0.59630
Epoch: [ 2], [ 200/ 294] time: 4027, d_loss: 1.38797, g_loss: 10.05972, cheat_loss: 0.69189, const_loss: 0.02587, l1_loss: 3.46052, fm_loss: 0.01275, vgg_loss: 0.58687
Epoch: [ 3], [ 0/ 294] time: 4505, d_loss: 1.38821, g_loss: 10.92889, cheat_loss: 0.69238, const_loss: 0.01674, l1_loss: 3.91190, fm_loss: 0.01498, vgg_loss: 0.62929
Epoch: [ 3], [ 100/ 294] time: 5013, d_loss: 1.38751, g_loss: 10.62019, cheat_loss: 0.69141, const_loss: 0.02753, l1_loss: 3.80641, fm_loss: 0.01234, vgg_loss: 0.60825
Epoch: [ 3], [ 200/ 294] time: 5522, d_loss: 1.38836, g_loss: 10.27982, cheat_loss: 0.69189, const_loss: 0.01873, l1_loss: 3.62583, fm_loss: 0.01275, vgg_loss: 0.59306
Epoch: [ 4], [ 0/ 294] time: 5999, d_loss: 1.38970, g_loss: 11.01090, cheat_loss: 0.69238, const_loss: 0.03371, l1_loss: 3.98023, fm_loss: 0.01290, vgg_loss: 0.62917
Epoch: [ 4], [ 100/ 294] time: 6508, d_loss: 1.38711, g_loss: 10.47773, cheat_loss: 0.69238, const_loss: 0.02625, l1_loss: 3.69440, fm_loss: 0.01311, vgg_loss: 0.60516
針對很難學習的特殊規則與筆劃數大於等於17筆的單獨訓練:
Epoch: [ 1], [ 0/ 61] time: 317, d_loss: 1.39201, g_loss: 12.68400, cheat_loss: 0.69043, const_loss: 0.01959, l1_loss: 4.44061, fm_loss: 0.01375, vgg_loss: 0.75196
Epoch: [ 2], [ 0/ 61] time: 633, d_loss: 1.39053, g_loss: 13.22295, cheat_loss: 0.69189, const_loss: 0.01997, l1_loss: 4.87483, fm_loss: 0.01443, vgg_loss: 0.76218
Epoch: [ 3], [ 0/ 61] time: 950, d_loss: 1.38706, g_loss: 12.17336, cheat_loss: 0.69434, const_loss: 0.01893, l1_loss: 4.21018, fm_loss: 0.01704, vgg_loss: 0.72329
Epoch: [ 4], [ 0/ 61] time: 1268, d_loss: 1.38686, g_loss: 12.15381, cheat_loss: 0.69336, const_loss: 0.03296, l1_loss: 4.22026, fm_loss: 0.01302, vgg_loss: 0.71942
Epoch: [ 5], [ 0/ 61] time: 1584, d_loss: 1.38713, g_loss: 11.68402, cheat_loss: 0.69141, const_loss: 0.03301, l1_loss: 4.04975, fm_loss: 0.01299, vgg_loss: 0.68969
Epoch: [ 6], [ 0/ 61] time: 1900, d_loss: 1.38864, g_loss: 12.62610, cheat_loss: 0.69336, const_loss: 0.01831, l1_loss: 4.66389, fm_loss: 0.01227, vgg_loss: 0.72383
Epoch: [ 7], [ 0/ 61] time: 2218, d_loss: 1.38984, g_loss: 11.68615, cheat_loss: 0.68896, const_loss: 0.01738, l1_loss: 3.96655, fm_loss: 0.01655, vgg_loss: 0.69967
Epoch: [ 8], [ 0/ 61] time: 2534, d_loss: 1.39058, g_loss: 10.58258, cheat_loss: 0.69336, const_loss: 0.01762, l1_loss: 3.53934, fm_loss: 0.01174, vgg_loss: 0.63205
Epoch: [ 9], [ 0/ 61] time: 2850, d_loss: 1.39206, g_loss: 10.84399, cheat_loss: 0.69482, const_loss: 0.01864, l1_loss: 3.61558, fm_loss: 0.01134, vgg_loss: 0.65036
Epoch: [10], [ 0/ 61] time: 3167, d_loss: 1.39295, g_loss: 11.44258, cheat_loss: 0.69092, const_loss: 0.01392, l1_loss: 3.92835, fm_loss: 0.01318, vgg_loss: 0.67962
Epoch: [11], [ 0/ 61] time: 3484, d_loss: 1.38692, g_loss: 11.93794, cheat_loss: 0.69287, const_loss: 0.01482, l1_loss: 4.15529, fm_loss: 0.01240, vgg_loss: 0.70626
Epoch: [12], [ 0/ 61] time: 3800, d_loss: 1.38748, g_loss: 11.11722, cheat_loss: 0.69336, const_loss: 0.01971, l1_loss: 3.74768, fm_loss: 0.01165, vgg_loss: 0.66448
Epoch: [13], [ 0/ 61] time: 4116, d_loss: 1.38903, g_loss: 10.63667, cheat_loss: 0.69385, const_loss: 0.01631, l1_loss: 3.65030, fm_loss: 0.01001, vgg_loss: 0.62662
Epoch: [14], [ 0/ 61] time: 4439, d_loss: 1.38796, g_loss: 11.03755, cheat_loss: 0.69287, const_loss: 0.01179, l1_loss: 3.61145, fm_loss: 0.01143, vgg_loss: 0.67100
Epoch: [15], [ 0/ 61] time: 4756, d_loss: 1.40419, g_loss: 10.30985, cheat_loss: 0.69238, const_loss: 0.03512, l1_loss: 3.28025, fm_loss: 0.01101, vgg_loss: 0.62911
Epoch: [16], [ 0/ 61] time: 5072, d_loss: 1.38779, g_loss: 11.25133, cheat_loss: 0.68994, const_loss: 0.04163, l1_loss: 3.75905, fm_loss: 0.01578, vgg_loss: 0.67449
說明: g_loss 拉高到 12.6, 慢慢地降到 10.6
resume, 高筆畫:
Epoch: [ 0], [ 0/ 61] time: 7, d_loss: 1.38693, g_loss: 11.98865, cheat_loss: 0.69629, const_loss: 0.04298, l1_loss: 4.26667, fm_loss: 0.01550, vgg_loss: 0.69672
Epoch: [ 1], [ 0/ 61] time: 309, d_loss: 1.38841, g_loss: 10.88043, cheat_loss: 0.69189, const_loss: 0.01434, l1_loss: 3.55824, fm_loss: 0.01272, vgg_loss: 0.66032
Epoch: [ 2], [ 0/ 61] time: 616, d_loss: 1.39253, g_loss: 11.20051, cheat_loss: 0.69336, const_loss: 0.01365, l1_loss: 3.85531, fm_loss: 0.01259, vgg_loss: 0.66256
Epoch: [ 3], [ 0/ 61] time: 922, d_loss: 1.38790, g_loss: 12.15136, cheat_loss: 0.69238, const_loss: 0.01620, l1_loss: 4.26966, fm_loss: 0.01550, vgg_loss: 0.71576
Epoch: [ 4], [ 0/ 61] time: 1230, d_loss: 1.38735, g_loss: 11.25488, cheat_loss: 0.69189, const_loss: 0.01763, l1_loss: 3.82930, fm_loss: 0.01188, vgg_loss: 0.67042
Epoch: [ 5], [ 0/ 61] time: 1536, d_loss: 1.39028, g_loss: 10.21915, cheat_loss: 0.69141, const_loss: 0.03889, l1_loss: 3.34199, fm_loss: 0.01206, vgg_loss: 0.61348
Epoch: [ 6], [ 0/ 61] time: 1842, d_loss: 1.38694, g_loss: 11.47831, cheat_loss: 0.69531, const_loss: 0.02315, l1_loss: 4.03497, fm_loss: 0.01203, vgg_loss: 0.67129
Epoch: [ 7], [ 0/ 61] time: 2149, d_loss: 1.38727, g_loss: 10.47713, cheat_loss: 0.69141, const_loss: 0.02468, l1_loss: 3.37970, fm_loss: 0.01045, vgg_loss: 0.63709
Epoch: [ 8], [ 0/ 61] time: 2455, d_loss: 1.38812, g_loss: 9.43009, cheat_loss: 0.69287, const_loss: 0.00957, l1_loss: 2.99526, fm_loss: 0.00992, vgg_loss: 0.57225
Epoch: [ 9], [ 0/ 61] time: 2761, d_loss: 1.38762, g_loss: 10.33156, cheat_loss: 0.69287, const_loss: 0.01283, l1_loss: 3.39780, fm_loss: 0.01015, vgg_loss: 0.62179
Epoch: [10], [ 0/ 61] time: 3068, d_loss: 1.39063, g_loss: 11.17368, cheat_loss: 0.69336, const_loss: 0.02040, l1_loss: 3.80127, fm_loss: 0.01152, vgg_loss: 0.66471
Epoch: [11], [ 0/ 61] time: 3374, d_loss: 1.38706, g_loss: 10.54826, cheat_loss: 0.69336, const_loss: 0.01537, l1_loss: 3.47519, fm_loss: 0.00998, vgg_loss: 0.63544
Epoch: [12], [ 0/ 61] time: 3680, d_loss: 1.38863, g_loss: 9.64512, cheat_loss: 0.69287, const_loss: 0.01122, l1_loss: 3.01493, fm_loss: 0.00920, vgg_loss: 0.59169
Epoch: [13], [ 0/ 61] time: 3985, d_loss: 1.38942, g_loss: 9.44887, cheat_loss: 0.69336, const_loss: 0.00816, l1_loss: 3.08098, fm_loss: 0.00939, vgg_loss: 0.56570
Epoch: [14], [ 0/ 61] time: 4294, d_loss: 1.39236, g_loss: 10.59517, cheat_loss: 0.69092, const_loss: 0.01547, l1_loss: 3.41478, fm_loss: 0.01087, vgg_loss: 0.64631
Epoch: [15], [ 0/ 61] time: 4599, d_loss: 1.38711, g_loss: 9.66012, cheat_loss: 0.69092, const_loss: 0.01530, l1_loss: 3.02108, fm_loss: 0.01025, vgg_loss: 0.59226
Epoch: [16], [ 0/ 61] time: 4905, d_loss: 1.40042, g_loss: 10.26786, cheat_loss: 0.69385, const_loss: 0.01680, l1_loss: 3.31985, fm_loss: 0.01286, vgg_loss: 0.62245
Epoch: [17], [ 0/ 61] time: 5218, d_loss: 1.38760, g_loss: 10.34432, cheat_loss: 0.69336, const_loss: 0.02078, l1_loss: 3.54651, fm_loss: 0.01340, vgg_loss: 0.60703
Epoch: [18], [ 0/ 61] time: 5524, d_loss: 1.38830, g_loss: 9.66052, cheat_loss: 0.69434, const_loss: 0.04560, l1_loss: 2.99914, fm_loss: 0.01016, vgg_loss: 0.59113
Epoch: [19], [ 0/ 61] time: 5830, d_loss: 1.38847, g_loss: 10.11744, cheat_loss: 0.69580, const_loss: 0.02390, l1_loss: 3.16776, fm_loss: 0.01115, vgg_loss: 0.62188
Epoch: [20], [ 0/ 61] time: 6139, d_loss: 1.39405, g_loss: 9.99380, cheat_loss: 0.69385, const_loss: 0.02761, l1_loss: 3.17475, fm_loss: 0.01066, vgg_loss: 0.60869
Epoch: [21], [ 0/ 61] time: 6445, d_loss: 1.38740, g_loss: 10.00713, cheat_loss: 0.69531, const_loss: 0.01706, l1_loss: 3.21530, fm_loss: 0.01142, vgg_loss: 0.60680
Epoch: [22], [ 0/ 61] time: 6750, d_loss: 1.39201, g_loss: 10.73232, cheat_loss: 0.69189, const_loss: 0.02320, l1_loss: 3.51164, fm_loss: 0.01318, vgg_loss: 0.64924
Epoch: [23], [ 0/ 61] time: 7062, d_loss: 1.38908, g_loss: 9.14977, cheat_loss: 0.69434, const_loss: 0.02336, l1_loss: 2.79269, fm_loss: 0.00821, vgg_loss: 0.56312
Epoch: [24], [ 0/ 61] time: 7368, d_loss: 1.38696, g_loss: 8.84442, cheat_loss: 0.69336, const_loss: 0.01111, l1_loss: 2.65544, fm_loss: 0.00957, vgg_loss: 0.54749
Epoch: [25], [ 0/ 61] time: 7673, d_loss: 1.39187, g_loss: 10.29429, cheat_loss: 0.69287, const_loss: 0.00925, l1_loss: 3.22521, fm_loss: 0.01049, vgg_loss: 0.63565
Epoch: [26], [ 0/ 61] time: 7979, d_loss: 1.38804, g_loss: 9.94276, cheat_loss: 0.69043, const_loss: 0.02251, l1_loss: 3.11852, fm_loss: 0.01014, vgg_loss: 0.61012
Epoch: [27], [ 0/ 61] time: 8286, d_loss: 1.38724, g_loss: 8.70646, cheat_loss: 0.69336, const_loss: 0.01938, l1_loss: 2.56524, fm_loss: 0.00885, vgg_loss: 0.54196
Epoch: [28], [ 0/ 61] time: 8592, d_loss: 1.38946, g_loss: 9.90956, cheat_loss: 0.69336, const_loss: 0.01765, l1_loss: 3.17868, fm_loss: 0.00984, vgg_loss: 0.60100
Epoch: [29], [ 0/ 61] time: 8897, d_loss: 1.38789, g_loss: 9.22064, cheat_loss: 0.69531, const_loss: 0.01077, l1_loss: 2.85830, fm_loss: 0.00982, vgg_loss: 0.56464
說明: g_loss / l1_loss / vgg_loss 持續下降.
resume:
Epoch: [ 0], [ 100/ 336] time: 459, d_loss: 1.38722, g_loss: 9.52488, cheat_loss: 0.69434, const_loss: 0.01413, l1_loss: 3.26481, fm_loss: 0.00956, vgg_loss: 0.55420
Epoch: [ 0], [ 200/ 336] time: 916, d_loss: 1.39037, g_loss: 9.15060, cheat_loss: 0.69189, const_loss: 0.03216, l1_loss: 3.16173, fm_loss: 0.00972, vgg_loss: 0.52551
Epoch: [ 0], [ 300/ 336] time: 1376, d_loss: 1.38696, g_loss: 9.66540, cheat_loss: 0.69434, const_loss: 0.01577, l1_loss: 3.37383, fm_loss: 0.01002, vgg_loss: 0.55714
Epoch: [ 1], [ 0/ 336] time: 1539, d_loss: 1.38683, g_loss: 8.00189, cheat_loss: 0.69287, const_loss: 0.01422, l1_loss: 2.57903, fm_loss: 0.00916, vgg_loss: 0.47066
Epoch: [ 1], [ 100/ 336] time: 2002, d_loss: 1.38748, g_loss: 7.98345, cheat_loss: 0.69238, const_loss: 0.01920, l1_loss: 2.59782, fm_loss: 0.00928, vgg_loss: 0.46648
Epoch: [ 1], [ 200/ 336] time: 2458, d_loss: 1.38702, g_loss: 8.14308, cheat_loss: 0.69385, const_loss: 0.01781, l1_loss: 2.63234, fm_loss: 0.00901, vgg_loss: 0.47901
Epoch: [ 1], [ 300/ 336] time: 2920, d_loss: 1.38720, g_loss: 8.86034, cheat_loss: 0.69434, const_loss: 0.02190, l1_loss: 3.14172, fm_loss: 0.01022, vgg_loss: 0.49922
Epoch: [ 2], [ 0/ 336] time: 3084, d_loss: 1.39078, g_loss: 8.52869, cheat_loss: 0.69141, const_loss: 0.01660, l1_loss: 2.88205, fm_loss: 0.01040, vgg_loss: 0.49282
Epoch: [ 2], [ 100/ 336] time: 3540, d_loss: 1.38807, g_loss: 8.83213, cheat_loss: 0.69287, const_loss: 0.02941, l1_loss: 2.92657, fm_loss: 0.01012, vgg_loss: 0.51732
Epoch: [ 2], [ 200/ 336] time: 3998, d_loss: 1.38720, g_loss: 8.89849, cheat_loss: 0.69238, const_loss: 0.01585, l1_loss: 2.98620, fm_loss: 0.01032, vgg_loss: 0.51937
Epoch: [ 2], [ 300/ 336] time: 4455, d_loss: 1.38711, g_loss: 8.81964, cheat_loss: 0.69336, const_loss: 0.01772, l1_loss: 2.93479, fm_loss: 0.01050, vgg_loss: 0.51633
Epoch: [ 3], [ 0/ 336] time: 4619, d_loss: 1.38709, g_loss: 8.73795, cheat_loss: 0.69482, const_loss: 0.01306, l1_loss: 2.82796, fm_loss: 0.01113, vgg_loss: 0.51910
Epoch: [ 3], [ 100/ 336] time: 5076, d_loss: 1.38686, g_loss: 8.89925, cheat_loss: 0.69238, const_loss: 0.02025, l1_loss: 3.07577, fm_loss: 0.00992, vgg_loss: 0.51009
Epoch: [ 3], [ 200/ 336] time: 5537, d_loss: 1.38675, g_loss: 9.32194, cheat_loss: 0.69336, const_loss: 0.01133, l1_loss: 3.15665, fm_loss: 0.00945, vgg_loss: 0.54511
Epoch: [ 3], [ 300/ 336] time: 5990, d_loss: 1.38704, g_loss: 8.92947, cheat_loss: 0.69287, const_loss: 0.01527, l1_loss: 3.05227, fm_loss: 0.01053, vgg_loss: 0.51585
Epoch: [ 4], [ 0/ 336] time: 6153, d_loss: 1.38692, g_loss: 8.83502, cheat_loss: 0.69238, const_loss: 0.01051, l1_loss: 2.92395, fm_loss: 0.00968, vgg_loss: 0.51985
測試訓練 384×384 thin style, console:
Epoch: [ 0], [ 0/ 214] time: 22.11, d_loss: 1.50260, g_loss: 17.42937, cheat_loss: 0.68649, const_loss: 0.08664, l1_loss: 16.65610
Epoch: [ 0], [ 100/ 214] time: 667.14, d_loss: 1.44196, g_loss: 18.55598, cheat_loss: 0.68878, const_loss: 0.10555, l1_loss: 17.76165
Epoch: [ 0], [ 200/ 214] time: 1329.97, d_loss: 1.43561, g_loss: 17.26299, cheat_loss: 0.66817, const_loss: 0.10516, l1_loss: 16.48965
Epoch: [ 1], [ 0/ 214] time: 1420.70, d_loss: 1.43020, g_loss: 17.87161, cheat_loss: 0.67667, const_loss: 0.11139, l1_loss: 17.08354
Epoch: [ 1], [ 100/ 214] time: 2084.62, d_loss: 1.44064, g_loss: 16.45610, cheat_loss: 0.68905, const_loss: 0.10249, l1_loss: 15.66456
Epoch: [ 1], [ 200/ 214] time: 2746.87, d_loss: 1.42002, g_loss: 15.97286, cheat_loss: 0.68978, const_loss: 0.07785, l1_loss: 15.20523
Epoch: [ 2], [ 0/ 214] time: 2837.45, d_loss: 1.41011, g_loss: 16.61582, cheat_loss: 0.68616, const_loss: 0.08334, l1_loss: 15.84632
Epoch: [ 2], [ 100/ 214] time: 3500.71, d_loss: 1.39727, g_loss: 15.10842, cheat_loss: 0.68482, const_loss: 0.08003, l1_loss: 14.34357
Epoch: [ 2], [ 200/ 214] time: 4163.36, d_loss: 1.41601, g_loss: 16.45517, cheat_loss: 0.69226, const_loss: 0.08608, l1_loss: 15.67682
Epoch: [ 3], [ 0/ 214] time: 4253.76, d_loss: 1.40171, g_loss: 16.39460, cheat_loss: 0.68721, const_loss: 0.08257, l1_loss: 15.62481
Epoch: [ 3], [ 100/ 214] time: 4918.49, d_loss: 1.40068, g_loss: 16.01672, cheat_loss: 0.68323, const_loss: 0.07980, l1_loss: 15.25369
Epoch: [ 3], [ 200/ 214] time: 5580.16, d_loss: 1.40033, g_loss: 14.98626, cheat_loss: 0.69058, const_loss: 0.06640, l1_loss: 14.22928
Epoch: [ 4], [ 0/ 214] time: 5670.69, d_loss: 1.55030, g_loss: 13.95393, cheat_loss: 0.68146, const_loss: 0.09027, l1_loss: 13.18219
Epoch: [ 4], [ 100/ 214] time: 6334.08, d_loss: 1.49206, g_loss: 14.31353, cheat_loss: 0.68241, const_loss: 0.08358, l1_loss: 13.54754
Epoch: [ 4], [ 200/ 214] time: 6994.40, d_loss: 1.39501, g_loss: 14.21539, cheat_loss: 0.68888, const_loss: 0.08712, l1_loss: 13.43939
Epoch: [ 5], [ 0/ 214] time: 7082.03, d_loss: 1.39529, g_loss: 13.53912, cheat_loss: 0.68865, const_loss: 0.08560, l1_loss: 12.76487
Epoch: [ 5], [ 100/ 214] time: 7717.36, d_loss: 1.39902, g_loss: 13.47905, cheat_loss: 0.68888, const_loss: 0.09366, l1_loss: 12.69651
Epoch: [ 5], [ 200/ 214] time: 8351.21, d_loss: 1.41555, g_loss: 14.92830, cheat_loss: 0.68953, const_loss: 0.06613, l1_loss: 14.17264
Epoch: [ 6], [ 0/ 214] time: 8437.95, d_loss: 1.40271, g_loss: 14.37121, cheat_loss: 0.68537, const_loss: 0.05804, l1_loss: 13.62780
Epoch: [ 6], [ 100/ 214] time: 9072.80, d_loss: 1.39686, g_loss: 12.22557, cheat_loss: 0.68799, const_loss: 0.05301, l1_loss: 11.48457
Epoch: [ 6], [ 200/ 214] time: 9705.83, d_loss: 1.52239, g_loss: 13.53644, cheat_loss: 0.68777, const_loss: 0.06376, l1_loss: 12.78491
Epoch: [ 7], [ 0/ 214] time: 9792.34, d_loss: 1.41500, g_loss: 12.17272, cheat_loss: 0.68835, const_loss: 0.07369, l1_loss: 11.41068
Epoch: [ 7], [ 100/ 214] time: 10426.56, d_loss: 1.42671, g_loss: 11.97365, cheat_loss: 0.68732, const_loss: 0.06299, l1_loss: 11.22334
Epoch: [ 7], [ 200/ 214] time: 11060.31, d_loss: 1.42598, g_loss: 12.60396, cheat_loss: 0.68685, const_loss: 0.06667, l1_loss: 11.85044
Epoch: [ 8], [ 0/ 214] time: 11146.98, d_loss: 1.45362, g_loss: 12.52666, cheat_loss: 0.68652, const_loss: 0.05286, l1_loss: 11.78727
Epoch: [ 8], [ 100/ 214] time: 11782.57, d_loss: 1.40706, g_loss: 13.04044, cheat_loss: 0.68601, const_loss: 0.06922, l1_loss: 12.28521
Epoch: [ 8], [ 200/ 214] time: 12416.65, d_loss: 1.41468, g_loss: 11.58127, cheat_loss: 0.68721, const_loss: 0.05492, l1_loss: 10.83913
Epoch: [ 9], [ 0/ 214] time: 12503.29, d_loss: 1.39647, g_loss: 10.72044, cheat_loss: 0.69115, const_loss: 0.07348, l1_loss: 9.95581
Epoch: [ 9], [ 100/ 214] time: 13138.94, d_loss: 1.39666, g_loss: 11.57403, cheat_loss: 0.68638, const_loss: 0.03580, l1_loss: 10.85185
Epoch: [ 9], [ 200/ 214] time: 13772.44, d_loss: 1.39635, g_loss: 11.72092, cheat_loss: 0.69004, const_loss: 0.05689, l1_loss: 10.97399
Epoch: [10], [ 0/ 214] time: 13859.20, d_loss: 1.41020, g_loss: 11.45188, cheat_loss: 0.68992, const_loss: 0.05096, l1_loss: 10.71100
Epoch: [10], [ 100/ 214] time: 14493.83, d_loss: 1.39288, g_loss: 10.19286, cheat_loss: 0.69165, const_loss: 0.04151, l1_loss: 9.45970
Epoch: [10], [ 200/ 214] time: 15127.48, d_loss: 1.39106, g_loss: 10.00354, cheat_loss: 0.68668, const_loss: 0.05315, l1_loss: 9.26370
說明: g_loss from 17 to 10.
resume:
Epoch: [ 1], [ 0/ 202] time: 1102, d_loss: 1.40696, g_loss: 7.32515, cheat_loss: 0.68115, const_loss: 0.03185, l1_loss: 6.52524, fm_loss: 0.08690
Epoch: [ 1], [ 100/ 202] time: 1647, d_loss: 1.39216, g_loss: 7.87336, cheat_loss: 0.67822, const_loss: 0.03120, l1_loss: 7.07913, fm_loss: 0.08481
Epoch: [ 1], [ 200/ 202] time: 2193, d_loss: 1.39392, g_loss: 8.32639, cheat_loss: 0.68359, const_loss: 0.04773, l1_loss: 7.51891, fm_loss: 0.07614
Epoch: [ 2], [ 0/ 202] time: 2201, d_loss: 1.39379, g_loss: 8.03761, cheat_loss: 0.67969, const_loss: 0.04011, l1_loss: 7.24280, fm_loss: 0.07501
Epoch: [ 2], [ 100/ 202] time: 2745, d_loss: 1.41959, g_loss: 7.69352, cheat_loss: 0.67773, const_loss: 0.03686, l1_loss: 6.90624, fm_loss: 0.07269
Epoch: [ 2], [ 200/ 202] time: 3291, d_loss: 1.39333, g_loss: 8.45683, cheat_loss: 0.68115, const_loss: 0.04265, l1_loss: 7.65715, fm_loss: 0.07588
Epoch: [ 3], [ 0/ 202] time: 3299, d_loss: 1.39134, g_loss: 7.90670, cheat_loss: 0.69189, const_loss: 0.04193, l1_loss: 7.09598, fm_loss: 0.07690
Epoch: [ 3], [ 100/ 202] time: 3843, d_loss: 1.39969, g_loss: 7.25844, cheat_loss: 0.68018, const_loss: 0.02305, l1_loss: 6.48604, fm_loss: 0.06916
Epoch: [ 3], [ 200/ 202] time: 4390, d_loss: 1.46355, g_loss: 8.09026, cheat_loss: 0.69092, const_loss: 0.02943, l1_loss: 7.29903, fm_loss: 0.07089
Epoch: [ 4], [ 0/ 202] time: 4398, d_loss: 1.40241, g_loss: 7.57375, cheat_loss: 0.68262, const_loss: 0.03320, l1_loss: 6.79303, fm_loss: 0.06490
Epoch: [ 4], [ 100/ 202] time: 4942, d_loss: 1.39036, g_loss: 6.95541, cheat_loss: 0.68604, const_loss: 0.03715, l1_loss: 6.17361, fm_loss: 0.05861
Epoch: [ 4], [ 200/ 202] time: 5487, d_loss: 1.39375, g_loss: 7.40470, cheat_loss: 0.68799, const_loss: 0.02625, l1_loss: 6.63530, fm_loss: 0.05516
Epoch: [ 5], [ 0/ 202] time: 5495, d_loss: 1.40612, g_loss: 7.47566, cheat_loss: 0.68359, const_loss: 0.03151, l1_loss: 6.69912, fm_loss: 0.06144
Epoch: [ 5], [ 100/ 202] time: 6039, d_loss: 1.42376, g_loss: 7.13291, cheat_loss: 0.68701, const_loss: 0.02462, l1_loss: 6.36342, fm_loss: 0.05785
Epoch: [ 5], [ 200/ 202] time: 6585, d_loss: 1.41352, g_loss: 6.76752, cheat_loss: 0.68604, const_loss: 0.03009, l1_loss: 5.99911, fm_loss: 0.05228
Epoch: [ 6], [ 0/ 202] time: 6593, d_loss: 1.41918, g_loss: 7.67140, cheat_loss: 0.69580, const_loss: 0.02419, l1_loss: 6.89219, fm_loss: 0.05923
Epoch: [ 6], [ 100/ 202] time: 7136, d_loss: 1.39142, g_loss: 6.40480, cheat_loss: 0.68896, const_loss: 0.02490, l1_loss: 5.64295, fm_loss: 0.04799
Epoch: [ 6], [ 200/ 202] time: 7683, d_loss: 1.38880, g_loss: 6.48734, cheat_loss: 0.68799, const_loss: 0.02719, l1_loss: 5.72540, fm_loss: 0.04676
Epoch: [ 7], [ 0/ 202] time: 7691, d_loss: 1.38840, g_loss: 6.31403, cheat_loss: 0.68604, const_loss: 0.02463, l1_loss: 5.55764, fm_loss: 0.04571
Epoch: [ 7], [ 100/ 202] time: 8235, d_loss: 1.39095, g_loss: 7.05506, cheat_loss: 0.68994, const_loss: 0.02134, l1_loss: 6.29332, fm_loss: 0.05046
Epoch: [ 7], [ 200/ 202] time: 8782, d_loss: 1.38811, g_loss: 6.66191, cheat_loss: 0.68408, const_loss: 0.01809, l1_loss: 5.91415, fm_loss: 0.04559
Epoch: [ 8], [ 0/ 202] time: 8790, d_loss: 1.38775, g_loss: 7.36030, cheat_loss: 0.68506, const_loss: 0.02068, l1_loss: 6.60692, fm_loss: 0.04764
Epoch: [ 8], [ 100/ 202] time: 9333, d_loss: 1.38805, g_loss: 6.02525, cheat_loss: 0.68750, const_loss: 0.02073, l1_loss: 5.27679, fm_loss: 0.04022
Epoch: [ 8], [ 200/ 202] time: 9878, d_loss: 1.38772, g_loss: 6.22977, cheat_loss: 0.69092, const_loss: 0.02277, l1_loss: 5.47373, fm_loss: 0.04235
Epoch: [ 9], [ 0/ 202] time: 9886, d_loss: 1.38782, g_loss: 5.83983, cheat_loss: 0.69141, const_loss: 0.02069, l1_loss: 5.08847, fm_loss: 0.03927
Epoch: [ 9], [ 100/ 202] time: 10429, d_loss: 1.38945, g_loss: 5.69849, cheat_loss: 0.68994, const_loss: 0.01719, l1_loss: 4.95193, fm_loss: 0.03943
說明: g_loss from 7.8 to 5.6
調整 Generator 程式碼,將其中的 BatchNorm2d 替換為 InstanceNorm2d, Discriminator 中的 InstanceNorm2d 替換為 BatchNorm2d
Missing key(s) in state_dict: "model.down.0.bias", "model.up.1.bias", "model.submodule.down.1.bias", "model.submodule.up.1.bias", "model.submodule.submodule.down.1.bias", "model.submodule.submodule.up.1.bias", "model.submodule.submodule.submodule.down.1.bias", "model.submodule.submodule.submodule.up.1.bias", "model.submodule.submodule.submodule.submodule.down.1.bias", "model.submodule.submodule.submodule.submodule.up.1.bias", "model.submodule.submodule.submodule.submodule.submodule.down.1.bias", "model.submodule.submodule.submodule.submodule.submodule.up.1.bias", "model.submodule.submodule.submodule.submodule.submodule.submodule.down.1.bias", "model.submodule.submodule.submodule.submodule.submodule.submodule.up.1.bias", "model.submodule.submodule.submodule.submodule.submodule.submodule.submodule.down.1.bias", "model.submodule.submodule.submodule.submodule.submodule.submodule.submodule.up.1.bias".
Epoch: [ 0], [ 0/ 202] time: 8, d_loss: 1.39550, g_loss: 23.53697, cheat_loss: 0.69092, const_loss: 0.08078, l1_loss: 22.69748, fm_loss: 0.06779
Epoch: [ 0], [ 100/ 202] time: 576, d_loss: 1.40814, g_loss: 10.83954, cheat_loss: 0.68018, const_loss: 0.06541, l1_loss: 10.04633, fm_loss: 0.04762
Epoch: [ 0], [ 200/ 202] time: 1159, d_loss: 1.38839, g_loss: 10.72736, cheat_loss: 0.67822, const_loss: 0.05727, l1_loss: 9.93954, fm_loss: 0.05232
Epoch: [ 1], [ 0/ 202] time: 1170, d_loss: 1.38924, g_loss: 10.81795, cheat_loss: 0.67432, const_loss: 0.04104, l1_loss: 10.04874, fm_loss: 0.05384
Epoch: [ 1], [ 100/ 202] time: 1753, d_loss: 1.39422, g_loss: 9.70846, cheat_loss: 0.67627, const_loss: 0.04375, l1_loss: 8.90945, fm_loss: 0.07898
Epoch: [ 1], [ 200/ 202] time: 2337, d_loss: 1.39148, g_loss: 8.51060, cheat_loss: 0.67578, const_loss: 0.04043, l1_loss: 7.73058, fm_loss: 0.06380
Epoch: [ 2], [ 0/ 202] time: 2345, d_loss: 1.41463, g_loss: 10.01004, cheat_loss: 0.68066, const_loss: 0.04474, l1_loss: 9.21046, fm_loss: 0.07417
Epoch: [ 2], [ 100/ 202] time: 2928, d_loss: 1.39061, g_loss: 8.15248, cheat_loss: 0.67969, const_loss: 0.04478, l1_loss: 7.37546, fm_loss: 0.05256
Epoch: [ 2], [ 200/ 202] time: 3513, d_loss: 1.39369, g_loss: 8.95584, cheat_loss: 0.70068, const_loss: 0.03261, l1_loss: 8.16902, fm_loss: 0.05352
Epoch: [ 3], [ 0/ 202] time: 3522, d_loss: 1.39348, g_loss: 8.06863, cheat_loss: 0.67676, const_loss: 0.04734, l1_loss: 7.29511, fm_loss: 0.04943
Epoch: [ 3], [ 100/ 202] time: 4104, d_loss: 1.41632, g_loss: 7.93658, cheat_loss: 0.68555, const_loss: 0.03962, l1_loss: 7.16529, fm_loss: 0.04612
Epoch: [ 3], [ 200/ 202] time: 4686, d_loss: 1.41523, g_loss: 8.16894, cheat_loss: 0.68701, const_loss: 0.06002, l1_loss: 7.37172, fm_loss: 0.05018
Epoch: [ 4], [ 0/ 202] time: 4694, d_loss: 1.40249, g_loss: 8.47142, cheat_loss: 0.69434, const_loss: 0.03275, l1_loss: 7.69220, fm_loss: 0.05212
Epoch: [ 4], [ 100/ 202] time: 5277, d_loss: 1.38821, g_loss: 7.13986, cheat_loss: 0.68848, const_loss: 0.03490, l1_loss: 6.37700, fm_loss: 0.03949
Epoch: [ 4], [ 200/ 202] time: 5860, d_loss: 1.38753, g_loss: 6.85540, cheat_loss: 0.69043, const_loss: 0.02775, l1_loss: 6.09596, fm_loss: 0.04126
Epoch: [ 5], [ 0/ 202] time: 5869, d_loss: 1.38833, g_loss: 7.28732, cheat_loss: 0.68799, const_loss: 0.02977, l1_loss: 6.53080, fm_loss: 0.03876
Epoch: [ 5], [ 100/ 202] time: 6452, d_loss: 1.38838, g_loss: 6.23575, cheat_loss: 0.68506, const_loss: 0.02688, l1_loss: 5.48403, fm_loss: 0.03978
Epoch: [ 5], [ 200/ 202] time: 7039, d_loss: 1.39255, g_loss: 7.67591, cheat_loss: 0.67871, const_loss: 0.03450, l1_loss: 6.92128, fm_loss: 0.04142
Epoch: [ 6], [ 0/ 202] time: 7048, d_loss: 1.38766, g_loss: 7.11000, cheat_loss: 0.68018, const_loss: 0.03457, l1_loss: 6.35422, fm_loss: 0.04103
Epoch: [ 6], [ 100/ 202] time: 7632, d_loss: 1.40654, g_loss: 6.46989, cheat_loss: 0.68213, const_loss: 0.02366, l1_loss: 5.72589, fm_loss: 0.03820
Epoch: [ 6], [ 200/ 202] time: 8216, d_loss: 1.41043, g_loss: 6.24932, cheat_loss: 0.68945, const_loss: 0.03070, l1_loss: 5.49136, fm_loss: 0.03780
Epoch: [ 7], [ 0/ 202] time: 8224, d_loss: 1.40241, g_loss: 7.50038, cheat_loss: 0.68359, const_loss: 0.02728, l1_loss: 6.74720, fm_loss: 0.04231
Epoch: [ 7], [ 100/ 202] time: 8805, d_loss: 1.38866, g_loss: 7.16883, cheat_loss: 0.68945, const_loss: 0.02111, l1_loss: 6.42225, fm_loss: 0.03601
Epoch: [ 7], [ 200/ 202] time: 9387, d_loss: 1.39177, g_loss: 7.45523, cheat_loss: 0.68896, const_loss: 0.03073, l1_loss: 6.69838, fm_loss: 0.03715
Epoch: [ 8], [ 0/ 202] time: 9395, d_loss: 1.39477, g_loss: 6.94823, cheat_loss: 0.68994, const_loss: 0.02445, l1_loss: 6.19871, fm_loss: 0.03513
Epoch: [ 8], [ 100/ 202] time: 9975, d_loss: 1.38811, g_loss: 6.26323, cheat_loss: 0.68945, const_loss: 0.02203, l1_loss: 5.52122, fm_loss: 0.03052
Epoch: [ 8], [ 200/ 202] time: 10565, d_loss: 1.40873, g_loss: 5.86570, cheat_loss: 0.68994, const_loss: 0.02512, l1_loss: 5.12195, fm_loss: 0.02868
Epoch: [ 9], [ 0/ 202] time: 10573, d_loss: 1.39184, g_loss: 5.90532, cheat_loss: 0.69043, const_loss: 0.02387, l1_loss: 5.15872, fm_loss: 0.03230
Epoch: [ 9], [ 100/ 202] time: 11153, d_loss: 1.38808, g_loss: 5.93890, cheat_loss: 0.69189, const_loss: 0.01974, l1_loss: 5.19655, fm_loss: 0.03072
Epoch: [ 9], [ 200/ 202] time: 11737, d_loss: 1.38751, g_loss: 5.54044, cheat_loss: 0.68408, const_loss: 0.01551, l1_loss: 4.81165, fm_loss: 0.02920
Epoch: [10], [ 0/ 202] time: 11746, d_loss: 1.39437, g_loss: 5.68954, cheat_loss: 0.67529, const_loss: 0.01900, l1_loss: 4.96508, fm_loss: 0.03016
Epoch: [10], [ 100/ 202] time: 12326, d_loss: 1.38717, g_loss: 5.76477, cheat_loss: 0.68799, const_loss: 0.02108, l1_loss: 5.02441, fm_loss: 0.03130
Epoch: [10], [ 200/ 202] time: 12913, d_loss: 1.38830, g_loss: 5.85320, cheat_loss: 0.69189, const_loss: 0.01712, l1_loss: 5.11211, fm_loss: 0.03208
Epoch: [11], [ 0/ 202] time: 12921, d_loss: 1.39378, g_loss: 5.71432, cheat_loss: 0.68750, const_loss: 0.01279, l1_loss: 4.98049, fm_loss: 0.03354
Epoch: [11], [ 100/ 202] time: 13502, d_loss: 1.38763, g_loss: 5.08361, cheat_loss: 0.68750, const_loss: 0.02090, l1_loss: 4.34727, fm_loss: 0.02793
Epoch: [11], [ 200/ 202] time: 14086, d_loss: 1.38813, g_loss: 5.74436, cheat_loss: 0.68066, const_loss: 0.01508, l1_loss: 5.01748, fm_loss: 0.03113
Epoch: [12], [ 0/ 202] time: 14094, d_loss: 1.39321, g_loss: 5.11333, cheat_loss: 0.68750, const_loss: 0.03960, l1_loss: 4.35804, fm_loss: 0.02819
更換Generator 為 Instance Normalization (IN), 初期 g_loss 由於並沒有該層, 所以 g_loss 突然被拉高, 訓練 10 epoch 就回來 5 了.
增加 vgg16 之後的訓練, g_loss from 5 to 10, vgg_loss 以極慢的數速, 微微下降:
Epoch: [ 0], [ 100/ 294] time: 505, d_loss: 1.38725, g_loss: 11.70140, cheat_loss: 0.69189, const_loss: 0.01072, l1_loss: 4.21606, fm_loss: 0.02219, vgg_loss: 0.67605
Epoch: [ 0], [ 200/ 294] time: 1004, d_loss: 1.38702, g_loss: 12.01446, cheat_loss: 0.69482, const_loss: 0.02253, l1_loss: 4.50669, fm_loss: 0.02348, vgg_loss: 0.67669
Epoch: [ 1], [ 0/ 294] time: 1474, d_loss: 1.38925, g_loss: 11.36297, cheat_loss: 0.69238, const_loss: 0.01059, l1_loss: 4.07969, fm_loss: 0.02476, vgg_loss: 0.65556
Epoch: [ 1], [ 100/ 294] time: 1972, d_loss: 1.39081, g_loss: 10.79162, cheat_loss: 0.69141, const_loss: 0.00999, l1_loss: 3.96069, fm_loss: 0.02500, vgg_loss: 0.61045
Epoch: [ 1], [ 200/ 294] time: 2472, d_loss: 1.39123, g_loss: 11.28069, cheat_loss: 0.69043, const_loss: 0.01762, l1_loss: 4.18947, fm_loss: 0.02821, vgg_loss: 0.63550
Epoch: [ 2], [ 0/ 294] time: 2940, d_loss: 1.38823, g_loss: 11.70903, cheat_loss: 0.69189, const_loss: 0.01413, l1_loss: 4.23458, fm_loss: 0.02631, vgg_loss: 0.67421
Epoch: [ 2], [ 100/ 294] time: 3440, d_loss: 1.38700, g_loss: 11.50572, cheat_loss: 0.69287, const_loss: 0.00979, l1_loss: 4.20913, fm_loss: 0.02398, vgg_loss: 0.65699
resume:
Epoch: [ 0], [ 100/ 294] time: 512, d_loss: 1.38705, g_loss: 14.01570, cheat_loss: 0.69043, const_loss: 0.01765, l1_loss: 5.60690, fm_loss: 0.03049, vgg_loss: 0.76702
Epoch: [ 0], [ 200/ 294] time: 1019, d_loss: 1.39169, g_loss: 11.25946, cheat_loss: 0.69482, const_loss: 0.01922, l1_loss: 4.17656, fm_loss: 0.02397, vgg_loss: 0.63449
Epoch: [ 1], [ 0/ 294] time: 1495, d_loss: 1.38775, g_loss: 11.13579, cheat_loss: 0.69336, const_loss: 0.01243, l1_loss: 4.03383, fm_loss: 0.02446, vgg_loss: 0.63717
Epoch: [ 1], [ 100/ 294] time: 2002, d_loss: 1.38732, g_loss: 10.47839, cheat_loss: 0.69189, const_loss: 0.02053, l1_loss: 3.87001, fm_loss: 0.02651, vgg_loss: 0.58695
Epoch: [ 1], [ 200/ 294] time: 2510, d_loss: 1.39090, g_loss: 10.88486, cheat_loss: 0.69238, const_loss: 0.00980, l1_loss: 4.03458, fm_loss: 0.02956, vgg_loss: 0.61185
Epoch: [ 2], [ 0/ 294] time: 2986, d_loss: 1.38716, g_loss: 10.95058, cheat_loss: 0.69385, const_loss: 0.01141, l1_loss: 3.90378, fm_loss: 0.02637, vgg_loss: 0.63152
Epoch: [ 2], [ 100/ 294] time: 3498, d_loss: 1.38902, g_loss: 11.48704, cheat_loss: 0.69336, const_loss: 0.00943, l1_loss: 4.22502, fm_loss: 0.02559, vgg_loss: 0.65336
Epoch: [ 2], [ 200/ 294] time: 4005, d_loss: 1.38957, g_loss: 10.38879, cheat_loss: 0.69336, const_loss: 0.01899, l1_loss: 3.97298, fm_loss: 0.02338, vgg_loss: 0.56801
Epoch: [ 3], [ 0/ 294] time: 4482, d_loss: 1.38680, g_loss: 11.23085, cheat_loss: 0.69287, const_loss: 0.02099, l1_loss: 4.10953, fm_loss: 0.02479, vgg_loss: 0.63827
Epoch: [ 3], [ 100/ 294] time: 4988, d_loss: 1.38708, g_loss: 11.21273, cheat_loss: 0.69287, const_loss: 0.04690, l1_loss: 4.23967, fm_loss: 0.02459, vgg_loss: 0.62087
Epoch: [ 3], [ 200/ 294] time: 5496, d_loss: 1.38696, g_loss: 11.07762, cheat_loss: 0.69434, const_loss: 0.01154, l1_loss: 4.12490, fm_loss: 0.02558, vgg_loss: 0.62213
Epoch: [ 4], [ 0/ 294] time: 5971, d_loss: 1.38705, g_loss: 11.87828, cheat_loss: 0.69385, const_loss: 0.25309, l1_loss: 4.31549, fm_loss: 0.02250, vgg_loss: 0.65934
Epoch: [ 4], [ 100/ 294] time: 6481, d_loss: 1.38692, g_loss: 11.62946, cheat_loss: 0.69287, const_loss: 0.01637, l1_loss: 4.36285, fm_loss: 0.02374, vgg_loss: 0.65336
針對很難學習的特殊規則與筆劃數大於等於17筆的單獨訓練:
Epoch: [ 1], [ 0/ 61] time: 325, d_loss: 1.40394, g_loss: 13.80208, cheat_loss: 0.69434, const_loss: 0.01150, l1_loss: 5.12858, fm_loss: 0.02980, vgg_loss: 0.79379
Epoch: [ 2], [ 0/ 61] time: 644, d_loss: 1.40778, g_loss: 12.80063, cheat_loss: 0.69238, const_loss: 0.01515, l1_loss: 4.73252, fm_loss: 0.02941, vgg_loss: 0.73312
Epoch: [ 3], [ 0/ 61] time: 964, d_loss: 1.40497, g_loss: 13.02791, cheat_loss: 0.69287, const_loss: 0.01464, l1_loss: 4.73810, fm_loss: 0.02837, vgg_loss: 0.75539
Epoch: [ 4], [ 0/ 61] time: 1288, d_loss: 1.38694, g_loss: 13.19347, cheat_loss: 0.69141, const_loss: 0.01310, l1_loss: 4.86164, fm_loss: 0.02980, vgg_loss: 0.75975
Epoch: [ 5], [ 0/ 61] time: 1607, d_loss: 1.38674, g_loss: 11.88909, cheat_loss: 0.69531, const_loss: 0.01993, l1_loss: 4.19824, fm_loss: 0.02735, vgg_loss: 0.69483
Epoch: [ 6], [ 0/ 61] time: 1926, d_loss: 1.38746, g_loss: 13.51695, cheat_loss: 0.69434, const_loss: 0.03637, l1_loss: 5.12495, fm_loss: 0.02747, vgg_loss: 0.76338
Epoch: [ 7], [ 0/ 61] time: 2246, d_loss: 1.38724, g_loss: 12.53305, cheat_loss: 0.69287, const_loss: 0.01712, l1_loss: 4.45105, fm_loss: 0.02594, vgg_loss: 0.73461
Epoch: [ 8], [ 0/ 61] time: 2565, d_loss: 1.39391, g_loss: 10.93140, cheat_loss: 0.69287, const_loss: 0.01211, l1_loss: 3.78881, fm_loss: 0.02402, vgg_loss: 0.64136
Epoch: [ 9], [ 0/ 61] time: 2884, d_loss: 1.38680, g_loss: 12.18471, cheat_loss: 0.69287, const_loss: 0.01085, l1_loss: 4.39082, fm_loss: 0.03069, vgg_loss: 0.70595
Epoch: [10], [ 0/ 61] time: 3206, d_loss: 1.42593, g_loss: 12.12920, cheat_loss: 0.69434, const_loss: 0.01224, l1_loss: 4.36544, fm_loss: 0.02604, vgg_loss: 0.70311
Epoch: [11], [ 0/ 61] time: 3525, d_loss: 1.38936, g_loss: 11.03166, cheat_loss: 0.69238, const_loss: 0.01270, l1_loss: 3.73623, fm_loss: 0.02197, vgg_loss: 0.65684
Epoch: [12], [ 0/ 61] time: 3844, d_loss: 1.38781, g_loss: 11.00123, cheat_loss: 0.69385, const_loss: 0.00925, l1_loss: 3.75542, fm_loss: 0.02183, vgg_loss: 0.65209
Epoch: [13], [ 0/ 61] time: 4163, d_loss: 1.39182, g_loss: 11.17430, cheat_loss: 0.69141, const_loss: 0.01393, l1_loss: 4.03063, fm_loss: 0.02084, vgg_loss: 0.64175
Epoch: [14], [ 0/ 61] time: 4484, d_loss: 1.39705, g_loss: 12.00578, cheat_loss: 0.69385, const_loss: 0.01364, l1_loss: 4.18747, fm_loss: 0.02392, vgg_loss: 0.70869
Epoch: [15], [ 0/ 61] time: 4803, d_loss: 1.38705, g_loss: 10.87176, cheat_loss: 0.69287, const_loss: 0.03346, l1_loss: 3.67301, fm_loss: 0.02333, vgg_loss: 0.64491
說明: g_loss 拉高到 13.8, 降到 11.0.
resume, 高筆畫:
Epoch: [ 1], [ 0/ 69] time: 313, d_loss: 1.38680, g_loss: 10.76942, cheat_loss: 0.69336, const_loss: 0.00931, l1_loss: 3.55881, fm_loss: 0.02119, vgg_loss: 0.64868
Epoch: [ 2], [ 0/ 69] time: 623, d_loss: 1.38721, g_loss: 10.70565, cheat_loss: 0.69238, const_loss: 0.00691, l1_loss: 3.70029, fm_loss: 0.02358, vgg_loss: 0.62825
Epoch: [ 3], [ 0/ 69] time: 933, d_loss: 1.38793, g_loss: 10.75125, cheat_loss: 0.69238, const_loss: 0.01123, l1_loss: 3.62881, fm_loss: 0.02165, vgg_loss: 0.63972
Epoch: [ 4], [ 0/ 69] time: 1243, d_loss: 1.38721, g_loss: 10.63284, cheat_loss: 0.69287, const_loss: 0.00857, l1_loss: 3.50333, fm_loss: 0.02081, vgg_loss: 0.64073
Epoch: [ 5], [ 0/ 69] time: 1553, d_loss: 1.38871, g_loss: 10.21337, cheat_loss: 0.69336, const_loss: 0.00620, l1_loss: 3.39710, fm_loss: 0.02021, vgg_loss: 0.60965
Epoch: [ 6], [ 0/ 69] time: 1864, d_loss: 1.38778, g_loss: 11.16366, cheat_loss: 0.69287, const_loss: 0.01016, l1_loss: 3.77312, fm_loss: 0.02065, vgg_loss: 0.66669
Epoch: [ 7], [ 0/ 69] time: 2175, d_loss: 1.38795, g_loss: 10.51671, cheat_loss: 0.69141, const_loss: 0.00643, l1_loss: 3.46276, fm_loss: 0.01767, vgg_loss: 0.63384
Epoch: [ 8], [ 0/ 69] time: 2485, d_loss: 1.38753, g_loss: 9.50303, cheat_loss: 0.69385, const_loss: 0.08992, l1_loss: 3.07433, fm_loss: 0.01801, vgg_loss: 0.56269
Epoch: [ 9], [ 0/ 69] time: 2798, d_loss: 1.38777, g_loss: 10.67541, cheat_loss: 0.69238, const_loss: 0.02568, l1_loss: 3.65465, fm_loss: 0.02204, vgg_loss: 0.62807
Epoch: [10], [ 0/ 69] time: 3109, d_loss: 1.38674, g_loss: 10.91636, cheat_loss: 0.69531, const_loss: 0.01064, l1_loss: 3.76905, fm_loss: 0.02227, vgg_loss: 0.64191
Epoch: [11], [ 0/ 69] time: 3420, d_loss: 1.38684, g_loss: 10.80934, cheat_loss: 0.69238, const_loss: 0.00502, l1_loss: 3.62814, fm_loss: 0.02243, vgg_loss: 0.64614
Epoch: [12], [ 0/ 69] time: 3733, d_loss: 1.38689, g_loss: 10.13876, cheat_loss: 0.69385, const_loss: 0.00848, l1_loss: 3.36373, fm_loss: 0.02060, vgg_loss: 0.60521
Epoch: [13], [ 0/ 69] time: 4044, d_loss: 1.38689, g_loss: 9.56927, cheat_loss: 0.69238, const_loss: 0.02105, l1_loss: 3.22456, fm_loss: 0.01786, vgg_loss: 0.56134
Epoch: [14], [ 0/ 69] time: 4356, d_loss: 1.38719, g_loss: 10.90050, cheat_loss: 0.69385, const_loss: 0.00788, l1_loss: 3.63554, fm_loss: 0.01924, vgg_loss: 0.65440
Epoch: [15], [ 0/ 69] time: 4668, d_loss: 1.38677, g_loss: 10.02535, cheat_loss: 0.69287, const_loss: 0.03650, l1_loss: 3.27949, fm_loss: 0.01976, vgg_loss: 0.59967
Epoch: [16], [ 0/ 69] time: 4979, d_loss: 1.38687, g_loss: 10.28905, cheat_loss: 0.69238, const_loss: 0.00919, l1_loss: 3.42073, fm_loss: 0.01961, vgg_loss: 0.61471
說明: 似乎開始變的, 很難下降, 訓練很久, 下降一點點.
resume, 再訓練回來正常的筆畫:
Epoch: [ 0], [ 100/ 317] time: 452, d_loss: 1.38675, g_loss: 12.26475, cheat_loss: 0.69287, const_loss: 0.01116, l1_loss: 4.76501, fm_loss: 0.02201, vgg_loss: 0.67737
Epoch: [ 0], [ 200/ 317] time: 905, d_loss: 1.38683, g_loss: 11.58067, cheat_loss: 0.69336, const_loss: 0.03313, l1_loss: 4.45930, fm_loss: 0.02325, vgg_loss: 0.63716
Epoch: [ 0], [ 300/ 317] time: 1359, d_loss: 1.38674, g_loss: 10.92872, cheat_loss: 0.69287, const_loss: 0.01517, l1_loss: 4.12120, fm_loss: 0.01909, vgg_loss: 0.60804
Epoch: [ 1], [ 0/ 317] time: 1433, d_loss: 1.38673, g_loss: 11.49218, cheat_loss: 0.69385, const_loss: 0.01714, l1_loss: 4.35434, fm_loss: 0.02144, vgg_loss: 0.64054
Epoch: [ 1], [ 100/ 317] time: 1887, d_loss: 1.38791, g_loss: 12.11400, cheat_loss: 0.69287, const_loss: 0.01035, l1_loss: 4.61668, fm_loss: 0.02436, vgg_loss: 0.67697
Epoch: [ 1], [ 200/ 317] time: 2339, d_loss: 1.38877, g_loss: 10.97058, cheat_loss: 0.69336, const_loss: 0.02209, l1_loss: 4.08948, fm_loss: 0.01864, vgg_loss: 0.61470
Epoch: [ 1], [ 300/ 317] time: 2792, d_loss: 1.38914, g_loss: 11.59704, cheat_loss: 0.69189, const_loss: 0.02782, l1_loss: 4.40532, fm_loss: 0.02318, vgg_loss: 0.64488
Epoch: [ 2], [ 0/ 317] time: 2866, d_loss: 1.39021, g_loss: 9.34783, cheat_loss: 0.69287, const_loss: 0.01219, l1_loss: 3.35214, fm_loss: 0.01765, vgg_loss: 0.52730
Epoch: [ 2], [ 100/ 317] time: 3318, d_loss: 1.38674, g_loss: 10.36315, cheat_loss: 0.69189, const_loss: 0.01243, l1_loss: 3.70686, fm_loss: 0.01928, vgg_loss: 0.59327
Epoch: [ 2], [ 200/ 317] time: 3777, d_loss: 1.38685, g_loss: 11.14089, cheat_loss: 0.69385, const_loss: 0.01753, l1_loss: 4.08696, fm_loss: 0.01909, vgg_loss: 0.63235
Epoch: [ 2], [ 300/ 317] time: 4229, d_loss: 1.38717, g_loss: 10.10146, cheat_loss: 0.69336, const_loss: 0.00913, l1_loss: 3.52672, fm_loss: 0.01717, vgg_loss: 0.58551
Epoch: [ 3], [ 0/ 317] time: 4303, d_loss: 1.38712, g_loss: 10.31236, cheat_loss: 0.69287, const_loss: 0.00974, l1_loss: 3.71696, fm_loss: 0.01959, vgg_loss: 0.58732
Epoch: [ 3], [ 100/ 317] time: 4756, d_loss: 1.38700, g_loss: 10.16995, cheat_loss: 0.69189, const_loss: 0.01288, l1_loss: 3.65795, fm_loss: 0.01786, vgg_loss: 0.57894
Epoch: [ 3], [ 200/ 317] time: 5208, d_loss: 1.38677, g_loss: 11.39987, cheat_loss: 0.69336, const_loss: 0.00966, l1_loss: 4.30649, fm_loss: 0.01964, vgg_loss: 0.63707
Epoch: [ 3], [ 300/ 317] time: 5666, d_loss: 1.38678, g_loss: 10.34439, cheat_loss: 0.69238, const_loss: 0.01845, l1_loss: 3.74052, fm_loss: 0.01951, vgg_loss: 0.58735
Epoch: [ 4], [ 0/ 317] time: 5740, d_loss: 1.39050, g_loss: 10.48697, cheat_loss: 0.69287, const_loss: 0.01239, l1_loss: 3.79909, fm_loss: 0.02094, vgg_loss: 0.59617
Epoch: [ 4], [ 100/ 317] time: 6192, d_loss: 1.38700, g_loss: 10.28607, cheat_loss: 0.69336, const_loss: 0.01367, l1_loss: 3.71227, fm_loss: 0.01894, vgg_loss: 0.58478
Epoch: [ 4], [ 200/ 317] time: 6645, d_loss: 1.38681, g_loss: 9.86172, cheat_loss: 0.69385, const_loss: 0.01393, l1_loss: 3.56781, fm_loss: 0.01807, vgg_loss: 0.55681
Epoch: [ 4], [ 300/ 317] time: 7096, d_loss: 1.38752, g_loss: 11.10733, cheat_loss: 0.69434, const_loss: 0.00987, l1_loss: 4.01633, fm_loss: 0.01950, vgg_loss: 0.63673
Epoch: [ 5], [ 0/ 317] time: 7170, d_loss: 1.38825, g_loss: 11.27310, cheat_loss: 0.69287, const_loss: 0.05156, l1_loss: 4.20801, fm_loss: 0.01966, vgg_loss: 0.63010
Epoch: [ 5], [ 100/ 317] time: 7624, d_loss: 1.38679, g_loss: 10.03360, cheat_loss: 0.69336, const_loss: 0.01022, l1_loss: 3.57978, fm_loss: 0.01759, vgg_loss: 0.57326
Epoch: [ 5], [ 200/ 317] time: 8076, d_loss: 1.38990, g_loss: 9.48372, cheat_loss: 0.69189, const_loss: 0.01126, l1_loss: 3.38894, fm_loss: 0.01645, vgg_loss: 0.53752
Epoch: [ 5], [ 300/ 317] time: 8529, d_loss: 1.38675, g_loss: 9.29934, cheat_loss: 0.69287, const_loss: 0.01208, l1_loss: 3.27883, fm_loss: 0.01675, vgg_loss: 0.52988
Epoch: [ 6], [ 0/ 317] time: 8603, d_loss: 1.38782, g_loss: 10.26814, cheat_loss: 0.69287, const_loss: 0.01066, l1_loss: 3.87605, fm_loss: 0.01664, vgg_loss: 0.56719
Epoch: [ 6], [ 100/ 317] time: 9057, d_loss: 1.38820, g_loss: 9.70704, cheat_loss: 0.69385, const_loss: 0.00955, l1_loss: 3.50647, fm_loss: 0.01665, vgg_loss: 0.54805
Epoch: [ 6], [ 200/ 317] time: 9510, d_loss: 1.38692, g_loss: 10.23998, cheat_loss: 0.69238, const_loss: 0.00819, l1_loss: 3.60831, fm_loss: 0.01755, vgg_loss: 0.59135
Epoch: [ 6], [ 300/ 317] time: 9964, d_loss: 1.38681, g_loss: 9.36549, cheat_loss: 0.69336, const_loss: 0.02900, l1_loss: 3.23567, fm_loss: 0.01477, vgg_loss: 0.53927
Epoch: [ 7], [ 0/ 317] time: 10038, d_loss: 1.38676, g_loss: 10.35784, cheat_loss: 0.69141, const_loss: 0.01559, l1_loss: 3.66848, fm_loss: 0.01689, vgg_loss: 0.59655
Epoch: [ 7], [ 100/ 317] time: 10490, d_loss: 1.38734, g_loss: 11.12286, cheat_loss: 0.69385, const_loss: 0.01430, l1_loss: 3.97371, fm_loss: 0.01836, vgg_loss: 0.64226
Epoch: [ 7], [ 200/ 317] time: 10948, d_loss: 1.38693, g_loss: 8.86370, cheat_loss: 0.69385, const_loss: 0.01122, l1_loss: 3.07869, fm_loss: 0.01524, vgg_loss: 0.50647
說明: 居然下降到 8.8
resume, 只訓練特殊字+高筆畫.
Epoch: [ 1], [ 0/ 69] time: 327, d_loss: 1.38802, g_loss: 9.93423, cheat_loss: 0.69385, const_loss: 0.00985, l1_loss: 3.24493, fm_loss: 0.01776, vgg_loss: 0.59678
Epoch: [ 2], [ 0/ 69] time: 651, d_loss: 1.39065, g_loss: 9.08825, cheat_loss: 0.69336, const_loss: 0.00538, l1_loss: 2.94353, fm_loss: 0.01541, vgg_loss: 0.54306
Epoch: [ 3], [ 0/ 69] time: 985, d_loss: 1.38692, g_loss: 9.53170, cheat_loss: 0.69336, const_loss: 0.00867, l1_loss: 3.03723, fm_loss: 0.01518, vgg_loss: 0.57773
Epoch: [ 4], [ 0/ 69] time: 1310, d_loss: 1.38826, g_loss: 9.33036, cheat_loss: 0.69385, const_loss: 0.00470, l1_loss: 2.94611, fm_loss: 0.01552, vgg_loss: 0.56702
Epoch: [ 5], [ 0/ 69] time: 1634, d_loss: 1.38677, g_loss: 8.81498, cheat_loss: 0.69336, const_loss: 0.00515, l1_loss: 2.73017, fm_loss: 0.01423, vgg_loss: 0.53721
Epoch: [ 6], [ 0/ 69] time: 1963, d_loss: 1.38700, g_loss: 9.76251, cheat_loss: 0.69238, const_loss: 0.00394, l1_loss: 3.14531, fm_loss: 0.01491, vgg_loss: 0.59060
Epoch: [ 7], [ 0/ 69] time: 2288, d_loss: 1.38702, g_loss: 9.13804, cheat_loss: 0.69238, const_loss: 0.00694, l1_loss: 2.83012, fm_loss: 0.01326, vgg_loss: 0.55953
Epoch: [ 8], [ 0/ 69] time: 2612, d_loss: 1.38673, g_loss: 8.76700, cheat_loss: 0.69336, const_loss: 0.00957, l1_loss: 2.79858, fm_loss: 0.01587, vgg_loss: 0.52496
Epoch: [ 9], [ 0/ 69] time: 2945, d_loss: 1.38678, g_loss: 8.74676, cheat_loss: 0.69385, const_loss: 0.01327, l1_loss: 2.72782, fm_loss: 0.01435, vgg_loss: 0.52975
Epoch: [10], [ 0/ 69] time: 3270, d_loss: 1.38709, g_loss: 8.89688, cheat_loss: 0.69385, const_loss: 0.01088, l1_loss: 2.81021, fm_loss: 0.01480, vgg_loss: 0.53671
Epoch: [11], [ 0/ 69] time: 3594, d_loss: 1.38711, g_loss: 9.61701, cheat_loss: 0.69336, const_loss: 0.00501, l1_loss: 3.09115, fm_loss: 0.01628, vgg_loss: 0.58112
Epoch: [12], [ 0/ 69] time: 3929, d_loss: 1.38684, g_loss: 8.61682, cheat_loss: 0.69336, const_loss: 0.00423, l1_loss: 2.66952, fm_loss: 0.01317, vgg_loss: 0.52365
Epoch: [13], [ 0/ 69] time: 4253, d_loss: 1.38674, g_loss: 8.22307, cheat_loss: 0.69336, const_loss: 0.02554, l1_loss: 2.57933, fm_loss: 0.01300, vgg_loss: 0.49118
Epoch: [14], [ 0/ 69] time: 4578, d_loss: 1.38674, g_loss: 9.19054, cheat_loss: 0.69336, const_loss: 0.00496, l1_loss: 2.86590, fm_loss: 0.01389, vgg_loss: 0.56124
Epoch: [15], [ 0/ 69] time: 4909, d_loss: 1.38695, g_loss: 8.82743, cheat_loss: 0.69238, const_loss: 0.00669, l1_loss: 2.75958, fm_loss: 0.01442, vgg_loss: 0.53544
Epoch: [16], [ 0/ 69] time: 5234, d_loss: 1.38676, g_loss: 9.24723, cheat_loss: 0.69336, const_loss: 0.00444, l1_loss: 2.95578, fm_loss: 0.01657, vgg_loss: 0.55771
把 log 拿去問 gemini:
根據你提供的log,我們可以對訓練過程進行一些分析:
1. 基本觀察:
- Epoch 和 Batch:
- 訓練進行了15個epoch,每個epoch包含61個batch。
- log只顯示了每個epoch的第一個batch([0/61])的損失值,這意味著我們只能看到每個epoch開始時的損失情況,無法了解epoch內的變化。
- 時間(time):
- 每個epoch的訓練時間大致呈線性增長,表明訓練過程穩定。
- 損失函數:
- log中包含了多個損失函數:
d_loss
(鑑別器損失)、g_loss
(生成器損失)、cheat_loss
、const_loss
、l1_loss
、fm_loss
、vgg_loss
。 l1_loss
、fm_loss
、vgg_loss
這些損失函數,代表了,模型生成出來的圖片與目標圖片的差異化,差異越小代表生成出來的圖片越接近目標圖片。
- log中包含了多個損失函數:
2. 損失函數分析:
d_loss
(鑑別器損失):d_loss
的值在1.38到1.42之間波動,相對穩定。- 這意味著鑑別器在區分真假資料時保持相對一致的性能。
g_loss
(生成器損失):g_loss
的波動較大,從10.87到13.80不等。- 這可能表示生成器在不同epoch的生成能力有所變化。
g_loss
的整體趨勢沒有明顯的下降,這可能表示生成器仍在努力學習,或者模型可能存在一些問題。
- 其他損失函數:
l1_loss
數值有逐漸下降的趨勢,代表生成圖片與目標圖片的像素差異逐漸減少。fm_loss
與vgg_loss
數值波動較小,代表生成圖片與目標圖片的特徵差異也相對穩定。cheat_loss
數值相對穩定,代表鑑別器不能很好的分辨真假。const_loss
數值波動較大,代表模型生成圖片的穩定性有待改善。
3. 問題和建議:
g_loss
的波動:g_loss
的波動可能表示生成器訓練不穩定。- 可以嘗試調整學習率、優化器參數或模型架構,以提高訓練的穩定性。
- 缺乏epoch內的損失變化:
- 只觀察每個epoch的第一個batch的損失值,無法了解整個epoch的訓練情況。
- 建議記錄每個batch的損失值,以便更全面地分析訓練過程。
- 模式崩潰的可能性:
- 如果生成器只產生少數幾種相似的結果,可能會導致
g_loss
波動,但是生成結果的品質卻很差。 - 建議定期檢查生成結果的視覺品質,以檢測模式崩潰。
- 如果生成器只產生少數幾種相似的結果,可能會導致
- 學習率調整:
- 觀察
g_loss
與d_loss
的數值,鑑別器已經可以很穩定的分辨真假圖片,而生成器還沒辦法很好的欺騙鑑別器,可以嘗試提高生成器的學習率,來加快生成器的學習速度。
- 觀察
總結:
- 訓練過程相對穩定,但生成器的訓練可能存在一些問題。
- 建議記錄更詳細的訓練數據,並定期檢查生成結果的視覺品質,以便更好地分析和改進模型。
- 調整生成器的學習率,可能會對訓練結果有幫助。
上述的情況, 需要增加 Discriminator 裡的 final_channels 值嗎?
根據你提供的log分析,以及GAN訓練的原理,以下是一些關於是否需要增加Discriminator的final_channels
值的思考:
1. Discriminator的final_channels
作用:
- 在GAN的Discriminator中,
final_channels
通常指的是最後一層卷積層輸出的通道數。 - 這個值影響Discriminator提取特徵的細緻程度。
- 增加
final_channels
可以讓Discriminator學習到更複雜的特徵,提高其辨別真假圖片的能力。
2. 目前log的觀察:
d_loss
相對穩定:- 從log中可以看出,
d_loss
的值在1.38到1.42之間波動,相對穩定。 - 這可能表示Discriminator已經具備一定的辨別能力。
- 從log中可以看出,
g_loss
波動較大:g_loss
的波動較大,表示Generator在生成圖片時,不夠穩定。
cheat_loss
數值相對穩定:cheat_loss
數值相對穩定,代表鑑別器不能很好的分辨真假。
3. 是否需要增加final_channels
:
- 如果Discriminator能力不足:
- 如果Discriminator無法有效區分真假圖片,導致
d_loss
過低或過高,那麼增加final_channels
可能會有幫助。 - 這可以讓Discriminator提取更細緻的特徵,提高其辨別能力。
- 如果Discriminator無法有效區分真假圖片,導致
- 目前情況可能不需要:
- 從log來看,
d_loss
相對穩定,表示Discriminator已經具備一定的辨別能力。 - 目前的問題可能更多在於Generator的訓練不穩定。
- 從
cheat_loss
數值來看,鑑別器不能很好的分辨真假,可以先不增加final_channels
。
- 從log來看,
- 其他調整優先:
- 在考慮增加
final_channels
之前,可以先嘗試以下調整:- 調整Generator的學習率,提高其生成能力。
- 檢查訓練數據的品質,確保數據足夠多樣化。
- 嘗試不同的損失函數組合,優化訓練過程。
- 監控生成圖片的品質,查看是否需要調整Generator的網路架構。
- 在考慮增加
4. 結論:
- 在目前情況下,增加Discriminator的
final_channels
可能不是首要的解決方案。 - 建議先嘗試調整Generator的學習率、檢查訓練數據、優化損失函數等,如果問題仍然存在,再考慮增加
final_channels
。 - 增加
final_channels
會增加Discriminator的複雜度,從而增加訓練的計算量和時間。
總之,需要根據實際情況和實驗結果,綜合判斷是否需要增加Discriminator的final_channels
。
根據你提供的log分析結果,以及GAN訓練的原理,以下是一些關於調整Generator學習率的建議:
1. 目前情況分析:
d_loss
相對穩定:- 這表示鑑別器已經具備一定的辨別能力。
g_loss
波動較大:- 這表示生成器在生成圖片時,不夠穩定,需要加強學習。
cheat_loss
數值相對穩定:- 這表示鑑別器不能很好的分辨真假。
2. 調整Generator學習率的建議:
- 提高Generator的學習率:
- 由於鑑別器已經具備一定的辨別能力,而生成器的訓練不夠穩定,因此可以嘗試提高生成器的學習率。
- 提高學習率可以讓生成器在每次迭代中更新參數的幅度更大,從而更快地學習到生成逼真圖片的能力。
- 這樣可以增加生成器欺騙鑑別器的能力,從而讓
cheat_loss
數值上升。
- 調整的注意事項:
- 提高學習率可能會導致訓練不穩定,因此需要逐步調整,並監控訓練過程中的損失函數和生成結果。
- 如果發現訓練出現震盪或發散,可以適當降低學習率。
- 可以嘗試使用不同的學習率調整策略,例如學習率衰減或自適應學習率。
3. 具體實施建議:
- 可以先嘗試將Generator的學習率提高一個較小的幅度,例如增加10%到20%。
- 在調整學習率的同時,也要監控
g_loss
、d_loss
、cheat_loss
等損失函數的變化。 - 定期檢查生成圖片的品質,確保生成結果的細節和品質有所提升。
總結:
- 在目前情況下,提高Generator的學習率可能更有利於提高其生成能力。
- 但是,需要謹慎調整,並密切監控訓練過程。
- GAN的訓練是相當複雜的,需要很多經驗來判斷訓練的狀況。
目前log 看起來是有出現震盪或發散?
從你提供的log數據來看,雖然沒有出現極端的發散現象,但可以觀察到一些震盪的跡象,特別是在g_loss
的變化上。以下是一些分析:
1. g_loss
的波動:
g_loss
的值在不同epoch之間有明顯的波動,例如從epoch 8的10.93到epoch 6的13.51,這種大幅度的變化表明生成器的訓練可能不穩定。- 這種波動可能表示生成器在不同epoch之間,生成圖片的品質出現了較大的差異。
2. 其他損失函數的變化:
- 雖然
d_loss
相對穩定,但其他損失函數(如l1_loss
、vgg_loss
)也存在一定的波動。 - 這些波動可能與生成器生成圖片品質的變化有關。
3. 震盪的可能原因:
- 學習率過高:
- 如果生成器的學習率過高,可能導致其在每次迭代中更新參數的幅度過大,從而出現震盪現象。
- 模型不平衡:
- 生成器和鑑別器之間的訓練可能不平衡,導致它們在訓練過程中相互競爭,從而出現震盪。
- 訓練數據的問題:
- 訓練數據的品質或多樣性不足,可能導致生成器無法穩定地學習到目標分佈。
4. 如何判斷發散:
- 損失函數爆炸:
- 如果損失函數的值出現極端增長,例如變成無窮大或NaN,那麼就表示出現了發散。
- 生成圖片品質惡化:
- 如果生成圖片的品質在訓練過程中持續惡化,例如出現模糊、失真或完全無意義的圖像,那麼也可能表示出現了發散。
5. 總結:
GAN的訓練是相當複雜的,需要很多經驗來判斷訓練的狀況。
從目前的log來看,雖然沒有出現明顯的發散,但g_loss
的波動表明訓練過程存在震盪。
建議仔細監控訓練過程,並嘗試調整學習率、優化器參數或模型架構,以提高訓練的穩定性。