TF之DCGAN:基于TF利用DCGAN测试MNIST数据集并进行生成过程全记录


水蜜桃怕孤独
水蜜桃怕孤独 2022-09-19 16:28:41 49846
分类专栏: 资讯

TF之DCGAN:基于TF利用DCGAN测试MNIST数据集并进行生成

目录

测试结果

测试过程全记录


测试结果

train_00_0099train_00_0799
train_00_0899train_01_0506
train_01_0606train_02_0213
train_02_0313train_02_1013
train_03_0020train_03_0720

测试过程全记录

1140~1410

  1. ……开始测试
  2. {'batch_size': <absl.flags._flag.Flag object at 0x000002A2FFDB1B38>,
  3. 'beta1': <absl.flags._flag.Flag object at 0x000002A2FE967DA0>,
  4. 'checkpoint_dir': <absl.flags._flag.Flag object at 0x000002A281135A20>,
  5. 'crop': <absl.flags._flag.BooleanFlag object at 0x000002A281135B70>,
  6. 'dataset': <absl.flags._flag.Flag object at 0x000002A281135908>,
  7. 'epoch': <absl.flags._flag.Flag object at 0x000002A2F7728048>,
  8. 'h': <tensorflow.python.platform.app._HelpFlag object at 0x000002A281135C50>,
  9. 'help': <tensorflow.python.platform.app._HelpFlag object at 0x000002A281135C50>,
  10. 'helpfull': <tensorflow.python.platform.app._HelpfullFlag object at 0x000002A281135CC0>,
  11. 'helpshort': <tensorflow.python.platform.app._HelpshortFlag object at 0x000002A281135D30>,
  12. 'input_fname_pattern': <absl.flags._flag.Flag object at 0x000002A281135978>,
  13. 'input_height': <absl.flags._flag.Flag object at 0x000002A2810ABCC0>,
  14. 'input_width': <absl.flags._flag.Flag object at 0x000002A281135780>,
  15. 'learning_rate': <absl.flags._flag.Flag object at 0x000002A2F92D7AC8>,
  16. 'output_height': <absl.flags._flag.Flag object at 0x000002A2811357F0>,
  17. 'output_width': <absl.flags._flag.Flag object at 0x000002A281135898>,
  18. 'sample_dir': <absl.flags._flag.Flag object at 0x000002A281135A90>,
  19. 'train': <absl.flags._flag.BooleanFlag object at 0x000002A281135AC8>,
  20. 'train_size': <absl.flags._flag.Flag object at 0x000002A2FE974400>,
  21. 'visualize': <absl.flags._flag.BooleanFlag object at 0x000002A281135BE0>}
  22. 2018-10-06 11:32:10.690386: I tensorflow/core/platform/cpu_feature_guard.cc:141] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2
  23. data_MNIST\mnist
  24. ---------
  25. Variables: name (type shape) [size]
  26. ---------
  27. generator/g_h0_lin/Matrix:0 (float32_ref 110x1024) [112640, bytes: 450560]
  28. generator/g_h0_lin/bias:0 (float32_ref 1024) [1024, bytes: 4096]
  29. generator/g_bn0/beta:0 (float32_ref 1024) [1024, bytes: 4096]
  30. generator/g_bn0/gamma:0 (float32_ref 1024) [1024, bytes: 4096]
  31. generator/g_h1_lin/Matrix:0 (float32_ref 1034x6272) [6485248, bytes: 25940992]
  32. generator/g_h1_lin/bias:0 (float32_ref 6272) [6272, bytes: 25088]
  33. generator/g_bn1/beta:0 (float32_ref 6272) [6272, bytes: 25088]
  34. generator/g_bn1/gamma:0 (float32_ref 6272) [6272, bytes: 25088]
  35. generator/g_h2/w:0 (float32_ref 5x5x128x138) [441600, bytes: 1766400]
  36. generator/g_h2/biases:0 (float32_ref 128) [128, bytes: 512]
  37. generator/g_bn2/beta:0 (float32_ref 128) [128, bytes: 512]
  38. generator/g_bn2/gamma:0 (float32_ref 128) [128, bytes: 512]
  39. generator/g_h3/w:0 (float32_ref 5x5x1x138) [3450, bytes: 13800]
  40. generator/g_h3/biases:0 (float32_ref 1) [1, bytes: 4]
  41. discriminator/d_h0_conv/w:0 (float32_ref 5x5x11x11) [3025, bytes: 12100]
  42. discriminator/d_h0_conv/biases:0 (float32_ref 11) [11, bytes: 44]
  43. discriminator/d_h1_conv/w:0 (float32_ref 5x5x21x74) [38850, bytes: 155400]
  44. discriminator/d_h1_conv/biases:0 (float32_ref 74) [74, bytes: 296]
  45. discriminator/d_bn1/beta:0 (float32_ref 74) [74, bytes: 296]
  46. discriminator/d_bn1/gamma:0 (float32_ref 74) [74, bytes: 296]
  47. discriminator/d_h2_lin/Matrix:0 (float32_ref 3636x1024) [3723264, bytes: 14893056]
  48. discriminator/d_h2_lin/bias:0 (float32_ref 1024) [1024, bytes: 4096]
  49. discriminator/d_bn2/beta:0 (float32_ref 1024) [1024, bytes: 4096]
  50. discriminator/d_bn2/gamma:0 (float32_ref 1024) [1024, bytes: 4096]
  51. discriminator/d_h3_lin/Matrix:0 (float32_ref 1034x1) [1034, bytes: 4136]
  52. discriminator/d_h3_lin/bias:0 (float32_ref 1) [1, bytes: 4]
  53. Total size of variables: 10834690
  54. Total bytes of variables: 43338760
  55. [*] Reading checkpoints...
  56. [*] Failed to find a checkpoint
  57. [!] Load failed...
  58. Epoch: [ 0] [ 0/1093] time: 3.3617, d_loss: 1.79891801, g_loss: 0.73078763
  59. Epoch: [ 0] [ 1/1093] time: 6.4123, d_loss: 1.46442509, g_loss: 0.61579478
  60. Epoch: [ 0] [ 2/1093] time: 8.7562, d_loss: 1.49022853, g_loss: 0.67894053
  61. Epoch: [ 0] [ 3/1093] time: 10.9214, d_loss: 1.40174472, g_loss: 0.66220653
  62. Epoch: [ 0] [ 4/1093] time: 13.3050, d_loss: 1.40663481, g_loss: 0.69936526
  63. Epoch: [ 0] [ 5/1093] time: 15.5709, d_loss: 1.38957083, g_loss: 0.68421012
  64. Epoch: [ 0] [ 6/1093] time: 17.8600, d_loss: 1.39213061, g_loss: 0.68934584
  65. Epoch: [ 0] [ 7/1093] time: 20.4708, d_loss: 1.39794362, g_loss: 0.69806755
  66. Epoch: [ 0] [ 8/1093] time: 23.0654, d_loss: 1.43503237, g_loss: 0.70846951
  67. Epoch: [ 0] [ 9/1093] time: 25.5358, d_loss: 1.39276147, g_loss: 0.70669782
  68. Epoch: [ 0] [ 10/1093] time: 28.2617, d_loss: 1.42136300, g_loss: 0.70364445
  69. Epoch: [ 0] [ 11/1093] time: 30.8038, d_loss: 1.40051103, g_loss: 0.70014894
  70. Epoch: [ 0] [ 12/1093] time: 33.3130, d_loss: 1.37765169, g_loss: 0.70824486
  71. Epoch: [ 0] [ 13/1093] time: 35.6096, d_loss: 1.38219857, g_loss: 0.69451976
  72. Epoch: [ 0] [ 14/1093] time: 37.8537, d_loss: 1.36866033, g_loss: 0.70824432
  73. Epoch: [ 0] [ 15/1093] time: 40.1426, d_loss: 1.36621869, g_loss: 0.69405836
  74. Epoch: [ 0] [ 16/1093] time: 42.7074, d_loss: 1.37535453, g_loss: 0.69518888
  75. Epoch: [ 0] [ 17/1093] time: 44.8565, d_loss: 1.36989605, g_loss: 0.69930756
  76. Epoch: [ 0] [ 18/1093] time: 46.7869, d_loss: 1.36563087, g_loss: 0.69781649
  77. Epoch: [ 0] [ 19/1093] time: 48.7288, d_loss: 1.36397326, g_loss: 0.70866680
  78. Epoch: [ 0] [ 20/1093] time: 51.0654, d_loss: 1.38101411, g_loss: 0.69544500
  79. Epoch: [ 0] [ 21/1093] time: 53.5399, d_loss: 1.46281934, g_loss: 0.70643008
  80. Epoch: [ 0] [ 22/1093] time: 56.5684, d_loss: 1.43966162, g_loss: 0.71961737
  81. Epoch: [ 0] [ 23/1093] time: 59.5954, d_loss: 1.42399430, g_loss: 0.72861439
  82. Epoch: [ 0] [ 24/1093] time: 62.9032, d_loss: 1.41276562, g_loss: 0.70471978
  83. Epoch: [ 0] [ 25/1093] time: 65.7187, d_loss: 1.48300290, g_loss: 0.71538234
  84. Epoch: [ 0] [ 26/1093] time: 68.6204, d_loss: 1.39843416, g_loss: 0.68771482
  85. Epoch: [ 0] [ 27/1093] time: 70.8153, d_loss: 1.42166626, g_loss: 0.69409549
  86. Epoch: [ 0] [ 28/1093] time: 73.5776, d_loss: 1.39594829, g_loss: 0.68035471
  87. Epoch: [ 0] [ 29/1093] time: 76.6749, d_loss: 1.39489424, g_loss: 0.69306409
  88. Epoch: [ 0] [ 30/1093] time: 79.8282, d_loss: 1.41070235, g_loss: 0.68208236
  89. Epoch: [ 0] [ 31/1093] time: 83.5562, d_loss: 1.39976072, g_loss: 0.69344074
  90. Epoch: [ 0] [ 32/1093] time: 86.5431, d_loss: 1.39875138, g_loss: 0.69864786
  91. Epoch: [ 0] [ 33/1093] time: 89.7386, d_loss: 1.39117682, g_loss: 0.68384939
  92. Epoch: [ 0] [ 34/1093] time: 92.1129, d_loss: 1.39306462, g_loss: 0.68603516
  93. Epoch: [ 0] [ 35/1093] time: 94.6717, d_loss: 1.39766645, g_loss: 0.67713618
  94. Epoch: [ 0] [ 36/1093] time: 97.4150, d_loss: 1.39619994, g_loss: 0.68300879
  95. Epoch: [ 0] [ 37/1093] time: 99.9408, d_loss: 1.39534819, g_loss: 0.69076747
  96. Epoch: [ 0] [ 38/1093] time: 103.1213, d_loss: 1.39753985, g_loss: 0.68903100
  97. Epoch: [ 0] [ 39/1093] time: 105.8520, d_loss: 1.41161013, g_loss: 0.69302136
  98. Epoch: [ 0] [ 40/1093] time: 108.9503, d_loss: 1.38997078, g_loss: 0.68370312
  99. Epoch: [ 0] [ 41/1093] time: 112.2070, d_loss: 1.39786303, g_loss: 0.69124269
  100. Epoch: [ 0] [ 42/1093] time: 115.2431, d_loss: 1.38943410, g_loss: 0.69021893
  101. Epoch: [ 0] [ 43/1093] time: 118.6511, d_loss: 1.38621378, g_loss: 0.68407494
  102. Epoch: [ 0] [ 44/1093] time: 122.0462, d_loss: 1.39240563, g_loss: 0.69688046
  103. Epoch: [ 0] [ 45/1093] time: 125.3139, d_loss: 1.39452100, g_loss: 0.69252259
  104. Epoch: [ 0] [ 46/1093] time: 129.0117, d_loss: 1.39167857, g_loss: 0.68246353
  105. Epoch: [ 0] [ 47/1093] time: 132.8489, d_loss: 1.39049268, g_loss: 0.69009811
  106. Epoch: [ 0] [ 48/1093] time: 136.4826, d_loss: 1.39105415, g_loss: 0.69570535
  107. Epoch: [ 0] [ 49/1093] time: 139.8832, d_loss: 1.38744533, g_loss: 0.68307704
  108. Epoch: [ 0] [ 50/1093] time: 142.6343, d_loss: 1.39128542, g_loss: 0.68657452
  109. Epoch: [ 0] [ 51/1093] time: 145.0365, d_loss: 1.39720774, g_loss: 0.68289292
  110. Epoch: [ 0] [ 52/1093] time: 148.8226, d_loss: 1.40998244, g_loss: 0.69946194
  111. Epoch: [ 0] [ 53/1093] time: 151.4981, d_loss: 1.42358077, g_loss: 0.69425476
  112. Epoch: [ 0] [ 54/1093] time: 154.4366, d_loss: 1.40655017, g_loss: 0.69315112
  113. Epoch: [ 0] [ 55/1093] time: 157.9840, d_loss: 1.39314961, g_loss: 0.67903620
  114. Epoch: [ 0] [ 56/1093] time: 160.5293, d_loss: 1.39538550, g_loss: 0.68701828
  115. Epoch: [ 0] [ 57/1093] time: 162.8455, d_loss: 1.40030372, g_loss: 0.68119174
  116. Epoch: [ 0] [ 58/1093] time: 165.5109, d_loss: 1.39839721, g_loss: 0.68374062
  117. Epoch: [ 0] [ 59/1093] time: 168.1250, d_loss: 1.40220833, g_loss: 0.67849696
  118. Epoch: [ 0] [ 60/1093] time: 170.4443, d_loss: 1.40346980, g_loss: 0.68534362
  119. Epoch: [ 0] [ 61/1093] time: 172.5757, d_loss: 1.40919614, g_loss: 0.68264174
  120. Epoch: [ 0] [ 62/1093] time: 175.3375, d_loss: 1.41680074, g_loss: 0.69107366
  121. Epoch: [ 0] [ 63/1093] time: 178.1931, d_loss: 1.42677331, g_loss: 0.68684256
  122. Epoch: [ 0] [ 64/1093] time: 180.9363, d_loss: 1.41873085, g_loss: 0.68174267
  123. Epoch: [ 0] [ 65/1093] time: 183.4142, d_loss: 1.41352820, g_loss: 0.69168335
  124. Epoch: [ 0] [ 66/1093] time: 186.2004, d_loss: 1.40492952, g_loss: 0.68485790
  125. Epoch: [ 0] [ 67/1093] time: 188.9013, d_loss: 1.41416049, g_loss: 0.69247150
  126. Epoch: [ 0] [ 68/1093] time: 191.3907, d_loss: 1.44085050, g_loss: 0.70080090
  127. Epoch: [ 0] [ 69/1093] time: 193.6596, d_loss: 1.42936659, g_loss: 0.70780182
  128. Epoch: [ 0] [ 70/1093] time: 196.2392, d_loss: 1.39855242, g_loss: 0.68066621
  129. Epoch: [ 0] [ 71/1093] time: 198.6732, d_loss: 1.39962685, g_loss: 0.68119228
  130. Epoch: [ 0] [ 72/1093] time: 201.1359, d_loss: 1.39792156, g_loss: 0.68046838
  131. Epoch: [ 0] [ 73/1093] time: 203.9913, d_loss: 1.40156364, g_loss: 0.68185544
  132. Epoch: [ 0] [ 74/1093] time: 206.5057, d_loss: 1.40137339, g_loss: 0.68439347
  133. Epoch: [ 0] [ 75/1093] time: 208.9730, d_loss: 1.39628625, g_loss: 0.68880224
  134. Epoch: [ 0] [ 76/1093] time: 212.1802, d_loss: 1.39695120, g_loss: 0.69053137
  135. Epoch: [ 0] [ 77/1093] time: 215.1069, d_loss: 1.39827728, g_loss: 0.67404974
  136. Epoch: [ 0] [ 78/1093] time: 217.8231, d_loss: 1.39441288, g_loss: 0.68811285
  137. Epoch: [ 0] [ 79/1093] time: 220.8017, d_loss: 1.39862061, g_loss: 0.68243313
  138. Epoch: [ 0] [ 80/1093] time: 223.6711, d_loss: 1.39560962, g_loss: 0.68420863
  139. Epoch: [ 0] [ 81/1093] time: 226.1243, d_loss: 1.39474165, g_loss: 0.68446684
  140. Epoch: [ 0] [ 82/1093] time: 228.9125, d_loss: 1.39735079, g_loss: 0.68914992
  141. Epoch: [ 0] [ 83/1093] time: 231.7087, d_loss: 1.40495729, g_loss: 0.67565703
  142. Epoch: [ 0] [ 84/1093] time: 234.3499, d_loss: 1.40376186, g_loss: 0.68402076
  143. Epoch: [ 0] [ 85/1093] time: 236.8927, d_loss: 1.39633703, g_loss: 0.67996454
  144. Epoch: [ 0] [ 86/1093] time: 239.8556, d_loss: 1.40431571, g_loss: 0.68185967
  145. Epoch: [ 0] [ 87/1093] time: 242.7527, d_loss: 1.40456629, g_loss: 0.68880403
  146. Epoch: [ 0] [ 88/1093] time: 245.2765, d_loss: 1.39363539, g_loss: 0.68647277
  147. Epoch: [ 0] [ 89/1093] time: 247.9097, d_loss: 1.39768720, g_loss: 0.68281728
  148. Epoch: [ 0] [ 90/1093] time: 250.6797, d_loss: 1.40258384, g_loss: 0.69015211
  149. Epoch: [ 0] [ 91/1093] time: 252.9605, d_loss: 1.41010988, g_loss: 0.69163489
  150. Epoch: [ 0] [ 92/1093] time: 255.8331, d_loss: 1.39705300, g_loss: 0.67692769
  151. Epoch: [ 0] [ 93/1093] time: 258.7976, d_loss: 1.41552734, g_loss: 0.69169050
  152. Epoch: [ 0] [ 94/1093] time: 262.1104, d_loss: 1.39865696, g_loss: 0.68793559
  153. Epoch: [ 0] [ 95/1093] time: 265.0370, d_loss: 1.40191650, g_loss: 0.68027002
  154. Epoch: [ 0] [ 96/1093] time: 267.7568, d_loss: 1.40628874, g_loss: 0.67845261
  155. Epoch: [ 0] [ 97/1093] time: 270.7154, d_loss: 1.40095508, g_loss: 0.68664324
  156. Epoch: [ 0] [ 98/1093] time: 273.6299, d_loss: 1.41269326, g_loss: 0.68330830
  157. Epoch: [ 0] [ 99/1093] time: 276.4041, d_loss: 1.41343331, g_loss: 0.69674391
  158. [Sample] d_loss: 1.39404178, g_loss: 0.71861243
  159. Epoch: [ 0] [ 100/1093] time: 279.9370, d_loss: 1.39926529, g_loss: 0.69326425
  160. Epoch: [ 0] [ 101/1093] time: 282.8589, d_loss: 1.39894390, g_loss: 0.68361241
  161. Epoch: [ 0] [ 102/1093] time: 285.4811, d_loss: 1.39818084, g_loss: 0.69090337
  162. Epoch: [ 0] [ 103/1093] time: 287.6454, d_loss: 1.39627695, g_loss: 0.67909706
  163. Epoch: [ 0] [ 104/1093] time: 290.3276, d_loss: 1.39514160, g_loss: 0.68727589
  164. Epoch: [ 0] [ 105/1093] time: 293.5694, d_loss: 1.40148556, g_loss: 0.68616998
  165. Epoch: [ 0] [ 106/1093] time: 296.7065, d_loss: 1.39823532, g_loss: 0.68184149
  166. Epoch: [ 0] [ 107/1093] time: 299.5040, d_loss: 1.40077090, g_loss: 0.67544007
  167. Epoch: [ 0] [ 108/1093] time: 302.5080, d_loss: 1.40159750, g_loss: 0.68739390
  168. Epoch: [ 0] [ 109/1093] time: 305.3266, d_loss: 1.40064311, g_loss: 0.68674183
  169. Epoch: [ 0] [ 110/1093] time: 308.2463, d_loss: 1.40190828, g_loss: 0.68489563
  170. ……
  171. Epoch: [ 0] [ 190/1093] time: 535.9742, d_loss: 1.39696872, g_loss: 0.67972469
  172. Epoch: [ 0] [ 191/1093] time: 538.4506, d_loss: 1.39499533, g_loss: 0.68089843
  173. Epoch: [ 0] [ 192/1093] time: 541.1816, d_loss: 1.39483309, g_loss: 0.68199342
  174. Epoch: [ 0] [ 193/1093] time: 544.6827, d_loss: 1.39154720, g_loss: 0.69034952
  175. Epoch: [ 0] [ 194/1093] time: 548.6390, d_loss: 1.38941956, g_loss: 0.68652773
  176. Epoch: [ 0] [ 195/1093] time: 551.9678, d_loss: 1.39027929, g_loss: 0.69264108
  177. Epoch: [ 0] [ 196/1093] time: 555.3258, d_loss: 1.39162266, g_loss: 0.68833613
  178. Epoch: [ 0] [ 197/1093] time: 558.5404, d_loss: 1.40050042, g_loss: 0.68856359
  179. Epoch: [ 0] [ 198/1093] time: 561.3181, d_loss: 1.39854860, g_loss: 0.69332385
  180. Epoch: [ 0] [ 199/1093] time: 563.8952, d_loss: 1.40790129, g_loss: 0.69219285
  181. [Sample] d_loss: 1.39614487, g_loss: 0.70220172
  182. Epoch: [ 0] [ 200/1093] time: 566.5791, d_loss: 1.39575028, g_loss: 0.68371403
  183. Epoch: [ 0] [ 201/1093] time: 568.9093, d_loss: 1.39769495, g_loss: 0.68171024
  184. Epoch: [ 0] [ 202/1093] time: 571.4728, d_loss: 1.40282321, g_loss: 0.67665672
  185. Epoch: [ 0] [ 203/1093] time: 574.0684, d_loss: 1.40040171, g_loss: 0.68347836
  186. Epoch: [ 0] [ 204/1093] time: 576.6086, d_loss: 1.40370631, g_loss: 0.67588425
  187. Epoch: [ 0] [ 205/1093] time: 579.1860, d_loss: 1.40058494, g_loss: 0.67948377
  188. Epoch: [ 0] [ 206/1093] time: 581.7698, d_loss: 1.40094650, g_loss: 0.68511415
  189. Epoch: [ 0] [ 207/1093] time: 584.3541, d_loss: 1.39703560, g_loss: 0.68563807
  190. Epoch: [ 0] [ 208/1093] time: 586.9515, d_loss: 1.39535570, g_loss: 0.69189703
  191. Epoch: [ 0] [ 209/1093] time: 589.5623, d_loss: 1.39087117, g_loss: 0.68965638
  192. Epoch: [ 0] [ 210/1093] time: 592.1490, d_loss: 1.39308906, g_loss: 0.68321383
  193. ……
  194. Epoch: [ 0] [ 889/1093] time: 2314.8393, d_loss: 1.39859378, g_loss: 0.67322266
  195. Epoch: [ 0] [ 890/1093] time: 2316.9278, d_loss: 1.39070845, g_loss: 0.68732977
  196. Epoch: [ 0] [ 891/1093] time: 2319.3591, d_loss: 1.39387286, g_loss: 0.67873466
  197. Epoch: [ 0] [ 892/1093] time: 2321.4178, d_loss: 1.39172828, g_loss: 0.68356216
  198. Epoch: [ 0] [ 893/1093] time: 2323.4089, d_loss: 1.39842272, g_loss: 0.67815489
  199. Epoch: [ 0] [ 894/1093] time: 2325.6301, d_loss: 1.39376366, g_loss: 0.68304271
  200. Epoch: [ 0] [ 895/1093] time: 2328.0387, d_loss: 1.39139628, g_loss: 0.67735171
  201. Epoch: [ 0] [ 896/1093] time: 2330.0398, d_loss: 1.39796066, g_loss: 0.67579186
  202. Epoch: [ 0] [ 897/1093] time: 2332.2183, d_loss: 1.39888477, g_loss: 0.66883886
  203. Epoch: [ 0] [ 898/1093] time: 2334.6396, d_loss: 1.39262605, g_loss: 0.67790604
  204. Epoch: [ 0] [ 899/1093] time: 2336.6380, d_loss: 1.38774049, g_loss: 0.68282270
  205. [Sample] d_loss: 1.38685536, g_loss: 0.70143592
  206. Epoch: [ 0] [ 900/1093] time: 2339.1794, d_loss: 1.39559400, g_loss: 0.67823637
  207. Epoch: [ 0] [ 901/1093] time: 2341.5979, d_loss: 1.39618373, g_loss: 0.67359304
  208. Epoch: [ 0] [ 902/1093] time: 2343.6090, d_loss: 1.40060043, g_loss: 0.68315041
  209. Epoch: [ 0] [ 903/1093] time: 2345.6101, d_loss: 1.38607645, g_loss: 0.68459594
  210. Epoch: [ 0] [ 904/1093] time: 2347.6186, d_loss: 1.38612366, g_loss: 0.68465877
  211. Epoch: [ 0] [ 905/1093] time: 2349.8598, d_loss: 1.38972747, g_loss: 0.68110597
  212. Epoch: [ 0] [ 906/1093] time: 2352.2383, d_loss: 1.40021336, g_loss: 0.67477131
  213. Epoch: [ 0] [ 907/1093] time: 2354.2594, d_loss: 1.38780701, g_loss: 0.68614316
  214. Epoch: [ 0] [ 908/1093] time: 2356.4380, d_loss: 1.39729989, g_loss: 0.68168002
  215. Epoch: [ 0] [ 909/1093] time: 2358.8492, d_loss: 1.39604807, g_loss: 0.68169260
  216. Epoch: [ 0] [ 910/1093] time: 2360.8703, d_loss: 1.39347506, g_loss: 0.67698503
  217. ……
  218. Epoch: [ 0] [ 990/1093] time: 2534.4882, d_loss: 1.38051999, g_loss: 0.68829250
  219. Epoch: [ 0] [ 991/1093] time: 2536.8594, d_loss: 1.38707495, g_loss: 0.69181627
  220. Epoch: [ 0] [ 992/1093] time: 2538.9105, d_loss: 1.39524150, g_loss: 0.68155080
  221. Epoch: [ 0] [ 993/1093] time: 2540.9216, d_loss: 1.39088154, g_loss: 0.68005645
  222. Epoch: [ 0] [ 994/1093] time: 2543.1603, d_loss: 1.38700223, g_loss: 0.68155348
  223. Epoch: [ 0] [ 995/1093] time: 2545.5215, d_loss: 1.40298247, g_loss: 0.66744435
  224. Epoch: [ 0] [ 996/1093] time: 2547.5300, d_loss: 1.40880179, g_loss: 0.66607797
  225. Epoch: [ 0] [ 997/1093] time: 2549.5310, d_loss: 1.39295077, g_loss: 0.67571455
  226. Epoch: [ 0] [ 998/1093] time: 2551.8797, d_loss: 1.39118791, g_loss: 0.68550998
  227. Epoch: [ 0] [ 999/1093] time: 2554.1409, d_loss: 1.38995099, g_loss: 0.68077219
  228. [Sample] d_loss: 1.39188242, g_loss: 0.69870007
  229. Epoch: [ 0] [1000/1093] time: 2556.5095, d_loss: 1.38937902, g_loss: 0.68420708
  230. Epoch: [ 0] [1001/1093] time: 2559.4411, d_loss: 1.38841224, g_loss: 0.67964196
  231. Epoch: [ 0] [1002/1093] time: 2561.3995, d_loss: 1.39025033, g_loss: 0.68857718
  232. Epoch: [ 0] [1003/1093] time: 2563.4106, d_loss: 1.38774192, g_loss: 0.68713319
  233. Epoch: [ 0] [1004/1093] time: 2565.7818, d_loss: 1.38517952, g_loss: 0.69962525
  234. Epoch: [ 0] [1005/1093] time: 2568.0208, d_loss: 1.39758313, g_loss: 0.68758988
  235. Epoch: [ 0] [1006/1093] time: 2570.0219, d_loss: 1.39658952, g_loss: 0.69050717
  236. Epoch: [ 0] [1007/1093] time: 2572.0104, d_loss: 1.39825773, g_loss: 0.67399806
  237. Epoch: [ 0] [1008/1093] time: 2574.2516, d_loss: 1.39735007, g_loss: 0.68345094
  238. Epoch: [ 0] [1009/1093] time: 2576.4203, d_loss: 1.39032114, g_loss: 0.67591566
  239. Epoch: [ 0] [1010/1093] time: 2578.4213, d_loss: 1.39701056, g_loss: 0.67272741

网站声明:如果转载,请联系本站管理员。否则一切后果自行承担。

本文链接:https://www.xckfsq.com/news/show.html?id=3592
赞同 0
评论 0 条
水蜜桃怕孤独L1
粉丝 0 发表 13 + 关注 私信
上周热门
如何使用 StarRocks 管理和优化数据湖中的数据?  2675
【软件正版化】软件正版化工作要点  2640
统信UOS试玩黑神话:悟空  2536
信刻光盘安全隔离与信息交换系统  2221
镜舟科技与中启乘数科技达成战略合作,共筑数据服务新生态  1092
grub引导程序无法找到指定设备和分区  747
WPS City Talk · 校招西安站来了!  15
金山办公2024算法挑战赛 | 报名截止日期更新  15
看到某国的寻呼机炸了,就问你用某水果手机发抖不?  14
有在找工作的IT人吗?  13
本周热议
我的信创开放社区兼职赚钱历程 40
今天你签到了吗? 27
信创开放社区邀请他人注册的具体步骤如下 15
如何玩转信创开放社区—从小白进阶到专家 15
方德桌面操作系统 14
我有15积分有什么用? 13
用抖音玩法闯信创开放社区——用平台宣传企业产品服务 13
如何让你先人一步获得悬赏问题信息?(创作者必看) 12
2024中国信创产业发展大会暨中国信息科技创新与应用博览会 9
中央国家机关政府采购中心:应当将CPU、操作系统符合安全可靠测评要求纳入采购需求 8

加入交流群

请使用微信扫一扫!