TF之DCGAN:基于TF利用DCGAN测试自己的数据集并进行生成过程全记录


用户已注册
用户已注册 2022-09-19 16:28:25 52431
分类专栏: 资讯

TF之DCGAN:基于TF利用DCGAN测试自己的数据集并进行生成过程全记录

目录

训练的数据集部分图片

输出结果

1、默认参数输出结果

训练过程全记录


训练的数据集部分图片

以从网上收集了许多日式动画为例

输出结果

1、默认参数输出结果

train_00_0099train_00_0399
train_00_0599train_00_0799
train_01_0099  

2、更改不同参数option=0 、option=1输出结果

使用option=0的可视化方法产生的图片

使用option=1的可视化方法产生的图片

GAN 模型隐空间中的插值可视化

训练过程全记录

1518~1910

  1. 开始训练……
  2. {'batch_size': <absl.flags._flag.Flag object at 0x000002C943CD16A0>,
  3. 'beta1': <absl.flags._flag.Flag object at 0x000002C9463D5F60>,
  4. 'checkpoint_dir': <absl.flags._flag.Flag object at 0x000002C946422CC0>,
  5. 'crop': <absl.flags._flag.BooleanFlag object at 0x000002C946422E10>,
  6. 'dataset': <absl.flags._flag.Flag object at 0x000002C946422BA8>,
  7. 'epoch': <absl.flags._flag.Flag object at 0x000002C93CA90320>,
  8. 'h': <tensorflow.python.platform.app._HelpFlag object at 0x000002C946422EF0>,
  9. 'help': <tensorflow.python.platform.app._HelpFlag object at 0x000002C946422EF0>,
  10. 'helpfull': <tensorflow.python.platform.app._HelpfullFlag object at 0x000002C946422F60>,
  11. 'helpshort': <tensorflow.python.platform.app._HelpshortFlag object at 0x000002C946422FD0>,
  12. 'input_fname_pattern': <absl.flags._flag.Flag object at 0x000002C946422C18>,
  13. 'input_height': <absl.flags._flag.Flag object at 0x000002C943CD1B38>,
  14. 'input_width': <absl.flags._flag.Flag object at 0x000002C946422A20>,
  15. 'learning_rate': <absl.flags._flag.Flag object at 0x000002C93E5E7DA0>,
  16. 'output_height': <absl.flags._flag.Flag object at 0x000002C946422A90>,
  17. 'output_width': <absl.flags._flag.Flag object at 0x000002C946422B38>,
  18. 'sample_dir': <absl.flags._flag.Flag object at 0x000002C946422D30>,
  19. 'train': <absl.flags._flag.BooleanFlag object at 0x000002C946422D68>,
  20. 'train_size': <absl.flags._flag.Flag object at 0x000002C943CD10F0>,
  21. 'visualize': <absl.flags._flag.BooleanFlag object at 0x000002C946422E80>}
  22. 2018-10-06 15:18:41.635062: I tensorflow/core/platform/cpu_feature_guard.cc:141] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2
  23. ---------
  24. Variables: name (type shape) [size]
  25. ---------
  26. generator/g_h0_lin/Matrix:0 (float32_ref 100x4608) [460800, bytes: 1843200]
  27. generator/g_h0_lin/bias:0 (float32_ref 4608) [4608, bytes: 18432]
  28. generator/g_bn0/beta:0 (float32_ref 512) [512, bytes: 2048]
  29. generator/g_bn0/gamma:0 (float32_ref 512) [512, bytes: 2048]
  30. generator/g_h1/w:0 (float32_ref 5x5x256x512) [3276800, bytes: 13107200]
  31. generator/g_h1/biases:0 (float32_ref 256) [256, bytes: 1024]
  32. generator/g_bn1/beta:0 (float32_ref 256) [256, bytes: 1024]
  33. generator/g_bn1/gamma:0 (float32_ref 256) [256, bytes: 1024]
  34. generator/g_h2/w:0 (float32_ref 5x5x128x256) [819200, bytes: 3276800]
  35. generator/g_h2/biases:0 (float32_ref 128) [128, bytes: 512]
  36. generator/g_bn2/beta:0 (float32_ref 128) [128, bytes: 512]
  37. generator/g_bn2/gamma:0 (float32_ref 128) [128, bytes: 512]
  38. generator/g_h3/w:0 (float32_ref 5x5x64x128) [204800, bytes: 819200]
  39. generator/g_h3/biases:0 (float32_ref 64) [64, bytes: 256]
  40. generator/g_bn3/beta:0 (float32_ref 64) [64, bytes: 256]
  41. generator/g_bn3/gamma:0 (float32_ref 64) [64, bytes: 256]
  42. generator/g_h4/w:0 (float32_ref 5x5x3x64) [4800, bytes: 19200]
  43. generator/g_h4/biases:0 (float32_ref 3) [3, bytes: 12]
  44. discriminator/d_h0_conv/w:0 (float32_ref 5x5x3x64) [4800, bytes: 19200]
  45. discriminator/d_h0_conv/biases:0 (float32_ref 64) [64, bytes: 256]
  46. discriminator/d_h1_conv/w:0 (float32_ref 5x5x64x128) [204800, bytes: 819200]
  47. discriminator/d_h1_conv/biases:0 (float32_ref 128) [128, bytes: 512]
  48. discriminator/d_bn1/beta:0 (float32_ref 128) [128, bytes: 512]
  49. discriminator/d_bn1/gamma:0 (float32_ref 128) [128, bytes: 512]
  50. discriminator/d_h2_conv/w:0 (float32_ref 5x5x128x256) [819200, bytes: 3276800]
  51. discriminator/d_h2_conv/biases:0 (float32_ref 256) [256, bytes: 1024]
  52. discriminator/d_bn2/beta:0 (float32_ref 256) [256, bytes: 1024]
  53. discriminator/d_bn2/gamma:0 (float32_ref 256) [256, bytes: 1024]
  54. discriminator/d_h3_conv/w:0 (float32_ref 5x5x256x512) [3276800, bytes: 13107200]
  55. discriminator/d_h3_conv/biases:0 (float32_ref 512) [512, bytes: 2048]
  56. discriminator/d_bn3/beta:0 (float32_ref 512) [512, bytes: 2048]
  57. discriminator/d_bn3/gamma:0 (float32_ref 512) [512, bytes: 2048]
  58. discriminator/d_h4_lin/Matrix:0 (float32_ref 4608x1) [4608, bytes: 18432]
  59. discriminator/d_h4_lin/bias:0 (float32_ref 1) [1, bytes: 4]
  60. Total size of variables: 9086340
  61. Total bytes of variables: 36345360
  62. [*] Reading checkpoints...
  63. [*] Failed to find a checkpoint
  64. [!] Load failed...
  65. Epoch: [ 0] [ 0/ 800] time: 14.9779, d_loss: 5.05348301, g_loss: 0.00766894
  66. Epoch: [ 0] [ 1/ 800] time: 28.0542, d_loss: 4.82881641, g_loss: 0.01297333
  67. Epoch: [ 0] [ 2/ 800] time: 40.2559, d_loss: 3.48951864, g_loss: 0.07677600
  68. Epoch: [ 0] [ 3/ 800] time: 53.2987, d_loss: 4.46177912, g_loss: 0.01912572
  69. Epoch: [ 0] [ 4/ 800] time: 66.6449, d_loss: 3.76898527, g_loss: 0.06732680
  70. Epoch: [ 0] [ 5/ 800] time: 80.2566, d_loss: 3.12670279, g_loss: 0.12792118
  71. Epoch: [ 0] [ 6/ 800] time: 94.6307, d_loss: 3.61706448, g_loss: 0.05859204
  72. Epoch: [ 0] [ 7/ 800] time: 108.9309, d_loss: 2.67836666, g_loss: 0.26883626
  73. Epoch: [ 0] [ 8/ 800] time: 122.1341, d_loss: 3.90734839, g_loss: 0.05641707
  74. Epoch: [ 0] [ 9/ 800] time: 135.7154, d_loss: 1.87382483, g_loss: 1.13096261
  75. Epoch: [ 0] [ 10/ 800] time: 148.9689, d_loss: 6.14149714, g_loss: 0.00330601
  76. ……
  77. Epoch: [ 0] [ 80/ 800] time: 1174.5982, d_loss: 2.07529640, g_loss: 0.39124209
  78. Epoch: [ 0] [ 81/ 800] time: 1192.4455, d_loss: 2.01820517, g_loss: 0.43641573
  79. Epoch: [ 0] [ 82/ 800] time: 1210.1161, d_loss: 2.14325690, g_loss: 0.41077107
  80. Epoch: [ 0] [ 83/ 800] time: 1226.0585, d_loss: 2.06479096, g_loss: 0.49251628
  81. Epoch: [ 0] [ 84/ 800] time: 1242.0143, d_loss: 2.23370504, g_loss: 0.43198395
  82. Epoch: [ 0] [ 85/ 800] time: 1257.2267, d_loss: 2.12133884, g_loss: 0.49163312
  83. Epoch: [ 0] [ 86/ 800] time: 1272.5151, d_loss: 2.12812853, g_loss: 0.45083773
  84. Epoch: [ 0] [ 87/ 800] time: 1289.7231, d_loss: 1.85827374, g_loss: 0.54915452
  85. Epoch: [ 0] [ 88/ 800] time: 1305.7893, d_loss: 1.75407577, g_loss: 0.59886670
  86. Epoch: [ 0] [ 89/ 800] time: 1324.8202, d_loss: 1.92280674, g_loss: 0.43640304
  87. Epoch: [ 0] [ 90/ 800] time: 1342.7920, d_loss: 1.90137959, g_loss: 0.45802355
  88. Epoch: [ 0] [ 91/ 800] time: 1361.9827, d_loss: 1.85933983, g_loss: 0.47512102
  89. Epoch: [ 0] [ 92/ 800] time: 1376.7853, d_loss: 1.83109379, g_loss: 0.53952801
  90. Epoch: [ 0] [ 93/ 800] time: 1391.9553, d_loss: 1.89624429, g_loss: 0.48314875
  91. Epoch: [ 0] [ 94/ 800] time: 1405.7957, d_loss: 1.95725751, g_loss: 0.50201762
  92. Epoch: [ 0] [ 95/ 800] time: 1419.8575, d_loss: 2.04467034, g_loss: 0.47200602
  93. Epoch: [ 0] [ 96/ 800] time: 1432.6235, d_loss: 1.86375761, g_loss: 0.63056684
  94. Epoch: [ 0] [ 97/ 800] time: 1446.1109, d_loss: 1.75833380, g_loss: 0.68587345
  95. Epoch: [ 0] [ 98/ 800] time: 1459.7021, d_loss: 1.61311054, g_loss: 0.56521410
  96. Epoch: [ 0] [ 99/ 800] time: 1473.4438, d_loss: 1.63083386, g_loss: 0.55198652
  97. [Sample] d_loss: 1.56934571, g_loss: 0.58893394
  98. Epoch: [ 0] [ 100/ 800] time: 1490.8011, d_loss: 2.02212882, g_loss: 0.38942879
  99. Epoch: [ 0] [ 101/ 800] time: 1504.8573, d_loss: 2.08615398, g_loss: 0.41869015
  100. Epoch: [ 0] [ 102/ 800] time: 1520.3561, d_loss: 1.94494843, g_loss: 0.52331185
  101. Epoch: [ 0] [ 103/ 800] time: 1534.8911, d_loss: 1.68799090, g_loss: 0.57893807
  102. Epoch: [ 0] [ 104/ 800] time: 1550.2059, d_loss: 1.73278153, g_loss: 0.55513334
  103. Epoch: [ 0] [ 105/ 800] time: 1564.4857, d_loss: 1.66107357, g_loss: 0.58009803
  104. Epoch: [ 0] [ 106/ 800] time: 1577.7365, d_loss: 1.62651777, g_loss: 0.68608046
  105. Epoch: [ 0] [ 107/ 800] time: 1591.2906, d_loss: 1.68899119, g_loss: 0.64795619
  106. Epoch: [ 0] [ 108/ 800] time: 1604.4354, d_loss: 1.64453030, g_loss: 0.66518682
  107. Epoch: [ 0] [ 109/ 800] time: 1618.1593, d_loss: 1.56328249, g_loss: 0.66451979
  108. Epoch: [ 0] [ 110/ 800] time: 1633.1294, d_loss: 1.51543558, g_loss: 0.77611113
  109. ……
  110. Epoch: [ 0] [ 160/ 800] time: 2385.2872, d_loss: 1.92123890, g_loss: 0.45402479
  111. Epoch: [ 0] [ 161/ 800] time: 2400.4567, d_loss: 1.78833413, g_loss: 0.53086638
  112. Epoch: [ 0] [ 162/ 800] time: 2415.2647, d_loss: 1.57849348, g_loss: 0.71513641
  113. Epoch: [ 0] [ 163/ 800] time: 2429.8398, d_loss: 1.67605543, g_loss: 0.65658081
  114. Epoch: [ 0] [ 164/ 800] time: 2447.2616, d_loss: 1.41697562, g_loss: 0.69170052
  115. Epoch: [ 0] [ 165/ 800] time: 2462.9209, d_loss: 1.37472379, g_loss: 0.81910974
  116. Epoch: [ 0] [ 166/ 800] time: 2479.5134, d_loss: 1.52106404, g_loss: 0.65593958
  117. Epoch: [ 0] [ 167/ 800] time: 2499.4337, d_loss: 1.48481750, g_loss: 0.56352514
  118. Epoch: [ 0] [ 168/ 800] time: 2515.0022, d_loss: 1.51672626, g_loss: 0.61658454
  119. Epoch: [ 0] [ 169/ 800] time: 2529.4996, d_loss: 1.60589409, g_loss: 0.63836646
  120. Epoch: [ 0] [ 170/ 800] time: 2543.3981, d_loss: 1.44772625, g_loss: 0.65181255
  121. ……
  122. Epoch: [ 0] [ 190/ 800] time: 2825.9758, d_loss: 1.47412062, g_loss: 0.54513580
  123. Epoch: [ 0] [ 191/ 800] time: 2838.9723, d_loss: 1.55055904, g_loss: 0.58368361
  124. Epoch: [ 0] [ 192/ 800] time: 2852.2630, d_loss: 1.59510207, g_loss: 0.66829801
  125. Epoch: [ 0] [ 193/ 800] time: 2866.4205, d_loss: 1.46519923, g_loss: 0.61558247
  126. Epoch: [ 0] [ 194/ 800] time: 2879.9993, d_loss: 1.32191777, g_loss: 0.80541551
  127. Epoch: [ 0] [ 195/ 800] time: 2893.4340, d_loss: 1.01147175, g_loss: 1.06913197
  128. Epoch: [ 0] [ 196/ 800] time: 2906.5733, d_loss: 0.93962598, g_loss: 0.83171976
  129. Epoch: [ 0] [ 197/ 800] time: 2920.1912, d_loss: 1.17017913, g_loss: 0.67285419
  130. Epoch: [ 0] [ 198/ 800] time: 2933.5356, d_loss: 1.59560084, g_loss: 0.56722575
  131. Epoch: [ 0] [ 199/ 800] time: 2947.0078, d_loss: 1.79016471, g_loss: 0.63441348
  132. [Sample] d_loss: 1.81597352, g_loss: 0.72201991
  133. Epoch: [ 0] [ 200/ 800] time: 2962.8138, d_loss: 1.84360504, g_loss: 0.68355072
  134. Epoch: [ 0] [ 201/ 800] time: 2976.0156, d_loss: 1.79623175, g_loss: 0.82725859
  135. Epoch: [ 0] [ 202/ 800] time: 2990.1701, d_loss: 1.84564495, g_loss: 0.36759761
  136. Epoch: [ 0] [ 203/ 800] time: 3003.2376, d_loss: 1.33034515, g_loss: 1.12043190
  137. Epoch: [ 0] [ 204/ 800] time: 3016.9012, d_loss: 1.43244946, g_loss: 0.60710204
  138. Epoch: [ 0] [ 205/ 800] time: 3031.2064, d_loss: 1.77543664, g_loss: 0.37925830
  139. Epoch: [ 0] [ 206/ 800] time: 3044.6623, d_loss: 1.38716245, g_loss: 0.79690325
  140. Epoch: [ 0] [ 207/ 800] time: 3058.6295, d_loss: 1.41732562, g_loss: 0.71504021
  141. Epoch: [ 0] [ 208/ 800] time: 3075.1982, d_loss: 1.48065066, g_loss: 0.58098531
  142. Epoch: [ 0] [ 209/ 800] time: 3092.2044, d_loss: 1.39409590, g_loss: 0.85311776
  143. Epoch: [ 0] [ 210/ 800] time: 3106.7110, d_loss: 1.55829871, g_loss: 0.71159673
  144. ……
  145. Epoch: [ 0] [ 250/ 800] time: 3706.7694, d_loss: 1.48207712, g_loss: 0.62254345
  146. Epoch: [ 0] [ 251/ 800] time: 3722.4864, d_loss: 1.43726230, g_loss: 0.59676802
  147. Epoch: [ 0] [ 252/ 800] time: 3739.1110, d_loss: 1.39565313, g_loss: 0.61483824
  148. Epoch: [ 0] [ 253/ 800] time: 3753.9008, d_loss: 1.64175820, g_loss: 0.55743980
  149. Epoch: [ 0] [ 254/ 800] time: 3768.4591, d_loss: 2.25337219, g_loss: 0.39440048
  150. Epoch: [ 0] [ 255/ 800] time: 3784.3170, d_loss: 2.21880293, g_loss: 0.43557072
  151. Epoch: [ 0] [ 256/ 800] time: 3799.8508, d_loss: 1.92927480, g_loss: 0.60396165
  152. Epoch: [ 0] [ 257/ 800] time: 3819.0884, d_loss: 1.54789436, g_loss: 0.62363708
  153. Epoch: [ 0] [ 258/ 800] time: 3835.9283, d_loss: 1.45292878, g_loss: 0.78123999
  154. Epoch: [ 0] [ 259/ 800] time: 3851.7583, d_loss: 1.38242722, g_loss: 0.71697128
  155. Epoch: [ 0] [ 260/ 800] time: 3867.8912, d_loss: 1.42830288, g_loss: 0.72657067
  156. ……
  157. Epoch: [ 0] [ 290/ 800] time: 4347.6360, d_loss: 1.51859045, g_loss: 0.63133144
  158. Epoch: [ 0] [ 291/ 800] time: 4362.6835, d_loss: 1.51562345, g_loss: 0.63072002
  159. Epoch: [ 0] [ 292/ 800] time: 4376.7609, d_loss: 1.51966012, g_loss: 0.68376446
  160. Epoch: [ 0] [ 293/ 800] time: 4391.5809, d_loss: 1.46159744, g_loss: 0.77321720
  161. Epoch: [ 0] [ 294/ 800] time: 4405.9471, d_loss: 1.51635325, g_loss: 0.64838612
  162. Epoch: [ 0] [ 295/ 800] time: 4421.1065, d_loss: 1.63491082, g_loss: 0.59127223
  163. Epoch: [ 0] [ 296/ 800] time: 4436.1505, d_loss: 1.56633282, g_loss: 0.63173258
  164. Epoch: [ 0] [ 297/ 800] time: 4451.4322, d_loss: 1.73018694, g_loss: 0.64139992
  165. Epoch: [ 0] [ 298/ 800] time: 4466.8813, d_loss: 1.60332918, g_loss: 0.64779305
  166. Epoch: [ 0] [ 299/ 800] time: 4482.4206, d_loss: 1.30365634, g_loss: 0.69317293
  167. [Sample] d_loss: 1.52858722, g_loss: 0.66097701
  168. Epoch: [ 0] [ 300/ 800] time: 4501.5354, d_loss: 1.54065537, g_loss: 0.61486077
  169. Epoch: [ 0] [ 301/ 800] time: 4517.9595, d_loss: 1.40912437, g_loss: 0.62744296
  170. Epoch: [ 0] [ 302/ 800] time: 4532.8548, d_loss: 1.83548975, g_loss: 0.48546115
  171. Epoch: [ 0] [ 303/ 800] time: 4548.5219, d_loss: 1.78749907, g_loss: 0.54208493
  172. Epoch: [ 0] [ 304/ 800] time: 4565.8423, d_loss: 1.59532309, g_loss: 0.70925272
  173. Epoch: [ 0] [ 305/ 800] time: 4582.6995, d_loss: 1.55741489, g_loss: 0.69813800
  174. Epoch: [ 0] [ 306/ 800] time: 4598.1985, d_loss: 1.46890306, g_loss: 0.65037167
  175. Epoch: [ 0] [ 307/ 800] time: 4613.3077, d_loss: 1.47391725, g_loss: 0.66135353
  176. Epoch: [ 0] [ 308/ 800] time: 4628.6944, d_loss: 1.47143006, g_loss: 0.68910688
  177. Epoch: [ 0] [ 309/ 800] time: 4643.7088, d_loss: 1.49028301, g_loss: 0.67232418
  178. Epoch: [ 0] [ 310/ 800] time: 4659.5347, d_loss: 1.59941697, g_loss: 0.67055005
  179. ……
  180. Epoch: [ 0] [ 350/ 800] time: 5263.0389, d_loss: 1.52133381, g_loss: 0.66190934
  181. Epoch: [ 0] [ 351/ 800] time: 5277.8903, d_loss: 1.50694644, g_loss: 0.57145911
  182. Epoch: [ 0] [ 352/ 800] time: 5292.1188, d_loss: 1.70610642, g_loss: 0.49781984
  183. Epoch: [ 0] [ 353/ 800] time: 5306.9107, d_loss: 1.77215934, g_loss: 0.58978939
  184. Epoch: [ 0] [ 354/ 800] time: 5321.5938, d_loss: 1.74831009, g_loss: 0.67320079
  185. Epoch: [ 0] [ 355/ 800] time: 5336.4302, d_loss: 1.59669852, g_loss: 0.68336225
  186. Epoch: [ 0] [ 356/ 800] time: 5351.4221, d_loss: 1.46689534, g_loss: 0.84482712
  187. Epoch: [ 0] [ 357/ 800] time: 5367.1353, d_loss: 1.38674009, g_loss: 0.78510588
  188. Epoch: [ 0] [ 358/ 800] time: 5384.3114, d_loss: 1.30605173, g_loss: 0.85381281
  189. Epoch: [ 0] [ 359/ 800] time: 5398.4569, d_loss: 1.29629779, g_loss: 0.81868672
  190. Epoch: [ 0] [ 360/ 800] time: 5413.3162, d_loss: 1.21817279, g_loss: 0.80424130
  191. Epoch: [ 0] [ 361/ 800] time: 5427.5560, d_loss: 1.35527205, g_loss: 0.67310977
  192. Epoch: [ 0] [ 362/ 800] time: 5441.6695, d_loss: 1.40627885, g_loss: 0.67996454
  193. Epoch: [ 0] [ 363/ 800] time: 5459.0163, d_loss: 1.33116567, g_loss: 0.73797810
  194. Epoch: [ 0] [ 364/ 800] time: 5478.6128, d_loss: 1.29250467, g_loss: 0.82915306
  195. Epoch: [ 0] [ 365/ 800] time: 5495.0862, d_loss: 1.37827444, g_loss: 0.73634720
  196. Epoch: [ 0] [ 366/ 800] time: 5514.7329, d_loss: 1.35434794, g_loss: 0.60365015
  197. Epoch: [ 0] [ 367/ 800] time: 5529.6542, d_loss: 1.53991985, g_loss: 0.62364745
  198. Epoch: [ 0] [ 368/ 800] time: 5543.7427, d_loss: 1.72570002, g_loss: 0.62098628
  199. Epoch: [ 0] [ 369/ 800] time: 5561.1792, d_loss: 1.73738861, g_loss: 0.55012739
  200. Epoch: [ 0] [ 370/ 800] time: 5575.9147, d_loss: 1.58512247, g_loss: 0.55001098
  201. Epoch: [ 0] [ 371/ 800] time: 5592.3616, d_loss: 1.59266281, g_loss: 0.69175625
  202. ……
  203. Epoch: [ 0] [ 399/ 800]
  204. ……
  205. Epoch: [ 0] [ 499/ 800]
  206. ……
  207. Epoch: [ 0] [ 599/ 800]
  208. ……
  209. Epoch: [ 0] [ 699/ 800]
  210. ……
  211. Epoch: [ 0] [ 799/ 800]
  212. ……
  213. Epoch: [ 1] [ 99/ 800]

网站声明:如果转载,请联系本站管理员。否则一切后果自行承担。

本文链接:https://www.xckfsq.com/news/show.html?id=3591
赞同 0
评论 0 条
用户已注册L0
粉丝 0 发表 5 + 关注 私信
上周热门
如何使用 StarRocks 管理和优化数据湖中的数据?  2672
【软件正版化】软件正版化工作要点  2637
统信UOS试玩黑神话:悟空  2532
信刻光盘安全隔离与信息交换系统  2216
镜舟科技与中启乘数科技达成战略合作,共筑数据服务新生态  1092
grub引导程序无法找到指定设备和分区  743
WPS City Talk · 校招西安站来了!  15
金山办公2024算法挑战赛 | 报名截止日期更新  15
看到某国的寻呼机炸了,就问你用某水果手机发抖不?  14
有在找工作的IT人吗?  13
本周热议
我的信创开放社区兼职赚钱历程 40
今天你签到了吗? 27
信创开放社区邀请他人注册的具体步骤如下 15
如何玩转信创开放社区—从小白进阶到专家 15
方德桌面操作系统 14
我有15积分有什么用? 13
用抖音玩法闯信创开放社区——用平台宣传企业产品服务 13
如何让你先人一步获得悬赏问题信息?(创作者必看) 12
2024中国信创产业发展大会暨中国信息科技创新与应用博览会 9
中央国家机关政府采购中心:应当将CPU、操作系统符合安全可靠测评要求纳入采购需求 8

加入交流群

请使用微信扫一扫!