DL之DenseNet:DenseNet算法的简介(论文介绍)、架构详解、案例应用等配图集合之详细攻略


优秀就百褶裙
优秀就百褶裙 2022-09-19 14:14:22 50686
分类专栏: 资讯

DL之DenseNet:DenseNet算法的简介(论文介绍)、架构详解、案例应用等配图集合之详细攻略

目录

DenseNet算法的简介(论文介绍)

DenseNet算法的架构详解

3、DenseNet architectures for ImageNet

4、实验结果

DenseNet算法的案例应用


相关文章
DL之DenseNet:DenseNet算法的简介(论文介绍)、架构详解、案例应用等配图集合之详细攻略
DL之DenseNet:DenseNet算法的架构详解

DenseNet算法的简介(论文介绍)

        DenseNet算法即Densely Connected Convolutional Networks,在某种度上也借鉴了ResNet算法,相关论文获得2017 (CVPR Best Paper Award)。

Abstract  
      Recent work has shown that convolutional networks can  be substantially deeper, more accurate, and efficient to train  if they contain shorter connections between layers close to  the input and those close to the output. In this paper, we  embrace this observation and introduce the Dense Convolutional  Network (DenseNet), which connects each layer  to every other layer in a feed-forward fashion. Whereas  traditional convolutional networks with L layers have L  connections—one between each layer and its subsequent  layer—our network has L(L+1)  2  direct connections. For  each layer, the feature-maps of all preceding layers are  used as inputs, and its own feature-maps are used as inputs  into all subsequent layers. DenseNets have several compelling  advantages: they alleviate the vanishing-gradient  problem, strengthen feature propagation, encourage feature  reuse, and substantially reduce the number of parameters.  We evaluate our proposed architecture on four highly  competitive object recognition benchmark tasks (CIFAR-10,  CIFAR-100, SVHN, and ImageNet). DenseNets obtain significant  improvements over the state-of-the-art on most of  them, whilst requiring less computation to achieve high performance.  Code and pre-trained models are available at  https://github.com/liuzhuang13/DenseNet.

摘要
      最近的研究表明,如果卷积网络在靠近输入和接近输出的层之间包含较短的连接,那么卷积网络可以更深入、更准确和有效地训练。在本文中,我们采用这种观测方法,并引入了紧密卷积网络(densenet),它以一种前馈的方式将每一层连接到另一层。传统的具有L层的卷积网络在每一层和其后续层之间都有L连接,而我们的网络有L(L+1)2个直接连接。对于每个图层,前面所有图层的 feature-maps都用作输入,其自身的 feature-maps也用作后面所有图层的输入。 DenseNets有几个引人注目的优点:它们可以缓解消失梯度问题,加强特征传播,鼓励特征重用,并大幅减少参数数量。我们在四个高度竞争的对象识别基准任务 (CIFAR-10,  CIFAR-100, SVHN, and ImageNet)上评估我们提出的体系结构。DenseNets 在大多数方面都比最先进的技术有了显著的改进,同时需要较少的计算来实现高性能。可在https://github.com/liuzhuang13/DenseNet上获取代码和预训练模型。

Conclusion  
      We proposed a new convolutional network architecture,  which we refer to as Dense Convolutional Network  (DenseNet). It introduces direct connections between any  two layers with the same feature-map size. We showed that  DenseNets scale naturally to hundreds of layers, while exhibiting  no optimization difficulties. In our experiments,DenseNets tend to yield consistent improvement in accuracy  with growing number of parameters, without any signs  of performance degradation or overfitting. Under multiple  settings, it achieved state-of-the-art results across several  highly competitive datasets. Moreover, DenseNets  require substantially fewer parameters and less computation  to achieve state-of-the-art performances. Because we  adopted hyperparameter settings optimized for residual networks  in our study, we believe that further gains in accuracy  of DenseNets may be obtained by more detailed tuning of  hyperparameters and learning rate schedules.
       Whilst following a simple connectivity rule, DenseNets  naturally integrate the properties of identity mappings, deep  supervision, and diversified depth. They allow feature reuse  throughout the networks and can consequently learn more  compact and, according to our experiments, more accurate  models. Because of their compact internal representations  and reduced feature redundancy, DenseNets may be good  feature extractors for various computer vision tasks that  build on convolutional features, e.g., [4, 5]. We plan to  study such feature transfer with DenseNets in future work.
结论
       我们提出了一种新的卷积网络结构,我们称之为密集卷积网络(DenseNet)。它引入了任何两层之间具有相同feature-map大小的直接连接。我们发现 DenseNets可以自然地扩展到数百层,但不存在优化困难。在我们的实验中,随着参数数量的增加, DenseNets的精确度会持续提高,而不会出现性能下降或过度拟合的迹象。在多个设置下,它在多个高度竞争的数据集中实现了最先进的结果。此外, DenseNets需要更少的参数和更少的计算来实现最先进的性能。因为我们在研究中采用了针对剩余网络进行优化的超参数设置,我们相信通过更详细地调整超参数和学习速率时间表,可以进一步提高 DenseNets的精度。
       在遵循简单连接规则的同时, DenseNets自然地整合了身份映射、深度监督和多样化深度的属性。它们允许在整个网络中重复使用功能,因此可以学习更紧凑的,根据我们的实验,更精确的模型。由于其紧凑的内部表示和减少的特征冗余,DenseNets可能是各种计算机视觉任务的很好的特征提取器,这些任务基于卷积特征,例如[4,5]。我们计划在未来的工作中与DenseNets一起研究这种特征转移。

论文
Gao Huang, Zhuang Liu, Laurens van der Maaten, Kilian Weinberger.
Densely connected convolutional networks. CVPR. 2017 (CVPR Best Paper Award)
https://arxiv.org/pdf/1608.06993.pdf

GitHub
https://github.com/liuzhuang13/DenseNet
       DenseNet is a network architecture where each layer is directly connected to every other layer in a feed-forward fashion (within each dense block). For each layer, the feature maps of all preceding layers are treated as separate inputs whereas its own feature maps are passed on as inputs to all subsequent layers. This connectivity pattern yields state-of-the-art accuracies on CIFAR10/100 (with or without data augmentation) and SVHN. On the large scale ILSVRC 2012 (ImageNet) dataset, DenseNet achieves a similar accuracy as ResNet, but using less than half the amount of parameters and roughly half the number of FLOPs.
       Densenet是一种网络架构,其中每一层以前馈方式(在每个密集块内)直接连接到其他每一层。对于每个图层,前面所有图层的要素图都被视为单独的输入,而它自己的要素图则作为输入传递给后面所有图层。这种连接模式在CIFAR10/100(有或无数据扩充)和SVHN上产生最先进的精度。在大规模的ILSVRC 2012(ImageNet)数据集上,DenseNet 实现了与ResNet相似的精度,但使用的参数数量不到一半,而使用的触发器数量大约为一半。

DenseNet算法的架构详解


3、DenseNet architectures for ImageNet

The growth rate for all the networks is ?=32. Note that each “conv” layer shown in the table corresponds the sequence BN-ReLU-Conv.   所有网络的增长率为32。请注意,表中所示的每个“conv”层对应于序列BN-ReLU-Conv。

4、实验结果

1、CIFAR-10上的结果

2、ImageNet上的结果

The top-1 and top-5 error rates on the ImageNet validation set, with single-crop / 10-crop testing

ImageNet上的结果:基于DenseNet的分类器只需要ResNet一半的参数量,就可在ImageNet上达到相同分类精度

 

DenseNet算法的案例应用

后期更新……

网站声明:如果转载,请联系本站管理员。否则一切后果自行承担。

本文链接:https://www.xckfsq.com/news/show.html?id=2904
赞同 0
评论 0 条
优秀就百褶裙L0
粉丝 0 发表 8 + 关注 私信
上周热门
如何使用 StarRocks 管理和优化数据湖中的数据?  2950
【软件正版化】软件正版化工作要点  2872
统信UOS试玩黑神话:悟空  2833
信刻光盘安全隔离与信息交换系统  2728
镜舟科技与中启乘数科技达成战略合作,共筑数据服务新生态  1261
grub引导程序无法找到指定设备和分区  1226
华为全联接大会2024丨软通动力分论坛精彩议程抢先看!  165
2024海洋能源产业融合发展论坛暨博览会同期活动-海洋能源与数字化智能化论坛成功举办  163
点击报名 | 京东2025校招进校行程预告  163
华为纯血鸿蒙正式版9月底见!但Mate 70的内情还得接着挖...  158
本周热议
我的信创开放社区兼职赚钱历程 40
今天你签到了吗? 27
如何玩转信创开放社区—从小白进阶到专家 15
信创开放社区邀请他人注册的具体步骤如下 15
方德桌面操作系统 14
用抖音玩法闯信创开放社区——用平台宣传企业产品服务 13
我有15积分有什么用? 13
如何让你先人一步获得悬赏问题信息?(创作者必看) 12
2024中国信创产业发展大会暨中国信息科技创新与应用博览会 9
中央国家机关政府采购中心:应当将CPU、操作系统符合安全可靠测评要求纳入采购需求 8

加入交流群

请使用微信扫一扫!