网络压缩论文整理(network compression)
1. Parameter pruning and sharing
1.1 Quantization and Binarization
Compressing deep convolutional networks using vector quantization
Binaryconnect: Training deep neural networks with binary weights during propagations
Binarynet: Training deep neural net- works with weights and activations constrained to +1 or -1
Xnor-net: Imagenet classification using binary convolutional neural networks
Deep neural networks are robust to weight binarization and other non- linear distortions
1.2 Pruning and Sharing
Comparing biases for minimal network construction with back-propagation
Second order derivatives for network pruning: Optimal brain surgeon
Learning both weights and connections for efficient neural networks
1.3 Designing Structural Matrix
2. Low rank factorization and sparsity
Exploiting linear structure within convolutional networks for efficient evaluation
Speeding up convolutional neural networks with low rank expansions
Speeding-up convolutional neural networks using fine-tuned cp- decomposition
Low-rank matrix factorization for deep neural network training with high-dimensional output targets
3. Transferred/compact convolution filters
Understanding and improving convolutional neural networks via concatenated rectified linear units
Inception-v4, inception-resnet and the impact of residual connections on learning
SQUEEZENET: ALEXNET-LEVEL ACCURACY WITH 50X FEWER PARAMETERS AND <0.5MB MODEL SIZE
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
4. Knowledge distillation
5. Other
Outrageously large neural networks: The sparsely- gated mixture-of-experts layer
Deep dynamic neural networks for multimodal gesture segmentation and recognition
Deep pyramidal residual networks with separated stochastic depth
6. Survey
网络压缩论文整理(network compression)的更多相关文章
- 网络压缩论文集(network compression)
Convolutional Neural Networks ImageNet Models Architecture Design Activation Functions Visualization ...
- 论文笔记——A Deep Neural Network Compression Pipeline: Pruning, Quantization, Huffman Encoding
论文<A Deep Neural Network Compression Pipeline: Pruning, Quantization, Huffman Encoding> Prunin ...
- (转) GAN论文整理
本文转自:http://www.jianshu.com/p/2acb804dd811 GAN论文整理 作者 FinlayLiu 已关注 2016.11.09 13:21 字数 1551 阅读 1263 ...
- plain framework 1 1.0.3更新 优化编译部分、网络压缩和加密
有些东西总是姗姗来迟,就好比这新年的钟声,我们盼望着新年同时也不太旧的一年过去.每当这个时候,我们都会总结一下在过去的一年中我们收获了什么,再计划新的一年我们要实现什么.PF并不是一个十分优秀的框架, ...
- Microsoft Message Analyzer (微软消息分析器,“网络抓包工具 - Network Monitor”的替代品)官方正式版现已发布
来自官方日志的喜悦 被誉为全新开始的消息分析器时代,由MMA为您开启,博客原文写的很激动,大家可以点击这里浏览:http://blogs.technet.com/b/messageanalyzer/a ...
- Neutron 理解 (1): Neutron 所实现的虚拟化网络 [How Netruon Virtualizes Network]
学习 Neutron 系列文章: (1)Neutron 所实现的虚拟化网络 (2)Neutron OpenvSwitch + VLAN 虚拟网络 (3)Neutron OpenvSwitch + GR ...
- 专注于HTTP的高性能高易用性网络库:Fslib.network库
博客列表页:http://blog.fishlee.net/tag/fslib-network/ 原创FSLib.Network库(目前专注于HTTP的高性能高易用性网络库) FSLib.Networ ...
- Microsoft Message Analyzer (微软消息分析器,“网络抓包工具 - Network Monitor”的替代品)官方正式版现已发布
Microsoft Message Analyzer (微软消息分析器,“网络抓包工具 - Network Monitor”的替代品)官方正式版现已发布 来自官方日志的喜悦 被誉为全新开始的消息分析器 ...
- 存储区域网络(Storage Area Network,简称SAN)
存储区域网络(Storage Area Network,简称SAN)采用网状通道(Fibre Channel ,简称FC,区别与Fiber Channel光纤通道)技术,通过FC交换机连接存储阵列和服 ...
随机推荐
- 时间序列深度学习:状态 LSTM 模型预測太阳黑子(一)
版权声明:本文为博主原创文章,未经博主同意不得转载. https://blog.csdn.net/kMD8d5R/article/details/82111558 作者:徐瑞龙,量化分析师,R语言中文 ...
- vue学习之六路由系统
一.vueRouter实现原理 VueRouter的实现原理是根据监控锚点值的改变,从而不断修改组件内容来实现的,我们来试试不使用VueRouter,自己实现路由控制,如下代码: <!DOCTY ...
- pymongo--Bulk Write Operations
mongo支持客户端进行批量写操作,其基于单一集合. mongo数据库允许应用程序指定用于批量写操作的可接受的等级. mongo提供方法db.collection.bulkWrite()用于批量插入, ...
- [LeetCode] 529. Minesweeper_ Medium_ tag: BFS
Let's play the minesweeper game (Wikipedia, online game)! You are given a 2D char matrix representin ...
- webpack相关
原文 https://segmentfault.com/a/1190000005089993 Webpack是目前基于React和Redux开发的应用的主要打包工具.我想使用Angular 2或其他 ...
- 摘要JSR168 PORLET标准手册汉化整理
本规范汉化资源搜集整理于网上并由我作了些修改和添加,主要为适应大陆的语辞.用语及其他未译之处. 由于本人于水平有限,如有错误,请各位高手指正:若有高见,希望不吝言辞,同为中国开源作项献. 特此严重感谢 ...
- Object之clone
一.Object类中clone的实现. 二.clone详解. 看,clone()方法又是一个被声明为native的方法,因此,我们知道了clone()方法并不是Java的原生方法,具体的实现是有C/C ...
- webp格式
有时候你右键保存了一张图片,然后好气啊,打不开.这要么是webp格式,要么,,,,要么有问题啊. WebP格式,谷歌大法开发的一种旨在加快图片加载速度的图片格式.图片压缩体积大约只有JPEG的2/3, ...
- 【译】在Asp.Net中操作PDF - iTextSharp - 利用列进行排版(转)
[译]在Asp.Net中操作PDF - iTextSharp - 利用列进行排版 在使用iTextSharp通过ASP.Net生成PDF的系列文章中,前面的文章已经讲述了iTextSharp所涵盖 ...
- 【kafka学习之三】kafka集群运维
kafka集群维护一.kafka集群启停#启动kafka/home/cluster/kafka211/bin/kafka-server-start.sh -daemon /home/cluster/k ...