VARIABLE RATE IMAGE COMPRESSION WITH RECURRENT NEURAL NETWORKS代码复现

首先上github地址github地址
然后基于rnn的理解
这里依旧记录一下复现过程中遇到的问题

1、安装pytorch环境

这一步先跳过了,可以参考之前的文章安装pytorch环境,如果在服务器安装记得选择linux环境

2、安装缺少的包

pip install pyyaml
pip install tb-nightly
pip install colored

3、训练使用绝对路径

github中给出的训练命令是
在这里插入图片描述
但实际过程中我调试出现了路径不存在的问题,我将其改为了绝对路径

4、数据集下载

数据集由来自YouTube-8M 数据集的 720p 图像组成,这里csdn不让上传1000M以上的,大家自己下载吧

  • 0
    点赞
  • 3
    收藏
    觉得还不错? 一键收藏
  • 1
    评论
Preface Acknowledgment Chapter 1—Introduction 1.1 Pattern Recognition Systems 1.2 Motivation For Artificial Neural Network Approach 1.3 A Prelude To Pattern Recognition 1.4 Statistical Pattern Recognition 1.5 Syntactic Pattern Recognition 1.6 The Character Recognition Problem 1.7 Organization Of Topics References And Bibliography Chapter 2—Neural Networks: An Overview 2.1 Motivation for Overviewing Biological Neural Networks 2.2 Background 2.3 Biological Neural Networks 2.4 Hierarchical Organization in the Brain 2.5 Historical Background 2.6 Artificial Neural Networks References and Bibliography Chapter 3—Preprocessing 3.1 General 3.2 Dealing with Input from a Scanned Image 3.3 Image Compression 3.3.1 Image Compression Example 3.4 Edge Detection 3.5 Skeletonizing 3.5.1 Thinning Example 3.6 Dealing with Input From a Tablet 3.7 Segmentation References and Bibliography Chapter 4—Feed-Forward Networks with Supervised Learning 4.1 Feed-Forward Multilayer Perceptron (FFMLP) Architecture 4.2 FFMLP in C++ 4.3 Training with Back Propagation 4.3.1 Back Propagation in C++ 4.4 A Primitive Example 4.5 Training Strategies and Avoiding Local Minima 4.6 Variations on Gradient Descent 4.6.1 Block Adaptive vs. Data Adaptive Gradient Descent 4.6.2 First-Order vs. Second-Order Gradient Descent 4.7 Topology 4.8 ACON vs. OCON 4.9 Overtraining and Generalization 4.10 Training Set Size and Network Size 4.11 Conjugate Gradient Method 4.12 ALOPEX References and Bibliography Chapter 5—Some Other Types of Neural Networks 5.1 General 5.2 Radial Basis Function Networks 5.2.1 Network Architecture 5.2.2 RBF Training 5.2.3 Applications of RBF Networks 5.3 Higher Order Neural Networks 5.3.1 Introduction 5.3.2 Architecture 5.3.3 Invariance to Geometric Transformations 5.3.4 An Example 5.3.5 Practical Applications References and Bibliography Chapter 6—Feature Extraction I: Geometric Features and Transformations 6.1 General 6.2 Geometric Features (Loops, Intersections, and Endpoints) 6.2.1 Intersections and Endpoints 6.2.2 Loops 6.3 Feature Maps 6.4 A Network Example Using Geometric Features 6.5 Feature Extraction Using Transformations 6.6 Fourier Descriptors 6.7 Gabor Transformations and Wavelets References And Bibliography Chapter 7—Feature Extraction II: Principal Component Analysis 7.1 Dimensionality Reduction 7.2 Principal Components 7.2.1 PCA Example 7.3 Karhunen-Loeve (K-L) Transformation 7.3.1 K-L Transformation Example 7.4 Principal Component Neural Networks 7.5 Applications References and Bibliography Chapter 8—Kohonen Networks and Learning Vector Quantization 8.1 General 8.2 The K-Means Algorithm 8.2.1 K-Means Example 8.3 An Introduction To The Kohonen Model 8.3.1 Kohonen Example 8.4 The Role Of Lateral Feedback 8.5 Kohonen Self-Organizing Feature Map 8.5.1 SOFM Example 8.6 Learning Vector Quantization 8.6.1 LVQ Example 8.7 Variations On LVQ 8.7.1 LVQ2 8.7.2 LVQ2.1 8.7.3 LVQ3 8.7.4 A Final Variation Of LVQ References And Bibliography Chapter 9—Neural Associative Memories and Hopfield Networks 9.1 General 9.2 Linear Associative Memory (LAM) 9.2.1 An Autoassociative LAM Example 9.3 Hopfield Networks 9.4 A Hopfield Example 9.5 Discussion 9.6 Bit Map Example 9.7 Bam Networks 9.8 A Bam Example References And Bibliography Chapter 10—Adaptive Resonance Theory (ART) 10.1 General 10.2 Discovering The Cluster Structure 10.3 Vector Quantization 10.3.1 VQ Example 1 10.3.2 VQ Example 2 10.3.3 VQ Example 3 10.4 Art Philosophy 10.5 The Stability-Plasticity Dilemma 10.6 ART1: Basic Operation 10.7 ART1: Algorithm 10.8 The Gain Control Mechanism 10.8.1 Gain Ccontrol Example 1 10.8.2 Gain Control Example 2 10.9 ART2 Model 10.10 Discussion 10.11 Applications References and Bibliography Chapter 11—Neocognitron 11.1 Introduction 11.2 Architecture 11.3 Example of a System with Sample Training Patterns References and Bibliography Chapter 12—Systems with Multiple Classifiers 12.1 General 12.2 A Framework for Combining Multiple Recognizers 12.3 Voting Schemes 12.4 The Confusion Matrix 12.5 Reliability 12.6 Some Empirical Approaches References and Bibliography Index

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值