对比学习四.
The algorithm uses a two-stage training strategy.
the first strategy: pretrained
the second strategy: classification
I. INTRODUCTION(已有方法写综述+优缺,提出方法写优缺)
四种解决小样本的方法:semisupervised learning (SSL), unsupervised learning, transfer learning, and small sample learning (Few-shot learning) for hyperspectral image classification.
无监督学习可以看作一种聚类方法:
传统聚类方法:K-means, fuzzy C-means
深度学习聚类方法: SAE.
II. RELATED WORK
A. Self-Supervised Learning
对自监督和无监督分类做了概念辨析。
首先:自监督属于无监督。
其次无监督主要是发现“模式”:异常检测,聚类,降维。
自监督:需要代理任务制作的标签。
自然而然:代理任务很重要。
B. Contrastive Learning
做了比较详细的原理说明。
III. PROPOSED SSCL MODEL
A. Data Preprocessing
principal component analysis (PCA) -> four dimensions
extended morphological profile (EMP)
B. Data Augmentation
random Gaussian noise
C. Structure of the SSCL
很常规。
IV. EXPERIMENTAL RESULTS
For the Salinas and Pavia University datasets, the input image block size is set to 23 × 23, while the Botswana dataset is set to 5×5
training and test set:
- 30% CL pretraining
- 10% for training
对比算法居然有ResNet 50 牛逼。
这些后续的训练有没有冻结权重呀!
做的实验:
- Classification Accuracy
- Parameters Analysis
- Effect of Temperature Parameter
- Effect of the Neighborhood Size
- Performance of SSCL Under Different Training Ratios
- Effect of Batch size (影响显著)
- V. C ONCLUSION
多思考下introduction 和 related work怎么写挺好的。