Note《Anchored Neighborhood Regression for Fast Example-Based Super-Resolution》

Note:

Anchored NeighborhoodRegression for Fast Example-Based Super-Resolution

 

1.        Abstract

a)        Propose fast super-resolutionmethods while making no compromise on quality

   i.             Support the use of sparselearned dictionaries in combination with neighbor embedding methods.

1).      Dictionary atomsßEuclidean distance

  ii.             Use global collaborative coding

  iii.             Propose the anchoredneighborhood regression

1).        Anchor the neighborhoodembedding

2).        Precompute the correspondingembedding matrix

2.        Introduction

a)        Defination of super-resolution

b)        Three subclasses

 i.             Interpilation methods

ii.             Multi-frame methods

iii.             Learning-based methods

1).        Gradient Profile Prior

2).        Dictionary- or example- learning methods

a)        Subdivided into patches

b)        Form a Markov Random Field(MRF)

c)        Search for nearest neighbors

d)        HR is retrieved

e)        MRF can be solved

3).        Downside

a)        High computational complexity

b)        Overcome:

  i.             Neighbor embedding

 ii.             Sparse encoding approaches

4).        Proposed example-basedsuper-resolution

a).             Low computational time

b).             Qualitative performance

c)        Organization

                         i.             Section 2:neighbor embedding& sparse coding

                       ii.             Section 3: proposed methods

                      iii.             Section 4:experimental results

                      iv.             Section 5:conclusions

3.        Dictionary-basedSuper-Resolution

a)        Neighbor embedding approaches

i.             Low-dimensional nonlinearmanifolds

 ii.             Locally linear embedding(LLE)

1.        Search for a set of K nearestneighbors

2.        Compute K appropriate weights

3.        Create HR patchs

4.        Create result HR image

                     iii.             Nonnegative neighbor embeddingapproaches

b)        Sparse coding approaches

i.             Effects: a learned compactdictionary

 ii.             

  iii.             sparsedictionaries:

  iv.             several modifications:

1).        different training approaches

2).        pseudoinverse(伪逆法)

3).        PCA

4).        Orthogonal matching pursuit

4.        Proposed Methods

a)        Global regression :special caseof ANR

 i.             

ii.             

  iii.             

  iv.             

 v.             

vi.             

b)        Anchored neighborhoodregression

5.        Experiments

a)        Conditions

 i.             Features

1).        Luminance component

2).        Basic feature: the patch

3).        First and second order derivative

ii.             Embeddings

 iii.             Dictionaries

1).        The larger the dictionary thebetter the performance

2).        “internal”dictionary,”external” dictionary

3).        Randomly sampled dictionaries,learned dictionaries

 iv.             Neighborhoods

b)        Performance

 i.             Quality

  ii.             Running times

6.        Conclusions              

a)        Propose a new example-basedmethod for super-resolution called Anchored Neighbor Regression

b)        Propose an extreme variantcalled Global Regression

c)        Most of these can reach asimilar top performance based on using the appropriate neighborhood size anddictionary

 

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值