AutoDL——NAS文献

人工智能文献记录专栏,专栏地址:https://blog.csdn.net/u014157632/category_9760481.html,总目录:https://blog.csdn.net/u014157632/article/details/104578738。不定期更新

1st Workshop on NAS at ICLR 2020

ICLR2020举办了历史第一届NAS的workshop(原网站点这里),论文打包下载点这里

2020

  • Yu J, Jin P, Liu H, et al. BigNAS: Scaling Up Neural Architecture Search with Big Single-Stage Models[J]. arXiv preprint arXiv:2003.11142, 2020.
  • Luo R, Tan X, Wang R, et al. Semi-Supervised Neural Architecture Search[J]. arXiv preprint arXiv:2002.10389, 2020.
  • Liu C, Dollár P, He K, et al. Are Labels Necessary for Neural Architecture Search?[J]. arXiv preprint arXiv:2003.12056, 2020.
  • Neural Architecture Transfer
  • Zhou D, Zhou X, Zhang W, et al. EcoNAS: Finding Proxies for Economical Neural Architecture Search[J]. arXiv preprint arXiv:2001.01233, 2020.
  • You S, Huang T, Yang M, et al. GreedyNAS: Towards Fast One-Shot NAS with Greedy Supernet[J]. arXiv preprint arXiv:2003.11236, 2020.
  • Real E, Liang C, So D R, et al. AutoML-Zero: Evolving Machine Learning Algorithms From Scratch[J]. arXiv preprint arXiv:2003.03384, 2020.
  • Dong X, Yang Y. NAS-Bench-102: Extending the Scope of Reproducible Neural Architecture Search[J]. arXiv preprint arXiv:2001.00326, 2020.
  • Zela A, Siems J, Hutter F. Nas-bench-1shot1: Benchmarking and dissecting one-shot neural architecture search[J]. arXiv preprint arXiv:2001.10422, 2020.
  • Rethinking Performance Estimation in Neural Architecture Search
  • Guo R, Lin C, Li C, et al. Powering One-shot Topological NAS with Stabilized Share-parameter Proxy[J]. arXiv preprint arXiv:2005.10511, 2020.
  • FBNetV3: Joint Architecture-Recipe Search using Neural Acquisition Function
  • Chen, Xiangning, and Cho-Jui Hsieh. “Stabilizing Differentiable Architecture Search via Perturbation-based Regularization.” arXiv preprint arXiv:2002.05283 (2020).
  • Jiang, Chenhan, et al. “SP-NAS: Serial-to-Parallel Backbone Search for Object Detection.” Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2020.
  • DA-NAS: Data Adapted Pruning for Efficient Neural Architecture Search
  • BATS_ Binary ArchitecTure Search
  • Angle-based Search Space Shrinking for Neural Architecture Search
  • Kaplan S , Giryes R . Self-supervised Neural Architecture Search[J]. 2020.

2019

  • Zheng X, Ji R, Tang L, et al. Multinomial Distribution Learning for Effective Neural Architecture Search[C]//Proceedings of the IEEE International Conference on Computer Vision. 2019: 1304-1313.
  • Dong X, Yang Y. Searching for a robust neural architecture in four gpu hours[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2019: 1761-1770.
  • Li G, Qian G, Delgadillo I C, et al. SGAS: Sequential Greedy Architecture Search[J]. arXiv preprint arXiv:1912.00195, 2019.
  • Cai H, Zhu L, Han S. Proxylessnas: Direct neural architecture search on target task and hardware[J]. arXiv preprint arXiv:1812.00332, 2018.
  • Li X, Lin C, Li C, et al. Improving One-shot NAS by Suppressing the Posterior Fading[J]. arXiv preprint arXiv:1910.02543, 2019.
  • Yang A, Esperança P M, Carlucci F M. NAS evaluation is frustratingly hard[J]. arXiv preprint arXiv:1912.12522, 2019.
  • Shu Y, Wang W, Cai S. Understanding Architectures Learnt by Cell-based Neural Architecture Search[J]. arXiv preprint arXiv:1909.09569, 2019.
  • Sciuto C, Yu K, Jaggi M, et al. Evaluating the search phase of neural architecture search[J]. arXiv preprint arXiv:1902.08142, 2019.
  • Mei J, Li Y, Lian X, et al. AtomNAS: Fine-Grained End-to-End Neural Architecture Search[J]. arXiv preprint arXiv:1912.09640, 2019.
  • Cai H, Gan C, Han S. Once for all: Train one network and specialize it for efficient deployment[J]. arXiv preprint arXiv:1908.09791, 2019.
  • Yao L , Xu H , Zhang W , et al. SM-NAS: Structural-to-Modular Neural Architecture Search for Object Detection[J]. 2019.
  • Wen W , Liu H , Li H , et al. Neural Predictor for Neural Architecture Search[J]. 2019.
  • AM-LFS: AutoML for Loss Function Search

基于强化学习、进化算法

  • Zoph B, Le Q V. Neural architecture search with reinforcement learning[J]. arXiv preprint arXiv:1611.01578, 2016.
  • Baker B, Gupta O, Naik N, et al. Designing neural network architectures using reinforcement learning[J]. arXiv preprint arXiv:1611.02167, 2016.
  • Zoph B, Vasudevan V, Shlens J, et al. Learning transferable architectures for scalable image recognition[C]//Proceedings of the IEEE conference on computer vision and pattern recognition. 2018: 8697-8710.
  • Tan M, Chen B, Pang R, et al. Mnasnet: Platform-aware neural architecture search for mobile[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2019: 2820-2828.
  • Pham H, Guan M Y, Zoph B, et al. Efficient neural architecture search via parameter sharing[J]. arXiv preprint arXiv:1802.03268, 2018.
  • Real E, Moore S, Selle A, et al. Large-scale evolution of image classifiers[C]//Proceedings of the 34th International Conference on Machine Learning-Volume 70. JMLR. org, 2017: 2902-2911.
  • Real E, Aggarwal A, Huang Y, et al. Regularized evolution for image classifier architecture search[C]//Proceedings of the aaai conference on artificial intelligence. 2019, 33: 4780-4789.

DARTS路线

  • Liu H, Simonyan K, Yang Y. Darts: Differentiable architecture search[J]. arXiv preprint arXiv:1806.09055, 2018.
  • Chu X, Zhou T, Zhang B, et al. Fair DARTS: Eliminating Unfair Advantages in Differentiable Architecture Search[J]. arXiv preprint arXiv:1911.12126, 2019.
  • Arber Zela T E, Saikia T, Marrakchi Y, et al. Understanding and robustifying differentiable architecture search[J]. arXiv preprint arXiv:1909.09656, 2019, 2(4): 9.
  • Liang H, Zhang S, Sun J, et al. Darts+: Improved differentiable architecture search with early stopping[J]. arXiv preprint arXiv:1909.06035, 2019.
  • Chen X, Xie L, Wu J, et al. Progressive differentiable architecture search: Bridging the depth gap between search and evaluation[C]//Proceedings of the IEEE International Conference on Computer Vision. 2019: 1294-1303.
  • Xu Y, Xie L, Zhang X, et al. Pc-darts: Partial channel connections for memory-efficient differentiable architecture search[J]. arXiv preprint arXiv:1907.05737, 2019.
  • Li G, Zhang X, Wang Z, et al. StacNAS: Towards stable and consistent optimization for differentiable Neural Architecture Search[J]. arXiv preprint arXiv:1909.11926, 2019.
  • Wu B, Dai X, Zhang P, et al. Fbnet: Hardware-aware efficient convnet design via differentiable neural architecture search[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2019: 10734-10742.

One-Shot路线

  • Brock A, Lim T, Ritchie J M, et al. Smash: one-shot model architecture search through hypernetworks[J]. arXiv preprint arXiv:1708.05344, 2017.
  • Bender G. Understanding and simplifying one-shot architecture search[J]. 2019.
  • Guo Z, Zhang X, Mu H, et al. Single path one-shot neural architecture search with uniform sampling[J]. arXiv preprint arXiv:1904.00420, 2019.
  • Stamoulis D, Ding R, Wang D, et al. Single-path nas: Designing hardware-efficient convnets in less than 4 hours[J]. arXiv preprint arXiv:1904.02877, 2019.
  • Zhou Y, Sun X, Luo C, et al. One-Shot Neural Architecture Search Through A Posteriori Distribution Guided Sampling[J]. arXiv preprint arXiv:1906.09557, 2019.
  • Cho M, Soltani M, Hegde C. One-Shot Neural Architecture Search via Compressive Sensing[J]. arXiv preprint arXiv:1906.02869, 2019.
  • Yu J, Huang T. AutoSlim: Towards One-Shot Architecture Search for Channel Numbers[J]. arXiv preprint arXiv:1903.11728, 2019, 8.
  • Chu X, Zhang B, Xu R, et al. Fairnas: Rethinking evaluation fairness of weight sharing neural architecture search[J]. arXiv preprint arXiv:1907.01845, 2019.
  • Chu X, Zhang B, Li J, et al. Scarletnas: Bridging the gap between scalability and fairness in neural architecture search[J]. arXiv preprint arXiv:1908.06022, 2019.
  • Chu X, Li X, Lu Y, et al. MixPath: A Unified Approach for One-shot Neural Architecture Search[J]. arXiv preprint arXiv:2001.05887, 2020.
  • 1
    点赞
  • 8
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值