omputing Nonvacuous Generalization Bonds for Deep (Stochastic) Neural Networks with Many More Parameters than Training Data |PDF|
Stronger Generalization Bounds for Deep Nets via a Compression Approach |PDF|
Non-Vacuous Generalization Bounds at the ImageNet Scale: A PAC-Bayesian Compression Approach |PDF|
Opening the Black Box of Deep Neural Networks via Information |PDF|
Towards Understanding the Role of Over-Parameterization in Generalization of Neural Networks|PDF|
Implicit Regularization via Neural Feature Alignment|PDF|
On the Generalization Mystery in Deep Learning|PDF|
Stiffness: A New Perspective on Generalization in Neural Networks|PDF|
Why Neural Networks Find Simple Solutions: The Many Regularizers of Geometric Complexity|PDF|
Generalization In Deep Learning|PDF|
mbd.pub/o/GeBENHAGEN
擅长现代信号处理(改进小波分析系列,改进变分模态分解,改进经验小波变换,改进辛几何模态分解等等),改进机器学习,改进深度学习,机械故障诊断,改进时间序列分析(金融信号,心电信号,振动信号等)