- 博客(4)
- 收藏
- 关注
原创 “BRECQ: PUSHING THE LIMIT OF POST-TRAINING QUANTIZATION BY BLOCK RECONSTRUCTION”笔记
对公式进行解释:Such block-diagonal Hessian ignores the inter-block dependency and considers the intra-block dependency but it produces less generalization error.(忽略了块间依赖,考虑了块内依赖) Then we can block-by-block reconstruct the intermediate output.
2024-09-15 15:33:48 943
原创 “Improving Post Training Neural Quantization: Layer-wise Calibration and Integer Programming”笔记
通过整数规划,可以找到满足所有约束条件的最佳整数解。但是QAT有三个主要限制:(a) they require the large training set to avoid over-fitting, (b) they approximate the back-propagation gradients through discrete function (the quantizer) and (c) they have high computational and memory footprints.
2024-09-15 15:32:32 676
原创 多目标多角度的快速模板匹配算法(基于NCC,效果无限接近Halcon中........)
多目标多角度的快速模板匹配算法(基于NCC,效果无限接近Halcon中........)
2022-04-22 00:18:10 4483
空空如也
空空如也
TA创建的收藏夹 TA关注的收藏夹
TA关注的人