DS Wannabe之5-AM Project: DS 30day int prep day14

本文概述了几个重要的深度学习模型,如AlexNet、VGGNet(特别是VGG16)、ResNet(及其在ILSVRC2015的突破),以及它们如何通过引入新颖架构(如skipconnections和batchnormalization)来提升性能。同时讨论了HaarCascade在对象检测中的应用和TransferLearning的概念,以及FasterR-CNN和LeNet-5等其他关键模型。
摘要由CSDN通过智能技术生成

Q1. What is Alexnet?

Q2. What is VGGNet?

Q3. What is VGG16?

Q4. What is ResNet?

At the ILSVRC 2015, so-called Residual Neural Network (ResNet) by the Kaiming He et al introduced the anovel architecture with “skip connections” and features heavy batch normalisation. Such skip connections are also known as the gated units or gated recurrent units and have the strong similarity to recent successful elements applied in RNNs. Thanks to this technique as they were able to train the NN with 152 layers while still having lower complexity than the VGGNet. It achieves the top-5 error rate of 3.57%, which beats human-level performance on this dataset.

Q5. What is HAAR CASCADE? 

Haar Cascade: It is the machine learning object detections algorithm used to identify the objects in an image or the video and based on the concept of features proposed by Paul Viola and Michael Jones in their paper "Rapid Object Detection using a Boosted Cascade of Simple Features" in 2001.

It is a machine learning-based approach where the cascade function is trained from the lot of positive and negative images. It is then used to detect the objects in other images.

The algorithm has four stages:

Q6. What is Transfer Learning?

Q7. What is Faster, R-CNN?

Q8. What is RCNN?

Q9.What is GoogLeNet/Inception?

Q10. What is LeNet-5?

  • 4
    点赞
  • 7
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值