dnn is a deep neural network tool in OpenCV, which can be seen as a light version of caffe. That means dnn only provides forward computation without backward process, since it is used merely for deploying model. Also, dnn can import tensorflow and torch model. Cool, ha?
This blog represent two main classes in dnn: Layer and LayerFactory.
The best way to understand a class is to image it in your brain and link it to a concrete object.
— Duino Du
Here we go.
1. Layer
Layer describes a common computing framework: given input, the algorithm with some parameters computes output.
This is what Layer does. If you want to implement a custom layer, two things need to be done:
1. Inherit Layer class and override allocate() and forward().
2. Register new layer using LayerFactor.
Let’s move on to LayerFactory.
2. LayerFactory
LayerFactory is an interesting class. As its name implies, it uses factory design pattern. Here is LayerFactory interface:
(You can find them here)
class CV_EXPORTS LayerFactory
{
public:
//! Each Layer class must provide this function to the factory
typedef Ptr<Layer>(*Constuctor)(LayerParams ¶ms);
//! Registers the layer class with typename @p type and specified @p constructor.
static void registerLayer(const String &type, Constuctor constructor);
//! Unregisters registered layer with specified type name.
static void unregisterLayer(const String &type);
/** @brief Creates instance of registered layer.
* @param type type name of creating layer.
* @param params parameters which will be used for layer initialization.
*/
static Ptr<Layer> createLayerInstance(const String &type, LayerParams& params);
private:
LayerFactory();
struct Impl;
static Ptr<Impl> impl();
};
Impl is std::map, which stores what to produce. So for a factory-like class, there are three essentials:
1. Information about what to produce, eg, std::map
2. Edit the information, eg, registerLayer, unregisterLayer
3. Produce function, eg, createLayerInstance
We have them all here.
Layer and LayerFactory is used by Net, which is another class in dnn. Net is responsible for managing all layers. For Net, there are only four essentials for Net-users:
1. Net::readNetFromCaffe/Torch/Tensorflow
2. Net::setInputBlob
3. Net::forward
4. Net::getOutputBlob
Yes, we only need these behaviors for Net class.