I1221 15:49:07.614038 28846 net.cpp:228] relu1_1 does not need backward computation.
I1221 15:49:07.614042 28846 net.cpp:228] conv1_1 does not need backward computation.
I1221 15:49:07.614045 28846 net.cpp:270] This network produces output bbox_pred
I1221 15:49:07.614049 28846 net.cpp:270] This network produces output cls_prob
I1221 15:49:07.614070 28846 net.cpp:283] Network initialization done.
I1221 15:49:07.725576 28846 net.cpp:816] Ignoring source layer data
I1221 15:49:07.725599 28846 net.cpp:816] Ignoring source layer conv1
I1221 15:49:07.725601 28846 net.cpp:816] Ignoring source layer relu1
I1221 15:49:07.725605 28846 net.cpp:816] Ignoring source layer norm1
I1221 15:49:07.725606 28846 net.cpp:816] Ignoring source layer conv2
I1221 15:49:07.725608 28846 net.cpp:816] Ignoring source layer relu2
I1221 15:49:07.725611 28846 net.cpp:816] Ignoring source layer norm2
I1221 15:49:07.725613 28846 net.cpp:816] Ignoring source layer conv3
I1221 15:49:07.725615 28846 net.cpp:816] Ignoring source layer relu3
I1221 15:49:07.725617 28846 net.cpp:816] Ignoring source layer conv4
I1221 15:49:07.725620 28846 net.cpp:816] Ignoring source layer relu4
I1221 15:49:07.725621 28846 net.cpp:816] Ignoring source layer conv5
I1221 15:49:07.725623 28846 net.cpp:816] Ignoring source layer relu5
I1221 15:49:07.725626 28846 net.cpp:816] Ignoring source layer pool5
F1221 15:49:07.775279 28846 net.cpp:829] Cannot copy param 0 weights from layer 'fc6'; shape mismatch. Source param shape is 4096 9216 (37748736); target param shape is 4096 25088 (102760448). To learn this layer's parameters from scratch rather than copying from a saved net, rename the layer.
*** Check failure stack trace: ***
已放弃 (核心已转储)
I1221 15:49:07.614042 28846 net.cpp:228] conv1_1 does not need backward computation.
I1221 15:49:07.614045 28846 net.cpp:270] This network produces output bbox_pred
I1221 15:49:07.614049 28846 net.cpp:270] This network produces output cls_prob
I1221 15:49:07.614070 28846 net.cpp:283] Network initialization done.
I1221 15:49:07.725576 28846 net.cpp:816] Ignoring source layer data
I1221 15:49:07.725599 28846 net.cpp:816] Ignoring source layer conv1
I1221 15:49:07.725601 28846 net.cpp:816] Ignoring source layer relu1
I1221 15:49:07.725605 28846 net.cpp:816] Ignoring source layer norm1
I1221 15:49:07.725606 28846 net.cpp:816] Ignoring source layer conv2
I1221 15:49:07.725608 28846 net.cpp:816] Ignoring source layer relu2
I1221 15:49:07.725611 28846 net.cpp:816] Ignoring source layer norm2
I1221 15:49:07.725613 28846 net.cpp:816] Ignoring source layer conv3
I1221 15:49:07.725615 28846 net.cpp:816] Ignoring source layer relu3
I1221 15:49:07.725617 28846 net.cpp:816] Ignoring source layer conv4
I1221 15:49:07.725620 28846 net.cpp:816] Ignoring source layer relu4
I1221 15:49:07.725621 28846 net.cpp:816] Ignoring source layer conv5
I1221 15:49:07.725623 28846 net.cpp:816] Ignoring source layer relu5
I1221 15:49:07.725626 28846 net.cpp:816] Ignoring source layer pool5
F1221 15:49:07.775279 28846 net.cpp:829] Cannot copy param 0 weights from layer 'fc6'; shape mismatch. Source param shape is 4096 9216 (37748736); target param shape is 4096 25088 (102760448). To learn this layer's parameters from scratch rather than copying from a saved net, rename the layer.
*** Check failure stack trace: ***
已放弃 (核心已转储)