how to change the AlexNet into FCNs ?

How to change the AlexNet into FCNs ?

FCNs is a network that only contain convolution layers and no fc layer at all. It's structure can be shown as the following figures:

This image from the paper : <Fully Convolutional Networks for Semantic Segmentation>  CVPR 2015.

how to change the AlexNet into FCNs ?

  It could locate the location of object target perfectly as shown in above images and it doesn't need to resize the resolution of input images, which is the mostly different from traditional CNNs. First, Let's review some related network parameters about AlexNet, related structure can be shown as following:

how to change the AlexNet into FCNs ?

  As we can see from the above figure, the input of images must be resized into a fixed resolution, like 224*224, due to the existance of fc_layer. The specific pipeline could be found in this blog, web link: http://blog.csdn.net/sunbaigui/article/details/39938097

  The output of Conv 5 is: 6*6*256, we want to obtain the final results: 1*1*1000 (take the 1k classes for an example). How could we use the middle Conv 6, Conv 7, Conv 8 layers to bridge the two results ? Do we need the pool layers added ? How to set the middle parameters in each layers ? Does it really work ?

  Let's do it now. We just add 3 Convolution layers for an example. The function used for change the width*height*channel (actually, it only about the width, due to width == height, and the channel only related to the output of each layer.) is :

(W- F + 2P)/S + 1

where W denotes the width of images from bottom layer, F denotes the size of Convolution filter, P means the padding you want to add, this mainly contribute to the same resolution of input and output, S denotes the stride.

  Thus, the following layers needed to add to the prototxt files:

    from: 6*6*256 --->  3*3*4096 ---> 1*1*4096 ---> 1*1*43 (take my experiments for an example.)

####################################################################
## the output of Pool 5 is **
####################################################################
layer {
name: "conv6"
type: "Convolution"
bottom: "pool5"
top: "conv6"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
kernel_size:
stride:
# group:
weight_filler {
type: "gaussian"
std: 0.01
}
bias_filler {
type: "constant"
value: 0.1
}
}
}
layer {
name: "relu6"
type: "ReLU"
bottom: "conv6"
top: "conv6"
} layer {
name: "conv7"
type: "Convolution"
bottom: "conv6"
top: "conv7"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
kernel_size:
stride:
# group:
weight_filler {
type: "gaussian"
std: 0.01
}
bias_filler {
type: "constant"
value: 0.1
}
}
}
layer {
name: "relu7"
type: "ReLU"
bottom: "conv7"
top: "conv7"
} layer {
name: "conv8"
type: "Convolution"
bottom: "conv7"
top: "conv8"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
kernel_size:
stride:
# group:
weight_filler {
type: "gaussian"
std: 0.01
}
bias_filler {
type: "constant"
value: 0.1
}
}
}
layer {
name: "relu8"
type: "ReLU"
bottom: "conv8"
top: "conv8"
}

  Then, make your caffe file and waiting for something amazing happens...

  Actually, at first, I always run a wrong result, i.e. 2*2*43 ... It really confused me, the function is wrong ? It does not make scene. Because it really worked at the begining of the Network. Lastly, I found I make a stupid mistake, due to I add the Conv 6 from Conv 5, not Pool 5. Thus, it is really important for us to be careful and more careful.

  Ok, the all pipeline has done, and due to my ACER lap-top only have a GTX960M, it warning me out of memory. The results running on the terminal are here :

I0423 ::24.421512   caffe.cpp:] Using GPUs
I0423 ::24.431041 caffe.cpp:] GPU : GeForce GTX 960M
I0423 ::24.565281 solver.cpp:] Initializing solver from parameters:
test_iter:
test_interval:
base_lr: 0.001
display:
max_iter:
lr_policy: "step"
gamma: 0.1
momentum: 0.9
weight_decay: 0.0005
stepsize:
snapshot:
snapshot_prefix: "/media/wangxiao/Acer/caffe_models_/"
solver_mode: GPU
device_id:
net: "/home/wangxiao/Downloads/fcn-caffe-master/wangxiao/train_val.prototxt"
test_initialization: false
I0423 ::24.621829 solver.cpp:] Creating training net from net file: /home/wangxiao/Downloads/fcn-caffe-master/wangxiao/train_val.prototxt
I0423 ::24.622601 net.cpp:] The NetState phase () differed from the phase () specified by a rule in layer data
I0423 ::24.622632 net.cpp:] The NetState phase () differed from the phase () specified by a rule in layer accuracy
I0423 ::24.622828 net.cpp:] Initializing net from parameters:
name: "AlexNet"
state {
phase: TRAIN
}
layer {
name: "data"
type: "ImageData"
top: "data"
top: "label"
include {
phase: TRAIN
}
transform_param {
mirror: false
}
image_data_param {
source: "/media/wangxiao/247317a3-e6b5-45d4-81d1-956930526746/---------------/new_born_data/train_data/newAdd_attribute_label.txt"
batch_size:
root_folder: "/media/wangxiao/247317a3-e6b5-45d4-81d1-956930526746/---------------/new_born_data/train_data/227_227_images/"
}
}
layer {
name: "conv1"
type: "Convolution"
bottom: "data"
top: "conv1"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
kernel_size:
stride:
weight_filler {
type: "gaussian"
std: 0.01
}
bias_filler {
type: "constant"
value:
}
}
}
layer {
name: "relu1"
type: "ReLU"
bottom: "conv1"
top: "conv1"
}
layer {
name: "norm1"
type: "LRN"
bottom: "conv1"
top: "norm1"
lrn_param {
local_size:
alpha: 0.0001
beta: 0.75
}
}
layer {
name: "pool1"
type: "Pooling"
bottom: "norm1"
top: "pool1"
pooling_param {
pool: MAX
kernel_size:
stride:
}
}
layer {
name: "conv2"
type: "Convolution"
bottom: "pool1"
top: "conv2"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
group:
weight_filler {
type: "gaussian"
std: 0.01
}
bias_filler {
type: "constant"
value: 0.1
}
}
}
layer {
name: "relu2"
type: "ReLU"
bottom: "conv2"
top: "conv2"
}
layer {
name: "norm2"
type: "LRN"
bottom: "conv2"
top: "norm2"
lrn_param {
local_size:
alpha: 0.0001
beta: 0.75
}
}
layer {
name: "pool2"
type: "Pooling"
bottom: "norm2"
top: "pool2"
pooling_param {
pool: MAX
kernel_size:
stride:
}
}
layer {
name: "conv3"
type: "Convolution"
bottom: "pool2"
top: "conv3"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
weight_filler {
type: "gaussian"
std: 0.01
}
bias_filler {
type: "constant"
value:
}
}
}
layer {
name: "relu3"
type: "ReLU"
bottom: "conv3"
top: "conv3"
}
layer {
name: "conv4"
type: "Convolution"
bottom: "conv3"
top: "conv4"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
group:
weight_filler {
type: "gaussian"
std: 0.01
}
bias_filler {
type: "constant"
value: 0.1
}
}
}
layer {
name: "relu4"
type: "ReLU"
bottom: "conv4"
top: "conv4"
}
layer {
name: "conv5"
type: "Convolution"
bottom: "conv4"
top: "conv5"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
group:
weight_filler {
type: "gaussian"
std: 0.01
}
bias_filler {
type: "constant"
value: 0.1
}
}
}
layer {
name: "relu5"
type: "ReLU"
bottom: "conv5"
top: "conv5"
}
layer {
name: "pool5"
type: "Pooling"
bottom: "conv5"
top: "pool5"
pooling_param {
pool: MAX
kernel_size:
stride:
}
}
layer {
name: "conv6"
type: "Convolution"
bottom: "pool5"
top: "conv6"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
kernel_size:
stride:
weight_filler {
type: "gaussian"
std: 0.01
}
bias_filler {
type: "constant"
value: 0.1
}
}
}
layer {
name: "relu6"
type: "ReLU"
bottom: "conv6"
top: "conv6"
}
layer {
name: "conv7"
type: "Convolution"
bottom: "conv6"
top: "conv7"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
kernel_size:
stride:
weight_filler {
type: "gaussian"
std: 0.01
}
bias_filler {
type: "constant"
value: 0.1
}
}
}
layer {
name: "relu7"
type: "ReLU"
bottom: "conv7"
top: "conv7"
}
layer {
name: "conv8"
type: "Convolution"
bottom: "conv7"
top: "conv8"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
kernel_size:
stride:
weight_filler {
type: "gaussian"
std: 0.01
}
bias_filler {
type: "constant"
value: 0.1
}
}
}
layer {
name: "relu8"
type: "ReLU"
bottom: "conv8"
top: "conv8"
}
layer {
name: "sigmoid"
type: "Sigmoid"
bottom: "conv8"
top: "conv8"
}
layer {
name: "loss"
type: "EuclideanLoss"
bottom: "conv8"
bottom: "label"
top: "loss"
}
I0423 ::24.622962 layer_factory.hpp:] Creating layer data
I0423 ::24.623002 net.cpp:] Creating Layer data
I0423 ::24.623009 net.cpp:] data -> data
I0423 ::24.623034 net.cpp:] data -> label
I0423 ::24.623051 image_data_layer.cpp:] Opening file /media/wangxiao/247317a3-e6b5-45d4-81d1-/---------------/new_born_data/train_data/newAdd_attribute_label.txt
I0423 ::25.535037 image_data_layer.cpp:] A total of images.
I0423 ::25.543112 image_data_layer.cpp:] output data size: ,,,
I0423 ::25.554397 net.cpp:] Setting up data
I0423 ::25.554425 net.cpp:] Top shape: ()
I0423 ::25.554431 net.cpp:] Top shape: ()
I0423 ::25.554435 net.cpp:] Memory required for data:
I0423 ::25.554451 layer_factory.hpp:] Creating layer conv1
I0423 ::25.554476 net.cpp:] Creating Layer conv1
I0423 ::25.554481 net.cpp:] conv1 <- data
I0423 ::25.554492 net.cpp:] conv1 -> conv1
I0423 ::25.556519 net.cpp:] Setting up conv1
I0423 ::25.556534 net.cpp:] Top shape: ()
I0423 ::25.556537 net.cpp:] Memory required for data:
I0423 ::25.556556 layer_factory.hpp:] Creating layer relu1
I0423 ::25.556565 net.cpp:] Creating Layer relu1
I0423 ::25.556568 net.cpp:] relu1 <- conv1
I0423 ::25.556573 net.cpp:] relu1 -> conv1 (in-place)
I0423 ::25.556583 net.cpp:] Setting up relu1
I0423 ::25.556587 net.cpp:] Top shape: ()
I0423 ::25.556591 net.cpp:] Memory required for data:
I0423 ::25.556594 layer_factory.hpp:] Creating layer norm1
I0423 ::25.556602 net.cpp:] Creating Layer norm1
I0423 ::25.556604 net.cpp:] norm1 <- conv1
I0423 ::25.556609 net.cpp:] norm1 -> norm1
I0423 ::25.556646 net.cpp:] Setting up norm1
I0423 ::25.556653 net.cpp:] Top shape: ()
I0423 ::25.556689 net.cpp:] Memory required for data:
I0423 ::25.556692 layer_factory.hpp:] Creating layer pool1
I0423 ::25.556700 net.cpp:] Creating Layer pool1
I0423 ::25.556704 net.cpp:] pool1 <- norm1
I0423 ::25.556710 net.cpp:] pool1 -> pool1
I0423 ::25.556749 net.cpp:] Setting up pool1
I0423 ::25.556766 net.cpp:] Top shape: ()
I0423 ::25.556769 net.cpp:] Memory required for data:
I0423 ::25.556772 layer_factory.hpp:] Creating layer conv2
I0423 ::25.556792 net.cpp:] Creating Layer conv2
I0423 ::25.556795 net.cpp:] conv2 <- pool1
I0423 ::25.556802 net.cpp:] conv2 -> conv2
I0423 ::25.565610 net.cpp:] Setting up conv2
I0423 ::25.565634 net.cpp:] Top shape: ()
I0423 ::25.565637 net.cpp:] Memory required for data:
I0423 ::25.565651 layer_factory.hpp:] Creating layer relu2
I0423 ::25.565660 net.cpp:] Creating Layer relu2
I0423 ::25.565665 net.cpp:] relu2 <- conv2
I0423 ::25.565672 net.cpp:] relu2 -> conv2 (in-place)
I0423 ::25.565681 net.cpp:] Setting up relu2
I0423 ::25.565686 net.cpp:] Top shape: ()
I0423 ::25.565690 net.cpp:] Memory required for data:
I0423 ::25.565692 layer_factory.hpp:] Creating layer norm2
I0423 ::25.565699 net.cpp:] Creating Layer norm2
I0423 ::25.565702 net.cpp:] norm2 <- conv2
I0423 ::25.565708 net.cpp:] norm2 -> norm2
I0423 ::25.565742 net.cpp:] Setting up norm2
I0423 ::25.565747 net.cpp:] Top shape: ()
I0423 ::25.565750 net.cpp:] Memory required for data:
I0423 ::25.565753 layer_factory.hpp:] Creating layer pool2
I0423 ::25.565762 net.cpp:] Creating Layer pool2
I0423 ::25.565764 net.cpp:] pool2 <- norm2
I0423 ::25.565769 net.cpp:] pool2 -> pool2
I0423 ::25.565798 net.cpp:] Setting up pool2
I0423 ::25.565804 net.cpp:] Top shape: ()
I0423 ::25.565809 net.cpp:] Memory required for data:
I0423 ::25.565811 layer_factory.hpp:] Creating layer conv3
I0423 ::25.565821 net.cpp:] Creating Layer conv3
I0423 ::25.565824 net.cpp:] conv3 <- pool2
I0423 ::25.565831 net.cpp:] conv3 -> conv3
I0423 ::25.590066 net.cpp:] Setting up conv3
I0423 ::25.590090 net.cpp:] Top shape: ()
I0423 ::25.590092 net.cpp:] Memory required for data:
I0423 ::25.590116 layer_factory.hpp:] Creating layer relu3
I0423 ::25.590126 net.cpp:] Creating Layer relu3
I0423 ::25.590131 net.cpp:] relu3 <- conv3
I0423 ::25.590137 net.cpp:] relu3 -> conv3 (in-place)
I0423 ::25.590145 net.cpp:] Setting up relu3
I0423 ::25.590149 net.cpp:] Top shape: ()
I0423 ::25.590152 net.cpp:] Memory required for data:
I0423 ::25.590155 layer_factory.hpp:] Creating layer conv4
I0423 ::25.590167 net.cpp:] Creating Layer conv4
I0423 ::25.590169 net.cpp:] conv4 <- conv3
I0423 ::25.590176 net.cpp:] conv4 -> conv4
I0423 ::25.608953 net.cpp:] Setting up conv4
I0423 ::25.608975 net.cpp:] Top shape: ()
I0423 ::25.608979 net.cpp:] Memory required for data:
I0423 ::25.608989 layer_factory.hpp:] Creating layer relu4
I0423 ::25.609007 net.cpp:] Creating Layer relu4
I0423 ::25.609011 net.cpp:] relu4 <- conv4
I0423 ::25.609019 net.cpp:] relu4 -> conv4 (in-place)
I0423 ::25.609027 net.cpp:] Setting up relu4
I0423 ::25.609031 net.cpp:] Top shape: ()
I0423 ::25.609047 net.cpp:] Memory required for data:
I0423 ::25.609050 layer_factory.hpp:] Creating layer conv5
I0423 ::25.609061 net.cpp:] Creating Layer conv5
I0423 ::25.609066 net.cpp:] conv5 <- conv4
I0423 ::25.609071 net.cpp:] conv5 -> conv5
I0423 ::25.621208 net.cpp:] Setting up conv5
I0423 ::25.621229 net.cpp:] Top shape: ()
I0423 ::25.621233 net.cpp:] Memory required for data:
I0423 ::25.621258 layer_factory.hpp:] Creating layer relu5
I0423 ::25.621268 net.cpp:] Creating Layer relu5
I0423 ::25.621273 net.cpp:] relu5 <- conv5
I0423 ::25.621279 net.cpp:] relu5 -> conv5 (in-place)
I0423 ::25.621286 net.cpp:] Setting up relu5
I0423 ::25.621290 net.cpp:] Top shape: ()
I0423 ::25.621294 net.cpp:] Memory required for data:
I0423 ::25.621297 layer_factory.hpp:] Creating layer pool5
I0423 ::25.621304 net.cpp:] Creating Layer pool5
I0423 ::25.621306 net.cpp:] pool5 <- conv5
I0423 ::25.621314 net.cpp:] pool5 -> pool5
I0423 ::25.621347 net.cpp:] Setting up pool5
I0423 ::25.621354 net.cpp:] Top shape: ()
I0423 ::25.621357 net.cpp:] Memory required for data:
I0423 ::25.621361 layer_factory.hpp:] Creating layer conv6
I0423 ::25.621373 net.cpp:] Creating Layer conv6
I0423 ::25.621377 net.cpp:] conv6 <- pool5
I0423 ::25.621384 net.cpp:] conv6 -> conv6
I0423 ::25.731640 net.cpp:] Setting up conv6
I0423 ::25.731675 net.cpp:] Top shape: ()
I0423 ::25.731679 net.cpp:] Memory required for data:
I0423 ::25.731688 layer_factory.hpp:] Creating layer relu6
I0423 ::25.731709 net.cpp:] Creating Layer relu6
I0423 ::25.731714 net.cpp:] relu6 <- conv6
I0423 ::25.731721 net.cpp:] relu6 -> conv6 (in-place)
I0423 ::25.731731 net.cpp:] Setting up relu6
I0423 ::25.731735 net.cpp:] Top shape: ()
I0423 ::25.731739 net.cpp:] Memory required for data:
I0423 ::25.731741 layer_factory.hpp:] Creating layer conv7
I0423 ::25.731752 net.cpp:] Creating Layer conv7
I0423 ::25.731757 net.cpp:] conv7 <- conv6
I0423 ::25.731765 net.cpp:] conv7 -> conv7
I0423 ::29.661667 net.cpp:] Setting up conv7
I0423 ::29.661705 net.cpp:] Top shape: ()
I0423 ::29.661710 net.cpp:] Memory required for data:
I0423 ::29.661720 layer_factory.hpp:] Creating layer relu7
I0423 ::29.661741 net.cpp:] Creating Layer relu7
I0423 ::29.661746 net.cpp:] relu7 <- conv7
I0423 ::29.661752 net.cpp:] relu7 -> conv7 (in-place)
I0423 ::29.661761 net.cpp:] Setting up relu7
I0423 ::29.661767 net.cpp:] Top shape: ()
I0423 ::29.661769 net.cpp:] Memory required for data:
I0423 ::29.661772 layer_factory.hpp:] Creating layer conv8
I0423 ::29.661783 net.cpp:] Creating Layer conv8
I0423 ::29.661788 net.cpp:] conv8 <- conv7
I0423 ::29.661795 net.cpp:] conv8 -> conv8
I0423 ::29.666793 net.cpp:] Setting up conv8
I0423 ::29.666815 net.cpp:] Top shape: ()
I0423 ::29.666818 net.cpp:] Memory required for data:
I0423 ::29.666826 layer_factory.hpp:] Creating layer relu8
I0423 ::29.666841 net.cpp:] Creating Layer relu8
I0423 ::29.666844 net.cpp:] relu8 <- conv8
I0423 ::29.666849 net.cpp:] relu8 -> conv8 (in-place)
I0423 ::29.666856 net.cpp:] Setting up relu8
I0423 ::29.666860 net.cpp:] Top shape: ()
I0423 ::29.666877 net.cpp:] Memory required for data:
I0423 ::29.666882 layer_factory.hpp:] Creating layer sigmoid
I0423 ::29.666888 net.cpp:] Creating Layer sigmoid
I0423 ::29.666892 net.cpp:] sigmoid <- conv8
I0423 ::29.666895 net.cpp:] sigmoid -> conv8 (in-place)
I0423 ::29.666901 net.cpp:] Setting up sigmoid
I0423 ::29.666905 net.cpp:] Top shape: ()
I0423 ::29.666908 net.cpp:] Memory required for data:
I0423 ::29.666911 layer_factory.hpp:] Creating layer loss
I0423 ::29.666918 net.cpp:] Creating Layer loss
I0423 ::29.666920 net.cpp:] loss <- conv8
I0423 ::29.666924 net.cpp:] loss <- label
I0423 ::29.666931 net.cpp:] loss -> loss
I0423 ::29.666975 net.cpp:] Setting up loss
I0423 ::29.666990 net.cpp:] Top shape: ()
I0423 ::29.666992 net.cpp:] with loss weight
I0423 ::29.667017 net.cpp:] Memory required for data:
I0423 ::29.667031 net.cpp:] loss needs backward computation.
I0423 ::29.667034 net.cpp:] sigmoid needs backward computation.
I0423 ::29.667038 net.cpp:] relu8 needs backward computation.
I0423 ::29.667040 net.cpp:] conv8 needs backward computation.
I0423 ::29.667043 net.cpp:] relu7 needs backward computation.
I0423 ::29.667047 net.cpp:] conv7 needs backward computation.
I0423 ::29.667050 net.cpp:] relu6 needs backward computation.
I0423 ::29.667053 net.cpp:] conv6 needs backward computation.
I0423 ::29.667057 net.cpp:] pool5 needs backward computation.
I0423 ::29.667060 net.cpp:] relu5 needs backward computation.
I0423 ::29.667063 net.cpp:] conv5 needs backward computation.
I0423 ::29.667068 net.cpp:] relu4 needs backward computation.
I0423 ::29.667070 net.cpp:] conv4 needs backward computation.
I0423 ::29.667073 net.cpp:] relu3 needs backward computation.
I0423 ::29.667076 net.cpp:] conv3 needs backward computation.
I0423 ::29.667080 net.cpp:] pool2 needs backward computation.
I0423 ::29.667084 net.cpp:] norm2 needs backward computation.
I0423 ::29.667088 net.cpp:] relu2 needs backward computation.
I0423 ::29.667091 net.cpp:] conv2 needs backward computation.
I0423 ::29.667094 net.cpp:] pool1 needs backward computation.
I0423 ::29.667098 net.cpp:] norm1 needs backward computation.
I0423 ::29.667101 net.cpp:] relu1 needs backward computation.
I0423 ::29.667104 net.cpp:] conv1 needs backward computation.
I0423 ::29.667109 net.cpp:] data does not need backward computation.
I0423 ::29.667111 net.cpp:] This network produces output loss
I0423 ::29.667127 net.cpp:] Network initialization done.
I0423 ::29.667804 solver.cpp:] Creating test net (#) specified by net file: /home/wangxiao/Downloads/fcn-caffe-master/wangxiao/train_val.prototxt
I0423 ::29.667937 net.cpp:] The NetState phase () differed from the phase () specified by a rule in layer data
I0423 ::29.668148 net.cpp:] Initializing net from parameters:
name: "AlexNet"
state {
phase: TEST
}
layer {
name: "data"
type: "ImageData"
top: "data"
top: "label"
include {
phase: TEST
}
transform_param {
mirror: false
}
image_data_param {
source: "/media/wangxiao/247317a3-e6b5-45d4-81d1-956930526746/---------------/new_born_data/test_data/newAdd_attribute_label_test.txt"
batch_size:
root_folder: "/media/wangxiao/247317a3-e6b5-45d4-81d1-956930526746/---------------/new_born_data/test_data/227_227_test_images/"
}
}
layer {
name: "conv1"
type: "Convolution"
bottom: "data"
top: "conv1"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
kernel_size:
stride:
weight_filler {
type: "gaussian"
std: 0.01
}
bias_filler {
type: "constant"
value:
}
}
}
layer {
name: "relu1"
type: "ReLU"
bottom: "conv1"
top: "conv1"
}
layer {
name: "norm1"
type: "LRN"
bottom: "conv1"
top: "norm1"
lrn_param {
local_size:
alpha: 0.0001
beta: 0.75
}
}
layer {
name: "pool1"
type: "Pooling"
bottom: "norm1"
top: "pool1"
pooling_param {
pool: MAX
kernel_size:
stride:
}
}
layer {
name: "conv2"
type: "Convolution"
bottom: "pool1"
top: "conv2"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
group:
weight_filler {
type: "gaussian"
std: 0.01
}
bias_filler {
type: "constant"
value: 0.1
}
}
}
layer {
name: "relu2"
type: "ReLU"
bottom: "conv2"
top: "conv2"
}
layer {
name: "norm2"
type: "LRN"
bottom: "conv2"
top: "norm2"
lrn_param {
local_size:
alpha: 0.0001
beta: 0.75
}
}
layer {
name: "pool2"
type: "Pooling"
bottom: "norm2"
top: "pool2"
pooling_param {
pool: MAX
kernel_size:
stride:
}
}
layer {
name: "conv3"
type: "Convolution"
bottom: "pool2"
top: "conv3"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
weight_filler {
type: "gaussian"
std: 0.01
}
bias_filler {
type: "constant"
value:
}
}
}
layer {
name: "relu3"
type: "ReLU"
bottom: "conv3"
top: "conv3"
}
layer {
name: "conv4"
type: "Convolution"
bottom: "conv3"
top: "conv4"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
group:
weight_filler {
type: "gaussian"
std: 0.01
}
bias_filler {
type: "constant"
value: 0.1
}
}
}
layer {
name: "relu4"
type: "ReLU"
bottom: "conv4"
top: "conv4"
}
layer {
name: "conv5"
type: "Convolution"
bottom: "conv4"
top: "conv5"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
pad:
kernel_size:
group:
weight_filler {
type: "gaussian"
std: 0.01
}
bias_filler {
type: "constant"
value: 0.1
}
}
}
layer {
name: "relu5"
type: "ReLU"
bottom: "conv5"
top: "conv5"
}
layer {
name: "pool5"
type: "Pooling"
bottom: "conv5"
top: "pool5"
pooling_param {
pool: MAX
kernel_size:
stride:
}
}
layer {
name: "conv6"
type: "Convolution"
bottom: "pool5"
top: "conv6"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
kernel_size:
stride:
weight_filler {
type: "gaussian"
std: 0.01
}
bias_filler {
type: "constant"
value: 0.1
}
}
}
layer {
name: "relu6"
type: "ReLU"
bottom: "conv6"
top: "conv6"
}
layer {
name: "conv7"
type: "Convolution"
bottom: "conv6"
top: "conv7"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
kernel_size:
stride:
weight_filler {
type: "gaussian"
std: 0.01
}
bias_filler {
type: "constant"
value: 0.1
}
}
}
layer {
name: "relu7"
type: "ReLU"
bottom: "conv7"
top: "conv7"
}
layer {
name: "conv8"
type: "Convolution"
bottom: "conv7"
top: "conv8"
param {
lr_mult:
decay_mult:
}
param {
lr_mult:
decay_mult:
}
convolution_param {
num_output:
kernel_size:
stride:
weight_filler {
type: "gaussian"
std: 0.01
}
bias_filler {
type: "constant"
value: 0.1
}
}
}
layer {
name: "relu8"
type: "ReLU"
bottom: "conv8"
top: "conv8"
}
layer {
name: "sigmoid"
type: "Sigmoid"
bottom: "conv8"
top: "conv8"
}
layer {
name: "accuracy"
type: "Accuracy"
bottom: "conv8"
bottom: "label"
top: "accuracy"
include {
phase: TEST
}
}
layer {
name: "loss"
type: "EuclideanLoss"
bottom: "conv8"
bottom: "label"
top: "loss"
}
I0423 ::29.668323 layer_factory.hpp:] Creating layer data
I0423 ::29.668349 net.cpp:] Creating Layer data
I0423 ::29.668355 net.cpp:] data -> data
I0423 ::29.668373 net.cpp:] data -> label
I0423 ::29.668382 image_data_layer.cpp:] Opening file /media/wangxiao/247317a3-e6b5-45d4-81d1-/---------------/new_born_data/test_data/newAdd_attribute_label_test.txt
I0423 ::29.696005 image_data_layer.cpp:] A total of images.
I0423 ::29.697830 image_data_layer.cpp:] output data size: ,,,
I0423 ::29.699980 net.cpp:] Setting up data
I0423 ::29.700013 net.cpp:] Top shape: ()
I0423 ::29.700019 net.cpp:] Top shape: ()
I0423 ::29.700022 net.cpp:] Memory required for data:
I0423 ::29.700028 layer_factory.hpp:] Creating layer label_data_1_split
I0423 ::29.700040 net.cpp:] Creating Layer label_data_1_split
I0423 ::29.700048 net.cpp:] label_data_1_split <- label
I0423 ::29.700060 net.cpp:] label_data_1_split -> label_data_1_split_0
I0423 ::29.700075 net.cpp:] label_data_1_split -> label_data_1_split_1
I0423 ::29.700141 net.cpp:] Setting up label_data_1_split
I0423 ::29.700151 net.cpp:] Top shape: ()
I0423 ::29.700160 net.cpp:] Top shape: ()
I0423 ::29.700176 net.cpp:] Memory required for data:
I0423 ::29.700181 layer_factory.hpp:] Creating layer conv1
I0423 ::29.700196 net.cpp:] Creating Layer conv1
I0423 ::29.700199 net.cpp:] conv1 <- data
I0423 ::29.700206 net.cpp:] conv1 -> conv1
I0423 ::29.701347 net.cpp:] Setting up conv1
I0423 ::29.701369 net.cpp:] Top shape: ()
I0423 ::29.701372 net.cpp:] Memory required for data:
I0423 ::29.701383 layer_factory.hpp:] Creating layer relu1
I0423 ::29.701390 net.cpp:] Creating Layer relu1
I0423 ::29.701395 net.cpp:] relu1 <- conv1
I0423 ::29.701400 net.cpp:] relu1 -> conv1 (in-place)
I0423 ::29.701406 net.cpp:] Setting up relu1
I0423 ::29.701412 net.cpp:] Top shape: ()
I0423 ::29.701416 net.cpp:] Memory required for data:
I0423 ::29.701418 layer_factory.hpp:] Creating layer norm1
I0423 ::29.701426 net.cpp:] Creating Layer norm1
I0423 ::29.701429 net.cpp:] norm1 <- conv1
I0423 ::29.701434 net.cpp:] norm1 -> norm1
I0423 ::29.701464 net.cpp:] Setting up norm1
I0423 ::29.701479 net.cpp:] Top shape: ()
I0423 ::29.701483 net.cpp:] Memory required for data:
I0423 ::29.701486 layer_factory.hpp:] Creating layer pool1
I0423 ::29.701503 net.cpp:] Creating Layer pool1
I0423 ::29.701505 net.cpp:] pool1 <- norm1
I0423 ::29.701510 net.cpp:] pool1 -> pool1
I0423 ::29.701537 net.cpp:] Setting up pool1
I0423 ::29.701544 net.cpp:] Top shape: ()
I0423 ::29.701545 net.cpp:] Memory required for data:
I0423 ::29.701550 layer_factory.hpp:] Creating layer conv2
I0423 ::29.701557 net.cpp:] Creating Layer conv2
I0423 ::29.701561 net.cpp:] conv2 <- pool1
I0423 ::29.701566 net.cpp:] conv2 -> conv2
I0423 ::29.709951 net.cpp:] Setting up conv2
I0423 ::29.709987 net.cpp:] Top shape: ()
I0423 ::29.709992 net.cpp:] Memory required for data:
I0423 ::29.710005 layer_factory.hpp:] Creating layer relu2
I0423 ::29.710014 net.cpp:] Creating Layer relu2
I0423 ::29.710018 net.cpp:] relu2 <- conv2
I0423 ::29.710026 net.cpp:] relu2 -> conv2 (in-place)
I0423 ::29.710033 net.cpp:] Setting up relu2
I0423 ::29.710039 net.cpp:] Top shape: ()
I0423 ::29.710042 net.cpp:] Memory required for data:
I0423 ::29.710046 layer_factory.hpp:] Creating layer norm2
I0423 ::29.710057 net.cpp:] Creating Layer norm2
I0423 ::29.710060 net.cpp:] norm2 <- conv2
I0423 ::29.710067 net.cpp:] norm2 -> norm2
I0423 ::29.710100 net.cpp:] Setting up norm2
I0423 ::29.710108 net.cpp:] Top shape: ()
I0423 ::29.710110 net.cpp:] Memory required for data:
I0423 ::29.710114 layer_factory.hpp:] Creating layer pool2
I0423 ::29.710120 net.cpp:] Creating Layer pool2
I0423 ::29.710124 net.cpp:] pool2 <- norm2
I0423 ::29.710129 net.cpp:] pool2 -> pool2
I0423 ::29.710155 net.cpp:] Setting up pool2
I0423 ::29.710171 net.cpp:] Top shape: ()
I0423 ::29.710175 net.cpp:] Memory required for data:
I0423 ::29.710187 layer_factory.hpp:] Creating layer conv3
I0423 ::29.710197 net.cpp:] Creating Layer conv3
I0423 ::29.710201 net.cpp:] conv3 <- pool2
I0423 ::29.710207 net.cpp:] conv3 -> conv3
I0423 ::29.733366 net.cpp:] Setting up conv3
I0423 ::29.733403 net.cpp:] Top shape: ()
I0423 ::29.733407 net.cpp:] Memory required for data:
I0423 ::29.733420 layer_factory.hpp:] Creating layer relu3
I0423 ::29.733439 net.cpp:] Creating Layer relu3
I0423 ::29.733444 net.cpp:] relu3 <- conv3
I0423 ::29.733453 net.cpp:] relu3 -> conv3 (in-place)
I0423 ::29.733461 net.cpp:] Setting up relu3
I0423 ::29.733466 net.cpp:] Top shape: ()
I0423 ::29.733469 net.cpp:] Memory required for data:
I0423 ::29.733472 layer_factory.hpp:] Creating layer conv4
I0423 ::29.733484 net.cpp:] Creating Layer conv4
I0423 ::29.733489 net.cpp:] conv4 <- conv3
I0423 ::29.733494 net.cpp:] conv4 -> conv4
I0423 ::29.750310 net.cpp:] Setting up conv4
I0423 ::29.750344 net.cpp:] Top shape: ()
I0423 ::29.750349 net.cpp:] Memory required for data:
I0423 ::29.750357 layer_factory.hpp:] Creating layer relu4
I0423 ::29.750366 net.cpp:] Creating Layer relu4
I0423 ::29.750370 net.cpp:] relu4 <- conv4
I0423 ::29.750376 net.cpp:] relu4 -> conv4 (in-place)
I0423 ::29.750393 net.cpp:] Setting up relu4
I0423 ::29.750397 net.cpp:] Top shape: ()
I0423 ::29.750401 net.cpp:] Memory required for data:
I0423 ::29.750403 layer_factory.hpp:] Creating layer conv5
I0423 ::29.750414 net.cpp:] Creating Layer conv5
I0423 ::29.750418 net.cpp:] conv5 <- conv4
I0423 ::29.750423 net.cpp:] conv5 -> conv5
I0423 ::29.762544 net.cpp:] Setting up conv5
I0423 ::29.762580 net.cpp:] Top shape: ()
I0423 ::29.762584 net.cpp:] Memory required for data:
I0423 ::29.762598 layer_factory.hpp:] Creating layer relu5
I0423 ::29.762609 net.cpp:] Creating Layer relu5
I0423 ::29.762614 net.cpp:] relu5 <- conv5
I0423 ::29.762619 net.cpp:] relu5 -> conv5 (in-place)
I0423 ::29.762629 net.cpp:] Setting up relu5
I0423 ::29.762646 net.cpp:] Top shape: ()
I0423 ::29.762650 net.cpp:] Memory required for data:
I0423 ::29.762653 layer_factory.hpp:] Creating layer pool5
I0423 ::29.762662 net.cpp:] Creating Layer pool5
I0423 ::29.762665 net.cpp:] pool5 <- conv5
I0423 ::29.762671 net.cpp:] pool5 -> pool5
I0423 ::29.762707 net.cpp:] Setting up pool5
I0423 ::29.762724 net.cpp:] Top shape: ()
I0423 ::29.762727 net.cpp:] Memory required for data:
I0423 ::29.762740 layer_factory.hpp:] Creating layer conv6
I0423 ::29.762753 net.cpp:] Creating Layer conv6
I0423 ::29.762755 net.cpp:] conv6 <- pool5
I0423 ::29.762761 net.cpp:] conv6 -> conv6
I0423 ::29.868270 net.cpp:] Setting up conv6
I0423 ::29.868306 net.cpp:] Top shape: ()
I0423 ::29.868311 net.cpp:] Memory required for data:
I0423 ::29.868320 layer_factory.hpp:] Creating layer relu6
I0423 ::29.868330 net.cpp:] Creating Layer relu6
I0423 ::29.868335 net.cpp:] relu6 <- conv6
I0423 ::29.868342 net.cpp:] relu6 -> conv6 (in-place)
I0423 ::29.868350 net.cpp:] Setting up relu6
I0423 ::29.868355 net.cpp:] Top shape: ()
I0423 ::29.868358 net.cpp:] Memory required for data:
I0423 ::29.868361 layer_factory.hpp:] Creating layer conv7
I0423 ::29.868372 net.cpp:] Creating Layer conv7
I0423 ::29.868376 net.cpp:] conv7 <- conv6
I0423 ::29.868381 net.cpp:] conv7 -> conv7
I0423 ::33.773138 net.cpp:] Setting up conv7
I0423 ::33.773177 net.cpp:] Top shape: ()
I0423 ::33.773182 net.cpp:] Memory required for data:
I0423 ::33.773192 layer_factory.hpp:] Creating layer relu7
I0423 ::33.773203 net.cpp:] Creating Layer relu7
I0423 ::33.773219 net.cpp:] relu7 <- conv7
I0423 ::33.773232 net.cpp:] relu7 -> conv7 (in-place)
I0423 ::33.773247 net.cpp:] Setting up relu7
I0423 ::33.773257 net.cpp:] Top shape: ()
I0423 ::33.773265 net.cpp:] Memory required for data:
I0423 ::33.773269 layer_factory.hpp:] Creating layer conv8
I0423 ::33.773283 net.cpp:] Creating Layer conv8
I0423 ::33.773286 net.cpp:] conv8 <- conv7
I0423 ::33.773293 net.cpp:] conv8 -> conv8
I0423 ::33.778169 net.cpp:] Setting up conv8
I0423 ::33.778193 net.cpp:] Top shape: ()
I0423 ::33.778198 net.cpp:] Memory required for data:
I0423 ::33.778203 layer_factory.hpp:] Creating layer relu8
I0423 ::33.778221 net.cpp:] Creating Layer relu8
I0423 ::33.778226 net.cpp:] relu8 <- conv8
I0423 ::33.778233 net.cpp:] relu8 -> conv8 (in-place)
I0423 ::33.778239 net.cpp:] Setting up relu8
I0423 ::33.778244 net.cpp:] Top shape: ()
I0423 ::33.778246 net.cpp:] Memory required for data:
I0423 ::33.778249 layer_factory.hpp:] Creating layer sigmoid
I0423 ::33.778255 net.cpp:] Creating Layer sigmoid
I0423 ::33.778260 net.cpp:] sigmoid <- conv8
I0423 ::33.778265 net.cpp:] sigmoid -> conv8 (in-place)
I0423 ::33.778270 net.cpp:] Setting up sigmoid
I0423 ::33.778275 net.cpp:] Top shape: ()
I0423 ::33.778277 net.cpp:] Memory required for data:
I0423 ::33.778295 layer_factory.hpp:] Creating layer conv8_sigmoid_0_split
I0423 ::33.778301 net.cpp:] Creating Layer conv8_sigmoid_0_split
I0423 ::33.778303 net.cpp:] conv8_sigmoid_0_split <- conv8
I0423 ::33.778318 net.cpp:] conv8_sigmoid_0_split -> conv8_sigmoid_0_split_0
I0423 ::33.778339 net.cpp:] conv8_sigmoid_0_split -> conv8_sigmoid_0_split_1
I0423 ::33.778373 net.cpp:] Setting up conv8_sigmoid_0_split
I0423 ::33.778389 net.cpp:] Top shape: ()
I0423 ::33.778393 net.cpp:] Top shape: ()
I0423 ::33.778408 net.cpp:] Memory required for data:
I0423 ::33.778411 layer_factory.hpp:] Creating layer accuracy
I0423 ::33.778419 net.cpp:] Creating Layer accuracy
I0423 ::33.778422 net.cpp:] accuracy <- conv8_sigmoid_0_split_0
I0423 ::33.778426 net.cpp:] accuracy <- label_data_1_split_0
I0423 ::33.778432 net.cpp:] accuracy -> accuracy
I0423 ::33.778439 net.cpp:] Setting up accuracy
I0423 ::33.778446 net.cpp:] Top shape: ()
I0423 ::33.778452 net.cpp:] Memory required for data:
I0423 ::33.778457 layer_factory.hpp:] Creating layer loss
I0423 ::33.778477 net.cpp:] Creating Layer loss
I0423 ::33.778496 net.cpp:] loss <- conv8_sigmoid_0_split_1
I0423 ::33.778503 net.cpp:] loss <- label_data_1_split_1
I0423 ::33.778513 net.cpp:] loss -> loss
I0423 ::33.778563 net.cpp:] Setting up loss
I0423 ::33.778573 net.cpp:] Top shape: ()
I0423 ::33.778578 net.cpp:] with loss weight
I0423 ::33.778602 net.cpp:] Memory required for data:
I0423 ::33.778609 net.cpp:] loss needs backward computation.
I0423 ::33.778616 net.cpp:] accuracy does not need backward computation.
I0423 ::33.778621 net.cpp:] conv8_sigmoid_0_split needs backward computation.
I0423 ::33.778625 net.cpp:] sigmoid needs backward computation.
I0423 ::33.778627 net.cpp:] relu8 needs backward computation.
I0423 ::33.778630 net.cpp:] conv8 needs backward computation.
I0423 ::33.778633 net.cpp:] relu7 needs backward computation.
I0423 ::33.778636 net.cpp:] conv7 needs backward computation.
I0423 ::33.778640 net.cpp:] relu6 needs backward computation.
I0423 ::33.778642 net.cpp:] conv6 needs backward computation.
I0423 ::33.778646 net.cpp:] pool5 needs backward computation.
I0423 ::33.778651 net.cpp:] relu5 needs backward computation.
I0423 ::33.778655 net.cpp:] conv5 needs backward computation.
I0423 ::33.778657 net.cpp:] relu4 needs backward computation.
I0423 ::33.778661 net.cpp:] conv4 needs backward computation.
I0423 ::33.778664 net.cpp:] relu3 needs backward computation.
I0423 ::33.778666 net.cpp:] conv3 needs backward computation.
I0423 ::33.778671 net.cpp:] pool2 needs backward computation.
I0423 ::33.778673 net.cpp:] norm2 needs backward computation.
I0423 ::33.778677 net.cpp:] relu2 needs backward computation.
I0423 ::33.778681 net.cpp:] conv2 needs backward computation.
I0423 ::33.778684 net.cpp:] pool1 needs backward computation.
I0423 ::33.778687 net.cpp:] norm1 needs backward computation.
I0423 ::33.778692 net.cpp:] relu1 needs backward computation.
I0423 ::33.778694 net.cpp:] conv1 needs backward computation.
I0423 ::33.778698 net.cpp:] label_data_1_split does not need backward computation.
I0423 ::33.778702 net.cpp:] data does not need backward computation.
I0423 ::33.778705 net.cpp:] This network produces output accuracy
I0423 ::33.778709 net.cpp:] This network produces output loss
I0423 ::33.778728 net.cpp:] Network initialization done.
I0423 ::33.778976 solver.cpp:] Solver scaffolding done.
I0423 ::33.779458 caffe.cpp:] Finetuning from /home/wangxiao/Downloads/fcn-caffe-master/wangxiao/bvlc_alexnet.caffemodel
I0423 ::34.067591 upgrade_proto.cpp:] Attempting to upgrade input file specified using deprecated transformation parameters: /home/wangxiao/Downloads/fcn-caffe-master/wangxiao/bvlc_alexnet.caffemodel
I0423 ::34.067654 upgrade_proto.cpp:] Successfully upgraded file specified using deprecated data transformation parameters.
W0423 ::34.067659 upgrade_proto.cpp:] Note that future Caffe releases will only support transform_param messages for transformation fields.
I0423 ::34.067752 upgrade_proto.cpp:] Attempting to upgrade input file specified using deprecated V1LayerParameter: /home/wangxiao/Downloads/fcn-caffe-master/wangxiao/bvlc_alexnet.caffemodel
I0423 ::34.193063 upgrade_proto.cpp:] Successfully upgraded file specified using deprecated V1LayerParameter
I0423 ::34.196166 net.cpp:] Ignoring source layer fc6
I0423 ::34.196195 net.cpp:] Ignoring source layer drop6
I0423 ::34.196199 net.cpp:] Ignoring source layer fc7
I0423 ::34.196203 net.cpp:] Ignoring source layer drop7
I0423 ::34.196207 net.cpp:] Ignoring source layer fc8
I0423 ::34.491250 upgrade_proto.cpp:] Attempting to upgrade input file specified using deprecated transformation parameters: /home/wangxiao/Downloads/fcn-caffe-master/wangxiao/bvlc_alexnet.caffemodel
I0423 ::34.491279 upgrade_proto.cpp:] Successfully upgraded file specified using deprecated data transformation parameters.
W0423 ::34.491284 upgrade_proto.cpp:] Note that future Caffe releases will only support transform_param messages for transformation fields.
I0423 ::34.491298 upgrade_proto.cpp:] Attempting to upgrade input file specified using deprecated V1LayerParameter: /home/wangxiao/Downloads/fcn-caffe-master/wangxiao/bvlc_alexnet.caffemodel
I0423 ::34.615309 upgrade_proto.cpp:] Successfully upgraded file specified using deprecated V1LayerParameter
I0423 ::34.617781 net.cpp:] Ignoring source layer fc6
I0423 ::34.617805 net.cpp:] Ignoring source layer drop6
I0423 ::34.617808 net.cpp:] Ignoring source layer fc7
I0423 ::34.617812 net.cpp:] Ignoring source layer drop7
I0423 ::34.617815 net.cpp:] Ignoring source layer fc8
I0423 ::34.619755 caffe.cpp:] Starting Optimization
I0423 ::34.619771 solver.cpp:] Solving AlexNet
I0423 ::34.619776 solver.cpp:] Learning Rate Policy: step
I0423 ::35.070583 solver.cpp:] Iteration , loss = 7.51117
I0423 ::35.070628 sgd_solver.cpp:] Iteration , lr = 0.001
F0423 ::35.071538 syncedmem.cpp:] Check failed: error == cudaSuccess ( vs. ) out of memory
*** Check failure stack trace: ***
@ 0x7f3d97747daa (unknown)
@ 0x7f3d97747ce4 (unknown)
@ 0x7f3d977476e6 (unknown)
@ 0x7f3d9774a687 (unknown)
@ 0x7f3d97e0fbd1 caffe::SyncedMemory::to_gpu()
@ 0x7f3d97e0ef39 caffe::SyncedMemory::mutable_gpu_data()
@ 0x7f3d97e76c02 caffe::Blob<>::mutable_gpu_data()
@ 0x7f3d97e8857c caffe::SGDSolver<>::ComputeUpdateValue()
@ 0x7f3d97e88f73 caffe::SGDSolver<>::ApplyUpdate()
@ 0x7f3d97e2827c caffe::Solver<>::Step()
@ 0x7f3d97e288c9 caffe::Solver<>::Solve()
@ 0x408abe train()
@ 0x405f8c main
@ 0x7f3d96a55ec5 (unknown)
@ 0x4066c1 (unknown)
@ (nil) (unknown)

 

  Later, we will concentrate on how to locate the target object and shown us the feature from each Convolution layers.

  Waiting and Continuing ...

  All right, the terminal shown me this, oh, my god ... Wrong ! Wrong ! Wrong !!!

  The loss = nan , fuck, how it possible ???

how to change the AlexNet into FCNs ?

Due to the base_lr = 0.001, and change into base_lr = 0.000001, the loss become normal.

how to change the AlexNet into FCNs ?

上一篇:iOS开发那些事-iOS应用本地化-资源文件本地化


下一篇:HDU.4903.The only survival(组合 计数)