我设计了一个与FCN相同的网络。输入数据是1*224*224,输入标签是1*224*224。但是我遇到了以下错误:
F0502 07:57:30.032742 18127 softmax_loss_layer.cpp:47] Check failed: outer_num_ * inner_num_ == bottom[1]->count() (50176 vs. 1) 标签数量必须与预测数量匹配;例如,如果softmax轴==1且预测形状为(N, C, H, W),则标签计数(标签数量)必须为N*H*W,整数值在{0, 1, ..., C-1}之间。
这是输入结构:
layer { name: "data" type: "ImageData" top: "data" top: "label" image_data_param { source: "/home/zhaimo/fcn-master/mo/train.txt" batch_size: 1 shuffle: true }}
这是softmax层:
layer {name: "loss"type: "SoftmaxWithLoss"bottom: "upscore1"bottom: "label"top: "loss"loss_param { ignore_label: 255 normalize: false }}
这是train.txt文件的内容:
/home/zhaimo/fcn-master/data/vessel/train/original/01.png /home/zhaimo/SegNet/data/vessel/train/label/01.png/home/zhaimo/fcn-master/data/vessel/train/original/02.png /home/zhaimo/SegNet/data/vessel/train/label/02.png/home/zhaimo/fcn-master/data/vessel/train/original/03.png /home/zhaimo/SegNet/data/vessel/train/label/03.png/home/zhaimo/fcn-master/data/vessel/train/original/04.png /home/zhaimo/SegNet/data/vessel/train/label/04.png
第一个文件名是输入数据,第二个是其标签。
===========================更新=======================================
我尝试使用两个ImageData层作为输入:
layer {name: "data" type: "ImageData" top: "data" image_data_param { source: "/home/zhaimo/fcn-master/mo/train_o.txt" batch_size: 1 shuffle: false }} layer { name: "label" type: "ImageData" top: "label" image_data_param { source: "/home/zhaimo/fcn-master/mo/train_l.txt" batch_size: 1 shuffle: false }}
但遇到了另一个错误:
I0502 08:34:46.429774 19100 layer_factory.hpp:77] Creating layer dataI0502 08:34:46.429808 19100 net.cpp:100] Creating Layer dataI0502 08:34:46.429816 19100 net.cpp:408] data -> dataF0502 08:34:46.429834 19100 layer.hpp:389] Check failed: ExactNumTopBlobs() == top.size() (2 vs. 1) ImageData Layer produces 2 top blob(s) as output.*** Check failure stack trace: ***Aborted (core dumped)
train_o.txt:
/home/zhaimo/fcn-master/data/vessel/train/original/01.png/home/zhaimo/fcn-master/data/vessel/train/original/02.png/home/zhaimo/fcn-master/data/vessel/train/original/03.png/home/zhaimo/fcn-master/data/vessel/train/original/04.png/home/zhaimo/fcn-master/data/vessel/train/original/05.png
train_l.txt:
/home/zhaimo/SegNet/data/vessel/train/label/01.png/home/zhaimo/SegNet/data/vessel/train/label/02.png/home/zhaimo/SegNet/data/vessel/train/label/03.png/home/zhaimo/SegNet/data/vessel/train/label/04.png/home/zhaimo/SegNet/data/vessel/train/label/05.png
===============================更新2===================================如果我使用两个ImageData层,如何修改deploy.prototxt?这是我写的文件:
layer { name: "data" type: "ImageData" top: "data" top: "tmp0" input_param { shape: { dim: 1 dim: 1 dim: 224 dim: 224 } }}
这是forward.py文件:
import numpy as npfrom PIL import Imagecaffe_root = '/home/zhaimo/' import syssys.path.insert(0, caffe_root + 'caffe-master/python')import caffe# 加载图像,转换为BGR,减去均值,并调整维度为C x H x W以适应Caffeim = Image.open('/home/zhaimo/fcn-master/data/vessel/test/13.png')in_ = np.array(im, dtype=np.float32)#in_ = in_[:,:,::-1]#in_ -= np.array((104.00698793,116.66876762,122.67891434))#in_ = in_.transpose((2,0,1))# 加载网络net = caffe.Net('/home/zhaimo/fcn-master/mo/deploy.prototxt', '/home/zhaimo/fcn-master/mo/snapshot/train/_iter_200000.caffemodel', caffe.TEST)# 设置输入的形状(数据blob是N x C x H x W)net.blobs['data'].reshape(1, *in_.shape)net.blobs['data'].data[...] = in_# 运行网络并获取预测的argmaxnet.forward()out = net.blobs['score'].data[0].argmax(axis=0)plt.axis('off')plt.savefig('/home/zhaimo/fcn-master/mo/result/13.png')
但我遇到了以下错误:
F0504 08:16:46.423981 3383 layer.hpp:389] Check failed: ExactNumTopBlobs() == top.size() (2 vs. 1) ImageData Layer produces 2 top blob(s) as output.
请问如何修改forward.py文件?
回答:
你的问题出在数据顶部blob的数量上。对于两个imagedata
层,请使用以下配置:
layer { name: "data" type: "ImageData" top: "data" top: "tmp" image_data_param { source: "/home/zhaimo/fcn-master/mo/train_o.txt" batch_size: 1 shuffle: false }} layer { name: "label" type: "ImageData" top: "label" top: "tmp1" image_data_param { // 你可能还需要 //is_color: false source: "/home/zhaimo/fcn-master/mo/train_l.txt" batch_size: 1 shuffle: false }}
在文本文件中,只需将所有标签设置为0。你不会使用tmp/tmp1
,所以这无关紧要。