Ubuntu 16.04 使用docker资料汇总与应用docker安装caffe并使用Classifier(ros kinetic+usb_cam+caffe)

2023-05-16

Docker是开源的应用容器引擎。若想简单了解一下,可以参考百度百科词条Docker。好像只支持64位系统。

Docker官网:https://www.docker.com/

Docker — 从入门到实践:https://yeasy.gitbooks.io/docker_practice/content/

Pdf版下载:http://download.csdn.net/detail/zhangrelay/9743400

caffe官网:http://caffe.berkeleyvision.org/installation.html

caffe_docker:https://github.com/BVLC/caffe/tree/master/docker

然后参考这篇博客就可以了:http://blog.csdn.net/sysushui/article/details/54585788


看右图数据,准确识别出是磁罗盘(>0.8)

如: docker search caffe

$ docker search caffe
NAME                                  DESCRIPTION                                     STARS     OFFICIAL   AUTOMATED
kaixhin/caffe                         Ubuntu Core 14.04 + Caffe.                      33                   [OK]
kaixhin/cuda-caffe                    Ubuntu Core 14.04 + CUDA + Caffe.               30                   [OK]
neowaylabs/caffe-cpu                  Caffe CPU based on:  https://hub.docker.co...   4                    [OK]
kaixhin/caffe-deps                    `kaixhin/caffe` dependencies.                   1                    [OK]
mbartoli/caffe                        Caffe, CPU-only                                 1                    [OK]
drunkar/cuda-caffe-anaconda-chainer   cuda-caffe-anaconda-chainer                     1                    [OK]
kaixhin/cuda-caffe-deps               `kaixhin/cuda-caffe` dependencies.              0                    [OK]
mtngld/caffe-gpu                      Ubuntu + caffe (gpu ready)                      0                    [OK]
nitnelave/caffe                       Master branch of BVLC/caffe, on CentOS7 wi...   0                    [OK]
bvlc/caffe                            Official Caffe images                           0                    [OK]
ruimashita/caffe-gpu                  ubuntu 14.04 cuda 7 (NVIDIA driver version...   0                    [OK]
ruimashita/caffe-cpu-with-models      ubuntu 14.04 caffe  bvlc_reference_caffene...   0                    [OK]
elezar/caffe                          Caffe Docker Images                             0                    [OK]
ruimashita/caffe-gpu-with-models      ubuntu 14.04 cuda 7.0 caffe  bvlc_referenc...   0                    [OK]
floydhub/caffe                        Caffe docker image                              0                    [OK]
namikister/caffe                      Caffe with CUDA 8.0                             0                    [OK]
tingtinglu/caffe                      caffe                                           0                    [OK]
djpetti/caffe                         A simple container with Caffe, CUDA, and C...   0                    [OK]
flyingmouse/caffe                     Caffe is a deep learning framework made wi...   0                    [OK]
ruimashita/caffe-cpu                  ubuntu 14.04 caffe                              0                    [OK]
suyongsun/caffe-gpu                   Caffe image with gpu mode.                      0                    [OK]
haoyangz/caffe-cnn                    caffe-cnn                                       0                    [OK]
2breakfast/caffe-sshd                 installed sshd server on nvidia/caffe           0                    [OK]
chakkritte/docker-caffe               Docker caffe                                    0                    [OK]
ederrm/caffe                          Caffe http://caffe.berkeleyvision.org setup!    0                    [OK]
relaybot@relaybot-desktop:~$ 

选择安装即可,caffe安装CPU版本还是比较容易的。

安装完毕测试,这是在ros kinetic版本测试,和ros indigo一样。

具体请参考:

ROS + Caffe 机器人操作系统框架和深度学习框架笔记 (機器人控制與人工智能)

http://blog.csdn.net/zhangrelay/article/details/54669922


$ roscore
... logging to /home/relaybot/.ros/log/f214a97a-e0b1-11e6-833d-70f1a1ca7552/roslaunch-relaybot-desktop-32381.log
Checking log directory for disk usage. This may take awhile.
Press Ctrl-C to interrupt
Done checking log file disk usage. Usage is <1GB.

started roslaunch server http://relaybot-desktop:44408/
ros_comm version 1.12.6


SUMMARY
========

PARAMETERS
 * /rosdistro: kinetic
 * /rosversion: 1.12.6

NODES

auto-starting new master
process[master]: started with pid [32411]
ROS_MASTER_URI=http://relaybot-desktop:11311/

setting /run_id to f214a97a-e0b1-11e6-833d-70f1a1ca7552
process[rosout-1]: started with pid [32424]
started core service [/rosout]


$ rosrun uvc_camera uvc_camera_node
[ INFO] [1485096579.984543774]: using default calibration URL
[ INFO] [1485096579.984671839]: camera calibration URL: file:///home/relaybot/.ros/camera_info/camera.yaml
[ INFO] [1485096579.984939036]: Unable to open camera calibration file [/home/relaybot/.ros/camera_info/camera.yaml]
[ WARN] [1485096579.984987494]: Camera calibration file /home/relaybot/.ros/camera_info/camera.yaml not found.
opening /dev/video0
pixfmt 0 = 'YUYV' desc = 'YUYV 4:2:2'
  discrete: 640x480:   1/30 1/15
  discrete: 352x288:   1/30 1/15
  discrete: 320x240:   1/30 1/15
  discrete: 176x144:   1/30 1/15
  discrete: 160x120:   1/30 1/15
  discrete: 1280x800:   2/15
  discrete: 1280x1024:   2/15
  int (Brightness, 0, id = 980900): -64 to 64 (1)
  int (Contrast, 0, id = 980901): 0 to 64 (1)
  int (Saturation, 0, id = 980902): 0 to 128 (1)
  int (Hue, 0, id = 980903): -40 to 40 (1)
  bool (White Balance Temperature, Auto, 0, id = 98090c): 0 to 1 (1)
  int (Gamma, 0, id = 980910): 72 to 500 (1)
  menu (Power Line Frequency, 0, id = 980918): 0 to 2 (1)
    0: Disabled
    1: 50 Hz
    2: 60 Hz
  int (Sharpness, 0, id = 98091b): 0 to 6 (1)
  int (Backlight Compensation, 0, id = 98091c): 0 to 2 (1)
select timeout in grab
^Crelaybot@relaybot-desktop:~$ rosrun uvc_camera uvc_camera_node topic:=/camera/b/image_raw
[ INFO] [1485096761.665718381]: using default calibration URL
[ INFO] [1485096761.665859706]: camera calibration URL: file:///home/relaybot/.ros/camera_info/camera.yaml
[ INFO] [1485096761.665944994]: Unable to open camera calibration file [/home/relaybot/.ros/camera_info/camera.yaml]
[ WARN] [1485096761.665980436]: Camera calibration file /home/relaybot/.ros/camera_info/camera.yaml not found.
opening /dev/video0
pixfmt 0 = 'YUYV' desc = 'YUYV 4:2:2'
  discrete: 640x480:   1/30 1/15
  discrete: 352x288:   1/30 1/15
  discrete: 320x240:   1/30 1/15
  discrete: 176x144:   1/30 1/15
  discrete: 160x120:   1/30 1/15
  discrete: 1280x800:   2/15
  discrete: 1280x1024:   2/15
  int (Brightness, 0, id = 980900): -64 to 64 (1)
  int (Contrast, 0, id = 980901): 0 to 64 (1)
  int (Saturation, 0, id = 980902): 0 to 128 (1)
  int (Hue, 0, id = 980903): -40 to 40 (1)
  bool (White Balance Temperature, Auto, 0, id = 98090c): 0 to 1 (1)
  int (Gamma, 0, id = 980910): 72 to 500 (1)
  menu (Power Line Frequency, 0, id = 980918): 0 to 2 (1)
    0: Disabled
    1: 50 Hz
    2: 60 Hz
  int (Sharpness, 0, id = 98091b): 0 to 6 (1)
  int (Backlight Compensation, 0, id = 98091c): 0 to 2 (1)
select timeout in grab

rosrun topic_tools transform /image_raw /camera/rgb/image_raw sensor_msgs/Image 'm'

$ rosrun ros_caffe ros_caffe_test

WARNING: Logging before InitGoogleLogging() is written to STDERR

I0122 23:02:21.915738  2968 upgrade_proto.cpp:67] Attempting to upgrade input file specified using deprecated input fields: /home/relaybot/Rob_Soft/caffe/src/ros_caffe/data/deploy.prototxt

I0122 23:02:21.915875  2968 upgrade_proto.cpp:70] Successfully upgraded file specified using deprecated input fields.

W0122 23:02:21.915894  2968 upgrade_proto.cpp:72] Note that future Caffe releases will only support input layers and not input fields.

I0122 23:02:21.916246  2968 net.cpp:53] Initializing net from parameters:

name: "CaffeNet"

state {

  phase: TEST

  level: 0

}

layer {

  name: "input"

  type: "Input"

  top: "data"

  input_param {

    shape {

      dim: 10

      dim: 3

      dim: 227

      dim: 227

    }

  }

}

layer {

  name: "conv1"

  type: "Convolution"

  bottom: "data"

  top: "conv1"

  convolution_param {

    num_output: 96

    kernel_size: 11

    stride: 4

  }

}

layer {

  name: "relu1"

  type: "ReLU"

  bottom: "conv1"

  top: "conv1"

}

layer {

  name: "pool1"

  type: "Pooling"

  bottom: "conv1"

  top: "pool1"

  pooling_param {

    pool: MAX

    kernel_size: 3

    stride: 2

  }

}

layer {

  name: "norm1"

  type: "LRN"

  bottom: "pool1"

  top: "norm1"

  lrn_param {

    local_size: 5

    alpha: 0.0001

    beta: 0.75

  }

}

layer {

  name: "conv2"

  type: "Convolution"

  bottom: "norm1"

  top: "conv2"

  convolution_param {

    num_output: 256

    pad: 2

    kernel_size: 5

    group: 2

  }

}

layer {

  name: "relu2"

  type: "ReLU"

  bottom: "conv2"

  top: "conv2"

}

layer {

  name: "pool2"

  type: "Pooling"

  bottom: "conv2"

  top: "pool2"

  pooling_param {

    pool: MAX

    kernel_size: 3

    stride: 2

  }

}

layer {

  name: "norm2"

  type: "LRN"

  bottom: "pool2"

  top: "norm2"

  lrn_param {

    local_size: 5

    alpha: 0.0001

    beta: 0.75

  }

}

layer {

  name: "conv3"

  type: "Convolution"

  bottom: "norm2"

  top: "conv3"

  convolution_param {

    num_output: 384

    pad: 1

    kernel_size: 3

  }

}

layer {

  name: "relu3"

  type: "ReLU"

  bottom: "conv3"

  top: "conv3"

}

layer {

  name: "conv4"

  type: "Convolution"

  bottom: "conv3"

  top: "conv4"

  convolution_param {

    num_output: 384

    pad: 1

    kernel_size: 3

    group: 2

  }

}

layer {

  name: "relu4"

  type: "ReLU"

  bottom: "conv4"

  top: "conv4"

}

layer {

  name: "conv5"

  type: "Convolution"

  bottom: "conv4"

  top: "conv5"

  convolution_param {

    num_output: 256

    pad: 1

    kernel_size: 3

    group: 2

  }

}

layer {

  name: "relu5"

  type: "ReLU"

  bottom: "conv5"

  top: "conv5"

}

layer {

  name: "pool5"

  type: "Pooling"

  bottom: "conv5"

  top: "pool5"

  pooling_param {

    pool: MAX

    kernel_size: 3

    stride: 2

  }

}

layer {

  name: "fc6"

  type: "InnerProduct"

  bottom: "pool5"

  top: "fc6"

  inner_product_param {

    num_output: 4096

  }

}

layer {

  name: "relu6"

  type: "ReLU"

  bottom: "fc6"

  top: "fc6"

}

layer {

  name: "drop6"

  type: "Dropout"

  bottom: "fc6"

  top: "fc6"

  dropout_param {

    dropout_ratio: 0.5

  }

}

layer {

  name: "fc7"

  type: "InnerProduct"

  bottom: "fc6"

  top: "fc7"

  inner_product_param {

    num_output: 4096

  }

}

layer {

  name: "relu7"

  type: "ReLU"

  bottom: "fc7"

  top: "fc7"

}

layer {

  name: "drop7"

  type: "Dropout"

  bottom: "fc7"

  top: "fc7"

  dropout_param {

    dropout_ratio: 0.5

  }

}

layer {

  name: "fc8"

  type: "InnerProduct"

  bottom: "fc7"

  top: "fc8"

  inner_product_param {

    num_output: 1000

  }

}

layer {

  name: "prob"

  type: "Softmax"

  bottom: "fc8"

  top: "prob"

}

I0122 23:02:21.916574  2968 layer_factory.hpp:77] Creating layer input

I0122 23:02:21.916613  2968 net.cpp:86] Creating Layer input

I0122 23:02:21.916638  2968 net.cpp:382] input -> data

I0122 23:02:21.931437  2968 net.cpp:124] Setting up input

I0122 23:02:21.939075  2968 net.cpp:131] Top shape: 10 3 227 227 (1545870)

I0122 23:02:21.939122  2968 net.cpp:139] Memory required for data: 6183480

I0122 23:02:21.939157  2968 layer_factory.hpp:77] Creating layer conv1

I0122 23:02:21.939210  2968 net.cpp:86] Creating Layer conv1

I0122 23:02:21.939235  2968 net.cpp:408] conv1 <- data

I0122 23:02:21.939278  2968 net.cpp:382] conv1 -> conv1

I0122 23:02:21.939563  2968 net.cpp:124] Setting up conv1

I0122 23:02:21.939604  2968 net.cpp:131] Top shape: 10 96 55 55 (2904000)

I0122 23:02:21.939618  2968 net.cpp:139] Memory required for data: 17799480

I0122 23:02:21.939685  2968 layer_factory.hpp:77] Creating layer relu1

I0122 23:02:21.939714  2968 net.cpp:86] Creating Layer relu1

I0122 23:02:21.939730  2968 net.cpp:408] relu1 <- conv1

I0122 23:02:21.939752  2968 net.cpp:369] relu1 -> conv1 (in-place)

I0122 23:02:21.939781  2968 net.cpp:124] Setting up relu1

I0122 23:02:21.939802  2968 net.cpp:131] Top shape: 10 96 55 55 (2904000)

I0122 23:02:21.939817  2968 net.cpp:139] Memory required for data: 29415480

I0122 23:02:21.939832  2968 layer_factory.hpp:77] Creating layer pool1

I0122 23:02:21.939857  2968 net.cpp:86] Creating Layer pool1

I0122 23:02:21.939868  2968 net.cpp:408] pool1 <- conv1

I0122 23:02:21.939887  2968 net.cpp:382] pool1 -> pool1

I0122 23:02:21.939947  2968 net.cpp:124] Setting up pool1

I0122 23:02:21.939967  2968 net.cpp:131] Top shape: 10 96 27 27 (699840)

I0122 23:02:21.939980  2968 net.cpp:139] Memory required for data: 32214840

I0122 23:02:21.939992  2968 layer_factory.hpp:77] Creating layer norm1

I0122 23:02:21.940014  2968 net.cpp:86] Creating Layer norm1

I0122 23:02:21.940027  2968 net.cpp:408] norm1 <- pool1

I0122 23:02:21.940045  2968 net.cpp:382] norm1 -> norm1

I0122 23:02:21.940075  2968 net.cpp:124] Setting up norm1

I0122 23:02:21.940093  2968 net.cpp:131] Top shape: 10 96 27 27 (699840)

I0122 23:02:21.940104  2968 net.cpp:139] Memory required for data: 35014200

I0122 23:02:21.940116  2968 layer_factory.hpp:77] Creating layer conv2

I0122 23:02:21.940137  2968 net.cpp:86] Creating Layer conv2

I0122 23:02:21.940152  2968 net.cpp:408] conv2 <- norm1

I0122 23:02:21.940171  2968 net.cpp:382] conv2 -> conv2

I0122 23:02:21.940996  2968 net.cpp:124] Setting up conv2

I0122 23:02:21.941033  2968 net.cpp:131] Top shape: 10 256 27 27 (1866240)

I0122 23:02:21.941045  2968 net.cpp:139] Memory required for data: 42479160

I0122 23:02:21.941121  2968 layer_factory.hpp:77] Creating layer relu2

I0122 23:02:21.941144  2968 net.cpp:86] Creating Layer relu2

I0122 23:02:21.941157  2968 net.cpp:408] relu2 <- conv2

I0122 23:02:21.941174  2968 net.cpp:369] relu2 -> conv2 (in-place)

I0122 23:02:21.941193  2968 net.cpp:124] Setting up relu2

I0122 23:02:21.941208  2968 net.cpp:131] Top shape: 10 256 27 27 (1866240)

I0122 23:02:21.941220  2968 net.cpp:139] Memory required for data: 49944120

I0122 23:02:21.941232  2968 layer_factory.hpp:77] Creating layer pool2

I0122 23:02:21.941248  2968 net.cpp:86] Creating Layer pool2

I0122 23:02:21.941259  2968 net.cpp:408] pool2 <- conv2

I0122 23:02:21.941275  2968 net.cpp:382] pool2 -> pool2

I0122 23:02:21.941301  2968 net.cpp:124] Setting up pool2

I0122 23:02:21.941316  2968 net.cpp:131] Top shape: 10 256 13 13 (432640)

I0122 23:02:21.941328  2968 net.cpp:139] Memory required for data: 51674680

I0122 23:02:21.941339  2968 layer_factory.hpp:77] Creating layer norm2

I0122 23:02:21.941360  2968 net.cpp:86] Creating Layer norm2

I0122 23:02:21.941372  2968 net.cpp:408] norm2 <- pool2

I0122 23:02:21.941390  2968 net.cpp:382] norm2 -> norm2

I0122 23:02:21.941411  2968 net.cpp:124] Setting up norm2

I0122 23:02:21.941426  2968 net.cpp:131] Top shape: 10 256 13 13 (432640)

I0122 23:02:21.941437  2968 net.cpp:139] Memory required for data: 53405240

I0122 23:02:21.941448  2968 layer_factory.hpp:77] Creating layer conv3

I0122 23:02:21.941468  2968 net.cpp:86] Creating Layer conv3

I0122 23:02:21.941478  2968 net.cpp:408] conv3 <- norm2

I0122 23:02:21.941495  2968 net.cpp:382] conv3 -> conv3

I0122 23:02:21.943603  2968 net.cpp:124] Setting up conv3

I0122 23:02:21.943662  2968 net.cpp:131] Top shape: 10 384 13 13 (648960)

I0122 23:02:21.943675  2968 net.cpp:139] Memory required for data: 56001080

I0122 23:02:21.943711  2968 layer_factory.hpp:77] Creating layer relu3

I0122 23:02:21.943733  2968 net.cpp:86] Creating Layer relu3

I0122 23:02:21.943747  2968 net.cpp:408] relu3 <- conv3

I0122 23:02:21.943765  2968 net.cpp:369] relu3 -> conv3 (in-place)

I0122 23:02:21.943786  2968 net.cpp:124] Setting up relu3

I0122 23:02:21.943801  2968 net.cpp:131] Top shape: 10 384 13 13 (648960)

I0122 23:02:21.943812  2968 net.cpp:139] Memory required for data: 58596920

I0122 23:02:21.943822  2968 layer_factory.hpp:77] Creating layer conv4

I0122 23:02:21.943848  2968 net.cpp:86] Creating Layer conv4

I0122 23:02:21.943861  2968 net.cpp:408] conv4 <- conv3

I0122 23:02:21.943881  2968 net.cpp:382] conv4 -> conv4

I0122 23:02:21.944964  2968 net.cpp:124] Setting up conv4

I0122 23:02:21.945030  2968 net.cpp:131] Top shape: 10 384 13 13 (648960)

I0122 23:02:21.945047  2968 net.cpp:139] Memory required for data: 61192760

I0122 23:02:21.945148  2968 layer_factory.hpp:77] Creating layer relu4

I0122 23:02:21.945188  2968 net.cpp:86] Creating Layer relu4

I0122 23:02:21.945206  2968 net.cpp:408] relu4 <- conv4

I0122 23:02:21.945230  2968 net.cpp:369] relu4 -> conv4 (in-place)

I0122 23:02:21.945258  2968 net.cpp:124] Setting up relu4

I0122 23:02:21.945277  2968 net.cpp:131] Top shape: 10 384 13 13 (648960)

I0122 23:02:21.945291  2968 net.cpp:139] Memory required for data: 63788600

I0122 23:02:21.945303  2968 layer_factory.hpp:77] Creating layer conv5

I0122 23:02:21.945334  2968 net.cpp:86] Creating Layer conv5

I0122 23:02:21.945353  2968 net.cpp:408] conv5 <- conv4

I0122 23:02:21.945376  2968 net.cpp:382] conv5 -> conv5

I0122 23:02:21.946549  2968 net.cpp:124] Setting up conv5

I0122 23:02:21.946606  2968 net.cpp:131] Top shape: 10 256 13 13 (432640)

I0122 23:02:21.946622  2968 net.cpp:139] Memory required for data: 65519160

I0122 23:02:21.946672  2968 layer_factory.hpp:77] Creating layer relu5

I0122 23:02:21.946698  2968 net.cpp:86] Creating Layer relu5

I0122 23:02:21.946717  2968 net.cpp:408] relu5 <- conv5

I0122 23:02:21.946743  2968 net.cpp:369] relu5 -> conv5 (in-place)

I0122 23:02:21.946771  2968 net.cpp:124] Setting up relu5

I0122 23:02:21.946792  2968 net.cpp:131] Top shape: 10 256 13 13 (432640)

I0122 23:02:21.946812  2968 net.cpp:139] Memory required for data: 67249720

I0122 23:02:21.946826  2968 layer_factory.hpp:77] Creating layer pool5

I0122 23:02:21.946848  2968 net.cpp:86] Creating Layer pool5

I0122 23:02:21.946864  2968 net.cpp:408] pool5 <- conv5

I0122 23:02:21.946885  2968 net.cpp:382] pool5 -> pool5

I0122 23:02:21.946935  2968 net.cpp:124] Setting up pool5

I0122 23:02:21.946971  2968 net.cpp:131] Top shape: 10 256 6 6 (92160)

I0122 23:02:21.946986  2968 net.cpp:139] Memory required for data: 67618360

I0122 23:02:21.947003  2968 layer_factory.hpp:77] Creating layer fc6

I0122 23:02:21.947028  2968 net.cpp:86] Creating Layer fc6

I0122 23:02:21.947044  2968 net.cpp:408] fc6 <- pool5

I0122 23:02:21.947065  2968 net.cpp:382] fc6 -> fc6

I0122 23:02:21.989847  2968 net.cpp:124] Setting up fc6

I0122 23:02:21.989913  2968 net.cpp:131] Top shape: 10 4096 (40960)

I0122 23:02:21.989919  2968 net.cpp:139] Memory required for data: 67782200

I0122 23:02:21.989943  2968 layer_factory.hpp:77] Creating layer relu6

I0122 23:02:21.989967  2968 net.cpp:86] Creating Layer relu6

I0122 23:02:21.989975  2968 net.cpp:408] relu6 <- fc6

I0122 23:02:21.989989  2968 net.cpp:369] relu6 -> fc6 (in-place)

I0122 23:02:21.990003  2968 net.cpp:124] Setting up relu6

I0122 23:02:21.990010  2968 net.cpp:131] Top shape: 10 4096 (40960)

I0122 23:02:21.990015  2968 net.cpp:139] Memory required for data: 67946040

I0122 23:02:21.990020  2968 layer_factory.hpp:77] Creating layer drop6

I0122 23:02:21.990031  2968 net.cpp:86] Creating Layer drop6

I0122 23:02:21.990036  2968 net.cpp:408] drop6 <- fc6

I0122 23:02:21.990043  2968 net.cpp:369] drop6 -> fc6 (in-place)

I0122 23:02:21.990067  2968 net.cpp:124] Setting up drop6

I0122 23:02:21.990074  2968 net.cpp:131] Top shape: 10 4096 (40960)

I0122 23:02:21.990079  2968 net.cpp:139] Memory required for data: 68109880

I0122 23:02:21.990084  2968 layer_factory.hpp:77] Creating layer fc7

I0122 23:02:21.990094  2968 net.cpp:86] Creating Layer fc7

I0122 23:02:21.990099  2968 net.cpp:408] fc7 <- fc6

I0122 23:02:21.990111  2968 net.cpp:382] fc7 -> fc7

I0122 23:02:22.008998  2968 net.cpp:124] Setting up fc7

I0122 23:02:22.009058  2968 net.cpp:131] Top shape: 10 4096 (40960)

I0122 23:02:22.009106  2968 net.cpp:139] Memory required for data: 68273720

I0122 23:02:22.009145  2968 layer_factory.hpp:77] Creating layer relu7

I0122 23:02:22.009173  2968 net.cpp:86] Creating Layer relu7

I0122 23:02:22.009187  2968 net.cpp:408] relu7 <- fc7

I0122 23:02:22.009209  2968 net.cpp:369] relu7 -> fc7 (in-place)

I0122 23:02:22.009232  2968 net.cpp:124] Setting up relu7

I0122 23:02:22.009248  2968 net.cpp:131] Top shape: 10 4096 (40960)

I0122 23:02:22.009259  2968 net.cpp:139] Memory required for data: 68437560

I0122 23:02:22.009269  2968 layer_factory.hpp:77] Creating layer drop7

I0122 23:02:22.009286  2968 net.cpp:86] Creating Layer drop7

I0122 23:02:22.009299  2968 net.cpp:408] drop7 <- fc7

I0122 23:02:22.009322  2968 net.cpp:369] drop7 -> fc7 (in-place)

I0122 23:02:22.009346  2968 net.cpp:124] Setting up drop7

I0122 23:02:22.009362  2968 net.cpp:131] Top shape: 10 4096 (40960)

I0122 23:02:22.009371  2968 net.cpp:139] Memory required for data: 68601400

I0122 23:02:22.009382  2968 layer_factory.hpp:77] Creating layer fc8

I0122 23:02:22.009399  2968 net.cpp:86] Creating Layer fc8

I0122 23:02:22.009410  2968 net.cpp:408] fc8 <- fc7

I0122 23:02:22.009428  2968 net.cpp:382] fc8 -> fc8

I0122 23:02:22.017177  2968 net.cpp:124] Setting up fc8

I0122 23:02:22.017282  2968 net.cpp:131] Top shape: 10 1000 (10000)

I0122 23:02:22.017313  2968 net.cpp:139] Memory required for data: 68641400

I0122 23:02:22.017356  2968 layer_factory.hpp:77] Creating layer prob

I0122 23:02:22.017395  2968 net.cpp:86] Creating Layer prob

I0122 23:02:22.017411  2968 net.cpp:408] prob <- fc8

I0122 23:02:22.017433  2968 net.cpp:382] prob -> prob

I0122 23:02:22.017469  2968 net.cpp:124] Setting up prob

I0122 23:02:22.017491  2968 net.cpp:131] Top shape: 10 1000 (10000)

I0122 23:02:22.017504  2968 net.cpp:139] Memory required for data: 68681400

I0122 23:02:22.017516  2968 net.cpp:202] prob does not need backward computation.

I0122 23:02:22.017554  2968 net.cpp:202] fc8 does not need backward computation.

I0122 23:02:22.017566  2968 net.cpp:202] drop7 does not need backward computation.

I0122 23:02:22.017577  2968 net.cpp:202] relu7 does not need backward computation.

I0122 23:02:22.017588  2968 net.cpp:202] fc7 does not need backward computation.

I0122 23:02:22.017598  2968 net.cpp:202] drop6 does not need backward computation.

I0122 23:02:22.017609  2968 net.cpp:202] relu6 does not need backward computation.

I0122 23:02:22.017619  2968 net.cpp:202] fc6 does not need backward computation.

I0122 23:02:22.017630  2968 net.cpp:202] pool5 does not need backward computation.

I0122 23:02:22.017642  2968 net.cpp:202] relu5 does not need backward computation.

I0122 23:02:22.017652  2968 net.cpp:202] conv5 does not need backward computation.

I0122 23:02:22.017663  2968 net.cpp:202] relu4 does not need backward computation.

I0122 23:02:22.017674  2968 net.cpp:202] conv4 does not need backward computation.

I0122 23:02:22.017685  2968 net.cpp:202] relu3 does not need backward computation.

I0122 23:02:22.017696  2968 net.cpp:202] conv3 does not need backward computation.

I0122 23:02:22.017707  2968 net.cpp:202] norm2 does not need backward computation.

I0122 23:02:22.017720  2968 net.cpp:202] pool2 does not need backward computation.

I0122 23:02:22.017734  2968 net.cpp:202] relu2 does not need backward computation.

I0122 23:02:22.017746  2968 net.cpp:202] conv2 does not need backward computation.

I0122 23:02:22.017757  2968 net.cpp:202] norm1 does not need backward computation.

I0122 23:02:22.017770  2968 net.cpp:202] pool1 does not need backward computation.

I0122 23:02:22.017783  2968 net.cpp:202] relu1 does not need backward computation.

I0122 23:02:22.017796  2968 net.cpp:202] conv1 does not need backward computation.

I0122 23:02:22.017809  2968 net.cpp:202] input does not need backward computation.

I0122 23:02:22.017819  2968 net.cpp:244] This network produces output prob

I0122 23:02:22.017868  2968 net.cpp:257] Network initialization done.

I0122 23:02:22.196004  2968 upgrade_proto.cpp:44] Attempting to upgrade input file specified using deprecated transformation parameters: /home/relaybot/Rob_Soft/caffe/src/ros_caffe/data/bvlc_reference_caffenet.caffemodel

I0122 23:02:22.196061  2968 upgrade_proto.cpp:47] Successfully upgraded file specified using deprecated data transformation parameters.

W0122 23:02:22.196069  2968 upgrade_proto.cpp:49] Note that future Caffe releases will only support transform_param messages for transformation fields.

I0122 23:02:22.196074  2968 upgrade_proto.cpp:53] Attempting to upgrade input file specified using deprecated V1LayerParameter: /home/relaybot/Rob_Soft/caffe/src/ros_caffe/data/bvlc_reference_caffenet.caffemodel

I0122 23:02:22.506147  2968 upgrade_proto.cpp:61] Successfully upgraded file specified using deprecated V1LayerParameter

I0122 23:02:22.507925  2968 net.cpp:746] Ignoring source layer data

I0122 23:02:22.597734  2968 net.cpp:746] Ignoring source layer loss

W0122 23:02:22.716584  2968 net.hpp:41] DEPRECATED: ForwardPrefilled() will be removed in a future version. Use Forward().

Test default image under /data/cat.jpg


0.3134 - "n02123045 tabby, tabby cat"
0.2380 - "n02123159 tiger cat"
0.1235 - "n02124075 Egyptian cat"
0.1003 - "n02119022 red fox, Vulpes vulpes"
0.0715 - "n02127052 lynx, catamount"
W0122 23:07:35.308277  2968 net.hpp:41] DEPRECATED: ForwardPrefilled() will be removed in a future version. Use Forward().
W0122 23:12:52.805382  2968 net.hpp:41] DEPRECATED: ForwardPrefilled() will be removed in a future version. Use Forward().


$ rostopic list

/camera/rgb/image_raw
/camera_info
/image_raw
/image_raw/compressed
/image_raw/compressed/parameter_descriptions
/image_raw/compressed/parameter_updates
/image_raw/compressedDepth
/image_raw/compressedDepth/parameter_descriptions
/image_raw/compressedDepth/parameter_updates
/image_raw/theora
/image_raw/theora/parameter_descriptions
/image_raw/theora/parameter_updates
/rosout
/rosout_agg


$ rostopic echo /caffe_ret

---
data: [0.557911 - n04286575 spotlight, spot]
[0.115966 - n03729826 matchstick]
[0.0737537 - n02948072 candle, taper, wax light]
[0.040883 - n09472597 volcano]
[0.028961 - n03666591 lighter, light, igniter, ignitor]

---



$ rosrun rqt_graph rqt_graph




-End-



本文内容由网友自发贡献,版权归原作者所有,本站不承担相应法律责任。如您发现有涉嫌抄袭侵权的内容,请联系:hwhale#tublm.com(使用前将#替换为@)

Ubuntu 16.04 使用docker资料汇总与应用docker安装caffe并使用Classifier(ros kinetic+usb_cam+caffe) 的相关文章

  • FreeRTOS之TCB

    FreeRTOSMini实现了最小任务调度 现在分开介绍进程调度重要部分 进程调度的基础首先是定义任务调度的数据结构 xff0c 来保存任务堆栈结构和任务状态所在状态列表 xff0c 然后就是任务的优先级唯一号等 最小Mini内核参照 Fr
  • FreeRTOS任务调度主要变量

    之前介绍的和FreeRTOS任务调度相关的数据结构即内存分配实现 xLIST heap 4 TCB结构体 任务调度就是基于这些结构体实现 这次介绍调度相关的主要变量 代码在FreeRTOSMini c文件签名部分 span class to
  • Base64串介绍

    以前写winform时候没接触过Base64 刚开始接触时候还不知道是个啥 最开始接触Base64串时候是仪器出图 很长一段时间我还真以为Base64就是表示图的 xff0c 很多人也是这么认为的 xff0c 这次介绍一下什么是Base64
  • FreeRTOS创建任务

    CPU有这些寄存器 R0 R12为通用寄存器 R13为栈顶指针 xff0c 在OS时候中断函数的R13使用MSP的指针 xff08 内核态 xff09 非中断里面使用PSP指针 xff08 用户态 xff09 正是有双堆栈指针可以保证OS切
  • FreeRTOS任务调度最后篇

    FreeRTOS开启任务调度 一篇说到启动任务调度最后启动Systick定时器 xff0c 通过SVC中断引导第一个任务执行 然后系统就在Systick的定时中断下调度任务执行 xff0c 这次介绍最后的部分 xff0c Systick和P
  • 从STM32-FreeRTOS到linux

    之前买的STM32的开发板学习裸机开发 了解裸机之后学习FreeRTOS来作为小型操作系统学习 xff0c 理解操作系统调度实现 一直想学习一下linux的内核 xff0c 之前下载源码和初步看了下感觉无从下手 有了RTOS的基础后 xff
  • C#实现图片旋转

    C 绘图正常是不涉及到旋转的 有时候会有旋转画笔的情况 比如条码打印字竖着打印 旋转图片一定角度绘制 或者斜着画水印 这时候就涉及到旋转画笔了 源码地址 通过graphics TranslateTransform Pcenter X Pce
  • C#调C++库返回字符串

    用C 调C 43 43 库函数返回字符串 xff0c 由于C 43 43 本身方法之间调用返回字符串都是一般都是申明void或int返回的方法 xff0c 然后通过char变量带出返回值 在C 43 43 调用这种之前自己先初始化char空
  • Asp.NetCore在CentOS网站卡死

    最近碰到项目的网站在高峰期卡死的现象 刚开始以为是数据库问题导致的卡死 xff0c 就排查和改了数据的设置 然后观察几天发现网站还是会在高峰期卡死 xff0c 然后改了点网站设置 xff0c 准备第二天观察一下 xff0c 星期二竟然又没出
  • 使用IRIS碰到的坑

    最近换新电脑了 xff0c 然后直接不安装cache2016了 xff0c 直接上IRIS啊 然后碰到几个坑 xff0c 一是在win11不知道是兼容性不好还是怎么了 每次重启电脑后数据库就无法启动 xff0c 为此祭出多年保存的方子 xf
  • K8s 配置高可用提示Configuration file ‘/etc/keepalived/keepalived.conf‘ is not a regular non-executable file

    k8s配置keepalived高可用 xff0c systemctl start keepalived提示 检查keepalived配置文件 xff0c 查询配置也正常 从报错提示显示keepalived conf 配置文件是一个非执行的文
  • Std数据M的荣光

    对检验的上线 xff0c 实施和开发的大部分时间都用在做基础数据和联设备对通道这些 对相同的仪器每次都有做项目数据 xff0c 对通道那些我一直深有感触 xff0c 一直在构思怎么减少仪器对通道这些做数据的工作量 奈何以前只是浅显的使用M
  • matlab从图表中提取数据

    有如下的波形图 xff0c 如何从中精确提取出全部的数据 1 将波形图片 截图 保存为test png或test jpg xff0c 并将图片放于matlab工作目录中 xff0c 如下图示例所指定的目录中 xff1a 2 xff0c 新建
  • STM32 基础系列教程 1- CubeMX+GPIO

    前言 学习stm32 GPIO 的使用 xff0c 设置某一GPIO引脚为输出功能 xff0c 将对应引脚拉高或拉低输出 xff0c 同时学会初步认识STM32最新的HAL库的使用 xff0c 用代码实现控制GPIO引脚输出产生周期出1s
  • STM32 基础系列教程 29 - FreeRTOS

    前言 学习stm32 中 FreeRTOS嵌入式实时操作系统的使用 xff0c 学会在FreeRTOS时行任务创建与任务运动 xff0c 学习在嵌入式实时操作系统下编程 xff0c 用串口打印相应信息 xff0c 并控制LED闪烁 示例详解
  • 对本地的代码进行修改后,直接git pull会提示本地代码和github代码冲突,需要先commit本地代码,或者stash他们

    对本地的代码进行修改后 xff0c 直接git pull会提示本地代码和github代码冲突 xff0c 需要先commit本地代码 xff0c 或者stash他们 对本地的代码进行修改后 xff0c 直接git pull会提示本地代码和g
  • linux查询内存、CPU、硬盘等系统信息的命令

    一 linux CPU大小 root 64 idc cat proc cpuinfo grep 34 model name 34 amp amp cat proc cpuinfo grep 34 physical id 34 model n
  • ubuntu无法更新的问题,提示错误Err http://mirrors.163.com trusty Release.gpg Could not resolve 'mirrors.163.com

    最近在安装使用ubuntu xff0c 并且配置源文件下载相应gcc xff0c gdb时候 xff0c 出现错误 xff0c 提示报错内容为 Err http mirrors 163 com trusty Release gpg Coul
  • 在 GitHub 下载某个程序的特定版本代码

    情况 github中某个项目已经更新到2 1 0版本 但是想要它的1 0 1版本怎么办 方法一 xff1a 首先点击这个repository下的这个branch按钮 点开了以后你会看到这个 xff0c 然后点tags 选择你想要下载的版本
  • Pixhawk之姿态控制

    原文地址 xff1a http blog csdn net qq 21842557 1 写在前面 无人机控制部分主要分为两个部分 xff0c 姿态控制部分和位置控制部分 xff1b 位置控制可用远程遥控控制 xff0c 而姿态控制一般由无人

随机推荐

  • Android注册表文件

    data system packages plist com google android ears 10043 0 data data com google android ears default 3003 1028 1015 com
  • Java 爬虫系列丨(一)爬虫介绍

    1 简介 1 1 背景 随着互联网的迅速发展 xff0c 网络资源越来越丰富 xff0c 信息需求者如何从网络中抽取信息变得至关重要 目前 xff0c 有效的获取网络数据资源的重要方式 xff0c 便是网络爬虫技术 简单的理解 xff0c
  • 基于龙伯格观测器的永磁同步电机仿真与实现

    摘 要 xff1a 在永磁同步电动机控制系统中 xff0c 使用转子位置传感器不仅会增加设计和制造的成本 xff0c 还会使系统的可靠性降低 因此 xff0c 无位置传感器技术已成为永磁同步电机控制领域的研究热点之一 本文对龙伯格观测器技术
  • 拷贝cp大文件报错“文件太大”

    问题 xff1a 今天在centos7系统下 xff0c u盘位vfat格式16个G xff0c 拷贝7个G大小的问文件 xff0c 无论是用dd还是cp都在拷贝到4 3G大小的时候显示失败 故写下这篇博客 无论什么系统 xff0c 只要分
  • CMakeList.txt

    一 Cmake 简介 cmake 是一个跨平台 开源的构建系统 它是一个集软件构建 测试 打包于一身的软件 它使用与平台和编译器独立的配置文件来对软件编译过程进行控制 二 常用命令 1 指定 cmake 的最小版本 cmake minimu
  • 安装centos7 卡在 “正在安装引导装载程序”界面

    今天系统突然起不来 xff0c 不知道什么原因删掉了一些文件 修复太浪费时间 xff0c 还是重新装一个系统 xff08 原来的分区有很多个人资料 xff0c 所以一定不能格调 xff0c 在无用的分区上装新的系统 所以你装系统的时候尽量不
  • insmod: ERROR: could not insert module: Invalid module format

    root 64 zn pc home zn sedriver 5000 new sedriver 5000 span class token comment insmod wst se echip drv ko span insmod ER
  • LoongArch上正常使用`pip install`

    原创 xff1a 你在使用loongarch架构操作系统时 xff0c 是否遇到pip install 安装失败的情况 xff1f 刷到这篇文章 xff0c 大家可添加评论或者私信我 xff0c 及时满足大家的需求 那么 xff0c 下面讲
  • python SOABI兼容性问题

    首先说明一点 xff1a 龙芯发布的仓库都是基于configure ac 中包含loongarch64 linux gnu定义的python所构建 https blog csdn net zhangna20151015 article de
  • python中为什么加上中文注释就会报错

    由于Python源代码也是一个文本文件 xff0c 所以 xff0c 当你的源代码中包含中文的时候 xff0c 在保存源代码时 xff0c 就需要务必指定保存为UTF 8编码 当Python解释器读取源代码时 xff0c 为了让它按UTF
  • 关于在linux操作系统下打不出汉字或者在敲打汉字时无法显示拼音的问题

    在linux下出现问题不比在window下形象 在window下 你发现哪个软件有问题了 xff0c 点击几下鼠标就完事了 xff1b 要是在linux系统下 xff0c 不懂代码 xff0c 可修复不了 打不出汉字 xff0c 在这我就说
  • 解析/etc/hosts文件

    1 xff0c etc hosts xff0c 主机名和ip配置文件 hosts The static table lookup for host name 主机名查询静态表 linux 的 etc hosts是配置ip地址和其对应主机名的
  • c++语法大全

    c 43 43 语法大全 一 变量和简单数据类型 1 变量名只能包含字母 数字和下划线 可以以字母和下划线开头 xff0c 但是不能从数字开头 xff1b 变量名不能包含空格 2 数据类型 字符串 字符串可以用双引号或者单引号括起来 xff
  • libxml2的安装及使用

    本文着重介绍解析xml的libxml2库的安装及使用 xff0c 举例说明创建和解析xml的过程 是针对C语言开发人员使用 你若想详细学习前端的一套东西 xff0c 即xml html css javascript JS 等 xff0c 可
  • dd 与cp的区别

    dd命令和cp命令的区别 cp与dd的区别在于cp可能是以字节方式读取文件 xff0c 而dd是以扇区方式记取 显然dd方式效率要高些 dd最大的用处是他可以进行格式转换和格式化 dd是对块进行操作的 xff0c cp是对文件操作的 比如有
  • 畸变校正与极线校正(具体原理+Matlab代码)

    附 xff1a 相关需要的工具函数源代码 xff08 投影函数 校正矩阵计算等 xff09 见最下面 1 畸变校正 1 1 形成原因 图像畸变一般有两种 xff0c 第一种是透镜本身的形状有问题 xff0c 使得图像发生径向畸变 xff1b
  • 无人驾驶项目——交通标志识别

    在无人驾驶项目中 xff0c 实现交通标志识别是一项重要工作 本文以德国交通标志数据集为训练对象 xff0c 采用深度神经网络LeNet架构处理图像 xff0c 实现交通标志识别 具体处理过程包括包括 xff1a 数据导入 探索和可视化数据
  • 使用机器人操作系统ROS 2和仿真软件Gazebo 9主题进阶实战(七)- mobot速度发布与里程计订阅

    在ROS2课程中已经学过并掌握了一个基本的发布器和订阅器 xff08 C 43 43 xff09 xff0c 官网的教程全部掌握大致需要20分钟吧 这过程包括 xff1a 创建一个功能包编程实现一个发布节点编程实现一个订阅节点编译与运行 这
  • ROS + Caffe 机器人操作系统框架和深度学习框架笔记 (機器人控制與人工智能)

    ROS 43 Caffe xff0c 这里以环境中物体识别为示例 xff0c 机器人怎么知道环境里面有什么呢 xff1f 0 0567392 n03376595 folding chair 0 0566773 n04099969 rocki
  • Ubuntu 16.04 使用docker资料汇总与应用docker安装caffe并使用Classifier(ros kinetic+usb_cam+caffe)

    Docker是开源的应用容器引擎 若想简单了解一下 xff0c 可以参考百度百科词条Docker 好像只支持64位系统 Docker官网 xff1a https www docker com Docker 从入门到实践 xff1a http