caffe基础-13AlexNet模型bvlc_reference_caffenet的测试

1、 准备

2、模型测试代码


  • 之前我们做灰度图像分类测试是用的caffe一个自带的cassify.py文件并经过一些简单的修改。这次我们创建一个testModel.py文件来测试分类,此文件代码如下:
import numpy as np
import sys
import caffe

caffe_root = '/home/terrence/caffe/'
sys.path.insert(0, caffe_root + 'python') 

caffe.set_device(0)
caffe.set_mode_gpu() 

modelDef = '/home/terrence/caffe/models/bvlc_reference_caffenet/deploy.prototxt'
modelWeights = '/home/terrence/caffe_case/bvlc_reference_caffenet.caffemodel'

net = caffe.Net(modelDef,     #模型结构
                modelWeights, #训练后的权重
                caffe.TEST)     
#三通道均值
mu = np.load(caffe_root + 'python/caffe/imagenet/ilsvrc_2012_mean.npy')
mu = mu.mean(1).mean(1)

print 'mean subtracted values:' , zip('BGR', mu)

transformer = caffe.io.Transformer({'data': net.blobs['data'].data.shape})
transformer.set_transpose('data', (2,0,1)) # h*w*c -> b* c*h*w 
transformer.set_mean('data', mu)
transformer.set_raw_scale('data', 255)
transformer.set_channel_swap('data', (2,1,0)) #RGB -> BGR

net.blobs['data'].reshape(1, #batch size
                          3, #3 channel
                          227,227) #image size

image = caffe.io.load_image(caffe_root + 'examples/images/cat.jpg')
transformedImage = transformer.preprocess('data', image)

net.blobs['data'].data[...] = transformedImage

output = net.forward()
outputProb = output['prob'][0]

print 'predicted class is: ', outputProb.argmax()
  • 运行代码之前一定要看目录下有没有相关的文件,有就不会出错。

3、实现分类


  • 执行命令:python testModel.py ,中断输出信息如下:
WARNING: Logging before InitGoogleLogging() is written to STDERR
W0921 15:31:58.679814  3654 _caffe.cpp:139] DEPRECATION WARNING - deprecated use of Python interface
W0921 15:31:58.679846  3654 _caffe.cpp:140] Use this instead (with the named "weights" parameter):
W0921 15:31:58.679864  3654 _caffe.cpp:142] Net('/home/terrence/caffe/models/bvlc_reference_caffenet/deploy.prototxt', 1, weights='/home/terrence/caffe_case/bvlc_reference_caffenet.caffemodel')
I0921 15:31:58.681190  3654 net.cpp:51] Initializing net from parameters: 
name: "CaffeNet"
state {
  phase: TEST
  level: 0
}
layer {
  name: "data"
  type: "Input"
  top: "data"
  input_param {
    shape {
      dim: 10
      dim: 3
      dim: 227
      dim: 227
    }
  }
}
layer {
  name: "conv1"
  type: "Convolution"
  bottom: "data"
  top: "conv1"
  convolution_param {
    num_output: 96
    kernel_size: 11
    stride: 4
  }
}
layer {
  name: "relu1"
  type: "ReLU"
  bottom: "conv1"
  top: "conv1"
}
layer {
  name: "pool1"
  type: "Pooling"
  bottom: "conv1"
  top: "pool1"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "norm1"
  type: "LRN"
  bottom: "pool1"
  top: "norm1"
  lrn_param {
    local_size: 5
    alpha: 0.0001
    beta: 0.75
  }
}
layer {
  name: "conv2"
  type: "Convolution"
  bottom: "norm1"
  top: "conv2"
  convolution_param {
    num_output: 256
    pad: 2
    kernel_size: 5
    group: 2
  }
}
layer {
  name: "relu2"
  type: "ReLU"
  bottom: "conv2"
  top: "conv2"
}
layer {
  name: "pool2"
  type: "Pooling"
  bottom: "conv2"
  top: "pool2"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "norm2"
  type: "LRN"
  bottom: "pool2"
  top: "norm2"
  lrn_param {
    local_size: 5
    alpha: 0.0001
    beta: 0.75
  }
}
layer {
  name: "conv3"
  type: "Convolution"
  bottom: "norm2"
  top: "conv3"
  convolution_param {
    num_output: 384
    pad: 1
    kernel_size: 3
  }
}
layer {
  name: "relu3"
  type: "ReLU"
  bottom: "conv3"
  top: "conv3"
}
layer {
  name: "conv4"
  type: "Convolution"
  bottom: "conv3"
  top: "conv4"
  convolution_param {
    num_output: 384
    pad: 1
    kernel_size: 3
    group: 2
  }
}
layer {
  name: "relu4"
  type: "ReLU"
  bottom: "conv4"
  top: "conv4"
}
layer {
  name: "conv5"
  type: "Convolution"
  bottom: "conv4"
  top: "conv5"
  convolution_param {
    num_output: 256
    pad: 1
    kernel_size: 3
    group: 2
  }
}
layer {
  name: "relu5"
  type: "ReLU"
  bottom: "conv5"
  top: "conv5"
}
layer {
  name: "pool5"
  type: "Pooling"
  bottom: "conv5"
  top: "pool5"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "fc6"
  type: "InnerProduct"
  bottom: "pool5"
  top: "fc6"
  inner_product_param {
    num_output: 4096
  }
}
layer {
  name: "relu6"
  type: "ReLU"
  bottom: "fc6"
  top: "fc6"
}
layer {
  name: "drop6"
  type: "Dropout"
  bottom: "fc6"
  top: "fc6"
  dropout_param {
    dropout_ratio: 0.5
  }
}
layer {
  name: "fc7"
  type: "InnerProduct"
  bottom: "fc6"
  top: "fc7"
  inner_product_param {
    num_output: 4096
  }
}
layer {
  name: "relu7"
  type: "ReLU"
  bottom: "fc7"
  top: "fc7"
}
layer {
  name: "drop7"
  type: "Dropout"
  bottom: "fc7"
  top: "fc7"
  dropout_param {
    dropout_ratio: 0.5
  }
}
layer {
  name: "fc8"
  type: "InnerProduct"
  bottom: "fc7"
  top: "fc8"
  inner_product_param {
    num_output: 1000
  }
}
layer {
  name: "prob"
  type: "Softmax"
  bottom: "fc8"
  top: "prob"
}
I0921 15:31:58.681818  3654 layer_factory.hpp:77] Creating layer data
I0921 15:31:58.681843  3654 net.cpp:84] Creating Layer data
I0921 15:31:58.681849  3654 net.cpp:380] data -> data
I0921 15:31:58.688185  3654 net.cpp:122] Setting up data
I0921 15:31:58.688215  3654 net.cpp:129] Top shape: 10 3 227 227 (1545870)
I0921 15:31:58.688235  3654 net.cpp:137] Memory required for data: 6183480
I0921 15:31:58.688242  3654 layer_factory.hpp:77] Creating layer conv1
I0921 15:31:58.688259  3654 net.cpp:84] Creating Layer conv1
I0921 15:31:58.688266  3654 net.cpp:406] conv1 <- data
I0921 15:31:58.688285  3654 net.cpp:380] conv1 -> conv1
I0921 15:31:58.688968  3654 net.cpp:122] Setting up conv1
I0921 15:31:58.688983  3654 net.cpp:129] Top shape: 10 96 55 55 (2904000)
I0921 15:31:58.688988  3654 net.cpp:137] Memory required for data: 17799480
I0921 15:31:58.689014  3654 layer_factory.hpp:77] Creating layer relu1
I0921 15:31:58.689023  3654 net.cpp:84] Creating Layer relu1
I0921 15:31:58.689028  3654 net.cpp:406] relu1 <- conv1
I0921 15:31:58.689035  3654 net.cpp:367] relu1 -> conv1 (in-place)
I0921 15:31:58.689047  3654 net.cpp:122] Setting up relu1
I0921 15:31:58.689066  3654 net.cpp:129] Top shape: 10 96 55 55 (2904000)
I0921 15:31:58.689071  3654 net.cpp:137] Memory required for data: 29415480
I0921 15:31:58.689077  3654 layer_factory.hpp:77] Creating layer pool1
I0921 15:31:58.689085  3654 net.cpp:84] Creating Layer pool1
I0921 15:31:58.689090  3654 net.cpp:406] pool1 <- conv1
I0921 15:31:58.689095  3654 net.cpp:380] pool1 -> pool1
I0921 15:31:58.689131  3654 net.cpp:122] Setting up pool1
I0921 15:31:58.689139  3654 net.cpp:129] Top shape: 10 96 27 27 (699840)
I0921 15:31:58.689144  3654 net.cpp:137] Memory required for data: 32214840
I0921 15:31:58.689149  3654 layer_factory.hpp:77] Creating layer norm1
I0921 15:31:58.689157  3654 net.cpp:84] Creating Layer norm1
I0921 15:31:58.689162  3654 net.cpp:406] norm1 <- pool1
I0921 15:31:58.689182  3654 net.cpp:380] norm1 -> norm1
I0921 15:31:58.689230  3654 net.cpp:122] Setting up norm1
I0921 15:31:58.689237  3654 net.cpp:129] Top shape: 10 96 27 27 (699840)
I0921 15:31:58.689242  3654 net.cpp:137] Memory required for data: 35014200
I0921 15:31:58.689246  3654 layer_factory.hpp:77] Creating layer conv2
I0921 15:31:58.689275  3654 net.cpp:84] Creating Layer conv2
I0921 15:31:58.689280  3654 net.cpp:406] conv2 <- norm1
I0921 15:31:58.689285  3654 net.cpp:380] conv2 -> conv2
I0921 15:31:58.690135  3654 net.cpp:122] Setting up conv2
I0921 15:31:58.690148  3654 net.cpp:129] Top shape: 10 256 27 27 (1866240)
I0921 15:31:58.690153  3654 net.cpp:137] Memory required for data: 42479160
I0921 15:31:58.690176  3654 layer_factory.hpp:77] Creating layer relu2
I0921 15:31:58.690183  3654 net.cpp:84] Creating Layer relu2
I0921 15:31:58.690201  3654 net.cpp:406] relu2 <- conv2
I0921 15:31:58.690207  3654 net.cpp:367] relu2 -> conv2 (in-place)
I0921 15:31:58.690215  3654 net.cpp:122] Setting up relu2
I0921 15:31:58.690222  3654 net.cpp:129] Top shape: 10 256 27 27 (1866240)
I0921 15:31:58.690227  3654 net.cpp:137] Memory required for data: 49944120
I0921 15:31:58.690232  3654 layer_factory.hpp:77] Creating layer pool2
I0921 15:31:58.690238  3654 net.cpp:84] Creating Layer pool2
I0921 15:31:58.690244  3654 net.cpp:406] pool2 <- conv2
I0921 15:31:58.690249  3654 net.cpp:380] pool2 -> pool2
I0921 15:31:58.690279  3654 net.cpp:122] Setting up pool2
I0921 15:31:58.690294  3654 net.cpp:129] Top shape: 10 256 13 13 (432640)
I0921 15:31:58.690299  3654 net.cpp:137] Memory required for data: 51674680
I0921 15:31:58.690315  3654 layer_factory.hpp:77] Creating layer norm2
I0921 15:31:58.690337  3654 net.cpp:84] Creating Layer norm2
I0921 15:31:58.690342  3654 net.cpp:406] norm2 <- pool2
I0921 15:31:58.690361  3654 net.cpp:380] norm2 -> norm2
I0921 15:31:58.690397  3654 net.cpp:122] Setting up norm2
I0921 15:31:58.690407  3654 net.cpp:129] Top shape: 10 256 13 13 (432640)
I0921 15:31:58.690423  3654 net.cpp:137] Memory required for data: 53405240
I0921 15:31:58.690428  3654 layer_factory.hpp:77] Creating layer conv3
I0921 15:31:58.690449  3654 net.cpp:84] Creating Layer conv3
I0921 15:31:58.690454  3654 net.cpp:406] conv3 <- norm2
I0921 15:31:58.690477  3654 net.cpp:380] conv3 -> conv3
I0921 15:31:58.691694  3654 net.cpp:122] Setting up conv3
I0921 15:31:58.691715  3654 net.cpp:129] Top shape: 10 384 13 13 (648960)
I0921 15:31:58.691720  3654 net.cpp:137] Memory required for data: 56001080
I0921 15:31:58.691745  3654 layer_factory.hpp:77] Creating layer relu3
I0921 15:31:58.691766  3654 net.cpp:84] Creating Layer relu3
I0921 15:31:58.691772  3654 net.cpp:406] relu3 <- conv3
I0921 15:31:58.691778  3654 net.cpp:367] relu3 -> conv3 (in-place)
I0921 15:31:58.691787  3654 net.cpp:122] Setting up relu3
I0921 15:31:58.691793  3654 net.cpp:129] Top shape: 10 384 13 13 (648960)
I0921 15:31:58.691799  3654 net.cpp:137] Memory required for data: 58596920
I0921 15:31:58.691803  3654 layer_factory.hpp:77] Creating layer conv4
I0921 15:31:58.691812  3654 net.cpp:84] Creating Layer conv4
I0921 15:31:58.691817  3654 net.cpp:406] conv4 <- conv3
I0921 15:31:58.691823  3654 net.cpp:380] conv4 -> conv4
I0921 15:31:58.692813  3654 net.cpp:122] Setting up conv4
I0921 15:31:58.692829  3654 net.cpp:129] Top shape: 10 384 13 13 (648960)
I0921 15:31:58.692834  3654 net.cpp:137] Memory required for data: 61192760
I0921 15:31:58.692857  3654 layer_factory.hpp:77] Creating layer relu4
I0921 15:31:58.692875  3654 net.cpp:84] Creating Layer relu4
I0921 15:31:58.692880  3654 net.cpp:406] relu4 <- conv4
I0921 15:31:58.692886  3654 net.cpp:367] relu4 -> conv4 (in-place)
I0921 15:31:58.692893  3654 net.cpp:122] Setting up relu4
I0921 15:31:58.692899  3654 net.cpp:129] Top shape: 10 384 13 13 (648960)
I0921 15:31:58.692905  3654 net.cpp:137] Memory required for data: 63788600
I0921 15:31:58.692910  3654 layer_factory.hpp:77] Creating layer conv5
I0921 15:31:58.692919  3654 net.cpp:84] Creating Layer conv5
I0921 15:31:58.692924  3654 net.cpp:406] conv5 <- conv4
I0921 15:31:58.692930  3654 net.cpp:380] conv5 -> conv5
I0921 15:31:58.693665  3654 net.cpp:122] Setting up conv5
I0921 15:31:58.693677  3654 net.cpp:129] Top shape: 10 256 13 13 (432640)
I0921 15:31:58.693682  3654 net.cpp:137] Memory required for data: 65519160
I0921 15:31:58.693706  3654 layer_factory.hpp:77] Creating layer relu5
I0921 15:31:58.693714  3654 net.cpp:84] Creating Layer relu5
I0921 15:31:58.693719  3654 net.cpp:406] relu5 <- conv5
I0921 15:31:58.693724  3654 net.cpp:367] relu5 -> conv5 (in-place)
I0921 15:31:58.693744  3654 net.cpp:122] Setting up relu5
I0921 15:31:58.693750  3654 net.cpp:129] Top shape: 10 256 13 13 (432640)
I0921 15:31:58.693755  3654 net.cpp:137] Memory required for data: 67249720
I0921 15:31:58.693760  3654 layer_factory.hpp:77] Creating layer pool5
I0921 15:31:58.693766  3654 net.cpp:84] Creating Layer pool5
I0921 15:31:58.693773  3654 net.cpp:406] pool5 <- conv5
I0921 15:31:58.693780  3654 net.cpp:380] pool5 -> pool5
I0921 15:31:58.693814  3654 net.cpp:122] Setting up pool5
I0921 15:31:58.693822  3654 net.cpp:129] Top shape: 10 256 6 6 (92160)
I0921 15:31:58.693827  3654 net.cpp:137] Memory required for data: 67618360
I0921 15:31:58.693831  3654 layer_factory.hpp:77] Creating layer fc6
I0921 15:31:58.693845  3654 net.cpp:84] Creating Layer fc6
I0921 15:31:58.693866  3654 net.cpp:406] fc6 <- pool5
I0921 15:31:58.693873  3654 net.cpp:380] fc6 -> fc6
I0921 15:31:58.753211  3654 net.cpp:122] Setting up fc6
I0921 15:31:58.753250  3654 net.cpp:129] Top shape: 10 4096 (40960)
I0921 15:31:58.753270  3654 net.cpp:137] Memory required for data: 67782200
I0921 15:31:58.753294  3654 layer_factory.hpp:77] Creating layer relu6
I0921 15:31:58.753309  3654 net.cpp:84] Creating Layer relu6
I0921 15:31:58.753315  3654 net.cpp:406] relu6 <- fc6
I0921 15:31:58.753322  3654 net.cpp:367] relu6 -> fc6 (in-place)
I0921 15:31:58.753334  3654 net.cpp:122] Setting up relu6
I0921 15:31:58.753338  3654 net.cpp:129] Top shape: 10 4096 (40960)
I0921 15:31:58.753343  3654 net.cpp:137] Memory required for data: 67946040
I0921 15:31:58.753348  3654 layer_factory.hpp:77] Creating layer drop6
I0921 15:31:58.753361  3654 net.cpp:84] Creating Layer drop6
I0921 15:31:58.753368  3654 net.cpp:406] drop6 <- fc6
I0921 15:31:58.753373  3654 net.cpp:367] drop6 -> fc6 (in-place)
I0921 15:31:58.753394  3654 net.cpp:122] Setting up drop6
I0921 15:31:58.753402  3654 net.cpp:129] Top shape: 10 4096 (40960)
I0921 15:31:58.753407  3654 net.cpp:137] Memory required for data: 68109880
I0921 15:31:58.753412  3654 layer_factory.hpp:77] Creating layer fc7
I0921 15:31:58.753418  3654 net.cpp:84] Creating Layer fc7
I0921 15:31:58.753434  3654 net.cpp:406] fc7 <- fc6
I0921 15:31:58.753440  3654 net.cpp:380] fc7 -> fc7
I0921 15:31:58.779583  3654 net.cpp:122] Setting up fc7
I0921 15:31:58.779621  3654 net.cpp:129] Top shape: 10 4096 (40960)
I0921 15:31:58.779641  3654 net.cpp:137] Memory required for data: 68273720
I0921 15:31:58.779666  3654 layer_factory.hpp:77] Creating layer relu7
I0921 15:31:58.779678  3654 net.cpp:84] Creating Layer relu7
I0921 15:31:58.779685  3654 net.cpp:406] relu7 <- fc7
I0921 15:31:58.779691  3654 net.cpp:367] relu7 -> fc7 (in-place)
I0921 15:31:58.779702  3654 net.cpp:122] Setting up relu7
I0921 15:31:58.779707  3654 net.cpp:129] Top shape: 10 4096 (40960)
I0921 15:31:58.779712  3654 net.cpp:137] Memory required for data: 68437560
I0921 15:31:58.779716  3654 layer_factory.hpp:77] Creating layer drop7
I0921 15:31:58.779723  3654 net.cpp:84] Creating Layer drop7
I0921 15:31:58.779733  3654 net.cpp:406] drop7 <- fc7
I0921 15:31:58.779741  3654 net.cpp:367] drop7 -> fc7 (in-place)
I0921 15:31:58.779762  3654 net.cpp:122] Setting up drop7
I0921 15:31:58.779770  3654 net.cpp:129] Top shape: 10 4096 (40960)
I0921 15:31:58.779775  3654 net.cpp:137] Memory required for data: 68601400
I0921 15:31:58.779780  3654 layer_factory.hpp:77] Creating layer fc8
I0921 15:31:58.779796  3654 net.cpp:84] Creating Layer fc8
I0921 15:31:58.779803  3654 net.cpp:406] fc8 <- fc7
I0921 15:31:58.779808  3654 net.cpp:380] fc8 -> fc8
I0921 15:31:58.785863  3654 net.cpp:122] Setting up fc8
I0921 15:31:58.785899  3654 net.cpp:129] Top shape: 10 1000 (10000)
I0921 15:31:58.785918  3654 net.cpp:137] Memory required for data: 68641400
I0921 15:31:58.785930  3654 layer_factory.hpp:77] Creating layer prob
I0921 15:31:58.785957  3654 net.cpp:84] Creating Layer prob
I0921 15:31:58.785964  3654 net.cpp:406] prob <- fc8
I0921 15:31:58.785971  3654 net.cpp:380] prob -> prob
I0921 15:31:58.786042  3654 net.cpp:122] Setting up prob
I0921 15:31:58.786051  3654 net.cpp:129] Top shape: 10 1000 (10000)
I0921 15:31:58.786056  3654 net.cpp:137] Memory required for data: 68681400
I0921 15:31:58.786098  3654 net.cpp:200] prob does not need backward computation.
I0921 15:31:58.786115  3654 net.cpp:200] fc8 does not need backward computation.
I0921 15:31:58.786120  3654 net.cpp:200] drop7 does not need backward computation.
I0921 15:31:58.786124  3654 net.cpp:200] relu7 does not need backward computation.
I0921 15:31:58.786129  3654 net.cpp:200] fc7 does not need backward computation.
I0921 15:31:58.786134  3654 net.cpp:200] drop6 does not need backward computation.
I0921 15:31:58.786139  3654 net.cpp:200] relu6 does not need backward computation.
I0921 15:31:58.786144  3654 net.cpp:200] fc6 does not need backward computation.
I0921 15:31:58.786149  3654 net.cpp:200] pool5 does not need backward computation.
I0921 15:31:58.786154  3654 net.cpp:200] relu5 does not need backward computation.
I0921 15:31:58.786159  3654 net.cpp:200] conv5 does not need backward computation.
I0921 15:31:58.786164  3654 net.cpp:200] relu4 does not need backward computation.
I0921 15:31:58.786172  3654 net.cpp:200] conv4 does not need backward computation.
I0921 15:31:58.786183  3654 net.cpp:200] relu3 does not need backward computation.
I0921 15:31:58.786188  3654 net.cpp:200] conv3 does not need backward computation.
I0921 15:31:58.786193  3654 net.cpp:200] norm2 does not need backward computation.
I0921 15:31:58.786198  3654 net.cpp:200] pool2 does not need backward computation.
I0921 15:31:58.786203  3654 net.cpp:200] relu2 does not need backward computation.
I0921 15:31:58.786208  3654 net.cpp:200] conv2 does not need backward computation.
I0921 15:31:58.786216  3654 net.cpp:200] norm1 does not need backward computation.
I0921 15:31:58.786222  3654 net.cpp:200] pool1 does not need backward computation.
I0921 15:31:58.786226  3654 net.cpp:200] relu1 does not need backward computation.
I0921 15:31:58.786232  3654 net.cpp:200] conv1 does not need backward computation.
I0921 15:31:58.786237  3654 net.cpp:200] data does not need backward computation.
I0921 15:31:58.786242  3654 net.cpp:242] This network produces output prob
I0921 15:31:58.786253  3654 net.cpp:255] Network initialization done.
I0921 15:31:59.048005  3654 upgrade_proto.cpp:44] Attempting to upgrade input file specified using deprecated transformation parameters: /home/terrence/caffe_case/bvlc_reference_caffenet.caffemodel
I0921 15:31:59.048066  3654 upgrade_proto.cpp:47] Successfully upgraded file specified using deprecated data transformation parameters.
W0921 15:31:59.048084  3654 upgrade_proto.cpp:49] Note that future Caffe releases will only support transform_param messages for transformation fields.
I0921 15:31:59.048104  3654 upgrade_proto.cpp:53] Attempting to upgrade input file specified using deprecated V1LayerParameter: /home/terrence/caffe_case/bvlc_reference_caffenet.caffemodel
I0921 15:31:59.177397  3654 upgrade_proto.cpp:61] Successfully upgraded file specified using deprecated V1LayerParameter
I0921 15:31:59.226982  3654 net.cpp:744] Ignoring source layer loss
mean subtracted values: [('B', 104.0069879317889), ('G', 116.66876761696767), ('R', 122.6789143406786)]
predicted class is:  281
  • 系统打印出了理整个前向传播的相关信息以及最终分类结果 281。
评论 2
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值