Tensorflow2.0入门教程6:逻辑回归基础示例

搭建模型一般步骤

  • 1.数据
  • 2.模型
  • 3.目标函数
  • 4.优化算法
  • 5.评价指标(准确率)

数据

import tensorflow as tf
import numpy as np
import matplotlib.pyplot as plt
import pickle
%matplotlib inline
with open('./logistic_regression.pkl', 'rb') as f:
    data = pickle.load(f)
plt.plot(data[:1024, 0], data[:1024, 1], 'o')
plt.plot(data[1024:, 0], data[1024:, 1], 'x')
plt.show()

在这里插入图片描述

data
array([[ 0.35990067, -0.13019803,  0.        ],
       [-1.74807039, -0.63542499,  0.        ],
       [ 0.32448663,  1.07917892,  0.        ],
       ...,
       [ 1.26301196,  2.52857687,  1.        ],
       [ 3.10944483,  3.38927944,  1.        ],
       [ 1.02804941,  1.47698313,  1.        ]])

任务描述

利用逻辑回归,将平面上两堆点分开,实现二分类。

sigmoid激活函数:

在这里插入图片描述

在这里插入图片描述

逻辑回归实现二分类:

在这里插入图片描述

打乱数据集,划分训练集和测试集

np.random.shuffle(data)
train_data = data[0:-128]
test_data = data[-128:]
train_data.shape
(1920, 3)

搭建模型

model = tf.keras.Sequential()
model.add(tf.keras.layers.Dense(1,input_shape=(2,),activation=tf.nn.sigmoid))
model.summary()
Model: "sequential_3"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_6 (Dense)              (None, 1)                 3         
=================================================================
Total params: 3
Trainable params: 3
Non-trainable params: 0
_________________________________________________________________
model.compile(optimizer = "adam",
              loss = "binary_crossentropy",
              metrics=["accuracy"])

或者

model.compile(
        optimizer=tf.keras.optimizers.Adam(learning_rate=0.001),
        loss=tf.keras.losses.binary_crossentropy,
        metrics=[tf.keras.metrics.binary_accuracy]
    )

训练模型

history = model.fit(train_data[:,:-1],train_data[:,-1],epochs=100,validation_split=0.2)
Train on 1536 samples, validate on 384 samples
Epoch 1/100
1536/1536 [==============================] - 0s 243us/sample - loss: 0.4278 - binary_accuracy: 0.7754 - val_loss: 0.5142 - val_binary_accuracy: 0.7552
Epoch 2/100
1536/1536 [==============================] - 0s 44us/sample - loss: 0.4148 - binary_accuracy: 0.7839 - val_loss: 0.4982 - val_binary_accuracy: 0.7604
Epoch 3/100
1536/1536 [==============================] - 0s 43us/sample - loss: 0.4029 - binary_accuracy: 0.7897 - val_loss: 0.4835 - val_binary_accuracy: 0.7630
Epoch 4/100
1536/1536 [==============================] - 0s 40us/sample - loss: 0.3918 - binary_accuracy: 0.7949 - val_loss: 0.4704 - val_binary_accuracy: 0.7734
Epoch 5/100
1536/1536 [==============================] - 0s 43us/sample - loss: 0.3814 - binary_accuracy: 0.7995 - val_loss: 0.4576 - val_binary_accuracy: 0.7734
Epoch 6/100
1536/1536 [==============================] - 0s 45us/sample - loss: 0.3718 - binary_accuracy: 0.8047 - val_loss: 0.4460 - val_binary_accuracy: 0.7760
Epoch 7/100
1536/1536 [==============================] - 0s 43us/sample - loss: 0.3629 - binary_accuracy: 0.8125 - val_loss: 0.4351 - val_binary_accuracy: 0.7839
Epoch 8/100
1536/1536 [==============================] - 0s 39us/sample - loss: 0.3545 - binary_accuracy: 0.8236 - val_loss: 0.4259 - val_binary_accuracy: 0.7865
Epoch 9/100
1536/1536 [==============================] - 0s 46us/sample - loss: 0.3470 - binary_accuracy: 0.8346 - val_loss: 0.4161 - val_binary_accuracy: 0.7943
Epoch 10/100
1536/1536 [==============================] - 0s 48us/sample - loss: 0.3398 - binary_accuracy: 0.8411 - val_loss: 0.4083 - val_binary_accuracy: 0.7995
Epoch 11/100
1536/1536 [==============================] - 0s 46us/sample - loss: 0.3333 - binary_accuracy: 0.8490 - val_loss: 0.4004 - val_binary_accuracy: 0.8021
Epoch 12/100
1536/1536 [==============================] - 0s 44us/sample - loss: 0.3271 - binary_accuracy: 0.8542 - val_loss: 0.3936 - val_binary_accuracy: 0.8021
Epoch 13/100
1536/1536 [==============================] - 0s 50us/sample - loss: 0.3214 - binary_accuracy: 0.8587 - val_loss: 0.3869 - val_binary_accuracy: 0.8073
Epoch 14/100
1536/1536 [==============================] - 0s 46us/sample - loss: 0.3159 - binary_accuracy: 0.8639 - val_loss: 0.3811 - val_binary_accuracy: 0.8073
Epoch 15/100
1536/1536 [==============================] - 0s 44us/sample - loss: 0.3108 - binary_accuracy: 0.8678 - val_loss: 0.3754 - val_binary_accuracy: 0.8047
Epoch 16/100
1536/1536 [==============================] - 0s 41us/sample - loss: 0.3060 - binary_accuracy: 0.8717 - val_loss: 0.3703 - val_binary_accuracy: 0.8099
Epoch 17/100
1536/1536 [==============================] - 0s 43us/sample - loss: 0.3014 - binary_accuracy: 0.8789 - val_loss: 0.3652 - val_binary_accuracy: 0.8125
Epoch 18/100
1536/1536 [==============================] - 0s 46us/sample - loss: 0.2970 - binary_accuracy: 0.8828 - val_loss: 0.3605 - val_binary_accuracy: 0.8177
Epoch 19/100
1536/1536 [==============================] - 0s 42us/sample - loss: 0.2928 - binary_accuracy: 0.8874 - val_loss: 0.3565 - val_binary_accuracy: 0.8203
Epoch 20/100
1536/1536 [==============================] - 0s 47us/sample - loss: 0.2888 - binary_accuracy: 0.8900 - val_loss: 0.3523 - val_binary_accuracy: 0.8203
Epoch 21/100
1536/1536 [==============================] - 0s 41us/sample - loss: 0.2849 - binary_accuracy: 0.8932 - val_loss: 0.3481 - val_binary_accuracy: 0.8203
Epoch 22/100
1536/1536 [==============================] - 0s 48us/sample - loss: 0.2811 - binary_accuracy: 0.8926 - val_loss: 0.3446 - val_binary_accuracy: 0.8281
Epoch 23/100
1536/1536 [==============================] - 0s 48us/sample - loss: 0.2776 - binary_accuracy: 0.8958 - val_loss: 0.3407 - val_binary_accuracy: 0.8281
Epoch 24/100
1536/1536 [==============================] - 0s 43us/sample - loss: 0.2740 - binary_accuracy: 0.8991 - val_loss: 0.3374 - val_binary_accuracy: 0.8359
Epoch 25/100
1536/1536 [==============================] - 0s 53us/sample - loss: 0.2707 - binary_accuracy: 0.9023 - val_loss: 0.3340 - val_binary_accuracy: 0.8438
Epoch 26/100
1536/1536 [==============================] - 0s 44us/sample - loss: 0.2675 - binary_accuracy: 0.9036 - val_loss: 0.3310 - val_binary_accuracy: 0.8490
Epoch 27/100
1536/1536 [==============================] - 0s 40us/sample - loss: 0.2643 - binary_accuracy: 0.9062 - val_loss: 0.3279 - val_binary_accuracy: 0.8516
Epoch 28/100
1536/1536 [==============================] - 0s 49us/sample - loss: 0.2614 - binary_accuracy: 0.9089 - val_loss: 0.3247 - val_binary_accuracy: 0.8594
Epoch 29/100
1536/1536 [==============================] - 0s 44us/sample - loss: 0.2584 - binary_accuracy: 0.9121 - val_loss: 0.3223 - val_binary_accuracy: 0.8594
Epoch 30/100
1536/1536 [==============================] - 0s 43us/sample - loss: 0.2556 - binary_accuracy: 0.9128 - val_loss: 0.3194 - val_binary_accuracy: 0.8568
Epoch 31/100
1536/1536 [==============================] - 0s 47us/sample - loss: 0.2529 - binary_accuracy: 0.9154 - val_loss: 0.3168 - val_binary_accuracy: 0.8568
Epoch 32/100
1536/1536 [==============================] - 0s 43us/sample - loss: 0.2503 - binary_accuracy: 0.9167 - val_loss: 0.3143 - val_binary_accuracy: 0.8568
Epoch 33/100
1536/1536 [==============================] - 0s 42us/sample - loss: 0.2477 - binary_accuracy: 0.9160 - val_loss: 0.3117 - val_binary_accuracy: 0.8620
Epoch 34/100
1536/1536 [==============================] - 0s 49us/sample - loss: 0.2452 - binary_accuracy: 0.9180 - val_loss: 0.3095 - val_binary_accuracy: 0.8646
Epoch 35/100
1536/1536 [==============================] - 0s 43us/sample - loss: 0.2429 - binary_accuracy: 0.9186 - val_loss: 0.3073 - val_binary_accuracy: 0.8646
Epoch 36/100
1536/1536 [==============================] - 0s 43us/sample - loss: 0.2405 - binary_accuracy: 0.9193 - val_loss: 0.3050 - val_binary_accuracy: 0.8646
Epoch 37/100
1536/1536 [==============================] - 0s 39us/sample - loss: 0.2383 - binary_accuracy: 0.9206 - val_loss: 0.3029 - val_binary_accuracy: 0.8698
Epoch 38/100
1536/1536 [==============================] - 0s 40us/sample - loss: 0.2361 - binary_accuracy: 0.9206 - val_loss: 0.3007 - val_binary_accuracy: 0.8750
Epoch 39/100
1536/1536 [==============================] - 0s 42us/sample - loss: 0.2340 - binary_accuracy: 0.9225 - val_loss: 0.2988 - val_binary_accuracy: 0.8776
Epoch 40/100
1536/1536 [==============================] - 0s 38us/sample - loss: 0.2320 - binary_accuracy: 0.9232 - val_loss: 0.2970 - val_binary_accuracy: 0.8802
Epoch 41/100
1536/1536 [==============================] - 0s 38us/sample - loss: 0.2300 - binary_accuracy: 0.9251 - val_loss: 0.2949 - val_binary_accuracy: 0.8854
Epoch 42/100
1536/1536 [==============================] - 0s 37us/sample - loss: 0.2281 - binary_accuracy: 0.9258 - val_loss: 0.2934 - val_binary_accuracy: 0.8880
Epoch 43/100
1536/1536 [==============================] - 0s 41us/sample - loss: 0.2262 - binary_accuracy: 0.9258 - val_loss: 0.2914 - val_binary_accuracy: 0.8880
Epoch 44/100
1536/1536 [==============================] - 0s 41us/sample - loss: 0.2244 - binary_accuracy: 0.9284 - val_loss: 0.2900 - val_binary_accuracy: 0.8880
Epoch 45/100
1536/1536 [==============================] - 0s 40us/sample - loss: 0.2227 - binary_accuracy: 0.9284 - val_loss: 0.2882 - val_binary_accuracy: 0.8958
Epoch 46/100
1536/1536 [==============================] - 0s 40us/sample - loss: 0.2210 - binary_accuracy: 0.9277 - val_loss: 0.2866 - val_binary_accuracy: 0.8958
Epoch 47/100
1536/1536 [==============================] - 0s 42us/sample - loss: 0.2193 - binary_accuracy: 0.9284 - val_loss: 0.2852 - val_binary_accuracy: 0.8958
Epoch 48/100
1536/1536 [==============================] - 0s 39us/sample - loss: 0.2177 - binary_accuracy: 0.9284 - val_loss: 0.2837 - val_binary_accuracy: 0.8984
Epoch 49/100
1536/1536 [==============================] - 0s 40us/sample - loss: 0.2162 - binary_accuracy: 0.9277 - val_loss: 0.2824 - val_binary_accuracy: 0.8984
Epoch 50/100
1536/1536 [==============================] - 0s 48us/sample - loss: 0.2147 - binary_accuracy: 0.9277 - val_loss: 0.2808 - val_binary_accuracy: 0.8984
Epoch 51/100
1536/1536 [==============================] - 0s 51us/sample - loss: 0.2132 - binary_accuracy: 0.9284 - val_loss: 0.2796 - val_binary_accuracy: 0.8984
Epoch 52/100
1536/1536 [==============================] - 0s 47us/sample - loss: 0.2118 - binary_accuracy: 0.9297 - val_loss: 0.2783 - val_binary_accuracy: 0.8984
Epoch 53/100
1536/1536 [==============================] - 0s 41us/sample - loss: 0.2104 - binary_accuracy: 0.9290 - val_loss: 0.2771 - val_binary_accuracy: 0.9010
Epoch 54/100
1536/1536 [==============================] - 0s 41us/sample - loss: 0.2091 - binary_accuracy: 0.9290 - val_loss: 0.2758 - val_binary_accuracy: 0.9036
Epoch 55/100
1536/1536 [==============================] - 0s 45us/sample - loss: 0.2078 - binary_accuracy: 0.9290 - val_loss: 0.2748 - val_binary_accuracy: 0.9036
Epoch 56/100
1536/1536 [==============================] - 0s 40us/sample - loss: 0.2065 - binary_accuracy: 0.9290 - val_loss: 0.2736 - val_binary_accuracy: 0.9036
Epoch 57/100
1536/1536 [==============================] - 0s 39us/sample - loss: 0.2053 - binary_accuracy: 0.9290 - val_loss: 0.2725 - val_binary_accuracy: 0.9036
Epoch 58/100
1536/1536 [==============================] - 0s 40us/sample - loss: 0.2041 - binary_accuracy: 0.9297 - val_loss: 0.2715 - val_binary_accuracy: 0.9010
Epoch 59/100
1536/1536 [==============================] - 0s 42us/sample - loss: 0.2030 - binary_accuracy: 0.9297 - val_loss: 0.2706 - val_binary_accuracy: 0.8984
Epoch 60/100
1536/1536 [==============================] - 0s 40us/sample - loss: 0.2019 - binary_accuracy: 0.9303 - val_loss: 0.2695 - val_binary_accuracy: 0.8984
Epoch 61/100
1536/1536 [==============================] - 0s 53us/sample - loss: 0.2008 - binary_accuracy: 0.9297 - val_loss: 0.2684 - val_binary_accuracy: 0.9010
Epoch 62/100
1536/1536 [==============================] - 0s 50us/sample - loss: 0.1997 - binary_accuracy: 0.9290 - val_loss: 0.2675 - val_binary_accuracy: 0.9010
Epoch 63/100
1536/1536 [==============================] - 0s 43us/sample - loss: 0.1987 - binary_accuracy: 0.9284 - val_loss: 0.2666 - val_binary_accuracy: 0.9010
Epoch 64/100
1536/1536 [==============================] - 0s 42us/sample - loss: 0.1977 - binary_accuracy: 0.9277 - val_loss: 0.2660 - val_binary_accuracy: 0.9010
Epoch 65/100
1536/1536 [==============================] - 0s 45us/sample - loss: 0.1968 - binary_accuracy: 0.9290 - val_loss: 0.2650 - val_binary_accuracy: 0.9010
Epoch 66/100
1536/1536 [==============================] - 0s 47us/sample - loss: 0.1958 - binary_accuracy: 0.9290 - val_loss: 0.2644 - val_binary_accuracy: 0.9010
Epoch 67/100
1536/1536 [==============================] - 0s 41us/sample - loss: 0.1949 - binary_accuracy: 0.9297 - val_loss: 0.2635 - val_binary_accuracy: 0.9010
Epoch 68/100
1536/1536 [==============================] - 0s 42us/sample - loss: 0.1940 - binary_accuracy: 0.9297 - val_loss: 0.2629 - val_binary_accuracy: 0.9036
Epoch 69/100
1536/1536 [==============================] - 0s 42us/sample - loss: 0.1931 - binary_accuracy: 0.9316 - val_loss: 0.2619 - val_binary_accuracy: 0.8984
Epoch 70/100
1536/1536 [==============================] - 0s 43us/sample - loss: 0.1923 - binary_accuracy: 0.9316 - val_loss: 0.2612 - val_binary_accuracy: 0.8984
Epoch 71/100
1536/1536 [==============================] - 0s 44us/sample - loss: 0.1914 - binary_accuracy: 0.9316 - val_loss: 0.2607 - val_binary_accuracy: 0.8958
Epoch 72/100
1536/1536 [==============================] - 0s 45us/sample - loss: 0.1906 - binary_accuracy: 0.9316 - val_loss: 0.2599 - val_binary_accuracy: 0.8958
Epoch 73/100
1536/1536 [==============================] - 0s 54us/sample - loss: 0.1899 - binary_accuracy: 0.9310 - val_loss: 0.2594 - val_binary_accuracy: 0.8958
Epoch 74/100
1536/1536 [==============================] - 0s 43us/sample - loss: 0.1891 - binary_accuracy: 0.9316 - val_loss: 0.2586 - val_binary_accuracy: 0.8958
Epoch 75/100
1536/1536 [==============================] - 0s 39us/sample - loss: 0.1884 - binary_accuracy: 0.9323 - val_loss: 0.2579 - val_binary_accuracy: 0.8984
Epoch 76/100
1536/1536 [==============================] - 0s 47us/sample - loss: 0.1876 - binary_accuracy: 0.9323 - val_loss: 0.2576 - val_binary_accuracy: 0.8984
Epoch 77/100
1536/1536 [==============================] - 0s 51us/sample - loss: 0.1870 - binary_accuracy: 0.9323 - val_loss: 0.2570 - val_binary_accuracy: 0.8984
Epoch 78/100
1536/1536 [==============================] - 0s 41us/sample - loss: 0.1863 - binary_accuracy: 0.9323 - val_loss: 0.2566 - val_binary_accuracy: 0.8984
Epoch 79/100
1536/1536 [==============================] - 0s 42us/sample - loss: 0.1856 - binary_accuracy: 0.9310 - val_loss: 0.2559 - val_binary_accuracy: 0.8984
Epoch 80/100
1536/1536 [==============================] - 0s 46us/sample - loss: 0.1850 - binary_accuracy: 0.9316 - val_loss: 0.2554 - val_binary_accuracy: 0.8958
Epoch 81/100
1536/1536 [==============================] - 0s 51us/sample - loss: 0.1843 - binary_accuracy: 0.9316 - val_loss: 0.2548 - val_binary_accuracy: 0.8958
Epoch 82/100
1536/1536 [==============================] - 0s 49us/sample - loss: 0.1837 - binary_accuracy: 0.9316 - val_loss: 0.2544 - val_binary_accuracy: 0.8958
Epoch 83/100
1536/1536 [==============================] - 0s 45us/sample - loss: 0.1831 - binary_accuracy: 0.9323 - val_loss: 0.2539 - val_binary_accuracy: 0.8958
Epoch 84/100
1536/1536 [==============================] - 0s 45us/sample - loss: 0.1825 - binary_accuracy: 0.9323 - val_loss: 0.2535 - val_binary_accuracy: 0.8958
Epoch 85/100
1536/1536 [==============================] - 0s 56us/sample - loss: 0.1819 - binary_accuracy: 0.9323 - val_loss: 0.2531 - val_binary_accuracy: 0.8958
Epoch 86/100
1536/1536 [==============================] - 0s 44us/sample - loss: 0.1814 - binary_accuracy: 0.9323 - val_loss: 0.2526 - val_binary_accuracy: 0.8932
Epoch 87/100
1536/1536 [==============================] - 0s 42us/sample - loss: 0.1809 - binary_accuracy: 0.9316 - val_loss: 0.2521 - val_binary_accuracy: 0.8932
Epoch 88/100
1536/1536 [==============================] - 0s 44us/sample - loss: 0.1804 - binary_accuracy: 0.9316 - val_loss: 0.2518 - val_binary_accuracy: 0.8932
Epoch 89/100
1536/1536 [==============================] - 0s 44us/sample - loss: 0.1799 - binary_accuracy: 0.9316 - val_loss: 0.2515 - val_binary_accuracy: 0.8932
Epoch 90/100
1536/1536 [==============================] - 0s 47us/sample - loss: 0.1793 - binary_accuracy: 0.9316 - val_loss: 0.2512 - val_binary_accuracy: 0.8932
Epoch 91/100
1536/1536 [==============================] - 0s 45us/sample - loss: 0.1789 - binary_accuracy: 0.9316 - val_loss: 0.2509 - val_binary_accuracy: 0.8932
Epoch 92/100
1536/1536 [==============================] - 0s 46us/sample - loss: 0.1784 - binary_accuracy: 0.9316 - val_loss: 0.2505 - val_binary_accuracy: 0.8932
Epoch 93/100
1536/1536 [==============================] - 0s 45us/sample - loss: 0.1780 - binary_accuracy: 0.9316 - val_loss: 0.2502 - val_binary_accuracy: 0.8932
Epoch 94/100
1536/1536 [==============================] - 0s 42us/sample - loss: 0.1775 - binary_accuracy: 0.9316 - val_loss: 0.2498 - val_binary_accuracy: 0.8932
Epoch 95/100
1536/1536 [==============================] - 0s 46us/sample - loss: 0.1771 - binary_accuracy: 0.9316 - val_loss: 0.2494 - val_binary_accuracy: 0.8932
Epoch 96/100
1536/1536 [==============================] - 0s 49us/sample - loss: 0.1767 - binary_accuracy: 0.9316 - val_loss: 0.2494 - val_binary_accuracy: 0.8932
Epoch 97/100
1536/1536 [==============================] - 0s 53us/sample - loss: 0.1763 - binary_accuracy: 0.9316 - val_loss: 0.2490 - val_binary_accuracy: 0.8932
Epoch 98/100
1536/1536 [==============================] - 0s 49us/sample - loss: 0.1759 - binary_accuracy: 0.9316 - val_loss: 0.2487 - val_binary_accuracy: 0.8984
Epoch 99/100
1536/1536 [==============================] - 0s 47us/sample - loss: 0.1755 - binary_accuracy: 0.9316 - val_loss: 0.2484 - val_binary_accuracy: 0.8984
Epoch 100/100
1536/1536 [==============================] - 0s 41us/sample - loss: 0.1751 - binary_accuracy: 0.9316 - val_loss: 0.2482 - val_binary_accuracy: 0.8984
history.history.keys()
dict_keys(['loss', 'binary_accuracy', 'val_loss', 'val_binary_accuracy'])

验证集loss变化

plt.plot(history.epoch,history.history["val_loss"])
[<matplotlib.lines.Line2D at 0x2b769a3d5c0>]

在这里插入图片描述

验证集准确率变化

plt.plot(history.epoch,history.history["val_binary_accuracy"])
[<matplotlib.lines.Line2D at 0x2b7699fedd8>]

在这里插入图片描述

测试模型

model.evaluate(test_data[:,:-1],test_data[:,-1],verbose=2)
128/1 - 0s - loss: 0.2164 - binary_accuracy: 0.8906

[0.22714035958051682, 0.890625]

取出权重W和偏置b

variables = model.layers[0].variables
variables
[<tf.Variable 'dense_6/kernel:0' shape=(2, 1) dtype=float32, numpy=
 array([[0.86180943],
        [1.5808575 ]], dtype=float32)>,
 <tf.Variable 'dense_6/bias:0' shape=(1,) dtype=float32, numpy=array([-2.670146], dtype=float32)>]
W = variables[0].numpy()
W
array([[0.86180943],
       [1.5808575 ]], dtype=float32)
b = variables[1].numpy()
b
array([-2.670146], dtype=float32)

结果可视化

def func(x, w1, w2, b):
    return (-b - w1 * x) / w2
with open('./logistic_regression.pkl', 'rb') as f:
    data = pickle.load(f)
line_space = np.linspace(-5, 6, 2048)
plt.plot(data[:1024, 0], data[:1024, 1], 'o')
plt.plot(data[1024:, 0], data[1024:, 1], 'x')
plt.plot(line_space, list(map(lambda x: func(x, W[0], W[1],b), line_space)))
plt.show()

在这里插入图片描述

评论 2
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值