Python 获取本文件内部变量

问题描述:
我们在编写单个篇幅较长的工程文件或者希望实现一定自动化时,希望查看本文件中定义的各个变量数值,但又无法像外部引用那样直接取调当前文件的函数及变量(无法通过当前文件名自己调用自己),怎么办呢?

以一个简单的工程文件举例(以torch编写的CNN模型,假设我需要自动化获取定义的模型和输入数据)

import torch
# 加载并归一化 CIFAR10 使用 torchvision ,用它来加载 CIFAR10 数据非常简单
import torch.nn as nn
import torch.nn.functional as F

# 测试模型类
class Net(nn.Module):
    def __init__(self):
        super(Net, self).__init__()
        self.conv1 = nn.Conv2d(1, 32, 3, 1)
        self.conv2 = nn.Conv2d(32, 64, 3, 1)
        self.dropout1 = nn.Dropout2d(0.25)
        self.dropout2 = nn.Dropout2d(0.5)
        self.fc1 = nn.Linear(9216, 128)
        self.fc2 = nn.Linear(128, 10)

    def forward(self, x):
        x = self.conv1(x)
        x = F.relu(x)
        x = self.conv2(x)
        x = F.relu(x)
        x = F.max_pool2d(x, 2)
        x = self.dropout1(x)
        x = torch.flatten(x, 1)
        x = self.fc1(x)
        x = F.relu(x)
        x = self.dropout2(x)
        x = self.fc2(x)
        output = F.log_softmax(x, dim=1)
        return output

if __name__ == '__main__':
    inputs = torch.rand(100,1,28,28)
    model = Net()
    outputs = model(inputs)

其实方法非常简单,只需要在函数入口(main())之后增加一行

print(vars())

vars()函数返回当前文件中定义的所有变量及其数值字典,可根据需要按键取值。

增加上诉代码后,即可获得所有本文件内定义的变量名及其数值(注意,此处编写了函数入口,故所得变量局限在main后所定义的变量,如果没有函数入口,则可能获得更多)

{'__name__': '__main__', '__doc__': None, '__package__': None, '__loader__': <_frozen_importlib_external.SourceFileLoader object at 0x0000020C2187B1D0>, '__spec__': None, '__annotations__': {}, '__builtins__': <module 'builtins' (built-in)>, '__file__': 'C:/Users/Ming/Desktop/Data_Scan/dataset/user_project/CNN.py', '__cached__': None, 'Data_Scan': <module 'Data_Scan.Data_Scan' from 'C:\\Users\\Ming\\Desktop\\Data_Scan\\dataset\\user_project\\Data_Scan\\Data_Scan.py'>, 'os': <module 'os' from 'D:\\Anaconda3\\lib\\os.py'>, 'torch': <module 'torch' from 'D:\\Anaconda3\\lib\\site-packages\\torch\\__init__.py'>, 'torchvision': <module 'torchvision' from 'D:\\Anaconda3\\lib\\site-packages\\torchvision\\__init__.py'>, 'np': <module 'numpy' from 'D:\\Anaconda3\\lib\\site-packages\\numpy\\__init__.py'>, 'nn': <module 'torch.nn' from 'D:\\Anaconda3\\lib\\site-packages\\torch\\nn\\__init__.py'>, 'F': <module 'torch.nn.functional' from 'D:\\Anaconda3\\lib\\site-packages\\torch\\nn\\functional.py'>, 'optim': <module 'torch.optim' from 'D:\\Anaconda3\\lib\\site-packages\\torch\\optim\\__init__.py'>, 'DataLoader': torch.utils.data.dataloader.DataLoader, 'random_split': <function random_split at 0x0000020C7FCF4C80>, 'datasets': <module 'torchvision.datasets' from 'D:\\Anaconda3\\lib\\site-packages\\torchvision\\datasets\\__init__.py'>, 'transforms': <module 'torchvision.transforms' from 'D:\\Anaconda3\\lib\\site-packages\\torchvision\\transforms\\__init__.py'>, 'Parameter': <class 'torch.nn.parameter.Parameter'>, 'GlobalVar': <class '__main__.GlobalVar'>, 'set_demo_value': <function set_demo_value at 0x0000020C21771EA0>, 'get_demo_value': <function get_demo_value at 0x0000020C0937B2F0>, 'Other_class': <class '__main__.Other_class'>, 'Net': <class '__main__.Net'>, 'inputs': tensor([[[[0.7344, 0.7221, 0.3775,  ..., 0.1456, 0.4173, 0.1059],
          [0.5824, 0.5221, 0.2762,  ..., 0.8132, 0.2255, 0.2311],
          [0.1773, 0.4683, 0.1962,  ..., 0.2351, 0.6717, 0.1142],
          ...,
          [0.8772, 0.5958, 0.3903,  ..., 0.9254, 0.5162, 0.6041],
          [0.3875, 0.4583, 0.8830,  ..., 0.6935, 0.0203, 0.2442],
          [0.7818, 0.9370, 0.9068,  ..., 0.9229, 0.3036, 0.1981]]],


        [[[0.5953, 0.2506, 0.9041,  ..., 0.0700, 0.1833, 0.2394],
          [0.5588, 0.5046, 0.9945,  ..., 0.6974, 0.1267, 0.3933],
          [0.2027, 0.2229, 0.7991,  ..., 0.5924, 0.5577, 0.8238],
          ...,
          [0.0089, 0.8397, 0.1213,  ..., 0.4301, 0.7023, 0.9378],
          [0.9471, 0.1437, 0.7558,  ..., 0.1063, 0.9973, 0.1087],
          [0.8877, 0.8831, 0.1881,  ..., 0.4767, 0.9263, 0.0139]]],


        [[[0.7046, 0.4205, 0.6066,  ..., 0.1125, 0.5717, 0.6282],
          [0.2976, 0.5885, 0.2237,  ..., 0.9201, 0.6602, 0.6795],
          [0.9236, 0.0135, 0.5068,  ..., 0.0867, 0.7639, 0.7302],
          ...,
          [0.4463, 0.3907, 0.3878,  ..., 0.9428, 0.1735, 0.2959],
          [0.4188, 0.5531, 0.0064,  ..., 0.4395, 0.3377, 0.5737],
          [0.8159, 0.8144, 0.0210,  ..., 0.8803, 0.3085, 0.1295]]],


        ...,


        [[[0.0019, 0.6522, 0.2207,  ..., 0.3137, 0.6120, 0.7767],
          [0.6474, 0.3027, 0.6944,  ..., 0.0067, 0.3140, 0.6002],
          [0.9654, 0.4729, 0.0834,  ..., 0.4091, 0.1174, 0.8615],
          ...,
          [0.5061, 0.1485, 0.2215,  ..., 0.8776, 0.0804, 0.9551],
          [0.3418, 0.0274, 0.2392,  ..., 0.1150, 0.7833, 0.0015],
          [0.4600, 0.7864, 0.9587,  ..., 0.3458, 0.1956, 0.6341]]],


        [[[0.9838, 0.6317, 0.4013,  ..., 0.0791, 0.3845, 0.8281],
          [0.4231, 0.1473, 0.9044,  ..., 0.3946, 0.1577, 0.9188],
          [0.5200, 0.2634, 0.0811,  ..., 0.6313, 0.4465, 0.9396],
          ...,
          [0.6441, 0.8478, 0.9492,  ..., 0.1710, 0.3072, 0.3238],
          [0.0892, 0.5131, 0.1819,  ..., 0.2852, 0.3984, 0.3618],
          [0.7434, 0.0326, 0.8899,  ..., 0.1583, 0.0354, 0.7608]]],


        [[[0.5595, 0.7517, 0.5624,  ..., 0.1907, 0.0827, 0.0921],
          [0.8240, 0.4301, 0.4873,  ..., 0.8661, 0.0602, 0.4451],
          [0.1571, 0.7180, 0.1296,  ..., 0.6806, 0.8869, 0.1016],
          ...,
          [0.8107, 0.3100, 0.4136,  ..., 0.8125, 0.8130, 0.3689],
          [0.7064, 0.1423, 0.6972,  ..., 0.5143, 0.2605, 0.0950],
          [0.3544, 0.7985, 0.7994,  ..., 0.3185, 0.9392, 0.4490]]]]), 'model': Net(
  (conv1): Conv2d(1, 32, kernel_size=(3, 3), stride=(1, 1))
  (conv2): Conv2d(32, 64, kernel_size=(3, 3), stride=(1, 1))
  (dropout1): Dropout2d(p=0.25, inplace=False)
  (dropout2): Dropout2d(p=0.5, inplace=False)
  (fc1): Linear(in_features=9216, out_features=128, bias=True)
  (fc2): Linear(in_features=128, out_features=10, bias=True)
), 'outputs': tensor([[-2.3425, -2.2923, -2.3561, -2.1765, -2.3585, -2.1959, -2.3089, -2.3954,
         -2.2897, -2.3328],
        [-2.4050, -2.2156, -2.2192, -2.2503, -2.4401, -2.2342, -2.3892, -2.3510,
         -2.2354, -2.3186],
        [-2.3828, -2.2383, -2.2134, -2.2276, -2.4019, -2.1923, -2.3047, -2.3820,
         -2.3208, -2.3924],
        [-2.3943, -2.2646, -2.3230, -2.1953, -2.4115, -2.1906, -2.2725, -2.3232,
         -2.3671, -2.3096],
        [-2.3629, -2.3065, -2.2533, -2.2636, -2.4125, -2.1667, -2.2804, -2.3322,
         -2.2816, -2.3900],
        [-2.4008, -2.2492, -2.2151, -2.1677, -2.4364, -2.2216, -2.3715, -2.3529,
         -2.2958, -2.3511],
        [-2.3544, -2.2811, -2.2425, -2.2703, -2.3888, -2.1806, -2.3133, -2.3585,
         -2.3326, -2.3213],
        [-2.3689, -2.2887, -2.2049, -2.2010, -2.3963, -2.2212, -2.3564, -2.3359,
         -2.3129, -2.3636],
        [-2.4121, -2.2505, -2.2427, -2.2860, -2.3572, -2.1871, -2.2277, -2.5367,
         -2.2585, -2.3141],
        [-2.4388, -2.2246, -2.2872, -2.2939, -2.4801, -2.1572, -2.2530, -2.3598,
         -2.2788, -2.2939],
        [-2.3323, -2.3107, -2.2197, -2.2494, -2.4138, -2.1894, -2.2948, -2.4329,
         -2.3247, -2.2849],
        [-2.3614, -2.3563, -2.2423, -2.2045, -2.3615, -2.2366, -2.2492, -2.4456,
         -2.2925, -2.3012],
        [-2.4385, -2.2905, -2.2725, -2.2113, -2.3834, -2.1936, -2.2924, -2.3434,
         -2.2901, -2.3346],
        [-2.4395, -2.2846, -2.2081, -2.1429, -2.3978, -2.2520, -2.3023, -2.4522,
         -2.2920, -2.2982],
        [-2.4315, -2.2181, -2.2813, -2.2633, -2.4384, -2.1808, -2.3621, -2.3481,
         -2.2291, -2.3078],
        [-2.3092, -2.2549, -2.2552, -2.3199, -2.4306, -2.1660, -2.3146, -2.3356,
         -2.2842, -2.3792],
        [-2.3681, -2.2770, -2.2541, -2.2752, -2.3816, -2.1839, -2.3044, -2.4001,
         -2.2549, -2.3477],
        [-2.3669, -2.2292, -2.2072, -2.2750, -2.3914, -2.2411, -2.3481, -2.3908,
         -2.2776, -2.3197],
        [-2.3586, -2.2957, -2.2468, -2.2080, -2.3721, -2.2064, -2.2697, -2.3864,
         -2.2796, -2.4295],
        [-2.3476, -2.2682, -2.2646, -2.2377, -2.3566, -2.2114, -2.2816, -2.4080,
         -2.2670, -2.4046],
        [-2.3178, -2.3120, -2.2646, -2.2265, -2.4068, -2.2135, -2.2691, -2.3494,
         -2.3290, -2.3537],
        [-2.3279, -2.3109, -2.2689, -2.2205, -2.4014, -2.1704, -2.2606, -2.4402,
         -2.3158, -2.3382],
        [-2.3747, -2.2555, -2.2679, -2.2744, -2.4163, -2.1801, -2.2764, -2.4392,
         -2.3115, -2.2588],
        [-2.4254, -2.3268, -2.2823, -2.2068, -2.3785, -2.1719, -2.3308, -2.3620,
         -2.2511, -2.3180],
        [-2.3925, -2.2591, -2.2289, -2.2806, -2.3765, -2.2076, -2.2887, -2.4276,
         -2.2676, -2.3203],
        [-2.3213, -2.2666, -2.2297, -2.3233, -2.4314, -2.1265, -2.3467, -2.3779,
         -2.2420, -2.3986],
        [-2.3964, -2.2536, -2.3144, -2.1534, -2.4080, -2.2527, -2.3133, -2.3176,
         -2.3352, -2.3058],
        [-2.3749, -2.2877, -2.2318, -2.2574, -2.4266, -2.1506, -2.2924, -2.4139,
         -2.2868, -2.3364],
        [-2.5004, -2.3011, -2.1996, -2.2140, -2.3827, -2.1803, -2.2727, -2.3400,
         -2.3097, -2.3673],
        [-2.3291, -2.2756, -2.2535, -2.2418, -2.4043, -2.2310, -2.2334, -2.4093,
         -2.3260, -2.3424],
        [-2.3820, -2.2652, -2.1877, -2.2431, -2.4379, -2.1889, -2.3238, -2.3731,
         -2.2839, -2.3735],
        [-2.4218, -2.3440, -2.2536, -2.1617, -2.3630, -2.2191, -2.2678, -2.3900,
         -2.3288, -2.3054],
        [-2.4000, -2.2423, -2.3622, -2.2379, -2.4520, -2.1696, -2.2997, -2.3285,
         -2.2825, -2.2820],
        [-2.3993, -2.2634, -2.2540, -2.1990, -2.3625, -2.2262, -2.2902, -2.3897,
         -2.3090, -2.3545],
        [-2.4642, -2.2825, -2.2495, -2.1607, -2.4167, -2.2085, -2.3007, -2.3732,
         -2.2634, -2.3463],
        [-2.4028, -2.2406, -2.2331, -2.2297, -2.4093, -2.1965, -2.2964, -2.4102,
         -2.3324, -2.3043],
        [-2.3342, -2.2989, -2.3075, -2.2138, -2.4531, -2.1712, -2.2724, -2.3351,
         -2.3455, -2.3206],
        [-2.3932, -2.3087, -2.2305, -2.1348, -2.3710, -2.1888, -2.2334, -2.5129,
         -2.3528, -2.3559],
        [-2.3950, -2.2613, -2.2639, -2.1404, -2.3852, -2.2498, -2.3348, -2.4439,
         -2.2901, -2.2960],
        [-2.3548, -2.3226, -2.2475, -2.1982, -2.4195, -2.1894, -2.2949, -2.3344,
         -2.3390, -2.3498],
        [-2.3608, -2.2683, -2.2687, -2.2319, -2.3937, -2.2099, -2.3040, -2.4076,
         -2.2400, -2.3641],
        [-2.4038, -2.2737, -2.2127, -2.3401, -2.3439, -2.1956, -2.3136, -2.4435,
         -2.2142, -2.3154],
        [-2.4312, -2.2923, -2.2599, -2.1530, -2.4093, -2.2348, -2.2926, -2.3974,
         -2.3098, -2.2789],
        [-2.3117, -2.3168, -2.2481, -2.2572, -2.4389, -2.1773, -2.3222, -2.3640,
         -2.2859, -2.3260],
        [-2.3565, -2.2438, -2.2269, -2.3060, -2.3585, -2.2321, -2.3246, -2.4745,
         -2.2785, -2.2507],
        [-2.3631, -2.2819, -2.2627, -2.2443, -2.4279, -2.2107, -2.3580, -2.3513,
         -2.2066, -2.3439],
        [-2.3574, -2.2566, -2.3019, -2.2426, -2.3571, -2.1851, -2.3224, -2.4168,
         -2.2483, -2.3602],
        [-2.4836, -2.3065, -2.2734, -2.1548, -2.3768, -2.2348, -2.2618, -2.4398,
         -2.2726, -2.2651],
        [-2.4261, -2.2628, -2.2564, -2.3064, -2.4494, -2.1799, -2.3462, -2.2974,
         -2.1935, -2.3431],
        [-2.4242, -2.2796, -2.2087, -2.1804, -2.3721, -2.2316, -2.2513, -2.4486,
         -2.3227, -2.3437],
        [-2.4344, -2.2887, -2.2555, -2.2574, -2.2999, -2.1815, -2.3277, -2.3882,
         -2.2555, -2.3620],
        [-2.3432, -2.2825, -2.2725, -2.2834, -2.3616, -2.2011, -2.3319, -2.4076,
         -2.2543, -2.3034],
        [-2.3711, -2.2289, -2.2710, -2.2510, -2.3261, -2.2249, -2.2510, -2.4558,
         -2.3450, -2.3249],
        [-2.3316, -2.2993, -2.2343, -2.2397, -2.3847, -2.1903, -2.2658, -2.4061,
         -2.3136, -2.3843],
        [-2.4778, -2.2409, -2.2503, -2.2031, -2.3879, -2.2372, -2.2802, -2.3740,
         -2.2625, -2.3451],
        [-2.3811, -2.2502, -2.2620, -2.1881, -2.3979, -2.1825, -2.3017, -2.4476,
         -2.3172, -2.3322],
        [-2.3433, -2.3431, -2.3376, -2.2222, -2.3388, -2.2010, -2.3224, -2.3771,
         -2.2363, -2.3210],
        [-2.3296, -2.3183, -2.2203, -2.2793, -2.4006, -2.1882, -2.2672, -2.4210,
         -2.2947, -2.3302],
        [-2.3728, -2.2961, -2.2823, -2.2130, -2.3839, -2.1491, -2.2778, -2.4598,
         -2.2858, -2.3407],
        [-2.4231, -2.1921, -2.2926, -2.2184, -2.3679, -2.2332, -2.3711, -2.3798,
         -2.2725, -2.3024],
        [-2.3494, -2.3172, -2.2193, -2.2096, -2.3769, -2.1796, -2.2975, -2.4575,
         -2.3103, -2.3405],
        [-2.3890, -2.3000, -2.3429, -2.2698, -2.4154, -2.1447, -2.2835, -2.3347,
         -2.2117, -2.3652],
        [-2.3065, -2.3181, -2.2698, -2.2784, -2.3765, -2.1986, -2.2730, -2.3218,
         -2.3166, -2.3794],
        [-2.3486, -2.3144, -2.2086, -2.2624, -2.3141, -2.2270, -2.2989, -2.4730,
         -2.2561, -2.3479],
        [-2.4116, -2.2916, -2.3347, -2.1924, -2.3967, -2.2015, -2.2903, -2.3658,
         -2.3138, -2.2529],
        [-2.4244, -2.2085, -2.2465, -2.2524, -2.4587, -2.1871, -2.3766, -2.3808,
         -2.2501, -2.2811],
        [-2.4144, -2.2885, -2.2057, -2.2287, -2.3531, -2.1870, -2.2912, -2.4212,
         -2.2711, -2.3987],
        [-2.3773, -2.2909, -2.2590, -2.2635, -2.3467, -2.2319, -2.2611, -2.4178,
         -2.2738, -2.3197],
        [-2.3187, -2.2742, -2.2183, -2.2920, -2.3346, -2.1696, -2.3156, -2.4338,
         -2.3244, -2.3694],
        [-2.3139, -2.2530, -2.2494, -2.1713, -2.4223, -2.2711, -2.2405, -2.4755,
         -2.3326, -2.3324],
        [-2.3737, -2.2749, -2.2686, -2.2399, -2.3589, -2.1637, -2.3184, -2.4005,
         -2.3009, -2.3492],
        [-2.4116, -2.2663, -2.2135, -2.2563, -2.4164, -2.2434, -2.3049, -2.3964,
         -2.2705, -2.2714],
        [-2.3309, -2.3134, -2.2548, -2.2403, -2.3830, -2.1459, -2.3157, -2.3979,
         -2.2860, -2.3856],
        [-2.3455, -2.3407, -2.2452, -2.1807, -2.3445, -2.2211, -2.2919, -2.3812,
         -2.3467, -2.3487],
        [-2.3338, -2.2916, -2.2831, -2.2356, -2.2828, -2.2269, -2.2463, -2.4890,
         -2.3110, -2.3512],
        [-2.3694, -2.1941, -2.2646, -2.2760, -2.4483, -2.1455, -2.3485, -2.3580,
         -2.2921, -2.3666],
        [-2.3802, -2.3410, -2.1710, -2.2555, -2.3793, -2.1837, -2.2671, -2.3533,
         -2.3172, -2.4091],
        [-2.3955, -2.2974, -2.2462, -2.2953, -2.3715, -2.1427, -2.3457, -2.3529,
         -2.3071, -2.2954],
        [-2.4791, -2.1531, -2.3023, -2.2610, -2.3359, -2.1726, -2.3349, -2.4392,
         -2.2808, -2.3134],
        [-2.2987, -2.3412, -2.2781, -2.2377, -2.3483, -2.2204, -2.2503, -2.4319,
         -2.3098, -2.3270],
        [-2.4046, -2.2533, -2.1872, -2.2100, -2.3636, -2.2078, -2.3007, -2.5002,
         -2.2978, -2.3438],
        [-2.3527, -2.2593, -2.2621, -2.2110, -2.3873, -2.1948, -2.2336, -2.4670,
         -2.3263, -2.3662],
        [-2.3770, -2.2767, -2.1763, -2.2720, -2.3955, -2.2083, -2.2531, -2.4409,
         -2.3800, -2.2801],
        [-2.2869, -2.2958, -2.3112, -2.2498, -2.3904, -2.2237, -2.2754, -2.3916,
         -2.2766, -2.3382],
        [-2.3452, -2.2367, -2.2827, -2.2368, -2.3573, -2.1580, -2.2821, -2.4934,
         -2.3301, -2.3399],
        [-2.4519, -2.2648, -2.2216, -2.2394, -2.4184, -2.1704, -2.2910, -2.3842,
         -2.2720, -2.3493],
        [-2.3891, -2.2110, -2.2358, -2.2975, -2.3858, -2.2123, -2.3577, -2.4145,
         -2.2760, -2.2724],
        [-2.4047, -2.2533, -2.2280, -2.2717, -2.3743, -2.1543, -2.3391, -2.4375,
         -2.2555, -2.3424],
        [-2.4343, -2.2489, -2.2147, -2.2069, -2.2552, -2.2045, -2.3297, -2.4932,
         -2.3463, -2.3355],
        [-2.4298, -2.2923, -2.2789, -2.1248, -2.4209, -2.2449, -2.2374, -2.3231,
         -2.3133, -2.4008],
        [-2.3951, -2.2589, -2.2459, -2.2335, -2.3730, -2.1742, -2.3222, -2.4463,
         -2.2088, -2.4082],
        [-2.3431, -2.2394, -2.2316, -2.2507, -2.3785, -2.2209, -2.3231, -2.4399,
         -2.2898, -2.3316],
        [-2.4101, -2.2713, -2.2133, -2.2755, -2.4132, -2.1805, -2.3278, -2.4292,
         -2.2211, -2.3197],
        [-2.4192, -2.1865, -2.2343, -2.3071, -2.4329, -2.1844, -2.2933, -2.3699,
         -2.2660, -2.3686],
        [-2.3200, -2.3119, -2.2287, -2.2005, -2.3965, -2.2703, -2.2600, -2.3929,
         -2.3094, -2.3551],
        [-2.3819, -2.2385, -2.2791, -2.2506, -2.4459, -2.1923, -2.3084, -2.4003,
         -2.2894, -2.2676],
        [-2.3847, -2.2366, -2.2637, -2.3133, -2.3367, -2.1804, -2.3081, -2.4816,
         -2.2430, -2.3094],
        [-2.3226, -2.3079, -2.2548, -2.1635, -2.3929, -2.2195, -2.2903, -2.4182,
         -2.2825, -2.4045],
        [-2.3662, -2.2483, -2.2741, -2.2629, -2.3968, -2.1614, -2.3224, -2.3958,
         -2.2721, -2.3513],
        [-2.3153, -2.2931, -2.2474, -2.3236, -2.3550, -2.1197, -2.3288, -2.4406,
         -2.3075, -2.3255]], grad_fn=<LogSoftmaxBackward>)}
  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值