使用python读取kafka实时topic数据demo,包括安装kafka module

 

1. 安装kafka module

kafka-python安装,转载:https://blog.csdn.net/see_you_see_me/article/details/78468421

1、准备工作

使用python操作kafka目前比较常用的库是kafka-python库,但是在安装这个库的时候需要依赖setuptools库和six库,下面就要分别来下载这几个库

1、下载setuptools

打开这个网址会弹出类似下面的额下载窗口,选择保存文件,点击确定可以下载到setuptools-0.6c11-py2.6.egg

2、下载kafka-python

打开http://pipy.python.org,在搜索框里面输入kafka-python,然后点击【search】就打开如下图所示的界面。里面列出了对python版本的要求,但是根据测试,这个版本在Python 2.6.6下面也是可以正常运行的。

 

点击Download打开下面的界面

 

选择 kafka-python-1.3.5.tar.gz (md5) 开始下载

3、下载six

打开http://pipy.python.org,在搜索框里面输入six,然后点击【search】就打开如下图所示的界面。

打开six1.11.0

点击红色方框的链接,会下载到six-1.11.0.tar.gz

2、安装相关python库

在上一步里面我们已经下载了好相关的包,下面开始具体安装,首先创建一个/opt/package/python_lib,然后把这几个包文件上传到这里

1、安装setuptools

执行sh setuptools-0.6c11-py2.6.egg

执行结果如下:


setuptools安装成功。

2、安装six

1)解压

执行tar -zxvf six-1.11.0.tar.gz

解压之后会产生six-1.11.0文件夹

2)安装

cd six-1.11.0

ll

然后执行python setup.py install

3、安装kafka-python

执行tar -zxvf kafka-python-1.3.4.tar.gz解压安装包,会产生kafka-python-1.3.4文件夹,进入到该文件夹

执行python setup.py install

[root@node2 kafka-python-1.3.4]# python setup.py install
running install
running bdist_egg
running egg_info
creating kafka_python.egg-info
writing kafka_python.egg-info/PKG-INFO
writing top-level names to kafka_python.egg-info/top_level.txt
writing dependency_links to kafka_python.egg-info/dependency_links.txt
writing manifest file 'kafka_python.egg-info/SOURCES.txt'
reading manifest file 'kafka_python.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
writing manifest file 'kafka_python.egg-info/SOURCES.txt'
installing library code to build/bdist.linux-x86_64/egg
running install_lib
running build_py
creating build
creating build/lib
creating build/lib/kafka
copying kafka/future.py -> build/lib/kafka
copying kafka/client_async.py -> build/lib/kafka
copying kafka/errors.py -> build/lib/kafka
copying kafka/__init__.py -> build/lib/kafka
copying kafka/structs.py -> build/lib/kafka
copying kafka/context.py -> build/lib/kafka
copying kafka/cluster.py -> build/lib/kafka
copying kafka/conn.py -> build/lib/kafka
copying kafka/version.py -> build/lib/kafka
copying kafka/client.py -> build/lib/kafka
copying kafka/codec.py -> build/lib/kafka
copying kafka/util.py -> build/lib/kafka
copying kafka/common.py -> build/lib/kafka
creating build/lib/kafka/serializer
copying kafka/serializer/__init__.py -> build/lib/kafka/serializer
copying kafka/serializer/abstract.py -> build/lib/kafka/serializer
creating build/lib/kafka/partitioner
copying kafka/partitioner/hashed.py -> build/lib/kafka/partitioner
copying kafka/partitioner/roundrobin.py -> build/lib/kafka/partitioner
copying kafka/partitioner/__init__.py -> build/lib/kafka/partitioner
copying kafka/partitioner/base.py -> build/lib/kafka/partitioner
copying kafka/partitioner/default.py -> build/lib/kafka/partitioner
creating build/lib/kafka/consumer
copying kafka/consumer/__init__.py -> build/lib/kafka/consumer
copying kafka/consumer/base.py -> build/lib/kafka/consumer
copying kafka/consumer/group.py -> build/lib/kafka/consumer
copying kafka/consumer/simple.py -> build/lib/kafka/consumer
copying kafka/consumer/subscription_state.py -> build/lib/kafka/consumer
copying kafka/consumer/fetcher.py -> build/lib/kafka/consumer
copying kafka/consumer/multiprocess.py -> build/lib/kafka/consumer
creating build/lib/kafka/producer
copying kafka/producer/future.py -> build/lib/kafka/producer
copying kafka/producer/__init__.py -> build/lib/kafka/producer
copying kafka/producer/buffer.py -> build/lib/kafka/producer
copying kafka/producer/base.py -> build/lib/kafka/producer
copying kafka/producer/record_accumulator.py -> build/lib/kafka/producer
copying kafka/producer/simple.py -> build/lib/kafka/producer
copying kafka/producer/kafka.py -> build/lib/kafka/producer
copying kafka/producer/sender.py -> build/lib/kafka/producer
copying kafka/producer/keyed.py -> build/lib/kafka/producer
creating build/lib/kafka/vendor
copying kafka/vendor/socketpair.py -> build/lib/kafka/vendor
copying kafka/vendor/__init__.py -> build/lib/kafka/vendor
copying kafka/vendor/six.py -> build/lib/kafka/vendor
copying kafka/vendor/selectors34.py -> build/lib/kafka/vendor
creating build/lib/kafka/protocol
copying kafka/protocol/legacy.py -> build/lib/kafka/protocol
copying kafka/protocol/pickle.py -> build/lib/kafka/protocol
copying kafka/protocol/admin.py -> build/lib/kafka/protocol
copying kafka/protocol/struct.py -> build/lib/kafka/protocol
copying kafka/protocol/message.py -> build/lib/kafka/protocol
copying kafka/protocol/__init__.py -> build/lib/kafka/protocol
copying kafka/protocol/offset.py -> build/lib/kafka/protocol
copying kafka/protocol/metadata.py -> build/lib/kafka/protocol
copying kafka/protocol/fetch.py -> build/lib/kafka/protocol
copying kafka/protocol/commit.py -> build/lib/kafka/protocol
copying kafka/protocol/group.py -> build/lib/kafka/protocol
copying kafka/protocol/abstract.py -> build/lib/kafka/protocol
copying kafka/protocol/produce.py -> build/lib/kafka/protocol
copying kafka/protocol/api.py -> build/lib/kafka/protocol
copying kafka/protocol/types.py -> build/lib/kafka/protocol
creating build/lib/kafka/metrics
copying kafka/metrics/quota.py -> build/lib/kafka/metrics
copying kafka/metrics/kafka_metric.py -> build/lib/kafka/metrics
copying kafka/metrics/measurable.py -> build/lib/kafka/metrics
copying kafka/metrics/__init__.py -> build/lib/kafka/metrics
copying kafka/metrics/metric_name.py -> build/lib/kafka/metrics
copying kafka/metrics/measurable_stat.py -> build/lib/kafka/metrics
copying kafka/metrics/dict_reporter.py -> build/lib/kafka/metrics
copying kafka/metrics/stat.py -> build/lib/kafka/metrics
copying kafka/metrics/compound_stat.py -> build/lib/kafka/metrics
copying kafka/metrics/metrics.py -> build/lib/kafka/metrics
copying kafka/metrics/metric_config.py -> build/lib/kafka/metrics
copying kafka/metrics/metrics_reporter.py -> build/lib/kafka/metrics
creating build/lib/kafka/coordinator
copying kafka/coordinator/__init__.py -> build/lib/kafka/coordinator
copying kafka/coordinator/base.py -> build/lib/kafka/coordinator
copying kafka/coordinator/protocol.py -> build/lib/kafka/coordinator
copying kafka/coordinator/heartbeat.py -> build/lib/kafka/coordinator
copying kafka/coordinator/consumer.py -> build/lib/kafka/coordinator
creating build/lib/kafka/metrics/stats
copying kafka/metrics/stats/rate.py -> build/lib/kafka/metrics/stats
copying kafka/metrics/stats/percentile.py -> build/lib/kafka/metrics/stats
copying kafka/metrics/stats/min_stat.py -> build/lib/kafka/metrics/stats
copying kafka/metrics/stats/sampled_stat.py -> build/lib/kafka/metrics/stats
copying kafka/metrics/stats/__init__.py -> build/lib/kafka/metrics/stats
copying kafka/metrics/stats/count.py -> build/lib/kafka/metrics/stats
copying kafka/metrics/stats/histogram.py -> build/lib/kafka/metrics/stats
copying kafka/metrics/stats/max_stat.py -> build/lib/kafka/metrics/stats
copying kafka/metrics/stats/sensor.py -> build/lib/kafka/metrics/stats
copying kafka/metrics/stats/total.py -> build/lib/kafka/metrics/stats
copying kafka/metrics/stats/percentiles.py -> build/lib/kafka/metrics/stats
copying kafka/metrics/stats/avg.py -> build/lib/kafka/metrics/stats
creating build/lib/kafka/coordinator/assignors
copying kafka/coordinator/assignors/roundrobin.py -> build/lib/kafka/coordinator/assignors
copying kafka/coordinator/assignors/__init__.py -> build/lib/kafka/coordinator/assignors
copying kafka/coordinator/assignors/abstract.py -> build/lib/kafka/coordinator/assignors
copying kafka/coordinator/assignors/range.py -> build/lib/kafka/coordinator/assignors
creating build/bdist.linux-x86_64
creating build/bdist.linux-x86_64/egg
creating build/bdist.linux-x86_64/egg/kafka
creating build/bdist.linux-x86_64/egg/kafka/serializer
copying build/lib/kafka/serializer/__init__.py -> build/bdist.linux-x86_64/egg/kafka/serializer
copying build/lib/kafka/serializer/abstract.py -> build/bdist.linux-x86_64/egg/kafka/serializer
creating build/bdist.linux-x86_64/egg/kafka/partitioner
copying build/lib/kafka/partitioner/hashed.py -> build/bdist.linux-x86_64/egg/kafka/partitioner
copying build/lib/kafka/partitioner/roundrobin.py -> build/bdist.linux-x86_64/egg/kafka/partitioner
copying build/lib/kafka/partitioner/__init__.py -> build/bdist.linux-x86_64/egg/kafka/partitioner
copying build/lib/kafka/partitioner/base.py -> build/bdist.linux-x86_64/egg/kafka/partitioner
copying build/lib/kafka/partitioner/default.py -> build/bdist.linux-x86_64/egg/kafka/partitioner
copying build/lib/kafka/future.py -> build/bdist.linux-x86_64/egg/kafka
creating build/bdist.linux-x86_64/egg/kafka/consumer
copying build/lib/kafka/consumer/__init__.py -> build/bdist.linux-x86_64/egg/kafka/consumer
copying build/lib/kafka/consumer/base.py -> build/bdist.linux-x86_64/egg/kafka/consumer
copying build/lib/kafka/consumer/group.py -> build/bdist.linux-x86_64/egg/kafka/consumer
copying build/lib/kafka/consumer/simple.py -> build/bdist.linux-x86_64/egg/kafka/consumer
copying build/lib/kafka/consumer/subscription_state.py -> build/bdist.linux-x86_64/egg/kafka/consumer
copying build/lib/kafka/consumer/fetcher.py -> build/bdist.linux-x86_64/egg/kafka/consumer
copying build/lib/kafka/consumer/multiprocess.py -> build/bdist.linux-x86_64/egg/kafka/consumer
creating build/bdist.linux-x86_64/egg/kafka/producer
copying build/lib/kafka/producer/future.py -> build/bdist.linux-x86_64/egg/kafka/producer
copying build/lib/kafka/producer/__init__.py -> build/bdist.linux-x86_64/egg/kafka/producer
copying build/lib/kafka/producer/buffer.py -> build/bdist.linux-x86_64/egg/kafka/producer
copying build/lib/kafka/producer/base.py -> build/bdist.linux-x86_64/egg/kafka/producer
copying build/lib/kafka/producer/record_accumulator.py -> build/bdist.linux-x86_64/egg/kafka/producer
copying build/lib/kafka/producer/simple.py -> build/bdist.linux-x86_64/egg/kafka/producer
copying build/lib/kafka/producer/kafka.py -> build/bdist.linux-x86_64/egg/kafka/producer
copying build/lib/kafka/producer/sender.py -> build/bdist.linux-x86_64/egg/kafka/producer
copying build/lib/kafka/producer/keyed.py -> build/bdist.linux-x86_64/egg/kafka/producer
copying build/lib/kafka/client_async.py -> build/bdist.linux-x86_64/egg/kafka
copying build/lib/kafka/errors.py -> build/bdist.linux-x86_64/egg/kafka
copying build/lib/kafka/__init__.py -> build/bdist.linux-x86_64/egg/kafka
creating build/bdist.linux-x86_64/egg/kafka/vendor
copying build/lib/kafka/vendor/socketpair.py -> build/bdist.linux-x86_64/egg/kafka/vendor
copying build/lib/kafka/vendor/__init__.py -> build/bdist.linux-x86_64/egg/kafka/vendor
copying build/lib/kafka/vendor/six.py -> build/bdist.linux-x86_64/egg/kafka/vendor
copying build/lib/kafka/vendor/selectors34.py -> build/bdist.linux-x86_64/egg/kafka/vendor
copying build/lib/kafka/structs.py -> build/bdist.linux-x86_64/egg/kafka
creating build/bdist.linux-x86_64/egg/kafka/protocol
copying build/lib/kafka/protocol/legacy.py -> build/bdist.linux-x86_64/egg/kafka/protocol
copying build/lib/kafka/protocol/pickle.py -> build/bdist.linux-x86_64/egg/kafka/protocol
copying build/lib/kafka/protocol/admin.py -> build/bdist.linux-x86_64/egg/kafka/protocol
copying build/lib/kafka/protocol/struct.py -> build/bdist.linux-x86_64/egg/kafka/protocol
copying build/lib/kafka/protocol/message.py -> build/bdist.linux-x86_64/egg/kafka/protocol
copying build/lib/kafka/protocol/__init__.py -> build/bdist.linux-x86_64/egg/kafka/protocol
copying build/lib/kafka/protocol/offset.py -> build/bdist.linux-x86_64/egg/kafka/protocol
copying build/lib/kafka/protocol/metadata.py -> build/bdist.linux-x86_64/egg/kafka/protocol
copying build/lib/kafka/protocol/fetch.py -> build/bdist.linux-x86_64/egg/kafka/protocol
copying build/lib/kafka/protocol/commit.py -> build/bdist.linux-x86_64/egg/kafka/protocol
copying build/lib/kafka/protocol/group.py -> build/bdist.linux-x86_64/egg/kafka/protocol
copying build/lib/kafka/protocol/abstract.py -> build/bdist.linux-x86_64/egg/kafka/protocol
copying build/lib/kafka/protocol/produce.py -> build/bdist.linux-x86_64/egg/kafka/protocol
copying build/lib/kafka/protocol/api.py -> build/bdist.linux-x86_64/egg/kafka/protocol
copying build/lib/kafka/protocol/types.py -> build/bdist.linux-x86_64/egg/kafka/protocol
copying build/lib/kafka/context.py -> build/bdist.linux-x86_64/egg/kafka
copying build/lib/kafka/cluster.py -> build/bdist.linux-x86_64/egg/kafka
copying build/lib/kafka/conn.py -> build/bdist.linux-x86_64/egg/kafka
creating build/bdist.linux-x86_64/egg/kafka/metrics
copying build/lib/kafka/metrics/quota.py -> build/bdist.linux-x86_64/egg/kafka/metrics
copying build/lib/kafka/metrics/kafka_metric.py -> build/bdist.linux-x86_64/egg/kafka/metrics
copying build/lib/kafka/metrics/measurable.py -> build/bdist.linux-x86_64/egg/kafka/metrics
copying build/lib/kafka/metrics/__init__.py -> build/bdist.linux-x86_64/egg/kafka/metrics
copying build/lib/kafka/metrics/metric_name.py -> build/bdist.linux-x86_64/egg/kafka/metrics
creating build/bdist.linux-x86_64/egg/kafka/metrics/stats
copying build/lib/kafka/metrics/stats/rate.py -> build/bdist.linux-x86_64/egg/kafka/metrics/stats
copying build/lib/kafka/metrics/stats/percentile.py -> build/bdist.linux-x86_64/egg/kafka/metrics/stats
copying build/lib/kafka/metrics/stats/min_stat.py -> build/bdist.linux-x86_64/egg/kafka/metrics/stats
copying build/lib/kafka/metrics/stats/sampled_stat.py -> build/bdist.linux-x86_64/egg/kafka/metrics/stats
copying build/lib/kafka/metrics/stats/__init__.py -> build/bdist.linux-x86_64/egg/kafka/metrics/stats
copying build/lib/kafka/metrics/stats/count.py -> build/bdist.linux-x86_64/egg/kafka/metrics/stats
copying build/lib/kafka/metrics/stats/histogram.py -> build/bdist.linux-x86_64/egg/kafka/metrics/stats
copying build/lib/kafka/metrics/stats/max_stat.py -> build/bdist.linux-x86_64/egg/kafka/metrics/stats
copying build/lib/kafka/metrics/stats/sensor.py -> build/bdist.linux-x86_64/egg/kafka/metrics/stats
copying build/lib/kafka/metrics/stats/total.py -> build/bdist.linux-x86_64/egg/kafka/metrics/stats
copying build/lib/kafka/metrics/stats/percentiles.py -> build/bdist.linux-x86_64/egg/kafka/metrics/stats
copying build/lib/kafka/metrics/stats/avg.py -> build/bdist.linux-x86_64/egg/kafka/metrics/stats
copying build/lib/kafka/metrics/measurable_stat.py -> build/bdist.linux-x86_64/egg/kafka/metrics
copying build/lib/kafka/metrics/dict_reporter.py -> build/bdist.linux-x86_64/egg/kafka/metrics
copying build/lib/kafka/metrics/stat.py -> build/bdist.linux-x86_64/egg/kafka/metrics
copying build/lib/kafka/metrics/compound_stat.py -> build/bdist.linux-x86_64/egg/kafka/metrics
copying build/lib/kafka/metrics/metrics.py -> build/bdist.linux-x86_64/egg/kafka/metrics
copying build/lib/kafka/metrics/metric_config.py -> build/bdist.linux-x86_64/egg/kafka/metrics
copying build/lib/kafka/metrics/metrics_reporter.py -> build/bdist.linux-x86_64/egg/kafka/metrics
copying build/lib/kafka/version.py -> build/bdist.linux-x86_64/egg/kafka
copying build/lib/kafka/client.py -> build/bdist.linux-x86_64/egg/kafka
copying build/lib/kafka/codec.py -> build/bdist.linux-x86_64/egg/kafka
copying build/lib/kafka/util.py -> build/bdist.linux-x86_64/egg/kafka
creating build/bdist.linux-x86_64/egg/kafka/coordinator
copying build/lib/kafka/coordinator/__init__.py -> build/bdist.linux-x86_64/egg/kafka/coordinator
copying build/lib/kafka/coordinator/base.py -> build/bdist.linux-x86_64/egg/kafka/coordinator
copying build/lib/kafka/coordinator/protocol.py -> build/bdist.linux-x86_64/egg/kafka/coordinator
creating build/bdist.linux-x86_64/egg/kafka/coordinator/assignors
copying build/lib/kafka/coordinator/assignors/roundrobin.py -> build/bdist.linux-x86_64/egg/kafka/coordinator/assignors
copying build/lib/kafka/coordinator/assignors/__init__.py -> build/bdist.linux-x86_64/egg/kafka/coordinator/assignors
copying build/lib/kafka/coordinator/assignors/abstract.py -> build/bdist.linux-x86_64/egg/kafka/coordinator/assignors
copying build/lib/kafka/coordinator/assignors/range.py -> build/bdist.linux-x86_64/egg/kafka/coordinator/assignors
copying build/lib/kafka/coordinator/heartbeat.py -> build/bdist.linux-x86_64/egg/kafka/coordinator
copying build/lib/kafka/coordinator/consumer.py -> build/bdist.linux-x86_64/egg/kafka/coordinator
copying build/lib/kafka/common.py -> build/bdist.linux-x86_64/egg/kafka
byte-compiling build/bdist.linux-x86_64/egg/kafka/serializer/__init__.py to __init__.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/serializer/abstract.py to abstract.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/partitioner/hashed.py to hashed.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/partitioner/roundrobin.py to roundrobin.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/partitioner/__init__.py to __init__.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/partitioner/base.py to base.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/partitioner/default.py to default.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/future.py to future.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/consumer/__init__.py to __init__.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/consumer/base.py to base.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/consumer/group.py to group.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/consumer/simple.py to simple.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/consumer/subscription_state.py to subscription_state.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/consumer/fetcher.py to fetcher.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/consumer/multiprocess.py to multiprocess.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/producer/future.py to future.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/producer/__init__.py to __init__.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/producer/buffer.py to buffer.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/producer/base.py to base.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/producer/record_accumulator.py to record_accumulator.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/producer/simple.py to simple.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/producer/kafka.py to kafka.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/producer/sender.py to sender.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/producer/keyed.py to keyed.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/client_async.py to client_async.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/errors.py to errors.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/__init__.py to __init__.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/vendor/socketpair.py to socketpair.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/vendor/__init__.py to __init__.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/vendor/six.py to six.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/vendor/selectors34.py to selectors34.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/structs.py to structs.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/protocol/legacy.py to legacy.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/protocol/pickle.py to pickle.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/protocol/admin.py to admin.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/protocol/struct.py to struct.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/protocol/message.py to message.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/protocol/__init__.py to __init__.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/protocol/offset.py to offset.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/protocol/metadata.py to metadata.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/protocol/fetch.py to fetch.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/protocol/commit.py to commit.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/protocol/group.py to group.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/protocol/abstract.py to abstract.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/protocol/produce.py to produce.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/protocol/api.py to api.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/protocol/types.py to types.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/context.py to context.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/cluster.py to cluster.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/conn.py to conn.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/metrics/quota.py to quota.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/metrics/kafka_metric.py to kafka_metric.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/metrics/measurable.py to measurable.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/metrics/__init__.py to __init__.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/metrics/metric_name.py to metric_name.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/metrics/stats/rate.py to rate.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/metrics/stats/percentile.py to percentile.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/metrics/stats/min_stat.py to min_stat.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/metrics/stats/sampled_stat.py to sampled_stat.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/metrics/stats/__init__.py to __init__.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/metrics/stats/count.py to count.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/metrics/stats/histogram.py to histogram.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/metrics/stats/max_stat.py to max_stat.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/metrics/stats/sensor.py to sensor.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/metrics/stats/total.py to total.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/metrics/stats/percentiles.py to percentiles.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/metrics/stats/avg.py to avg.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/metrics/measurable_stat.py to measurable_stat.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/metrics/dict_reporter.py to dict_reporter.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/metrics/stat.py to stat.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/metrics/compound_stat.py to compound_stat.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/metrics/metrics.py to metrics.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/metrics/metric_config.py to metric_config.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/metrics/metrics_reporter.py to metrics_reporter.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/version.py to version.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/client.py to client.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/codec.py to codec.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/util.py to util.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/coordinator/__init__.py to __init__.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/coordinator/base.py to base.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/coordinator/protocol.py to protocol.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/coordinator/assignors/roundrobin.py to roundrobin.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/coordinator/assignors/__init__.py to __init__.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/coordinator/assignors/abstract.py to abstract.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/coordinator/assignors/range.py to range.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/coordinator/heartbeat.py to heartbeat.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/coordinator/consumer.py to consumer.pyc
byte-compiling build/bdist.linux-x86_64/egg/kafka/common.py to common.pyc
creating build/bdist.linux-x86_64/egg/EGG-INFO
copying kafka_python.egg-info/PKG-INFO -> build/bdist.linux-x86_64/egg/EGG-INFO
copying kafka_python.egg-info/SOURCES.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
copying kafka_python.egg-info/dependency_links.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
copying kafka_python.egg-info/top_level.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
zip_safe flag not set; analyzing archive contents...
kafka.vendor.six: module references __path__
creating dist
creating 'dist/kafka_python-1.3.4-py2.6.egg' and adding 'build/bdist.linux-x86_64/egg' to it
removing 'build/bdist.linux-x86_64/egg' (and everything under it)
Processing kafka_python-1.3.4-py2.6.egg
creating /usr/lib/python2.6/site-packages/kafka_python-1.3.4-py2.6.egg
Extracting kafka_python-1.3.4-py2.6.egg to /usr/lib/python2.6/site-packages
Adding kafka-python 1.3.4 to easy-install.pth file

Installed /usr/lib/python2.6/site-packages/kafka_python-1.3.4-py2.6.egg
Processing dependencies for kafka-python==1.3.4
Finished processing dependencies for kafka-python==1.3.4
[root@node2 kafka-python-1.3.4]# 

 

接下来测试一下,进入python ,导入KafkaProducer,如果没有提示找不到包就证明已经安装OK了

OK,提示kafka-python安装成功。

3、编写测试代码

无论使用哪种语言操作kafka其本质上都是在围绕两个角色进行的,分别是Producer、Consumer

已经在kafka boker里面创建好一个Topic,

1、创建Produer

1)、命令行方式---普通的发送方式

 

 
  1. [root@node1 python_app]# python

  2. Python 2.6.6 (r266:84292, Nov 22 2013, 12:16:22)

  3. [GCC 4.4.7 20120313 (Red Hat 4.4.7-4)] on linux2

  4. Type "help", "copyright", "credits" or "license" for more information.

  5. >>> from kafka import KafkaProducer

  6. >>> producer = KafkaProducer(bootstrap_servers='192.168.120.11:9092')

  7. >>> for _ in range(100):

  8. ... producer.send('world',b'some_message_bytes')

  9. ...

  10. <kafka.producer.future.FutureRecordMetadata object at 0xddb5d0>

  11. <kafka.producer.future.FutureRecordMetadata object at 0xddb750>

  12. <kafka.producer.future.FutureRecordMetadata object at 0xddb790>

 

上面的几行功能分别是:

导入KafkaProducer

创建连接到192.168.120.11:9092这个Broker的Producer,

循环向world这个Topic发送100个消息,消息内容都是some_message_bytes',这种发送方式不指定Partition,kafka会均匀大把这些消息分别写入5个Partiton里面,


更详细的说明可以参考 https://kafka-python.readthedocs.io/en/master/index.html

 

2)、命令行方式---发送json字符串

json作为一种强大的文本格式,已经得到非常普遍的应用,kafak-python也支持发送json格式的消息

其实如果你参考https://kafka-python.readthedocs.io/en/master/index.html这里的KafkaProducer里面的发送json

一定会报错的,这应该是这个文档的一个bug,

 

 
  1. >>> producer = KafkaProducer(value_serializer=lambda v: json.dumps(v).encode('utf-8'))

  2. Traceback (most recent call last):

  3. File "<stdin>", line 1, in <module>

  4. File "/usr/lib/python2.6/site-packages/kafka_python-1.3.4-py2.6.egg/kafka/producer/kafka.py", line 347, in __init__

  5. **self.config)

  6. File "/usr/lib/python2.6/site-packages/kafka_python-1.3.4-py2.6.egg/kafka/client_async.py", line 220, in __init__

  7. self.config['api_version'] = self.check_version(timeout=check_timeout)

  8. File "/usr/lib/python2.6/site-packages/kafka_python-1.3.4-py2.6.egg/kafka/client_async.py", line 841, in check_version

  9. raise Errors.NoBrokersAvailable()

  10. kafka.errors.NoBrokersAvailable: NoBrokersAvailable

经过测试,应该是这样写才是OK的

 

 

 
  1. >>> producer = KafkaProducer(bootstrap_servers='192.168.120.11:9092',value_serializer=lambda v: json.dumps(v).encode('utf-8'))

  2. >>> producer.send('world', {'key1': 'value1'})

  3. <kafka.producer.future.FutureRecordMetadata object at 0x2a9ebd0>

  4. >>>

 

 

3)、命令行方式---发送普通字符串

 
  1. >>> producer.send('world', key=b'foo', value=b'bar')

  2. <kafka.producer.future.FutureRecordMetadata object at 0x29dcd90>

  3. >>>

 

4)、命令行方式--发送压缩字符串

 

 
  1. >>> producer = KafkaProducer(bootstrap_servers='192.168.120.11:9092',compression_type='gzip')

  2. >>> producer.send('world', b'msg 1')


经过测试这种方式发送的内容,在接收方收到的消息仍然是普通的字符串,也许是没有安装python-lz4,原文中有这样的内容:

 

kafka-python supports gzip compression/decompression natively. To produce or consume lz4 compressed messages, you should install python-lz4 (pip install lz4). To enable snappy, install python-snappy (also requires snappy library). See Installation for more information.

 

上面都是测试各个命令的使用,接下来,我们写一个完整的脚本,这个脚本的功能是把指定目录下的文件名发送到world这个topic

file_monitor.py脚本

 

 
  1. #-*- coding: utf-8 -*-

  2.  
  3. from kafka import KafkaProducer

  4. import json

  5. import os

  6. import time

  7. from sys import argv

  8.  
  9. producer = KafkaProducer(bootstrap_servers='192.168.120.11:9092')

  10.  
  11. def log(str):

  12. t = time.strftime(r"%Y-%m-%d_%H-%M-%S",time.localtime())

  13. print("[%s]%s"%(t,str))

  14.  
  15. def list_file(path):

  16. dir_list = os.listdir(path);

  17. for f in dir_list:

  18. producer.send('world',f)

  19. producer.flush()

  20. log('send: %s' % (f))

  21.  
  22. list_file(argv[1])

  23. producer.close()

  24. log('done')

 

 

假如我们要监控/opt/jdk1.8.0_91/lib/missioncontrol/features这个目录下的文件,可以这样执行

python file_monitor.py  /opt/jdk1.8.0_91/lib/missioncontrol/features

执行结果如下:

 

 
  1. [root@node2 python_app]# python file_monitor.py /opt/jdk1.8.0_91/lib/missioncontrol/features

  2. [2017-11-07_17-41-04]send: org.eclipse.ecf.filetransfer.ssl.feature_1.0.0.v20140827-1444

  3. [2017-11-07_17-41-04]send: org.eclipse.emf.common_2.10.1.v20140901-1043

  4. [2017-11-07_17-41-04]send: com.jrockit.mc.feature.rcp.ja_5.5.0.165303

  5. [2017-11-07_17-41-04]send: com.jrockit.mc.feature.console_5.5.0.165303

  6. [2017-11-07_17-41-04]send: org.eclipse.ecf.core.feature_1.1.0.v20140827-1444

  7. [2017-11-07_17-41-04]send: org.eclipse.equinox.p2.core.feature_1.3.0.v20140523-0116

  8. [2017-11-07_17-41-04]send: org.eclipse.ecf.filetransfer.httpclient4.ssl.feature_1.0.0.v20140827-1444

  9. [2017-11-07_17-41-04]send: com.jrockit.mc.feature.rcp_5.5.0.165303

  10. [2017-11-07_17-41-04]send: org.eclipse.babel.nls_eclipse_zh_4.4.0.v20140623020002

  11. [2017-11-07_17-41-04]send: com.jrockit.mc.rcp.product_5.5.0.165303

  12. [2017-11-07_17-41-04]send: org.eclipse.help_2.0.102.v20141007-2301

  13. [2017-11-07_17-41-04]send: org.eclipse.ecf.core.ssl.feature_1.0.0.v20140827-1444

  14. [2017-11-07_17-41-04]send: org.eclipse.ecf.filetransfer.httpclient4.feature_3.9.1.v20140827-1444

  15. [2017-11-07_17-41-04]send: org.eclipse.e4.rcp_1.3.100.v20141007-2033

  16. [2017-11-07_17-41-04]send: org.eclipse.babel.nls_eclipse_ja_4.4.0.v20140623020002

  17. [2017-11-07_17-41-04]send: com.jrockit.mc.feature.flightrecorder_5.5.0.165303

  18. [2017-11-07_17-41-04]send: org.eclipse.emf.ecore_2.10.1.v20140901-1043

  19. [2017-11-07_17-41-04]send: org.eclipse.equinox.p2.rcp.feature_1.2.0.v20140523-0116

  20. [2017-11-07_17-41-04]send: org.eclipse.ecf.filetransfer.feature_3.9.0.v20140827-1444

  21. [2017-11-07_17-41-04]send: com.jrockit.mc.feature.core_5.5.0.165303

  22. [2017-11-07_17-41-04]send: org.eclipse.rcp_4.4.0.v20141007-2301

  23. [2017-11-07_17-41-04]send: com.jrockit.mc.feature.rcp.zh_CN_5.5.0.165303

  24. [2017-11-07_17-41-04]done

 

 

在consumer上看到的内容是这样:

 

 
  1. world:4:93: key=None value=org.eclipse.ecf.filetransfer.ssl.feature_1.0.0.v20140827-1444

  2. world:1:112: key=None value=org.eclipse.emf.common_2.10.1.v20140901-1043

  3. world:3:119: key=None value=com.jrockit.mc.feature.console_5.5.0.165303

  4. world:1:113: key=None value=com.jrockit.mc.feature.rcp.ja_5.5.0.165303

  5. world:0:86: key=None value=org.eclipse.ecf.core.feature_1.1.0.v20140827-1444

  6. world:1:114: key=None value=org.eclipse.equinox.p2.core.feature_1.3.0.v20140523-0116

  7. world:4:94: key=None value=org.eclipse.ecf.filetransfer.httpclient4.ssl.feature_1.0.0.v20140827-1444

  8. world:0:87: key=None value=com.jrockit.mc.feature.rcp_5.5.0.165303

  9. world:4:95: key=None value=org.eclipse.babel.nls_eclipse_zh_4.4.0.v20140623020002

  10. world:2:66: key=None value=com.jrockit.mc.rcp.product_5.5.0.165303

  11. world:4:96: key=None value=org.eclipse.ecf.core.ssl.feature_1.0.0.v20140827-1444

  12. world:2:67: key=None value=org.eclipse.help_2.0.102.v20141007-2301

  13. world:1:115: key=None value=org.eclipse.ecf.filetransfer.httpclient4.feature_3.9.1.v20140827-1444

  14. world:4:97: key=None value=org.eclipse.e4.rcp_1.3.100.v20141007-2033

  15. world:0:88: key=None value=org.eclipse.babel.nls_eclipse_ja_4.4.0.v20140623020002

  16. world:4:98: key=None value=com.jrockit.mc.feature.flightrecorder_5.5.0.165303

  17. world:3:120: key=None value=org.eclipse.emf.ecore_2.10.1.v20140901-1043

  18. world:1:116: key=None value=org.eclipse.equinox.p2.rcp.feature_1.2.0.v20140523-0116

  19. world:4:99: key=None value=org.eclipse.ecf.filetransfer.feature_3.9.0.v20140827-1444

  20. world:2:68: key=None value=com.jrockit.mc.feature.core_5.5.0.165303

  21. world:4:100: key=None value=com.jrockit.mc.feature.rcp.zh_CN_5.5.0.165303

  22. world:2:69: key=None value=org.eclipse.rcp_4.4.0.v20141007-2301

 

 

2、创建Consumer

通常使用Kafka时会创建不同的Topic,并且在Topic里面创建多个Partiton,因此作为Consumer,通常是连接到指定的Broker,指定的Topic来消费消息。

完整的python 脚本

consumer.py

 

 
  1. #-*- coding: utf-8 -*-

  2.  
  3. from kafka import KafkaConsumer

  4. import time

  5.  
  6. def log(str):

  7. t = time.strftime(r"%Y-%m-%d_%H-%M-%S",time.localtime())

  8. print("[%s]%s"%(t,str))

  9.  
  10. log('start consumer')

  11. #消费192.168.120.11:9092上的world 这个Topic,指定consumer group是consumer-20171017

  12. consumer=KafkaConsumer('world',group_id='consumer-20171017',bootstrap_servers=['192.168.120.11:9092'])

  13. for msg in consumer:

  14. recv = "%s:%d:%d: key=%s value=%s" %(msg.topic,msg.partition,msg.offset,msg.key,msg.value)

  15. log(recv)

 

 

重新启动file_monitor.py脚本

 

 
  1. [root@node2 python_app]# python file_monitor.py /opt/jdk1.8.0_91/lib/missioncontrol/features

  2. [2017-11-07_18-00-31]send: org.eclipse.ecf.filetransfer.ssl.feature_1.0.0.v20140827-1444

  3. [2017-11-07_18-00-31]send: org.eclipse.emf.common_2.10.1.v20140901-1043

  4. [2017-11-07_18-00-31]send: com.jrockit.mc.feature.rcp.ja_5.5.0.165303

  5. [2017-11-07_18-00-31]send: com.jrockit.mc.feature.console_5.5.0.165303

  6. [2017-11-07_18-00-31]send: org.eclipse.ecf.core.feature_1.1.0.v20140827-1444

  7. [2017-11-07_18-00-31]send: org.eclipse.equinox.p2.core.feature_1.3.0.v20140523-0116

  8. [2017-11-07_18-00-31]send: org.eclipse.ecf.filetransfer.httpclient4.ssl.feature_1.0.0.v20140827-1444

  9. [2017-11-07_18-00-31]send: com.jrockit.mc.feature.rcp_5.5.0.165303

  10. [2017-11-07_18-00-31]send: org.eclipse.babel.nls_eclipse_zh_4.4.0.v20140623020002

  11. [2017-11-07_18-00-31]send: com.jrockit.mc.rcp.product_5.5.0.165303

  12. [2017-11-07_18-00-31]send: org.eclipse.help_2.0.102.v20141007-2301

  13. [2017-11-07_18-00-31]send: org.eclipse.ecf.core.ssl.feature_1.0.0.v20140827-1444

  14. [2017-11-07_18-00-31]send: org.eclipse.ecf.filetransfer.httpclient4.feature_3.9.1.v20140827-1444

  15. [2017-11-07_18-00-31]send: org.eclipse.e4.rcp_1.3.100.v20141007-2033

  16. [2017-11-07_18-00-31]send: org.eclipse.babel.nls_eclipse_ja_4.4.0.v20140623020002

  17. [2017-11-07_18-00-31]send: com.jrockit.mc.feature.flightrecorder_5.5.0.165303

  18. [2017-11-07_18-00-31]send: org.eclipse.emf.ecore_2.10.1.v20140901-1043

  19. [2017-11-07_18-00-31]send: org.eclipse.equinox.p2.rcp.feature_1.2.0.v20140523-0116

  20. [2017-11-07_18-00-31]send: org.eclipse.ecf.filetransfer.feature_3.9.0.v20140827-1444

  21. [2017-11-07_18-00-31]send: com.jrockit.mc.feature.core_5.5.0.165303

  22. [2017-11-07_18-00-31]send: org.eclipse.rcp_4.4.0.v20141007-2301

  23. [2017-11-07_18-00-31]send: com.jrockit.mc.feature.rcp.zh_CN_5.5.0.165303

  24. [2017-11-07_18-00-31]done


然后启动consumer.py脚本

 

 

 
  1. [root@node1 python_app]# python consumer.py

  2. [2017-09-23_11-34-00]start consumer

  3. [2017-09-23_11-34-10]world:3:121: key=None value=org.eclipse.ecf.filetransfer.ssl.feature_1.0.0.v20140827-1444

  4. [2017-09-23_11-34-10]world:2:70: key=None value=org.eclipse.emf.common_2.10.1.v20140901-1043

  5. [2017-09-23_11-34-10]world:3:122: key=None value=com.jrockit.mc.feature.rcp.ja_5.5.0.165303

  6. [2017-09-23_11-34-10]world:2:71: key=None value=com.jrockit.mc.feature.console_5.5.0.165303

  7. [2017-09-23_11-34-10]world:0:89: key=None value=org.eclipse.ecf.core.feature_1.1.0.v20140827-1444

  8. [2017-09-23_11-34-10]world:4:101: key=None value=org.eclipse.equinox.p2.core.feature_1.3.0.v20140523-0116

  9. [2017-09-23_11-34-10]world:1:117: key=None value=org.eclipse.ecf.filetransfer.httpclient4.ssl.feature_1.0.0.v20140827-1444

  10. [2017-09-23_11-34-10]world:2:72: key=None value=com.jrockit.mc.feature.rcp_5.5.0.165303

  11. [2017-09-23_11-34-10]world:4:102: key=None value=org.eclipse.babel.nls_eclipse_zh_4.4.0.v20140623020002

  12. [2017-09-23_11-34-10]world:2:73: key=None value=com.jrockit.mc.rcp.product_5.5.0.165303

  13. [2017-09-23_11-34-10]world:3:123: key=None value=org.eclipse.help_2.0.102.v20141007-2301

  14. [2017-09-23_11-34-10]world:3:124: key=None value=org.eclipse.ecf.core.ssl.feature_1.0.0.v20140827-1444

  15. [2017-09-23_11-34-10]world:0:90: key=None value=com.jrockit.mc.feature.flightrecorder_5.5.0.165303

  16. [2017-09-23_11-34-10]world:3:125: key=None value=org.eclipse.ecf.filetransfer.httpclient4.feature_3.9.1.v20140827-1444

  17. [2017-09-23_11-34-10]world:3:126: key=None value=org.eclipse.e4.rcp_1.3.100.v20141007-2033

  18. [2017-09-23_11-34-10]world:3:127: key=None value=org.eclipse.babel.nls_eclipse_ja_4.4.0.v20140623020002

  19. [2017-09-23_11-34-10]world:2:74: key=None value=org.eclipse.emf.ecore_2.10.1.v20140901-1043

  20. [2017-09-23_11-34-10]world:3:128: key=None value=org.eclipse.equinox.p2.rcp.feature_1.2.0.v20140523-0116

  21. [2017-09-23_11-34-10]world:0:91: key=None value=com.jrockit.mc.feature.core_5.5.0.165303

  22. [2017-09-23_11-34-10]world:3:129: key=None value=org.eclipse.ecf.filetransfer.feature_3.9.0.v20140827-1444

  23. [2017-09-23_11-34-11]world:3:130: key=None value=org.eclipse.rcp_4.4.0.v20141007-2301

  24. [2017-09-23_11-34-11]world:3:131: key=None value=com.jrockit.mc.feature.rcp.zh_CN_5.5.0.165303

可以看到file_monitor.py脚本发送了一批文件名到word这个topic,并且consumer.py收到了这些文件名。

2. 参考代码

from kafka import KafkaConsumer

from kafka.client import KafkaClient

import time

class KafkaPython:

consumer = None

TOPIC = 'test_topic'

BROKER_LIST = '10.4.146.15:9092,10.4.146.15:9092'

server = topic = None

def __init__(self):

print("begin kafka-python")

self.server = self.BROKER_LIST

self.topic = self.TOPIC

def __del__(self):

print("end")

def getConnect(self):

self.consumer = KafkaConsumer(self.topic, bootstrap_servers = self.server)

def beginConsumer(self):

for oneLog in self.consumer:

print(oneLog)

def disConnect(self):

self.consumer.close()

if __name__ == '__main__':

kp = KafkaPython()

kp.getConnect()

kp.beginConsumer()

 

评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值