1、jieba库概述
jieba是优秀的中文分词第三方库
-中文文本需要通过分词获得单个的词语
-jieba是优秀的中文分词第三方库,需要额外安装
-jieba库提供三种分词模式,最简单只需要掌握一个函数
2、jieba库的安装
(cmd命令行)pip install jieba 或 easy_install jieba
C:\Users\lenovo>easy_install jieba
Searching for jieba
Reading https://pypi.python.org/simple/jieba/
Downloading https://files.pythonhosted.org/packages/71/46/c6f9179f73b818d5827202ad1c4a94e371a29473b7f043b736b4dab6b8cd/jieba-0.39.zip#sha256=de385e48582a4862e55a9167334d0fbe91d479026e5dac40e59e22c08b8e883e
Best match: jieba 0.39
Processing jieba-0.39.zip
Writing C:\Users\lenovo\AppData\Local\Temp\easy_install-o02rlo5j\jieba-0.39\setup.cfg
Running jieba-0.39\setup.py -q bdist_egg --dist-dir C:\Users\lenovo\AppData\Local\Temp\easy_install-o02rlo5j\jieba-0.39\egg-dist-tmp-9zp6cf8i
zip_safe flag not set; analyzing archive contents...
jieba.__pycache__._compat.cpython-37: module references __file__
jieba.analyse.__pycache__.tfidf.cpython-37: module references __file__
creating d:\python37\lib\site-packages\jieba-0.39-py3.7.egg
Extracting jieba-0.39-py3.7.egg to d:\python37\lib\site-packages
Adding jieba 0.39 to easy-install.pth file
Installed d:\python37\lib\site-packages\jieba-0.39-py3.7.egg
Processing dependencies for jieba
Finished processing dependencies for jieba
3、jieba分词的原理
(1)jieba分词依靠中文词库
-利用一个中文词库,确定汉字之间的关联概率
-汉字间概率大的组成词组,形成分词结果