Python-Word标记化

Python-Word标记化 (Python - Word Tokenization)

Word tokenization is the process of splitting a large sample of text into words. This is a requirement in natural language processing tasks where each word needs to be captured and subjected to further analysis like classifying and counting them for a particular sentiment etc. The Natural Language Tool kit(NLTK) is a library used to achieve this. Install NLTK before proceeding with the python program for word tokenization.

单词标记化是将大量文本样本拆分为单词的过程。 这是自然语言处理任务的要求,在自然语言处理任务中,每个单词都需要捕获并进行进一步的分析,例如根据特定的情感对它们进行分类和计数等。自然语言工具套件(NLTK)是用于实现此目的的库。 在继续python程序进行单词标记化之前,请安装NLTK。


conda install -c anaconda nltk

Next we use the word_tokenize method to split the paragraph into individual words.

接下来,我们使用word_tokenize方法将段落拆分为单个单词。


import nltk

word_data = "It originated from the idea that there are readers who prefer learning new skills from the comforts of their drawing rooms"
nltk_tokens = nltk.word_tokenize(word_data)
print (nltk_tokens)

When we execute the above code, it produces the following result.

当我们执行上面的代码时,它产生以下结果。


['It', 'originated', 'from', 'the', 'idea', 'that', 'there', 'are', 'readers', 
'who', 'prefer', 'learning', 'new', 'skills', 'from', 'the',
'comforts', 'of', 'their', 'drawing', 'rooms']

标记化句子 (Tokenizing Sentences)

We can also tokenize the sentences in a paragraph like we tokenized the words. We use the method sent_tokenize to achieve this. Below is an example.

我们也可以像标记词一样标记段落中的句子。 我们使用send_tokenize方法来实现此目的。 下面是一个例子。


import nltk
sentence_data = "Sun rises in the east. Sun sets in the west."
nltk_tokens = nltk.sent_tokenize(sentence_data)
print (nltk_tokens)

When we execute the above code, it produces the following result.

当我们执行上面的代码时,它产生以下结果。


['Sun rises in the east.', 'Sun sets in the west.']

翻译自: https://www.tutorialspoint.com/python_data_science/python_word_tokenization.htm

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值