Spark下四种中文分词工具使用
hanLP
ansj
jieba
fudannlp
推荐使用ansj,速度快而且效果好
另外jieba,hanLP效果也不错。
具体参考
ansj:https://github.com/NLPchina/ansj_seg
HanLP:https://github.com/hankcs/HanLP
我的代码如下,加了scala连接mysql数据库查找、插入操作,添加自定义词典,添加停用词词典,和Spark RDD实现wordcount的相关知识。
package WordCloud
import java.sql.{
Connection, DriverManager}
import java.{
util}
import Mysql.ConvertToJson
import domain.tb_analyze_professional_skill
import org.ansj.library.DicLibrary
import scala.io.Source
import org.ansj.recognition.impl.StopRecognition
import org.ansj.splitWord.analysis.{
DicAnalysis, ToAnalysis}
import org.apache.spark.sql.SparkSession
import org.apache.spark.SparkConf
/**
* Created by ljq on 19-2-23.
*/
object WordCloud {
def main(args: Array[String]): Unit = {
val conf = new SparkConf().setAppName("Wordcloud").setMaster("local[4]")
val spark = SparkSession.builder().config(conf).getOrCreate()
val jdbcDF = spark.read.format