Java版结巴分词项目地址:https://github.com/huaban/jieba-analysis
1. 加载依赖
使用Maven构建项目
com.huabangroupId>
jieba-analysisartifactId>
1.0.2version>
dependency>
2. 加载用户自定义词典
// 词典路径为Resource/dicts/jieba.dict
val path = Paths.get(new File(getClass.getClassLoader.getResource("dicts/jieba.dict").getPath).getAbsolutePath)
WordDictionary.getInstance().loadUserDict(path)
3. 进行分词
import scala.collection.JavaConverters._
import com.huaban.analysis.jieba.{JiebaSegmenter, SegToken, WordDictionary}
import com.huaban.analysis.jieba.JiebaSegmenter.SegMode
import scala.collection.mutable
val segmenter = new JiebaSegmenter()
val line = "这是一个伸手不见五指的黑夜。我叫孙悟空,我爱北京,我爱Python和C++。"
val list: mutable.Buffer[SegToken] = segmenter.process(line, SegMode.SEARCH