安装elasticsearch中文切词插件hanlp

hanlp好处的,就是它的data字典比较齐全.

github上有国人写hanlp支持es的插件

https://github.com/pengcong90/elasticsearch-analysis-hanlp
1
下载它的安装release包

下载发现解压按它的安装要求总找不到hanlp.properties文件

将源码git下来,发现路径有问题.

package org.elasticsearch.index.analysis;

import com.hankcs.hanlp.HanLP;
import com.hankcs.hanlp.utility.Predefine;
import com.hankcs.lucene4.HanLPIndexAnalyzer;
import org.elasticsearch.common.inject.Inject;
import org.elasticsearch.common.inject.assistedinject.Assisted;
import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.env.Environment;
import org.elasticsearch.index.IndexSettings;

/**
*/
public class HanLPAnalyzerProvider extends AbstractIndexAnalyzerProvider {

private final HanLPIndexAnalyzer analyzer;
private static String sysPath = String.valueOf(System.getProperties().get("user.dir"));

@Inject
public HanLPAnalyzerProvider(IndexSettings indexSettings, Environment env, @Assisted String name, @Assisted Settings settings) {
    super(indexSettings, name, settings);
    //原来路径
    //Predefine.HANLP_PROPERTIES_PATH = sysPath.substring(0, sysPath.length()-4) + "/plugins/analysis-hanlp/hanlp.properties";
    //修改后正确路径
    Predefine.HANLP_PROPERTIES_PATH = sysPath + "/plugins/analysis-hanlp/hanlp.properties";
    analyzer = new HanLPIndexAnalyzer(true);
}

public static HanLPAnalyzerProvider getIndexAnalyzerProvider(IndexSettings indexSettings, Environment env, String name, Settings settings) {
    return new HanLPAnalyzerProvider(indexSettings, env, name, settings);
}

public static HanLPAnalyzerProvider getSmartAnalyzerProvider(IndexSettings indexSettings, Environment env, String name, Settings settings) {
    return new HanLPAnalyzerProvider(indexSettings, env, name, settings);
}

@Override
public HanLPIndexAnalyzer get() {
    return this.analyzer;
}

}

因为它的hanlp版本是1.2.8,最新版本是1.5.4
修改pom.xml为

   <dependency>
            <groupId>com.hankcs</groupId>
            <artifactId>hanlp</artifactId>
            <version>portable-1.5.4</version>
        <!--<systemPath>${pom.basedir}/lib/hanlp-1.2.8.jar</systemPath>-->
        <!--<scope>system</scope>-->
    </dependency>

打包编译

在$ES_HOME下/plugins建立analysis-hanlp文件
目录下结构为
安装elasticsearch中文切词插件hanlp

hanlp.properties属性(可以直接从https://github.com/hankcs/HanLP 的realease下载修改root路径就行了)

本配置文件中的路径的根目录,根目录+其他路径=完整路径(支持相对路径,请参考:https://github.com/hankcs/HanLP/pull/254

Windows用户请注意,路径分隔符统一使用/

root=/opt/elasticsearch-5.5.1/plugins/analysis-hanlp

核心词典路径

CoreDictionaryPath=data/dictionary/CoreNatureDictionary.txt

2元语法词典路径

BiGramDictionaryPath=data/dictionary/CoreNatureDictionary.ngram.txt

停用词词典路径

CoreStopWordDictionaryPath=data/dictionary/stopwords.txt

同义词词典路径

CoreSynonymDictionaryDictionaryPath=data/dictionary/synonym/CoreSynonym.txt

人名词典路径

PersonDictionaryPath=data/dictionary/person/nr.txt

人名词典转移矩阵路径

PersonDictionaryTrPath=data/dictionary/person/nr.tr.txt

繁简词典根目录

tcDictionaryRoot=data/dictionary/tc

自定义词典路径,用;隔开多个自定义词典,空格开头表示在同一个目录,使用“文件名 词性”形式则表示这个词典的词性默认是该词性。优先级递减。

另外data/dictionary/custom/CustomDictionary.txt是个高质量的词库,请不要删除。所有词典统一使用UTF-8编码。

CustomDictionaryPath=data/dictionary/custom/CustomDictionary.txt; 现代汉语补充词库.txt; 全国地名大全.txt ns; 人名词典.txt; 机构名词典.txt; 上海地名.txt ns;data/dictionary/person/nrf.txt nrf;

CRF分词模型路径

CRFSegmentModelPath=data/model/segment/CRFSegmentModel.txt

HMM分词模型

HMMSegmentModelPath=data/model/segment/HMMSegmentModel.bin

分词结果是否展示词性

ShowTermNature=true

IO适配器,实现com.hankcs.hanlp.corpus.io.IIOAdapter接口以在不同的平台(Hadoop、Redis等)上运行HanLP

默认的IO适配器如下,该适配器是基于普通文件系统的。

IOAdapter=com.hankcs.hanlp.corpus.io.FileIOAdapter

plugin-descriptor.properties和plugin-security.policy属性按 elasticsearch-analysis-hanlp的release包属性修改.

修改ES启动,并启动

vim /opt/elasticsearch-5.5.1config/jvm.options

新增

-Djava.security.policy=/opt/elasticsearch-5.5.1/plugins/analysis-hanlp/plugin-security.policy

测试安装成功否命令

GET /_analyze?analyzer=hanlp-index&pretty=true
{
"text":"*部:各地校车将享最高路权"

}

data字典文件从https://github.com/hankcs/HanLP/releases 下载,解压就行了.

文章来源于小白鸽的博客

上一篇:Java中关于类型自动提升的两个注意点。


下一篇:at org.apache.jsp.index_jsp._jspInit(index_jsp.java:22) 报空指针