ElasticSearch

安装(坑点)
$ wget https://artifacts.elastic.co/downloads/elasticsearch/elasticsearch-5.5.1.zip
$ unzip elasticsearch-5.5.1.zip
$ cd elasticsearch-5.5.1/
$ ./bin/elasticsearch

如果这时报错"max virtual memory areas vm.maxmapcount [65530] is too low",要运行下面的命令
$ sudo sysctl -w vm.max_map_count=262144

如果一切正常,Elastic 就会在默认的9200端口运行。这时,打开另一个命令行窗口,请求该端口,会得到说明信息。
$ curl localhost:9200

{
“name” : “atntrTf”,
“cluster_name” : “elasticsearch”,
“cluster_uuid” : “tf9250XhQ6ee4h7YI11anA”,
“version” : {
“number” : “5.5.1”,
“build_hash” : “19c13d0”,
“build_date” : “2017-07-18T20:44:24.823Z”,
“build_snapshot” : false,
“lucene_version” : “6.6.0”
},
“tagline” : “You Know, for Search”
}

1-1、问题:ERROR: bootstrap checks failed
max file descriptors [4096] for elasticsearch process likely too low, increase to at least [65536]
max number of threads [1024] for user [lishang] likely too low, increase to at least [2048]
解决:切换到root用户,编辑limits.conf 添加类似如下内容
vi /etc/security/limits.conf
添加如下内容:
* soft nofile 65536
* hard nofile 131072
* soft nproc 2048
* hard nproc 4096

1-2、问题:
max virtual memory areas vm.max_map_count [65530] likely too low, increase to at least [262144]
解决:切换到root用户修改配置sysctl.conf
vi /etc/sysctl.conf
添加下面配置:
vm.max_map_count=655360
并执行命令:
sysctl -p
然后,重新启动elasticsearch,即可启动成功。

中文分词插件ik
$ ./bin/elasticsearch-plugin install https://github.com/medcl/elasticsearch-analysis-ik/releases/download/v5.5.1/elasticsearch-analysis-ik-5.5.1.zip

操作命令

查看当前节点所有Index
$ curl -X GET ‘http://localhost:9200/_cat/indices?v’

查看每个Index的Type
$ curl ‘localhost:9200/_mapping?pretty=true’

操作命令模板
$ curl -X [操作][url:port]/[Index名字]

新建名叫weather的Index
$ curl -X PUT ‘localhost:9200/weather’

{
“acknowledged”:true,
“shards_acknowledged”:true
}

删除
$ curl -X DELETE ‘localhost:9200/weather’

新建accounts的Index,中文字段分词设置
$ curl -X PUT ‘localhost:9200/accounts’ -d ’
{
“mappings”: {
“person”: {
“properties”: {
“user”: {
“type”: “text”,
“analyzer”: “ik_max_word”,
“search_analyzer”: “ik_max_word”
},
“title”: {
“type”: “text”,
“analyzer”: “ik_max_word”,
“search_analyzer”: “ik_max_word”
},
“desc”: {
“type”: “text”,
“analyzer”: “ik_max_word”,
“search_analyzer”: “ik_max_word”
}
}
}
}
}’

analyzer是字段文本的分词器,search_analyzer是搜索词的分词器。ik_max_word分词器是插件ik提供的,可以对文本进行最大数量的分词。

模板
$ curl -X [操作] ‘[url:port]/[index]/[type]/[id]’ -d ’
[json数据]’

$ curl -X PUT ‘localhost:9200/accounts/person/1’ -d ’
{
“user”: “张三”,
“title”: “工程师”,
“desc”: “数据库管理”
}’

{
“_index”:“accounts”,
“_type”:“person”,
“_id”:“1”,
“_version”:1,
“result”:“created”,
“_shards”:{“total”:2,“successful”:1,“failed”:0},
“created”:true
}

上一篇:Elasticsearch6.5.2的安装与使用(二):ik插件,head插件,kibana插件的安装


下一篇:IK分词器 原理分析 源码解析