文章目录
一、前言
我已将项目代码上传,地址 https://github.com/xiazi123/Test
二、将项目导入Myeclipse中
方法1:
将下载好的文件(是解压es_hbase6
文件夹而不是Test-master
)解压到你 Myeclipse 的 Workspaces 目录中,然后在 Myeclipse 中右键点击 Import 导入项目
方法2:
将下载好的文件解压到你的 Windows 桌面,然后在 Myeclipse(我这里用的是MyEclipse 10.7.1
,如果你的版本不同,界面和选项会略有不同)中右键点击Import
导入项目
导入成功:
三、准备工作
3.1 运行这个项目你得安装有hadoop和hbase集群
zookeeper也安装上吧,反正我是不习惯用hbase自带的zookeeper而是自己安装的zookeeper,我安装的都是cdh5.5.2版,的这里的安装步骤我就不累述了,如果你已安装可忽略这步,若没有则可参考我的另两篇文章 hadoop-2.6.0-cdh5.5.2安装 和 zookeeper-3.4.5-cdh5.5.2和hbase-1.0.0-cdh5.5.2安装
3.2 安装Elasticsearch集群
我的Linux为 Centos 7.2
(1)下载elasticsearch-2.2.0.tar.gz,下载地址:elasticsearch-2.2.0.tar.gz,执行解压命令
[hadoop@h153 ~]$ tar -zxvf elasticsearch-2.0.0.tar.gz
(2)同步到其他两个节点:
[hadoop@h153 ~]$ scp -r elasticsearch-2.2.0/ hadoop@h154:/home/hadoop/
[hadoop@h153 ~]$ scp -r elasticsearch-2.2.0/ hadoop@h155:/home/hadoop/
(3)修改配置文件config/elasticsearch.yml
:
[hadoop@h153 elasticsearch-2.2.0]$ vi config/elasticsearch.yml
# 添加:
cluster.name: my-application
node.name: node-1
network.host: 192.168.205.153
# 添加防脑裂配置:
discovery.zen.ping.multicast.enabled: false
discovery.zen.ping_timeout: 120s
client.transport.ping_timeout: 60s
discovery.zen.ping.unicast.hosts: ["192.168.205.153","192.168.205.154","192.168.205.155"]
[hadoop@h154 elasticsearch-2.2.0]$ vi config/elasticsearch.yml
# 添加:
cluster.name: my-application
node.name: node-2
network.host: 192.168.205.154
# 添加防脑裂配置:
discovery.zen.ping.multicast.enabled: false
discovery.zen.ping_timeout: 120s
client.transport.ping_timeout: 60s
discovery.zen.ping.unicast.hosts: ["192.168.205.153","192.168.205.154","192.168.205.155"]
[hadoop@h155 elasticsearch-2.2.0]$ vi config/elasticsearch.yml
# 添加:
cluster.name: my-application
node.name: node-3
network.host: 192.168.205.155
# 添加防脑裂配置:
discovery.zen.ping.multicast.enabled: false
discovery.zen.ping_timeout: 120s
client.transport.ping_timeout: 60s
discovery.zen.ping.unicast.hosts: ["192.168.205.153","192.168.205.154","192.168.205.155"]
注意:如果要配置集群需要两个节点上的elasticsearch配置的cluster.name
相同,都启动可以自动组成集群,nodename随意取但是集群内的各节点不能相同。
(4)安装es监控插件(三台虚拟机都装,后来感觉一台装就可以吧,有时间验证一下)
[hadoop@h153 ~]$ cd elasticsearch-2.2.0/bin/
[hadoop@h153 bin]$ ./plugin install mobz/elasticsearch-head
-> Installing mobz/elasticsearch-head...
Trying https://github.com/mobz/elasticsearch-head/archive/master.zip ...
Downloading ..................................................................................DONE
Verifying https://github.com/mobz/elasticsearch-head/archive/master.zip checksums if available ...
NOTE: Unable to verify checksum for downloaded plugin (unable to find .sha1 or .md5 file to verify)
Installed head into /home/hadoop/elasticsearch-2.2.0/plugins/head
(5)启动集群,在已经启动了hadoop、hbase、zookeeper集群后再启动es集群
[hadoop@h153 ~]$ ./elasticsearch-2.2.0/bin/elasticsearch
[hadoop@h154 ~]$ ./elasticsearch-2.2.0/bin/elasticsearch
[hadoop@h155 ~]$ ./elasticsearch-2.2.0/bin/elasticsearch
3.3 导入hbase库中的测试数据
存放在你指定的目录下 C:\Users\huiqiang\Desktop\es\doc1.txt
(内容以Tab键分隔)
1a hbase介绍及安装 阿里巴巴 hbase的服务器体系结构遵从简单的主从服务架 在很多图片上传以及文件下载操作的时候在很多图片上传以及文件上传下载操作的时候
2b docker的实战经验分享 百度 paas从2008年万众瞩目到2012年遭受质疑 最近十天在做一个博客系统,因为域名服务器都闲置已久
3c 实时推荐系统的方式 腾讯 推荐系统介绍,自从1992年施乐的科学家为了解决信息 这篇文章最要分享的是使用Apache的poi来实现数据导出到execl的功能,这里提供三种解决方案
4d hive的优化总结 华为 优化可以分为几个方面着手 在商品详情页处理这里的时候,因为我爱你
5e hive分区 启明星辰 1、在hive select查询中一般会扫描整个表内容 我们在使用kafka消费信息的过程中
6f hdfs原理分析 七牛 存储超大文件 在${KAFKA_HOME}/bin下,有很多的脚本,其中有一个kafka-run-class.sh
相对应Index.java的代码为:
在hbase中建立相应的表:
hbase(main):010:0> create 'doc','cf1'
相对应HbaseUtils.java中的代码为:
四、运行项目
4.1 在MyEclipse中运行
右击EsController.java
运行项目
登录http://desktop-egkibnh:8080/es_hbase/create.jsp
点击创建索引,则会往hbase中插入数据并且在es中建立索引(在谷歌浏览器输入http://192.168.205.153:9200/_plugin/head/
):
在http://desktop-egkibnh:8080/es_hbase/
中输入搜索的关键字后搜索:
4.2 在Idea中运行
注意:运行该项目安装的jdk得是1.7版本的,1.8版本运行报错。在本地还得安装Tomcat。启动程序后会自动弹出浏览器http://localhost:8080/
界面
说明:该项目正常是把title
、describe
、author
、id
(即在hbase中的rowkey)在es中做索引的,hbase中存放全部的数据包括content
字段,在搜索页面查询出来的是es中不包括content
字段的内容,点详情的时候再根据得到的 id 即 rowkey 再去hbase中查询详细内容。否则给人的感觉像是 es 做了 hbase 的数据备份,查询展现都可以走 es,hbase 则显得可有可无的样子。
4.3 项目升级
4.3.1 改为Springboot项目
框起来的这些和es_hbase6
项目中的代码内容一致,直接拿过来用就行,java版本1.7或者1.8都可以正常运行。pom.xml
里的内容倒是有很大的改动:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.guoxin</groupId>
<artifactId>SpringbootTest</artifactId>
<version>1.0-SNAPSHOT</version>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>1.4.0.RELEASE</version>
<relativePath/>
</parent>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<!-- elasticsearch -->
<dependency>
<groupId>org.elasticsearch</groupId>
<artifactId>elasticsearch</artifactId>
<version>2.2.0</version>
</dependency>
<!-- habse -->
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-client</artifactId>
<version>1.1.3</version>
<exclusions>
<exclusion>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.tomcat.embed</groupId>
<artifactId>tomcat-embed-jasper</artifactId>
</dependency>
<!--如果要使用servlet必须添加该以下两个依赖-->
<!-- servlet依赖的jar包-->
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>javax.servlet-api</artifactId>
</dependency>
<dependency>
<groupId>javax.servlet.jsp</groupId>
<artifactId>javax.servlet.jsp-api</artifactId>
<version>2.3.1</version>
</dependency>
<!--如果使用JSTL必须添加该依赖-->
<!--jstl标签依赖的jar包start-->
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>jstl</artifactId>
</dependency>
</dependencies>
<build>
<resources>
<resource>
<!--源文件位置-->
<directory>src/main/webapp</directory>
<!--指定编译到META-INF/resources,该目录不能随便写-->
<targetPath>META-INF/resources</targetPath>
<!--指定要把哪些文件编译进去,**表示webapp目录及子目录,*.*表示所有文件-->
<includes>
<include>**/*.*</include>
</includes>
</resource>
<resource>
<directory>src/main/resources</directory>
<includes>
<include>**/*.*</include>
</includes>
</resource>
</resources>
<finalName>SpringBootTest</finalName>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
在Springboot项目内部是不推荐用jsp的,而是推荐使用模板引擎thymeleaf,但是这仍然是我们需要掌握的一项技术。详情可参考:SpringBoot 之 使用jsp
在 Idea 中点击运行程序,可能会出现如下报错,该报错是因为没有在本地创建 Hadoop 的目录和环境变量造成的,该报错可忽略不影响程序正常运行,如果将 target 目录下的SpringBootTest.jar
上传到装有 hbase 和 es 的虚拟机中运行则不会有该报错信息:
注意:在运行完该程序后并不像之前会主动弹出浏览器界面,需要手动在浏览器中输入http://localhost:8080/index.jsp
才可以展现出之前的搜索界面,而输入http://localhost:8080/
并不好使还报错:
注意:Stopwatch.java
也是必须要有的,否则在访问localhost:8080/create.do
时会引起如下报错:
2022-05-06 09:13:32.237 ERROR 15012 --- [nio-8080-exec-2] o.a.c.c.C.[.[.[/].[dispatcherServlet] : Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Handler dispatch failed; nested exception is java.lang.IllegalAccessError: tried to access method com.google.common.base.Stopwatch.<init>()V from class org.apache.hadoop.hbase.zookeeper.MetaTableLocator] with root cause
java.lang.IllegalAccessError: tried to access method com.google.common.base.Stopwatch.<init>()V from class org.apache.hadoop.hbase.zookeeper.MetaTableLocator
at org.apache.hadoop.hbase.zookeeper.MetaTableLocator.blockUntilAvailable(MetaTableLocator.java:596) ~[hbase-client-1.1.3.jar:1.1.3]
at org.apache.hadoop.hbase.zookeeper.MetaTableLocator.blockUntilAvailable(MetaTableLocator.java:580) ~[hbase-client-1.1.3.jar:1.1.3]
at org.apache.hadoop.hbase.zookeeper.MetaTableLocator.blockUntilAvailable(MetaTableLocator.java:559) ~[hbase-client-1.1.3.jar:1.1.3]
at org.apache.hadoop.hbase.client.ZooKeeperRegistry.getMetaRegionLocation(ZooKeeperRegistry.java:61) ~[hbase-client-1.1.3.jar:1.1.3]
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateMeta(ConnectionManager.java:1186) ~[hbase-client-1.1.3.jar:1.1.3]
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1153) ~[hbase-client-1.1.3.jar:1.1.3]
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.relocateRegion(ConnectionManager.java:1127) ~[hbase-client-1.1.3.jar:1.1.3]
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegionInMeta(ConnectionManager.java:1332) ~[hbase-client-1.1.3.jar:1.1.3]
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1156) ~[hbase-client-1.1.3.jar:1.1.3]
at org.apache.hadoop.hbase.client.AsyncProcess.submit(AsyncProcess.java:370) ~[hbase-client-1.1.3.jar:1.1.3]
at org.apache.hadoop.hbase.client.AsyncProcess.submit(AsyncProcess.java:321) ~[hbase-client-1.1.3.jar:1.1.3]
at org.apache.hadoop.hbase.client.BufferedMutatorImpl.backgroundFlushCommits(BufferedMutatorImpl.java:206) ~[hbase-client-1.1.3.jar:1.1.3]
at org.apache.hadoop.hbase.client.BufferedMutatorImpl.flush(BufferedMutatorImpl.java:183) ~[hbase-client-1.1.3.jar:1.1.3]
at org.apache.hadoop.hbase.client.HTable.flushCommits(HTable.java:1449) ~[hbase-client-1.1.3.jar:1.1.3]
at org.apache.hadoop.hbase.client.HTable.put(HTable.java:1040) ~[hbase-client-1.1.3.jar:1.1.3]
at com.hui.demo.HbaseUtils.put(HbaseUtils.java:85) ~[classes/:na]
at com.hui.demo.Index.createIndex(Index.java:37) ~[classes/:na]
at com.hui.demo.EsController.createIndex(EsController.java:30) ~[classes/:na]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.7.0_25]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) ~[na:1.7.0_25]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.7.0_25]
at java.lang.reflect.Method.invoke(Method.java:606) ~[na:1.7.0_25]
at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:221) ~[spring-web-4.3.2.RELEASE.jar:4.3.2.RELEASE]
at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:136) ~[spring-web-4.3.2.RELEASE.jar:4.3.2.RELEASE]
at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:114) ~[spring-webmvc-4.3.2.RELEASE.jar:4.3.2.RELEASE]
at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:827) ~[spring-webmvc-4.3.2.RELEASE.jar:4.3.2.RELEASE]
at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:738) ~[spring-webmvc-4.3.2.RELEASE.jar:4.3.2.RELEASE]
at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:85) ~[spring-webmvc-4.3.2.RELEASE.jar:4.3.2.RELEASE]
at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:963) ~[spring-webmvc-4.3.2.RELEASE.jar:4.3.2.RELEASE]
at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:897) ~[spring-webmvc-4.3.2.RELEASE.jar:4.3.2.RELEASE]
at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:970) ~[spring-webmvc-4.3.2.RELEASE.jar:4.3.2.RELEASE]
at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:861) ~[spring-webmvc-4.3.2.RELEASE.jar:4.3.2.RELEASE]
at javax.servlet.http.HttpServlet.service(HttpServlet.java:622) ~[tomcat-embed-core-8.5.4.jar:8.5.4]
at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:846) ~[spring-webmvc-4.3.2.RELEASE.jar:4.3.2.RELEASE]
at javax.servlet.http.HttpServlet.service(HttpServlet.java:729) ~[tomcat-embed-core-8.5.4.jar:8.5.4]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:230) ~[tomcat-embed-core-8.5.4.jar:8.5.4]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165) ~[tomcat-embed-core-8.5.4.jar:8.5.4]
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52) ~[tomcat-embed-websocket-8.5.4.jar:8.5.4]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192) ~[tomcat-embed-core-8.5.4.jar:8.5.4]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165) ~[tomcat-embed-core-8.5.4.jar:8.5.4]
at org.springframework.web.filter.RequestContextFilter.doFilterInternal(RequestContextFilter.java:99) ~[spring-web-4.3.2.RELEASE.jar:4.3.2.RELEASE]
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) ~[spring-web-4.3.2.RELEASE.jar:4.3.2.RELEASE]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192) ~[tomcat-embed-core-8.5.4.jar:8.5.4]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165) ~[tomcat-embed-core-8.5.4.jar:8.5.4]
at org.springframework.web.filter.HttpPutFormContentFilter.doFilterInternal(HttpPutFormContentFilter.java:87) ~[spring-web-4.3.2.RELEASE.jar:4.3.2.RELEASE]
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) ~[spring-web-4.3.2.RELEASE.jar:4.3.2.RELEASE]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192) ~[tomcat-embed-core-8.5.4.jar:8.5.4]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165) ~[tomcat-embed-core-8.5.4.jar:8.5.4]
at org.springframework.web.filter.HiddenHttpMethodFilter.doFilterInternal(HiddenHttpMethodFilter.java:77) ~[spring-web-4.3.2.RELEASE.jar:4.3.2.RELEASE]
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) ~[spring-web-4.3.2.RELEASE.jar:4.3.2.RELEASE]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192) ~[tomcat-embed-core-8.5.4.jar:8.5.4]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165) ~[tomcat-embed-core-8.5.4.jar:8.5.4]
at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:197) ~[spring-web-4.3.2.RELEASE.jar:4.3.2.RELEASE]
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) ~[spring-web-4.3.2.RELEASE.jar:4.3.2.RELEASE]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192) ~[tomcat-embed-core-8.5.4.jar:8.5.4]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165) ~[tomcat-embed-core-8.5.4.jar:8.5.4]
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:198) ~[tomcat-embed-core-8.5.4.jar:8.5.4]
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:108) ~[tomcat-embed-core-8.5.4.jar:8.5.4]
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:522) ~[tomcat-embed-core-8.5.4.jar:8.5.4]
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:140) ~[tomcat-embed-core-8.5.4.jar:8.5.4]
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79) ~[tomcat-embed-core-8.5.4.jar:8.5.4]
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:87) ~[tomcat-embed-core-8.5.4.jar:8.5.4]
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:349) ~[tomcat-embed-core-8.5.4.jar:8.5.4]
at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:1110) ~[tomcat-embed-core-8.5.4.jar:8.5.4]
at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:66) ~[tomcat-embed-core-8.5.4.jar:8.5.4]
at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:785) ~[tomcat-embed-core-8.5.4.jar:8.5.4]
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1425) ~[tomcat-embed-core-8.5.4.jar:8.5.4]
at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49) ~[tomcat-embed-core-8.5.4.jar:8.5.4]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) ~[na:1.7.0_25]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) ~[na:1.7.0_25]
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61) ~[tomcat-embed-core-8.5.4.jar:8.5.4]
at java.lang.Thread.run(Thread.java:724) ~[na:1.7.0_25]
4.3.2 升级hbase和es的版本
上面使用的 hbase 和 es 的版本比较老,这里升级一下,使用 elasticsearch 的版本为7.14.1
,hbase 的版本为2.0.6
,前提是在虚拟机中运行的 Hadoop 集群已将这两个组件的相应版本安装好了。则有些 java 文件中的代码也需要有相应的修改,Stopwatch.java
则不再需要有,hbase 中的依赖也不用将 guava 去掉,其余的不变。
pom.xml:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.guoxin</groupId>
<artifactId>SpringbootTest</artifactId>
<version>1.0-SNAPSHOT</version>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.3.2.RELEASE</version>
<relativePath/>
</parent>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<!-- elasticsearch -->
<dependency>
<groupId>org.elasticsearch</groupId>
<artifactId>elasticsearch</artifactId>
<version>7.14.1</version>
</dependency>
<dependency>
<groupId>org.elasticsearch.client</groupId>
<artifactId>transport</artifactId>
<version>7.14.1</version>
</dependency>
<dependency>
<groupId>org.elasticsearch.plugin</groupId>
<artifactId>transport-netty4-client</artifactId>
<version>7.14.1</version>
</dependency>
<!-- habse -->
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-client</artifactId>
<version>2.0.6</version>
</dependency>
<dependency>
<groupId>org.apache.tomcat.embed</groupId>
<artifactId>tomcat-embed-jasper</artifactId>
</dependency>
<!--如果要使用servlet必须添加该以下两个依赖-->
<!-- servlet依赖的jar包-->
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>javax.servlet-api</artifactId>
</dependency>
<dependency>
<groupId>javax.servlet.jsp</groupId>
<artifactId>javax.servlet.jsp-api</artifactId>
<version>2.3.1</version>
</dependency>
<!--如果使用JSTL必须添加该依赖-->
<!--jstl标签依赖的jar包start-->
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>jstl</artifactId>
</dependency>
</dependencies>
<build>
<resources>
<resource>
<!--源文件位置-->
<directory>src/main/webapp</directory>
<!--指定编译到META-INF/resources,该目录不能随便写-->
<targetPath>META-INF/resources</targetPath>
<!--指定要把哪些文件编译进去,**表示webapp目录及子目录,*.*表示所有文件-->
<includes>
<include>**/*.*</include>
</includes>
</resource>
<resource>
<directory>src/main/resources</directory>
<includes>
<include>**/*.*</include>
</includes>
</resource>
</resources>
<finalName>SpringBootTest</finalName>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
Esutil.java修改为:
package com.hui.demo;
import org.apache.commons.lang.StringUtils;
import org.elasticsearch.action.index.IndexResponse;
import org.elasticsearch.action.search.SearchRequestBuilder;
import org.elasticsearch.action.search.SearchResponse;
import org.elasticsearch.action.search.SearchType;
import org.elasticsearch.client.Client;
import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.common.text.Text;
import org.elasticsearch.common.transport.TransportAddress;
import org.elasticsearch.index.query.QueryBuilders;
import org.elasticsearch.search.SearchHit;
import org.elasticsearch.search.SearchHits;
import org.elasticsearch.search.fetch.subphase.highlight.HighlightBuilder;
import org.elasticsearch.search.fetch.subphase.highlight.HighlightField;
import org.elasticsearch.transport.client.PreBuiltTransportClient;
import java.net.InetAddress;
import java.net.UnknownHostException;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
public class Esutil {
public static Client client = null;
/**
* 获取客户端
* @return
*/
public static Client getClient() {
if(client!=null){
return client;
}
Settings settings = Settings.builder().put("cluster.name", "my-application").build();
try {
client = new PreBuiltTransportClient(settings)
.addTransportAddress(new TransportAddress(InetAddress.getByName("192.168.91.143"), 9300))
.addTransportAddress(new TransportAddress(InetAddress.getByName("192.168.91.144"), 9300))
.addTransportAddress(new TransportAddress(InetAddress.getByName("192.168.91.145"), 9300));
} catch (UnknownHostException e) {
e.printStackTrace();
}
return client;
}
public static String addIndex(String index,String type,Doc Doc){
HashMap<String, Object> hashMap = new HashMap<String, Object>();
hashMap.put("id", Doc.getId());
hashMap.put("title", Doc.getTitle());
hashMap.put("describe", Doc.getDescribe());
hashMap.put("author", Doc.getAuthor());
hashMap.put("content", Doc.getContent());
IndexResponse response = getClient().prepareIndex(index, type).setSource(hashMap).execute().actionGet();
return response.getId();
}
public static Map<String, Object> search(String key,String index,String type,int start,int row){
SearchRequestBuilder builder = getClient().prepareSearch(index);
builder.setTypes(type);
builder.setFrom(start);
builder.setSize(row);
HighlightBuilder hiBuilder=new HighlightBuilder();
hiBuilder.field("title");
hiBuilder.field("describe");
hiBuilder.field("author");
hiBuilder.field("content");
hiBuilder.preTags("<font color='red' >");
hiBuilder.postTags("</font>");
//设置高亮字段名称
builder.highlighter(hiBuilder);
builder.setSearchType(SearchType.DFS_QUERY_THEN_FETCH);
if(StringUtils.isNotBlank(key)){
builder.setQuery(QueryBuilders.multiMatchQuery(key,"title","describe","author","content"));
}
builder.setExplain(true);
SearchResponse searchResponse = builder.get();
SearchHits hits = searchResponse.getHits();
long total = hits.getTotalHits().value;
Map<String, Object> map = new HashMap<String,Object>();
SearchHit[] hits2 = hits.getHits();
map.put("count", total);
List<Map<String, Object>> list = new ArrayList<Map<String, Object>>();
for (SearchHit searchHit : hits2) {
Map<String, HighlightField> highlightFields = searchHit.getHighlightFields();
HighlightField highlightField = highlightFields.get("title");
Map<String, Object> source = searchHit.getSourceAsMap();
if(highlightField!=null){
Text[] fragments = highlightField.fragments();
String name = "";
for (Text text : fragments) {
name+=text;
}
source.put("title", name);
}
HighlightField highlightField2 = highlightFields.get("describe");
if(highlightField2!=null){
Text[] fragments = highlightField2.fragments();
String describe = "";
for (Text text : fragments) {
describe+=text;
}
source.put("describe", describe);
}
HighlightField highlightField3 = highlightFields.get("author");
if(highlightField3!=null){
Text[] fragments = highlightField3.fragments();
String author = "";
for (Text text : fragments) {
author+=text;
}
source.put("author", author);
}
HighlightField highlightField4 = highlightFields.get("content");
if(highlightField4!=null){
Text[] fragments = highlightField4.fragments();
String content = "";
for (Text text : fragments) {
content+=text;
}
source.put("content", content);
}
list.add(source);
}
map.put("dataList", list);
return map;
}
}
HbaseUtils.java修改为:
package com.hui.demo;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hbase.*;
import org.apache.hadoop.hbase.client.*;
import org.apache.hadoop.hbase.util.Bytes;
import java.io.IOException;
public class HbaseUtils {
/**
* HBASE 表名称
*/
public final String TABLE_NAME = "doc";
/**
* 列簇1 文章信息
*/
public final String COLUMNFAMILY_1 = "cf1";
/**
* 列簇1中的列
*/
public final String COLUMNFAMILY_1_TITLE = "title";
public final String COLUMNFAMILY_1_AUTHOR = "author";
public final String COLUMNFAMILY_1_CONTENT = "content";
public final String COLUMNFAMILY_1_DESCRIBE = "describe";
public static Admin admin = null;
public static Configuration conf = null;
public static Connection conn = null;
/**
* 构造函数加载配置
*/
public HbaseUtils() {
Configuration conf = HBaseConfiguration.create();
conf.set("hbase.zookeeper.quorum", "192.168.91.143:2181");
conf.set("hbase.rootdir", "hdfs://192.168.91.143:9000/hbase");
try {
conn = ConnectionFactory.createConnection(conf);
admin = conn.getAdmin();
} catch (IOException e) {
e.printStackTrace();
}
}
// 读取一条记录
@SuppressWarnings({ "deprecation", "resource" })
public Doc get(String tableName, String row) throws IOException {
Table table = conn.getTable(TableName.valueOf(tableName));
Get get = new Get(row.getBytes());
Doc Doc = null;
try {
Result result = table.get(get);
// KeyValue[] raw = result.raw();
Cell[] cells = result.rawCells();
Doc = new Doc();
Doc.setId(row);
for (Cell cell: cells){
//获取列的名称
String columnName = Bytes.toString(cell.getQualifierArray(), cell.getQualifierOffset(), cell.getQualifierLength());
if (columnName.equals("title")) {
Doc.setTitle(Bytes.toString(cell.getValueArray(),cell.getValueOffset(),cell.getValueLength()));
} else if (columnName.equals("author")) {
Doc.setAuthor(Bytes.toString(cell.getValueArray(),cell.getValueOffset(),cell.getValueLength()));
} else if (columnName.equals("content")) {
Doc.setContent(Bytes.toString(cell.getValueArray(),cell.getValueOffset(),cell.getValueLength()));
} else if (columnName.equals("describe")) {
Doc.setDescribe(Bytes.toString(cell.getValueArray(),cell.getValueOffset(),cell.getValueLength()));
}
}
} catch (IOException e) {
e.printStackTrace();
}
return Doc;
}
// 添加一条记录
public void put(String tableName, String row, String columnFamily,
String column, String data) throws IOException {
Table table = conn.getTable(TableName.valueOf(tableName));
Put p1 = new Put(Bytes.toBytes(row));
p1.addColumn(columnFamily.getBytes(),column.getBytes(),data.getBytes());
table.put(p1);
System.out.println("put'" + row + "'," + columnFamily + ":" + column
+ "','" + data + "'");
}
}
4.3.3 使用 thymeleaf 替代 jsp
ThymeleafController:
package com.hui.demo;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Controller;
import org.springframework.ui.Model;
import org.springframework.ui.ModelMap;
import org.springframework.web.bind.annotation.PathVariable;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestParam;
import java.io.IOException;
import java.io.UnsupportedEncodingException;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
@Controller
@RequestMapping("/")
public class ThymeleafController {
static final Logger logger = LoggerFactory.getLogger(ThymeleafController.class);
HbaseUtils hbaseUtils = new HbaseUtils();
@Autowired
private Index index;
@RequestMapping("/create.do")
public String createIndex() throws Exception {
index.createIndex();
return "thymeleaf/create";
}
@RequestMapping("/search.do")
public String serachArticle(Model model,
@RequestParam(value="keyWords",required = false) String keyWords,
@RequestParam(value = "pageNum", defaultValue = "1") Integer pageNum,
@RequestParam(value = "pageSize", defaultValue = "3") Integer pageSize){
try {
keyWords = new String(keyWords.getBytes("ISO-8859-1"),"UTF-8");
} catch (UnsupportedEncodingException e) {
e.printStackTrace();
}
Map<String,Object> map = new HashMap<String, Object>();
int count = 0;
try {
map = Esutil.search(keyWords,"hui","qiang",(pageNum-1)*pageSize, pageSize);
count = Integer.parseInt(((Long) map.get("count")).toString());
} catch (Exception e) {
logger.error("查询索引错误!{}",e);
e.printStackTrace();
}
PageUtil<Map<String, Object>> page = new PageUtil<Map<String, Object>>(String.valueOf(pageNum),String.valueOf(pageSize),count);
List<Map<String, Object>> articleList = (List<Map<String, Object>>)map.get("dataList");
page.setList(articleList);
model.addAttribute("total",count);
model.addAttribute("pageNum",pageNum);
model.addAttribute("page",page);
model.addAttribute("kw",keyWords);
return "thymeleaf/index";
}
/**
* 查看文章详细信息
* @return
*/
@RequestMapping("/detailDocById/{id}.do")
public String detailArticleById(@PathVariable(value="id") String id, Model modelMap) throws IOException {
Doc Doc = hbaseUtils.get(hbaseUtils.TABLE_NAME, id);
modelMap.addAttribute("Doc",Doc);
return "thymeleaf/detail";
}
@RequestMapping("/index")
public String index(ModelMap map){
return "thymeleaf/index";
}
}
application.properties:
##############################################
#
# thymeleaf静态资源配置
#
##############################################
# 默认路径
spring.thymeleaf.prefix=classpath:/templates/
# 后缀
spring.thymeleaf.suffix=.html
# 模板格式
spring.thymeleaf.mode=HTML5
spring.thymeleaf.encoding=UTF-8
spring.thymeleaf.content-type=text/html
spring.thymeleaf.cache=false
create.html:
<!DOCTYPE html>
<link rel="stylesheet" th:href="@{/style.css}" xmlns:th="http://www.w3.org/1999/xhtml">
<html lang="en" xmlns:th="http://www.w3.org/1999/xhtml">
<head>
<base href="<%=basePath%>">
<title>My JSP 'index.jsp' starting page</title>
<meta http-equiv="pragma" content="no-cache">
<meta http-equiv="cache-control" content="no-cache">
<meta http-equiv="expires" content="0">
<meta http-equiv="keywords" content="keyword1,keyword2,keyword3">
<meta http-equiv="description" content="This is my page">
<!--
<link rel="stylesheet" type="text/css" href="styles.css">
-->
</head>
<body>
<a th:href="@{'/create.do'}">创建索引</a>
</body>
</html>
detail.html:
<!DOCTYPE html>
<link rel="stylesheet" th:href="@{/style.css}" xmlns:th="http://www.w3.org/1999/xhtml">
<html lang="en" xmlns:th="http://www.w3.org/1999/xhtml">
<head>
<base href="<%=basePath%>">
<title>My JSP 'index.jsp' starting page</title>
<meta http-equiv="pragma" content="no-cache">
<meta http-equiv="cache-control" content="no-cache">
<meta http-equiv="expires" content="0">
<meta http-equiv="keywords" content="keyword1,keyword2,keyword3">
<meta http-equiv="description" content="This is my page">
<!--
<link rel="stylesheet" type="text/css" href="styles.css">
-->
</head>
<body>
<h1 th:utext="${Doc.title}"></h1>
<div th:utext="${Doc.author}"></div>
<div th:utext="${Doc.content}"></div>
</body>
</html>
index.html:
<!DOCTYPE html>
<link rel="stylesheet" th:href="@{/style.css}" xmlns:th="http://www.w3.org/1999/xhtml">
<html lang="en" xmlns:th="http://www.w3.org/1999/xhtml">
<head>
<title>starting page</title>
<meta charset="UTF-8">
<meta http-equiv="pragma" content="no-cache">
<meta http-equiv="cache-control" content="no-cache">
<meta http-equiv="expires" content="0">
<meta http-equiv="keywords" content="keyword1,keyword2,keyword3">
<meta http-equiv="description" content="This is my page">
</head>
<body>
<form th:action="search.do" method="get">
<input type="text" name="keyWords" />
<input type="submit" value="千度一下">
<input type="hidden" value="1" name="pageNum">
</form>
<div th:if="${!#lists.isEmpty(page)}">
<h3 th:text="'千度为您找到相关结果约' + ${total} + '个'"></h3>
<div th:each="bean:${page.list}">
<a th:href="@{'/detailDocById/' + ${bean.id} + '.do'}">
<span th:utext="${bean.title}"></span>
</a>
<br/>
<br/>
<td th:utext="${bean.describe}"></td>
<br/>
<br/>
</div>
<span th:if="${page.hasPrevious}">
<a th:href="@{/search.do(pageNum=${page.previousPageNum},keyWords=${kw})}">上一页</a>
</span>
<span th:each="num:${#numbers.sequence(page.everyPageStart,page.everyPageEnd)}">
<a th:href="@{search.do(pageNum=${num},keyWords=${kw})}">
<span th:utext="${num}"></span>
</a>
</span>
<span th:if="${page.hasNext}">
<a th:href="@{/search.do(pageNum=${page.nextPageNum},keyWords=${kw})}">下一页</a>
</span>
</div>
</body>
</html>
pom.xml:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.guoxin</groupId>
<artifactId>SpringbootTest</artifactId>
<version>1.0-SNAPSHOT</version>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.3.2.RELEASE</version>
<relativePath/>
</parent>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-thymeleaf</artifactId>
</dependency>
<!-- elasticsearch -->
<dependency>
<groupId>org.elasticsearch</groupId>
<artifactId>elasticsearch</artifactId>
<version>7.14.1</version>
</dependency>
<dependency>
<groupId>org.elasticsearch.client</groupId>
<artifactId>transport</artifactId>
<version>7.14.1</version>
</dependency>
<dependency>
<groupId>org.elasticsearch.plugin</groupId>
<artifactId>transport-netty4-client</artifactId>
<version>7.14.1</version>
</dependency>
<!-- habse -->
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-client</artifactId>
<version>2.0.6</version>
</dependency>
</dependencies>
<build>
<finalName>SpringBootTest</finalName>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
注意:启动页面后搜索中文无法返回结果,后来发现keyWords
从前端传过来的中文乱码。
解决:将ThymeleafController
中的keyWords = new String(keyWords.getBytes("ISO-8859-1"),"UTF-8");
改为keyWords = new String(keyWords.getBytes("UTF-8"),"UTF-8");
4.3.3.1 详情页面想以表格的形式表示
修改detail.html
文件为:
<!DOCTYPE html>
<link rel="stylesheet" th:href="@{/style.css}" xmlns:th="http://www.w3.org/1999/xhtml">
<html lang="en" xmlns:th="http://www.w3.org/1999/xhtml">
<head>
<base href="<%=basePath%>">
<title>My JSP 'index.jsp' starting page</title>
<meta http-equiv="pragma" content="no-cache">
<meta http-equiv="cache-control" content="no-cache">
<meta http-equiv="expires" content="0">
<meta http-equiv="keywords" content="keyword1,keyword2,keyword3">
<meta http-equiv="description" content="This is my page">
<!--
<link rel="stylesheet" type="text/css" href="styles.css">
-->
</head>
<body>
<table border='1' cellspacing='0'>
<thead>
<tr>
<td>序号</td>
<td>字段名称</td>
<td>内容</td>
</tr>
</thead>
<tbody>
<tr>
<td th:utext="1"></td>
<td th:utext="title"></td>
<td th:utext="${Doc.title}"></td>
</tr>
<tr>
<td th:utext="2"></td>
<td th:utext="author"></td>
<td th:utext="${Doc.author}"></td>
</tr>
<tr>
<td th:utext="3"></td>
<td th:utext="content"></td>
<td th:utext="${Doc.content}"></td>
</tr>
</tbody>
</table>
</body>
</html>
展示效果为:
4.3.3.2 界面优化
修改index.html
文件为:
<!DOCTYPE html>
<link rel="stylesheet" th:href="@{/style.css}" xmlns:th="http://www.w3.org/1999/xhtml">
<html lang="en" xmlns:th="http://www.w3.org/1999/xhtml">
<head>
<title>starting page</title>
<meta charset="UTF-8">
<meta http-equiv="pragma" content="no-cache">
<meta http-equiv="cache-control" content="no-cache">
<meta http-equiv="expires" content="0">
<meta http-equiv="keywords" content="keyword1,keyword2,keyword3">
<meta http-equiv="description" content="This is my page">
<style>
.body_div {
background-image: url('background.png');
/*background: #123663f5;*/
}
.p_text {
margin-left: 70px;
margin-top: 50px;
color: #28ca8e;
}
.div_left {
border-radius: 3px;
background: #123663f5;
width: 85%;
height: 100px;
border: 1px solid #446fb7;
float: left;
margin-left: 70px;
}
.inpt_text {
outline: none;
width: 70%;
height: 30px;
margin-left: 30px;
margin-right: 20px;
margin-top: 30px;
border: 3px solid #4293e6;
}
.input_class {
cursor: pointer;
background: #28ca8e;
color: #ffffff;
height: 35px;
border: 1px solid #28ca8e;
width: 176px;
font-size: 16px;
line-height: 15px;
}
.span_text {
color: #00ffff;
margin-left: 10px;
font-size: 14px;
text-decoration: none;
}
.div_td {
margin-left: 10px;
margin-right: 10px;
font-size: 13px;
}
.div_content {
border: 1px solid #b39d4f;
width: 82%;
padding: 10px;
height: auto;
position: absolute;
margin-top: 150px;
color: #ffffff;
margin-left: 70px;
}
.div_page {
float: right;
margin-right: 20px;
}
.span_num {
color: #ffffff;
}
.span_num:hover {
color: #28ca8e;
}
.a_text {
color: #28ca8e;
}
.a_text:hover {
color: #10b578;
}
</style>
</head>
<body class="body_div">
<div>
<div>
<h2 class="p_text">条件查询:</h2>
</div>
<div class="div_left">
<form th:action="search.do" method="get">
<input class="inpt_text" type="text" name="keyWords"/>
<input class="input_class" type="submit" value="千度一下">
<input type="hidden" value="1" name="pageNum">
</form>
</div>
</div>
<!--<a th:href="@{'/create.do'}">上传文件并创建索引</a>-->
<div th:if="${!#lists.isEmpty(page)}" class="div_content">
<h3 th:text="'千度为您找到相关结果约' + ${total} + '个'"
style="color: #28ca8e; margin-left: 10px;
margin-top: 10px"></h3>
<div th:each="bean:${page.list}">
<a class="span_text" th:href="@{'/detailDocById/' + ${bean.id} + '.do'}">
<span th:utext="${bean.title}"></span>
</a>
<br/>
<br/>
<div class="div_td">
<td th:utext="${bean.describe}"></td>
</div>
<br/>
<br/>
</div>
<div class="div_page">
<span th:if="${page.hasPrevious}">
<a class="a_text" th:href="@{/search.do(pageNum=${page.previousPageNum},keyWords=${kw})}">上一页</a>
</span>
<span th:each="num:${#numbers.sequence(page.everyPageStart,page.everyPageEnd)}">
<a class="span_num" th:href="@{search.do(pageNum=${num},keyWords=${kw})}">
<span th:utext="${num}"></span>
</a>
</span>
<span th:if="${page.hasNext}">
<a class="a_text" th:href="@{/search.do(pageNum=${page.nextPageNum},keyWords=${kw})}">下一页</a>
</span>
</div>
</div>
</body>
</html>
修改detail.html
文件为:
<!DOCTYPE html>
<link rel="stylesheet" th:href="@{/style.css}" xmlns:th="http://www.w3.org/1999/xhtml">
<html lang="en" xmlns:th="http://www.w3.org/1999/xhtml">
<head>
<base href="<%=basePath%>">
<title>My JSP 'index.jsp' starting page</title>
<meta http-equiv="pragma" content="no-cache">
<meta http-equiv="cache-control" content="no-cache">
<meta http-equiv="expires" content="0">
<meta http-equiv="keywords" content="keyword1,keyword2,keyword3">
<meta http-equiv="description" content="This is my page">
<!--
<link rel="stylesheet" type="text/css" href="styles.css">
-->
<style>
.table_css {
background: #123663d4;
width: 99%;
color: #ffffff;
border: 1px solid #123663d4;
}
td {
vertical-align: bottom;
height: 17px;
padding: 15px;
text-align: center;
}
.td_css {
text-align: center;
background: #123663f5;
border: 1px solid #123663f5;
color: #ffffff;
}
</style>
</head>
<body>
<!-- <h1 th:utext="${Doc.title}"></h1>-->
<!-- <div th:utext="${Doc.author}"></div>-->
<!-- <div th:utext="${Doc.content}"></div>-->
<table border='1' class="table_css">
<thead>
<tr>
<td class="td_css" style="width: 10%">序号</td>
<td class="td_css">字段名称</td>
<td class="td_css">内容</td>
</tr>
</thead>
<tbody>
<tr>
<td th:utext="1"></td>
<td th:utext="title"></td>
<td th:utext="${Doc.title}"></td>
</tr>
<tr>
<td th:utext="2"></td>
<td th:utext="author"></td>
<td th:utext="${Doc.author}"></td>
</tr>
<tr>
<td th:utext="3"></td>
<td th:utext="content"></td>
<td th:utext="${Doc.content}"></td>
</tr>
</tbody>
</table>
</body>
</html>
在 application.properties
中增加一行配置:spring.resources.static-locations=file:D:/huiq/
,并把 background.png
背景图片上传到该目录。
注:有时候你把该目录下的相应图片删除也能访问到,这有可能是缓存的问题,清除浏览器的缓存数据即可、
展示效果为:
附:background.png
五、思考
后来我想增加对hbase表中的rowkey在es中也建立索引,但却总是失败。主要遇到了两个问题:
- 对rowkey设置高亮后搜索rowkey点击无法返回内容。
- 对rowkey的搜索只能是全部搜索,比如rowkey为abcd,那么只能输入abcd才能搜索到,输入ab则搜索不到。其实并不只rowkey是这样,对所有的英文单词(hive)和数字(2008)都只能全部搜索而不能部分匹配。
后来想想其实rowkey也没必要建立索引,你可以把需要搜索的信息放在列里,rowkey可以用UUID生成来保证每条数据的唯一性,UUID就没必要作为搜索信息了吧。但强迫症的我还是想实现也能够对rowkey建立索引搜索,如果大家有谁能实现了的话,还希望能告我一下,大家一起探讨学习一下哈。
5.1 解决问题2:
解决该问题可使用部分匹配(可参考Elasticsearch搜索中的部分匹配),目前我整出了三种类型供大家在不同场景下使用。
注意:在运行项目之前需要先手动用文件创建索引。
类型一
[hadoop@h153 elasticsearch-2.2.0]$ vi hehe.json
{
"settings": {
"number_of_shards": 1,
"analysis": {
"filter": {
"autocomplete_filter": {
"type": "edge_ngram",
"min_gram": 1,
"max_gram": 20
}
},
"analyzer": {
"autocomplete": {
"type": "custom",
"tokenizer": "standard",
"filter": [
"lowercase",
"autocomplete_filter"
]
}
}
}
},
"mappings":{
"qiang":{
"dynamic":"strict",
"properties":{
"id":{"type":"string","store":"yes","index":"analyzed","analyzer": "autocomplete","search_analyzer": "standard"},
"title":{"type":"string","store":"yes","index":"analyzed","analyzer": "autocomplete","search_analyzer": "standard"},
"describe":{"type":"string","store":"yes","index":"analyzed","analyzer": "autocomplete","search_analyzer": "standard"},
"author":{"type":"string","store":"yes","index":"analyzed","analyzer": "autocomplete","search_analyzer": "standard"},
"content":{"type":"string","store":"yes","index":"analyzed","analyzer": "autocomplete","search_analyzer": "standard"}
}
}
}
}
说明:min_gram
和 max_gram
的设置也是有讲究的,这里的设置为当单个英文或数字术语的长度如果不大于20的话,在搜索时输入前1-20中的任意值都会命中该术语。但是设置的值越大所分的术语就越多,所需的资源也就越多
[hadoop@h153 elasticsearch-2.2.0]$ curl -XPOST '192.168.205.153:9200/hui' -d @hehe.json
最终搜索效果:
局限性:
- 对于一个英文单词只能从前面往后而不能任意输入,比如hive这个单词输入hiv能命中,而输入ive则不可以。
- 只能高亮显示整个英文单词,而不能高亮显示搜索的内容,比如hive这个单词只能这样显示hive,而不能这样显示hive。
- 对特殊字符无能为力,如
_
、}
、/
类型二
[hadoop@h153 elasticsearch-2.2.0]$ vi hehe.json
{
"settings": {
"analysis": {
"filter": {
"trigrams_filter": {
"type": "ngram",
"min_gram": 1,
"max_gram": 5
}
},
"analyzer": {
"trigrams": {
"type": "custom",
"tokenizer": "standard",
"filter": [
"lowercase",
"trigrams_filter"
]
}
}
}
},
"mappings":{
"qiang":{
"dynamic":"strict",
"properties":{
"id":{"type":"string","store":"yes","index":"analyzed","analyzer": "trigrams","search_analyzer": "standard"},
"title":{"type":"string","store":"yes","index":"analyzed","analyzer": "trigrams","search_analyzer": "standard"},
"describe":{"type":"string","store":"yes","index":"analyzed","analyzer": "trigrams","search_analyzer": "standard"},
"author":{"type":"string","store":"yes","index":"analyzed","analyzer": "trigrams","search_analyzer": "standard"},
"content":{"type":"string","store":"yes","index":"analyzed","analyzer": "trigrams","search_analyzer": "standard"}
}
}
}
}
[hadoop@h153 elasticsearch-2.2.0]$ curl -XPOST '192.168.205.153:9200/hui' -d @hehe.json
最终搜索效果:
局限性:
- 这个也不应该叫局限性,是出了我也不知道咋解决的问题,如果一个术语长这样
0123223003_0e72262cc4264b27b0ffc0f8cb137d12
,那么在输_
前半部分的时候能搜索到该术语并且高亮显示,但输_
后半部分的时候虽然也能搜索到,但却不高亮显示,一开始我以为是特殊符号_
的原因,但结果换成012_cc4
后却正常,我也是醉了。。。 - 只能高亮显示整个英文单词,而不能高亮显示搜索的内容,比如 hive 这个单词只能这样显示hive,而不能这样显示hive。
- 对特殊字符无能为力,如
_
、}
、/
类型三
[hadoop@h153 elasticsearch-2.2.0]$ vi hehe.json
{
"settings": {
"analysis": {
"analyzer": {
"charSplit": {
"type": "custom",
"tokenizer": "ngram_tokenizer"
}
},
"tokenizer": {
"ngram_tokenizer": {
"type": "nGram",
"min_gram": "1",
"max_gram": "1",
"token_chars": [
"letter",
"digit",
"punctuation"
]
}
}
}
},
"mappings":{
"qiang":{
"dynamic":"strict",
"properties":{
"id":{"type":"string","store":"yes","index":"analyzed","analyzer": "charSplit","search_analyzer": "charSplit"},
"title":{"type":"string","store":"yes","index":"analyzed","analyzer": "charSplit","search_analyzer": "charSplit"},
"describe":{"type":"string","store":"yes","index":"analyzed","analyzer": "charSplit","search_analyzer": "charSplit"},
"author":{"type":"string","store":"yes","index":"analyzed","analyzer": "charSplit","search_analyzer": "charSplit"},
"content":{"type":"string","store":"yes","index":"analyzed","analyzer": "charSplit","search_analyzer": "charSplit"}
}
}
}
}
[hadoop@h153 elasticsearch-2.2.0]$ curl -XPOST '192.168.205.153:9200/hui' -d @hehe.json
最终搜索效果:
局限性:
- 虽然能这样显示 hive,但是却将其他不想要的也搜索出来,目前我并没有想出很好的解决方法(本来想在代码中搜索的api中加入模糊匹配的代码,如搜索关键词 key 的时候就自动搜索
*key*
,但我没有成功。即使对英文能成功但是对中文却无能为力) - 发现了个奇怪的现象,当把
max_gram
设置成大于1的值时,搜索“提”字能搜索到却不高亮显示,并且搜索中间隔一个字的两个字三个都高亮显示,比如搜索提供种
,搜索结果为提供三种
,并且搜}
和/
这两个特殊字符能搜到却不高亮显示。