- 通过大数据平台下载客户端,拷贝
FusionInsight_Cluster_1_HBase_ClientConfig\HBase\FusionInsight-HBase-1.3.1.tar.gz\hbase\lib
路径下的jar到项目内,此处的jar包可能会因为解压问题导致大小为0kb,这里需要联系大数据平台人员处理 - 拷贝
FusionInsight_Cluster_1_HBase_ClientConfig\HBase\config
下的core-site.xml
、hbase-site.xml
和hdfs-site.xml
文件到resources
目录下,如果使用测试用例,则需要复制到test
下的resources
目录 - 通过大数据平台下载认证凭证,放到项目内,如果修改过密码,则需要重新下载凭证
- 配置hosts文件
10.191.178.2 CYPC-HD-MN01 CYPC-HD-MN01.
10.191.178.3 CYPC-HD-MN02 CYPC-HD-MN02.
10.191.178.8 CYPC-HD-CN02 CYPC-HD-CN02.
10.191.178.26 CYPC-HD-KA01 CYPC-HD-KA01.
10.191.178.28 CYPC-HD-KA03 CYPC-HD-KA03.
10.191.178.25 CYPC-HD-DN16 CYPC-HD-DN16.
10.191.178.32 CYPC-HD-ES01 CYPC-HD-ES01.
10.191.178.10 CYPC-HD-DN01 CYPC-HD-DN01.
10.191.178.14 CYPC-HD-DN05 CYPC-HD-DN05.
10.191.178.18 CYPC-HD-DN09 CYPC-HD-DN09.
10.191.178.21 CYPC-HD-DN12 CYPC-HD-DN12.
10.191.178.13 CYPC-HD-DN04 CYPC-HD-DN04.
10.191.178.29 CYPC-HD-RDS01 CYPC-HD-RDS01.
10.191.178.9 CYPC-HD-CN03 CYPC-HD-CN03.
10.191.178.24 CYPC-HD-DN15 CYPC-HD-DN15.
10.191.178.17 CYPC-HD-DN08 CYPC-HD-DN08.
10.191.178.12 CYPC-HD-DN03 CYPC-HD-DN03.
10.191.178.20 CYPC-HD-DN11 CYPC-HD-DN11.
10.191.178.31 CYPC-HD-RDS03 CYPC-HD-RDS03.
10.191.178.7 CYPC-HD-CN01 CYPC-HD-CN01.
10.191.178.16 CYPC-HD-DN07 CYPC-HD-DN07.
10.191.178.34 CYPC-HD-ES03 CYPC-HD-ES03.
10.191.178.15 CYPC-HD-DN06 CYPC-HD-DN06.
10.191.178.11 CYPC-HD-DN02 CYPC-HD-DN02.
10.191.178.23 CYPC-HD-DN14 CYPC-HD-DN14.
10.191.178.19 CYPC-HD-DN10 CYPC-HD-DN10.
10.191.178.33 CYPC-HD-ES02 CYPC-HD-ES02.
10.191.178.30 CYPC-HD-RDS02 CYPC-HD-RDS02.
10.191.178.22 CYPC-HD-DN13 CYPC-HD-DN13.
10.191.178.27 CYPC-HD-KA02 CYPC-HD-KA02.
1.1.1.1 hadoop.cypc.bd
- 编写测试代码进行测试,此处需要确认
table_name
package fun.gudu;
import fun.gudu.modules.util.LoginUtil;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hbase.Cell;
import org.apache.hadoop.hbase.CellUtil;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.TableName;
import org.apache.hadoop.hbase.client.*;
import org.apache.hadoop.hbase.util.Bytes;
import org.junit.Test;
import java.io.File;
import java.io.FileNotFoundException;
import java.io.IOException;
import java.util.Iterator;
public class HBaseTest {
@Test
public void HBaseTest01() throws FileNotFoundException {
String userdir = System.getProperty("user.dir") + File.separator + "conf" + File.separator;
String principal = "cypc_yx";
String keytab = userdir + "user.keytab";
String krb5file = userdir + "krb5.conf";
Configuration conf = HBaseConfiguration.create();
try {
LoginUtil.setJaasFile(principal, keytab);
LoginUtil.login(principal, keytab, krb5file, conf);
} catch (IOException e) {
throw new RuntimeException(e);
}
// 创建连接并且连接hbase
try {
Connection conn = ConnectionFactory.createConnection(conf);
Table rtd = conn.getTable(TableName.valueOf("cypc_rtd"));
// scan 操作
System.out.println("---------------- hbase scan operator-------------------------");
Scan scan = new Scan();
ResultScanner scanner = rtd.getScanner(scan);
Iterator<Result> iterator = scanner.iterator();
int i = 0;
while (iterator.hasNext()) {
if (i == 10) {
break;
}
Result next = iterator.next();
Cell[] cells = next.rawCells();
for (Cell cell : cells) {
System.out.println(
Bytes.toString(CellUtil.cloneRow(cell)) + "\t" +
Bytes.toString(CellUtil.cloneFamily(cell)) + "\t" +
Bytes.toString(CellUtil.cloneQualifier(cell)) + "\t" +
Bytes.toString(CellUtil.cloneValue(cell)));
}
i++;
}
// get 操作
System.out.println("---------------- hbase get operator-------------------------");
Get get = new Get(Bytes.toBytes("02YY2CJA21ES856------ADIC1RUN"));
Result result = rtd.get(get);
Cell[] cells = result.rawCells();
for (Cell cell : cells) {
System.out.println(
Bytes.toString(CellUtil.cloneRow(cell)) + "\t" +
Bytes.toString(CellUtil.cloneFamily(cell)) + "\t" +
Bytes.toString(CellUtil.cloneQualifier(cell)) + "\t" +
Bytes.toString(CellUtil.cloneValue(cell)));
}
//
rtd.close();
conn.close();
} catch (IOException e) {
throw new RuntimeException(e);
}
}
}
- 结果说明
取点结果结构大致如下
02YY2CJA21ES856------ADIC1RUN cf av 0
02YY2CJA21ES856------ADIC1RUN cf tm 1681183357.449
key 是测点的kks编码
tm 对应的值是时间戳,av字段对应的值就是测点值
按照结构化数据理解 表结构为 kks、tm、av
av值的具体含义需要二次确认,不同测点的含义不同
# To Be Continued!😎
← Harbor安装与使用 IDEA常用插件 →