dask使用kerberos认证读取hive数据库

dask使用kerberos认证读取hive数据库

1、首先主机需要有KERBEROS客户端,测试kinit命令是否存在;

2、执行

kinit -kt  xxx.keytab  xxx/zzzz@EXAMPLE.COM  

在此之前需要确认xxx.keytab 文件正确,以及/etc/krb5.conf配置文件配置正确

下面是krb5.conf, 以下的配置和代码中的Example需要替换为自己的目标配置

kerberos.example.com 这个配置是hive所在主机的映射名,可在/etc/hosts中配置映射

# Configuration snippets may be placed in this directory as well includedir /etc/krb5.conf.d/  [logging]  default = FILE:/var/log/krb5libs.log  kdc = FILE:/var/log/krb5kdc.log  admin_server = FILE:/var/log/kadmind.log  [libdefaults]  dns_lookup_realm = false  ticket_lifetime = 24h  renew_lifetime = 7d  forwardable = true  rdns = false  pkinit_anchors = FILE:/etc/pki/tls/certs/ca-bundle.crt # default_realm = EXAMPLE.COM # default_ccache_name = KEYRING:persistent:%{uid}  [realms] # EXAMPLE.COM = { #  kdc = kerberos.example.com #  admin_server = kerberos.example.com # }  [domain_realm] # .example.com = EXAMPLE.COM # example.com = EXAMPLE.COM  

3、

con = "hive://ipaddr:10000/database?auth=KERBEROS&kerberos_service_name=hive" 

ipaddr是远程hive数据库主机地址,database是要连接的数据库名;

这里设置auth=KERBEROS 使用kerberos方式连接hive数据库,不需要密码,但是需要keytab认证文件,设置kerberos_service_name=hive

4、使用sqlalchemy构建sql查询语句

from sqlalchemy import Column, MetaData, Table, text from sqlalchemy.sql import select  metadata = MetaData() columns = "col1,col2,col3,col4" li = [Column(col) for col in columns.split(",")] print(*li) t = Table('tableName', metadata,           *li,           schema='databaseName') s = select([t]).where(text("col2>4"), text("col3>5")).limit(2) 

dask的read_sql_table方法要么就是进行整表查询

5、使用krbcontext模块进行认证并进行数据读取

principal = "xxx/zzzz@EXAMPLE.COM" keytab_path = "./xxx.keytab" from krbcontext import krbcontext with krbcontext(using_keytab=True, principal=principal, keytab_file=keytab_path): 	df = dd.read_sql_table(s, con, index_col="col1") 

有个依赖包需要sasl,这个包在linux上使用conda能够正常安装,在windows上可能会因为缺少一些不确定的东西而装不上,在linux上直接conda install sasl即可。