How to Enable Authentication for TiSpark and Pass TiDB Authentication Information When Using Spark Thrift Server for Remote Connection, Currently Encountering Error: Caused by: java.util.NoSuchElementException: spark.sql.tidb.user

Note:
This topic has been translated from a Chinese forum by GPT and might contain errors.

Original topic: tispark开启鉴权,使用Spark Thrift Server服务进行远程连接如何传入连接TiDB的身份验证信息,目前有报错:Caused by: java.util.NoSuchElementException: spark.sql.tidb.user

| username: Jackie492391142

TiSpark has enabled TiDB database authentication connection.
Spark Thrift Server service has started normally
root@master:/opt/spark/spark-3.1.3/sbin# ./start-thriftserver.sh
root@master:/opt/spark/spark-3.1.3/bin# jps -l
79219 org.apache.hadoop.yarn.server.resourcemanager.ResourceManager
79701 org.apache.spark.deploy.SparkSubmit
78998 org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode
78699 org.apache.hadoop.hdfs.server.namenode.NameNode
79564 org.apache.spark.deploy.master.Master
80286 sun.tools.jps.Jps

Start beeline, enter !connect jdbc:hive2://192.168.1.110:10000 to connect.
Prompt Error: Could not open client transport with JDBC Uri: jdbc:hive2://192.168.1.110:10000: Failed to open new session: java.lang.reflect.InvocationTargetException (state=08S01,code=0) causing the connection to fail.
root@slave2:/opt/spark/spark-3.1.3/bin# ./beeline
Beeline version 2.3.7 by Apache Hive
beeline> !connect jdbc:hive2://192.168.1.110:10000
Connecting to jdbc:hive2://192.168.1.110:10000
Enter username for jdbc:hive2://192.168.1.110:10000: root
Enter password for jdbc:hive2://192.168.1.110:10000: *********
22/09/09 14:35:27 INFO Utils: Supplied authorities: 192.168.1.110:10000
22/09/09 14:35:27 INFO Utils: Resolved authority: 192.168.1.110:10000
22/09/09 14:35:39 WARN HiveConnection: Failed to connect to 192.168.1.110:10000
Error: Could not open client transport with JDBC Uri: jdbc:hive2://192.168.1.110:10000: Failed to open new session: java.lang.reflect.InvocationTargetException (state=08S01,code=0)

Check the thriftserver log:
22/09/09 14:49:54 INFO ThriftCLIService: Client protocol version: HIVE_CLI_SERVICE_PROTOCOL_V10
22/09/09 14:49:54 INFO deprecation: No unit for dfs.client.datanode-restart.timeout(30) assuming SECONDS
22/09/09 14:49:54 INFO SessionState: Created HDFS directory: /tmp/hive/test_spark
22/09/09 14:49:54 INFO SessionState: Created HDFS directory: /tmp/hive/test_spark/5984aa7c-8d6a-46cc-8f6b-7654dac0d9a0
22/09/09 14:49:54 INFO SessionState: Created local directory: /tmp/root/5984aa7c-8d6a-46cc-8f6b-7654dac0d9a0
22/09/09 14:49:54 INFO SessionState: Created HDFS directory: /tmp/hive/test_spark/5984aa7c-8d6a-46cc-8f6b-7654dac0d9a0/_tmp_space.db
22/09/09 14:49:54 INFO HiveSessionImpl: Operation log session directory is created: /tmp/root/operation_logs/5984aa7c-8d6a-46cc-8f6b-7654dac0d9a0
22/09/09 14:49:54 INFO TiAuthRuleFactory: TiSpark running in auth mode
22/09/09 14:49:54 INFO SessionState: Deleted directory: /tmp/hive/test_spark/5984aa7c-8d6a-46cc-8f6b-7654dac0d9a0 on fs with scheme hdfs
22/09/09 14:49:54 INFO SessionState: Deleted directory: /tmp/root/5984aa7c-8d6a-46cc-8f6b-7654dac0d9a0 on fs with scheme file
22/09/09 14:49:54 WARN ThriftCLIService: Error opening session:
org.apache.hive.service.cli.HiveSQLException: Failed to open new session: java.lang.reflect.InvocationTargetException
at org.apache.spark.sql.hive.thriftserver.SparkSQLSessionManager.openSession(SparkSQLSessionManager.scala:85)
at org.apache.hive.service.cli.CLIService.openSessionWithImpersonation(CLIService.java:204)
at org.apache.hive.service.cli.thrift.ThriftCLIService.getSessionHandle(ThriftCLIService.java:371)
at org.apache.hive.service.cli.thrift.ThriftCLIService.OpenSession(ThriftCLIService.java:243)
at org.apache.hive.service.rpc.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1497)
at org.apache.hive.service.rpc.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1482)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:38)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:53)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:310)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at com.pingcap.tispark.utils.ReflectionUtil$.newTiAuthRule(ReflectionUtil.scala:140)
at org.apache.spark.sql.extensions.TiAuthRuleFactory.apply(rules.scala:36)
at org.apache.spark.sql.extensions.TiAuthRuleFactory.apply(rules.scala:25)
at org.apache.spark.sql.SparkSessionExtensions.$anonfun$buildResolutionRules$1(SparkSessionExtensions.scala:141)
at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:238)
at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
at scala.collection.TraversableLike.map(TraversableLike.scala:238)
at scala.collection.TraversableLike.map$(TraversableLike.scala:231)
at scala.collection.AbstractTraversable.map(Traversable.scala:108)
at org.apache.spark.sql.SparkSessionExtensions.buildResolutionRules(SparkSessionExtensions.scala:141)
at org.apache.spark.sql.internal.BaseSessionStateBuilder.customResolutionRules(BaseSessionStateBuilder.scala:200)
at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anon$1.(HiveSessionStateBuilder.scala:82)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.analyzer(HiveSessionStateBuilder.scala:73)
at org.apache.spark.sql.internal.BaseSessionStateBuilder.$anonfun$build$2(BaseSessionStateBuilder.scala:342)
at org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:84)
at org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:84)
at org.apache.spark.sql.execution.QueryExecution.$anonfun$analyzed$1(QueryExecution.scala:73)
at org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:111)
at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:143)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
at org.apache.spark.sql.execution.QueryExecution.executePhase(QueryExecution.scala:143)
at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:73)
at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:71)
at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:63)
at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:98)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:96)
at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:618)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:613)
at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:650)
at org.apache.spark.sql.hive.thriftserver.SparkSQLSessionManager.openSession(SparkSQLSessionManager.scala:73)
… 12 more
Caused by: java.util.NoSuchElementException: spark.sql.tidb.user
at org.apache.spark.sql.internal.SQLConf.$anonfun$getConfString$3(SQLConf.scala:3732)
at scala.Option.getOrElse(Option.scala:189)
at org.apache.spark.sql.internal.SQLConf.getConfString(SQLConf.scala:3732)
at com.pingcap.tispark.auth.TiAuthorization$.tiAuthorization(TiAuthorization.scala:244)
at org.apache.spark.sql.TiContext.tiAuthorization$lzycompute(TiContext.scala:42)
at org.apache.spark.sql.TiContext.tiAuthorization(TiContext.scala:42)
at org.apache.spark.sql.TiContext.(TiContext.scala:47)
at org.apache.spark.sql.TiExtensions.getOrCreateTiContext(TiExtensions.scala:49)
at org.apache.spark.sql.TiExtensions.$anonfun$apply$2(TiExtensions.scala:36)
at org.apache.spark.sql.extensions.TiAuthorizationRule.(TiAuthorizationRule.scala:34)
… 50 more
The log contains a prompt message: Caused by: java.util.NoSuchElementException: spark.sql.tidb.user meaning that the configuration for the element spark.sql.tidb.user cannot be found.
The current issue is: TiSpark has enabled TiDB authentication connection, and because the production database is quite important, the user and password cannot be explicitly configured in the configuration file. How can beeline be used to pass in user password and other authentication information to make the Spark Thrift Server service connect successfully?

| username: 数据小黑 | Original post link

In Beeline, you can use the set command to add configurations by setting properties in the format set property=value.

| username: Jackie492391142 | Original post link

Okay! I’ll give it a try.

| username: jansu-dev | Original post link

Hi, has this issue been resolved? If so, don’t forget to mark it as resolved, thanks!!!
Beeline usage → beeline 使用_beeline -u-CSDN博客

| username: system | Original post link

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.