You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I created external tables using Spark SQL, the path pointed to by the upgraded Hive's configuration item "base.hive.location-root" was not the correct storage path for the external tables. Instead, it used the path composed of the storage path of the database and the table name, while the expected result should point to the actual storage path of the external tables.
What happened?
I created external tables using Spark SQL, the path pointed to by the upgraded Hive's configuration item "base.hive.location-root" was not the correct storage path for the external tables. Instead, it used the path composed of the storage path of the database and the table name, while the expected result should point to the actual storage path of the external tables.
create table command:
create table test_db.test_table(a string,b string) partitioned by(dt string) stored as parquet location '/user/hadoop_user/data'
The configuration item "base.hive.location-root" displayed in the WebUI is "hdfs:///user/hive/test_db.db/test_table/hive" at this time.
Affects Versions
master/0.7.1
What table formats are you seeing the problem on?
Mixed-Hive
What engines are you seeing the problem on?
Flink, Spark
How to reproduce
1.create table using Spark and specify external location.
2.upgrade hive table to Mixed-Hive table by webUI.
Relevant log output
No response
Anything else
No response
Are you willing to submit a PR?
Code of Conduct
The text was updated successfully, but these errors were encountered: