When using spark on hive, there is an error in the time zone, which is 8 hours more than the correct value

Here is my configuration information:
"browser-info": {
"language": "zh-CN",
"platform": "MacIntel",
"userAgent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/98.0.4758.102 Safari/537.36 Edg/98.0.1108.62",
"vendor": "Google Inc."
"system-info": {
"file.encoding": "UTF-8",
"java.runtime.name": "OpenJDK Runtime Environment",
"java.runtime.version": "1.8.0_312-8u312-b07-0ubuntu1~20.04-b07",
"java.vendor": "Private Build",
"java.vendor.url": "http://java.oracle.com/",
"java.version": "1.8.0_312",
"java.vm.name": "OpenJDK 64-Bit Server VM",
"java.vm.version": "25.312-b07",
"os.name": "Linux",
"os.version": "5.4.0-77-generic",
"user.language": "en",
"user.timezone": "Asia/Shanghai"
"metabase-info": {
"databases": [
"hosting-env": "unknown",
"application-database": "mysql",
"application-database-details": {
"database": {
"name": "MySQL",
"version": "5.7.18-txsql-log"
"jdbc-driver": {
"name": "MariaDB Connector/J",
"version": "2.6.2"
"run-mode": "prod",
"version": {
"tag": "v0.41.6",
"date": "2022-01-10",
"branch": "release-x.41.x",
"hash": "296635f"
"settings": {
"report-timezone": null

I tried to change the report time zone to Asia / Shanghai, and the system and JVM time zones have been set to Asia / Shanghai, which is ineffective.
my system zone:

           Local time: Mon 2022-03-07 10:59:33 CST
       Universal time: Mon 2022-03-07 02:59:33 UTC
             RTC time: Mon 2022-03-07 02:59:34    
            Time zone: Asia/Shanghai (CST, +0800) 

System clock synchronized: yes
NTP service: n/a
RTC in local TZ: no

my jvm zone:

export TZ='Asia/Shanghai'

Hi @m824673408
Metabase does current not support Report Timezone for Spark:
I would suggest that you use Java 11:
The Spark driver has been upgraded in upcoming 0.43:

Thank you for your reply. I'll try