Spark SQL Native SQL Date variables not working

Hi after upgrading from 0.36.6 to 0.43 community version, Native spark queries with date variables not working!

sample query:

SELECT DATE_FORMAT(sessionstarttime, 'dd MMM YYYY HH:mm') as "Session Start Time",
DATE_FORMAT(sessionendtime, 'dd MMM YYYY HH:mm') as "Session End Time",
practitionername as "Doctor Name", practitionerrolename as "Doctor Role",
servicecategoryname as "Category",
appointments.paymentstatus as "Status",
FROM appointments
WHERE {{date}}

This produces following where close for variable {{date}} when selecting previous 30 days.

WHERE CAST(from_unixtime(unix_timestamp(date_format(CAST(appointments.sessionstarttime AS timestamp), 'yyyy-MM-dd'), 'yyyy-MM-dd')) AS timestamp) BETWEEN timestamp '2022-03-08 00:00:00.000' AND timestamp '2022-04-06 00:00:00.000'

Error is:

org.apache.spark.sql.ParseException: Invalid input ''', expected '.', '(', '|', arithmeticOperator, '[', plusOrMinus or AND (line 16, column 169): WHERE CAST(from_unixtime(unix_timestamp(date_format(CAST(appointments.sessionstarttime AS timestamp), 'yyyy-MM-dd'), 'yyyy-MM-dd')) AS timestamp) BETWEEN timestamp '2022-03-08 00:00:00.000' AND timestamp '2022-04-06 00:00:00.000' ^;

If i remove the 'timestamp' before the 'between dates, then it works.
like this

WHERE CAST(from_unixtime(unix_timestamp(date_format(CAST(appointments.sessionstarttime AS timestamp), 'yyyy-MM-dd'), 'yyyy-MM-dd')) AS timestamp) BETWEEN '2022-03-08 00:00:00.000' AND '2022-04-06 00:00:00.000'

It is related to spark sql driver upgrade or something? if so, can i downgrade spark sql driver?

Thanks for helping!

Hi @sathi
There's no version 0.43 - yet. Sounds like you are using the development master, but then you need to tell which commit you are using.
It is not possible to downgrade the driver of Spark - they are incompatible.
Sounds similar to this, but each driver can handle things differently: https://github.com/metabase/metabase/issues/21133

Hi Sorry I meant 'v0.42.3'. downloaded the jar from the release page.

@sathi Then I don't understand what you're talking about with driver update - the Spark has only been upgraded in the upcoming v43.
You're welcome to open an issue for this: https://github.com/metabase/metabase/issues/new/choose

Hi @flamer, you are right. The issue is here. I compared 0.36.6 with 0.42.3

in 0.42.3
(defmethod unprepare/unprepare-value [:hive-like LocalDate]
[driver t]
(unprepare/unprepare-value driver (t/local-date-time t (t/local-time 0))))

in 0.36.6
(defmethod unprepare/unprepare-value [:hive-like LocalDate]
[_ t]
(format "'%s'" (t/format "yyyy-MM-dd" t)))

Because of this change. When i revert this, date variable working.

Thanks for helping. Will report that in github.

@sathi It sounds like you have been using a custom driver. There has not been such code in 0.36.6:
https://github.com/metabase/metabase/blob/v0.36.6/modules/drivers/sparksql/src/metabase/driver/hive_like.clj#L162-L165

Hi @flamber, yes we were using a custom one. Thank you for noticing. Our Spark distribution is 2.1.1. and it gives error when using '(unprepare/unprepare-value driver (t/local-date-time t (t/local-time 0))))' so we did that change.

So the root of this all is because metabase uses spark sql 1.2.2?

Thank you.

closed the github issue https://github.com/metabase/metabase/issues/21524

@sathi Spark driver has been upgraded in the upcoming v43:
https://github.com/metabase/metabase/issues/11023