Hi after upgrading from 0.36.6 to 0.43 community version, Native spark queries with date variables not working!
sample query:
SELECT DATE_FORMAT(sessionstarttime, 'dd MMM YYYY HH:mm') as "Session Start Time",
DATE_FORMAT(sessionendtime, 'dd MMM YYYY HH:mm') as "Session End Time",
practitionername as "Doctor Name", practitionerrolename as "Doctor Role",
servicecategoryname as "Category",
appointments.paymentstatus as "Status",
FROM appointments
WHERE {{date}}
This produces following where close for variable {{date}} when selecting previous 30 days.
WHERE CAST(from_unixtime(unix_timestamp(date_format(CAST(appointments
.sessionstarttime
AS timestamp), 'yyyy-MM-dd'), 'yyyy-MM-dd')) AS timestamp) BETWEEN timestamp '2022-03-08 00:00:00.000' AND timestamp '2022-04-06 00:00:00.000'
Error is:
org.apache.spark.sql.ParseException: Invalid input ''', expected '.', '(', '|', arithmeticOperator, '[', plusOrMinus or AND (line 16, column 169): WHERE CAST(from_unixtime(unix_timestamp(date_format(CAST(appointments
.sessionstarttime
AS timestamp), 'yyyy-MM-dd'), 'yyyy-MM-dd')) AS timestamp) BETWEEN timestamp '2022-03-08 00:00:00.000' AND timestamp '2022-04-06 00:00:00.000' ^;
If i remove the 'timestamp' before the 'between dates, then it works.
like this
WHERE CAST(from_unixtime(unix_timestamp(date_format(CAST(appointments
.sessionstarttime
AS timestamp), 'yyyy-MM-dd'), 'yyyy-MM-dd')) AS timestamp) BETWEEN '2022-03-08 00:00:00.000' AND '2022-04-06 00:00:00.000'
It is related to spark sql driver upgrade or something? if so, can i downgrade spark sql driver?
Thanks for helping!