Combining 2 Databases from Same data warehouse 1 query works, the other doesn't. Why?

Hi, im wondering why these 2 queries have different results while both are pulling from the same BigQuery datawarehouse.

Query 1: Does not work. Gives error
No matching signature for operator = for argument types: TIMESTAMP, DATETIME. Supported signature: ANY = ANY at [29:24]

WITH 
`smashes` AS
        (
        SELECT DATE_TRUNC(`manual_input.cancellations_and_problems`.`date`, DAY) AS `date`, 
       COUNT(DISTINCT `manual_input.cancellations_and_problems`.`order_number`) AS `parcels_smashed`
  FROM `manual_input.cancellations_and_problems`
 WHERE `manual_input.cancellations_and_problems`.`problem_category` = 'Damaged Shipping Box'

GROUP BY `date`
ORDER BY `date` ASC
        ),
`parcels` AS
        (
        SELECT TIMESTAMP_TRUNC(DATETIME(`shopify.order`.`created_at`,"Asia/Bangkok"), DAY) AS `date`, 
       COUNT(distinct `shopify.order`.`id`) AS `total_parcels`
  FROM `shopify.order`

LEFT JOIN `shopify.order_line` ON `shopify.order`.`id` = `shopify.order_line`.`order_id`
 WHERE `shopify.order`.`fulfillment_status` = 'fulfilled'

GROUP BY `date`
ORDER BY `date` ASC
        )
        
SELECT SUM(`smashes`.`parcels_smashed`) / SUM(`parcels`.`total_parcels`)
  FROM `smashes`

FULL JOIN `parcels` ON `smashes`.`date` = `parcels`.`date`

Query 2: Works with different databases

WITH 
-- FACEBOOK AD SPEND BY PRODUCT TYPE
    `spend` AS
            (
            SELECT DATE_TRUNC(`facebook.facebookads`.`date`, DAY) AS `date`, 
                   SUM(`facebook.facebookads`.`spend`) AS `ad_spend`
            FROM `facebook.facebookads`
            WHERE (lower(`facebook.facebookads`.`campaign_name`) like '%hrc%')

            GROUP BY `date`
            ORDER BY `date` ASC
            ),
            
-- ORDERS BY FIRST TIME CUSTOMERS BY PRODUCT TYPE
    `orders` AS
            (
            SELECT TIMESTAMP_TRUNC(DATETIME(`shopify.VIEW_sales_by_customer_type`.`created_at`,"Asia/Bangkok"), DAY) AS `date`, 
                   COUNT(DISTINCT `shopify.VIEW_sales_by_customer_type`.`order_id`) AS `first_time_customers`
              FROM `shopify.VIEW_sales_by_customer_type`
  
            LEFT JOIN `shopify.order_line` `Order Line` ON `shopify.VIEW_sales_by_customer_type`.`order_id` = `Order Line`.`order_id`
             WHERE ((lower(`Order Line`.`sku`) like '%hrc%')
               AND `shopify.VIEW_sales_by_customer_type`.`Order_Count` = 1)
   
            GROUP BY `date`
            ORDER BY `date` ASC
            )
            
-- CAC BY PRODUCT TYPE
SELECT IFNULL(`spend`.`date`,`orders`.`date`) AS `date`,
       SUM(`spend`.`ad_spend`) / SUM(`orders`.`first_time_customers`) AS `hrc_facebook_cac`
  FROM `spend`

FULL JOIN `orders` on `spend`.`date` = `orders`.`date`

GROUP BY `date`
ORDER BY `date`

Hi @natti
Try running the command in BigQuery interface - you might get better inline-help there.
It looks like it's having problems with comparing two different type - TIMESTAMP and DATETIME - so try casting to the same type.

Hi flamber, so its actually possible to query across databases from the same data warehouse right? This is a datatype error rather than incompatibility? @flamber

@natti As long as it's inside of the same database connection (Admin > Databases), then yes.
As you say, the second query is working, which is also joining across multiple datasets.

@flamber What if its using the same database connection but with different service account .json files? Would this cause a problem? @flamber

@natti Then it's not the same database then. A database is a single item on Admin > Databases.
Each database connection pool in Metabase does not have any idea what the other connection pools are doing and they cannot interact.
That's a very difficult thing to implement - one of the most requested features:
https://github.com/metabase/metabase/issues/3953 - upvote by clicking :+1: on the first post

@flamber I see, thanks for the help and tips. But I wonder, how does the second query work in as they are different databases as well?

@natti Like I said, it looks like a type problem.
Try running the query with the same credentials on https://console.cloud.google.com/bigquery
Or download DBeaver.io and try using that