Hi !
When uploading a CSV on Redshift via Metabase, it doesn't accept values with length > 256 bytes. I did not find the code responsible for the typing of CSV imports, so I don't know if it's limited to Redshift. I chose "Bug Report" since it doesn't seem to be an expected behavior, and I did not find it anywhere in the documentation. Should I open an issue on github ?
Error message :
Batch entry 0 INSERT INTO "metabase_csv_import"."upload_test_import_varchar_256_20240704091658" ("id", "name", "comment") VALUES (('xxxxxxxx'), ('Client Name'), ('.................................................................................................................................................................................................................................................................')) was aborted: ERROR: Value too long for character type
Detail:
-----------------------------------------------
error: Value too long for character type
code: 8001
context: Value too long for type character varying(256)
query: 47898039
location: string.cpp:219
process: padbmaster [pid=1073856709]
-----------------------------------------------
Call getNextException to see other errors in the batch.
Our use case is to upload qualitative comments on our clients to add them directly on our analytics dashboard. If you have any idea on a better way to do this, tell me too !
Thanks !