When inserting from a decimal(38.2) (PostgreSQL via Spark 2.4) to decimal(18.4) through JDBC an error is thrown when value is null
java.sql.SQLException: Cannot handle data type: 2005
This sounds a bit like https://github.com/exasol/spark-exasol-connector/issues/46 albeit with different datatypes in question. Maybe a null is considered as wide as the input field or becomes a clob?
It works fine when the field contains not null values or has its size shrunk to match the target field size.
Thank you for your reply.
I use it directly.
I've experienced a similar issue (with exactly the same error message), with strings that were too long (potentially by data type - never actually too long), and today also with nulls into varchar columns that should accept nulls.
Coalescing null strings into empty strings serves as a workaround, but shouldn't be required.
I don't think the precision limit will be a problem - just something to keep in mind when loading from systems with a higher limit.
I think it would be better to try the insert and fail if value/precision truncation actually occurs, rather than throw the "2005" error.