Inserting from a decimal(38.2) to decimal(18.4) through JDBC throws error when value is null

Padawan

When inserting from a decimal(38.2) (PostgreSQL via Spark 2.4) to decimal(18.4) through JDBC  an error is thrown when value is null

java.sql.SQLException: Cannot handle data type: 2005

This sounds a bit like https://github.com/exasol/spark-exasol-connector/issues/46 albeit with different datatypes in question. Maybe a null is considered as wide as the input field or becomes a clob?

It works fine when the field contains not null values or has its size shrunk to match the target field size.

2 REPLIES 2

Team Exasol
Team Exasol

Hello @lensbjerg ,

Thanks for the feedback!

Exasol data types sometimes implicitly converted to other types. Are you using the decimal column in any of the string or substring commands?

Yes, unfortunately the maximum precision is 36 in Exasol decimal type

Padawan

Hi @exa-Muhammet 

Thank you for your reply.

I use it directly.

I've experienced a similar issue (with exactly the same error message), with strings that were too long (potentially by data type - never actually too long), and today also with nulls into varchar columns that should accept nulls.

Coalescing null strings into empty strings serves as a workaround, but shouldn't be required.

I don't think the precision limit will be a problem - just something to keep in mind when loading from systems with a higher limit.

I think it would be better to try the insert and fail if value/precision truncation actually occurs, rather than throw the "2005" error.