19-06-2020
03:08 PM
- last edited on
30-06-2020
03:09 PM
by
exa-Vandana
When inserting from a decimal(38.2) (PostgreSQL via Spark 2.4) to decimal(18.4) through JDBC an error is thrown when value is null
java.sql.SQLException: Cannot handle data type: 2005
This sounds a bit like https://github.com/exasol/spark-exasol-connector/issues/46 albeit with different datatypes in question. Maybe a null is considered as wide as the input field or becomes a clob?
It works fine when the field contains not null values or has its size shrunk to match the target field size.
22-06-2020 10:41 AM
Hello @lensbjerg ,
Thanks for the feedback!
Exasol data types sometimes implicitly converted to other types. Are you using the decimal column in any of the string or substring commands?
Yes, unfortunately the maximum precision is 36 in Exasol decimal type.
24-06-2020 04:02 PM
Thank you for your reply.
I use it directly.
I've experienced a similar issue (with exactly the same error message), with strings that were too long (potentially by data type - never actually too long), and today also with nulls into varchar columns that should accept nulls.
Coalescing null strings into empty strings serves as a workaround, but shouldn't be required.
I don't think the precision limit will be a problem - just something to keep in mind when loading from systems with a higher limit.
I think it would be better to try the insert and fail if value/precision truncation actually occurs, rather than throw the "2005" error.
16-07-2020 10:31 PM
17-07-2020 12:01 PM
Hi Muhammet
Only by performing the workarounds I mentioned.
BR
Christian
Then it's time to become part of a unique family! Discover helpful tips and support other Community members with your knowledge.
Sign In