-
Notifications
You must be signed in to change notification settings - Fork 278
Compatibility with ksqlDB - path /apis/ccompat/v6/schemas/ids/0 not found #2151
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Hi @barnyrelph I encountered this very issue just this past week. After scouring online with nothing positive, I accidentally traced the issue to a missing option in the Debezium source connector configuration. Add this option to your configuration to rectify this. "value.converter.apicurio.registry.as-confluent": true Whiles introducing the above options ensures a lookup with the Global ID assigned as the schema, the lookup still returns a 404. The reason for this appears to be that Apicurio expects the value of the Yet to figure this out. |
So there are two compatibility issues you might run into when using both confluent and apicurio tools. The first is a client setting in apicurio tooling that instructs our tools to use integers instead of lungs for all IDs that we encode in the messages. Confluent uses 4 byte integers while we use 8 byte longs. That is the option you discovered and is one piece of the puzzle. The other piece is that early versions of apicurio used global identifiers rather than content identifiers as the unique lookup in those tools. Global IDs are unique for every version of every artifact Even if the same content is uploaded multiple times. Content IDs on the other hand are shared. So that if you upload the same content multiple times you will get back the same ID. Confluent server has always used content IDs (although I think they call them global IDs) in their API. As of version 2.0 of apicurio registry, our API uses content IDs as well. So it should be easier to interoperate with our tools and their tools. However if you are still using any legacy apicurio tools, or the V1 edition of our rest API, there may still be an incompatibility. Does that context help at all? It's possible we still have a mismatch somewhere between the confluent compatibility API and our core API. |
This is an issue we have as well. Setting up a data infrastructure at a Parisian hospital. I think we'll have to switch to the Confluent schema registry because of this :/ We tried all the solutions mentioned in this thread. Context: we use Debezium and ksqldb |
Sorry for being so late to the party. In addition to what Eric described, you can still force the Confluent Compatibility API to use the global ID across the entire API by setting the environment variable ENABLE_CCOMPAT_LEGACY_ID_MODE to true in your Apicurio Registry instance and this is the important point, this is a configuration for the server, not the converter. Alternatively (and probably easier) you can instruct the converter to use the contentId instead of the global ID by setting |
--UPDATE--. Resolved this issue some months back. I have data pipelines running smoothly now with Kafka, Debezium Connectors, Apicurio Registry and ksqlDB.
Repeat the same if you are serializing/deserializing your message key data with a schema registry(using Apicurio Registry in my case)
|
Hi, Yes, that is the expected configuration, you can't use as-confluent without also setting use-id. @tgy can you please confirm if setting this configuration fixes the problem for you? Thanks! |
Hi guys, thanks a lot for your answers. We're looking into the solutions you shared and will let you know if it worked! |
Hi @carlesarnal and @ofelix03,
It does correctly identify the fields in the schema:
However, when I want to query the data, I obtain no result:
When I run this query, I see the following error repeating in the apicurio log:
This error message repeats many times (maybe as many times as there are messages in the topic). At the same time, the ksqldb-server logs show the following error:
It looks like ksqldb queries the Id 0, which is strange since my Ids start at 1. I have schema Ids 1 to 17. The relevant schema Ids for the topic in question are 8 for dwcavro.Philips.PatientData.Export.Numeric-key and 9 for dwcavro.Philips.PatientData.Export.Numeric-value. If I request the correct Ids from apicurio, it actually returns a result. I don't understand why ksqldb requests Id 0 over and over again.
Best regards, |
To better understand what's going on, please share some details
|
@ofelix03 Thanks for taking the time to look at our setup Our Apicurio Registry is stored in a PostgreSQL database (in We're not setting any |
@ofelix03 @tgy
|
@ofelix03 @tgy
However, ksqldb is able to parse the topic with no error. It doesn't even try to query ids/0. Something must be wrong in the apicurio emulation of the confluent api, prompting ksqldb to request schema id 0 instead of the correct schema id. Best regards, |
If you can share your full setup since you mentioned you're running docker-compose (obviously omitting any confidential information) and not only some snippets that would be very helpful. I think you might be missing some essential configuration on the Apicurio Registry side of things, so I would like to take a look at your configuration. Keep in mind that although we provide a compatibility API both the tooling and the server should be configured properly to make things work. Thanks in advance. |
Hi Carl. Thanks for taking the time. Here's our
|
and this:
|
Hi, After some testing, I think you're missing some key configuration points in your connector. Here you have an example I just used to successfully use all the stack together (kafka-ui, apicurio-registry, debezium, ksqldb).
Notice especially the difference in |
Closing as stale, if someone finds this in the future, please, re-open the issue. |
Hi All, I'm using Debezium to capture Postgres events, stream them into Kafka and hopefully process them in ksqlDB, then stream them out to a separate Postgres DB for analysis.
I have a Connector set up watching a couple of tables. On inserting to the table, I can see Schemas created in the registry.
I have ksqlDB set up using:
I then create a stream in ksqlDB:
Upon insert, the schema registry is populated and I can see ksqlDB go looking for the schema, but in the logs for the apicurio registry, I can see this:
That last request seems to be where things are going awry. Is this a problem with my configuration (most likely) or some aspect of the Confluent API missing?
Connector setup snippet for reference: :
Thanks in advance for any guidance
The text was updated successfully, but these errors were encountered: