-
Notifications
You must be signed in to change notification settings - Fork 106
Header serialization exception in aggregator processor with Redis #500
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Any explanation how an aggregator processor receives from the binder something what is not May you can share with us a stack trace from that serialization error, so we would have some clue what is going on? Thanks |
Sorry I forgot to mention that I don't use the last version of the aggregator. I use |
In fact I think the problem happens before the aggregation when the app is caching individual messages to |
I just see that Here is the stacktrace :
Thanks for your help ! |
OK. According to the stack trace, there is nothing with a
And real exception is like this:
So, your original request is wrong since we definitely don't talk about "JSON or not" for a payload: we are just failing to serialize message headers.
Therefore that We need to investigate what headers are populated by binders and filter them out before applying for an aggregator handler. However to trigger that process I need you to refactor this ticket to the proper problem description. Thank you for understanding! |
Thanks I didn't see that line in the stack trace ! Here are the headers of one individual Kafka message of my example, if that helps :
|
Well, those headers are OK and they are probably a part of Kafka record before it is pulled by consumer. @sobychacko , any chances that you can quickly enlighten us what headers are populated by the Kafka Binder on the inbound side? I wonder what is that Thanks |
@CEDDM, We spent some time debugging this issue with @artembilan. The problem occurs because of the micrometer listeners we are adding in the binder and hence Spring Kafka creates a proxy object for the actual Add the following bean in your custom version.
When the binder detects this bean, it does not add any micrometer listeners, thus avoiding the proxy issue above. Once you add that, you need to rebuild the app for the Kafka binder using the standard procedures for building a custom app. You need to rebuild and re-generated the binder-based app for the aggregator until we develop a proper fix at the framework level. |
Here is the fix for Spring for Apache Kafka: spring-projects/spring-kafka#2822. Since there is nothing we can fix in this project to mitigate the problem, I'm going to close this as Well, it is valid, but it does not trigger anything to be changed in this project, unless the version upgrade when it is available. Thank you for the report and understanding! |
Thanks for the fix in Spring Kafka. |
PR for Spring for Apache Kafka has been just opened, so the fix won't make it the release until October. Sorry for inconvenience. Well, I can come up with a workaround for the version of Stream Applications which release is due today, but that is already for fully fresh generation of this project. |
Correction: the release is postponed to next Wednesday. |
Thanks a lot @artembilan ! That's great news ! |
Fixes spring-cloud#500 When `listeners` are provided for `DefaultKafkaConsumerFactory`, the target `KafkaConsumer` instance is proxied. The `java.lang.reflect.Proxy` is `Serializable`, but the value it is wrapping is not. When the `MessageHeaders` is serialized (e.g. into persistent `MessageStore`), it checks for `Serializable` type only on top-level object of the header. Therefore, the `Proxy` is passing condition, but eventually we fail with `NotSerializableException`, since the proxied object is not like that * Remove `kafka_consumer` from a message before it reaches an aggregator with its logic to serialize message into the store This is a workaround until Spring for Apache Kafka is released with the fix: spring-projects/spring-kafka#2822
See related PR. |
Fixes #500 When `listeners` are provided for `DefaultKafkaConsumerFactory`, the target `KafkaConsumer` instance is proxied. The `java.lang.reflect.Proxy` is `Serializable`, but the value it is wrapping is not. When the `MessageHeaders` is serialized (e.g. into persistent `MessageStore`), it checks for `Serializable` type only on top-level object of the header. Therefore, the `Proxy` is passing condition, but eventually we fail with `NotSerializableException`, since the proxied object is not like that * Remove `kafka_consumer` from a message before it reaches an aggregator with its logic to serialize message into the store This is a workaround until Spring for Apache Kafka is released with the fix: spring-projects/spring-kafka#2822
Fixes spring-cloud/stream-applications#500 When `listeners` are provided for `DefaultKafkaConsumerFactory`, the target `KafkaConsumer` instance is proxied. The `java.lang.reflect.Proxy` is `Serializable`, but the value it is wrapping is not. When the `MessageHeaders` is serialized (e.g. into persistent `MessageStore`), it checks for `Serializable` type only on top-level object of the header. Therefore, the `Proxy` is passing condition, but eventually we fail with `NotSerializableException`, since the proxied object is not like that * Remove `kafka_consumer` from a message before it reaches an aggregator with its logic to serialize message into the store This is a workaround until Spring for Apache Kafka is released with the fix: spring-projects/spring-kafka#2822
When I configure my aggregator with
Redis
to cache JSON messages, I get a serialization exception because the object does not implementSerializable
. This is an known problem with Redis default serializer (JdkSerializationRedisSerializer
).The problem is we don't have a real Java object when the aggregator is used in
SCDF
(the messages come from Kafka or RabbitMQ)The recommanded workaround I found is to use
GenericJackson2JsonRedisSerializer
but there is no way to do this without forking the aggregator (see the answer of @artembilan here : https://stackoverflow.com/questions/77088203/redis-serializer-properties-in-scdf).It would be great to add a property to set the Redis serializer in the app configuration.
EDIT : According to @artembilan analysis, the problem comes from headers not the payload so the description above is wrong. Some headers must be filtered if not serializable
The text was updated successfully, but these errors were encountered: