-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Output of reasoning_content that supports deepseek-r1 is required #2283
Comments
+1,怎么还不支持思维链,烦死了 |
related document : https://api-docs.deepseek.com/guides/reasoning_model |
Currently, the chain of thought (reasoning_content) is merged with the actual content, which also causes the structured output converters to fail. I created the rudimentary custom converter below as a temporary workaround, but yes, this functionality should be natively supported by the library. |
This issue is getting resolved by this PR #2192 Example of deepseek-reasoner call and response: request:
response:
|
Please do a quick search on GitHub issues first, the feature you are about to request might have already been requested.
Expected Behavior
ChatModel.stream(xxx)
.flatMapSequential(f -> {
System.out.println(f.getResult().getOutput().getContent());
// Output reasoning_content
System.out.println(f.getResult().getOutput().getReasoningContent());
})
Current Behavior
ChatModel.stream(xxx)
.flatMapSequential(f -> {
// Output reasoning_content is not supported
System.out.println(f.getResult().getOutput().getContent());
})
Context
When launching LLM Q&A using ChatModel, the thinking response of deepseek-r1 will be output in reasoning_content. Currently, there is only content field in Message output. Unable to receive LLM's thinking, want to add related fields.
The text was updated successfully, but these errors were encountered: