Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Output of reasoning_content that supports deepseek-r1 is required #2283

Open
stillmoon opened this issue Feb 20, 2025 · 4 comments · May be fixed by #2192
Open

Output of reasoning_content that supports deepseek-r1 is required #2283

stillmoon opened this issue Feb 20, 2025 · 4 comments · May be fixed by #2192

Comments

@stillmoon
Copy link

Please do a quick search on GitHub issues first, the feature you are about to request might have already been requested.

Expected Behavior
ChatModel.stream(xxx)
.flatMapSequential(f -> {
System.out.println(f.getResult().getOutput().getContent());
// Output reasoning_content
System.out.println(f.getResult().getOutput().getReasoningContent());
})

Current Behavior

ChatModel.stream(xxx)
.flatMapSequential(f -> {
// Output reasoning_content is not supported
System.out.println(f.getResult().getOutput().getContent());
})

Context

When launching LLM Q&A using ChatModel, the thinking response of deepseek-r1 will be output in reasoning_content. Currently, there is only content field in Message output. Unable to receive LLM's thinking, want to add related fields.

@Ltyro
Copy link

Ltyro commented Feb 20, 2025

+1,怎么还不支持思维链,烦死了

@dev-jonghoonpark
Copy link
Contributor

related document : https://api-docs.deepseek.com/guides/reasoning_model

@hardikSinghBehl
Copy link

hardikSinghBehl commented Feb 21, 2025

Currently, the chain of thought (reasoning_content) is merged with the actual content, which also causes the structured output converters to fail. I created the rudimentary custom converter below as a temporary workaround, but yes, this functionality should be natively supported by the library.

DeepSeekModelOutputConverter.java
ChatbotService.java

@apappascs
Copy link
Contributor

This issue is getting resolved by this PR #2192

Example of deepseek-reasoner call and response:

request:

curl -v -X POST "https://api.deepseek.com/chat/completions" \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer sk-your-api-key" \
  -d '{
    "model": "deepseek-reasoner",
    "messages": [
      {"role": "system", "content": "You are a helpful assistant."},
      {"role": "user", "content": "What is the answer to the Great Question of Life, the Universe, and Everything?"}
    ],
    "stream": false
  }'

response:

{
  "id":"6090f86a-12aa-4xxxx-89af-85xxxxxx",
  "object":"chat.completion",
  "created":1740134353,
  "model":"deepseek-reasoner",
  "choices":[
    {
      "index":0,
      "message":{
        "role":"assistant",
        "content":"The answer to the Great Question of Life, the Universe, and Everything, as famously depicted in Douglas Adams' *The Hitchhiker's Guide to the Galaxy*, is **42**. \n\nHowever, the story humorously reveals that while the supercomputer Deep Thought calculated this answer over millions of years, the actual *question* corresponding to it remains ambiguous—highlighting the absurdity of seeking absolute meaning in a vast, chaotic universe. 😊",
        "reasoning_content":"Okay, let's see. The user is asking about the answer to the Great Question of Life, the Universe, and Everything. Hmm, I remember that this is a reference to \"The Hitchhiker's Guide to the Galaxy\" by Douglas Adams. In the book, a supercomputer named Deep Thought was built to calculate the answer to this ultimate question. After a lot of time and processing, the computer comes up with the number 42. But then the characters realize they didn't actually know what the question was. So the answer is 42, but the joke is that the question isn't really known.\n\nWait, but maybe the user is looking for a more philosophical answer? Like, not just the fictional reference. But given the way the question is phrased, \"the Great Question of Life, the Universe, and Everything\" is almost certainly pointing to the Hitchhiker's Guide joke. The answer is famously 42. I should confirm that I'm not missing any other context here. Maybe check if there's another interpretation, but I don't think so. This is a well-known pop culture reference. So the answer is 42, and maybe a brief explanation about the book reference to be helpful."
      },
      "logprobs":null,
      "finish_reason":"stop"
    }
  ],
  "usage":{
    "prompt_tokens":28,
    "completion_tokens":338,
    "total_tokens":366,
    "prompt_tokens_details":{
      "cached_tokens":0
    },
    "completion_tokens_details":{
      "reasoning_tokens":247
    },
    "prompt_cache_hit_tokens":0,
    "prompt_cache_miss_tokens":28
  },
  "system_fingerprint":"fp_5417b77867_prod"
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
5 participants