Skip to content

Getting "UnicodeDecodeError: 'charmap' codec can't decode byte 0x81 in position 1980: character maps to <undefined>" when streaming #20

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
Pandibrudi opened this issue Apr 28, 2025 · 2 comments

Comments

@Pandibrudi
Copy link

Pandibrudi commented Apr 28, 2025

Python Version: 3.13.3
Windows 10 Home, 64-Bit
Language: German

Litellm version '1.67.4'

Test 1:

> import sys
> sys.getdefaultencoding()

--- Output ---
'utf-8'

Test 2:
> import litellm

--- Output ---

Traceback (most recent call last):
File "", line 1, in
import litellm
File "C:\Users\Fabia\Documents\Arbeit\Coding\Django-Projekte\project-robopsy.venv\Lib\site-packages\litellm_init_.py", line 762, in
from .cost_calculator import completion_cost
File "C:\Users\Fabia\Documents\Arbeit\Coding\Django-Projekte\project-robopsy.venv\Lib\site-packages\litellm\cost_calculator.py", line 19, in
from litellm.litellm_core_utils.llm_cost_calc.utils import (
...<2 lines>...
)
File "C:\Users\Fabia\Documents\Arbeit\Coding\Django-Projekte\project-robopsy.venv\Lib\site-packages\litellm\litellm_core_utils\llm_cost_calc\utils.py", line 9, in
from litellm.utils import get_model_info
File "C:\Users\Fabia\Documents\Arbeit\Coding\Django-Projekte\project-robopsy.venv\Lib\site-packages\litellm\utils.py", line 188, in
json_data = json.load(f)
File "C:\Users\Fabia\AppData\Local\Programs\Python\Python313\Lib\json_init_.py", line 293, in load
return loads(fp.read(),
~~~~~~~^^
File "C:\Users\Fabia\AppData\Local\Programs\Python\Python313\Lib\encodings\cp1252.py", line 23, in decode
return codecs.charmap_decode(input,self.errors,decoding_table)[0]
~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
UnicodeDecodeError: 'charmap' codec can't decode byte 0x81 in position 1980: character maps to

Code:

llm_service.py

import os, sys
from dotenv import load_dotenv #for secret key
from typing import Optional, List, Dict
sys.path.append(os.path.join(".."))
from llms_wrapper.llms import LLMS, toolnames2funcs, get_func_by_name
from llms_wrapper.config import update_llm_config

def complete_new(messages):

    config = dict(
    llms=[
        # OpenAI
        # https://platform.openai.com/docs/models
        dict(llm="openai/gpt-4o"),
        dict(llm="openai/gpt-4o-mini"),
    ],
    providers = dict(
        openai = dict(api_key_env=os.getenv("OPENAI_API_KEY")),
    ))
    config = update_llm_config(config)
    llms = LLMS(config)
    llms.list_aliases()

    msgs = LLMS.make_messages(messages)

    ret = llms.query("openai/gpt-4o", msgs, temperature=0.5, max_tokens=1000, stream=True)

    if ret["ok"]:
        for chunk in ret["chunks"]:
            if chunk["error"]:
                print()
                print("Error:", chunk["error"])
                break
            print(chunk["answer"], end="")
            yield chunk.choices[0].delta.content
        print()
    else:
        print("Error:", ret["error"])

Terminal

Traceback (most recent call last):
File "c:\Users\Fabia\Documents\Arbeit\Coding\Django-Projekte\project-robopsy\RobopsyApp\conversation\llm_service.py", line 6, in
from llms_wrapper.llms import LLMS, toolnames2funcs, get_func_by_name
File "C:\Users\Fabia\Documents\Arbeit\Coding\Django-Projekte\project-robopsy.venv\Lib\site-packages\llms_wrapper\llms.py", line 9, in
import litellm
File "C:\Users\Fabia\Documents\Arbeit\Coding\Django-Projekte\project-robopsy.venv\Lib\site-packages\litellm_init_.py", line 762, in
from .cost_calculator import completion_cost
File "C:\Users\Fabia\Documents\Arbeit\Coding\Django-Projekte\project-robopsy.venv\Lib\site-packages\litellm\cost_calculator.py", line 19, in
from litellm.litellm_core_utils.llm_cost_calc.utils import (
...<2 lines>...
File "C:\Users\Fabia\Documents\Arbeit\Coding\Django-Projekte\project-robopsy.venv\Lib\site-packages\llms_wrapper\llms.py", line 9, in
import litellm
File "C:\Users\Fabia\Documents\Arbeit\Coding\Django-Projekte\project-robopsy.venv\Lib\site-packages\litellm_init_.py", line 762, in
from .cost_calculator import completion_cost
File "C:\Users\Fabia\Documents\Arbeit\Coding\Django-Projekte\project-robopsy.venv\Lib\site-packages\litellm\cost_calculator.py", line 19, in
from litellm.litellm_core_utils.llm_cost_calc.utils import (
...<2 lines>...
File "C:\Users\Fabia\Documents\Arbeit\Coding\Django-Projekte\project-robopsy.venv\Lib\site-packages\llms_wrapper\llms.py", line 9, in
import litellm
File "C:\Users\Fabia\Documents\Arbeit\Coding\Django-Projekte\project-robopsy.venv\Lib\site-packages\litellm_init_.py", line 762, in
from .cost_calculator import completion_cost
File "C:\Users\Fabia\Documents\Arbeit\Coding\Django-Projekte\project-robopsy.venv\Lib\site-packages\litellm\cost_calculator.py", line 19, in
File "C:\Users\Fabia\Documents\Arbeit\Coding\Django-Projekte\project-robopsy.venv\Lib\site-packages\llms_wrapper\llms.py", line 9, in
import litellm
File "C:\Users\Fabia\Documents\Arbeit\Coding\Django-Projekte\project-robopsy.venv\Lib\site-packages\litellm_init_.py", line 762, in
from .cost_calculator import completion_cost
File "C:\Users\Fabia\Documents\Arbeit\Coding\Django-Projekte\project-robopsy.venv\Lib\site-packages\llms_wrapper\llms.py", line 9, in
import litellm
File "C:\Users\Fabia\Documents\Arbeit\Coding\Django-Projekte\project-robopsy.venv\Lib\site-packages\litellm_init_.py", line 762, in
from .cost_calculator import completion_cost
File "C:\Users\Fabia\Documents\Arbeit\Coding\Django-Projekte\project-robopsy.venv\Lib\site-packages\litellm\cost_calculator.py", line 19, in
File "C:\Users\Fabia\Documents\Arbeit\Coding\Django-Projekte\project-robopsy.venv\Lib\site-packages\llms_wrapper\llms.py", line 9, in
import litellm
File "C:\Users\Fabia\Documents\Arbeit\Coding\Django-Projekte\project-robopsy.venv\Lib\site-packages\litellm_init_.py", line 762, in
from .cost_calculator import completion_cost
File "C:\Users\Fabia\Documents\Arbeit\Coding\Django-Projekte\project-robopsy.venv\Lib\site-packages\llms_wrapper\llms.py", line 9, in
import litellm
File "C:\Users\Fabia\Documents\Arbeit\Coding\Django-Projekte\project-robopsy.venv\Lib\site-packages\litellm_init_.py", line 762, in
File "C:\Users\Fabia\Documents\Arbeit\Coding\Django-Projekte\project-robopsy.venv\Lib\site-packages\llms_wrapper\llms.py", line 9, in
import litellm
File "C:\Users\Fabia\Documents\Arbeit\Coding\Django-Projekte\project-robopsy.venv\Lib\site-packages\litellm_init_.py", line 762, in
from .cost_calculator import completion_cost
File "C:\Users\Fabia\Documents\Arbeit\Coding\Django-Projekte\project-robopsy.venv\Lib\site-packages\litellm\cost_calculator.py", line 19, in
File "C:\Users\Fabia\Documents\Arbeit\Coding\Django-Projekte\project-robopsy.venv\Lib\site-packages\llms_wrapper\llms.py", line 9, in
import litellm
File "C:\Users\Fabia\Documents\Arbeit\Coding\Django-Projekte\project-robopsy.venv\Lib\site-packages\litellm_init_.py", line 762, in
from .cost_calculator import completion_cost
File "C:\Users\Fabia\Documents\Arbeit\Coding\Django-Projekte\project-robopsy.venv\Lib\site-packages\llms_wrapper\llms.py", line 9, in
import litellm
File "C:\Users\Fabia\Documents\Arbeit\Coding\Django-Projekte\project-robopsy.venv\Lib\site-packages\llms_wrapper\llms.py", line 9, in
import litellm
File "C:\Users\Fabia\Documents\Arbeit\Coding\Django-Projekte\project-robopsy.venv\Lib\site-packages\litellm_init_.py", line 762, in
from .cost_calculator import completion_cost
File "C:\Users\Fabia\Documents\Arbeit\Coding\Django-Projekte\project-robopsy.venv\Lib\site-packages\litellm\cost_calculator.py", line 19, in
File "C:\Users\Fabia\Documents\Arbeit\Coding\Django-Projekte\project-robopsy.venv\Lib\site-packages\litellm\cost_calculator.py", line 19, in
from litellm.litellm_core_utils.llm_cost_calc.utils import (
from litellm.litellm_core_utils.llm_cost_calc.utils import (
...<2 lines>...
)
File "C:\Users\Fabia\Documents\Arbeit\Coding\Django-Projekte\project-robopsy.venv\Lib\site-packages\litellm\litellm_core_utils\llm_cost_calc\utils.py", line 9, in
from litellm.utils import get_model_info
File "C:\Users\Fabia\Documents\Arbeit\Coding\Django-Projekte\project-robopsy.venv\Lib\site-packages\litellm\utils.py", line 188, in
json_data = json.load(f)
File "C:\Users\Fabia\AppData\Local\Programs\Python\Python313\Lib\json_init_.py", line 293, in load
return loads(fp.read(),
~~~~~~~^^
File "C:\Users\Fabia\AppData\Local\Programs\Python\Python313\Lib\encodings\cp1252.py", line 23, in decode
return codecs.charmap_decode(input,self.errors,decoding_table)[0]
~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
UnicodeDecodeError: 'charmap' codec can't decode byte 0x81 in position 1980: character maps to

@johann-petrak
Copy link
Member

Try to install the modified litellm into your environment and test again:

First, install git, then

  • clone to following repository: https://github.com/johann-petrak/litellm
  • checkout the issue10272 branch
  • with your environment activated in a terminal, change into the root of the directory with that repo und the branch checked out and run pip install -e .

After this try running import litellm inside a python interpreter again.

@johann-petrak
Copy link
Member

Fix proposed as a PR BerriAI/litellm#10380

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants