Skip to content

Using convert.py with a fine tuned phi-2 #5009

Closed
@FiveTechSoft

Description

@FiveTechSoft

We are loading phi-2 from HF using load_in_8bit=True and torch_dtype=torch.float16, then we fine tune it and finally we save it locally.

When running convert.py ./phi-2 we get this error:
File "/content/convert.py", line 764, in convert
data_type = SAFETENSORS_DATA_TYPES[info['dtype']]
KeyError: 'I8'

If we try the same using load_in_8bit=False then we get:
File "/content/convert.py", line 257, in loadHFTransformerJson
f_norm_eps = config["rms_norm_eps"],
KeyError: 'rms_norm_eps'

how to generate a GGUF from a fine tuned phi-2 ? many thanks

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions