Skip to content

Releases: likelovewant/ollama-for-amd

v0.6.8

07 May 02:53
5d967d5
Compare
Choose a tag to compare
  • Detailed Installation Guide: please refer to the wiki guide:
    Demo Release Version:
    This release OllamaSetup.exe build with ROCm6.2 [hipsdk 6.2.4], make sure to replace the libs with v0.6.2.4

Tip

ROCm5.7 version * limited available

ollama-windows-amd64-rocm5.7z for ROCm5.7
Support lists gfx803 gfx900:xnack- gfx902 gfx1103

ollama-windows-amd64.7z for ROCm6.2.4
ROCmlibs for 6.2 available at ROCmlibs for 6.2.4

Support lists gfx906:xnack- gfx1010:xnack- gfx1012:xnack- gfx1030 gfx1031 gfx1032 gfx1034 gfx1035 gfx1036 gfx1100 gfx1101 gfx1103 gfx1150 gfx1151 gfx1201

Note

Let's get Ollama set up!

  1. Install or Extract: You can either run the OllamaSetup.exe file, or download the
    ollama-windows-amd64.7z file and unzip it.

  2. Update ROCm Libraries:

    • Find the rocblas.dll file and the rocblas/library folder within your Ollama installation folder (usually
      located at C:\Users\usrname\AppData\Local\Programs\Ollama\lib\ollama\rocm).
    • Delete the existing rocblas/library folder.
    • Replace it with the correct ROCm libraries. You’ll need the right version for your GPU. You can find
      them here: ROCmLibs for 6.2.4 or ROCmLibs for 5.7
  3. Start Ollama: Once the ROCm libraries are updated, you can start using Ollama. You can either run ollama run model names or use the command ./ollama serve (depending on how you installed it).

or use One Click Installer Ollama-For-AMD-Installer by ByronLeeeee

If there is error log show amdgpu is not supported (supported types:[gfx1030 gfx1100 gfx1101 gfx1102 gfx906] it's means you are missing some steps . please check the above steps or possible replace the rocmlibs in your hipsdk

Full Changelog: v0.6.6...v0.6.8

v0.6.6

24 Apr 12:06
e82cdb5
Compare
Choose a tag to compare
  • Detailed Installation Guide: please refer to the wiki guide:
    Demo Release Version:
    This release OllamaSetup.exe build with ROCm6.2 [hipsdk 6.2.4], make sure to replace the libs with v0.6.2.4

Tip

ROCm5.7 version * limited available

ollama-windows-amd64-rocm5.7z for ROCm5.7
Support lists gfx803 gfx900:xnack- gfx902 gfx1103

ollama-windows-amd64.7z for ROCm6.2.4
ROCmlibs for 6.2 available at ROCmlibs for 6.2.4

Support lists gfx906:xnack- gfx1010:xnack- gfx1011 gfx1012:xnack- gfx1030 gfx1031 gfx1032 gfx1034 gfx1035 gfx1036 gfx1100 gfx1101 gfx1103 gfx1150 gfx1151 gfx1201

Note

Let's get Ollama set up!

  1. Install or Extract: You can either run the OllamaSetup.exe file, or download the
    ollama-windows-amd64.7z file and unzip it.

  2. Update ROCm Libraries:

    • Find the rocblas.dll file and the rocblas/library folder within your Ollama installation folder (usually
      located at C:\Users\usrname\AppData\Local\Programs\Ollama\lib\ollama\rocm).
    • Delete the existing rocblas/library folder.
    • Replace it with the correct ROCm libraries. You’ll need the right version for your GPU. You can find
      them here: ROCmLibs for 6.2.4 or ROCmLibs for 5.7
  3. Start Ollama: Once the ROCm libraries are updated, you can start using Ollama. You can either run ollama run model names or use the command ./ollama serve (depending on how you installed it).

or use One Click Installer Ollama-For-AMD-Installer by ByronLeeeee

If there is error log show amdgpu is not supported (supported types:[gfx1030 gfx1100 gfx1101 gfx1102 gfx906] it's means you are missing some steps . please check the above steps or possible replace the rocmlibs in your hipsdk

Full Changelog: v0.6.3...v0.6.6

v0.6.3

23 Mar 06:32
17bb5ea
Compare
Choose a tag to compare
  • Detailed Installation Guide: please refer to the wiki guide:
    Demo Release Version:
    This release OllamaSetup.exe build with ROCm6.2 [hipsdk 6.2.4], make sure to replace the libs with v0.6.2.4

Tip

ROCm5.7 version * No need to update, Not available

ollama-windows-amd64.7z for ROCm6.2.4
ROCmlibs for 6.2 available at ROCmlibs for 6.2.4

Support lists gfx906:xnack- gfx1010:xnack- gfx1011 gfx1012:xnack- gfx1030 gfx1031 gfx1032 gfx1034 gfx1035 gfx1036 gfx1100 gfx1101 gfx1103 gfx1150 gfx1201 (expertimenal)

Note

Let's get Ollama set up!

  1. Install or Extract: You can either run the OllamaSetup.exe file, or download the
    ollama-windows-amd64.7z file and unzip it.

  2. Update ROCm Libraries:

    • Find the rocblas.dll file and the rocblas/library folder within your Ollama installation folder (usually
      located at C:\Users\usrname\AppData\Local\Programs\Ollama\lib\ollama\rocm).
    • Delete the existing rocblas/library folder.
    • Replace it with the correct ROCm libraries. You’ll need the right version for your GPU. You can find
      them here: ROCmLibs for 6.2.4 or ROCmLibs for 5.7
  3. Start Ollama: Once the ROCm libraries are updated, you can start using Ollama. You can either run ollama run model names or use the command ./ollama serve (depending on how you installed it).

or use One Click Installer Ollama-For-AMD-Installer by ByronLeeeee

If there is error log show amdgpu is not supported (supported types:[gfx1030 gfx1100 gfx1101 gfx1102 gfx906] it's means you are missing some steps . please check the above steps or possible replace the rocmlibs in your hipsdk

Full Changelog: v0.6.1...v0.6.3

v0.6.1

17 Mar 07:08
4575767
Compare
Choose a tag to compare
  • Detailed Installation Guide: please refer to the wiki guide:
    Demo Release Version:
    This release OllamaSetup.exe build with ROCm6.2 [hipsdk 6.2.4], make sure to replace the libs with v0.6.2.4

Tip

ROCm5.7 version * available at limited arches

ollama-windows-amd64-rocm5.7z for ROCm5.7
Support lists gfx803 gfx900:xnack- gfx902 gfx1103

ROCmlibs for 5.7 available at ROCmlibs for 5.7

ollama-windows-amd64.7z for ROCm6.2.4
ROCmlibs for 6.2 available at ROCmlibs for 6.2.4

Support lists gfx906:xnack- gfx1010:xnack- gfx1011 gfx1012:xnack- gfx1030 gfx1031 gfx1032 gfx1034 gfx1035 gfx1036 gfx1100 gfx1101 gfx1103 gfx1150

Note

Let's get Ollama set up!

  1. Install or Extract: You can either run the OllamaSetup.exe file, or download the
    ollama-windows-amd64.7z file and unzip it.

  2. Update ROCm Libraries:

    • Find the rocblas.dll file and the rocblas/library folder within your Ollama installation folder (usually
      located at C:\Users\usrname\AppData\Local\Programs\Ollama\lib\ollama\rocm).
    • Delete the existing rocblas/library folder.
    • Replace it with the correct ROCm libraries. You’ll need the right version for your GPU. You can find
      them here: ROCmLibs for 6.2.4 or ROCmLibs for 5.7
  3. Start Ollama: Once the ROCm libraries are updated, you can start using Ollama. You can either run ollama run model names or use the command ./ollama serve (depending on how you installed it).

or use One Click Installer Ollama-For-AMD-Installer by ByronLeeeee

If there is error log show amdgpu is not supported (supported types:[gfx1030 gfx1100 gfx1101 gfx1102 gfx906] it's means you are missing some steps . please check the above steps or possible replace the rocmlibs in your hipsdk

For a complete list of changes and bug fixes, please check ollama changelog:
ollama/releases
Full Changelog: v0.6.0...v0.6.1

v0.6.0

12 Mar 06:35
88ab587
Compare
Choose a tag to compare
  • Detailed Installation Guide: please refer to the wiki guide:
    Demo Release Version:
    This release OllamaSetup.exe build with ROCm6.2 [hipsdk 6.2.4], make sure to replace the libs with v0.6.2.4

Tip

ROCm5.7 version * available at limited arches

ollama-windows-amd64-rocm5.7z for ROCm5.7
Support lists gfx803 gfx900:xnack- gfx902 gfx1103

ROCmlibs for 5.7 available at ROCmlibs for 5.7

ollama-windows-amd64.7z for ROCm6.2.4
ROCmlibs for 6.2 available at ROCmlibs for 6.2.4

Support lists gfx906:xnack- gfx1010:xnack- gfx1011 gfx1012:xnack- gfx1030 gfx1031 gfx1032 gfx1034 gfx1035 gfx1036 gfx1100 gfx1101 gfx1103 gfx1150

Note

1st, Install OllamaSetup.exe or download and upzip ollama-windows-amd64.7z . 2nd, replace the rocm libs in C:\Users\usrname\AppData\Local\Programs\Ollama\lib\ollama\rocm or your ollama-windows-amd64 , replace files in your Ollama program ROCm folder with the rocblas.dll and delete rocblas/library folder put your arches library matches your GPU architecture (Don't overwrite library , simply delete it and put the new in there) with the correct ROCmlibs for 6.2.4 or ROCmlibs for 5.7 for ollama-windows-amd64-rocm5.7z .(eg. use OllamaSetup.exe , by run ollama run model names or use package ollama-windows-amd64.7z , by run ./ollama serve )

If there is error log show amdgpu is not supported (supported types:[gfx1030 gfx1100 gfx1101 gfx1102 gfx906] it's means you are missing some steps . please check the above steps or possible replace the rocmlibs in your hipsdk

For a complete list of changes and bug fixes, please check ollama changelog:
ollama/releases
Full Changelog: v0.5.13...v0.6.0

v0.5.13

28 Feb 11:53
8cc0064
Compare
Choose a tag to compare
  • Detailed Installation Guide: please refer to the wiki guide:
    Demo Release Version:
    This release OllamaSetup.exe build with `ROCm6.1.2 hipsdk 6.1.2, make sure to replace the libs with v0.6.1.2

Tip

ROCm5.7 version * available at limited arches

ollama-windows-amd64-rocm5.7z ( build for gfx803 gfx900:xnack- gfx902 gfx1103(test only))

ROCmlibs for 5.7 available at ROCmlibs for 5.7

ollama-windows-amd64.7z for ROCm6
ROCmlibs for 6.1.2 available at ROCmlibs for 6.1.2

windows-amd64-rocm6.2.7z for ROCm6.2
ROCmlibs for 6.2 available at ROCmlibs for 6.2.4

Support lists gfx906:xnack- gfx1010:xnack- gfx1011 gfx1012:xnack- gfx1030 gfx1031 gfx1032 gfx1034 gfx1035 gfx1036 gfx1100 gfx1101 gfx1103 gfx1150

Note

Install OllamaSetup.exe or download and upzip ollama-windows-amd64.7z to replace the rcom libs in C:\Users\usrname\AppData\Local\Programs\Ollama\lib\ollama\rocm or your ollama-windows-amd64 , replace files in your Ollama program ROCm folder with the rocblas.dll and delete rocblas/library folder put your arches library matches your GPU architecture (Don't overwrite library , simply delete it and put the new in there) with the correct ROCmlibs for 6.1.2 or ROCmlibs for 5.7 for ollama-windows-amd64-rocm5.7z .(eg. use OllamaSetup.exe , by run ollama run model names or use package ollama-windows-amd64.7z , by run ./ollama serve )

If there is error log show amdgpu is not supported (supported types:[gfx1030 gfx1100 gfx1101 gfx1102 gfx906] it's means you are missing some steps . please check the above steps or possible replace the rocmlibs in your hipsdk

For a complete list of changes and bug fixes, please check ollama changelog:
ollama/releases

Full Changelog: v0.5.9...v0.5.13

v0.5.9

12 Feb 06:48
0e97670
Compare
Choose a tag to compare
  • Detailed Installation Guide: please refer to the wiki guide:
    Demo Release Version:
    This release OllamaSetup.exe build with `ROCm6.1.2 hipsdk 6.1.2, make sure to replace the libs with v0.6.1.2

Tip

ROCm5.7 version * The upstream llama.cpp broken the rocm5.7 , need update update hipsdk5.7 clang17 to clang19 in building process. More from wiki

ollama-windows-amd64-rocm5.7z ( build for gfx803 gfx900:xnack- gfx1103(test only))

ROCmlibs for 5.7 available at ROCmlibs for 5.7

ROCmlibs for 6.1.2 available at ROCmlibs for 6.1.2

Support lists gfx1010:xnack- gfx1011 gfx1012:xnack- gfx1030 gfx1031 gfx1032 gfx1034 gfx1035 gfx1100 gfx1101 gfx1103 gfx1150

Note

Install OllamaSetup.exe first in this release , 2nd, upzip ollama-windows-amd64.7z to replace the libs in C:\Users\usrname\AppData\Local\Programs\Ollama\lib\ollama , 3rd, replace files in your Ollama program ROCm folder with the rocblas.dll and rocblas/library folder matches your GPU architecture with the correct ROCmlibs for 6.1.2 or ROCmlibs for 5.7 for ollama-windows-amd64-rocm5.7z .(eg. the file in C:\Users\usrname\AppData\Local\Programs\Ollama\lib\ollama ( or you may skip step 1 ,remove any ollama clients on your local , simply use 7z package with step 2 and step3. by run ./ollama serve )

If there is error log show amdgpu is not supported (supported types:[gfx1030 gfx1100 gfx1101 gfx1102 gfx906] it's means you are missing some steps . please check the above steps or possible replace the rocmlibs in your hipsdk

Full Changelog: v0.5.8...v0.5.9
For a complete list of changes and bug fixes, please check ollama changelog:
ollama/releases

v0.5.8

10 Feb 07:49
Compare
Choose a tag to compare
v0.5.8 Pre-release
Pre-release

Full Changelog: v0.5.4...v0.5.8

v0.5.4

18 Dec 10:57
Compare
Choose a tag to compare
  • Detailed Installation Guide: please refer to the wiki guide:
    Demo Release Version:
  • Add experimental support for gfx900:xnack-
    This release OllamaSetup.exe build with `ROCm6.1.2 hipsdk 6.1.2, make sure to replace the libs with v0.6.1.2

Tip

ROCm5.7 version *No need to update for this release

ROCmlibs for 6.1.2 available at ROCmlibs for 6.1.2

Support lists gfx803 (may not detected ) gfx900:xnack- gfx902 gfx90c:xnack- gfx906:xnack- gfx1010:xnack- gfx1011 gfx1012:xnack- gfx1030 gfx1031 gfx1032 gfx1034 gfx1035 gfx1036 gfx1100 gfx1101 gfx1103 gfx1150

ollama-windows-amd64-rocm5.7-v2.7z ( rebuild for gfx803 gfx900:xnack- gfx902 gfx906:xnack- gfx90c:xnack- gfx1010:xnack-)

Note

OllamaSetup.exe is same as ollama-windows-amd64.7z ,you don't need them both . * Refer to the Standalone CLI guide:

For a complete list of changes and bug fixes, please check ollama changelog:
ollama/releases

v0.5.1

07 Dec 06:07
a0caaa2
Compare
Choose a tag to compare
  • Detailed Installation Guide: please refer to the wiki guide:
    Demo Release Version:
  • Add experimental support for gfx1150 (890M,880M): please refer to the wiki guide:
    This release OllamaSetup.exe build with `ROCm6.1.2 hipsdk 6.1.2, make sure to replace the libs with v0.6.1.2

Tip

ROCm5.7 version *ollama-windows-amd64-rocm5.7.7z.

ROCmlibs for 6.1.2 available at ROCmlibs for 6.1.2
ROCmlibs for 5.7 available at ROCmlibs for 5.7

Support lists gfx803 gfx900 gfx902 gfx90c:xnack- gfx906:xnack- gfx90a:xnack- gfx1010:xnack- gfx1012:xnack- gfx1030 gfx1031 gfx1032 gfx1034 gfx1035 gfx1036 gfx1100 gfx1101 gfx1102 gfx1103 gfx1150

OllamaSetup.exe is same as ollama-windows-amd64.7z ,you don't need them both . * Refer to the Standalone CLI guide:

For a complete list of changes and bug fixes, please check ollama changelog:
ollama/releases