Notes from a rather tough, yet futile, attempt at getting Gpt4All to run locally on an old Ubuntu20.04 box, with Python-3.8.
* Pip Install: First up, the gpt4All installed via pip (ver 2..8.2) has changes incompatible with current/ recent model files & gguf (llama3.2, etc). Causing type, value, keyword, attribute errors etc at different stages of installation & execution.
This leads to issues with Ubuntu 20.04 library's being outdated/ missing & the hardware being outdated :
- GLIBC_2.32 not found
- GLIBCXX_3.4.29 not found
- CMake 3.23 or higher is required. You are running version 3.16.3
- Vulkan not found, version incompatible with Gpu, etc
- CUDA Toolkit not found.
- CPU does not support AVX
Anyway, after a lot of false steps the build did succeed with the following flags set did succeed:
cmake -B build -DCMAKE_BUILD_TYPE=Rel -DLLMODEL_CUDA=OFF -DKOMPUTE_OPT_DISABLE_VULKAN_VERSION_CHECK=ON
Build files have been written to: .../gpt4all/gpt4all-backend/build
Even after all that there were issues popping up with getting Llms to run from libraries like langchain, pygpt4all and so on. Clearly indicating that it was time to bid adieu to Ubuntu 20.04 & upgrade to more recent and better supported versions.
References
- https://python.langchain.com/docs/how_to/local_llms/
- https://askubuntu.com/questions/1393285/how-to-install-glibcxx-3-4-29-on-ubuntu-20-04
- https://stackoverflow.com/questions/71940179/error-lib-x86-64-linux-gnu-libc-so-6-version-glibc-2-34-not-found
No comments:
Post a Comment