ALT Linux Bugzilla
– Attachment 19290 Details for
Bug 55500
Пакет python3-module-llama-cpp-python не загружает GGUF-модель — не инициализирован backend
New bug
|
Search
|
[?]
|
Help
Register
|
Log In
[x]
|
Forgot Password
Login:
[x]
|
EN
|
RU
Логи ошибки
logs.txt (text/plain), 1.02 KB, created by
Валуев Никита Сергеевич
on 2025-08-05 16:44:59 MSK
(
hide
)
Description:
Логи ошибки
Filename:
MIME Type:
Creator:
Валуев Никита Сергеевич
Created:
2025-08-05 16:44:59 MSK
Size:
1.02 KB
patch
obsolete
>llama_model_load_from_file_impl: no backends are loaded. hint: use ggml_backend_load() or ggml_backend_load_all() to load a backend before calling this function >Traceback (most recent call last): > File "/root/test_llama.py", line 3, in <module> > llm = Llama( > ^^^^^^ > File "/usr/lib64/python3/site-packages/llama_cpp/llama.py", line 374, in __init__ > internals.LlamaModel( > File "/usr/lib64/python3/site-packages/llama_cpp/_internals.py", line 58, in __init__ > raise ValueError(f"Failed to load model from file: {path_model}") >ValueError: Failed to load model from file: /root/Wan2.2-I2V-A14B-LowNoise-Q2_K.gguf >Exception ignored in: <function LlamaModel.__del__ at 0x7fabca9200e0> >Traceback (most recent call last): > File "/usr/lib64/python3/site-packages/llama_cpp/_internals.py", line 86, in __del__ > self.close() > File "/usr/lib64/python3/site-packages/llama_cpp/_internals.py", line 78, in close > if self.sampler is not None: > ^^^^^^^^^^^^ >AttributeError: 'LlamaModel' object has no attribute 'sampler'
You cannot view the attachment while viewing its details because your browser does not support IFRAMEs.
View the attachment on a separate page
.
View Attachment As Raw
Actions:
View
Attachments on
bug 55500
:
19289
| 19290