ALT Linux Bugzilla
– Attachment 19289 Details for
Bug 55500
Пакет python3-module-llama-cpp-python не загружает GGUF-модель — не инициализирован backend
New bug
|
Search
|
[?]
|
Help
Register
|
Log In
[x]
|
Forgot Password
Login:
[x]
|
EN
|
RU
Логи ошибки выполнения и скрипт
test_llama.py (text/x-python), 443 bytes, created by
Валуев Никита Сергеевич
on 2025-08-05 16:41:16 MSK
(
hide
)
Description:
Логи ошибки выполнения и скрипт
Filename:
MIME Type:
Creator:
Валуев Никита Сергеевич
Created:
2025-08-05 16:41:16 MSK
Size:
443 bytes
patch
obsolete
>from llama_cpp import Llama >llm = Llama( > model_path="/root/Llama-3.2-3B-Instruct-f16.gguf", > chat_format="chatml" >) >output = llm.create_chat_completion( > messages=[ > {"role": "system", "content": "You are a helpful assistant."}, > {"role": "user", "content": "What is the capital of France?"} > ], > max_tokens=64, > temperature=0.7 >) >print("РезÑлÑÑаÑ:") >print(output["choices"][0]["message"]["content"])
You cannot view the attachment while viewing its details because your browser does not support IFRAMEs.
View the attachment on a separate page
.
View Attachment As Raw
Actions:
View
Attachments on
bug 55500
: 19289 |
19290