.. |
bench-all-gg.txt
|
whisper : use flash attention (#2152)
|
2024-05-15 09:38:19 +03:00 |
bench-all.sh
|
scripts : bench v3-turbo
|
2024-10-05 16:22:53 +03:00 |
bench-wts.sh
|
files : rename ./extra to ./scripts
|
2024-04-09 20:13:41 +03:00 |
bench.py
|
whisper : add large-v3-turbo (#2440)
|
2024-10-01 15:57:06 +03:00 |
build-info.sh
|
whisper : disable CUDA mel + fix FFMPEG
|
2024-06-26 20:11:38 +03:00 |
convert-all.sh
|
whisper : add large-v3-turbo (#2440)
|
2024-10-01 15:57:06 +03:00 |
deploy-wasm.sh
|
files : rename ./extra to ./scripts
|
2024-04-09 20:13:41 +03:00 |
gen-authors.sh
|
license : update copyright notice + add AUTHORS
|
2024-04-09 20:27:44 +03:00 |
get-flags.mk
|
whisper : reorganize source code + improve CMake (#2256)
|
2024-06-26 19:34:09 +03:00 |
quantize-all.sh
|
files : rename ./extra to ./scripts
|
2024-04-09 20:13:41 +03:00 |
sha-all.sh
|
files : rename ./extra to ./scripts
|
2024-04-09 20:13:41 +03:00 |
sync-ggml-am.sh
|
scripts : sync ggml-backend.cpp
|
2024-10-05 15:23:51 +03:00 |
sync-ggml.last
|
whisper : adapt to latest ggml (skip) (#0)
|
2024-10-05 15:23:51 +03:00 |
sync-ggml.sh
|
scripts : sync ggml-backend.cpp
|
2024-10-05 15:23:51 +03:00 |
sync-llama.sh
|
talk-llama : sync llama.cpp
|
2024-08-08 22:48:46 +03:00 |