diff --git a/README.md b/README.md index 83f30cbc..0298818c 100644 --- a/README.md +++ b/README.md @@ -386,7 +386,7 @@ Run the inference examples as usual, for example: ## Moore Threads GPU support With Moore Threads cards the processing of the models is done efficiently on the GPU via muBLAS and custom MUSA kernels. -First, make sure you have installed `MUSA SDK rc4.0.1`: https://developer.mthreads.com/sdk/download/musa?equipment=&os=&driverVersion=&version=rc4.0.1 +First, make sure you have installed `MUSA SDK rc4.0.1`: https://developer.mthreads.com/sdk/download/musa?equipment=&os=&driverVersion=&version=4.0.1 Now build `whisper.cpp` with MUSA support: diff --git a/tests/librispeech/README.md b/tests/librispeech/README.md index 85478a0f..670f39b4 100644 --- a/tests/librispeech/README.md +++ b/tests/librispeech/README.md @@ -48,7 +48,7 @@ performance of whisper.cpp on LibriSpeech corpus. ## How-to guides -### How to change the inferece parameters +### How to change the inference parameters Create `eval.conf` and override variables.