From 587ea01f55d3947e31b2f4eef3d7b42f2868c1ac Mon Sep 17 00:00:00 2001 From: Tomer Schlesinger Date: Tue, 13 May 2025 06:03:50 +0200 Subject: [PATCH] docs : update README.md for whisper.objc app (#2569) --- examples/whisper.objc/README.md | 19 +++++++++++++------ 1 file changed, 13 insertions(+), 6 deletions(-) diff --git a/examples/whisper.objc/README.md b/examples/whisper.objc/README.md index 7e790dbc..0cea0077 100644 --- a/examples/whisper.objc/README.md +++ b/examples/whisper.objc/README.md @@ -26,10 +26,17 @@ If you don't want to convert a Core ML model, you can skip this step by creating mkdir models/ggml-base.en-encoder.mlmodelc ``` -## Core ML +### Core ML support +1. Follow all the steps in the `Usage` section, including adding the ggml model file. +The ggml model file is required as the Core ML model is only used for the encoder. The +decoder which is in the ggml model is still required. +2. Follow the [`Core ML support` section of readme](../../README.md#core-ml-support) to convert the +model. +3. Add the Core ML model (`models/ggml-base.en-encoder.mlmodelc/`) to `whisper.swiftui.demo/Resources/models` **via Xcode**. -Follow the [`Core ML support` section of readme](../../README.md#core-ml-support) to convert the model. -That is all the needs to be done to use the Core ML model in the app. The converted model is a -resource in the project and will be used if it is available. Note that the Core ML model is only -used for the encoder, the decoder which is in the ggml model is still required so both need to -be available. +When the example starts running you should now see that it is using the Core ML model: +```console +whisper_init_state: loading Core ML model from '/Library/Developer/CoreSimulator/Devices/25E8C27D-0253-4281-AF17-C3F2A4D1D8F4/data/Containers/Bundle/Application/3ADA7D59-7B9C-43B4-A7E1-A87183FC546A/whisper.swiftui.app/models/ggml-base.en-encoder.mlmodelc' +whisper_init_state: first run on a device may take a while ... +whisper_init_state: Core ML model loaded +```