Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow building of whispercpp with COREML support #31

Merged
merged 1 commit into from
Jan 11, 2024

Conversation

tangm
Copy link
Contributor

@tangm tangm commented Jan 10, 2024

Addresses #12 and maybe #20

Rather than copying much of whisper.cpp's CMakeLists.txt, just link the built libraries against main.cpp.

To enable CoreML support on macs per https://github.com/ggerganov/whisper.cpp?tab=readme-ov-file#core-ml-support , we will still need to first download and convert the appropriate model using the original whisper.cpp repository, producing a <model>.mlmodelc directory.

To build and install,

  • export CMAKE_ARGS="-DWHISPER_COREML=1"
  • python -m build --wheel in this repository to build the wheel. Assumes you have installed build with pip install build
  • pip install dist/<generated>.whl

To invoke,

  • Use something like model = Model('<model_path>/ggml-base.en.bin', n_threads=6), assuming the converted mlmodelc directory is also in <model_path>

If successful, you will see something like:

whisper_init_state: loading Core ML model from '<model_path>/ggml-base.en-encoder.mlmodelc'
whisper_init_state: first run on a device may take a while ...
whisper_init_state: Core ML model loaded

You can also verify if whisper.cpp has been built successfully by doing print(Model.system_info()), and you should see COREML = 1

@absadiki
Copy link
Owner

Thanks @tangm very much for the PR.
Yes I agree, your idea is perfect.

I don't have access to a MAC to test, but the mac-os workflow has been successfully finished.

Could you please try to add "-DWHISPER_COREML=1" to the cmake args in the setup.py file and test if you can build the project with pip install -e . ? so we can generate pre-built wheels for pypi.

@tangm
Copy link
Contributor Author

tangm commented Jan 10, 2024

Thanks @tangm very much for the PR. Yes I agree, your idea is perfect.

I don't have access to a MAC to test, but the mac-os workflow has been successfully finished.

Could you please try to add "-DWHISPER_COREML=1" to the cmake args in the setup.py file and test if you can build the project with pip install -e . ? so we can generate pre-built wheels for pypi.

hmm, I can add it to the build args in setup.py, but I'm not sure if you want to turn on CoreML support by default, as it does require some intervention by the user to convert models, i.e. someone on a mac who has been using this library previously but upgrades will now experience a crash because whisper.cpp with CoreML support expects converted models to be present. Maybe just putting the instructions in the README like whisper.cpp did?

If there is some way of releasing variations of a library that might be more suitable but I'm not very familiar with the pypi ecosystem...

absadiki added a commit that referenced this pull request Jan 11, 2024
@absadiki absadiki merged commit 074f067 into absadiki:main Jan 11, 2024
11 checks passed
@absadiki
Copy link
Owner

Yeah .. And also I forget that the Github VMs do not have CoreML to build the binary against (or at least AFAIK), so instructions on the README file should be enough.

Thank you very much again, @tangm, for the awesome contribution.

@tangm
Copy link
Contributor Author

tangm commented Jan 11, 2024

Glad to help! 👍

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants