colab

Non-large models

We provide a colab notebook Sherpa-onnx offline recognition with whisper python api colab notebook for you to try Whisper models with sherpa-onnx step by step.

screenshot of using whisper with sherpa-onnx in colab

Large models

For large models of whisper, please see the following colab notebook sherpa-onnx with whisper large-v3 colab notebook. It walks you step by step to try the exported large-v3 onnx model with sherpa-onnx on CPU as well as on GPU.

You will find the RTF on GPU (Tesla T4) is less than 1.