icefall
Contents:
Icefall for dummies tutorial
Installation
Docker
Frequently Asked Questions (FAQs)
Model export
Export model.state_dict()
Export model with torch.jit.trace()
Export model with torch.jit.script()
Export to ONNX
Export to ncnn
FST-based forced alignment
Recipes
Contributing
Huggingface
Decoding with language models
icefall
Model export
Edit on GitHub
Model export
In this section, we describe various ways to export models.
Export model.state_dict()
When to use it
How to export
How to use the exported model
Use the exported model to run decode.py
Export model with torch.jit.trace()
When to use it
How to export
How to use the exported models
Export model with torch.jit.script()
When to use it
How to export
How to use the exported model
Export to ONNX
sherpa-onnx
Example
Download the pre-trained model
Export the model to ONNX
Decode sound files with exported ONNX models
Export to ncnn
Export streaming Zipformer transducer models to ncnn
1. Download the pre-trained model
2. Install ncnn and pnnx
3. Export the model via torch.jit.trace()
4. Export torchscript model via pnnx
5. Test the exported models in icefall
6. Modify the exported encoder for sherpa-ncnn
Export ConvEmformer transducer models to ncnn
1. Download the pre-trained model
2. Install ncnn and pnnx
3. Export the model via torch.jit.trace()
4. Export torchscript model via pnnx
5. Test the exported models in icefall
6. Modify the exported encoder for sherpa-ncnn
7. (Optional) int8 quantization with sherpa-ncnn
Export LSTM transducer models to ncnn
1. Download the pre-trained model
2. Install ncnn and pnnx
3. Export the model via torch.jit.trace()
4. Export torchscript model via pnnx
5. Test the exported models in icefall
6. Modify the exported encoder for sherpa-ncnn
7. (Optional) int8 quantization with sherpa-ncnn