k2
Hint
We use the lowercase k in k2. That is, it is k2, not K2.
k2 is able to seamlessly integrate Finite State Automaton (FSA) and Finite State Transducer (FST) algorithms into autograd-based machine learning toolkits like PyTorch [1]. k2 supports CPU as well as CUDA. It can process a batch of FSTs at the same time.
We have used k2 to compute CTC loss, LF-MMI loss, and to do decoding including lattice rescoring in the field of automatic speech recognition (ASR) [2].
We hope k2 will have many other applications as well.
You can find ASR recipes in https://github.com/k2-fsa/icefall