Porting the SSL (self-supervised-learning) version of the Omnilingual-Asr W2V2 release from Meta to transformers. 3B checkpoint. More on the official repo.

Almost the same usage as indicated here.

Downloads last month
8
Safetensors
Model size
3B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Dataset used to train ylacombe/omniASR_W2V_3B_SSL