• Skip to primary navigation
  • Skip to content
  • Skip to footer
IWSLT
  • 2025
  • Shared Tasks
  • Important Dates
  • Sponsors
  • Past Editions
  • Lectures
  • SIGSLT
  • Organization

    Pretrained models

    Pretrained models

    The follow pre-trained language models are considered parts of the training data and freely usable to build the SLT systems:

    • Wav2vec 2.0
    • Hubert
    • WavLM
    • SpeechLM
    • data2vec
    • MBART
    • MBART50
    • M2M100
    • Delta LM
    • T5
    • BLOOM(only the small 560m paramter version)
      Partners:  
    • ACLACL
    • ISCAISCA
    • ELRAELRA
    © 2025 IWSLT. Powered by Jekyll & Minimal Mistakes.