Openseq2seq: extensible toolkit for distributed and mixed precision training of sequence-to-sequence models
Proceedings of Workshop for NLP Open Source Software (NLP-OSS), 2018•aclanthology.org
Abstract We present OpenSeq2Seq–an open-source toolkit for training sequence-to-
sequence models. The main goal of our toolkit is to allow researchers to most effectively
explore different sequence-to-sequence architectures. The efficiency is achieved by fully
supporting distributed and mixed-precision training. OpenSeq2Seq provides building blocks
for training encoder-decoder models for neural machine translation and automatic speech
recognition. We plan to extend it with other modalities in the future.
sequence models. The main goal of our toolkit is to allow researchers to most effectively
explore different sequence-to-sequence architectures. The efficiency is achieved by fully
supporting distributed and mixed-precision training. OpenSeq2Seq provides building blocks
for training encoder-decoder models for neural machine translation and automatic speech
recognition. We plan to extend it with other modalities in the future.
Abstract
We present OpenSeq2Seq–an open-source toolkit for training sequence-to-sequence models. The main goal of our toolkit is to allow researchers to most effectively explore different sequence-to-sequence architectures. The efficiency is achieved by fully supporting distributed and mixed-precision training. OpenSeq2Seq provides building blocks for training encoder-decoder models for neural machine translation and automatic speech recognition. We plan to extend it with other modalities in the future.
aclanthology.org
Résultat de recherche le plus pertinent Voir tous les résultats