Fork me on GitHub


Fast Neural Machine Translation in C++

Marian is an efficient Neural Machine Translation framework written in pure C++ with minimal dependencies. It has mainly been developed at the Adam Mickiewicz University in Poznań (AMU) and at the University of Edinburgh.

It is currently being deployed in multiple European projects and is the main translation and training engine behind the neural MT launch at the World Intellectual Property Organization.

Main features:

  • Fast multi-gpu training and translation
  • Efficient pure C++ implementation
  • Different model types: deep RNNs, transformer and LMs
  • Compatible with Nematus
  • Permissive open source license (MIT)
  • and much more...

Quick Start

Minimal examples for compilation, training, translation

Features & Benchmarks

Feature list, translation and training benchmarks


Extended description of command line options for training and translation


Short examples how to train and translate with WMT-grade NMT models


Look for answers or post your own question


Citation information and list of publications


The development of Marian received funding from the European Union's Horizon 2020 Research and Innovation Programme under grant agreements 688139 (SUMMA; 2016-2019), 645487 (Modern MT; 2015-2017), 644333 (TraMOOC; 2015-2017), 644402 (HimL; 2015-2017), the Amazon Academic Research Awards program, the World Intellectual Property Organization, and is based upon work supported in part by the Office of the Director of National Intelligence (ODNI), Intelligence Advanced Research Projects Activity (IARPA), via contract #FA8650-17-C-9117.