This post is in two parts; they are: • Understanding the Encoder-Decoder Architecture • Evaluating the Result of Summarization using ROUGE DistilBart is a « distilled » version of the BART model, a powerful sequence-to-sequence model for natural language generation, translation, and comprehension.