vioft2nntf2t|tblJournal|Abstract_paper|0xf4ffad0b2d0000002ab60c0001000400 The arena of automatic text summarization incorporates the paramount and relevant information from a large document. This research paper attempts at representing two hybrid models for automatic text summarization. Extractive summarization followed by an abstractive summarization, is the strategy which is adopted in this paper to produce an informative and concise summary. The LexRank algorithm is used for extractive summarization, while BART (Bidirectional and Auto Regressive Transformers) and T5 (Text-ToText Transfer Transformer) are used for abstractive summarization. BART and T5 are advanced per-trained models based on Transformer. The Transformer-based Per-trained models are causing a stir in the deep learning world. The first hybrid model is constructed using LexRank with BART (LRB) and the second hybrid model is constructed with LexRank with T5 (LRT). This specific approach will result in the generation of the extractive summary using the LexRank algorithm. The resulted output of LexRank is used as the input for BART and T5. The efficiency of two hybrid models is analyzed using qualitative and quantitative methods. The human-generated summary is used to evaluate the quality of the models, while the ROUGE score provides a quantitative assessment of their performance. Thus, this work may be concluded in the precision that, the LRT hybrid model is more effective than LRB hybrid model.
Shini George, V. Srividhya Avinashilingam Institute for Home Science and Higher Education for Women, India1,2
BART, T5, LexRank, ROUGE Score, Extractive Summarization, Abstractive Summarization
January | February | March | April | May | June | July | August | September | October | November | December |
0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| Published By : ICTACT
Published In :
ICTACT Journal on Soft Computing ( Volume: 12 , Issue: 4 , Pages: 2690-2696 )
Date of Publication :
July 2022
Page Views :
620
Full Text Views :
3
|