Journal IJCRT UGC-CARE, UGCCARE( ISSN: 2320-2882 ) | UGC Approved Journal | UGC Journal | UGC CARE Journal | UGC-CARE list, New UGC-CARE Reference List, UGC CARE Journals, International Peer Reviewed Journal and Refereed Journal, ugc approved journal, UGC CARE, UGC CARE list, UGC CARE list of Journal, UGCCARE, care journal list, UGC-CARE list, New UGC-CARE Reference List, New ugc care journal list, Research Journal, Research Journal Publication, Research Paper, Low cost research journal, Free of cost paper publication in Research Journal, High impact factor journal, Journal, Research paper journal, UGC CARE journal, UGC CARE Journals, ugc care list of journal, ugc approved list, ugc approved list of journal, Follow ugc approved journal, UGC CARE Journal, ugc approved list of journal, ugc care journal, UGC CARE list, UGC-CARE, care journal, UGC-CARE list, Journal publication, ISSN approved, Research journal, research paper, research paper publication, research journal publication, high impact factor, free publication, index journal, publish paper, publish Research paper, low cost publication, ugc approved journal, UGC CARE, ugc approved list of journal, ugc care journal, UGC CARE list, UGCCARE, care journal, UGC-CARE list, New UGC-CARE Reference List, UGC CARE Journals, ugc care list of journal, ugc care list 2020, ugc care approved journal, ugc care list 2020, new ugc approved journal in 2020, ugc care list 2021, ugc approved journal in 2021, Scopus, web of Science.
How start New Journal & software Book & Thesis Publications
Submit Your Paper
Login to Author Home
Communication Guidelines

WhatsApp Contact
Click Here

  Published Paper Details:

  Paper Title

Automatic Text Summarization

  Authors

  B. Dinesh Kumar,  CH. Shonaakshay,  B. Thrillokh Goud,  B. Raviteja

  Keywords

: Text summarization, extractive text summarization, Abstractive Text summarization, Deep Learning, Natural Language processing, machine learning, Hugging Face, TensorFlow, Chunking, Web Scraping

  Abstract


Nowadays, a lot of information can be obtained from both online and offline sources. We can access hundreds of documents for a topic. The ability to summarize or create popular topics allows users to quickly search for topics and get initial information as quickly as possible. Manually extracting useful information from them is a difficult task. To solve this problem automatic text summarization (ATS) systems were developed. Text summarization is the process of extracting important information from a large document and turning it into a summary, preserving all the important points. This abstract presents a novel approach using BART pre-trained model which is a large language model used widely for text generation tasks. The proposed system takes input a either a URL, Text or a file of any format. Automatic text summarization aims to produce a shortened version of a long text document while retaining its overall meaning. This has applications like summarizing news articles, research papers, reports, etc. There are two common approaches - extractive methods that identify and extract key sentences from the original text, and abstractive methods which paraphrase and generate new sentences. The proposed system takes input text, documents or URLs and returns a summary. It uses a pretrained BART model fine-tuned on the CNN-Daily Mail dataset. BART (Bidirectional and Auto-Regressive Transformer) combines BERT and GPT architectures. It is a seq2seq model trained as a denoising autoencoder, taking text sequence input and output. Key advantages of BART include improved summary coherence and factual consistency. During training, the model learns to generate a summary given an input document. Current models can produce high quality summaries, but issues remain around coherence, repetition, and accuracy. Key challenges include improving faithfulness to the original text, avoiding repetition, and controlling summary length and style. Future directions involve multi-document summarization, controlled generation, better evaluation metrics, and performance on domain-specific datasets. Overall, text summarization continues to be an important application of NLP that helps condense information and enable quick access to key facts and context. The proposed BART-based system aims to leverage strengths in text generation and compression to improve summarization capabilities.

  IJCRT's Publication Details

  Unique Identification Number - IJCRT2402277

  Paper ID - 248227

  Page Number(s) - c411-c421

  Pubished in - Volume 12 | Issue 2 | February 2024

  DOI (Digital Object Identifier) -   

  Publisher Name - IJCRT | www.ijcrt.org | ISSN : 2320-2882

  E-ISSN Number - 2320-2882

  Cite this article

  B. Dinesh Kumar,  CH. Shonaakshay,  B. Thrillokh Goud,  B. Raviteja,   "Automatic Text Summarization", International Journal of Creative Research Thoughts (IJCRT), ISSN:2320-2882, Volume.12, Issue 2, pp.c411-c421, February 2024, Available at :http://www.ijcrt.org/papers/IJCRT2402277.pdf

  Share this article

  Article Preview

  Indexing Partners

indexer
indexer
indexer
indexer
indexer
indexer
indexer
indexer
indexer
indexer
indexer
indexer
indexer
indexer
indexer
indexer
indexer
indexer
indexer
Call For Paper July 2024
Indexing Partner
ISSN and 7.97 Impact Factor Details


ISSN
ISSN
ISSN: 2320-2882
Impact Factor: 7.97 and ISSN APPROVED
Journal Starting Year (ESTD) : 2013
ISSN
ISSN and 7.97 Impact Factor Details


ISSN
ISSN
ISSN: 2320-2882
Impact Factor: 7.97 and ISSN APPROVED
Journal Starting Year (ESTD) : 2013
ISSN
DOI Details

Providing A Free digital object identifier by DOI.one How to get DOI?
For Reviewer /Referral (RMS) Earn 500 per paper
Our Social Link
Open Access
This material is Open Knowledge
This material is Open Data
This material is Open Content
Indexing Partner

Scholarly open access journals, Peer-reviewed, and Refereed Journals, Impact factor 7.97 (Calculate by google scholar and Semantic Scholar | AI-Powered Research Tool) , Multidisciplinary, Monthly, Indexing in all major database & Metadata, Citation Generator, Digital Object Identifier(DOI)

indexer
indexer
indexer
indexer
indexer
indexer
indexer
indexer
indexer
indexer
indexer
indexer
indexer
indexer
indexer
indexer
indexer
indexer
indexer
indexer
indexer
indexer
indexer
indexer
indexer
indexer
indexer
indexer
indexer
indexer
indexer
indexer
indexer