A Review Paper on Natural Language Processing

: This study examines natural language processing (NLP) from the late 1940s to the present, with the goal of identifying trends that reflect concerns about various difficulties or the development of various approaches to solving these problems and constructing systems as a whole. In this we will also know about Natural Language Processing is the practice of teaching machines to understand and interpret conversational inputs from humans. NPL based on machine learning can be used to establish communication channels between humans and machines.


I. INTRODUCTION Don Walker, Jane
Robinson were discussing about how they got started in NLP research at the 1987 ACL Coference.Fred Thomson told them he started in 1954, and others, including Martin Kay, started in the 1950s as well.Work in the subject has tended to focus on one problem at a time, sometimes because solving problem X depends on solving problem Y and some other times simply because problem Y appears to be more tractable than solving problem X .It's comforting to suppose that NLP research, like scientific research in general, is consolidating, and while this may be more faith than substance, we can surely perform NLP now that we couldn't do in the 1950s.The fact that we are being lured by the fact that we are being seduced by the fact that we are While advances in computing technology have made intellectual advances in understanding how NLP is performed, the improvements in technology have also eliminated some of the problems that previously plagued us.But more importantly, good technology means that when you get back to a long-standing problem, it's not always as difficult as it used to be.For those who have been around like Don for a long time, old ideas can be seen re-emerging in new guises, such as lexicalist approaches to NLP and MT in particular.But the new costumes are better made, of better materials, and more pleasing: so the research doesn't go so much in circles as on a spiral, albeit a fairly shallow one.Looking back at the history of NLP, I see four phases, each with their own concerns and styles.Don has moved in one way or another and like all of us to some degree Time to the current beat.But it is noteworthy that in the push he made for linguistic resources, like corpus collections, he not only significantly promoted what I have called the data-bashing decade that is now with us, but also returned to what was a major concern in the first period of NLP research: building the powerful and comprehensive dictionaries that serious NLP applications, like MT, need.I define the first phase of work in NLP as lasting from the late 1940s to the late 1960s, the second from the late 60s to the late 70s and the third to the late 80s, while we are in a clear fourth phase now.

II. HISTORICAL REVIEW (LATE 1940S TO LATE 1960S)
The first phase of work focuses on machine translation (MT).Following the investigations of Booth and Richens and Weaver's influential translation memo published in 1949 (Locke and Booth, 1955), NLP research officially began in the 1950s.The 1954 IBM-Georgetown Demonstration demonstrated automatic translation from Russian to English, a very rudimentary form and limited experimentation.MT (Mechanical Translation), the predecessor of Computational Linguistics, also started publishing in 1954.The first international conference on machine translation was held in 1952, the second in 1956 (the year of the first conference on artificial intelligence); at the important Washington International Conference on Scientific Information in 1958, language processing was linked to information retrieval, For example in the use of thesaurus; Minsky draws attention to artificial intelligence; Luhn provides automatic abstracts (actually excerpts) for a conference papers.The Teddingt on International Conference on Machine Translation and Applied Language Analysis in 1961 was probably the high point of this first phase.From morphology, syntax, semantics, interpretation and generation, and formal theory to hardware.

Text Categoration
The practise of categorising text into ordered groupings is known as text classification, sometimes known as text tagging or text categorization.Text classifiers use Natural Language Processing (NLP) to automatically analyse text and assign predefined tags or categories depending on its content.It is mostly organised in a step-by-step format to make the procedure easier.

Text Summarization
Automatic Text Summarization is a technique in which a computer software shortens lengthy texts and provides summaries to convey the desired content.It is a prevalent challenge in machine learning and natural language processing (NLP).It will become shorter and have points.IV.CONCLUSION  The practice of teaching machines to read and interpret human conversational inputs is known as natural language processing. NLP based on Machine Learning can be used to construct human-machine communication channels. NLP has already shown effective in a variety of disciplines, despite the fact that it is still evolving. Various NLP implementations can assist organizations and individuals in saving time, increasing efficiency, and improving customer satisfaction.
3.1 Top NPL Algorithms  Lemmatization and Stemming. Keyword Extraction. Topic Modelling. Knowledge Graphs. Named Entity Recognition. Words Cloud. Machine Translation. Sentiment Analysis.Sentiment analysis is the most common method used by NLP algorithms.3.2 Advantage of NPL Include  When properly applied, NLP is less expensive and time-consuming than hiring a person. NLP can also assist companies in providing quicker customer care response times.Customers get rapid replies to their questions no matter what time of day it is or what day of the week it is. Developers can use pre-trained machine learning models to help with a variety of NLP applications, making them simple to build.3.3 Disadvantage of NPL Include  Training can take a long time.It can take weeks to get a high level of performance if a new model is constructed without the help of a pre-trained model. Another drawback of NLP is that machine learning isn't always accurate.There is always the chance of inaccuracies in predictions and outcomes, which must be considered.