Natural Language Processing involves machines or robots to understand and execute the language that human speak. It is a connection between computers and human language. Natural Language is the computer understanding, execution, twiddle and creation of natural language. Natural language processing (NLP) is a way to translate between a computer and human languages.
The creation of NLP is daring because computers conventionally require humans to speak with them in a coded language that is brief, unambiguous and organised. Human speech is not always brief it is often ambiguous and semantic structure which can depend on many variables including slang, regional dialects and social context etc. In other words, NLP automates the translation process between computers and humans.
Natural Language refers to speech analysis in both audible speeches as well as of a text language. NLP system grabs meaning from an input of words (sentence, paragraphs, pages etc.) in the form of formal and organised results. Natural Language is a basic part of Artificial Intelligence. It is far more than just speech interpretation. There are various approaches for human language which includes:
The symbolic approach to natural language processing is based on human-developed regulations and lexicons. In other words, the foundation behind this approach is in generally approved regulations of speech within a specific language which is materialised and recorded by experts.
The statistical approach to natural language processing is based on observable and persistent examples of semantic phenomena. Models which are based on statistics identify persistent themes through mathematical interpretation of the large text. By recognising trends in huge samples of text the computer system can develop its own semantic rules that it will use to interpret future input variables and the development of language output.
The connectionist approach to natural language processing is a mixture of the symbolic and statistical approaches. This approach starts with generally approved rules of language and converts them to specific applications from input procured from statistical inference.
Morphemes are the smallest units of meaning within words and this level deals with morphemes in their role as the parts that makeup word.
This level of speech analysis examines how the parts of words (morphemes) combine to make words and how slight differences can dramatically change the meaning of the final word.
This level aims at text at the sentence level. Syntax rotates around the plan that in most languages the sense of a sentence is dependent on word order and dependency.
Semantics focuses on how the context of words within a sentence helps determine the meaning of words on an individual level.
How sentences relate to one another. Sentence order and arrangement can affect the meaning of the sentences.
Bases meaning of words or sentences on situational awareness and world knowledge. Basically, what meaning is most likely and would make the most sense.
The ultimate aim of natural language processing is for computers to achieve human-like comprehension of texts/languages. When this is attained, computer systems will be able to interpret, summarise, translate and generate accurate and natural human text and language.
Image Courtesy- Expert Systems