និមិត្តសញ្ញា Zephyrnet

NLP ក្នុងឆ្នាំ ២០២០; កម្មវិធីទំនើប

កាលបរិច្ឆេទ:

NLP has gone from rule based systems to generative systems with almost human level accuracy along multiple rubrics within 40 years. This is incredible considering we were so far off naturally talking to a computer system even just ten years ago; now I can tell Google Home to turn off my sitting room lights.

In the Stanford Lecture by Chris Manning introduces a Computer Science class to what NLP is, its complexity and specific toolings such as word2vec which enable learning systems to learn from natural language. Professor Manning is the Director of the Stanford Artificial Intelligence Laboratory and is a leader in applying Deep Learning (DL) to NLP.

គោលដៅរបស់។ NLP is to allow computers to ‘understand’ natural language in order to perform tasks and support the human user to make decisions. For a logic system, understanding and representing the meaning of language is a “difficult goal”. The goal is so compelling all major technology firms have put huge investment into the field. The lecture focuses on these areas of the NLP challenge.

Some applications which you might encounter NLP systems are spell checking, search, recommendations, speech recognition, dialog agents, sentiment analysis and translation services. One key point Chris Manning explains is that human language (either text, speech or movement) is unique in that it is done to communicate something, some ‘meaning’ is embedded in the action. This is not often the case with anything else that generates data. Its data with intent, extracting and understanding the intent is part of the NLP challenge. Chris Manning also lists “Why NLP is hard” which I think we take for granted.

Language interpretation depends on ‘common sense’ and contextual knowledge, language is ambiguous (computers like direct, formal statements!), language contains a complex mix of situational, visual and linguistic knowledge from various timelines. Learning systems we have now do not have a lifetime of learned weights and bias so can only currently be applied in narrow-AI use cases.

The Stanford lecture also dives into DL and how it is different to a human exploring and designing features or signals to then apply to the learning systems. The lecture discusses the first spark of DL with speech recognition from work done by George Dahl and how the DL approach got a 33% increase in performance compared to traditional feature modelling. Professor Manning also talks about how NLP and DL have added capabilities in three segments, namely what he calls កម្រិត; speech, words, syntax and semantics. ឧបករណ៍; parts-of-speech, entities and parsing and ការកម្មវិធី; machine translation, sentiment analysis, dialogue agent and question answering. Stating NLP + DL have created a ‘Few key tools’ which have wide applications.

Words as vectors — https://youtu.be/8rXD5-xhemo?t=2346

Towards the end of the lecture we explore the ideas around how words are represented as numbers in vector spaces and how this applies to NLP and DL. Word meaning vectors then are usable to represent meaning in words, sentences and beyond.

Source: https://chatbotslife.com/nlp-in-2020-modern-applications-c8dab0ffdead?source=rss—-a49517e4c30b—4

spot_img

បញ្ញាចុងក្រោយ

spot_img

ជជែកជាមួយយើង

សួស្តី! តើខ្ញុំអាចជួយអ្នកដោយរបៀបណា?