• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Excelitas Qioptiq banner

BATTLESPACE Updates

   +44 (0)77689 54766
   

  • Home
  • Features
  • News Updates
  • Defence Engage
  • Company Directory
  • About
  • Subscribe
  • Contact
  • Media Pack 2023

NEW TECHNOLOGIES

June 3, 2016 by

Web Page sponsor Oxley Developments

www.oxleygroup.com
————————————————————————
31 May 16. Google Implements Neural Network Computing For Machine Learning. Google’s SyntaxNet, an open-source neural network framework implemented in TensorFlow that provides a foundation for Natural Language Understanding (NLU) systems. Google release includes all the code needed to train new SyntaxNet models on your own data, as well as Parsey McParseface, an English parser that we have trained for you and that you can use to analyze English text.
Parsey McParseface is built on powerful machine learning algorithms that learn to analyze the linguistic structure of language, and that can explain the functional role of each word in a given sentence. Because Parsey McParseface is the most accurate such model in the world, we hope that it will be useful to developers and researchers interested in automatic extraction of information, translation, and other core applications of NLU.
How does SyntaxNet work?
SyntaxNet is a framework for what’s known in academic circles as a syntactic parser, which is a key first component in many NLU systems. Given a sentence as input, it tags each word with a part-of-speech (POS) tag that describes the word’s syntactic function, and it determines the syntactic relationships between words in the sentence, represented in the dependency parse tree. These syntactic relationships are directly related to the underlying meaning of the sentence in question. To take a very simple example, consider the following dependency tree for Alice saw Bob:
This structure encodes that Alice and Bob are nouns and saw is a verb. The main verb saw is the root of the sentence and Alice is the subject (nsubj) of saw, while Bob is its direct object (dobj).
This structure again encodes the fact that Alice and Bob are the subject and object respectively of saw, in addition that Alice is modified by a relative clause with the verb reading, that saw is modified by the temporal modifier yesterday, and so on. The grammatical relationships encoded in dependency structures allow us to easily recover the answers to various questions, for example whom did Alice see? who saw Bob? what had Alice been reading about? or when did Alice see Bob?
Why is Parsing So Hard for Computers to Get Right?
One of the main problems that makes parsing so challenging is that human languages show remarkable levels of ambiguity. It is not uncommon for moderate length sentences – say 20 or 30 words in length – to have hundreds, thousands, or even tens of thousands of possible syntactic structures. A natural language parser must somehow search through all of these alternatives, and find the most plausible structure given the context. As a very simple example, the sentence Alice drove down the street in her car has at least two possible dependency parses
The first corresponds to the (correct) interpretation where Alice is driving in her car; the second corresponds to the (absurd, but possible) interpretation where the street is located in her car. The ambiguity arises because the preposition in can either modify drove or street; this example is an instance of what is called prepositional phrase attachment ambiguity.
Humans do a remarkable job of dealing with ambiguity, almost to the point where the problem is unnoticeable; the challenge is for computers to do the same. Multiple ambiguities such as these in longer sentences conspire to give a combinatorial explosion in the number of possible structures for a sentence. Usually the vast majority of these structures are wildly implausible, but are nevertheless possible and must be somehow discarded by a parser.
SyntaxNet applies neural networks to the ambiguity problem. An input sentence is processed from left to right, with dependencies between words being incrementally added as each word in the sentence is considered. At each point in processing many decisions may

Primary Sidebar

Advertisers

  • qioptiq.com
  • Exensor
  • TCI
  • Visit the Oxley website
  • Visit the Viasat website
  • Blighter
  • SPECTRA
  • Britbots logo
  • Faun Trackway
  • Systematic
  • CISION logo
  • ProTEK logo
  • businesswire logo
  • ProTEK logo
  • ssafa logo
  • Atkins
  • IEE
  • EXFOR logo
  • DSEi
  • sibylline logo
  • Team Thunder logo
  • Commando Spirit - Blended Scoth Whisy
  • Comtech logo
Hilux Military Raceday Novemeber 2023 Chepstow SOF Week 2023

Contact Us

BATTLESPACE Publications
Old Charlock
Abthorpe Road
Silverstone
Towcester NN12 8TW

+44 (0)77689 54766

BATTLESPACE Technologies

An international defence electronics news service providing our readers with up to date developments in the defence electronics industry.

Recent News

  • EXHIBITIONS AND CONFERENCES

    March 24, 2023
    Read more
  • VETERANS UPDATE

    March 24, 2023
    Read more
  • MANAGEMENT ON THE MOVE

    March 24, 2023
    Read more

Copyright BATTLESPACE Publications © 2002–2023.

This website uses cookies to improve your experience. If you continue to use the website, we'll assume you're ok with this.   Read More  Accept
Privacy & Cookies Policy

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary
Always Enabled
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Non-necessary
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.
SAVE & ACCEPT