Semantic Fingerprint Sentence Generation [DEMO #5]

Video in TIB AV-Portal: Semantic Fingerprint Sentence Generation [DEMO #5]

Formal Metadata

Semantic Fingerprint Sentence Generation [DEMO #5]
Title of Series
Number of Parts
CC Attribution 3.0 Unported:
You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor.
Release Date

Content Metadata

Subject Area
In this hack, we will attempt to generate semantic fingerprints from WordNet semantic relationships and train the HTM to recognize sequences of meaning from training texts. The trained HTM will be used to generate English sentences by using the predicted sequence of SDRs from the HTM to select words from the training set to fill in blanks in the sentences generated according to a limited English grammar.
Point (geometry) Semantics (computer science) Computer programming Energy level Software testing Process (computing) Physical system Fingerprint Form (programming) Family Dependent and independent variables Forcing (mathematics) Projective plane Electronic mailing list Basis <Mathematik> Bit Incidence algebra Category of being Word Raster graphics Order (biology) Right angle Natural language Fingerprint Resultant
Building MUD Cellular automaton Set (mathematics) Client (computing) Semantics (computer science) Number Power (physics) Wave packet Pointer (computer programming) Goodness of fit String (computer science) Motion blur Physical law Process (computing) Codierung <Programmierung> Hydraulic jump Fingerprint Family Window Dependent and independent variables Theory of relativity Magneto-optical drive Interior (topology) Ext functor Bit Surface of revolution Limit (category theory) Receiver operating characteristic Category of being God Word Sparse matrix Raster graphics File archiver output Hill differential equation Pattern language Fingerprint
12 (number) MUD Artificial neural network Interior (topology) Maxima and minima 3 (number) Emulation Wave packet Exploratory data analysis output Hill differential equation Row (database) Physical law
Group action Direction (geometry) Decision theory Set (mathematics) Insertion loss Function (mathematics) Information privacy Special unitary group Different (Kate Ryan album) Software framework Process (computing) Physical system Identity management Social class Family Predictability Metropolitan area network Algorithm Theory of relativity Ext functor Bit Sequence Arithmetic mean Process (computing) Multi-agent system Order (biology) Pattern language Quicksort Figurate number Resultant Link (knot theory) Real number Letterpress printing Polarization (waves) Wave packet Power (physics) Number Ring (mathematics) Representation (politics) Selectivity (electronic) Data structure Surjective function Fingerprint Raw image format Matching (graph theory) Information Content (media) Letterpress printing Word Speech synthesis Fingerprint
Rounding MUD Uniform resource name Demo (music) Maxima and minima Endliche Modelltheorie Fingerprint
Europe over an hour at the start of the test set out to do so I've in itself this is what will be on the order of a gene was the semantic categories of something like that you know if you have been so tired of giving back along project was to try to figure out if we could generate a natural language request response back where we would the to use the words in the system and it would come back words out and our big goal is to try and make it so that the words that come out of the home grammatical sentences in some form of that didn't work out so well but we had we had some successes some the wire right so that the child was from a little bit so this is the list result all the other stuff program that we created in incident and what it does is going to take the words in the text and match them up against the words in those vocabularies the trade off so this is the the user wants right of the text was revised we found that it has about 20 thousand unique words in that were also in the WordNet of dictionary so there we were all words in the text that are found in our vocabulary we build a fingerprint semantic fingerprints all cortical idols fingerprints the basis of a lot of background work which can really high levels of naturally processing intelligence are very simple it looks at work that match the relationships in the network and come to later in this this it is not as forces we would like it to be so it is a little actually are like it has the principles that govern it tends to have either you get a lot of but that's an interesting point of what we did so what I showed you initially came initially came here I have really no idea how to make the the fingerprints so I was just saying it will take all my words and you
need that responsibility meaning the building fingerprint by some more candidates for all related concepts I think it turns out you get 20 thousand bits kind
a manageable so we decided to use an idea that we got from the discussion on friday afternoon where you mention due to the spatial geospatial encoder to randomly experience and so we looked at that and found out that will take each of our input ideas and semantic concepts and just give it a random pattern applied this they and those client it's a set of our 4 thousand a strange then that means that concept and then we can build our fingerprints by looking at all the related concepts of word entry in your following some certain limits on that in this setting all those patterns of like this because of the the sparseness of all the the number of bits we have there's very little overlap and so you get all this nice property that starts from there so that's how we generate our fingerprint we needed to get a true str which has sort of from 1 to normalize it to a relatively the same number of bits set for all of our concepts so that in the spatial more about their fingerprints of visual the spatial cooler we try to train it to learn the power of fingerprints but that turned out to be completely unnecessary and this was a great revolution revelation for us that if we used the spatial were just randomly initialized with no training whatsoever it gave us a very good encoding of archives for interference as jumps out of
which I really not expect that but so we up using that no training should be the or fingerprints and spatial were remember yes comes out as matching to the to the 1 in doing
so of the 1st book we took me at 62 thousand input concepts from that was or was it was end of that 6 per cent overlap came out right so that means the same thing and then we
get identical Antarctica fingerprints and identical as yards out my son alive lost something the process will be 6 per cent in we looked at the the duplicates they were nearly identical concepts mean so we really didn't believe in light of the world and you're in the class together in the same as the art of the words being essentially the same things that were not included in the lexical information into the data and so that was good news and took the sentences from the world was lost in to the temple after taking each word in building a fingerprint of come up answers and then we try and say OK now with this training system the fingerprint for the text that the sequence of the difference with the text that we got and then
generate out or take the predicted output of the temple polar and they can use that to decide word print out and we wanted to sort link that up to a common framework structure that we were able to get that work and so now we're just basically printing out sort of related words and then all of the loss of policy using so what is the words out of the the the output word we select from the possibilities back into the spatial was sort of privacy and that was the real ideas in order to generate your sentences coming out it's not being outside process of sometimes there's a look that committed to this word this idea in my speech durations of that's restrict the future pathways that follows the globalization that match the pattern I see these possible direction and go from there what would pick 1 of those directions spell that to restrict further our future directions and so we try to create that of using that here we did get really results I think there's a lot of other we got a bit of a drug figure out what's going on and what concepts are really coming through you want to put a little process going on in the same way is this is general 1 of the so it had about how these that that the US stuck in there the output of the simple temporal were doesn't match any of like you don't word for it so the group that we have a problem with that is looking at it from the literature the in the act of taking the predictions from the simple reward which corresponds to the era of like a union of the possible outputs and then taking that matching that's to all of our training concepts in carrying out the top matches content matches and then from those top 10 matches intersecting that with the set of expected concepts my that in the next and sentence in about using a word and then from from back selection I have narrowed it down to a narrow concept that I committed to to my sentence you got it right so I think you have a broader meaning that comes out or narrowing it through a filtering process in the united to drive the selection of any good at the art and of the best of yeah that was that was made a really insightful it's not really powerful learning algorithm the power of the large number of bits representation of the data items at all this really you know always results without having well what
thank you will work model is that it you was to know I have to do with the of


  413 ms - page object


AV-Portal 3.20.2 (36f6df173ce4850b467c9cb7af359cf1cdaed247)