Bestand wählen
Merken

Market Patterns [DEMO #4]

Zitierlink des Filmsegments
Embed Code

Automatisierte Medienanalyse

Beta
Erkannte Entitäten
Sprachtranskript
the problem but we will build on it and will we will build the wall but I have a
lot of money this stage and how we did market patents which is not very sexy name for what we're doing out but 1st of all thank you for having a as here and a pleasure to be someone was wonderful people and talk about some of the most ingenious ideas over this weekend and also haven't slept a lot
so if this goes downhill it's probably because
of the and then the other drags over think there was resolutions can be too low if I can maneuver and
so on so we started well I started working with Daniel was not here yet to leave early to drop senses can that's an important I think what is important we don't hear about and then you propose this idea talk to him about trading but in a very specific way but with stock markets and it's always ambitious to kind of predict where stock market's going be but it's a really hard task to do in the real world not just because the volatility of the market but because there's a lot of underlying factors such as machine learning that is already taking place a lot of algorithms a control high-volume trading and it's really difficult to do that but if you have a specific strategy in
mind for training then it's kind of easier to predict or kind of realize what the patterns are and hopefully selected data that you want to train something like demands new pick on so that you can kind of get what you want to give the dates and explain before so we did was we took stock data from SNP 500 we looked at 15 years in the past and we just pick 1 stock in case of this would be good because right here but also we think Google's stock in the last 15 years back and we started selecting for the data that we wanted to kind of see so we knew that we were not treating for a large gains were training for small gains may be 1 2 5 dollar gains but at least a three-day window trading and a maximum of 14 days of trading pretty as well as 2nd thing we did was we kind of started looking at where these events have so we look the way these events happen specifically and once we have those points we knew with the ease with which he 0 as we think like physics way of thinking about it the zeros happen certain points in the past 15 years so then we said that if t 0 happen here what happened preceding that event was there a pattern that happens proceeding an event that kind of helps us detect whether this pattern is going to happen again we don't we couldn't predict is gonna go up or down specifically yet but we think that the patterns has something to say about what happens next so what we did was we took the data we put it through this selective process outcome Darwinian thing and we just like data that we wanted we look back at the data points about ultimate about 28 days and we try to get all the pass points then we took that data and not described as the prices because that's that that kind of misses the point of predictive analysis more describing what's happening so we looked at the patterns in the market and we looked at the candlestick patterns and we try to describe the semantically in a way that new pick understands our similarities and concurrency is and differences so we had overlaps between the highs and the lows and we looked at the upper shatters lower shadows are we looked at whether the deal with the 2 candlesticks were engulfing 1 another by the mid point with higher and lower a lot of different ways of kind of coding we we basically built our own encoded with a Candlestick Park we also look at the still Karzai side to kind analyze whether we're trading in a really low point where we should be it should be over it was over trade was under so that it did get yesterday was we had a high we have the low end we have open and close prices for a daily records the past 15 years and we can use all that to describe the data more than just put the prices so with that being said what we got out of work for Google starts got about the course so we were able to build
so as might we were able to build this thing called a trainer that users are encoding method to encode the data we have to train on new pick using the the temporal pool only we took the spatial part of it we thought we were doing a pretty good job encoding of comes out ahead and we use it in property thing that's where the patterns really emerge so what we did was we broke the data that we hadn't about 70 % I try other training data and 30 % testing and we never showed the tested it to the to the new we only show the training data so this is our 1 of the ways it does it we are just go ahead and load the training and and right here and loading the file consists of below some some part of part I read so it's loading up on a table of Google CSV the file into things looking for the exact market up and down finding the exact points we want for training that whatever not it is we're looking for and defined those perfect points and then go back from those points and feed those into new page by 7 initially test data which would also make it seem so I go ahead and run that runs through a whole process we get about 250 data points and from that we go backwards in time we get about 2 thousand days worth of records in you pick to train then we went ahead and tested it using the data we never shown before we never shown new but before they want to see how good it was recognized the same patterns again and what look at anomalies from that i'm hoping bookmaking Daniel proud to consider like this and then anyway goes to a training process got about 130 I'm going to stop because of injury like 3 or 4 times already and so we went ahead and we analyze this is a test file that we produce and this is never shown to new we we turned on the learning process offer this point I want to see whether it could recognize the scene patterns happen again and when the numbers were low and so then we would have that this produces a graph in shows you on the graph weights how close it is to predicting anomalies and if it's closer to 0 the nineties as recognizing the patterns that we want to record and hopefully pop up in about 10 seconds 10 9 8 7 6 5 4 3 2 1 no but it but the are so you can say I mean it starts out OK but then he sees starts to recognize more and more the patterns in this realizing the patterns we want we selected the patterns before it's happening in the test files to obviously so starting to see that the pattern set which it has been trained on a kind of recognize that it becomes more closer to being 0 being like yes this the pattern that i want that I that I know or so that I was like the single was the further as real-time data so I went home and was that 1 isn't that a plugin for Python said why not go ahead and start streaming data maybe about half a year back service and 30 days ahead and start seeing if there's anomalies that are low enough for us to come see the pattern is happening now and so this is going to basically be streaming from was the finest hopefully it works I'm had a a chance to like make all gooey and fancy but I so streaming any sees got anomaly values 0 . 1 8 in that time period 0 . 1 7 instead of loads loads of blowing them up and you get a time period where the anomalies are significantly closer you know that pattern that we recognize the force happening and the thing is we can make this a little obviously a lot better in the sense that we can not only trained to recognize when the patterns that we wanna happening we can look at the opposite side and teacher the patterns we don't want also happening in the current compared to and correlate when the best probability of investing in stocks really it's for I think having the application for this extends beyond just finance human behavior was a great role but I think this the demo for something like this and for temporal patterns and that's basically the freedom session sorry resolutions and half of it is that all the way to get out of free use different but yeah so I mean I wanted to do more but the more I do the more break last minute so I didn't have or talks about all yeah so so within this anomaly it's taking a pretty significantly big window and is it taking an average of all the knowledge and the window so the idea would be to go with this anomaly look at your and see where the lowest anomaly happened because then you can start seeing the same patterns again that happened before if you recognize these candlesticks are happening again before significant event whether it be upward doubt that I don't know what yet but I think that's where the idea of a fundamentally lies and you can figure out out at what point you want kind investigate and and the idea is to compare some like this against things they already have because really this involves temporal learning and and building connections synapsis which isn't as an algorithm I feel like is to fix and I think some of this has has a lot more potential to do a lot more in in the long run than something that's already fixed parameters as I have stated as regression kind of not really I know I'm terrible at that sometimes we suffered invariance yet this new it was daily his Daily again and I just went ahead and look at the 1st ASEAN was start financing media 200 is in the past are starting to entities in the past and whether it is for from there and every time I just went 1 the forward and in 3 days from that and look at the numbers between those 2 sections the nominees course so they had to take with a grain of salt they tell you when new picky seeing something that has seen before because the way we picked our patterns it tells us that the new big recognizes the patterns that are happening again since it was trained on those very same patterns and if we can if if if patterns to hold true if his if history does repeat itself as this then I think to a certain extent there is some predictability in this country as I know there's a lot of things that happened the world that changing markets but I think markets are a meta-analysis of looking at those kind of events yes Odyssey anomalies exist that's what we're doing here but I think it in the day and in this case is without doubt that is by the sum my feeling that should I can't answer it is the future data about our but yes I think there's some there's some predictability on like this and the causality twist time so if feel like OK how many we have out that you aren't well I'll tell you think that the money and they put a lot of effort and Daniel was not here he's he's amazing and he was sold on the idea that things which are working and after that just we could stop like the head of the other they reside in the same here is
Quelle <Physik>
Computeranimation
Demo <Programm>
Besprechung/Interview
Vorlesung/Konferenz
Umwandlungsenthalpie
Task
Virtuelle Maschine
Algorithmus
Besprechung/Interview
Gamecontroller
Vorlesung/Konferenz
Teilbarkeit
Bildauflösung
Demo <Programm>
Gewichtete Summe
Prozess <Physik>
Punkt
Datenparallelität
Extrempunkt
Kartesische Koordinaten
Homepage
Prognoseverfahren
Algorithmus
Prozess <Informatik>
Trennschärfe <Statistik>
Lineare Regression
Bildschirmfenster
Mustersprache
Kontrollstruktur
Vorlesung/Konferenz
Bildauflösung
Softwaretest
Parametersystem
Physikalischer Effekt
Benutzerfreundlichkeit
Kategorie <Mathematik>
Güte der Anpassung
Temporale Logik
Ähnlichkeitsgeometrie
Frequenz
Ereignishorizont
Dienst <Informatik>
Forcing
Garbentheorie
Decodierung
Message-Passing
Tabelle <Informatik>
Subtraktion
Wellenpaket
Gewicht <Mathematik>
Invarianz
Physikalismus
Zahlenbereich
Demoszene <Programmierung>
Datensatz
Metaanalyse
Mittelwert
Abschattung
Maßerweiterung
Analysis
Schreib-Lese-Kopf
Einfach zusammenhängender Raum
Graph
Zwei
Plug in
Elektronische Publikation
Programmfehler
Last
Mereologie
Hypermedia

Metadaten

Formale Metadaten

Titel Market Patterns [DEMO #4]
Serientitel 2015 Spring NuPIC Hackathon
Anzahl der Teile 19
Autor Arora, Saj
McDonald, Daniel
Lizenz CC-Namensnennung 3.0 Unported:
Sie dürfen das Werk bzw. den Inhalt zu jedem legalen Zweck nutzen, verändern und in unveränderter oder veränderter Form vervielfältigen, verbreiten und öffentlich zugänglich machen, sofern Sie den Namen des Autors/Rechteinhabers in der von ihm festgelegten Weise nennen.
DOI 10.5446/18054
Herausgeber Numenta Platform for Intelligent Computing (NuPIC)
Erscheinungsjahr 2015
Sprache Englisch

Inhaltliche Metadaten

Fachgebiet Informatik
Abstract An attempt to use NuPIC to understand market pattern trends.

Zugehöriges Material

Ähnliche Filme

Loading...
Feedback