Merken

San Francisco Crimes [DEMO #1]

Zitierlink des Filmsegments
Embed Code

Automatisierte Medienanalyse

Beta
Erkannte Entitäten
Sprachtranskript
we do do we do the and brood blue bonding and tend to the user and the should you feel bad
about your preparedness to structure OK so what I did was I took a hack that Austin Marshall did at the last act on where he planted the plotted the
earthquake historical earthquake data and showed anomaly score is based on the geographical coordinates and the strength of the earthquake and I did something somewhat similar with crime data from San Francisco so let's see if this works it was invented
here and in all of them had had that
and so this is going
skip that here is the crimes I've got historical data for every crime that has occurred in San Francisco since 2003 and over time the US so yeah so that's just kind 1 after the other is popping up so right now we're still in january 1st 2003 and will be in january 1st you the 3 4 annex another 60 seconds or so on that's the anomaly score but but the thing about this Austin's was also a little bit better just because he had arranged for the chord encoder that would was determined by the magnitude of the earthquake and is also a bit more of poral aspect to that because there's aftershocks all that are associated with previous earthquakes there's really no correlation between the crimes that are happening here allegedly development and so local correlations to these kinds of that but it but it is least it's an example of taking a large dataset users over milk a million 1 million 700 thousand crimes of the past 15 12 years or something in Francisco and encoding the latitude and longitude and time so that time of day day of week weekend versus weak so
but it's not that impressive that I think the only scores mean of whole lot in this case just because there's not really a decent association because if you say I thought about that using the seriousness of the crime because what if the that's true all the datasets of just like you might as you have a lot of money at at the the dataset had quite a bit of information it had the area of the town that I had more time to some other interesting things like do it had this the police precincts the area of the town that that the crime occurred that the police precinct that was dispatched to deal with a crime and the cat categories categorization of the crime and a short description so I could separate this out of petty larceny versus murderer verses drug offenses that sort of thing also segregated out so that each different like area of the city might have its own models of past 1 each each model but I don't have a lot of time so this is basically that this is what uh there's some nasty stuff and the status that is also a resolution and and the very very 1st crime
that happened on January 1st was like a child abuse sexual abuse cases the resolution was DA would not prosecute which was really disappointed so it's it's it's an interesting interesting dataset to do something with this is about as far as the but it's something I did something like that that after a while the altering because it's because it you know there's the the geospatial quarters expecting the past with the sequences are not doing any research or anything so eventually just gets comfortable with the entire area of the city but you couldn't do you might have been able to prove the few simple changes that 1 was that the learning rate used by the going rate much faster so so maybe you've got a million records reviewed this so that can only track last 2000 and also only it only plot the last 2 thousand so you could see like right now and you have lots of I have no idea yeah uh but if you plot the last 100 or 200 you'd see you would see the customer and you see if a new anomaly showed up production human parameters of yeah so there's a couple really simple things but change the learning rate might you know we get over the fact that after a while you just it's seen everything is really don't wanna know that the history 10 years ago but really relate to what's going on in the last few months for a grave and also to the it now hold it until it earlier it might have a law-enforcement use if you're just doing it for the last hour of the last 2 years yeah just displaying and maintaining the learning so you can have all of the dashboard there and so you have a lot of time and walking around hoping that the service that none of you might want to uh I think this would be really fun dataset to play with in the future as well but the 1 thing that would be really cool is not an anomaly model the prediction model here and do it her neighborhood maybe in a few you know that is like the number of crimes in this region per hour or per day or something you have a lot of data and see if you can predict when the next prime is going to happen in that area and they're like minority reports and police of there is and that that's a great idea of course you know what you would expect to predict where the next round happen but you could say that that you have on an average the unexpected for a a region and then you say can we do better than that of the state of the marine industry has on average 30 % crimes like that which you wanted to say as at predicting it would give me a likelihood score for the offer you know like the change in the climate area where you could score over time see instinctively how much of the just the the the regular garbage but not agree but he that's so that's a the question and that's sort of been seeing in the morning so
I was the data included with a not police I travel patterns don't what to call the search and the reason I ask is because maybe because it's not that may be used as a correlation between now and I think this might affect sorry that that we didn't have anything like that at all it was like neighborhood of very talented condition of like that of street area latitude longitude which is what I I used and categories there's more information do more interesting stuff with just so did not have OK the questions of of all of these are at the next 1 is the head services go the city of services that was required for the people of the mind
Quelle <Physik>
COM
Sichtenkonzept
Speicherbereichsnetzwerk
Computeranimation
Demo <Programm>
Speicherbereichsnetzwerk
Besprechung/Interview
Datenstruktur
Besprechung/Interview
Vorlesung/Konferenz
Zwölf
Bit
Kategorizität
Chord <Kommunikationsprotokoll>
Kategorie <Mathematik>
Zwei
Besprechung/Interview
Quick-Sort
Deskriptive Statistik
Informationsmodellierung
Flächeninhalt
Vorlesung/Konferenz
Information
Decodierung
Größenordnung
Softwareentwickler
Computerunterstützte Übersetzung
Korrelationsfunktion
Nachbarschaft <Mathematik>
Parametersystem
Folge <Mathematik>
Kategorie <Mathematik>
Likelihood-Funktion
Mathematisierung
Besprechung/Interview
Zahlenbereich
Biprodukt
Bitrate
Quick-Sort
Programmfehler
Quelle <Physik>
Informationsmodellierung
Dienst <Informatik>
Datensatz
Prognoseverfahren
Flächeninhalt
COM
Mustersprache
Vorlesung/Konferenz
Information
Korrelationsfunktion
Verkehrsinformation
Demo <Programm>
Schreib-Lese-Kopf
Aggregatzustand
Bildauflösung

Metadaten

Formale Metadaten

Titel San Francisco Crimes [DEMO #1]
Serientitel 2015 Spring NuPIC Hackathon
Anzahl der Teile 19
Autor Taylor, Matthew
Lizenz CC-Namensnennung 3.0 Unported:
Sie dürfen das Werk bzw. den Inhalt zu jedem legalen Zweck nutzen, verändern und in unveränderter oder veränderter Form vervielfältigen, verbreiten und öffentlich zugänglich machen, sofern Sie den Namen des Autors/Rechteinhabers in der von ihm festgelegten Weise nennen.
DOI 10.5446/18062
Herausgeber Numenta Platform for Intelligent Computing (NuPIC)
Erscheinungsjahr 2015
Sprache Englisch

Inhaltliche Metadaten

Fachgebiet Informatik
Abstract Analyzing the locations of crimes in San Francisco using temporal crime data since 2003.

Zugehöriges Material

Ähnliche Filme

Loading...