We're sorry but this page doesn't work properly without JavaScript enabled. Please enable it to continue.
Feedback

Analysing Big Time-series Data in the Cloud

Formal Metadata

Title
Analysing Big Time-series Data in the Cloud
Title of Series
Number of Parts
96
Author
License
CC Attribution - NonCommercial - ShareAlike 3.0 Unported:
You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal and non-commercial purpose as long as the work is attributed to the author in the manner specified by the author or licensor and the work or content is shared also in adapted form only under the conditions of this
Identifiers
Publisher
Release Date
Language

Content Metadata

Subject Area
Genre
Abstract
Working with small time-series data is fun. You can easily load daily Microsoft stock prices into memory and find the most successful year it its history. Or you can download average daily temperatures in your city over the last 10 years and try to spot a trend in a chart. But what if you have prices at millisecond frequency for thousands of stocks or high-resolution temperatures for the entire globe? With the right tools, working with massive time-series can feel the same as crunching through hundreds of observations in memory. In this talk, I will show what's available if you are using R, the .NET platform and Azure. We'll use Deedle, a scalable .NET data analytics library, R type provider that makes thousands of R packages available to .NET developers and MBrace, a cloud computing framework that can easily scale your data analytics over an Azure compute cluster.