Simply put, it is about collection of data from various devices (different types, different protocols, different complexity, etc) in real-time and aggregating and processing this incoming stream of data to provide useful information. This can enormously improve efficency and make the business stand out. Presently businesses may be implementing custom solutions for addressing such data handling and analysing scenarios.
This is going to get more streamlined, easy and cost-effective using the recently introduced Azure Stream Analytics. Azure Stream analytics has all the well known advantages of Azure Cloud; fully managed; real-time stream computation; highly resilient; and easy to implement and easy to get started. It can, not only help larger corporations who want to dissociate themselves from custom solutions but also help small businesses who cannot afford a custom solution.
All it takes are a few click to author a streaming job using a simplified SQL-like language and monitor the outcome. According to Microsoft, stream rates of kB/sec to gb/sec are accomodated. There are no custom codes to write as most of it is declarative (language).
An assoiciated product, the Azure Event Hub to which Stream Analytics connect to feed all the collected data from varied devices. The streaming nature of data is compromised neither during this 'injesting' process nor during the computational phase of the analytics.
The advertised key capabilities are the following:
Read more here (as this post is a condensed version):
http://azure.microsoft.com/en-us/documentation/articles/stream-analytics-introduction/
This is going to get more streamlined, easy and cost-effective using the recently introduced Azure Stream Analytics. Azure Stream analytics has all the well known advantages of Azure Cloud; fully managed; real-time stream computation; highly resilient; and easy to implement and easy to get started. It can, not only help larger corporations who want to dissociate themselves from custom solutions but also help small businesses who cannot afford a custom solution.
All it takes are a few click to author a streaming job using a simplified SQL-like language and monitor the outcome. According to Microsoft, stream rates of kB/sec to gb/sec are accomodated. There are no custom codes to write as most of it is declarative (language).
An assoiciated product, the Azure Event Hub to which Stream Analytics connect to feed all the collected data from varied devices. The streaming nature of data is compromised neither during this 'injesting' process nor during the computational phase of the analytics.
The advertised key capabilities are the following:
- Ease of use -declarative query model, customer insulation from computautional complexity
- Scalability -handle millions of events/sec
- Reliable, repeatable and quick recovery-guarantees zero data loss
- Low Latency -Optimized for sub-second latency with an adoptive pull-based model
- Reference data- treated very much like the incomimg stream
- Financial Services; Personalized stock trading and alerts
- Real-time fraud detection
- Identity protecction real-time
- Web click stream analytics
- Telemetry log analysis
- Event archival for future reference
Read more here (as this post is a condensed version):
http://azure.microsoft.com/en-us/documentation/articles/stream-analytics-introduction/
No comments:
Post a Comment