Using Azure as a Universal Data Store

This architectural pattern is very generic in that it attempts to consume data into the Azure cloud, and can process, store and analyse in a cost effective, highly available and flexible way.

  1. The entry Logic App can extract data that exists on premise or can receive data directly through its own API.
  2. The payload or body of the data is immediately encapsulated into a file with a unique filename generated from a GUID and finally stored in the highly efficient, highly available Azure Blob storage.
  3. The unique name is also submitted into an Azure Service Bus Queue. The queue can be configure to store the unique name for weeks if necessary.
  4. A second Logic App subscribes to the queue and receives each unique name as it appears.
  5. The Logic app then retrieves the file held in the Blob store using the unique name as an identifier.
  6. The Logic App then processes the body data through a workflow, saving back to the blob store
  7. The unique name is then stored in either Azure Table Store or SQL Azure
  8. A Web App can provide a dashboard to display the results, loading from the blob store or table/SQL Azure

Here is an example of this pattern. Let us imagine that a UK airport needs a way of monitoring which cars are entering and leaving. Cameras can be fitted with software that can detect the number plate and/or any other detail. The system receives images and data about each vehicle as it arrives in the blob store and service bus respectively. The system the detects a vehicle departure and finds the matching arrival record. The system then can apply rules that determine if any action should be taken.

Leave a Reply

Your email address will not be published. Required fields are marked *