This architectural pattern is very generic in that it attempts to consume data into the Azure cloud, and can process, store and analyse in a cost effective, highly available and flexible way.
- The entry Logic App can extract data that exists on premise or can receive data directly through its own API.
- The payload or body of the data is immediately encapsulated into a file with a unique filename generated from a GUID and finally stored in the highly efficient, highly available Azure Blob storage.
- The unique name is also submitted into an Azure Service Bus Queue. The queue can be configure to store the unique name for weeks if necessary.
- A second Logic App subscribes to the queue and receives each unique name as it appears.
- The Logic app then retrieves the file held in the Blob store using the unique name as an identifier.
- The Logic App then processes the body data through a workflow, saving back to the blob store
- The unique name is then stored in either Azure Table Store or SQL Azure
- A Web App can provide a dashboard to display the results, loading from the blob store or table/SQL Azure
Here is an example of this pattern. Let us imagine that a UK airport needs a way of monitoring which cars are entering and leaving. Cameras can be fitted with software that can detect the number plate and/or any other detail. The system receives images and data about each vehicle as it arrives in the blob store and service bus respectively. The system the detects a vehicle departure and finds the matching arrival record. The system then can apply rules that determine if any action should be taken.