Skip to content

Commit

Permalink
Adding overview diagram and minor edits
Browse files Browse the repository at this point in the history
  • Loading branch information
xurui203 authored Aug 6, 2018
1 parent 1ffbed5 commit ecf5d05
Showing 1 changed file with 11 additions and 7 deletions.
18 changes: 11 additions & 7 deletions samples/e2e/EventHubsCaptureEventGridDemo/Readme.md
Original file line number Diff line number Diff line change
@@ -1,18 +1,21 @@
# An overview, how Event Hubs Capture integrates with Event Grid
# How Event Hubs Capture integrates with Event Grid and Azure Function

One of the key scenarios for modern cloud scale apps is seamless integration and notification amongst other apps and services. In this blog post, we introduced [Azure EventGrid](https://azure.microsoft.com/blog/introducing-azure-event-grid-an-event-service-for-modern-applications/) (in public preview), a service designed just for that!
Today, we will go over a realistic scenario of capturing Azure EventHub data into a SQL Database Warehouse and demonstrate the power and simplicity of using [Azure EventGrid](https://docs.microsoft.com/azure/event-grid/overview) to achieve this.

So, fasten your seat belts…
### What is covered in this tutorial:

![Visual Studio](./media/EventGridIntegrationOverview.PNG)

* First, we turn on Capture on Azure Event Hub with Azure blob storage as destination. This generates Azure storage blobs containing EventHub data in Avro format.
* Next, we create an Azure EventGrid subscription with source as the Azure EventHub namespace and the destination as an Azure WebJob Function.
* Whenever a new Avro blob file is generated by Azure EventHub Capture, Azure EventGrid notifies or shoulder-taps the Azure WebJobs Function with info about the blob (the blob path etc.). The Function then does the required processing to dump the data to a SQL Database data warehouse.

That is it! There are no worker services involved in polling for these Avro blobs. This means no management overhead and significantly lower COGS, especially in a cloud-scale production environment!

The sample code for this scenario is here. It consists of a solution with projects that do the following
a. WindTurbineDataGenerator – a simple publisher for wind turbine data, that sends the interested data to your EventHub which has Capture enabled on it
b. FunctionDWDumper – This Azure Functions project receives the EventGrid notification about each Capture Avro blob created. This gets the blob’s Uri path, reads its contents and pushes this data to a SQL Database data warehouse.
The sample code for this scenario consists of a solution with projects that do the following:
1. WindTurbineDataGenerator – a simple publisher for wind turbine data, that sends the interested data to your EventHub which has Capture enabled on it
1. FunctionDWDumper – This Azure Functions project receives the EventGrid notification about each Capture Avro blob created. This gets the blob’s Uri path, reads its contents and pushes this data to a SQL Database data warehouse.

# Prerequisites
* [Visual studio 2017 Version 15.3.2 or greater](https://www.visualstudio.com/vs/)
Expand All @@ -21,7 +24,7 @@ b. FunctionDWDumper – This Azure Functions project receives the EventGrid noti
![Visual Studio](./media/EventCaptureGridDemo1.png)

# Detailed steps
**Overview:**
### Overview:
1. Deploy the infrastructure for this solution
2. Create a table in SQL Data Warehouse
3. Publish code to the Functions App
Expand Down Expand Up @@ -126,7 +129,8 @@ You have now set up your Event Hub, SQL data warehouse, Azure Function App, and
6. Build the solution, then run the WindTurbineGenerator.exe application.

## 6. Observe the Captured data that has been migrated to your SQL Data Warehouse table by the Azure Function
After a couple of minutes, query the table in your data warehouse. Data generated by the WindTurbineDataGenerator is now streamed to your Event Hub, Captured into an Azure Storage container, and then migrated into the SQL data table by Azure Function.
After a couple of minutes, query the table in your data warehouse. You will observe that data generated by the WindTurbineDataGenerator is now streamed to your Event Hub, Captured into an Azure Storage container, and then migrated into the SQL data table by Azure Function.

## Next steps
You can use powerful data visualization tools with your data warehouse to achieve your Actionable insights.

Expand Down

0 comments on commit ecf5d05

Please sign in to comment.