Skip to content

Commit

Permalink
Adding more edits
Browse files Browse the repository at this point in the history
  • Loading branch information
xurui203 committed Aug 7, 2018
1 parent 57e7846 commit 10fd001
Showing 1 changed file with 10 additions and 10 deletions.
20 changes: 10 additions & 10 deletions samples/e2e/EventHubsCaptureEventGridDemo/Readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,15 +6,15 @@ One of the key scenarios for modern cloud scale apps is seamless integration and

![Visual Studio](./media/EventGridIntegrationOverview.PNG)

* First, we turn on Capture on Azure Event Hub with Azure blob storage as destination. This generates Azure storage blobs containing EventHub data in Avro format.
* Next, we create an Azure EventGrid subscription with source as the Azure EventHub namespace and the destination as an Azure WebJob Function.
* Whenever a new Avro blob file is generated by Azure EventHub Capture, Azure EventGrid notifies or shoulder-taps the Azure WebJobs Function with info about the blob (the blob path etc.). The Function then does the required processing to dump the data to a SQL Database data warehouse.
* First, we turn on Capture on Azure Event Hub with an Azure blob storage as destination. Data generated by WindTurbineGenerator will be streamed into the Event Hub and automatically Captured into Azure Storage.
* Next, we create an Azure EventGrid subscription with the Azure Event Hub namespace as its source and the Azure Function endpoint and its destination.
* Whenever a new Avro blob file is generated by Azure EventHub Capture, Azure Event Grid notifies the Azure Function with the blob URI. The Function then does the required processing to migrate the data from the Storage blob to a SQL Database data warehouse.

There are no worker services involved in polling for these Avro blobs, which eliminates management overhead and significantly lower COGS, especially in a cloud-scale production environment!
There are no worker services involved in polling for these Avro files, which eliminates management overhead and significantly lower COGS, especially in a cloud-scale production environment!

The sample code for this scenario consists of a solution with projects that do the following:
1. *WindTurbineDataGenerator*a simple publisher for wind turbine data, that sends the interested data to your EventHub which has Capture enabled on it
1. *FunctionDWDumper*This Azure Functions project receives the EventGrid notification about each Capture Avro blob created. This gets the blob’s Uri path, reads its contents and pushes this data to a SQL Database data warehouse.
This sample solution contains files that do the following:
1. *WindTurbineDataGenerator*A simple publisher that sends wind turbine data to a Capture-enable Event Hub
1. *FunctionDWDumper*An Azure Function that receives an Event Grid notification an Avro file is Captured to the Azure Storage blob. It receives the blob’s URI path, reads its contents and pushes this data to a SQL Data Warehouse.

# Prerequisites
* [Visual studio 2017 Version 15.3.2 or greater](https://www.visualstudio.com/vs/)
Expand All @@ -31,7 +31,7 @@ The sample code for this scenario consists of a solution with projects that do t
5. Run WindTurbineDataGenerator.exe to generate data streams to the Event Hub.
6. Observe the Captured data that has been migrated to your SQL Data Warehouse table by the Azure Function

## 1. Deploy the infrastructure.
## 1. Deploy the infrastructure
Deploy the infrastructure needed for this tutorial by using this [Azure Resource Manager template](https://raw.githubusercontent.com/Azure/azure-docs-json-samples/master/event-grid/EventHubsDataMigration.json). This creates the following resources:
- Event Hub with Capture enabled
- Storage account for the files from Capture
Expand Down Expand Up @@ -127,8 +127,8 @@ You have now set up your Event Hub, SQL data warehouse, Azure Function App, and

6. Build the solution, then run the WindTurbineGenerator.exe application.

## 6. Observe the Captured data that has been migrated to your SQL Data Warehouse table by the Azure Function
After a couple of minutes, query the table in your data warehouse. You will observe that data generated by the WindTurbineDataGenerator is now streamed to your Event Hub, Captured into an Azure Storage container, and then migrated into the SQL data table by Azure Function.
## 6. Observe the Captured data migrate to your SQL Data Warehouse table
After a couple of minutes, query the table in your data warehouse. You will observe that data generated by the WindTurbineDataGenerator has been streamed to your Event Hub, Captured into an Azure Storage container, and then migrated into the SQL data table by Azure Function.

## Next steps
You can use powerful data visualization tools with your data warehouse to achieve your Actionable insights.
Expand Down

0 comments on commit 10fd001

Please sign in to comment.