Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: fixes links to code examples #220

Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
20 changes: 10 additions & 10 deletions docs/BUILDING-BLOCKS.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ We strongly suggest that you build your sources out of existing **building block
* [Connect the transformers to the resources](https://dlthub.com/docs/general-usage/resource#feeding-data-from-one-resource-into-another) to load additional data or enrich it
* [Create your resources dynamically from data](https://dlthub.com/docs/general-usage/source#create-resources-dynamically)
* [Append, replace and merge your tables](https://dlthub.com/docs/general-usage/incremental-loading)
* [Transform your data before loading](https://dlthub.com/docs/general-usage/resource#customize-resources) and see some [examples of customizations like column renames and anonymization](https://dlthub.com/docs/customizations/customizing-pipelines/renaming_columns)
* [Transform your data before loading](https://dlthub.com/docs/general-usage/resource#customize-resources) and see some [examples of customizations like column renames and anonymization](https://dlthub.com/docs/general-usage/customising-pipelines/renaming_columns)
* [Set up "last value" incremental loading](https://dlthub.com/docs/general-usage/incremental-loading#incremental-loading-with-last-value)
* [Dispatch data to several tables from a single resource](https://dlthub.com/docs/general-usage/resource#dispatch-data-to-many-tables)
* [Set primary and merge keys, define the columns nullability and data types](https://dlthub.com/docs/general-usage/resource#define-schema)
Expand All @@ -22,14 +22,14 @@ Concepts to grasp
* [How we distribute sources to our users](DISTRIBUTION.md)

Building blocks used right:
* [Create dynamic resources for tables by reflecting a whole database](https://github.com/dlt-hub/verified-sources/blob/master/sources/sql_database/sql_database.py#L50)
* [Incrementally dispatch github events to separate tables](https://github.com/dlt-hub/verified-sources/blob/master/sources/github/__init__.py#L70)
* [Read the participants for each deal using transformers and pipe operator](https://github.com/dlt-hub/verified-sources/blob/master/sources/pipedrive/__init__.py#L67)
* [Create dynamic resources for tables by reflecting a whole database](https://github.com/dlt-hub/verified-sources/blob/master/sources/sql_database/__init__.py#L56)
* [Incrementally dispatch github events to separate tables](https://github.com/dlt-hub/verified-sources/blob/master/sources/github/__init__.py#L91-L95)
* [Read the participants for each deal using transformers and pipe operator](https://github.com/dlt-hub/verified-sources/blob/master/sources/pipedrive/__init__.py#L85-L88)
* [Read the events for each ticket by attaching transformer to resource explicitly](https://github.com/dlt-hub/verified-sources/blob/master/sources/hubspot/__init__.py#L125)
* [Set `tags` column data type to complex to load them as JSON/struct](https://github.com/dlt-hub/verified-sources/blob/master/sources/zendesk/__init__.py#L108)
* Typical use of `merge` with incremental load for endpoints returning a list of updates to entities in [Shopify source](https://github.com/dlt-hub/verified-sources/blob/master/sources/shopify_dlt/__init__.py#L36).
* A `dlt` mega-combo in `pipedrive` source, where the deals from `deal` endpoint are [fed into](https://github.com/dlt-hub/verified-sources/blob/master/sources/pipedrive/__init__.py#L113) `deals_flow` resource to obtain events for a particular deal. [Both resources use `merge` write disposition and incremental load to get just the newest updates](https://github.com/dlt-hub/verified-sources/blob/master/sources/pipedrive/__init__.py#L103). [The `deals_flow` is dispatching different event types to separate tables with `dlt.mark.with_table_name`](https://github.com/dlt-hub/verified-sources/blob/master/sources/pipedrive/__init__.py#L135).
* An example of using JSONPath expression to get cursor value for incremental loading. In pipedrive some objects have `timestamp` property and others `update_time`. [The dlt.sources.incremental('update_time|modified') expression lets you bind the incremental to either](https://github.com/dlt-hub/verified-sources/blob/master/sources/pipedrive/recents.py#L39).
* If your source/resource needs google credentials, just use `dlt` built-in credentials as we do in [google sheets](https://github.com/dlt-hub/verified-sources/blob/master/sources/google_sheets/__init__.py#L26) and [google analytics](https://github.com/dlt-hub/verified-sources/blob/master/sources/google_analytics/__init__.py#L32). Also note how `credentials.to_native_credentials()` is used to initialize google api client.
* [Set `tags` column data type to complex to load them as JSON/struct](https://github.com/dlt-hub/verified-sources/blob/master/sources/zendesk/__init__.py#L254-L257)
* Typical use of `merge` with incremental loading for endpoints returning a list of updates to entities in [Shopify source](https://github.com/dlt-hub/verified-sources/blob/master/sources/shopify_dlt/__init__.py#L41).
* A `dlt` mega-combo in `pipedrive` source, where the deals from `deal` endpoint are [fed into](https://github.com/dlt-hub/verified-sources/blob/master/sources/pipedrive/__init__.py#L90-L92) `deals_flow` resource to obtain events for a particular deal. [Both resources use `merge` write disposition and incremental load to get just the newest updates](https://github.com/dlt-hub/verified-sources/blob/master/sources/pipedrive/__init__.py#L77-L80). [The `deals_flow` is dispatching different event types to separate tables with `dlt.mark.with_table_name`](https://github.com/dlt-hub/verified-sources/blob/master/sources/pipedrive/__init__.py#L105).
* An example of using JSONPath expression to get cursor value for incremental loading. In pipedrive some objects have `timestamp` property and others `update_time`. [The dlt.sources.incremental('update_time|modified') expression lets you bind the incremental to either](https://github.com/dlt-hub/verified-sources/blob/master/sources/pipedrive/helpers/pages.py#L46-L48).
* If your source/resource needs google credentials, just use `dlt` built-in credentials as we do in [google sheets](https://github.com/dlt-hub/verified-sources/blob/master/sources/google_sheets/__init__.py#L27) and [google analytics](https://github.com/dlt-hub/verified-sources/blob/master/sources/google_analytics/__init__.py#L28). Also note how `credentials.to_native_credentials()` is used to initialize google api client.
* If your source/resource accepts several different credential types look how [we deal with 3 different types of Zendesk credentials](https://github.com/dlt-hub/verified-sources/blob/master/sources/zendesk/helpers/credentials.py#L10)
* See database connection string credentials [applied to sql_database source](https://github.com/dlt-hub/verified-sources/blob/master/sources/sql_database/sql_database.py#L22)
* See database connection string credentials [applied to sql_database source](https://github.com/dlt-hub/verified-sources/blob/master/sources/sql_database/__init__.py#L22)
Loading