diff --git a/guide/blueprints/workflow/common.md b/guide/blueprints/workflow/common.md index c790833b..bfc4fba8 100644 --- a/guide/blueprints/workflow/common.md +++ b/guide/blueprints/workflow/common.md @@ -50,6 +50,8 @@ steps: not: { equals: true } ``` +Care should be taken when using `:` in a step with shorthand. YAML will parse it as a map if it is not quoted in YAML. However at runtime, if the step looks like it came from an accidental colon causing a map, it will be reverted to a string with the colon re-introduced, so you can write steps with shorthand `- log Your name is: ${name}`. + All steps support a number of common properties, described below. ### Explicit IDs and Name diff --git a/guide/blueprints/workflow/examples/aws-cfn-stacks/aws-cfn-type.bom b/guide/blueprints/workflow/examples/aws-cfn-stacks/aws-cfn-type.bom new file mode 100644 index 00000000..5db98302 --- /dev/null +++ b/guide/blueprints/workflow/examples/aws-cfn-stacks/aws-cfn-type.bom @@ -0,0 +1,15 @@ +brooklyn.catalog: + bundle: aws-cfn-discovery-sample + version: 1.0.0-SNAPSHOT + items: + - id: aws-cfn-discovered-stack-sample + item: + type: org.apache.brooklyn.entity.stock.BasicEntity + brooklyn.initializers: + - type: workflow-effector + name: on_update + steps: + - set-entity-name ${item.StackName} + - set-sensor data = ${item} + - set-sensor stack_status = ${item.StackStatus} + # above is just a start, you can check drift, explore resources, etc diff --git a/guide/blueprints/workflow/examples/aws-cfn-stacks/aws-discoverer.yaml b/guide/blueprints/workflow/examples/aws-cfn-stacks/aws-discoverer.yaml new file mode 100644 index 00000000..a04c0fb7 --- /dev/null +++ b/guide/blueprints/workflow/examples/aws-cfn-stacks/aws-discoverer.yaml @@ -0,0 +1,16 @@ +name: AWS CloudFormation Discoverer + +services: + - type: workflow-software-process + location: localhost + name: Stacks + + brooklyn.policies: + - type: workflow-policy + brooklyn.config: + name: periodically update children + period: 1m + steps: + - ssh aws cloudformation describe-stacks + - transform stdout | json | set describe_stacks_output_json + - update-children type aws-cfn-discovered-stack-sample id ${item.StackId} from ${describe_stacks_output_json.Stacks} diff --git a/guide/blueprints/workflow/examples/aws-cfn-stacks/index.md b/guide/blueprints/workflow/examples/aws-cfn-stacks/index.md new file mode 100644 index 00000000..9c5a9465 --- /dev/null +++ b/guide/blueprints/workflow/examples/aws-cfn-stacks/index.md @@ -0,0 +1,35 @@ +--- +title: AWS CloudFormation Stack Discovery +title_in_menu: AWS CFN Stacks +layout: website-normal +--- + +The `update-children` step makes it straightforward to keep an Apache Brooklyn model +in synch with external resources, from a cloud, GitHub or Jira tickets, or any data source you choose. +The Brooklyn blueprint can then be used to attach management logic, including for example +automatically deploying branched resources into ephemeral test environments. + +This example shows how CloudFormation stacks in AWS can be synchronized. + +Firstly, we define our type to represent discovered stack and be able to refresh `on_update`: + +{% highlight yaml %} +{% readj aws-cfn-type.yaml %} +{% endhighlight %} + +This should be added to the catalog. + +We can then deploy our Brooklyn application to discover and monitor stacks: + +{% highlight yaml %} +{% readj aws-discoverer.yaml %} +{% endhighlight %} + +Create and delete stacks, and see them update in Brooklyn. +Then consider: + +* Modify the `ssh aws` step in the "discoverer" to filter based on your preferred tags. +* Use the `transform ... | merge` operator to combine lists from different regions. +* Add other policies to check for drift on stacks and show failures in AMP if there is drift. +* Create a similar workflow to monitor pull requests using the `gh` CLI; + then create, update, delete, and track ephemeral test deployments based on those diff --git a/guide/blueprints/workflow/examples/index.md b/guide/blueprints/workflow/examples/index.md index 3ea6c5ad..bc1ef878 100644 --- a/guide/blueprints/workflow/examples/index.md +++ b/guide/blueprints/workflow/examples/index.md @@ -6,6 +6,7 @@ children: - ansible-bash/ - git-latest/ - oauth/ +- aws-cfn-stacks/ --- The following examples are available: diff --git a/guide/blueprints/workflow/nested-workflow.md b/guide/blueprints/workflow/nested-workflow.md index ee64804c..aa08992c 100644 --- a/guide/blueprints/workflow/nested-workflow.md +++ b/guide/blueprints/workflow/nested-workflow.md @@ -56,7 +56,7 @@ as follows: inclusive (so the string `1..4` is equivalent to the list `[1,2,3,4]`) The scratch variables `target` and `target_index` are available referring to to the specific target -and its 0-indexed position. +and its 0-indexed position. These names can be overridden with the `target_var_name` and `target_index_var_name` keys. Where a list is supplied, the result of the step is the list collecting the output of each sub-workflow. @@ -64,6 +64,45 @@ If a `condition` is supplied when a list is being used, the `workflow` step will and the `condition` will be applied to entries in the list. An example of this is included below. +The `foreach` type is a simplified variant of `workflow` when recursing over a list, +taking the same. + +#### Example + +``` +- step: foreach x in 1..3 + steps: + - return ${x} +``` + +The above loop will return `[1,2,3]`. + + +### Reducing + +Each nested workflow runs in its own scope and does not share workflow variables with the parent, +apart from values specified as `input`, or with other iterations of a loop. +Where it is desired to share variables across iterations, the key `reducing` can be supplied, +giving a map of variable names to be shared and their initial values. + +When `reducing`, the output of the workflow is this set of variables with their final values. + + +#### Example + +``` +- step: foreach x in 1..3 + reducing: + sum: 0 + steps: + - let sum = ${sum} + ${x} +``` + +The above loop will return `6`. + + +### Concurrency + By default nested workflows with list targets run sequentially over the entries, but this can be varied by setting `concurrency`. The following values are allowed: @@ -80,6 +119,8 @@ and always allowing 1. This might be used for example to upgrade a cluster in situ, leaving the larger of 10 instances or half the cluster alone, if possible. If the concurrency expression evaluates to 0, or to a negative number whose absolute value is larger than the number of values, the step will fail before executing, to ensure that if e.g. "-10" is specified when there are fewer than 10 items in the target list, the workflow does not run. (Use "max(1, -10)" to allow it to run 1 at a time if there are 10 or fewer.) +Note: Concurrency cannot be specified when `reducing`. + #### Example This example invokes an effector on all children which are `service.isUp`, @@ -115,6 +156,11 @@ It also accepts the standard step keys such as `input`, `timeout` on `on-error`. A user of the defined step type can also supply `output` which, as per other steps, is evaluated in the context of the outer workflow, with visibility of the output from the current step. +When supplying a workflow in contexts where a `workflow` is already expected, +such as in a config key that takes a `workflow` (a Java `CustomWorkflowStep`), +it is not necessary to specify the `type: workflow`, and additionally, if the only things being set is `steps`, those steps can be provided as a list without the `steps` keyword. +Internally a _list_ will coerce to a `workflow` by interpreting the list as the steps. + #### Shorthand Template Syntax @@ -182,7 +228,7 @@ boolean isDefaultIdempotent(); ``` The first of these does the work of the step, resolving inputs and accessing context as needed via `context`. -The second handles providing a cusotm shorthand, as described above; +The second handles providing a custom shorthand, as described above; it can call to a superclass method `populateFromShorthandTemplate(TEMPLATE, value)` with the `TEMPLATE` for the class, if shorthand is to be supported. Finally, the third returns whether the step is idempotent, that is if the custom step is interrupted, diff --git a/guide/blueprints/workflow/settings.md b/guide/blueprints/workflow/settings.md index 3fef5709..6bc480a8 100644 --- a/guide/blueprints/workflow/settings.md +++ b/guide/blueprints/workflow/settings.md @@ -10,7 +10,7 @@ Some of the common properties permitted on [steps](common.md) also apply to work including `condition`, `timeout`, and `on-error`. This rest of this section describes the remaining properties for more advanced use cases -including mutex locking and resilient workflows with replay points. +including mutex locking and resilient workflows with replay points, and some tips on optimizing. ## Locks and Mutual Exclusion Behavior @@ -477,3 +477,22 @@ on-error: - workflow retention parent ``` +## Optimizing for Workflows + +Workflows can generate a huge amount of data which can impact memory usage, persistence, and the UI. +The REST API and UI do some filtering (e.g. in the body of the `internal` sensors used by workflow), +but when working with large `ssh` `output` and `http` `content` payloads, and with `update-children`, +performance can be dramatically improved by following these tips: + +* Optimize external calls to return the minimal amount of information needed + * Use `jq` to filter when using `ssh` or `container` steps + * Pass filter argumetns to `http` endpoints that accept them + * Use small page sizes with `retry from` steps + +* Optimize the data which is stored + * Override the `output` on `ssh` and `http` steps to remove unnecessary objects; + for example `http` returns several `content*` fields, and often just one is needed. + Simply settings `output: { content: ${content} }` will achieve this. + * Set `retention: 1` or `retention: 0` on workflows that use a large amount of information + and can simply be replayed from the start + diff --git a/guide/blueprints/workflow/steps/steps.yaml b/guide/blueprints/workflow/steps/steps.yaml index 516d9bac..a543ce1b 100644 --- a/guide/blueprints/workflow/steps/steps.yaml +++ b/guide/blueprints/workflow/steps/steps.yaml @@ -22,11 +22,11 @@ steps: - name: let - summary: An alias for `set-workflow-variable`. + summary: An alias for `set-workflow-variable` shorthand: '`let [ TYPE ] VARIABLE_NAME [ = VALUE ]`' - name: set-workflow-variable - summary: Sets the value of a workflow internal variable. The step `let` is an alias for this. + summary: Sets the value of a workflow internal variable. The step `let` is an alias for this shorthand: '`set-workflow-variable [TYPE] VARIABLE_NAME [ = VALUE ]`' input: | * `variable`: either a string, being the workflow variable name, or a map, containing the `name` and optionally the `type`; @@ -42,14 +42,20 @@ output: the output from the previous step, or null if this is the first step - name: transform - summary: Applies a transformation to a variable or expression. - shorthand: '`transform [TYPE] VARIABLE_NAME [ [ = VALUE ] | TRANSFORM ]`' + summary: Applies a transformation to a variable or expression + shorthand: '`transform [TYPE] [ "value" ] VARIABLE_NAME [ = VALUE ] | TRANSFORM`' input: | * `variable`: either a string, being the workflow variable name, or a map, containing the `name` and optionally the `type`; - the value will be coerced to the given type, e.g. to force conversion to an integer or to a bean registered type; - the special types `yaml` and `json` can be specified here to force conversion to a valid YAML or JSON string; - the `name` here can be of the form `x.key` where `x` is an existing map variable, to set a specific `key` within it - * `value`: the value to set, with some limited evaluation as described [here](../variables.html) + if `value` is supplied, this variable will be set to the result of all transforms and then coercion to any indicated type, + e.g. to force conversion to an integer or to a bean registered type; + if `value` is not supplied, the name is treated as a variable name to resolve and coerce to any indicated type + (i.e. wrapping the name in `${...}`; this is suppressed if the type is the special keyword `value`, and `name` is resolved as a normal expression) + and then used for the transform with the result being returned; + `name` here can be of the form `x.key` where `x` is an existing map variable, to user or set a specific `key` within it + * `value_is_initial`: a boolean, implied in shorthand by the word "value", means to treat the variable name as a value rather than a variable name, + evaluating it as a normal expression rather than wraping it in `${...}`; this cannot be used it a `value` is supplied + * `value`: the value to set, with some limited evaluation prior to transforms as described [here](../variables.html); + if not supplied, the `variable` is evaluated per the above and used as the input for the transform * `transform`: a string indicating a transform, or multiple transforms separated by `|`, where a transform can be * `trim` to remove leading and trailing whitespace on strings, null values from lists or sets, and any entry with a null key or null value in a map * `replace MODE PATTERN REPLACEMENT` to replace a pattern in a string with a replacement, supporting mode `regex`, `glob`, or `literal` @@ -71,11 +77,15 @@ * `bash [json|yaml] [string|encode]`: equivalent to the corresponding `json` or `yaml` transform, with `json` being the default, followed by bash-escaping and wrapping in double quotes; ideally suited for passing to scripts; `string` is the default and suitable for most purposes, but `encode` is needed where passing to something that expects JSON such as `jq` * `first`, `last`, `min`, `max`, `sum`, `average` and `size` are accepted for collections + * `to_string`, `to_upper_case`, and `to_lower_case` are accepted for strings + * `resolve_expression` performs resolution of any interpolated expressions; this can be used to do nested interpolation + * `set VAR` to set the workflow variable `VAR` to the the value at that point in the transformation + * `return` to return the result of the transformation (not compatible with supplying a `value` to set in a variable) * any other word is looked up as a registered type of type `org.apache.brooklyn.core.workflow.steps.transform.WorkflowTransform` (or `WorkflowTransformWithContext` to get the context or arguments) - output: the output from the previous step, or null if this is the first step + output: if no `value` is supplied, the result of all transformed; if a `value` is supplied and set in `variable`, then the output from the previous step, or null if this is the first step - name: clear-workflow-variable - summary: Clears the value of a workflow internal variable. + summary: Clears the value of a workflow internal variable shorthand: '`clear-workflow-variable [TYPE] VARIABLE_NAME`' input: | * `variable`: a string being the workflow variable name or a map containing the `name` @@ -190,14 +200,34 @@ * `steps`: a list of steps to run, run in a separate context * `target`: an optional target specifier, an entity or input to the steps for the sub-workflow, or if a list, a list of entities or inputs to pass to multiple sub-workflows + * `target_var_name`: an optional variable name to set in nested workflows to refer to the element in the target, defaulting to `target` + * `target_index_var_name`: an optional variable name to set in nested workflows to refer to the index of the element in the target, if the target is a list, defaulting to `target_index` * `concurrency`: a specification for how many of the sub-workflows can be run in parallel, if given a list target; defaulting to one at a time, supporting a DSL as described [here](../nested-workflow.html) + * `reducing`: a map of variables to pass sequentially to each nested workflow instance, + with the values of those variables in the output or scratch for each nested workflow passed to the next nested workflow, + and the final set of those values being returned from this step for use in the calling workflow; + not permitted if `concurrency` is not static at `1` * `condition`: the usual condition settable on steps as described [here](../common.html) can be used, with one difference that if a target is specified, the condition is applied to it or to each entry in the list, to conditionally allow sub-workflows, and the workflow step itself will always run (i.e. the condition does not apply to the step itself if it has a target) * `replayable`: instructions to add or modify replay points, as described [here](../settings.html), for example `workflow replayable from here` * `retention`: instructions to modify workflow retention, as described [here](../settings.html) - output: the output from the last step in the nested workflow, or a list of such outputs if supplied a `target` list + output: if `reducing` is specified, the final value of those variables, + otherwise the output from the last step in the nested workflow or a list of such outputs if supplied a `target` list + + - name: foreach + summary: Runs nested workflow over a list using the specified variable name, equivalent to `workflow` when looping over a list + shorthand: '`foreach TARGET_VAR_NAME [ in TARGET ]`' + input: | + * `steps`: a list of steps to run, run in a separate context + * `target_var_name`: the name of the variable that should be set in nested workflows to refer to the element in the target list, + with the additional behavior that if this is of the "spread map syntax" form `{KEY,KEY2}`, each element in the target list will be assumed to be a map + and the keys "KEY" and "KEY2" will be extracted and each one set as a variable in the nested workflow + * `target`: the list that should be looped over + * other input as per `workflow` + output: as per `workflow`, if `reducing` is specified, the final value of those variables, + otherwise the list of outputs from the last step of each nested workflow corresponding to an element in the `target` list - section_name: External Actions @@ -230,7 +260,7 @@ - name: http - summary: Sends an HTTPS (or HTTP) request and returns the response content and code. + summary: Sends an HTTPS (or HTTP) request and returns the response content and code shorthand: '`http ENDPOINT`' input: | * `endpoint`: the URL to connect to; protocol can be omitted to use the default `https://`; @@ -259,7 +289,7 @@ * `duration`: how long the request took - name: ssh - summary: Runs a command over ssh. + summary: Runs a command over ssh shorthand: '`ssh COMMAND`' input: | * `command`: the command to run @@ -277,7 +307,7 @@ * `exit_code` - name: winrm - summary: Runs a command over winrm. + summary: Runs a command over winrm shorthand: '`winrm COMMAND`' input: | * `command`: the command to run @@ -329,7 +359,7 @@ steps: - name: invoke-effector - summary: Invokes an effector. + summary: Invokes an effector shorthand: '`invoke-effector EFFECTOR`' input: | * `effector`: the name of the effector to invoke @@ -338,7 +368,7 @@ output: the returned object from the invoked effector - name: set-config - summary: Sets the value of a config key on an entity. + summary: Sets the value of a config key on an entity shorthand: '`set-config [TYPE] CONFIG_KEY_NAME = VALUE`' input: | * `config`: either a string, being the config key name, or a map, containing the `name` and @@ -349,7 +379,7 @@ output: the output from the previous step, or null if this is the first step - name: set-sensor - summary: Sets the value of a sensor on an entity. + summary: Sets the value of a sensor on an entity shorthand: '`set-sensor [TYPE] SENSOR_NAME = VALUE`' input: | * `sensor`: either a string, being the sensor name, or a map, containing the `name` and @@ -363,7 +393,7 @@ output: the output from the previous step, or null if this is the first step - name: clear-config - summary: Clears the value of a config key on an entity. + summary: Clears the value of a config key on an entity shorthand: '`clear-config [TYPE] CONFIG_KEY_NAME`' input: | * `config`: a string being the config key name or a map containing the `name` and @@ -372,7 +402,7 @@ the output from the previous step, or null if this is the first step - name: clear-sensor - summary: Clears the value of a sensor on an entity. + summary: Clears the value of a sensor on an entity shorthand: '`clear-sensor [TYPE] SENSOR_NAME`' input: | * `sensor`: a string being the sensor name or a map containing the `name` and @@ -458,15 +488,79 @@ * `entity`: the entity or entity ID where the adjunct should be removed, defaulting to the current entity output: the output from the previous step, or null if this is the first step + - name: update-children + summary: Updates children of an entity to be in 1:1 correspondence with items in a given list + shorthand: '`update-children [of PARENT] type BLUEPRINT id IDENTIFIER_EXPRESSION from ITEMS`' + input: | + * `blueprint`: a blueprint or name of a registered entity type to use to create the children; + this is required unless `on_create` is specified; where supplied as a blueprint (not a string) + the variable `item` can be referenced to provide initial values, but note this is not updated + * `identifier_expression`: an expression in terms of a local variable `item` to use to identify the same child; + e.g. if the `items` is of the form `[{ field_id: 1, name: "Ticket 1" },...]` then + `identifier_expression: ticket_${item.field_id}` will create/update/delete a child whose ID is `ticket_1`; + used only by the default `match_check` so not required if that is overridden not to use it + * `items`: the list of items to be used to create/update/delete the children + * `parent`: the entity or entity ID whose children are to be updated, defaulting to the current entity; + any children which do not match something in `items` may be removed + + * `on_create`: an optionally supplied workflow to run at any newly created child, where no pre-existing child was found + corresponding to an item in `items` and `update-children` created it from `blueprint`, + passed the `item` (and all inputs to the `update-children` step) + and typically doing initial setup as required on the child; + the default behavior is to invoke an `on_create` effector at the child (if there is such an effector, otherwise do nothing), passing `item`; + this is invoked prior to `on_update` so if there is nothing special needed for creation this can be omitted + + * `on_update`: a workflow to run on each child which has a corresponding item, + passed the `item` (and all inputs to the `update-children` step), + and typically updating config and sensors as appropriate on the child; + the default behavior is to invoke an `on_update` effector at the child (if there is such an effector, otherwise do nothing), passing `item`; + if the name or any policies may need to change on update, that should be considered in this workflow; + if the `update-children` is performed frequently, it might be efficient in this method to check whether the `item` has changed + + * `on_delete`: a workflow to run on each child which no longer has a corresponding item prior to its being deleted; + the default behavior is to invoke an `on_delete` effector at the child (if there is such an effector, or nothing); + if this workflow returns `false` the framework will not delete it; + this workflow may reparent or delete the entity, although if deletion is desired there is no need as that will be done after this workflow + + * `match_check`: + this optionally supplied workflow allows the matching process to be customized, filtering and determining the intended child or its id; + it will be invoked for each item in `items` to find a matching child/entity if one is already present; + the workflow is passed input variable `item` (and other inputs to the `update-children` step) + and should return either the entity that corresponds to it which should be updated (`on_update`) or the identifier for the child that should be created, + or `null` if the item should be omitted; + the default implementation is to evaluate the expression in `identifier`, i.e. `let id = ${${identifier}}`, + then to return any child matching that if there is one or the resolved identifier, i.e. `${parent.child[${id}]} ?? ${id}`; + this workflow may create or reparent an entity and return it, and it will not have `on_create` invoked + + * `creation_check`: + this optionally supplied workflow allows filtering and custom creation; + it will be invoked for each item in `items` for which the `match_check` returned a string value indicating to create a child, + the workflow is passed the resulting `match` and the `item` (and other inputs to the `update-children` step), + and should return the entity created or `null` if the item should be omitted; + the result of this, if not null, will have the `on_create` handler invoked on it; + the default implementation is to create the entity (applying the `resolve_expression` transform on `blueprint`), + set the ID, and return the newly created entity + + * `deletion_check`: + this optionally supplied workflow allows customizing pre-deletion activities and/or the deletion itself; + it will be invoked for each child of `parent` which was not returned by the `match_check` or `creation_check`, + with each such entity passed as an input variable `child` (along with other inputs to the `update-children` step); + it can then return `true` or `false` to specify whether the child should be deleted + (with `on_delete` called prior to deletion if `true` is returned); + this workflow may reparent the entity and return `false` if it is intended to keep the entity but + disconnect it from this synchronization process, + or may even `delete-entity ${child}` (although that is not usually necessary) + + output: the output from the previous step, or null if this is the first step - section_name: General Purpose section_intro: | - A few other miscellaneous step types don't fit into the other categories. + Miscellaneous step types that don't fit into the other categories. steps: - name: log - summary: Logs a message. + summary: Logs a message shorthand: '`log MESSAGE`' input: | * `message`: the message to be logged @@ -483,7 +577,7 @@ output: the output from the previous step, or null if this is the first step - name: sleep - summary: Causes execution to pause for a specified duration. + summary: Causes execution to pause for a specified duration shorthand: '`sleep DURATION`' input: | * `duration`: how long to sleep for, e.g. `5s` for 5 seconds diff --git a/guide/blueprints/workflow/variables.md b/guide/blueprints/workflow/variables.md index 868bca50..a2b421bb 100644 --- a/guide/blueprints/workflow/variables.md +++ b/guide/blueprints/workflow/variables.md @@ -81,6 +81,7 @@ workflow is running, where `` can be: * `config.` - returns the value of the config key `` * `sensor.` - returns the value of the sensor key `` * `attributeWhenReady.` - returns the value of the sensor key `` once it is ready (available and truthy), for use with the `wait` step +* `effector.` - returns the definition of the effector `` (useful in conditions to invoke effectors only if they are defined on an entity) * `children` - returns the list of children; these can be further identified either by index or by ID using square-bracket notation * `members` - returns the list of members (for a group); these can be further identified either by index or by ID using square-bracket notation * `parent.` - returns the value of `` per any of the above in the context of the application @@ -202,10 +203,22 @@ and the modulo operator `%` for integers giving the remainder. These are evaluated in usual mathematical order. Parentheses are not supported. -The `transform` step can be used for more complicated transformations, such as whether to `wait` on values that are not yet ready, -conversion using `json` and `yaml`, and whether to `trim` strings or yaml documents. -This supports two types of trimming: if a `type` is specified, the value is scanned for `---` on a line by itself -and that token is used as a "document separator", and only the last document is considered; +The `transform` step can be used for more complicated transformations, such as +to `wait` on values that are not yet ready, +to convert `json` and `yaml`, to `trim` strings, merge lists and maps, and much more. +For example: + +``` +- transform x = " [ a, b ] " | trim` # sets x to the string '[ a, b ]' +- transform y = ${x} | json` # sets y to the list of strings '[ "a", "b" ]' +- step: transform | merge | set z` # sets z to the list of strings '[ "a", "b", "c" ]' + value: + - ${x} + - [ c ] +``` + +The `yaml` transform will treat a `---` on a line by itself +as a "document separator", and only the last document is considered; if no `type` is specified, the value has leading and trailing whitespace removed. The former is primarily intended for YAML processing from a script which might include unwanted output prior to the outputting the YAML intended to set in the variable: the script can do `echo ---` before the