Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Alerting] creating alert with slack action with bad action variable results in 400 response on execution #60042

Closed
pmuellr opened this issue Mar 12, 2020 · 2 comments · Fixed by #60468
Assignees
Labels
bug Fixes for quality problems that affect the customer experience Feature:Alerting Team:ResponseOps Label for the ResponseOps team (formerly the Cases and Alerting teams) v7.7.0

Comments

@pmuellr
Copy link
Member

pmuellr commented Mar 12, 2020

Create a slack action pointing to a real slack webhook. Create an index threshold alert that will actually fire, and add the slack action to it. For the "message" parameter, enter {{{message}}}.

Once the slack action fires, you should see a 400 response from slack in the Kibana log:

server    log   [13:30:05.158] [error][plugins][taskManager][taskManager] Task actions:.slack "1b1a4350-6487-11ea-866c-815ea27e2986" failed: Error: unexpected http response from slack: 400 Bad Request
server    log   [13:30:05.159] [info][eventLog][plugins] event logged: {"event":{"provider":"actions","action":"execute","start":"2020-03-12T17:30:04.865Z","end":"2020-03-12T17:30:05.157Z","duration":292000000},"kibana":{"namespace":"default","saved_objects":[{"type":"action","id":"3ccbd3f1-e93e-4673-957b-875b4ba3b82b"}],"server_uuid":"5b2de169-2785-441b-ae8c-186a1936b17d"},"message":"action execution failure: .slack:3ccbd3f1-e93e-4673-957b-875b4ba3b82b: slack #kibana-alerting-slack-tests manual","error":{"message":"unexpected http response from slack: 400 Bad Request"},"@timestamp":"2020-03-12T17:30:05.157Z","ecs":{"version":"1.3.1"}}

Change the "message" parameter value to {{{context.message}}} and save, and the messages will start working.

Not sure what's going on there, but the {{{message}}} is clearly incorrect, as we don't have a variable with that name. Does it end up rendering as an empty string, and so Slack ignores it?

If that's it, then we should fix this for 7.7, because I'm sure people will end up hitting this.

Seems like we may want to have some check to see if required parameter values are non-empty, within the actions. Maybe that's where it's coming from in this case, but then the message isn't very clear about the problem.

We could perhaps default "empty" required values to a value which indicates the param was "empty" - in this case something like slack message was an empty string in action parameters

@pmuellr pmuellr added Feature:Alerting Team:ResponseOps Label for the ResponseOps team (formerly the Cases and Alerting teams) labels Mar 12, 2020
@elasticmachine
Copy link
Contributor

Pinging @elastic/kibana-alerting-services (Team:Alerting Services)

@pmuellr
Copy link
Member Author

pmuellr commented Mar 12, 2020

ya, looks like it must render to an empty string:

$ node
> mustache = require('mustache')
{ name: 'mustache.js',
  ...
  render: [Function: render],
  ... }
> mustache.render('hi')
'hi'
> mustache.render('{{{foo}}}', {})
''

@mikecote mikecote added the bug Fixes for quality problems that affect the customer experience label Mar 16, 2020
@mikecote mikecote self-assigned this Mar 18, 2020
@kobelb kobelb added the needs-team Issues missing a team label label Jan 31, 2022
@botelastic botelastic bot removed the needs-team Issues missing a team label label Jan 31, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Fixes for quality problems that affect the customer experience Feature:Alerting Team:ResponseOps Label for the ResponseOps team (formerly the Cases and Alerting teams) v7.7.0
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants