Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(asynchooks-scope): fix context loss using .with() #1101 #1099

Merged
merged 5 commits into from
May 28, 2020

Conversation

vmarchaud
Copy link
Member

@vmarchaud vmarchaud commented May 23, 2020

This one was kinda hard to debug and fix but it finally works !
I had to remove different test cases that were working in the past:
- cases where we expected that .bind of a function will propagate the context across async operations, which is essentially leaking because .bind is only synchronous.
- cases where we wanted to use withAsync inside of a with which doesn't work because as said above, with is only sync.

Started this refactor for #1013 but considering the bug in #1101 (that this refactor fixes) as discussed at the SIG, we should merge that first and then try to fix withAsync

Lot of edit for this one but its worth, i found a way to fix the initial issue #752 without using await so i removed withAsync which should fixes #1013 and it works with #1101. Finally !

closes #1101 #752 #1013

@vmarchaud vmarchaud self-assigned this May 23, 2020
@vmarchaud vmarchaud added the size/L Denotes a PR that changes 100-499 lines, ignoring generated files. label May 23, 2020
@vmarchaud vmarchaud force-pushed the refactor-asynchooks-context branch from 669a33a to 77b67fe Compare May 23, 2020 13:50
@vmarchaud

This comment has been minimized.

@vmarchaud

This comment has been minimized.

@vmarchaud vmarchaud marked this pull request as draft May 23, 2020 16:50
@vmarchaud vmarchaud changed the title feat(asynchooks-scope): refactor to fix context leak #1013 refactor(asynchooks-scope): refactor to fix context leak #1013 May 27, 2020
@vmarchaud

This comment has been minimized.

@vmarchaud vmarchaud force-pushed the refactor-asynchooks-context branch 2 times, most recently from e9fb5fc to e0f1bc5 Compare May 28, 2020 09:27
@vmarchaud vmarchaud changed the title refactor(asynchooks-scope): refactor to fix context leak #1013 fix(asynchooks-scope): fix context loss using .with() #1101 May 28, 2020
@vmarchaud vmarchaud force-pushed the refactor-asynchooks-context branch from e0f1bc5 to eceac37 Compare May 28, 2020 09:30
@codecov
Copy link

codecov bot commented May 28, 2020

Codecov Report

Merging #1099 into master will decrease coverage by 0.04%.
The diff coverage is 100.00%.

@@            Coverage Diff             @@
##           master    #1099      +/-   ##
==========================================
- Coverage   92.36%   92.32%   -0.05%     
==========================================
  Files         115      115              
  Lines        3407     3389      -18     
  Branches      686      683       -3     
==========================================
- Hits         3147     3129      -18     
  Misses        260      260              
Impacted Files Coverage Δ
...ontext-async-hooks/src/AsyncHooksContextManager.ts 96.87% <100.00%> (-0.50%) ⬇️

@vmarchaud vmarchaud force-pushed the refactor-asynchooks-context branch from eceac37 to 006a5ac Compare May 28, 2020 11:53
@vmarchaud
Copy link
Member Author

Good news, i found a way to make with() works when using async/await (checkout this test), plus the code is way simpler than before !
Ready for review @open-telemetry/javascript-approvers :)

@vmarchaud vmarchaud marked this pull request as ready for review May 28, 2020 12:01
@vmarchaud vmarchaud force-pushed the refactor-asynchooks-context branch from 006a5ac to f0b4c0e Compare May 28, 2020 12:03
@vmarchaud vmarchaud requested a review from dyladan May 28, 2020 12:43
@dyladan dyladan linked an issue May 28, 2020 that may be closed by this pull request
Copy link
Member

@dyladan dyladan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is great work thanks so much for this

@dyladan
Copy link
Member

dyladan commented May 28, 2020

@open-telemetry/javascript-approvers this is very high priority. I want to get this merged and released as a patch release which means not merging any breaking changes until this is in. @mayurkale22 wdyt?

@dyladan dyladan added the bug Something isn't working label May 28, 2020
Copy link
Member

@obecny obecny left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks great

Copy link
Member

@mayurkale22 mayurkale22 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice!

@mayurkale22
Copy link
Member

@open-telemetry/javascript-approvers this is very high priority. I want to get this merged and released as a patch release which means not merging any breaking changes until this is in. @mayurkale22 wdyt?

I totally agree. In the next release, we can try to combine all the breaking changes (mostly related to metrics work).

Copy link
Contributor

@lykkin lykkin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Out of curiosity, what's the perf difference with the added before/after hooks?

@dyladan
Copy link
Member

dyladan commented May 28, 2020

LGTM! Out of curiosity, what's the perf difference with the added before/after hooks?

We don't really have a great way to compare the performance here, but it should be relatively minimal as they are only called when async contexts are created/destroyed. Sync function calls should have no performance impact from this. Moreover, even if there is a performance impact, correctness is more important than speed.

@dyladan dyladan merged commit 12e6246 into open-telemetry:master May 28, 2020
@lykkin
Copy link
Contributor

lykkin commented May 28, 2020

Sync function calls should have no performance impact from this.

My concern was more about the work done on entry/exit of the async context - with the introduction of the before and after hooks there is going to be some extra work for all network/fs/promise/async resource/etc work done by the runtime. These changes have perf implications for a user beyond how they interact with the OTel API. In my experience, the async hook overhead is smaller than the alternatives and the work in the hook is minimal so I'm not terribly concerned because I totally agree with you here.

Moreover, even if there is a performance impact, correctness is more important than speed.

It would be nice to have benchmarks for these kinds of things in the future, but considering it covers a gap in functionality maybe it's just gravy at this point.

@vmarchaud
Copy link
Member Author

My concern was more about the work done on entry/exit of the async context - with the introduction of the before and after hooks there is going to be some extra work for all network/fs/promise/async resource/etc work done by the runtime. These changes have perf implications for a user beyond how they interact with the OTel API. In my experience, the async hook overhead is smaller than the alternatives and the work in the hook is minimal so I'm not terribly concerned because I totally agree with you here.

IMO if people want performance over UX (by UX i mean getting all traces automatically), they must not use the async hooks scope manager and trace manually the operations they want to.

It would be nice to have benchmarks for these kinds of things in the future, but considering it covers a gap in functionality maybe it's just gravy at this point.

I agree that we should have some benchmark in place, plus they are not really hard to implement. Ive opened #1123

pichlermarc pushed a commit to dynatrace-oss-contrib/opentelemetry-js that referenced this pull request Dec 15, 2023
Co-authored-by: Amir Blum <amir@aspecto.io>
Co-authored-by: Valentin Marchaud <contact@vmarchaud.fr>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working size/L Denotes a PR that changes 100-499 lines, ignoring generated files.
Projects
None yet
6 participants