Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add tests for eth_subscribe #146

Merged

Conversation

tgmichel
Copy link
Contributor

No description provided.

@tgmichel
Copy link
Contributor Author

tgmichel commented Oct 2, 2020

@sorpaas I am not able to test logs subscription. The reason is evm receipt logs are empty on manual-seal, I am not sure why.

Please review and merge if it lgty. We will come back to it once we figure out the receipt logs thing.

@danforbes
Copy link
Contributor

@jimmychu0807, @seunlanlege - @tgmichel is having some issues with manual seal. Do you think you might be able to help them?

I noticed that manual seal is quite inconsistent when you have multiple mocha-steps in a single test case that call engine_createBlock. Locally I increased the step timeout and it seemed to fix those inconsistencies, but failing in github actions arbitrarily.

@tgmichel
Copy link
Contributor Author

@danforbes @jimmychu0807 @seunlanlege I cannot move forward with this without knowing more about the problem.

@crystalin mentioned he is also having problems with it - set_timestamp called twice when calling engine_createblock, making the node crash:

4: <pallet_timestamp::Module<T> as 
frame_support::traits::OnFinalize<<T as frame_system::Trait>::BlockNumber>>::on_finalize
             at /rustc/d3fb005a39e62501b8b0b356166e515ae24e2e54/src/libstd/macros.rs:13
   5: <(TupleElement0,TupleElement1) as frame_support::traits::OnFinalize<BlockNumber>>::on_finalize
             at /home/alan/.cargo/git/checkouts/substrate-7e08433d4c370a21/83544d4/frame/support/src/traits.rs:1418
   6: <(TupleElement0,TupleElement1) as 
frame_support::traits::OnFinalize<BlockNumber>>::on_finalize
             at /home/alan/.cargo/git/checkouts/substrate-7e08433d4c370a21/83544d4/frame/support/src/traits.rs:1418
   7: <(TupleElement0,TupleElement1) as frame_support::traits::OnFinalize<BlockNumber>>::
on_finalize
             at /home/alan/.cargo/git/checkouts/substrate-7e08433d4c370a21/83544d4/frame/support/src/traits.rs:1418
   8: <(TupleElement0
,TupleElement1) as frame_support::traits::OnFinalize<BlockNumber>>::on_finalize
             at /home/alan/.cargo/git/checkouts/substrate-7e08433d4c370a21/83544d4/frame/support/src/traits.rs:1418
   9: <(TupleElement0,TupleElement1) as frame_support::traits::OnFinalize<BlockNumber>>::on_finalize
             at /home/alan/.cargo/git/checkouts/substrate-7e08433d4c370a21/83544d4/frame/support/src/traits.rs:1418
  10: <
(TupleElement0,TupleElement1) as frame_support::
traits::OnFinalize<BlockNumber>>::on_finalize
             at 
/home/alan/.cargo/git/checkouts/substrate-7e08433d4c370a21/83544d4/frame/support/src/traits.rs:1418

Do you know something about this?

@seunlanlege
Copy link

can you post the full logs?

@tgmichel
Copy link
Contributor Author

can you post the full logs?

@crystalin

@JoshOrndorff
Copy link
Contributor

JoshOrndorff commented Oct 14, 2020

I've noticed intermittent tests related to manual-seal also paritytech/substrate#6951 (comment)

@crystalin
Copy link
Contributor

crystalin commented Oct 15, 2020

@seunlanlege. One way to reproduce the error in a different way is by sending engine_createBlock (with finalize) 2 in a row.

Start frontier with manual-seal
cargo run -- --manual-seal --rpc-port 33300 --tmp -l'debug,rpc=trace' --validator --no-grandpa --force-authoring

Simply send 2 requests to create a block:
curl localhost:33300 -H 'Content-Type: application/json' -d '{"jsonrpc":"2.0","id":1,"method":"engine_createBlock","params":[true,true,null]}'

The 2nd one will fail and the server will generate:

Oct 15 01:28:57.537 TRACE Request: {"jsonrpc":"2.0","id":1,"method":"engine_createBlock","params":[true,true,null]}.    
Oct 15 01:28:57.538  INFO 🙌 Starting consensus session on top of parent 0xf7df07137f7fd008f3187d7556da7389ab41589fa552769c9b39de4e51d7bbbe    
Oct 15 01:28:57.595 ERROR panicked at 'Only one block may be authored per slot.', /home/alan/.cargo/git/checkouts/substrate-7e08433d4c370a21/d81e3ab/frame/aura/src/lib.rs:215:9    
Oct 15 01:28:57.596  WARN ❗️ Inherent extrinsic returned unexpected error: Execution: Trap: Trap { kind: Unreachable }. Dropping.    
Oct 15 01:28:57.596 DEBUG Attempting to push transactions from the pool.    
Oct 15 01:28:57.596 DEBUG Pool status: PoolStatus { ready: 0, ready_bytes: 0, future: 0, future_bytes: 0 }    
Oct 15 01:28:57.690 ERROR panicked at 'Timestamp must be updated once in the block', /home/alan/.cargo/git/checkouts/substrate-7e08433d4c370a21/d81e3ab/frame/timestamp/src/lib.rs:195:13

I think it is the same issue that happens with the tests where depending on the latency, there could be 2 createBlock too quickly.

@JoshOrndorff
Copy link
Contributor

JoshOrndorff commented Oct 15, 2020

So the log says Only one block may be authored per slot. Why are slots relevant if this is manual seal? Is there still an Aura pallet in the runtime when using manual seal? Is so are we somehow forgeing the Aura pre-runtime digest?

I believe paritytech/substrate#7010 introduced the ability to do that with manual seal.

@JoshOrndorff
Copy link
Contributor

JoshOrndorff commented Oct 15, 2020

Here's a hypothesis:

We use manual seal to author two blocks in short succession. When authoring each block, we create a timestamp inherent. The second block is rejected because, "Only one block may be authored per slot". The timestamp extrinsic from that rejected block somehow makes it back into the pool and ends up in the next (third overall) block.

What happens if we shorten the slot duration significantly? Can we still reproduce the issue as easily?

@crystalin
Copy link
Contributor

Changing the blocktime (pub const MILLISECS_PER_BLOCK: u64 = 300;) definitely prevent the issue to happen.
Also this is probably a relevant part in the runtime implementation

parameter_types! {
	pub const MinimumPeriod: u64 = SLOT_DURATION / 2;
}

impl pallet_timestamp::Trait for Runtime {
	/// A timestamp: milliseconds since the unix epoch.
	type Moment = u64;
	type OnTimestampSet = Aura;
	type MinimumPeriod = MinimumPeriod;
	type WeightInfo = ();
}

@tgmichel tgmichel marked this pull request as ready for review November 6, 2020 21:26
@tgmichel
Copy link
Contributor Author

tgmichel commented Nov 6, 2020

#170 mostly fixed this.

However still experimenting odd behaviour when running the tests locally VS. github. I had to extend the timeout to let the Promises resolve properly here, while locally works with the default 2000ms timeout.

Copy link
Member

@sorpaas sorpaas left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Mostly LGTM. Some nits.

ts-tests/tests/test-subscription.ts Outdated Show resolved Hide resolved
ts-tests/tests/util.ts Outdated Show resolved Hide resolved
ts-tests/tests/util.ts Outdated Show resolved Hide resolved
ts-tests/tests/util.ts Outdated Show resolved Hide resolved
@sorpaas sorpaas merged commit 6abd240 into polkadot-evm:master Nov 6, 2020
@tgmichel tgmichel deleted the tgmichel-subscribe-tests branch April 1, 2022 09:56
abhijeetbhagat pushed a commit to web3labs/frontier that referenced this pull request Jan 11, 2023
* Add connect, subscribe and newHeads tests.

* Add newPendingTransactions test

* Increase timeout on steps that create blocks

* Add new bytecode and update existing

* Add all logs test

* Add logs test by address

* Add logs test by topic

* Add past events subscription

* Fix checker

* Increase step timeout for promise resolving

* Add some fields to just identify a block type

* Add topic wildcards and parameters tests

* Bug fix for topic filter

* Cleanup
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants