Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New Product proposal: IPFS Implementation feature parity checker #101

Open
SgtPooki opened this issue Oct 14, 2022 · 4 comments
Open

New Product proposal: IPFS Implementation feature parity checker #101

SgtPooki opened this issue Oct 14, 2022 · 4 comments
Assignees
Labels
dif/medium Prior experience is likely helpful effort/days Estimated to take multiple days, but less than a week need/analysis Needs further analysis before proceeding need/community-input Needs input from the wider community P2 Medium: Good to have, but can wait until someone steps up

Comments

@SgtPooki
Copy link
Member

SgtPooki commented Oct 14, 2022

Introduction

This is a work-in-progress proposal for creating a new product/tool that would benefit the IPFS ecosystem.

The idea is to create a tool that acts as a mashup of ipfs/interop & ipfs-shipyard/pinning-service-compliance where IPFS implementations (js-ipfs, iroh, kubo) can be displayed. Its intent wouldn't be to ensure interoperability but display interoperability gaps.

Goal

The goal of this new product/feature is multi-purpose:

  1. A single location where all IPFS implementations and their functionality can be seen
  2. Encourage/foster the creation, and performance improvements, of the different implementations
  3. Support users' awareness of different implementations to simplify decision-making when deciding which implementation may be right for them
  4. Encourage the adoption of certain features/functionality by an implementation via pressure from the community

Things that we shouldn't cover

  • interoperability tests - ipfs/interop is a fairly large beast, and one thing I don't want this product to become is another huge maintenance burden. interoperability tests and insurance should remain in ipfs/interop, where they belong.
  • infantile implementations - Implementations that do not pass a minimum bar (to be set) should not be included. We should have a minimum feature set that implementations must support to be included in the list. Without this barrier to entry, it would be too easy for this tool to become a list of incomplete and unmaintained implementations
  • features that are too low-level - Displaying the support of different block encodings or other minutia could easily bog down the usefulness of this feature. The idea is not to cover every single thing an implementation can do, but instead to display which implementations support the most up-to-date and recommended things.
    • i.e. don't show nuance of dag-* support. Show that dag-cbor (or whatever is currently recommended) blocks can be added, removed, etc..

Proposal (TBD)

A lot of the work for this tool already exists elsewhere, so I don't want to duplicate that work. Instead, we should utilize existing tools to perform the functionality testing, and instead, focus this tool on rendering the resulting pass/fail for displayed features.

How we could consolidate the existing tests without duplicating work still needs to be fleshed out, but I wanted to ensure I wrote down my thoughts on this before investing too heavily.

Additional thoughts

  • We could display benchmarking results to facilitate community pressure on certain implementations to speed up features & functionality that are slower than other implementations, but this could easily be its own tool.
  • We may want to expand this tool to support minutia (dag-* and other microcosms of functionality) in the future, but I don't think we should start there.
  • We could systematize ipfs/interop to be the foundation for this product, and make it easier to add implementations and feature tests.
@SgtPooki SgtPooki added the need/triage Needs initial labeling and prioritization label Oct 14, 2022
@SgtPooki SgtPooki changed the title New Product proposal: IPFS Implementation feature compliance checker New Product proposal: IPFS Implementation feature parity checker Oct 14, 2022
@SgtPooki
Copy link
Member Author

@lidel it would be useful to get your thoughts on how this could overlap with some of the specs work you're doing.

@lidel
Copy link
Member

lidel commented Oct 14, 2022

Hm.. need to decide on the resolution we want to operate at.

Something like https://libp2p.io/implementations/ with discrete features such as

  • "Block" "CAR" "libp2p" "DCUtR" "Public DHT Client/Server aka kad 1.0.0", "MDSN", "TCP" "WebTransport" "QUIC", "Bitswap 1.x.x" "UnixFS", "UnixFS 1.5. Metadata" "UnixFS HAMT reading|auto-sharding", "DAG-JSON", "DAG-CBOR", "DNSLink", "IPNS V1", "IPNS V2 (Extensible Data Records)", "Gateway Path|Subdomain|DNSLink" "Delegated Routing with Reframe" etc etc

per implementation ?

@SgtPooki
Copy link
Member Author

In gui triage, Lidel said that Robin Berjon will be in Lisbon, and he worked on web-platform-tests, so we could pick his brain during IPFS Camp.

@SgtPooki SgtPooki added dif/medium Prior experience is likely helpful P2 Medium: Good to have, but can wait until someone steps up need/analysis Needs further analysis before proceeding need/community-input Needs input from the wider community effort/days Estimated to take multiple days, but less than a week and removed need/triage Needs initial labeling and prioritization labels Oct 17, 2022
@SgtPooki SgtPooki self-assigned this Oct 17, 2022
@darobin
Copy link

darobin commented Oct 31, 2022

I agree with @lidel that a good first step is something static that implementations can PR (we probably want them to PR some data rather than an HTML page directly, but that's a detail).

Building something like WPT certainly takes time, even if we will probably never need to get to the 1.8m tests that they have. All that the initial version did was bring together a few thousand tests in a central place and unified test runner that crashed browsers more often than not. There was a fair bit of time before that led to results being collected and reported on. I suspect we'll have to start with a similar MVP!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dif/medium Prior experience is likely helpful effort/days Estimated to take multiple days, but less than a week need/analysis Needs further analysis before proceeding need/community-input Needs input from the wider community P2 Medium: Good to have, but can wait until someone steps up
Projects
No open projects
Status: Needs Grooming
Development

No branches or pull requests

3 participants