diff --git a/.env.example b/.env.example index 5caf38b..08f9bad 100644 --- a/.env.example +++ b/.env.example @@ -1,9 +1,12 @@ PRIVATE_KEY="abc123abc123abc123abc123abc123abc123abc123abc123abc123abc123abc1" # Acquired via. `curl --location --request GET 'https://auth.estuary.tech/register-new-token'` EDGE_API_KEY="EST5c1d961b-5916-4c23-a17d-889ebffa84c6ARY" -LIGHTHOUSE_API_KEY="ac89924f.928bfddf2e214309ba669397d6b66664" +LIGHTHOUSE_API_KEY="673212f3.6c46a171d620450999e78b9d1e3b77f7" MINER="t017840" +# 'calibration' or 'mainnet' +NETWORK="calibration" + # API ENDPOINTS: EDGE EDGE_DEAL_INFOS_ENDPOINT="https://hackfs-coeus.estuary.tech/edge/open/status/content/" EDGE_DEAL_UPLOAD_ENDPOINT="https://hackfs-coeus.estuary.tech/edge/api/v1/content/add" @@ -11,7 +14,7 @@ EDGE_DEAL_DOWNLOAD_ENDPOINT="https://hackfs-coeus.estuary.tech/edge/gw/" # API ENDPOINTS: LIGHTHOUSE LIGHTHOUSE_DEAL_DOWNLOAD_ENDPOINT="https://gateway.lighthouse.storage/ipfs/" -LIGHTHOUSE_DEAL_INFOS_ENDPOINT="https://api.lighthouse.storage/api/lighthouse/get_proof\?cid\=" +LIGHTHOUSE_DEAL_INFOS_ENDPOINT="https://api.lighthouse.storage/api/lighthouse/get_proof" # API ENDPOINTS: LOTUS LOTUS_RPC="https://api.node.glif.io" \ No newline at end of file diff --git a/README.md b/README.md index d7a4846..f388b57 100644 --- a/README.md +++ b/README.md @@ -29,15 +29,15 @@ This will clone the hardhat kit onto your computer, switch directories into the You can get a private key from a wallet provider [such as Metamask](https://metamask.zendesk.com/hc/en-us/articles/360015289632-How-to-export-an-account-s-private-key). -## Add your Private Key as an Environment Variable +## Setting Environment Variables -Add your private key as an environment variable by running this command: +Add your private key as an environment variable inside the `.env` file: ```bash -export PRIVATE_KEY='abcdef' +PRIVATE_KEY='abcdef' ``` -If you use a .env file, don't commit and push any changes to .env files that may contain sensitive information, such as a private key! If this information reaches a public GitHub repository, someone can use it to check if you have any Mainnet funds in that wallet address, and steal them! +Don't commit and push any changes to .env files that may contain sensitive information, such as a private key! If this information reaches a public GitHub repository, someone can use it to check if you have any Mainnet funds in that wallet address, and steal them! ## Get the Deployer Address @@ -66,7 +66,7 @@ yarn hardhat deploy This will compile the DealStatus contract and deploy it to the Calibrationnet test network automatically! Keep note of the deployed contract address - the service node will need it to interact with the contract. -Update the `contractInstance` variable in `api/service.js` with the deployed contract address. +**Update the `contractInstance` variable in `api/service.js` with the deployed contract address.** There's a contract interface in the `contracts/interfaces` directory that `DealStatus` inherits from. If you would like to create your own contract different from `DealStatus`, be sure to inherit from and override the methods in the interface. @@ -85,7 +85,9 @@ yarn start # This starts up the frontend You can access a frontend of the app at [localhost:1337](http://localhost:1337/). -Several test cases regarding the service's functionality are located in `api/tests`. To run them, run the following command: +**Note: some processes that the service performs (such as uploading deals to lighthouse) may take up to 24 hours. Once you submit the deal, you do not need to keep the node running.** The node will attempt to finish incomplete jobs on startup by reading from the state-persisting files it creates in cache whenever jobs are registered. + +Several test cases for the service's functionality are located in `api/tests`. To run them, run the following command: ```bash # Tests the interaction for API calls into service @@ -95,23 +97,28 @@ yarn test-edge yarn test-lighthouse ``` -**Note: some processes that the service performs (such as uploading deals to lighthouse) may take up to 24 hours. Once you submit the deal, you do not need to keep the node running. Incomplete jobs will be maintained by the node. The node service has local state persistence in the `cache` directory in case of shutdown.** +### How RaaS Works + +To innovate new use cases, you'll have to take apart your app. The RaaS application has two components: the API frontend and the smart contract backend. -The service performs the following: +The backend stores the CID of the file and the infos used to complete the storage deal (e.g. the proof that the file is included on chain). It also has functionality to return active deals made with a particular CID, as well as deals that are about to expire. + +The API frontend performs the following: - **Allows users to register various jobs to be performed by the service (performed by default every 12 hours)**. - - **Replication**: When building a storage solution with FVM on Filecoin, storage deals need to be replicated across geo location, policy sizes and reputation. Teams building data solutions will pay FIL in order to make sure their data can be replicated N times across a number of selected storage providers, either one-off or continuously responding to SP faults. The job should get all the active deals of the cid, if the number of active deals is smaller than replication_target, the worker retrieves the data (see the retrieval section below), create a new deal using aggregators (see the create a new deal section below), and send the cid to the aggregator smart contract. - - **Renewal**: When building storage solutions with FVM on Filecoin, storage deals need to be live for a long time. This service should be able to take an existing deal and renew it with the same or a different SP. Teams building data solutions will pay FIL in order to make sure their data can be renewed when it comes close to the end of its lifetime, or renew on another SP whenever they need to do so. For ‘renew’ job, the job gets all the active deals of the cid, if any deal is expiring, perform a retrieval and submit the retrieved data to aggregators to create a new deal and send the cid to the aggregator smart contract. - - **Repair**: When building storage solutions with FVM on Filecoin, storage deals need to be stable. This service should be able to take an existing deal and repair it with the same or a different SP. Teams building data solutions will pay FIL in order to make sure their data can be repaired when it comes close to the end of its lifetime, or repair on another SP whenever they need to do so. The node checks that the deal has been verified previously, and if the deal has been inactive for more than `repair_threshold` epochs. If so, the worker resubmits the deal to the smart contract and creates a new deal. -- **Monitors smart contract for new deal submissions and creates a new deal with the aggregator node**. - - The node listens to the `SubmitAggregatorRequest` event in aggregators’ smart contract, and trigger the following workflow whenever it sees a new SubmitAggregatorRequest event. The flow similarly assumes that the data of the cid is discoverable for aggregators. If not, upload it first. The steps below are not implemented and are left empty for developers to implement. + - **Replication**: When building a storage solution with FVM on Filecoin, storage deals need to be replicated across geo location, policy sizes and reputation. Replication deals ensure that data can be replicated N times across a number of storage providers. + - **Renewal**: When building storage solutions with FVM on Filecoin, storage deals need to be live for a long time. This service should be able to take an existing deal and renew it with the same or a different storage provider. + - **Repair**: When building storage solutions with FVM on Filecoin, storage deals need to be stable. Repair jobs ensure that data can be maintained when it comes close to the end of its lifetime, or if the data somehow becomes inactive and needs to be repaired via. another storage provider. + - **Monitors Smart Contract**: The node listens to the `SubmitAggregatorRequest` event in aggregators’ smart contract, and trigger the following workflow whenever it sees a new SubmitAggregatorRequest event. - 1. A new`SubmitAggregatorRequest` event comes in, the node saves save the `txId` and `cid`, and go to the next step - - 2. Create a new deal with aggregators ([see this section](https://www.notion.so/Renew-Replication-Starter-Kit-f57af3ebd221462b8b8ef2714178865a?pvs=21)) by retrieving and uploading the data + - 2. Create a new deal with aggregators by retrieving and uploading the data - The response contains an ID, which is the `content_id` - 3. [Use the content_id to check the upload’s status](https://github.com/application-research/edge-ur/blob/car-gen/docs/aggregation.md#checking-the-status-by-content-id) - 4. Periodically poll the API above, and once `deal_id` becomes non-zero, proceed to the next step - 5. Post the `deal_id`, `inclusion_proof`, and `verifier_data` back to [the aggregators’ smart contract](https://github.com/application-research/fevm-data-segment/blob/main/contracts/aggregator-oracle/edge.sol#L52) by calling the `complete` method, along with the `txId` and `cid` -### Usage +For a more detailed guide, check out the [documentation](https://www.notion.so/Renew-Replication-Starter-Kit-f57af3ebd221462b8b8ef2714178865a). + +## API Usage Once you start up the server, the POST endpoint will be available at the designated port. @@ -138,7 +145,7 @@ curl --location 'http://localhost:1337/api/register_job' \ --header 'User-Agent: SMB Redirect/1.0.0' \ --header 'Content-Type: application/x-www-form-urlencoded' \ --header 'Authorization: Basic ZDU5MWYyYzQtMzk0MS00ZWM4LTkyNTQtYjgzZDg1NmI2YmU5Om1xZkU5eklsVFFOdGVIUnY2WDEwQXVmYkNlN0pIUXVC' \ ---data-urlencode 'cid=QmbY5ZWR4RjxG82eUeWCmsVD1MrHNZhBQz5J4yynKLvgfZ' \ +--data-urlencode 'cid=QmYSNU2i62v4EFvLehikb4njRiBrcWqH6STpMwduDcNmK6' \ --data-urlencode 'endDate=2023-07-15' \ --data-urlencode 'jobType=replication' \ --data-urlencode 'replicationTarget=1' \ @@ -146,7 +153,6 @@ curl --location 'http://localhost:1337/api/register_job' \ --data-urlencode 'epochs=1000' ``` -Note: The `aggregator` field can be one of the following: `edge`, or `lighthouse`. This changes the type of aggregator node that the service will use to interact with the Filecoin network. The `jobType` field can be one of the following: `renew`, `replicate`, or `repair`. This changes the type of job that the service will perform. diff --git a/api/lighthouseAggregator.js b/api/lighthouseAggregator.js index 6a42ab5..ba4df8c 100644 --- a/api/lighthouseAggregator.js +++ b/api/lighthouseAggregator.js @@ -4,13 +4,14 @@ const path = require('path'); const { ethers } = require("hardhat"); const EventEmitter = require('events'); const sleep = require('util').promisify(setTimeout); -const { spawn } = require('child_process'); const lighthouse = require('@lighthouse-web3/sdk'); // Location of fetched data for each CID from edge const dataDownloadDir = path.join(__dirname, 'download'); const lighthouseDealDownloadEndpoint = process.env.LIGHTHOUSE_DEAL_DOWNLOAD_ENDPOINT; const lighthouseDealInfosEndpoint = process.env.LIGHTHOUSE_DEAL_INFOS_ENDPOINT; +const lighthousePinEndpoint = process.env.LIGHTHOUSE_PIN_ENDPOINT + if (!lighthouseDealDownloadEndpoint) { throw new Error("Missing environment variables: data endpoints"); } @@ -32,9 +33,7 @@ class LighthouseAggregator { // For any files that do, poll the deal status this.aggregatorJobs.forEach(async job => { if (!job.lighthouse_cid) { - console.log("Redownloading file with CID: ", job.cid); - await this.downloadFile(job.cid); - const lighthouse_cid = await this.uploadFileAndMakeDeal(path.join(dataDownloadDir, job.cid)); + const lighthouse_cid = await this.pinCIDAndMakeDeal(job.cid); job.lighthouse_cid = lighthouse_cid; this.saveState(); } @@ -44,39 +43,24 @@ class LighthouseAggregator { } async processFile(cid, txID) { - let downloaded_file_path; - let lighthouse_cid; - - // Try to download the file only if the cid is new + // Queue jobs only if the cid is new if (!this.aggregatorJobs.some(job => job.cid == cid)) { - try { - downloaded_file_path = await this.downloadFile(cid); - this.enqueueJob(cid, txID); - this.saveState(); - } catch (err) { - // If an error occurred, log it - console.error(`Failed to download file: ${err}`); - return; - } + this.enqueueJob(cid, txID); + this.saveState(); } else { - // If the file has already been downloaded, use the existing file - downloaded_file_path = path.join(dataDownloadDir, cid); // Update the txID for the job this.aggregatorJobs.find(job => job.cid == cid).txID = txID; } - // Wait for the file to be downloaded - await sleep(2500); - - // Upload the file (either the downloaded one or the error file) - lighthouse_cid = await this.uploadFileAndMakeDeal(downloaded_file_path); + // Pin cid to lighthouse + const lighthouse_cid = await this.pinCIDAndMakeDeal(cid); // Find the job with the matching CID and update the lighthouse_cid // lighthouse_cid depends on whether or not content was uploaded to edge or lighthouse. this.aggregatorJobs.find(job => job.cid == cid).lighthouse_cid = lighthouse_cid; this.saveState(); - return lighthouse_cid; + return cid; } async processDealInfos(maxRetries, initialDelay, lighthouse_cid) { @@ -86,7 +70,8 @@ class LighthouseAggregator { try { let response = await axios.get(lighthouseDealInfosEndpoint, { params: { - cid: lighthouse_cid + cid: lighthouse_cid, + network: "testnet" // Change the network to mainnet when ready } }) if (!response.data) { @@ -101,14 +86,24 @@ class LighthouseAggregator { this.aggregatorJobs = this.aggregatorJobs.filter(job => job.contentID != contentID); return; } + let dealIds = []; + let miner = []; + response.data.dealInfo.forEach(item => { + dealIds.push(item.dealId); + miner.push(item.storageProvider.replace("t0", "")); + }); let dealInfos = { txID: job.txID, - dealID: response.data.dealInfo[0].dealId, + dealID: dealIds, inclusion_proof: response.data.proof.fileProof.inclusionProof, verifier_data: response.data.proof.fileProof.verifierData, - miner: response.data.dealInfo[0].storageProvider.replace("f0", ""), + // For each deal, the miner address is returned with a t0 prefix + // Replace the t0 prefix with an empty string to get the address + miner: miner, } - if (dealInfos.dealID != 0) { + // If we receive a nonzero dealID, emit the DealReceived event + if (dealInfos.dealID[0] != null) { + console.log("Lighthouse deal infos processed after receiving nonzero dealID: ", dealInfos); this.eventEmitter.emit('DealReceived', dealInfos); // Remove the job from the list this.aggregatorJobs = this.aggregatorJobs.filter(job => job.lighthouse_cid != lighthouse_cid); @@ -120,17 +115,18 @@ class LighthouseAggregator { } } } catch (e) { - console.log("Error polling lighthouse for lighthouse_cid: ", lighthouse_cid); + console.log("Error polling lighthouse for lighthouse_cid: ", lighthouse_cid + e); } await sleep(delay); delay *= 2; } this.eventEmitter.emit('error', new Error('All retries failed, totaling: ' + maxRetries)); - } + } async uploadFileAndMakeDeal(filePath) { try { - const response = await lighthouse.upload(filePath, process.env.LIGHTHOUSE_API_KEY); + const dealParams = {miner:[ process.env.MINER ], repair_threshold: null, renew_threshold: null, network: process.env.NETWORK}; + const response = await lighthouse.upload(filePath, process.env.LIGHTHOUSE_API_KEY, false, dealParams); const lighthouse_cid = response.data.Hash; console.log("Uploaded file, lighthouse_cid: ", lighthouse_cid); return lighthouse_cid; @@ -139,31 +135,26 @@ class LighthouseAggregator { } } - async downloadFile(lighthouse_cid, downloadPath = path.join(dataDownloadDir, lighthouse_cid)) { - console.log("Downloading file with CID: ", lighthouse_cid); - let response; - - // Ensure 'download' directory exists - fs.mkdir(dataDownloadDir, { - recursive: true - }, (err) => { - if (err) { - console.error(err); - } - }); - - response = await axios({ - method: 'GET', - url: `${lighthouseDealDownloadEndpoint}${lighthouse_cid}`, - responseType: 'stream', - }); - + async pinCIDAndMakeDeal(cidString) { try { - const filePath = await this.saveResponseToFile(response, downloadPath); - console.log(`File saved at ${filePath}`); - return filePath - } catch (err) { - console.error(`Error saving file: ${err}`); + const data = { + "cid": cidString, + "raas": { + "network": "calibration", + } + } + const pinResponse = await axios.post( + lighthousePinEndpoint, + data, + { + headers: { + 'Authorization': `Bearer ${process.env.LIGHTHOUSE_API_KEY}` + } + } + ) + return cidString; + } catch (error) { + console.error('An error occurred:', error); } } diff --git a/api/service.js b/api/service.js index 3ffcebc..58364a6 100644 --- a/api/service.js +++ b/api/service.js @@ -1,5 +1,6 @@ const express = require('express') const { ethers } = require("hardhat"); +const cors = require('cors'); const { networkConfig } = require("../helper-hardhat-config") @@ -8,10 +9,11 @@ const app = express(); const fs = require('fs'); const path = require('path'); const multer = require('multer'); +const sleep = require('util').promisify(setTimeout); const port = 1337; const contractName = "DealStatus"; -const contractInstance = "0x65A1dC7FE50fe836145bd403746b73E89980E7ca"; +const contractInstance = "0x6ec8722e6543fB5976a547434c8644b51e24785b"; // The user will also input const EdgeAggregator = require('./edgeAggregator.js'); const LighthouseAggregator = require('./lighthouseAggregator.js'); const upload = multer({ dest: 'temp/' }); // Temporary directory for uploads @@ -22,6 +24,7 @@ let edgeAggregatorInstance; let lighthouseAggregatorInstance; let isDealCreationListenerActive = false; +app.use(cors()); app.listen(port, () => { if (!isDealCreationListenerActive) { isDealCreationListenerActive = true; @@ -38,7 +41,7 @@ app.listen(port, () => { setInterval(async () => { console.log("Executing jobs"); await executeJobs(); - }, 20000); // 43200000 = 12 hours + }, 50000); // 43200000 = 12 hours }); app.use( @@ -60,7 +63,7 @@ app.post('/api/register_job', upload.none(), async (req, res) => { jobType: req.body.jobType || "all", replicationTarget: req.body.replicationTarget || 2, aggregator: req.body.aggregator || "lighthouse", - epochs: req.body.epochs || 1000 + epochs: req.body.epochs || 4, }; if (newJob.cid != null && newJob.cid != "") { @@ -132,7 +135,7 @@ app.get('/api/deal_status', async (req, res) => { } let activeDeals; try { - activeDeals = await dealStatus.getActiveDeals(ethers.utils.toUtf8Bytes(job.cid)); + activeDeals = await dealStatus.callStatic.getActiveDeals(ethers.utils.toUtf8Bytes(job.cid)); } catch (err) { console.log("An error has occurred when retrieving deal status: ", err); @@ -174,10 +177,14 @@ async function registerJob(newJob) { // 3. Check if newJob.endDate is a valid date // 4. Check if newJob.jobType is either 'renew' or 'replication' // 5. Check if newJob.replicationTarget is a number - console.log("Executing deal creation job from API request with CID: ", newJob.cid); + console.log("Executing deal creation job with CID: ", newJob.cid); const dealStatus = await ethers.getContractAt(contractName, contractInstance); - await dealStatus.submit(ethers.utils.toUtf8Bytes(newJob.cid)); + try { + await dealStatus.submit(ethers.utils.toUtf8Bytes(newJob.cid)); + } catch (error) { + console.log("Error submitting deal creation job: ", error); + } if (!storedNodeJobs.some(job => job.cid == newJob.cid)) { storedNodeJobs.push(newJob); @@ -197,15 +204,23 @@ async function executeReplicationJob(job) { } catch { console.log("Error: CID must be a hexadecimal string or bytes"); } - const activeDeals = await dealStatus.getActiveDeals(ethers.utils.toUtf8Bytes(job.cid)); - console.log(`Deal ${job.cid} at ${activeDeals.length}`); + const activeDeals = await dealStatus.callStatic.getActiveDeals(ethers.utils.toUtf8Bytes(job.cid)); + console.log(`Deal ${job.cid} at ${activeDeals.length} replications`); if (activeDeals.length < job.replicationTarget) { - try { - await dealStatus.submit(cid); - } catch (error) { - console.log("Error: ", error); + // Repeat the submission for as many times as the difference between the replication target and the number of active deals + console.log(`Replicating deal ${job.cid} to ${job.replicationTarget} replications. Currently at ${activeDeals.length} replications.`); + for (let i = 0; i < job.replicationTarget - activeDeals.length; i++) { + await sleep(2000000); + try { + console.log(`Submitting replication deal`) + await dealStatus.submit(ethers.utils.toUtf8Bytes(job.cid)); + // Wait a minute before submitting another. + } catch (error) { + console.log("Error replicating: ", error); + } } } + console.log("Replication successful"); } // Execute the renewal job @@ -215,14 +230,17 @@ async function executeReplicationJob(job) { async function executeRenewalJob(job) { const dealStatus = await ethers.getContractAt(contractName, contractInstance); // Get all expiring deals for the job's CID within a certain epoch - const expiringDeals = await dealStatus.getExpiringDeals(ethers.utils.toUtf8Bytes(job.cid), job.epochs ? job.epochs : 1000); - expiringDeals.forEach(async () => { + const expiringDeals = await dealStatus.callStatic.getExpiringDeals(ethers.utils.toUtf8Bytes(job.cid), job.epochs ? job.epochs : 4); + console.log(`Deal ${job.cid} has ${expiringDeals.length} expiring deals: renewing (if any).`); + for (let i = 0; i < expiringDeals.length; i++) { try { - await dealStatus.submit(job.cid); + await dealStatus.submit(ethers.utils.toUtf8Bytes(job.cid)); + await sleep(20000); } catch (error) { - console.log("Error: ", error); + console.log("Error renewing: ", error); } - }); + } + console.log("Renewal successful"); } // Execute the repair job @@ -233,10 +251,11 @@ async function executeRepairJob(job) { const dealStatus = await ethers.getContractAt(contractName, contractInstance); const method = "Filecoin.StateMarketStorageDeal"; // Get all (deal_id, miner) containing the data’s cid - const allDeals = await dealStatus.getAllDeals(ethers.utils.toUtf8Bytes(job.cid)); + const allDeals = await dealStatus.callStatic.getAllDeals(ethers.utils.toUtf8Bytes(job.cid)); + console.log(`Deal ${job.cid} has ${allDeals.length} deals: repairing if any are broken.`); allDeals.forEach(async deal => { // Takes integer format (need to prefix f0 for API call). - const dealId = deal.dealId; + const dealId = deal.dealId.toNumber(); const params = [dealId, null]; const body = { @@ -254,15 +273,16 @@ async function executeRepairJob(job) { const currentBlockHeight = await getBlockNumber(); - if (response.result.State.SectorStartEpoch > -1 && currentBlockHeight - deal.result.State.SlashEpoch > job.epochs) + if ((response.data.result.State.SectorStartEpoch > -1 && response.data.result.State.SlashEpoch != -1) && currentBlockHeight - response.data.result.State.SlashEpoch > job.epochs) { try { - await dealStatus.submit(job.cid); + await dealStatus.submit(ethers.utils.toUtf8Bytes(job.cid)); } catch (error) { - console.log("Error: ", error); + console.log("Error repairing: ", error); } } }); + console.log("Repair successful"); } // Initialize the listener for the Deal Creation event @@ -296,17 +316,19 @@ async function initializeDealCreationListener() { // To process the dealInfos before completion of deal is handled at dataRetrievalListener // Max retries: 18 (48 hours) // Initial delay: 1000 ms - if (!job.aggregator) { + if (job === undefined) { + // If the CID lookup doesn't yield a job console.log("Error: Aggregator type not specified for job with CID: ", cidString); - // Remove the job if the aggregator type is not specified - storedNodeJobs.splice(storedNodeJobs.indexOf(job), 1); - saveJobsToState(); } else { if (job.aggregator === 'edge') { const contentID = await edgeAggregatorInstance.processFile(cidString, transactionId); edgeAggregatorInstance.processDealInfos(18, 1000, contentID); } else if (job.aggregator === 'lighthouse') { + // Reattach the event listener + if (dealStatus.listenerCount("SubmitAggregatorRequest") === 0) { + dealStatus.once("SubmitAggregatorRequest", handleEvent); + } try { const result = await lighthouseProcessWithRetry(cidString, transactionId); return result; @@ -323,8 +345,6 @@ async function initializeDealCreationListener() { saveJobsToState(); } } - - // After processing this event, reattach the event listener if (dealStatus.listenerCount("SubmitAggregatorRequest") === 0) { dealStatus.once("SubmitAggregatorRequest", handleEvent); } @@ -396,8 +416,8 @@ async function initializeDataRetrievalListener() { lighthouseAggregatorInstance.eventEmitter.on('DealReceived', async dealInfos => { // Process the dealInfos let txID = dealInfos.txID.toString(); - let dealID = dealInfos.dealID; - let miner = dealInfos.miner; + let dealIDs = dealInfos.dealID; + let miners = dealInfos.miner; let inclusionProof = { proofIndex: { index: '0x' + dealInfos.inclusion_proof.proofIndex.index, @@ -412,17 +432,21 @@ async function initializeDataRetrievalListener() { verifierData.commPc = '0x' + verifierData.commPc; // The size piece is originally in hex. Convert it to a number. verifierData.sizePc = parseInt(verifierData.sizePc, 16); - console.log(verifierData); + // Add on the dealInfos to the existing job stored inside the storedNodeJobs. + storedNodeJobs.forEach(job => { + if (job.txID === dealInfos.txID) { + job.dealInfos = dealInfos; + } + }); + saveJobsToState(); + console.log("Deal received with dealInfos: ", dealInfos) try { - await dealStatus.complete(txID, dealID, miner, inclusionProof, verifierData); - // Add on the dealInfos to the existing job stored inside the storedNodeJobs. - storedNodeJobs.forEach(job => { - if (job.txID === dealInfos.txID) { - job.dealInfos = dealInfos; - } - }); - console.log(storedNodeJobs); - console.log("Deal completed for deal ID: ", txID.toString()); + // For each dealID, complete the deal + for (let i = 0; i < dealIDs.length; i++) { + console.log("Completing deal with deal ID: ", dealIDs[i]); + await dealStatus.complete(txID, dealIDs[i], miners[i], inclusionProof, verifierData); + console.log("Deal completed for deal ID: ", dealIDs[i]); + } } catch (err) { console.log("Error submitting file for completion: ", err); diff --git a/contracts/DealStatus.sol b/contracts/DealStatus.sol index 920c22e..db369e6 100644 --- a/contracts/DealStatus.sol +++ b/contracts/DealStatus.sol @@ -7,9 +7,6 @@ pragma solidity ^0.8.17; import "./interfaces/IAggregatorOracle.sol"; import "./data-segment/Proof.sol"; -import {MarketAPI} from "@zondax/filecoin-solidity/contracts/v0.8/MarketAPI.sol"; -import {MarketTypes} from "@zondax/filecoin-solidity/contracts/v0.8/types/MarketTypes.sol"; - // Delta that implements the AggregatorOracle interface contract DealStatus is IAggregatorOracle, Proof { uint256 private transactionId; @@ -32,6 +29,23 @@ contract DealStatus is IAggregatorOracle, Proof { return transactionId; } + function submitRaaS( + bytes memory _cid, + uint256 _replication_target, + uint256 _repair_threshold, + uint256 _renew_threshold + ) external returns (uint256) { + // Increment the transaction ID + transactionId++; + + // Save _cid + txIdToCid[transactionId] = _cid; + + // Emit the event + emit SubmitAggregatorRequestWithRaaS(transactionId, _cid, _replication_target, _repair_threshold, _renew_threshold); + return transactionId; + } + function complete( uint256 _id, uint64 _dealId, @@ -64,6 +78,14 @@ contract DealStatus is IAggregatorOracle, Proof { return cidToDeals[_cid]; } + function getAllCIDs() external view returns (bytes[] memory) { + bytes[] memory cids = new bytes[](transactionId); + for (uint256 i = 0; i < transactionId; i++) { + cids[i] = txIdToCid[i + 1]; + } + return cids; + } + // getActiveDeals should return all the _cid's active dealIds function getActiveDeals(bytes memory _cid) external returns (Deal[] memory) { // get all the deal ids for the cid @@ -95,7 +117,7 @@ contract DealStatus is IAggregatorOracle, Proof { // get the deal's expiration epoch MarketTypes.GetDealTermReturn memory dealTerm = MarketAPI.getDealTerm(dealId); - if (block.timestamp < uint64(dealTerm.end) - epochs) { + if (block.number < uint64(dealTerm.end) - epochs || block.number > uint64(dealTerm.end)) { delete expiringDealIds[i]; } } diff --git a/contracts/interfaces/IAggregatorOracle.sol b/contracts/interfaces/IAggregatorOracle.sol index 87902af..3ac57f1 100644 --- a/contracts/interfaces/IAggregatorOracle.sol +++ b/contracts/interfaces/IAggregatorOracle.sol @@ -1,7 +1,14 @@ // SPDX-License-Identifier: MIT pragma solidity ^0.8.17; -import "../data-segment/Proof.sol"; +import { + ProofData, + InclusionProof, + InclusionVerifierData, + InclusionAuxData, + SegmentDesc, + Fr32 +} from "../data-segment/ProofTypes.sol"; // Behavioral Interface for an aggregator oracle interface IAggregatorOracle { @@ -15,12 +22,25 @@ interface IAggregatorOracle { // Emitted when a new request is submitted with an ID and content identifier (CID). event SubmitAggregatorRequest(uint256 indexed id, bytes cid); + // Emitted when a new request is submitted with an ID, content identifier (CID), and RaaS parameters + event SubmitAggregatorRequestWithRaaS(uint256 indexed id, bytes cid, + uint256 _replication_target, uint256 _repair_threshold, + uint256 _renew_threshold); + // Emitted when a request is completed, providing the request ID and deal ID. event CompleteAggregatorRequest(uint256 indexed id, uint64 indexed dealId); // Function that submits a new request to the oracle function submit(bytes memory _cid) external returns (uint256); + // Function to submit a new file to the aggregator, specifing the raas parameters + function submitRaaS( + bytes memory _cid, + uint256 _replication_target, + uint256 _repair_threshold, + uint256 _renew_threshold + ) external returns (uint256); + // Callback function that is called by the aggregator function complete( uint256 _id, @@ -30,6 +50,8 @@ interface IAggregatorOracle { InclusionVerifierData memory _verifierData ) external returns (InclusionAuxData memory); + function getAllCIDs() external view returns (bytes[] memory); + // Get all deal IDs for a specified cid function getAllDeals(bytes memory _cid) external view returns (Deal[] memory); diff --git a/contracts/mocks/DealStatusMock.sol b/contracts/mocks/DealStatusMock.sol index 7f65cc1..854de95 100644 --- a/contracts/mocks/DealStatusMock.sol +++ b/contracts/mocks/DealStatusMock.sol @@ -32,6 +32,23 @@ contract DealStatusMock is IAggregatorOracle, ProofMock { return transactionId; } + function submitRaaS( + bytes memory _cid, + uint256 _replication_target, + uint256 _repair_threshold, + uint256 _renew_threshold + ) external returns (uint256) { + // Increment the transaction ID + transactionId++; + + // Save _cid + txIdToCid[transactionId] = _cid; + + // Emit the event + emit SubmitAggregatorRequestWithRaaS(transactionId, _cid, _replication_target, _repair_threshold, _renew_threshold); + return transactionId; + } + // TODO: use _miner integer function complete( uint256 _id, @@ -85,7 +102,7 @@ contract DealStatusMock is IAggregatorOracle, ProofMock { } // getExpiringDeals should return all the deals' dealIds if they are expiring within `epochs` - function getExpiringDeals(bytes memory _cid, uint64 epochs) external returns (Deal[] memory) { + function getExpiringDeals(bytes memory _cid,uint64 epochs) external view returns (Deal[] memory) { // the logic is similar to the above, but use this api call: // https://github.com/Zondax/filecoin-solidity/blob/master/contracts/v0.8/MarketAPI.sol#LL110C9-L110C9 Deal[] memory expiringDealIds; @@ -96,12 +113,19 @@ contract DealStatusMock is IAggregatorOracle, ProofMock { // get the deal's expiration epoch MarketTypes.GetDealTermReturn memory dealTerm = MarketAPI.getDealTerm(dealId); - if (block.timestamp < uint64(dealTerm.end) - epochs) { + if (block.number < uint64(dealTerm.end) - epochs || block.number > uint64(dealTerm.end)) { delete expiringDealIds[i]; } } return expiringDealIds; } -} + function getAllCIDs() external view returns (bytes[] memory) { + bytes[] memory cids = new bytes[](transactionId); + for (uint256 i = 0; i < transactionId; i++) { + cids[i] = txIdToCid[i + 1]; + } + return cids; + } +} diff --git a/contracts/mocks/IAggregatorOracleMock.sol b/contracts/mocks/IAggregatorOracleMock.sol index decf76d..6a9ced7 100644 --- a/contracts/mocks/IAggregatorOracleMock.sol +++ b/contracts/mocks/IAggregatorOracleMock.sol @@ -13,11 +13,24 @@ interface IAggregatorOracle { // Event emitted when a new request is submitted event SubmitAggregatorRequest(uint256 indexed id, bytes cid); + // Emitted when a new request is submitted with an ID, content identifier (CID), and RaaS parameters + event SubmitAggregatorRequestWithRaaS(uint256 indexed id, bytes cid, + uint256 _replication_target, uint256 _repair_threshold, + uint256 _renew_threshold); + event CompleteAggregatorRequest(uint256 indexed id, uint64 indexed dealId); // Function that submits a new request to the oracle function submit(bytes memory _cid) external returns (uint256); + // Function to submit a new file to the aggregator, specifing the raas parameters + function submitRaaS( + bytes memory _cid, + uint256 _replication_target, + uint256 _repair_threshold, + uint256 _renew_threshold + ) external returns (uint256); + // Callback function that is called by the aggregator function complete( uint256 _id, diff --git a/hardhat.config.js b/hardhat.config.js index dcc5270..90c0f47 100644 --- a/hardhat.config.js +++ b/hardhat.config.js @@ -18,6 +18,9 @@ module.exports = { }, }, defaultNetwork: "calibrationnet", + mocha: { + timeout: 100000000 + }, networks: { localnet: { chainId: 31415926, diff --git a/package-lock.json b/package-lock.json index 55149e4..d9a4912 100644 --- a/package-lock.json +++ b/package-lock.json @@ -10,7 +10,7 @@ "license": "MIT", "dependencies": { "@glif/filecoin-address": "^2.0.18", - "@lighthouse-web3/sdk": "^0.2.5", + "@lighthouse-web3/sdk": "^0.2.7", "@nomicfoundation/hardhat-chai-matchers": "^1.0.0", "@nomicfoundation/hardhat-network-helpers": "^1.0.0", "@nomicfoundation/hardhat-toolbox": "^2.0.0", @@ -21,6 +21,7 @@ "babel-eslint": "^10.1.0", "chai-http": "^4.4.0", "cid-tool": "^3.0.0", + "cors": "^2.8.5", "dotenv": "^10.0.0", "express": "^4.18.2", "hardhat-deploy-ethers": "^0.3.0-beta.13", @@ -42,7 +43,7 @@ "cids": "^1.1.9", "ethereum-waffle": "^3.4.0", "ethers": "^5.5.1", - "hardhat": "^2.11.2", + "hardhat": "^2.17.3", "hardhat-contract-sizer": "^2.4.0", "hardhat-deploy": "^0.9.29", "hardhat-gas-reporter": "^1.0.7", @@ -1830,9 +1831,9 @@ } }, "node_modules/@lighthouse-web3/sdk": { - "version": "0.2.5", - "resolved": "https://registry.npmjs.org/@lighthouse-web3/sdk/-/sdk-0.2.5.tgz", - "integrity": "sha512-01+jFuSMWvA2IoTFJnu8nnPSK21TlNQWW2PVRoIxq3GBMRhH5U912y6cNI79yqH16fHXe6VwwDQqQv8SrEwKGQ==", + "version": "0.2.7", + "resolved": "https://registry.npmjs.org/@lighthouse-web3/sdk/-/sdk-0.2.7.tgz", + "integrity": "sha512-IHc9SnP+dUGfkj6M65EeR9UGaKRALOcNUlbmRwfVqq5VSCf6n/e8zEtmBB+LSWneB+UWSC9wCC7ZUziMZBZDGw==", "dependencies": { "@lighthouse-web3/kavach": "^0.1.2", "@peculiar/webcrypto": "^1.4.0", @@ -1957,15 +1958,15 @@ } }, "node_modules/@nomicfoundation/ethereumjs-block": { - "version": "5.0.1", - "resolved": "https://registry.npmjs.org/@nomicfoundation/ethereumjs-block/-/ethereumjs-block-5.0.1.tgz", - "integrity": "sha512-u1Yioemi6Ckj3xspygu/SfFvm8vZEO8/Yx5a1QLzi6nVU0jz3Pg2OmHKJ5w+D9Ogk1vhwRiqEBAqcb0GVhCyHw==", - "dependencies": { - "@nomicfoundation/ethereumjs-common": "4.0.1", - "@nomicfoundation/ethereumjs-rlp": "5.0.1", - "@nomicfoundation/ethereumjs-trie": "6.0.1", - "@nomicfoundation/ethereumjs-tx": "5.0.1", - "@nomicfoundation/ethereumjs-util": "9.0.1", + "version": "5.0.2", + "resolved": "https://registry.npmjs.org/@nomicfoundation/ethereumjs-block/-/ethereumjs-block-5.0.2.tgz", + "integrity": "sha512-hSe6CuHI4SsSiWWjHDIzWhSiAVpzMUcDRpWYzN0T9l8/Rz7xNn3elwVOJ/tAyS0LqL6vitUD78Uk7lQDXZun7Q==", + "dependencies": { + "@nomicfoundation/ethereumjs-common": "4.0.2", + "@nomicfoundation/ethereumjs-rlp": "5.0.2", + "@nomicfoundation/ethereumjs-trie": "6.0.2", + "@nomicfoundation/ethereumjs-tx": "5.0.2", + "@nomicfoundation/ethereumjs-util": "9.0.2", "ethereum-cryptography": "0.1.3", "ethers": "^5.7.1" }, @@ -1974,17 +1975,17 @@ } }, "node_modules/@nomicfoundation/ethereumjs-blockchain": { - "version": "7.0.1", - "resolved": "https://registry.npmjs.org/@nomicfoundation/ethereumjs-blockchain/-/ethereumjs-blockchain-7.0.1.tgz", - "integrity": "sha512-NhzndlGg829XXbqJEYrF1VeZhAwSPgsK/OB7TVrdzft3y918hW5KNd7gIZ85sn6peDZOdjBsAXIpXZ38oBYE5A==", - "dependencies": { - "@nomicfoundation/ethereumjs-block": "5.0.1", - "@nomicfoundation/ethereumjs-common": "4.0.1", - "@nomicfoundation/ethereumjs-ethash": "3.0.1", - "@nomicfoundation/ethereumjs-rlp": "5.0.1", - "@nomicfoundation/ethereumjs-trie": "6.0.1", - "@nomicfoundation/ethereumjs-tx": "5.0.1", - "@nomicfoundation/ethereumjs-util": "9.0.1", + "version": "7.0.2", + "resolved": "https://registry.npmjs.org/@nomicfoundation/ethereumjs-blockchain/-/ethereumjs-blockchain-7.0.2.tgz", + "integrity": "sha512-8UUsSXJs+MFfIIAKdh3cG16iNmWzWC/91P40sazNvrqhhdR/RtGDlFk2iFTGbBAZPs2+klZVzhRX8m2wvuvz3w==", + "dependencies": { + "@nomicfoundation/ethereumjs-block": "5.0.2", + "@nomicfoundation/ethereumjs-common": "4.0.2", + "@nomicfoundation/ethereumjs-ethash": "3.0.2", + "@nomicfoundation/ethereumjs-rlp": "5.0.2", + "@nomicfoundation/ethereumjs-trie": "6.0.2", + "@nomicfoundation/ethereumjs-tx": "5.0.2", + "@nomicfoundation/ethereumjs-util": "9.0.2", "abstract-level": "^1.0.3", "debug": "^4.3.3", "ethereum-cryptography": "0.1.3", @@ -1997,22 +1998,22 @@ } }, "node_modules/@nomicfoundation/ethereumjs-common": { - "version": "4.0.1", - "resolved": "https://registry.npmjs.org/@nomicfoundation/ethereumjs-common/-/ethereumjs-common-4.0.1.tgz", - "integrity": "sha512-OBErlkfp54GpeiE06brBW/TTbtbuBJV5YI5Nz/aB2evTDo+KawyEzPjBlSr84z/8MFfj8wS2wxzQX1o32cev5g==", + "version": "4.0.2", + "resolved": "https://registry.npmjs.org/@nomicfoundation/ethereumjs-common/-/ethereumjs-common-4.0.2.tgz", + "integrity": "sha512-I2WGP3HMGsOoycSdOTSqIaES0ughQTueOsddJ36aYVpI3SN8YSusgRFLwzDJwRFVIYDKx/iJz0sQ5kBHVgdDwg==", "dependencies": { - "@nomicfoundation/ethereumjs-util": "9.0.1", + "@nomicfoundation/ethereumjs-util": "9.0.2", "crc-32": "^1.2.0" } }, "node_modules/@nomicfoundation/ethereumjs-ethash": { - "version": "3.0.1", - "resolved": "https://registry.npmjs.org/@nomicfoundation/ethereumjs-ethash/-/ethereumjs-ethash-3.0.1.tgz", - "integrity": "sha512-KDjGIB5igzWOp8Ik5I6QiRH5DH+XgILlplsHR7TEuWANZA759G6krQ6o8bvj+tRUz08YygMQu/sGd9mJ1DYT8w==", + "version": "3.0.2", + "resolved": "https://registry.npmjs.org/@nomicfoundation/ethereumjs-ethash/-/ethereumjs-ethash-3.0.2.tgz", + "integrity": "sha512-8PfoOQCcIcO9Pylq0Buijuq/O73tmMVURK0OqdjhwqcGHYC2PwhbajDh7GZ55ekB0Px197ajK3PQhpKoiI/UPg==", "dependencies": { - "@nomicfoundation/ethereumjs-block": "5.0.1", - "@nomicfoundation/ethereumjs-rlp": "5.0.1", - "@nomicfoundation/ethereumjs-util": "9.0.1", + "@nomicfoundation/ethereumjs-block": "5.0.2", + "@nomicfoundation/ethereumjs-rlp": "5.0.2", + "@nomicfoundation/ethereumjs-util": "9.0.2", "abstract-level": "^1.0.3", "bigint-crypto-utils": "^3.0.23", "ethereum-cryptography": "0.1.3" @@ -2022,14 +2023,14 @@ } }, "node_modules/@nomicfoundation/ethereumjs-evm": { - "version": "2.0.1", - "resolved": "https://registry.npmjs.org/@nomicfoundation/ethereumjs-evm/-/ethereumjs-evm-2.0.1.tgz", - "integrity": "sha512-oL8vJcnk0Bx/onl+TgQOQ1t/534GKFaEG17fZmwtPFeH8S5soiBYPCLUrvANOl4sCp9elYxIMzIiTtMtNNN8EQ==", + "version": "2.0.2", + "resolved": "https://registry.npmjs.org/@nomicfoundation/ethereumjs-evm/-/ethereumjs-evm-2.0.2.tgz", + "integrity": "sha512-rBLcUaUfANJxyOx9HIdMX6uXGin6lANCulIm/pjMgRqfiCRMZie3WKYxTSd8ZE/d+qT+zTedBF4+VHTdTSePmQ==", "dependencies": { "@ethersproject/providers": "^5.7.1", - "@nomicfoundation/ethereumjs-common": "4.0.1", - "@nomicfoundation/ethereumjs-tx": "5.0.1", - "@nomicfoundation/ethereumjs-util": "9.0.1", + "@nomicfoundation/ethereumjs-common": "4.0.2", + "@nomicfoundation/ethereumjs-tx": "5.0.2", + "@nomicfoundation/ethereumjs-util": "9.0.2", "debug": "^4.3.3", "ethereum-cryptography": "0.1.3", "mcl-wasm": "^0.7.1", @@ -2040,9 +2041,9 @@ } }, "node_modules/@nomicfoundation/ethereumjs-rlp": { - "version": "5.0.1", - "resolved": "https://registry.npmjs.org/@nomicfoundation/ethereumjs-rlp/-/ethereumjs-rlp-5.0.1.tgz", - "integrity": "sha512-xtxrMGa8kP4zF5ApBQBtjlSbN5E2HI8m8FYgVSYAnO6ssUoY5pVPGy2H8+xdf/bmMa22Ce8nWMH3aEW8CcqMeQ==", + "version": "5.0.2", + "resolved": "https://registry.npmjs.org/@nomicfoundation/ethereumjs-rlp/-/ethereumjs-rlp-5.0.2.tgz", + "integrity": "sha512-QwmemBc+MMsHJ1P1QvPl8R8p2aPvvVcKBbvHnQOKBpBztEo0omN0eaob6FeZS/e3y9NSe+mfu3nNFBHszqkjTA==", "bin": { "rlp": "bin/rlp" }, @@ -2051,12 +2052,12 @@ } }, "node_modules/@nomicfoundation/ethereumjs-statemanager": { - "version": "2.0.1", - "resolved": "https://registry.npmjs.org/@nomicfoundation/ethereumjs-statemanager/-/ethereumjs-statemanager-2.0.1.tgz", - "integrity": "sha512-B5ApMOnlruVOR7gisBaYwFX+L/AP7i/2oAahatssjPIBVDF6wTX1K7Qpa39E/nzsH8iYuL3krkYeUFIdO3EMUQ==", + "version": "2.0.2", + "resolved": "https://registry.npmjs.org/@nomicfoundation/ethereumjs-statemanager/-/ethereumjs-statemanager-2.0.2.tgz", + "integrity": "sha512-dlKy5dIXLuDubx8Z74sipciZnJTRSV/uHG48RSijhgm1V7eXYFC567xgKtsKiVZB1ViTP9iFL4B6Je0xD6X2OA==", "dependencies": { - "@nomicfoundation/ethereumjs-common": "4.0.1", - "@nomicfoundation/ethereumjs-rlp": "5.0.1", + "@nomicfoundation/ethereumjs-common": "4.0.2", + "@nomicfoundation/ethereumjs-rlp": "5.0.2", "debug": "^4.3.3", "ethereum-cryptography": "0.1.3", "ethers": "^5.7.1", @@ -2064,12 +2065,12 @@ } }, "node_modules/@nomicfoundation/ethereumjs-trie": { - "version": "6.0.1", - "resolved": "https://registry.npmjs.org/@nomicfoundation/ethereumjs-trie/-/ethereumjs-trie-6.0.1.tgz", - "integrity": "sha512-A64It/IMpDVODzCgxDgAAla8jNjNtsoQZIzZUfIV5AY6Coi4nvn7+VReBn5itlxMiL2yaTlQr9TRWp3CSI6VoA==", + "version": "6.0.2", + "resolved": "https://registry.npmjs.org/@nomicfoundation/ethereumjs-trie/-/ethereumjs-trie-6.0.2.tgz", + "integrity": "sha512-yw8vg9hBeLYk4YNg5MrSJ5H55TLOv2FSWUTROtDtTMMmDGROsAu+0tBjiNGTnKRi400M6cEzoFfa89Fc5k8NTQ==", "dependencies": { - "@nomicfoundation/ethereumjs-rlp": "5.0.1", - "@nomicfoundation/ethereumjs-util": "9.0.1", + "@nomicfoundation/ethereumjs-rlp": "5.0.2", + "@nomicfoundation/ethereumjs-util": "9.0.2", "@types/readable-stream": "^2.3.13", "ethereum-cryptography": "0.1.3", "readable-stream": "^3.6.0" @@ -2079,15 +2080,15 @@ } }, "node_modules/@nomicfoundation/ethereumjs-tx": { - "version": "5.0.1", - "resolved": "https://registry.npmjs.org/@nomicfoundation/ethereumjs-tx/-/ethereumjs-tx-5.0.1.tgz", - "integrity": "sha512-0HwxUF2u2hrsIM1fsasjXvlbDOq1ZHFV2dd1yGq8CA+MEYhaxZr8OTScpVkkxqMwBcc5y83FyPl0J9MZn3kY0w==", + "version": "5.0.2", + "resolved": "https://registry.npmjs.org/@nomicfoundation/ethereumjs-tx/-/ethereumjs-tx-5.0.2.tgz", + "integrity": "sha512-T+l4/MmTp7VhJeNloMkM+lPU3YMUaXdcXgTGCf8+ZFvV9NYZTRLFekRwlG6/JMmVfIfbrW+dRRJ9A6H5Q/Z64g==", "dependencies": { "@chainsafe/ssz": "^0.9.2", "@ethersproject/providers": "^5.7.2", - "@nomicfoundation/ethereumjs-common": "4.0.1", - "@nomicfoundation/ethereumjs-rlp": "5.0.1", - "@nomicfoundation/ethereumjs-util": "9.0.1", + "@nomicfoundation/ethereumjs-common": "4.0.2", + "@nomicfoundation/ethereumjs-rlp": "5.0.2", + "@nomicfoundation/ethereumjs-util": "9.0.2", "ethereum-cryptography": "0.1.3" }, "engines": { @@ -2095,12 +2096,12 @@ } }, "node_modules/@nomicfoundation/ethereumjs-util": { - "version": "9.0.1", - "resolved": "https://registry.npmjs.org/@nomicfoundation/ethereumjs-util/-/ethereumjs-util-9.0.1.tgz", - "integrity": "sha512-TwbhOWQ8QoSCFhV/DDfSmyfFIHjPjFBj957219+V3jTZYZ2rf9PmDtNOeZWAE3p3vlp8xb02XGpd0v6nTUPbsA==", + "version": "9.0.2", + "resolved": "https://registry.npmjs.org/@nomicfoundation/ethereumjs-util/-/ethereumjs-util-9.0.2.tgz", + "integrity": "sha512-4Wu9D3LykbSBWZo8nJCnzVIYGvGCuyiYLIJa9XXNVt1q1jUzHdB+sJvx95VGCpPkCT+IbLecW6yfzy3E1bQrwQ==", "dependencies": { "@chainsafe/ssz": "^0.10.0", - "@nomicfoundation/ethereumjs-rlp": "5.0.1", + "@nomicfoundation/ethereumjs-rlp": "5.0.2", "ethereum-cryptography": "0.1.3" }, "engines": { @@ -2125,19 +2126,19 @@ } }, "node_modules/@nomicfoundation/ethereumjs-vm": { - "version": "7.0.1", - "resolved": "https://registry.npmjs.org/@nomicfoundation/ethereumjs-vm/-/ethereumjs-vm-7.0.1.tgz", - "integrity": "sha512-rArhyn0jPsS/D+ApFsz3yVJMQ29+pVzNZ0VJgkzAZ+7FqXSRtThl1C1prhmlVr3YNUlfpZ69Ak+RUT4g7VoOuQ==", - "dependencies": { - "@nomicfoundation/ethereumjs-block": "5.0.1", - "@nomicfoundation/ethereumjs-blockchain": "7.0.1", - "@nomicfoundation/ethereumjs-common": "4.0.1", - "@nomicfoundation/ethereumjs-evm": "2.0.1", - "@nomicfoundation/ethereumjs-rlp": "5.0.1", - "@nomicfoundation/ethereumjs-statemanager": "2.0.1", - "@nomicfoundation/ethereumjs-trie": "6.0.1", - "@nomicfoundation/ethereumjs-tx": "5.0.1", - "@nomicfoundation/ethereumjs-util": "9.0.1", + "version": "7.0.2", + "resolved": "https://registry.npmjs.org/@nomicfoundation/ethereumjs-vm/-/ethereumjs-vm-7.0.2.tgz", + "integrity": "sha512-Bj3KZT64j54Tcwr7Qm/0jkeZXJMfdcAtRBedou+Hx0dPOSIgqaIr0vvLwP65TpHbak2DmAq+KJbW2KNtIoFwvA==", + "dependencies": { + "@nomicfoundation/ethereumjs-block": "5.0.2", + "@nomicfoundation/ethereumjs-blockchain": "7.0.2", + "@nomicfoundation/ethereumjs-common": "4.0.2", + "@nomicfoundation/ethereumjs-evm": "2.0.2", + "@nomicfoundation/ethereumjs-rlp": "5.0.2", + "@nomicfoundation/ethereumjs-statemanager": "2.0.2", + "@nomicfoundation/ethereumjs-trie": "6.0.2", + "@nomicfoundation/ethereumjs-tx": "5.0.2", + "@nomicfoundation/ethereumjs-util": "9.0.2", "debug": "^4.3.3", "ethereum-cryptography": "0.1.3", "mcl-wasm": "^0.7.1", @@ -3448,17 +3449,6 @@ "resolved": "https://registry.npmjs.org/abbrev/-/abbrev-1.0.9.tgz", "integrity": "sha512-LEyx4aLEC3x6T0UguF6YILf+ntvmOaWsVfENmIW0E9H09vKlLDGelMjjSm0jkDHALj8A8quZ/HapKNigzwge+Q==" }, - "node_modules/abort-controller": { - "version": "3.0.0", - "resolved": "https://registry.npmjs.org/abort-controller/-/abort-controller-3.0.0.tgz", - "integrity": "sha512-h8lQ8tacZYnR3vNQTgibj+tODHI5/+l06Au2Pcriv/Gmet0eaj4TwWH41sO9wnHDiQsEj19q0drzdWdeAHtweg==", - "dependencies": { - "event-target-shim": "^5.0.0" - }, - "engines": { - "node": ">=6.5" - } - }, "node_modules/abstract-level": { "version": "1.0.3", "resolved": "https://registry.npmjs.org/abstract-level/-/abstract-level-1.0.3.tgz", @@ -5011,6 +5001,18 @@ "resolved": "https://registry.npmjs.org/core-util-is/-/core-util-is-1.0.2.tgz", "integrity": "sha512-3lqz5YjWTYnW6dlDa5TLaTCcShfar1e40rmcJVwCBJC6mWlFuj0eCHIElmG1g5kyuJ/GD+8Wn4FFCcz4gJPfaQ==" }, + "node_modules/cors": { + "version": "2.8.5", + "resolved": "https://registry.npmjs.org/cors/-/cors-2.8.5.tgz", + "integrity": "sha512-KIHbLJqu73RGr/hnbrO9uBeixNGuvSQjul/jdFvS/KFSIH1hWVd1ng7zOHx+YrEfInLG7q4n6GHQ9cDtxv/P6g==", + "dependencies": { + "object-assign": "^4", + "vary": "^1" + }, + "engines": { + "node": ">= 0.10" + } + }, "node_modules/cosmiconfig": { "version": "8.2.0", "resolved": "https://registry.npmjs.org/cosmiconfig/-/cosmiconfig-8.2.0.tgz", @@ -6969,14 +6971,6 @@ "npm": ">=3" } }, - "node_modules/event-target-shim": { - "version": "5.0.1", - "resolved": "https://registry.npmjs.org/event-target-shim/-/event-target-shim-5.0.1.tgz", - "integrity": "sha512-i/2XbnSz/uxRCU6+NdVJgKWDTM427+MqYbkQzD321DuCQJUqOuJKIA0IM2+W2xtYHdKOmZ4dR6fExsd4SXL+WQ==", - "engines": { - "node": ">=6" - } - }, "node_modules/eventemitter3": { "version": "4.0.7", "resolved": "https://registry.npmjs.org/eventemitter3/-/eventemitter3-4.0.7.tgz", @@ -12997,8 +12991,6 @@ }, "node_modules/ganache-core/node_modules/keccak": { "version": "3.0.1", - "resolved": "https://registry.npmjs.org/keccak/-/keccak-3.0.1.tgz", - "integrity": "sha512-epq90L9jlFWCW7+pQa6JOnKn2Xgl2mtI664seYR6MHskvI9agt7AnDqmAlp9TqU4/caMYbA08Hi5DMZAl5zdkA==", "dev": true, "hasInstallScript": true, "inBundle": true, @@ -13572,8 +13564,6 @@ }, "node_modules/ganache-core/node_modules/node-addon-api": { "version": "2.0.2", - "resolved": "https://registry.npmjs.org/node-addon-api/-/node-addon-api-2.0.2.tgz", - "integrity": "sha512-Ntyt4AIXyaLIuMHF6IOoTakB3K+RWxwtsHNRxllEoA6vPwP9o4866g6YWDLUdnucilZhmkxiHwHr11gAENw+QA==", "dev": true, "inBundle": true, "license": "MIT" @@ -17120,27 +17110,26 @@ "integrity": "sha512-xbbCH5dCYU5T8LcEhhuh7HJ88HXuW3qsI3Y0zOZFKfZEHcpWiHU/Jxzk629Brsab/mMiHQti9wMP+845RPe3Vg==" }, "node_modules/hardhat": { - "version": "2.17.0", - "resolved": "https://registry.npmjs.org/hardhat/-/hardhat-2.17.0.tgz", - "integrity": "sha512-CaEGa13tkJNe2/rdaBiive4pmdNShwxvdWVhr1zfb6aVpRhQt9VNO0l/UIBt/zzajz38ZFjvhfM2bj8LDXo9gw==", + "version": "2.17.3", + "resolved": "https://registry.npmjs.org/hardhat/-/hardhat-2.17.3.tgz", + "integrity": "sha512-SFZoYVXW1bWJZrIIKXOA+IgcctfuKXDwENywiYNT2dM3YQc4fXNaTbuk/vpPzHIF50upByx4zW5EqczKYQubsA==", "dependencies": { "@ethersproject/abi": "^5.1.2", "@metamask/eth-sig-util": "^4.0.0", - "@nomicfoundation/ethereumjs-block": "5.0.1", - "@nomicfoundation/ethereumjs-blockchain": "7.0.1", - "@nomicfoundation/ethereumjs-common": "4.0.1", - "@nomicfoundation/ethereumjs-evm": "2.0.1", - "@nomicfoundation/ethereumjs-rlp": "5.0.1", - "@nomicfoundation/ethereumjs-statemanager": "2.0.1", - "@nomicfoundation/ethereumjs-trie": "6.0.1", - "@nomicfoundation/ethereumjs-tx": "5.0.1", - "@nomicfoundation/ethereumjs-util": "9.0.1", - "@nomicfoundation/ethereumjs-vm": "7.0.1", + "@nomicfoundation/ethereumjs-block": "5.0.2", + "@nomicfoundation/ethereumjs-blockchain": "7.0.2", + "@nomicfoundation/ethereumjs-common": "4.0.2", + "@nomicfoundation/ethereumjs-evm": "2.0.2", + "@nomicfoundation/ethereumjs-rlp": "5.0.2", + "@nomicfoundation/ethereumjs-statemanager": "2.0.2", + "@nomicfoundation/ethereumjs-trie": "6.0.2", + "@nomicfoundation/ethereumjs-tx": "5.0.2", + "@nomicfoundation/ethereumjs-util": "9.0.2", + "@nomicfoundation/ethereumjs-vm": "7.0.2", "@nomicfoundation/solidity-analyzer": "^0.1.0", "@sentry/node": "^5.18.1", "@types/bn.js": "^5.1.0", "@types/lru-cache": "^5.1.0", - "abort-controller": "^3.0.0", "adm-zip": "^0.4.16", "aggregate-error": "^3.0.0", "ansi-escapes": "^4.3.0", @@ -17177,9 +17166,6 @@ "bin": { "hardhat": "internal/cli/bootstrap.js" }, - "engines": { - "node": ">=16.0.0" - }, "peerDependencies": { "ts-node": "*", "typescript": "*" diff --git a/package.json b/package.json index 58d6136..9d5d4d7 100644 --- a/package.json +++ b/package.json @@ -29,7 +29,7 @@ "cids": "^1.1.9", "ethereum-waffle": "^3.4.0", "ethers": "^5.5.1", - "hardhat": "^2.11.2", + "hardhat": "^2.17.3", "hardhat-contract-sizer": "^2.4.0", "hardhat-deploy": "^0.9.29", "hardhat-gas-reporter": "^1.0.7", @@ -40,7 +40,7 @@ }, "dependencies": { "@glif/filecoin-address": "^2.0.18", - "@lighthouse-web3/sdk": "^0.2.5", + "@lighthouse-web3/sdk": "^0.2.7", "@nomicfoundation/hardhat-chai-matchers": "^1.0.0", "@nomicfoundation/hardhat-network-helpers": "^1.0.0", "@nomicfoundation/hardhat-toolbox": "^2.0.0", @@ -51,6 +51,7 @@ "babel-eslint": "^10.1.0", "chai-http": "^4.4.0", "cid-tool": "^3.0.0", + "cors": "^2.8.5", "dotenv": "^10.0.0", "express": "^4.18.2", "hardhat-deploy-ethers": "^0.3.0-beta.13", diff --git a/public/index.html b/public/index.html index b309f8f..ba64a91 100644 --- a/public/index.html +++ b/public/index.html @@ -117,8 +117,8 @@

Deploy a Repair, Renewal and Replication Worker

Workers can be registered individually! Job workers don't have to be bundled, and each job can have 0, 1, or 2 workers. Simply change the default input in `service.js` for your dApp, or use the sample frontend code in `public/index.html` to allow for user input.

The CID is passed to the post endpoint of the service node, and submits a data deal on-chain. A deal is also created with the aggregator.

This whole process may take up to 24 hours, during which you do not have to keep the node running.

-

If you're unsure what to do, try with the placeholder CID! It uploads a 1KB json file as a deal on-chain and to lighthouse.

-

+

If you're unsure what to do, try with the placeholder CID! It uploads a 1MB doc file as a deal on-chain and to lighthouse.

+