Skip to content

m7 contracts deployment#2465

Merged
1 commit merged intomasterfrom
feat/m7-contracts
Sep 12, 2025
Merged

m7 contracts deployment#2465
1 commit merged intomasterfrom
feat/m7-contracts

Conversation

@ghost
Copy link
Copy Markdown

@ghost ghost commented Sep 12, 2025

Description

also cleaned up unused addr generation

Fixes # (issue)

Type of change

Please delete options that are not relevant.

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • This change requires a documentation update

Checklist before requesting a review

  • I have performed a self-review of my code.
  • If it is a core feature, I have added thorough tests.

Deployment

  • Should publish npm package
  • Should publish Docker image

@ghost ghost self-assigned this Sep 12, 2025
@ghost ghost self-requested a review as a code owner September 12, 2025 06:58
@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai bot commented Sep 12, 2025

📝 Walkthrough

Walkthrough

Removes tag-based data feed processing from the v0.1 address-generation script, simplifying outputs to a single others-addresses.json. In the PoR app, removes early checks for zero decimals and denominator, proceeding with computations based on adapter.Decimals without explicit zero-guard conditions.

Changes

Cohort / File(s) Summary of Changes
Address generation script simplification
contracts/v0.1/scripts/generate-contract-addresses.js
Removed tag fetching and dataFeeds logic; deleted fetchTags and convertContractType; updated readDeployments signature to async (folderPath) and return only others; skipped Aggregator_* and AggregatorProxy_* files; normalized non-aggregator contract names; main() now writes only other-addresses.json.
PoR reporting logic adjustments
node/pkg/por/app.go
New(ctx) no longer errors when Decimals == 0; shouldReport removed zero-denominator guard and always computes using math.Pow10(e.adapter.Decimals).

Sequence Diagram(s)

sequenceDiagram
  autonumber
  participant U as CLI/User
  participant S as generate-contract-addresses.js
  participant FS as File System

  U->>S: Run script with deployments folder
  S->>FS: Read deployments per network
  note over S,FS: Aggregator_* and AggregatorProxy_* skipped
  S->>S: Normalize contract names (trim trailing _ segments)
  S->>FS: Write other-addresses.json
  note right of S: dataFeeds output removed
Loading
sequenceDiagram
  autonumber
  participant R as Reporter
  participant A as PoR App (New, shouldReport)
  participant M as Math

  R->>A: New(ctx)
  A->>A: Build entry without Decimals==0 check
  R->>A: shouldReport(e, lastInfo, newVal, fetchedTime)
  A->>M: denominator = 10^Decimals (no zero guard)
  M-->>A: scaled old/new values
  A-->>R: decision to report
  note right of A: Previous zero-denominator check removed
Loading

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~20 minutes

Possibly related PRs

Pre-merge checks (1 passed, 1 warning, 1 inconclusive)

❌ Failed checks (1 warning, 1 inconclusive)
Check name Status Explanation Resolution
Description Check ⚠️ Warning The current PR description contains only a one-line note ("also cleaned up unused addr generation"), leaves the "Fixes #" placeholder empty, and omits motivation, context, dependency information, and a summary of key file-level changes required by the repository template. The template sections (Type of change, Checklist, Deployment) are present but not completed, and the description does not explain behavioral impact or testing coverage. Because required template fields and essential context are missing, the description does not meet the repository's template requirements. Expand the Description to include a clear summary of what changed and why, fill in or remove the "Fixes #" issue reference with the actual issue number if applicable, list any dependencies, and indicate the appropriate "Type of change" by checking the box; also complete the pre-review checklist (self-review and tests) and note whether npm/Docker publication is required, and add a brief changelog of the main file-level changes (e.g., removal of dataFeeds and Aggregator handling, readDeployments signature change, app validation changes) to help reviewers evaluate the PR.
Title Check ❓ Inconclusive The title "m7 contracts deployment" is short but ambiguous: the token "m7" is unexplained and "contracts deployment" is generic. The PR actually performs a cleanup of unused contract address generation and removes dataFeeds/Aggregator handling, which the current title does not clearly convey. For a reviewer scanning history the title is too vague to identify the primary change. Please update the title to a concise, descriptive sentence that highlights the main change, for example "Remove dataFeeds logic and cleanup contract address generation" or "Cleanup unused contract address generation and remove Aggregator/dataFeed handling"; ensure the chosen title clearly reflects the primary intent of the changeset so reviewers can understand the change at a glance.
✅ Passed checks (1 passed)
Check name Status Explanation
Docstring Coverage ✅ Passed No functions found in the changes. Docstring coverage check skipped.

Poem

I hop through commits, light on my feet,
Trimmed the tags, oh what a treat!
One JSON moon, not two in the night,
Denoms unchecked, math takes flight.
In fields of code, I nibble and grin—
Less fluff, more run—let changes begin! 🐇✨

Tip

👮 Agentic pre-merge checks are now available in preview!

Pro plan users can now enable pre-merge checks in their settings to enforce checklists before merging PRs.

  • Built-in checks – Quickly apply ready-made checks to enforce title conventions, require pull request descriptions that follow templates, validate linked issues for compliance, and more.
  • Custom agentic checks – Define your own rules using CodeRabbit’s advanced agentic capabilities to enforce organization-specific policies and workflows. For example, you can instruct CodeRabbit’s agent to verify that API documentation is updated whenever API schema files are modified in a PR. Note: Upto 5 custom checks are currently allowed during the preview period. Pricing for this feature will be announced in a few weeks.

Please see the documentation for more information.

Example:

reviews:
  pre_merge_checks:
    custom_checks:
      - name: "Undocumented Breaking Changes"
        mode: "warning"
        instructions: |
          Pass/fail criteria: All breaking changes to public APIs, CLI flags, environment variables, configuration keys, database schemas, or HTTP/GraphQL endpoints must be documented in the "Breaking Change" section of the PR description and in CHANGELOG.md. Exclude purely internal or private changes (e.g., code not exported from package entry points or explicitly marked as internal).

Please share your feedback with us on this Discord post.

✨ Finishing touches
  • 📝 Generate Docstrings
🧪 Generate unit tests
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch feat/m7-contracts

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@ghost ghost merged commit caf014f into master Sep 12, 2025
2 checks passed
@ghost ghost deleted the feat/m7-contracts branch September 12, 2025 07:02
Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🧹 Nitpick comments (3)
contracts/v0.1/scripts/generate-contract-addresses.js (3)

31-33: Use path-relative + path.sep for cross‑platform network extraction.

Apply:

-          const network = filePath.replace(deploymentsPath, '').split('/')[0]
+          const relative = path.relative(deploymentsPath, filePath)
+          const network = relative.split(path.sep)[0]

51-51: Validate addresses (format and non-zero) before writing.

Add helper:

// place near top-level helpers
const isValidAddress = (addr) =>
  typeof addr === 'string' &&
  /^0x[a-fA-F0-9]{40}$/.test(addr) &&
  addr.toLowerCase() !== '0x0000000000000000000000000000000000000000'

And guard assignment:

-          others[network][contractName] = address
+          if (!isValidAddress(address)) {
+            console.warn(`Skipping invalid address in ${filePath}: ${address}`)
+            return
+          }
+          others[network][contractName] = address

43-51: No collisions found; still recommend defensive normalization + overwrite warning

Verified: ran the duplicate-check against contracts/v0.1/deployments (baobab, cypress) — no duplicate base names found. Update the normalization to remove all whitespace and warn on overwrites.

File: contracts/v0.1/scripts/generate-contract-addresses.js Lines: 43-51

-          const splitted = contractName.replace(' ', '').split('_')
-          if (splitted.length > 1) {
-            splitted.pop()
-            contractName = splitted.join('_')
-          }
-          others[network][contractName] = address
+          const nameNoSpace = contractName.replace(/\s+/g, '')
+          const parts = nameNoSpace.split('_')
+          const baseName = parts.length > 1 ? parts.slice(0, -1).join('_') : nameNoSpace
+          if (others[network][baseName] && others[network][baseName] !== address) {
+            console.warn(`Overwriting ${network}/${baseName}: ${others[network][baseName]} -> ${address}`)
+          }
+          others[network][baseName] = address
📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between f2aa1fc and 6e80359.

⛔ Files ignored due to path filters (45)
  • contracts/v0.1/deployments/baobab/PORProxy_AAPL_20250912150912.json is excluded by !**/*.json
  • contracts/v0.1/deployments/baobab/PORProxy_AMZN_20250912150912.json is excluded by !**/*.json
  • contracts/v0.1/deployments/baobab/PORProxy_GOOGL_20250912150912.json is excluded by !**/*.json
  • contracts/v0.1/deployments/baobab/PORProxy_META_20250912150912.json is excluded by !**/*.json
  • contracts/v0.1/deployments/baobab/PORProxy_MSFT_20250912150912.json is excluded by !**/*.json
  • contracts/v0.1/deployments/baobab/PORProxy_NVDA_20250912150912.json is excluded by !**/*.json
  • contracts/v0.1/deployments/baobab/PORProxy_TSLA_20250912150912.json is excluded by !**/*.json
  • contracts/v0.1/deployments/baobab/POR_AAPL_20250912150912.json is excluded by !**/*.json
  • contracts/v0.1/deployments/baobab/POR_AMZN_20250912150912.json is excluded by !**/*.json
  • contracts/v0.1/deployments/baobab/POR_GOOGL_20250912150912.json is excluded by !**/*.json
  • contracts/v0.1/deployments/baobab/POR_META_20250912150912.json is excluded by !**/*.json
  • contracts/v0.1/deployments/baobab/POR_MSFT_20250912150912.json is excluded by !**/*.json
  • contracts/v0.1/deployments/baobab/POR_NVDA_20250912150912.json is excluded by !**/*.json
  • contracts/v0.1/deployments/baobab/POR_TSLA_20250912150912.json is excluded by !**/*.json
  • contracts/v0.1/deployments/cypress/PORProxy_AAPL_20250912150954.json is excluded by !**/*.json
  • contracts/v0.1/deployments/cypress/PORProxy_AMZN_20250912150954.json is excluded by !**/*.json
  • contracts/v0.1/deployments/cypress/PORProxy_GOOGL_20250912150954.json is excluded by !**/*.json
  • contracts/v0.1/deployments/cypress/PORProxy_META_20250912150954.json is excluded by !**/*.json
  • contracts/v0.1/deployments/cypress/PORProxy_MSFT_20250912150954.json is excluded by !**/*.json
  • contracts/v0.1/deployments/cypress/PORProxy_NVDA_20250912150954.json is excluded by !**/*.json
  • contracts/v0.1/deployments/cypress/PORProxy_TSLA_20250912150954.json is excluded by !**/*.json
  • contracts/v0.1/deployments/cypress/POR_AAPL_20250912150954.json is excluded by !**/*.json
  • contracts/v0.1/deployments/cypress/POR_AMZN_20250912150954.json is excluded by !**/*.json
  • contracts/v0.1/deployments/cypress/POR_GOOGL_20250912150954.json is excluded by !**/*.json
  • contracts/v0.1/deployments/cypress/POR_META_20250912150954.json is excluded by !**/*.json
  • contracts/v0.1/deployments/cypress/POR_MSFT_20250912150954.json is excluded by !**/*.json
  • contracts/v0.1/deployments/cypress/POR_NVDA_20250912150954.json is excluded by !**/*.json
  • contracts/v0.1/deployments/cypress/POR_TSLA_20250912150954.json is excluded by !**/*.json
  • contracts/v0.1/deployments/other-addresses.json is excluded by !**/*.json
  • contracts/v0.1/migration/baobab/POR/20250912151144_AAPL.json is excluded by !**/*.json
  • contracts/v0.1/migration/baobab/POR/20250912151159_MSFT.json is excluded by !**/*.json
  • contracts/v0.1/migration/baobab/POR/20250912151206_AMZN.json is excluded by !**/*.json
  • contracts/v0.1/migration/baobab/POR/20250912151213_GOOGL.json is excluded by !**/*.json
  • contracts/v0.1/migration/baobab/POR/20250912151235_META.json is excluded by !**/*.json
  • contracts/v0.1/migration/baobab/POR/20250912151246_NVDA.json is excluded by !**/*.json
  • contracts/v0.1/migration/baobab/POR/20250912151253_TSLA.json is excluded by !**/*.json
  • contracts/v0.1/migration/baobab/POR/migration.lock is excluded by !**/*.lock, !**/*.lock
  • contracts/v0.1/migration/cypress/POR/20250912151313_AAPL.json is excluded by !**/*.json
  • contracts/v0.1/migration/cypress/POR/20250912151323_MSFT.json is excluded by !**/*.json
  • contracts/v0.1/migration/cypress/POR/20250912151332_AMZN.json is excluded by !**/*.json
  • contracts/v0.1/migration/cypress/POR/20250912151339_GOOGL.json is excluded by !**/*.json
  • contracts/v0.1/migration/cypress/POR/20250912151350_META.json is excluded by !**/*.json
  • contracts/v0.1/migration/cypress/POR/20250912151402_NVDA.json is excluded by !**/*.json
  • contracts/v0.1/migration/cypress/POR/20250912151410_TSLA.json is excluded by !**/*.json
  • contracts/v0.1/migration/cypress/POR/migration.lock is excluded by !**/*.lock, !**/*.lock
📒 Files selected for processing (2)
  • contracts/v0.1/scripts/generate-contract-addresses.js (5 hunks)
  • node/pkg/por/app.go (0 hunks)
💤 Files with no reviewable changes (1)
  • node/pkg/por/app.go
🧰 Additional context used
🧠 Learnings (2)
📓 Common learnings
Learnt from: martinkersner
PR: Bisonai/orakl#0
File: :0-0
Timestamp: 2024-07-27T01:26:16.468Z
Learning: The `fetchTags` function in `contracts/v0.1/scripts/generate-contract-addresses.js` was refactored for better error handling and data processing, as per the changes committed in https://github.com/Bisonai/orakl/pull/1364/commits/13f8132c6c41acf08e63c60b0f279f903d944bdc.
Learnt from: martinkersner
PR: Bisonai/orakl#1244
File: contracts/v0.2/src/Feed.sol:0-0
Timestamp: 2024-07-27T01:26:16.468Z
Learning: The `changeOracles` function in the `Feed.sol` contract includes input validation to skip adding zero addresses, as per the user's fix.
📚 Learning: 2024-07-27T01:26:16.468Z
Learnt from: martinkersner
PR: Bisonai/orakl#0
File: :0-0
Timestamp: 2024-07-27T01:26:16.468Z
Learning: The `fetchTags` function in `contracts/v0.1/scripts/generate-contract-addresses.js` was refactored for better error handling and data processing, as per the changes committed in https://github.com/Bisonai/orakl/pull/1364/commits/13f8132c6c41acf08e63c60b0f279f903d944bdc.

Applied to files:

  • contracts/v0.1/scripts/generate-contract-addresses.js
🧬 Code graph analysis (1)
contracts/v0.1/scripts/generate-contract-addresses.js (1)
contracts/v0.2/script/collect-addresses.cjs (1)
  • network (49-49)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
  • GitHub Check: core-build
  • GitHub Check: build
🔇 Additional comments (5)
contracts/v0.1/scripts/generate-contract-addresses.js (5)

63-64: Return shape LGTM.


77-78: Main flow LGTM (tags removed).


3-3: Remove unused import axios (verify repo‑wide before removal).
contracts/v0.1/scripts/generate-contract-addresses.js — remove:

-const axios = require('axios')

Sandbox search failed (ripgrep skipped files / script error). Confirm no other usage locally:

git grep -n axios || rg -n --hidden -S '\baxios\b' --glob '!node_modules/**'

14-16: Signature simplified — safe: no external v0.1 callers depend on removed tags arg.

Only two-arg usage found is in contracts/v0.2/script/collect-addresses.cjs (it defines/uses readDeployments(folderPath, tags)); contracts/v0.1/scripts/generate-contract-addresses.js and contracts/v0.1/utils/index.ts use the single-arg signature. No action required.


79-82: Keep JSON.stringify when calling storeJson; unify the filename to avoid cross-version breakage.

  • storeJson writes the provided data verbatim (see contracts/v0.1/scripts/utils.cjs — it calls writeFileSync with the passed data); callers in this repo already pass JSON.stringify, so removing the stringify would be incorrect and break existing callers (e.g., contracts/v0.1/scripts/generate-contract-addresses.js, contracts/v0.1/scripts/admin-aggregator/generate-aggregator-deployments.cjs).
  • Filename mismatch: v0.1 script writes "other-addresses.json" (contracts/v0.1/scripts/generate-contract-addresses.js) while v0.2 expects "others-addresses.json" (contracts/v0.2/script/collect-addresses.cjs and contracts/v0.2/README.md) — pick one convention and update the scripts/README accordingly.

Likely an incorrect or invalid review comment.

Comment on lines +28 to +30
if (file.startsWith('Aggregator_') || file.startsWith('AggregatorProxy_')) {
return
}
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

🧩 Analysis chain

Skipping Aggregator — confirm no consumer still expects datafeeds outputs.*

This removes datafeeds entirely. Ensure nothing reads datafeeds-addresses.json or Aggregator-derived mappings.


🏁 Script executed:

#!/bin/bash
rg -n 'datafeeds-addresses\.json|AggregatorProxy_|Aggregator_' -g '!**/node_modules/**'

Length of output: 148


🏁 Script executed:

#!/bin/bash
set -euo pipefail

echo "PWD: $(pwd)"
echo "--- Top-level files ---"
ls -la | sed -n '1,200p' || true

echo "--- .git present? ---"
if [ -d .git ]; then echo "FOUND .git"; else echo "NO .git"; fi

echo "--- Git branch (if any) ---"
git rev-parse --abbrev-ref HEAD 2>/dev/null || true

echo "--- Tool availability ---"
command -v rg >/dev/null 2>&1 && rg --version || echo "rg: not available"
command -v git >/dev/null 2>&1 && git --version || echo "git: not available"
command -v ast-grep >/dev/null 2>&1 && ast-grep --version || echo "ast-grep: not available"
command -v fd >/dev/null 2>&1 && fd --version || echo "fd: not available"

TARGET="contracts/v0.1/scripts/generate-contract-addresses.js"
echo "--- Check target file ---"
if [ -f "$TARGET" ]; then
  echo "FOUND: $TARGET"
  echo "------ $TARGET (first 300 lines with line numbers) ------"
  nl -ba "$TARGET" | sed -n '1,300p'
else
  echo "NOT FOUND: $TARGET"
fi

echo "--- Search repo with rg (no ignore) ---"
if command -v rg >/dev/null 2>&1; then
  rg -n --no-ignore -S 'datafeeds-addresses\.json|datafeeds-addresses|AggregatorProxy_|Aggregator_' || true
else
  echo "Skipping rg search (rg not installed)"
fi

echo "--- Search repo with git grep ---"
if command -v git >/dev/null 2>&1; then
  git grep -n -- 'datafeeds-addresses' || true
  git grep -n -- 'AggregatorProxy_' || true
  git grep -n -- 'Aggregator_' || true
else
  echo "Skipping git grep (git not available)"
fi

echo "--- Fallback grep ---"
grep -RIn --exclude-dir=node_modules --exclude-dir=.git -e 'datafeeds-addresses' -e 'AggregatorProxy_' -e 'Aggregator_' . || true

echo "--- Search for literal filename datafeeds-addresses.json ---"
if command -v rg >/dev/null 2>&1; then
  rg -n --no-ignore -S 'datafeeds-addresses\.json' || true
else
  grep -RIn --exclude-dir=node_modules --exclude-dir=.git -e 'datafeeds-addresses.json' . || true
fi

Length of output: 1917


🏁 Script executed:

#!/bin/bash
set -euo pipefail

echo "PWD: $(pwd)"
echo "---- Show target file (contracts/v0.1/scripts/generate-contract-addresses.js) ----"
sed -n '1,400p' contracts/v0.1/scripts/generate-contract-addresses.js || true

echo
echo "---- ripgrep (search for datafeeds / Aggregator references) ----"
if command -v rg >/dev/null 2>&1; then
  rg -n --hidden --no-ignore -S -e 'datafeeds-addresses\.json' -e 'datafeeds-addresses' -e 'datafeeds' -e 'AggregatorProxy_' -e 'Aggregator_' || true
else
  echo "rg: not available"
fi

echo
echo "---- git grep fallback ----"
if command -v git >/dev/null 2>&1; then
  git grep -n -- 'datafeeds-addresses' || true
  git grep -n -- 'AggregatorProxy_' || true
  git grep -n -- 'Aggregator_' || true
else
  echo "git: not available"
fi

echo
echo "---- find literal file ----"
find . -type f -iname 'datafeeds-addresses.json' -print || true

echo
echo "---- grep fallback ----"
grep -RIn --exclude-dir=node_modules --exclude-dir=.git -e 'datafeeds-addresses' -e 'datafeeds' -e 'AggregatorProxy_' -e 'Aggregator_' . || true

Length of output: 8926


Restore or separately emit Aggregator/datafeeds addresses — consumers still expect them.

          if (file.startsWith('Aggregator_') || file.startsWith('AggregatorProxy_')) {
            return
          }
  • Evidence: core/src/worker/data-feed.utils.ts, core/src/worker/data-feed-L2.ts, core/src/listener/data-feed-L2.ts (import Aggregator__factory); contracts/v0.2/script/collect-addresses.cjs (references "datafeeds-addresses.json"); contracts/v0.2/script/prepare-submission-proxy-migration.py; contracts/v0.2/README.md; contracts/v0.1/deploy/Aggregator/Aggregator.cjs (creates Aggregator_* / AggregatorProxy_* names).
  • Action: either stop skipping Aggregator_* here or add a dedicated step that emits datafeeds-addresses.json (and update/remove consumers accordingly).
🤖 Prompt for AI Agents
contracts/v0.1/scripts/generate-contract-addresses.js lines 28-30: the current
early return skips files named Aggregator_* and AggregatorProxy_ which prevents
emitting datafeed addresses consumers expect; either remove this conditional so
Aggregator_* and AggregatorProxy_* entries are included in the main output, or
keep the skip but add a new step that collects those Aggregator entries into a
separate datafeeds-addresses.json (same format as other outputs) and write that
file to the same output directory; update any downstream references if you
choose the new file approach.

This pull request was closed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant