fix(deps): use linkedin_scraper fork with rate limit fix#139
Conversation
Point dependency at stickerdaniel/linkedin_scraper fork (fix/rate-limit-false-positive) to fix detect_rate_limit() false-firing on React RSC payloads. Also update docs with detailed release workflow notes and bump opencode agent models to gpt-5.3-codex. See also: joeyism/linkedin_scraper#278
There was a problem hiding this comment.
Your free trial has ended. If you'd like to continue receiving code reviews, you can add a payment method here.
There was a problem hiding this comment.
Cursor Bugbot has reviewed your changes and found 1 potential issue.
Bugbot Autofix is OFF. To automatically fix reported issues with Cloud Agents, enable Autofix in the Cursor dashboard.
This PR is being reviewed by Cursor Bugbot
Details
You are on the Bugbot Free tier. On this plan, Bugbot will review limited PRs each billing cycle.
To receive Bugbot reviews on all of your PRs, visit the Cursor dashboard to activate Pro and start your 14-day free trial.
| "fastmcp>=2.14.0", | ||
| "inquirer>=3.4.0", | ||
| "linkedin-scraper>=3.1.1", | ||
| "linkedin-scraper", |
There was a problem hiding this comment.
Removed minimum version constraint on published dependency
High Severity
The linkedin-scraper dependency in [project.dependencies] dropped its >=3.1.1 minimum version constraint and now has no version specifier at all. Since [tool.uv.sources] is a uv-specific override that is not included in built package metadata, anyone installing this package from PyPI (via pip, uvx, or non-uv tools) will resolve linkedin-scraper from PyPI with no version floor — potentially pulling an old, incompatible version. The fork's rate-limit fix also won't reach end users this way. The >=3.1.1 specifier needs to be preserved alongside the [tool.uv.sources] override.
Additional Locations (1)
There was a problem hiding this comment.
Pull request overview
This PR updates the project’s dependency resolution to use a fork of linkedin-scraper that fixes false rate-limit detection, bumps the package version, and adjusts internal agent configuration/docs to reflect the updated release workflow and model selections.
Changes:
- Point
linkedin-scraperresolution atstickerdaniel/linkedin_scraper(fix/rate-limit-false-positive) viauvsources + lockfile. - Bump
linkedin-scraper-mcpversion from2.3.5to2.3.6. - Update AGENTS release notes and bump
.opencodeagent model configs togpt-5.3-codex.
Reviewed changes
Copilot reviewed 8 out of 9 changed files in this pull request and generated 1 comment.
Show a summary per file
| File | Description |
|---|---|
uv.lock |
Locks linkedin-scraper to the fork/commit and bumps the local package version entry. |
pyproject.toml |
Bumps project version and adds [tool.uv.sources] for forked linkedin-scraper. |
AGENTS.md |
Expands release workflow notes to reflect automated steps handled by CI. |
.opencode/agents/type-design-analyzer.md |
Updates agent model/variant configuration. |
.opencode/agents/silent-failure-hunter.md |
Updates agent model/variant configuration. |
.opencode/agents/pr-test-analyzer.md |
Updates agent model/variant configuration. |
.opencode/agents/comment-analyzer.md |
Updates agent model/variant configuration. |
.opencode/agents/code-simplifier.md |
Updates agent model/variant configuration. |
.opencode/agents/code-reviewer.md |
Updates agent model/variant configuration. |
Comments suppressed due to low confidence (1)
pyproject.toml:41
project.dependenciesnow listslinkedin-scraperwithout a version or direct URL, while the fork is only specified under[tool.uv.sources]. That uv-specific section will not be part of the published wheel metadata, so PyPI/uvx/pip installs will resolvelinkedin-scraperfrom PyPI (likely reintroducing the rate-limit false positive) instead of the fork. To actually ship the fork fix to consumers, encode the VCS dependency inproject.dependencies(PEP 508 direct URL) or depend on a separately published fork package on PyPI.
dependencies = [
"fastmcp>=2.14.0",
"inquirer>=3.4.0",
"linkedin-scraper",
"playwright>=1.40.0",
"pyperclip>=1.9.0",
"python-dotenv>=1.1.1",
]
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| linkedin_mcp_server = ["py.typed"] | ||
|
|
||
| [tool.uv.sources] | ||
| linkedin-scraper = { git = "https://github.com/stickerdaniel/linkedin_scraper.git", rev = "fix/rate-limit-false-positive" } |
There was a problem hiding this comment.
[tool.uv.sources] pins linkedin-scraper to a branch name (rev = "fix/rate-limit-false-positive"). For reproducible builds and to reduce supply-chain risk, prefer pinning to an immutable ref (a tag or commit SHA) and update it intentionally when needed.
| linkedin-scraper = { git = "https://github.com/stickerdaniel/linkedin_scraper.git", rev = "fix/rate-limit-false-positive" } | |
| linkedin-scraper = { git = "https://github.com/stickerdaniel/linkedin_scraper.git", rev = "d34db33fd34db33fd34db33fd34db33fd34db33f" } |



Point dependency at stickerdaniel/linkedin_scraper fork
(fix/rate-limit-false-positive) to fix detect_rate_limit()
false-firing on React RSC payloads.
Also update docs with detailed release workflow notes and
bump opencode agent models to gpt-5.3-codex.
See also: joeyism/linkedin_scraper#278
Note
Medium Risk
Medium risk because it changes the source of a core scraping dependency from a pinned PyPI version to a specific git revision, which can alter runtime behavior and reproducibility. Other changes are configuration/docs-only and low impact.
Overview
Switches the
linkedin-scraperdependency to a git-sourced fork/branch viauv([tool.uv.sources]anduv.lock), replacing the previous>=3.1.1PyPI constraint.Bumps the package version to
2.3.6and updates release documentation inAGENTS.mdto clarify that the GitHub Actions workflow performs most release steps.Updates
.opencodeagent configs to useopenai/gpt-5.3-codexwith a lowervariantsetting (fromxhightohigh).Written by Cursor Bugbot for commit d5de8ef. This will update automatically on new commits. Configure here.