Skip to content

fix(deps): use linkedin_scraper fork with rate limit fix#139

Merged
stickerdaniel merged 1 commit intomainfrom
02-12-fix_deps_use_linkedin_scraper_fork_with_rate_limit_fix
Feb 12, 2026
Merged

fix(deps): use linkedin_scraper fork with rate limit fix#139
stickerdaniel merged 1 commit intomainfrom
02-12-fix_deps_use_linkedin_scraper_fork_with_rate_limit_fix

Conversation

@stickerdaniel
Copy link
Owner

@stickerdaniel stickerdaniel commented Feb 12, 2026

Point dependency at stickerdaniel/linkedin_scraper fork
(fix/rate-limit-false-positive) to fix detect_rate_limit()
false-firing on React RSC payloads.

Also update docs with detailed release workflow notes and
bump opencode agent models to gpt-5.3-codex.

See also: joeyism/linkedin_scraper#278


Note

Medium Risk
Medium risk because it changes the source of a core scraping dependency from a pinned PyPI version to a specific git revision, which can alter runtime behavior and reproducibility. Other changes are configuration/docs-only and low impact.

Overview
Switches the linkedin-scraper dependency to a git-sourced fork/branch via uv ([tool.uv.sources] and uv.lock), replacing the previous >=3.1.1 PyPI constraint.

Bumps the package version to 2.3.6 and updates release documentation in AGENTS.md to clarify that the GitHub Actions workflow performs most release steps.

Updates .opencode agent configs to use openai/gpt-5.3-codex with a lower variant setting (from xhigh to high).

Written by Cursor Bugbot for commit d5de8ef. This will update automatically on new commits. Configure here.

Point dependency at stickerdaniel/linkedin_scraper fork
(fix/rate-limit-false-positive) to fix detect_rate_limit()
false-firing on React RSC payloads.

Also update docs with detailed release workflow notes and
bump opencode agent models to gpt-5.3-codex.

See also: joeyism/linkedin_scraper#278
Copy link
Owner Author

This stack of pull requests is managed by Graphite. Learn more about stacking.

@stickerdaniel stickerdaniel marked this pull request as ready for review February 12, 2026 07:49
Copilot AI review requested due to automatic review settings February 12, 2026 07:49
Copy link
Contributor

@greptile-apps greptile-apps bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Your free trial has ended. If you'd like to continue receiving code reviews, you can add a payment method here.

@stickerdaniel stickerdaniel merged commit 6b6315e into main Feb 12, 2026
5 checks passed
@stickerdaniel stickerdaniel deleted the 02-12-fix_deps_use_linkedin_scraper_fork_with_rate_limit_fix branch February 12, 2026 07:49
Copy link

@cursor cursor bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cursor Bugbot has reviewed your changes and found 1 potential issue.

Bugbot Autofix is OFF. To automatically fix reported issues with Cloud Agents, enable Autofix in the Cursor dashboard.

This PR is being reviewed by Cursor Bugbot

Details

You are on the Bugbot Free tier. On this plan, Bugbot will review limited PRs each billing cycle.

To receive Bugbot reviews on all of your PRs, visit the Cursor dashboard to activate Pro and start your 14-day free trial.

"fastmcp>=2.14.0",
"inquirer>=3.4.0",
"linkedin-scraper>=3.1.1",
"linkedin-scraper",
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Removed minimum version constraint on published dependency

High Severity

The linkedin-scraper dependency in [project.dependencies] dropped its >=3.1.1 minimum version constraint and now has no version specifier at all. Since [tool.uv.sources] is a uv-specific override that is not included in built package metadata, anyone installing this package from PyPI (via pip, uvx, or non-uv tools) will resolve linkedin-scraper from PyPI with no version floor — potentially pulling an old, incompatible version. The fork's rate-limit fix also won't reach end users this way. The >=3.1.1 specifier needs to be preserved alongside the [tool.uv.sources] override.

Additional Locations (1)

Fix in Cursor Fix in Web

Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR updates the project’s dependency resolution to use a fork of linkedin-scraper that fixes false rate-limit detection, bumps the package version, and adjusts internal agent configuration/docs to reflect the updated release workflow and model selections.

Changes:

  • Point linkedin-scraper resolution at stickerdaniel/linkedin_scraper (fix/rate-limit-false-positive) via uv sources + lockfile.
  • Bump linkedin-scraper-mcp version from 2.3.5 to 2.3.6.
  • Update AGENTS release notes and bump .opencode agent model configs to gpt-5.3-codex.

Reviewed changes

Copilot reviewed 8 out of 9 changed files in this pull request and generated 1 comment.

Show a summary per file
File Description
uv.lock Locks linkedin-scraper to the fork/commit and bumps the local package version entry.
pyproject.toml Bumps project version and adds [tool.uv.sources] for forked linkedin-scraper.
AGENTS.md Expands release workflow notes to reflect automated steps handled by CI.
.opencode/agents/type-design-analyzer.md Updates agent model/variant configuration.
.opencode/agents/silent-failure-hunter.md Updates agent model/variant configuration.
.opencode/agents/pr-test-analyzer.md Updates agent model/variant configuration.
.opencode/agents/comment-analyzer.md Updates agent model/variant configuration.
.opencode/agents/code-simplifier.md Updates agent model/variant configuration.
.opencode/agents/code-reviewer.md Updates agent model/variant configuration.
Comments suppressed due to low confidence (1)

pyproject.toml:41

  • project.dependencies now lists linkedin-scraper without a version or direct URL, while the fork is only specified under [tool.uv.sources]. That uv-specific section will not be part of the published wheel metadata, so PyPI/uvx/pip installs will resolve linkedin-scraper from PyPI (likely reintroducing the rate-limit false positive) instead of the fork. To actually ship the fork fix to consumers, encode the VCS dependency in project.dependencies (PEP 508 direct URL) or depend on a separately published fork package on PyPI.
dependencies = [
    "fastmcp>=2.14.0",
    "inquirer>=3.4.0",
    "linkedin-scraper",
    "playwright>=1.40.0",
    "pyperclip>=1.9.0",
    "python-dotenv>=1.1.1",
]

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

linkedin_mcp_server = ["py.typed"]

[tool.uv.sources]
linkedin-scraper = { git = "https://github.com/stickerdaniel/linkedin_scraper.git", rev = "fix/rate-limit-false-positive" }
Copy link

Copilot AI Feb 12, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[tool.uv.sources] pins linkedin-scraper to a branch name (rev = "fix/rate-limit-false-positive"). For reproducible builds and to reduce supply-chain risk, prefer pinning to an immutable ref (a tag or commit SHA) and update it intentionally when needed.

Suggested change
linkedin-scraper = { git = "https://github.com/stickerdaniel/linkedin_scraper.git", rev = "fix/rate-limit-false-positive" }
linkedin-scraper = { git = "https://github.com/stickerdaniel/linkedin_scraper.git", rev = "d34db33fd34db33fd34db33fd34db33fd34db33f" }

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants