| Affected_by_vulnerabilities |
| 0 |
| url |
VCID-1k4b-pr5k-s7e5 |
| vulnerability_id |
VCID-1k4b-pr5k-s7e5 |
| summary |
Scrapy: Arbitrary Module Import via Referrer-Policy Header in RefererMiddleware
### Impact
Since version 1.4.0, Scrapy respects the `Referrer-Policy` response header to decide whether and how to set a `Referer` header on follow-up requests.
If the header value looked like a valid Python import path, Scrapy would import the referenced object and call it, assuming it referred to a referrer policy class (for example, `scrapy.spidermiddlewares.referer.DefaultReferrerPolicy`) and attempting to instantiate it to handle the `Referer` header.
A malicious site could exploit this by setting `Referrer-Policy` to a path such as `sys.exit`, causing Scrapy to import and execute it and potentially terminate the process.
### Patches
Upgrade to Scrapy 2.14.2 (or later).
### Workarounds
If you cannot upgrade to Scrapy 2.14.2, consider the following mitigations.
- **Disable the middleware:** If you don't need the `Referer` header on follow-up requests, set [`REFERER_ENABLED`](https://docs.scrapy.org/en/latest/topics/spider-middleware.html#referer-enabled) to `False`.
- **Set headers manually:** If you do need a `Referer`, disable the middleware and set the header explicitly on the requests that require it.
- **Set `referrer_policy` in request metadata:** If disabling the middleware is not viable, set the [`referrer_policy`](https://docs.scrapy.org/en/latest/topics/spider-middleware.html#referrer-policy) request meta key on all requests to prevent evaluating preceding responses' `Referrer-Policy`. For example:
```python
Request(
url,
meta={
"referrer_policy": "scrapy.spidermiddlewares.referer.DefaultReferrerPolicy",
},
)
```
Instead of editing requests individually, you can:
- implement a custom [spider middleware](https://docs.scrapy.org/en/latest/topics/spider-middleware.html) that runs before the built-in referrer policy middleware and sets the `referrer_policy` meta key; or
- set the meta key in start requests and use the [scrapy-sticky-meta-params](https://github.com/heylouiz/scrapy-sticky-meta-params) plugin to propagate it to follow-up requests.
If you want to continue respecting legitimate `Referrer-Policy` headers while protecting against malicious ones, disable the built-in referrer policy middleware by setting it to `None` in [`SPIDER_MIDDLEWARES`](https://docs.scrapy.org/en/latest/topics/settings.html#std-setting-SPIDER_MIDDLEWARES) and replace it with the fixed implementation from Scrapy 2.14.2.
If the Scrapy 2.14.2 implementation is incompatible with your project (for example, because your Scrapy version is older), copy the corresponding middleware from your Scrapy version, apply the same patch, and use that as a replacement. |
| references |
|
| fixed_packages |
|
| aliases |
GHSA-cwxj-rr6w-m6w7
|
| risk_score |
4.0 |
| exploitability |
0.5 |
| weighted_severity |
8.0 |
| resource_url |
http://public2.vulnerablecode.io/vulnerabilities/VCID-1k4b-pr5k-s7e5 |
|
| 1 |
| url |
VCID-385b-344t-23es |
| vulnerability_id |
VCID-385b-344t-23es |
| summary |
Scrapy decompression bomb vulnerability
### Impact
Scrapy limits allowed response sizes by default through the [`DOWNLOAD_MAXSIZE`](https://docs.scrapy.org/en/latest/topics/settings.html#download-maxsize) and [`DOWNLOAD_WARNSIZE`](https://docs.scrapy.org/en/latest/topics/settings.html#download-warnsize) settings.
However, those limits were only being enforced during the download of the raw, usually-compressed response bodies, and not during decompression, making Scrapy vulnerable to [decompression bombs](https://cwe.mitre.org/data/definitions/409.html).
A malicious website being scraped could send a small response that, on decompression, could exhaust the memory available to the Scrapy process, potentially affecting any other process sharing that memory, and affecting disk usage in case of uncompressed response caching.
### Patches
Upgrade to Scrapy 2.11.1.
If you are using Scrapy 1.8 or a lower version, and upgrading to Scrapy 2.11.1 is not an option, you may upgrade to Scrapy 1.8.4 instead.
### Workarounds
There is no easy workaround.
Disabling HTTP decompression altogether is impractical, as HTTP compression is a rather common practice.
However, it is technically possible to manually backport the 2.11.1 or 1.8.4 fix, replacing the corresponding components of an unpatched version of Scrapy with patched versions copied into your own code.
### Acknowledgements
This security issue was reported by @dmandefy [through huntr.com](https://huntr.com/bounties/c4a0fac9-0c5a-4718-9ee4-2d06d58adabb/). |
| references |
| 0 |
| reference_url |
https://api.first.org/data/v1/epss?cve=CVE-2024-3572 |
| reference_id |
|
| reference_type |
|
| scores |
| 0 |
| value |
0.00157 |
| scoring_system |
epss |
| scoring_elements |
0.36675 |
| published_at |
2026-04-04T12:55:00Z |
|
| 1 |
| value |
0.00157 |
| scoring_system |
epss |
| scoring_elements |
0.36503 |
| published_at |
2026-04-21T12:55:00Z |
|
| 2 |
| value |
0.00157 |
| scoring_system |
epss |
| scoring_elements |
0.36559 |
| published_at |
2026-04-18T12:55:00Z |
|
| 3 |
| value |
0.00157 |
| scoring_system |
epss |
| scoring_elements |
0.36576 |
| published_at |
2026-04-16T12:55:00Z |
|
| 4 |
| value |
0.00157 |
| scoring_system |
epss |
| scoring_elements |
0.36532 |
| published_at |
2026-04-13T12:55:00Z |
|
| 5 |
| value |
0.00157 |
| scoring_system |
epss |
| scoring_elements |
0.36556 |
| published_at |
2026-04-12T12:55:00Z |
|
| 6 |
| value |
0.00157 |
| scoring_system |
epss |
| scoring_elements |
0.3659 |
| published_at |
2026-04-11T12:55:00Z |
|
| 7 |
| value |
0.00157 |
| scoring_system |
epss |
| scoring_elements |
0.36584 |
| published_at |
2026-04-09T12:55:00Z |
|
| 8 |
| value |
0.00157 |
| scoring_system |
epss |
| scoring_elements |
0.36565 |
| published_at |
2026-04-08T12:55:00Z |
|
| 9 |
| value |
0.00157 |
| scoring_system |
epss |
| scoring_elements |
0.36513 |
| published_at |
2026-04-07T12:55:00Z |
|
| 10 |
| value |
0.00157 |
| scoring_system |
epss |
| scoring_elements |
0.36643 |
| published_at |
2026-04-02T12:55:00Z |
|
|
| url |
https://api.first.org/data/v1/epss?cve=CVE-2024-3572 |
|
| 1 |
|
| 2 |
|
| 3 |
|
| 4 |
|
| 5 |
|
| 6 |
|
| 7 |
|
| 8 |
|
| 9 |
|
| 10 |
|
|
| fixed_packages |
|
| aliases |
CVE-2024-3572, GHSA-7j7m-v7m3-jqm7, GMS-2024-327
|
| risk_score |
4.0 |
| exploitability |
0.5 |
| weighted_severity |
8.0 |
| resource_url |
http://public2.vulnerablecode.io/vulnerabilities/VCID-385b-344t-23es |
|
| 2 |
| url |
VCID-64nx-aruy-q7gy |
| vulnerability_id |
VCID-64nx-aruy-q7gy |
| summary |
A Regular Expression Denial of Service (ReDoS) vulnerability exists in the XMLFeedSpider class of the scrapy/scrapy project, specifically in the parsing of XML content. By crafting malicious XML content that exploits inefficient regular expression complexity used in the parsing process, an attacker can cause a denial-of-service (DoS) condition. This vulnerability allows for the system to hang and consume significant resources, potentially rendering services that utilize Scrapy for XML processing unresponsive. |
| references |
| 0 |
| reference_url |
https://api.first.org/data/v1/epss?cve=CVE-2024-1892 |
| reference_id |
|
| reference_type |
|
| scores |
| 0 |
| value |
0.00058 |
| scoring_system |
epss |
| scoring_elements |
0.18213 |
| published_at |
2026-04-12T12:55:00Z |
|
| 1 |
| value |
0.00058 |
| scoring_system |
epss |
| scoring_elements |
0.1815 |
| published_at |
2026-04-21T12:55:00Z |
|
| 2 |
| value |
0.00058 |
| scoring_system |
epss |
| scoring_elements |
0.18106 |
| published_at |
2026-04-16T12:55:00Z |
|
| 3 |
| value |
0.00058 |
| scoring_system |
epss |
| scoring_elements |
0.18161 |
| published_at |
2026-04-13T12:55:00Z |
|
| 4 |
| value |
0.00058 |
| scoring_system |
epss |
| scoring_elements |
0.1836 |
| published_at |
2026-04-02T12:55:00Z |
|
| 5 |
| value |
0.00058 |
| scoring_system |
epss |
| scoring_elements |
0.18415 |
| published_at |
2026-04-04T12:55:00Z |
|
| 6 |
| value |
0.00058 |
| scoring_system |
epss |
| scoring_elements |
0.18118 |
| published_at |
2026-04-18T12:55:00Z |
|
| 7 |
| value |
0.00058 |
| scoring_system |
epss |
| scoring_elements |
0.18203 |
| published_at |
2026-04-08T12:55:00Z |
|
| 8 |
| value |
0.00058 |
| scoring_system |
epss |
| scoring_elements |
0.18257 |
| published_at |
2026-04-09T12:55:00Z |
|
| 9 |
| value |
0.00058 |
| scoring_system |
epss |
| scoring_elements |
0.18259 |
| published_at |
2026-04-11T12:55:00Z |
|
|
| url |
https://api.first.org/data/v1/epss?cve=CVE-2024-1892 |
|
| 1 |
|
| 2 |
|
| 3 |
|
| 4 |
|
| 5 |
|
| 6 |
|
| 7 |
|
| 8 |
|
| 9 |
| reference_url |
https://huntr.com/bounties/271f94f2-1e05-4616-ac43-41752389e26b |
| reference_id |
|
| reference_type |
|
| scores |
| 0 |
| value |
7.5 |
| scoring_system |
cvssv3 |
| scoring_elements |
CVSS:3.0/AV:N/AC:L/PR:N/UI:N/S:U/C:N/I:N/A:H |
|
| 1 |
| value |
6.5 |
| scoring_system |
cvssv3.1 |
| scoring_elements |
CVSS:3.1/AV:N/AC:L/PR:N/UI:R/S:U/C:N/I:N/A:H |
|
| 2 |
| value |
7.5 |
| scoring_system |
cvssv3.1 |
| scoring_elements |
CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:N/I:N/A:H |
|
| 3 |
| value |
HIGH |
| scoring_system |
generic_textual |
| scoring_elements |
|
|
| 4 |
| value |
Track |
| scoring_system |
ssvc |
| scoring_elements |
SSVCv2/E:N/A:Y/T:P/P:M/B:A/M:M/D:T/2024-03-05T16:44:39Z/ |
|
|
| url |
https://huntr.com/bounties/271f94f2-1e05-4616-ac43-41752389e26b |
|
| 10 |
|
| 11 |
|
| 12 |
|
|
| fixed_packages |
|
| aliases |
CVE-2024-1892, GHSA-cc65-xxvf-f7r9, GMS-2024-287, PYSEC-2024-162
|
| risk_score |
4.0 |
| exploitability |
0.5 |
| weighted_severity |
8.0 |
| resource_url |
http://public2.vulnerablecode.io/vulnerabilities/VCID-64nx-aruy-q7gy |
|
| 3 |
| url |
VCID-dc1m-rt7j-w3af |
| vulnerability_id |
VCID-dc1m-rt7j-w3af |
| summary |
Scrapy is vulnerable to a denial of service (DoS) attack due to flaws in brotli decompression implementation
Scrapy versions up to 2.13.3 are vulnerable to a denial of service (DoS) attack due to a flaw in its brotli decompression implementation. The protection mechanism against decompression bombs fails to mitigate the brotli variant, allowing remote servers to crash clients with less than 80GB of available memory. This occurs because brotli can achieve extremely high compression ratios for zero-filled data, leading to excessive memory consumption during decompression. Mitigation for this vulnerability needs security enhancement added in brotli v1.2.0. |
| references |
| 0 |
|
| 1 |
| reference_url |
https://api.first.org/data/v1/epss?cve=CVE-2025-6176 |
| reference_id |
|
| reference_type |
|
| scores |
| 0 |
| value |
0.00028 |
| scoring_system |
epss |
| scoring_elements |
0.08047 |
| published_at |
2026-04-04T12:55:00Z |
|
| 1 |
| value |
0.00028 |
| scoring_system |
epss |
| scoring_elements |
0.08092 |
| published_at |
2026-04-09T12:55:00Z |
|
| 2 |
| value |
0.00028 |
| scoring_system |
epss |
| scoring_elements |
0.08068 |
| published_at |
2026-04-08T12:55:00Z |
|
| 3 |
| value |
0.00028 |
| scoring_system |
epss |
| scoring_elements |
0.08 |
| published_at |
2026-04-02T12:55:00Z |
|
| 4 |
| value |
0.00028 |
| scoring_system |
epss |
| scoring_elements |
0.08008 |
| published_at |
2026-04-07T12:55:00Z |
|
| 5 |
| value |
0.00033 |
| scoring_system |
epss |
| scoring_elements |
0.09747 |
| published_at |
2026-04-13T12:55:00Z |
|
| 6 |
| value |
0.00033 |
| scoring_system |
epss |
| scoring_elements |
0.09605 |
| published_at |
2026-04-18T12:55:00Z |
|
| 7 |
| value |
0.00033 |
| scoring_system |
epss |
| scoring_elements |
0.09633 |
| published_at |
2026-04-16T12:55:00Z |
|
| 8 |
| value |
0.00033 |
| scoring_system |
epss |
| scoring_elements |
0.09795 |
| published_at |
2026-04-11T12:55:00Z |
|
| 9 |
| value |
0.00033 |
| scoring_system |
epss |
| scoring_elements |
0.09763 |
| published_at |
2026-04-12T12:55:00Z |
|
| 10 |
| value |
0.00037 |
| scoring_system |
epss |
| scoring_elements |
0.11087 |
| published_at |
2026-04-21T12:55:00Z |
|
|
| url |
https://api.first.org/data/v1/epss?cve=CVE-2025-6176 |
|
| 2 |
|
| 3 |
|
| 4 |
|
| 5 |
|
| 6 |
|
| 7 |
|
| 8 |
|
| 9 |
|
| 10 |
|
| 11 |
|
| 12 |
|
| 13 |
|
| 14 |
|
| 15 |
|
| 16 |
|
| 17 |
|
| 18 |
|
| 19 |
|
| 20 |
|
| 21 |
|
| 22 |
|
| 23 |
|
| 24 |
|
| 25 |
|
| 26 |
|
| 27 |
|
| 28 |
|
| 29 |
|
| 30 |
|
| 31 |
|
| 32 |
|
| 33 |
|
| 34 |
|
| 35 |
|
| 36 |
|
| 37 |
|
| 38 |
|
|
| fixed_packages |
|
| aliases |
CVE-2025-6176, GHSA-2qfp-q593-8484
|
| risk_score |
4.0 |
| exploitability |
0.5 |
| weighted_severity |
8.0 |
| resource_url |
http://public2.vulnerablecode.io/vulnerabilities/VCID-dc1m-rt7j-w3af |
|
| 4 |
| url |
VCID-kgf5-wu3r-pqc6 |
| vulnerability_id |
VCID-kgf5-wu3r-pqc6 |
| summary |
Scrapy authorization header leakage on cross-domain redirect
### Impact
When you send a request with the `Authorization` header to one domain, and the response asks to redirect to a different domain, Scrapy’s built-in redirect middleware creates a follow-up redirect request that keeps the original `Authorization` header, leaking its content to that second domain.
The [right behavior](https://fetch.spec.whatwg.org/#ref-for-cors-non-wildcard-request-header-name) would be to drop the `Authorization` header instead, in this scenario.
### Patches
Upgrade to Scrapy 2.11.1.
If you are using Scrapy 1.8 or a lower version, and upgrading to Scrapy 2.11.1 is not an option, you may upgrade to Scrapy 1.8.4 instead.
### Workarounds
If you cannot upgrade, make sure that you are not using the `Authentication` header, either directly or through some third-party plugin.
If you need to use that header in some requests, add `"dont_redirect": True` to the `request.meta` dictionary of those requests to disable following redirects for them.
If you need to keep (same domain) redirect support on those requests, make sure you trust the target website not to redirect your requests to a different domain.
### Acknowledgements
This security issue was reported by @ranjit-git [through huntr.com](https://huntr.com/bounties/49974321-2718-43e3-a152-62b16eed72a9/). |
| references |
| 0 |
| reference_url |
https://api.first.org/data/v1/epss?cve=CVE-2024-3574 |
| reference_id |
|
| reference_type |
|
| scores |
| 0 |
| value |
0.00121 |
| scoring_system |
epss |
| scoring_elements |
0.31206 |
| published_at |
2026-04-16T12:55:00Z |
|
| 1 |
| value |
0.00121 |
| scoring_system |
epss |
| scoring_elements |
0.31157 |
| published_at |
2026-04-21T12:55:00Z |
|
| 2 |
| value |
0.00121 |
| scoring_system |
epss |
| scoring_elements |
0.31187 |
| published_at |
2026-04-18T12:55:00Z |
|
| 3 |
| value |
0.00121 |
| scoring_system |
epss |
| scoring_elements |
0.31311 |
| published_at |
2026-04-02T12:55:00Z |
|
| 4 |
| value |
0.00121 |
| scoring_system |
epss |
| scoring_elements |
0.31352 |
| published_at |
2026-04-04T12:55:00Z |
|
| 5 |
| value |
0.00121 |
| scoring_system |
epss |
| scoring_elements |
0.31172 |
| published_at |
2026-04-13T12:55:00Z |
|
| 6 |
| value |
0.00121 |
| scoring_system |
epss |
| scoring_elements |
0.31225 |
| published_at |
2026-04-08T12:55:00Z |
|
| 7 |
| value |
0.00121 |
| scoring_system |
epss |
| scoring_elements |
0.31255 |
| published_at |
2026-04-09T12:55:00Z |
|
| 8 |
| value |
0.00121 |
| scoring_system |
epss |
| scoring_elements |
0.31259 |
| published_at |
2026-04-11T12:55:00Z |
|
| 9 |
| value |
0.00121 |
| scoring_system |
epss |
| scoring_elements |
0.31216 |
| published_at |
2026-04-12T12:55:00Z |
|
|
| url |
https://api.first.org/data/v1/epss?cve=CVE-2024-3574 |
|
| 1 |
|
| 2 |
|
| 3 |
|
| 4 |
|
| 5 |
|
| 6 |
|
| 7 |
|
|
| fixed_packages |
|
| aliases |
CVE-2024-3574, GHSA-cw9j-q3vf-hrrv, GMS-2024-288
|
| risk_score |
4.0 |
| exploitability |
0.5 |
| weighted_severity |
8.0 |
| resource_url |
http://public2.vulnerablecode.io/vulnerabilities/VCID-kgf5-wu3r-pqc6 |
|
| 5 |
| url |
VCID-nekz-z7zw-mfgz |
| vulnerability_id |
VCID-nekz-z7zw-mfgz |
| summary |
Scrapy allows redirect following in protocols other than HTTP
### Impact
Scrapy was following redirects regardless of the URL protocol, so redirects were working for `data://`, `file://`, `ftp://`, `s3://`, and any other scheme defined in the `DOWNLOAD_HANDLERS` setting.
However, HTTP redirects should only work between URLs that use the `http://` or `https://` schemes.
A malicious actor, given write access to the start requests (e.g. ability to define `start_urls`) of a spider and read access to the spider output, could exploit this vulnerability to:
- Redirect to any local file using the `file://` scheme to read its contents.
- Redirect to an `ftp://` URL of a malicious FTP server to obtain the FTP username and password configured in the spider or project.
- Redirect to any `s3://` URL to read its content using the S3 credentials configured in the spider or project.
For `file://` and `s3://`, how the spider implements its parsing of input data into an output item determines what data would be vulnerable. A spider that always outputs the entire contents of a response would be completely vulnerable, while a spider that extracted only fragments from the response could significantly limit vulnerable data.
### Patches
Upgrade to Scrapy 2.11.2.
### Workarounds
Replace the built-in retry middlewares (`RedirectMiddleware` and `MetaRefreshMiddleware`) with custom ones that implement the fix from Scrapy 2.11.2, and verify that they work as intended.
### References
This security issue was reported by @mvsantos at https://github.com/scrapy/scrapy/issues/457. |
| references |
|
| fixed_packages |
|
| aliases |
GHSA-23j4-mw76-5v7h
|
| risk_score |
3.1 |
| exploitability |
0.5 |
| weighted_severity |
6.2 |
| resource_url |
http://public2.vulnerablecode.io/vulnerabilities/VCID-nekz-z7zw-mfgz |
|
| 6 |
| url |
VCID-t5cn-a543-nyag |
| vulnerability_id |
VCID-t5cn-a543-nyag |
| summary |
Duplicate Advisory: Scrapy leaks the authorization header on same-domain but cross-origin redirects
## Duplicate Advisory
This advisory has been withdrawn because it is a duplicate of GHSA-4qqq-9vqf-3h3f. This link is maintained to preserve external references.
## Original Description
In scrapy/scrapy, an issue was identified where the Authorization header is not removed during redirects that only change the scheme (e.g., HTTPS to HTTP) but remain within the same domain. This behavior contravenes the Fetch standard, which mandates the removal of Authorization headers in cross-origin requests when the scheme, host, or port changes. Consequently, when a redirect downgrades from HTTPS to HTTP, the Authorization header may be inadvertently exposed in plaintext, leading to potential sensitive information disclosure to unauthorized actors. The flaw is located in the _build_redirect_request function of the redirect middleware. |
| references |
|
| fixed_packages |
|
| aliases |
GHSA-cg34-w3fm-82h3
|
| risk_score |
4.0 |
| exploitability |
0.5 |
| weighted_severity |
8.0 |
| resource_url |
http://public2.vulnerablecode.io/vulnerabilities/VCID-t5cn-a543-nyag |
|
| 7 |
| url |
VCID-urb1-hv1z-duga |
| vulnerability_id |
VCID-urb1-hv1z-duga |
| summary |
In scrapy/scrapy, an issue was identified where the Authorization header is not removed during redirects that only change the scheme (e.g., HTTPS to HTTP) but remain within the same domain. This behavior contravenes the Fetch standard, which mandates the removal of Authorization headers in cross-origin requests when the scheme, host, or port changes. Consequently, when a redirect downgrades from HTTPS to HTTP, the Authorization header may be inadvertently exposed in plaintext, leading to potential sensitive information disclosure to unauthorized actors. The flaw is located in the _build_redirect_request function of the redirect middleware. |
| references |
| 0 |
| reference_url |
https://api.first.org/data/v1/epss?cve=CVE-2024-1968 |
| reference_id |
|
| reference_type |
|
| scores |
| 0 |
| value |
0.0019 |
| scoring_system |
epss |
| scoring_elements |
0.40917 |
| published_at |
2026-04-08T12:55:00Z |
|
| 1 |
| value |
0.0019 |
| scoring_system |
epss |
| scoring_elements |
0.40818 |
| published_at |
2026-04-21T12:55:00Z |
|
| 2 |
| value |
0.0019 |
| scoring_system |
epss |
| scoring_elements |
0.40886 |
| published_at |
2026-04-13T12:55:00Z |
|
| 3 |
| value |
0.0019 |
| scoring_system |
epss |
| scoring_elements |
0.40905 |
| published_at |
2026-04-12T12:55:00Z |
|
| 4 |
| value |
0.0019 |
| scoring_system |
epss |
| scoring_elements |
0.40941 |
| published_at |
2026-04-11T12:55:00Z |
|
| 5 |
| value |
0.0019 |
| scoring_system |
epss |
| scoring_elements |
0.40923 |
| published_at |
2026-04-09T12:55:00Z |
|
| 6 |
| value |
0.0019 |
| scoring_system |
epss |
| scoring_elements |
0.40912 |
| published_at |
2026-04-02T12:55:00Z |
|
| 7 |
| value |
0.0019 |
| scoring_system |
epss |
| scoring_elements |
0.40939 |
| published_at |
2026-04-04T12:55:00Z |
|
| 8 |
| value |
0.0019 |
| scoring_system |
epss |
| scoring_elements |
0.40868 |
| published_at |
2026-04-07T12:55:00Z |
|
| 9 |
| value |
0.0019 |
| scoring_system |
epss |
| scoring_elements |
0.40898 |
| published_at |
2026-04-18T12:55:00Z |
|
| 10 |
| value |
0.0019 |
| scoring_system |
epss |
| scoring_elements |
0.40928 |
| published_at |
2026-04-16T12:55:00Z |
|
|
| url |
https://api.first.org/data/v1/epss?cve=CVE-2024-1968 |
|
| 1 |
|
| 2 |
|
| 3 |
|
| 4 |
|
| 5 |
|
| 6 |
|
| 7 |
|
| 8 |
|
|
| fixed_packages |
|
| aliases |
CVE-2024-1968, GHSA-4qqq-9vqf-3h3f, PYSEC-2024-258
|
| risk_score |
3.4 |
| exploitability |
0.5 |
| weighted_severity |
6.8 |
| resource_url |
http://public2.vulnerablecode.io/vulnerabilities/VCID-urb1-hv1z-duga |
|
| 8 |
| url |
VCID-veaw-n6vt-zfgu |
| vulnerability_id |
VCID-veaw-n6vt-zfgu |
| summary |
Scrapy's redirects ignoring scheme-specific proxy settings
### Impact
When using system proxy settings, which are scheme-specific (i.e. specific to `http://` or `https://` URLs), Scrapy was not accounting for scheme changes during redirects.
For example, an HTTP request would use the proxy configured for HTTP and, when redirected to an HTTPS URL, the new HTTPS request would still use the proxy configured for HTTP instead of switching to the proxy configured for HTTPS. Same the other way around.
If you have different proxy configurations for HTTP and HTTPS in your system for security reasons (e.g., maybe you don’t want one of your proxy providers to be aware of the URLs that you visit with the other one), this would be a security issue.
### Patches
Upgrade to Scrapy 2.11.2.
### Workarounds
Replace the built-in retry middlewares (`RedirectMiddleware` and `MetaRefreshMiddleware`) and the `HttpProxyMiddleware` middleware with custom ones that implement the fix from Scrapy 2.11.2, and verify that they work as intended.
### References
This security issue was reported by @redapple at https://github.com/scrapy/scrapy/issues/767. |
| references |
|
| fixed_packages |
|
| aliases |
GHSA-jm3v-qxmh-hxwv
|
| risk_score |
3.1 |
| exploitability |
0.5 |
| weighted_severity |
6.2 |
| resource_url |
http://public2.vulnerablecode.io/vulnerabilities/VCID-veaw-n6vt-zfgu |
|
|