Coder Social home page Coder Social logo

yifeikong / curl_cffi Goto Github PK

View Code? Open in Web Editor NEW
1.4K 26.0 190.0 1.02 MB

Python binding for curl-impersonate via cffi. A http client that can impersonate browser tls/ja3/http2 fingerprints.

Home Page: https://curl-cffi.readthedocs.io/

License: MIT License

Python 97.67% C 1.55% Makefile 0.66% Shell 0.13%
curl http-client curl-impersonate http https ja3 ja3-fingerprint tls-fingerprint fingerprinting web-scraping

curl_cffi's Introduction

curl_cffi

Downloads PyPI - Python Version PyPI version Generic badge Generic badge

Documentation | 中文 README

Python binding for curl-impersonate via cffi.

Unlike other pure python http clients like httpx or requests, curl_cffi can impersonate browsers' TLS/JA3 and HTTP/2 fingerprints. If you are blocked by some website for no obvious reason, you can give curl_cffi a try.


Scrapfly.io

Scrapfly is an enterprise-grade solution providing Web Scraping API that aims to simplify the scraping process by managing everything: real browser rendering, rotating proxies, and fingerprints (TLS, HTTP, browser) to bypass all major anti-bots. Scrapfly also unlocks the observability by providing an analytical dashboard and measuring the success rate/block rate in detail.

Scrapfly is a good solution if you are looking for a cloud-managed solution for curl_cffi. If you are managing TLS/HTTP fingerprint by yourself with curl_cffi, they also maintain this tool to convert curl command into python curl_cffi code!


Features

  • Supports JA3/TLS and http2 fingerprints impersonation.
  • Much faster than requests/httpx, on par with aiohttp/pycurl, see benchmarks.
  • Mimics requests API, no need to learn another one.
  • Pre-compiled, so you don't have to compile on your machine.
  • Supports asyncio with proxy rotation on each request.
  • Supports http 2.0, which requests does not.
  • Supports websocket.
requests aiohttp httpx pycurl curl_cffi
http2
sync
async
websocket
fingerprints
speed 🐇 🐇🐇 🐇 🐇🐇 🐇🐇

Install

pip install curl_cffi --upgrade

This should work on Linux, macOS and Windows out of the box. If it does not work on you platform, you may need to compile and install curl-impersonate first and set some environment variables like LD_LIBRARY_PATH.

To install beta releases:

pip install curl_cffi --upgrade --pre

To install unstable version from GitHub:

git clone https://github.com/yifeikong/curl_cffi/
cd curl_cffi
make preprocess
pip install .

Usage

curl_cffi comes with a low-level curl API and a high-level requests-like API.

Use the latest impersonate versions, do NOT copy chrome110 here without changing.

requests-like

from curl_cffi import requests

# Notice the impersonate parameter
r = requests.get("https://tools.scrapfly.io/api/fp/ja3", impersonate="chrome110")

print(r.json())
# output: {..., "ja3n_hash": "aa56c057ad164ec4fdcb7a5a283be9fc", ...}
# the js3n fingerprint should be the same as target browser

# To keep using the latest browser version as `curl_cffi` updates,
# simply set impersonate="chrome" without specifying a version.
# Other similar values are: "safari" and "safari_ios"
r = requests.get("https://tools.scrapfly.io/api/fp/ja3", impersonate="chrome")

# http/socks proxies are supported
proxies = {"https": "http://localhost:3128"}
r = requests.get("https://tools.scrapfly.io/api/fp/ja3", impersonate="chrome110", proxies=proxies)

proxies = {"https": "socks://localhost:3128"}
r = requests.get("https://tools.scrapfly.io/api/fp/ja3", impersonate="chrome110", proxies=proxies)

Sessions

s = requests.Session()

# httpbin is a http test website, this endpoint makes the server set cookies
s.get("https://httpbin.org/cookies/set/foo/bar")
print(s.cookies)
# <Cookies[<Cookie foo=bar for httpbin.org />]>

# retrieve cookies again to verify
r = s.get("https://httpbin.org/cookies")
print(r.json())
# {'cookies': {'foo': 'bar'}}

Supported impersonate versions, as supported by my fork of curl-impersonate:

However, only Chrome-like browsers are supported. Firefox support is tracked in #59.

Browser versions will be added only when their fingerprints change. If you see a version, e.g. chrome122, were skipped, you can simply impersonate it with your own headers and the previous version.

  • chrome99
  • chrome100
  • chrome101
  • chrome104
  • chrome107
  • chrome110
  • chrome116 [1]
  • chrome119 [1]
  • chrome120 [1]
  • chrome123 [3]
  • chrome124 [3]
  • chrome99_android
  • edge99
  • edge101
  • safari15_3 [2]
  • safari15_5 [2]
  • safari17_0 [1]
  • safari17_2_ios [1]

Notes:

  1. Added in version 0.6.0.
  2. Fixed in version 0.6.0, previous http2 fingerprints were not correct.
  3. Added in version 0.7.0.

asyncio

from curl_cffi.requests import AsyncSession

async with AsyncSession() as s:
    r = await s.get("https://example.com")

More concurrency:

import asyncio
from curl_cffi.requests import AsyncSession

urls = [
    "https://google.com/",
    "https://facebook.com/",
    "https://twitter.com/",
]

async with AsyncSession() as s:
    tasks = []
    for url in urls:
        task = s.get(url)
        tasks.append(task)
    results = await asyncio.gather(*tasks)

WebSockets

from curl_cffi.requests import Session, WebSocket

def on_message(ws: WebSocket, message):
    print(message)

with Session() as s:
    ws = s.ws_connect(
        "wss://api.gemini.com/v1/marketdata/BTCUSD",
        on_message=on_message,
    )
    ws.run_forever()

For low-level APIs, Scrapy integration and other advanced topics, see the docs for more details.

Acknowledgement

  • Originally forked from multippt/python_curl_cffi, which is under the MIT license.
  • Headers/Cookies files are copied from httpx, which is under the BSD license.
  • Asyncio support is inspired by Tornado's curl http client.
  • The WebSocket API is inspired by websocket_client.

[Sponsor] Bypass Cloudflare with API

Yes Captcha!

Yescaptcha is a proxy service that bypasses Cloudflare and uses the API interface to obtain verified cookies (e.g. cf_clearance). Click here to register: https://yescaptcha.com/i/stfnIO

[Sponsor] ScrapeNinja

Scrape Ninja

ScrapeNinja is a web scraping API with two engines: fast, with high performance and TLS fingerprint; and slower with a real browser under the hood.

ScrapeNinja handles headless browsers, proxies, timeouts, retries, and helps with data extraction, so you can just get the data in JSON. Rotating proxies are available out of the box on all subscription plans.

Sponsor

Buy Me A Coffee

curl_cffi's People

Contributors

0-2-1 avatar bjia56 avatar bratao avatar caxaro4ik avatar deedy5 avatar dhirajbaheti avatar dolfies avatar douo avatar flxme avatar gallamine avatar hlohaus avatar lebr0nli avatar multippt avatar raguggg avatar rlaphoenix avatar sml2h3 avatar snusmumr1000 avatar t-256 avatar tadasgedgaudas avatar velocidensity avatar yifeikong avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

curl_cffi's Issues

Feature Request: Add the "stream" parameter which can be found inside normal requests.

First of all thank you for fixing #22 I think it is not available in pypi yet. I even tried to install it manually by doing pip install git+https://github.com/yifeikong/curl_cffi but it seems to give the below error on a linux (ubuntu) platform:

$ pip3 install git+https://github.com/yifeikong/curl_cffi
Collecting git+https://github.com/yifeikong/curl_cffi
  Cloning https://github.com/yifeikong/curl_cffi to /tmp/pip-req-build-gisjnq_5
  Running command git clone -q https://github.com/yifeikong/curl_cffi /tmp/pip-req-build-gisjnq_5
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
    Preparing wheel metadata ... done
Requirement already satisfied: cffi>=1.0.0 in ./.local/lib/python3.8/site-packages (from curl-cffi==0.3.8) (1.15.1)
Requirement already satisfied: pycparser in ./.local/lib/python3.8/site-packages (from cffi>=1.0.0->curl-cffi==0.3.8) (2.21)
Building wheels for collected packages: curl-cffi
  Building wheel for curl-cffi (PEP 517) ... error
  ERROR: Command errored out with exit status 1:
   command: /usr/bin/python3 /tmp/tmpmigkk7kk build_wheel /tmp/tmp_eqeh8pr
       cwd: /tmp/pip-req-build-gisjnq_5
  Complete output (110 lines):
  running bdist_wheel
  running build
  running build_py
  creating build
  creating build/lib.linux-x86_64-cpython-38
  creating build/lib.linux-x86_64-cpython-38/curl_cffi
  copying curl_cffi/build.py -> build/lib.linux-x86_64-cpython-38/curl_cffi
  copying curl_cffi/__init__.py -> build/lib.linux-x86_64-cpython-38/curl_cffi
  copying curl_cffi/_const.py -> build/lib.linux-x86_64-cpython-38/curl_cffi
  creating build/lib.linux-x86_64-cpython-38/curl_cffi/requests
  copying curl_cffi/requests/headers.py -> build/lib.linux-x86_64-cpython-38/curl_cffi/requests
  copying curl_cffi/requests/cookies.py -> build/lib.linux-x86_64-cpython-38/curl_cffi/requests
  copying curl_cffi/requests/session.py -> build/lib.linux-x86_64-cpython-38/curl_cffi/requests
  copying curl_cffi/requests/__init__.py -> build/lib.linux-x86_64-cpython-38/curl_cffi/requests
  copying curl_cffi/requests/errors.py -> build/lib.linux-x86_64-cpython-38/curl_cffi/requests
  running egg_info
  writing curl_cffi.egg-info/PKG-INFO
  writing dependency_links to curl_cffi.egg-info/dependency_links.txt
  writing requirements to curl_cffi.egg-info/requires.txt
  writing top-level names to curl_cffi.egg-info/top_level.txt
  reading manifest file 'curl_cffi.egg-info/SOURCES.txt'
  reading manifest template 'MANIFEST.in'
  /tmp/pip-build-env-bx5qgl50/overlay/lib/python3.8/site-packages/setuptools/config/pyprojecttoml.py:108: _BetaConfiguration: Support for `[tool.setuptools]` in `pyproject.toml` is still *beta*.
    warnings.warn(msg, _BetaConfiguration)
  warning: no files found matching 'curl_cffi/cacert.pem'
  warning: no files found matching 'curl_cffi/*.dll'
  adding license file 'LICENSE'
  writing manifest file 'curl_cffi.egg-info/SOURCES.txt'
  copying curl_cffi/cdef.c -> build/lib.linux-x86_64-cpython-38/curl_cffi
  copying curl_cffi/shim.c -> build/lib.linux-x86_64-cpython-38/curl_cffi
  creating build/lib.linux-x86_64-cpython-38/curl_cffi/include
  copying curl_cffi/include/shim.h -> build/lib.linux-x86_64-cpython-38/curl_cffi/include
  creating build/lib.linux-x86_64-cpython-38/curl_cffi/include/curl
  copying curl_cffi/include/curl/Makefile.am -> build/lib.linux-x86_64-cpython-38/curl_cffi/include/curl
  copying curl_cffi/include/curl/curl.h -> build/lib.linux-x86_64-cpython-38/curl_cffi/include/curl
  copying curl_cffi/include/curl/curlver.h -> build/lib.linux-x86_64-cpython-38/curl_cffi/include/curl
  copying curl_cffi/include/curl/easy.h -> build/lib.linux-x86_64-cpython-38/curl_cffi/include/curl
  copying curl_cffi/include/curl/header.h -> build/lib.linux-x86_64-cpython-38/curl_cffi/include/curl
  copying curl_cffi/include/curl/mprintf.h -> build/lib.linux-x86_64-cpython-38/curl_cffi/include/curl
  copying curl_cffi/include/curl/multi.h -> build/lib.linux-x86_64-cpython-38/curl_cffi/include/curl
  copying curl_cffi/include/curl/options.h -> build/lib.linux-x86_64-cpython-38/curl_cffi/include/curl
  copying curl_cffi/include/curl/stdcheaders.h -> build/lib.linux-x86_64-cpython-38/curl_cffi/include/curl
  copying curl_cffi/include/curl/system.h -> build/lib.linux-x86_64-cpython-38/curl_cffi/include/curl
  copying curl_cffi/include/curl/typecheck-gcc.h -> build/lib.linux-x86_64-cpython-38/curl_cffi/include/curl
  copying curl_cffi/include/curl/urlapi.h -> build/lib.linux-x86_64-cpython-38/curl_cffi/include/curl
  running build_ext
  generating cffi module 'build/temp.linux-x86_64-cpython-38/curl_cffi._wrapper.c'
  creating build/temp.linux-x86_64-cpython-38
  building 'curl_cffi._wrapper' extension
  creating build/temp.linux-x86_64-cpython-38/build
  creating build/temp.linux-x86_64-cpython-38/build/temp.linux-x86_64-cpython-38
  creating build/temp.linux-x86_64-cpython-38/curl_cffi
  x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -fPIC -Icurl_cffi/include -I/usr/include/python3.8 -c build/temp.linux-x86_64-cpython-38/curl_cffi._wrapper.c -o build/temp.linux-x86_64-cpython-38/build/temp.linux-x86_64-cpython-38/curl_cffi._wrapper.o
  x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -fPIC -Icurl_cffi/include -I/usr/include/python3.8 -c curl_cffi/shim.c -o build/temp.linux-x86_64-cpython-38/curl_cffi/shim.o
  x86_64-linux-gnu-gcc -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 build/temp.linux-x86_64-cpython-38/build/temp.linux-x86_64-cpython-38/curl_cffi._wrapper.o build/temp.linux-x86_64-cpython-38/curl_cffi/shim.o -L/usr/local/lib -L/usr/lib -lcurl-impersonate-chrome -o build/lib.linux-x86_64-cpython-38/curl_cffi/_wrapper.abi3.so
  /usr/bin/ld: cannot find -lcurl-impersonate-chrome
  collect2: error: ld returned 1 exit status
  /tmp/pip-build-env-bx5qgl50/overlay/lib/python3.8/site-packages/setuptools/command/build_py.py:202: SetuptoolsDeprecationWarning:     Installing 'curl_cffi.include' as data is deprecated, please list it in `packages`.
      !!


      ############################
      # Package would be ignored #
      ############################
      Python recognizes 'curl_cffi.include' as an importable package,
      but it is not listed in the `packages` configuration of setuptools.

      'curl_cffi.include' has been automatically added to the distribution only
      because it may contain data files, but this behavior is likely to change
      in future versions of setuptools (and therefore is considered deprecated).

      Please make sure that 'curl_cffi.include' is included as a package by using
      the `packages` configuration field or the proper discovery methods
      (for example by using `find_namespace_packages(...)`/`find_namespace:`
      instead of `find_packages(...)`/`find:`).

      You can read more about "package discovery" and "data files" on setuptools
      documentation page.


  !!

    check.warn(importable)
  /tmp/pip-build-env-bx5qgl50/overlay/lib/python3.8/site-packages/setuptools/command/build_py.py:202: SetuptoolsDeprecationWarning:     Installing 'curl_cffi.include.curl' as data is deprecated, please list it in `packages`.
      !!


      ############################
      # Package would be ignored #
      ############################
      Python recognizes 'curl_cffi.include.curl' as an importable package,
      but it is not listed in the `packages` configuration of setuptools.

      'curl_cffi.include.curl' has been automatically added to the distribution only
      because it may contain data files, but this behavior is likely to change
      in future versions of setuptools (and therefore is considered deprecated).

      Please make sure that 'curl_cffi.include.curl' is included as a package by using
      the `packages` configuration field or the proper discovery methods
      (for example by using `find_namespace_packages(...)`/`find_namespace:`
      instead of `find_packages(...)`/`find:`).

      You can read more about "package discovery" and "data files" on setuptools
      documentation page.


  !!

    check.warn(importable)
  error: command '/usr/bin/x86_64-linux-gnu-gcc' failed with exit code 1
  ----------------------------------------
  ERROR: Failed building wheel for curl-cffi
Failed to build curl-cffi
ERROR: Could not build wheels for curl-cffi which use PEP 517 and cannot be installed directly

However I hope it will be fixed soon when the next version is pushed to pypi!

This issue is about the stream=True/False parameter which can be found inside normal requests package. If it is possible can you add it to this please so I can perform operations like below:

import requests

url="https://videos-fastly-studiodrm.jwpsrv.com/6411bcd6_0x344cee0abd6d9a757e2ea355371aba081c4aa3d3/site/Y21kWwti/media/UFCQHrhu/version/dozELTrB/manifest-34126610,34126611,34126612,34126615,34126616,34126617.ism/dash/manifest-34126610,34126611,34126612,34126615,34126616,34126617-audio_eng=112019.dash"
headers = {
    'authority': 'videos-fastly-studiodrm.jwpsrv.com',
    'accept': '*/*',
    'accept-language': 'en-GB,en-US;q=0.9,en;q=0.8',
    'cache-control': 'no-cache',
    'origin': 'https://juanflix.com.ph',
    'pragma': 'no-cache',
    'referer': 'https://juanflix.com.ph/',
    'sec-ch-ua': '"Google Chrome";v="111", "Not(A:Brand";v="8", "Chromium";v="111"',
    'sec-ch-ua-mobile': '?0',
    'sec-ch-ua-platform': '"Windows"',
    'sec-fetch-dest': 'empty',
    'sec-fetch-mode': 'cors',
    'sec-fetch-site': 'cross-site',
    'user-agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/111.0.0.0 Safari/537.36',
}
proxies={}
raw = requests.get(url=url, headers=headers, proxies=proxies, stream=True, verify=False)
if raw.status_code == 200 or raw.status_code == 206:
	for chunk in raw.iter_content(chunk_size=4096):
		if chunk:
			print("chunk found")
			# Do something

Thanks again and I am really sorry for disturbing you again and again.

[BUG] Posting/patching seems to break AsyncSession sessions in a weird way

EDIT: Changed the code a bit and found a potential fix (not really sure if its proper though, see the bottom for this)

Noticing some very strange behavior, I'll try my best to describe it.

Essentially, it seems that posting or patching data to a site will make certain following requests fail with a "400 malformed request" response from whatever URL you try to get. It's strange because it only breaks certain sites and sometimes only certain paths for a site. For example, "discord.com" will return 200 but "https://discord.com/api/v9/invites/python" will return 400.

I think this is best described by example, so here's some reproducable code that shows discord.com/api/v9/invites/python first returning with a 200 status code, then after the post to httpbin, it breaks the same request to discord.com/api/v9/invites/python and starts returning 400.

from curl_cffi import requests
import json, asyncio

async def test():
	async def do_req(sess):
		r = await sess.get("https://discord.com/api/v9/invites/python", impersonate="chrome110")
		print("1st https://discord.com/api/v9/invites/python status: " + str(r.status_code))
		r = await sess.post("https://httpbin.org/post", json={"testData": True},impersonate="chrome110")
		print("https://httpbin.org/post status: " + str(r.status_code))
		r = await sess.get("https://discord.com/", impersonate="chrome110")
		print("https://discord.com status: " + str(r.status_code))
		r = await sess.get("https://discord.com/api/v9/invites/python", impersonate="chrome110")
		print("2nd https://discord.com/api/v9/invites/python status: " + str(r.status_code))
		return r.status_code
	
	tasks = []
	async with requests.AsyncSession() as s:
		for i in range(1):
			tasks.append(do_req(s))
		results = await asyncio.gather(*tasks)

policy = asyncio.WindowsSelectorEventLoopPolicy() # needed to run on windows
asyncio.set_event_loop_policy(policy) # needed to run on windows
asyncio.get_event_loop().run_until_complete(test())

Expected Result:

1st https://discord.com/api/v9/invites/python status: 200
https://httpbin.org/post status: 200
https://discord.com status: 200
2nd https://discord.com/api/v9/invites/python status: 400

After running the code, if you print the response text it'll look something like (this was for google):

<!DOCTYPE html>
<html lang=en>
  <meta charset=utf-8>
  <meta name=viewport content="initial-scale=1, minimum-scale=1, width=device-width">
  <title>Error 400 (Bad Request)!!1</title>
  <style>
    *{margin:0;padding:0}html,code{font:15px/22px arial,sans-serif}html{background:#fff;color:#222;padding:15px}body{margin:7% auto 0;max-width:390px;min-height:180px;padding:30px 0 15px}* > body{background:url(//www.google.com/images/errors/robot.png) 100% 5px no-repeat;padding-right:205px}p{margin:11px 0 22px;overflow:hidden}ins{color:#777;text-decoration:none}a img{border:0}@media screen and (max-width:772px){body{background:none;margin-top:0;max-width:none;padding-right:0}}#logo{background:url(//www.google.com/images/branding/googlelogo/1x/googlelogo_color_150x54dp.png) no-repeat;margin-left:-5px}@media only screen and (min-resolution:192dpi){#logo{background:url(//www.google.com/images/branding/googlelogo/2x/googlelogo_color_150x54dp.png) no-repeat 0% 0%/100% 100%;-moz-border-image:url(//www.google.com/images/branding/googlelogo/2x/googlelogo_color_150x54dp.png) 0}}@media only screen and (-webkit-min-device-pixel-ratio:2){#logo{background:url(//www.google.com/images/branding/googlelogo/2x/googlelogo_color_150x54dp.png) no-repeat;-webkit-background-size:100% 100%}}#logo{display:inline-block;height:54px;width:150px}
  </style>
  <a href=//www.google.com/><span id=logo aria-label=Google></span></a>
  <p><b>400.</b> <ins>That’s an error.</ins>
  <p>Your client has issued a malformed or illegal request.  <ins>That’s all we know.</ins>

I tried debugging the code to see where something is getting changed behind-the-scenes when doing post/patch requests, but I couldn't figure it out. Anyone know what might be happening here?

Python version: 3.9.6
curl_cffi version: 0.5.2
cffi version: 1.15.1

Let me know if you need any other information! Thanks in advance.

EDIT: Found a potential fix - it seems that calling reset() on the session object after the POST request fixes it, is it intended to have to call this function after every post/patch request?

Are retries supported?

Just wondering if I'm missing something. Is there a retry and/or backoff factor included in the requests made by this package?

Is it possible to mount those retries onto a session? With the normal request package it would look like:

session.mount('https://', HTTPAdapter(max_retries=requests.packages.urllib3.util.retry.Retry(**kwargs)))

CurlOpt.HTTPHEADER causes an error.

I have tried various inputs "all strings converted to bytes", but I get "Process finished with exit code 135 (interrupted by signal 7: SIGEMT)" each time I try to use CurlOpt.HTTPHEADER.

eg. c.setopt(CurlOpt.HTTPHEADER, b'x-access-token: 123')

I believe regular pycurl takes an array as an input for HTTPHEADER. I don't know if that makes any difference.

Add Support for Chrome Version 112

It has come to my attention that Chrome Version 112 has recently been released. In order to ensure compatibility and provide the best user experience, please add support for this new version. Thank you!

curl_cffi treats get as post request

i noticed that if a you do a post request with a specific post data then do a get request right after it the get will have the same post data as the post request below is a example
this is a post request i intercepted on mitmproxy notice the post data

image
image
code:
x = self.session.post("https://www.patreon.com/api/login?include=campaign%2Cuser_location&json-api-version=1.0", data=json.dumps(payload), headers=post_header)

and this is a get request that is sent right after the post
image
notice the post data is the same as above
image
this is the code i used for this request
r = self.session.get("https://www.patreon.com/login?ru=%2Fpledges%3Fty%3Dp", headers=self.header)
note the self.header does not have a content type header
im guessing this is a problem with the values passed to curl as its possible to do a get request with post data

Bug: data must be dict, BytesIO or bytes

There are some sites which sends raw strings as data.

Here is an example:-

Request copied as curl (bash) format from chrome:

curl 'https://secure-gen-hapi.canal-plus.com/conso/view/70e7c3f0-bdb9-11ed;70e7c3f0-bdb9-11ed-8f2f-11937df79a3f;-8f2f-11937df79a3f/licence?drmConfig=mkpl::true' \
  -H 'xx-request-id: 1658229840503-198a71ed5b1e-1678283976400-482624' \
  -H 'sec-ch-ua: "Chromium";v="110", "Not A(Brand";v="24", "Google Chrome";v="110"' \
  -H 'xx-spyro-version: 3.0' \
  -H 'accept-language: fr,fr-FR' \
  -H 'authorization: PASS Token="11101wFMydY17lNcvY28FkzF2F9eyKbZst68ae6C2gJpPjclYizPY6rQ1ReBzCwcDEiE148GU6GTtvnguTIDn3gMitxTIyNKwP4TWpjJ33KBcgqrXripnV1cLlpMYnm-FG7OvUFI6RNXZCc7RpaIAYPKrPQvCC18GW-_aK8QbZ2t5Zn3QV4sB3bsbqvmWH62b0h7FBTB7OexazmPe9wGSb0oCiDFdMWBkNXe5ytv1sQDa_I0wN-muOiGoFq074OyO8XvbYnsDOW8mLF59NhHfMv_uW4YImELSb0kbn_1xksFOltQbW_8W0BvBJWB-qnsw10hsV2m00tLLoQ8efu6uczX2MMKYjg1_P_-JtD8fLUVK06lLvUmbYKrkAgmlMuzj7ykFR25XYBBUH3D1HWX5GIqaKSDOUROxp_02fijNXOpzGWG_nHJnG5MJAgxYp63l8OCQsA9Kfw3MhKstnDTo7G0qCVOhc1s48L8msM1v9eXtJSstaFUdaFXoHGEk1YQlh55c-cMkt_IzVG73CM-u5b2l0SotiEvI1ByJfnOifgqxaE6N6sR6dL3wggDAc8FC4GQwfRt8UoAUpIikaNz4zMSTiXNdDSNHzgWozqgOc6tmYny67uFtssdRygK32nxd2P99JmG2qJJ_Sau8AUBJfzuKxeOUYBfcd-N9tAYA434oxjr7luca3TJbzBdi9HT20xSBNyXjFBYJG9Qh7qZAb6UmTg.."' \
  -H 'xx-profile-id: 0' \
  -H 'xx-operator: pc' \
  -H 'xx-api-version: 3.0' \
  -H 'xx-device: pc 1658229840503-198a71ed5b1e' \
  -H 'sec-ch-ua-platform: "Windows"' \
  -H 'xx-service: mycanal' \
  -H 'xx-oz: cpfra' \
  -H 'sec-ch-ua-mobile: ?0' \
  -H 'user-agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/110.0.0.0 Safari/537.36' \
  -H 'content-type: text/plain' \
  -H 'xx-ol: fr' \
  -H 'accept: application/json, text/plain, */*' \
  -H 'xx-distmodes: catchup,live,svod,tvod,posttvod' \
  -H 'xx-domain: cpfra' \
  -H 'origin: https://www.canalplus.com' \
  -H 'sec-fetch-site: cross-site' \
  -H 'sec-fetch-mode: cors' \
  -H 'sec-fetch-dest: empty' \
  -H 'referer: https://www.canalplus.com/' \
  -H 'accept-encoding: gzip, deflate, br' \
  --data-raw 'CAES3R8SSApGCjAIARIQjPgHSs34RIuk96KXbK//whoIV2lkZXZpbmUiCjEyMjA3NDkwOTNI49yVmwYQARoQlnrOk7t2NQE5vdtGGTw0BxgBIMipoqAGMBU4pru70Q5CgB8KFGxpY2Vuc2Uud2lkZXZpbmUuY29tEhAXBbkXzBIEhosGMzovdyqMGsAcrDigFhnsm+BZUH0PYtZ/fqWVP4VcgpukzQa367b2ftP/Yrc4CCF8VeOTR5i4IdbTS+Kbx5if4EkQuCpLPV3mupJPrPxk8spPu54B69AUAv8vKUi7Bv/OWRM195qhl1Yvaa2rMezQ3UOqp8g50sI9Z/gh7RA9Dv/o6eruwuWOcD4vvDElAVcROV7WJp1Ch52XuJ/2BsnuhkoT+FclB7vfLPNdyNkLvD4R4ksLeVFI8rP1cdmCwmYz9M4Ly+z3mFg7b+84PyvV+2HqdNVBaDcNozuQdUOcNusrd/rzsv3csWhSvfnGEzQ57CqlPduPAUR2akGxn0RUu6bIlxy3mPN3vuqO37XrhGHXaQdntEVl6PgvDnKGQk0NCiqvcu+myKdCc9uBK9RkDrSmQfXglV1/I7UHf3vM+Uf8N2OZO2DMJJn563dvllGx2o9gnQtECfkTHYkG5hxYckqo5SfJqH3OUezL9Ir6OJd9pH7Wt0ixxtpI6WZCdrYQjDziMzpysXYuI3qmGFpqWt8R2oWjzRGY8RpwDG7P4WMIfUNgcqFntummLOCsI4YEXBB5KujPSkMEL25ofx1dQoqqC6NGrwdeMTJp3hY5wIeY/4efqzdJHoGL0bm7j3g82AP7rB2kJ6hSgAcX8GadqaydBUmKMpKs/Ulw0rYeLypc132Hwoc2hNWuBZvYaMPGOoctP8R4ViI6vnfHucwQd6hjtxuMJEJ5b3OCKDsghYkb0G4JZTur1LKo9kg1V/1MVrLaczE6IdSSP4qLLnAtcpVALSS9XpUR1z5WwgYbCNDgDUeaUSXSfz1odvfgcLI6TpUPKP/cXIAWMXVc3Ew6VcDC86OgHpVlbxTRSZG4QVM6z789AN0CWBbiOI031tHlchHKBXM4MNKcdfE2KRE9aSHS/g5mCewjHW/OOxNE5iO4iqWiqweb0d3T822AChb+a86w63OxuATGaNlD838jKDVBA5amF/YNNySI88vgUSTrOiRhwgmG3IU7YgqOPdIdEqkBfCqTNwBhcGMbIacoqsvMUIpIDs/RZ3eAnYjptYa6FiX/ZXLsxVqX6qQDA96LsuzhOjB9qwAc3misaZZDYc8bsle6WeYKTRmcz5BL4CS33Zkk4W3S0wv1qZVBhwW8HimO3lsA0bZ54DF6fbDXQzdrfexAmjMhodPqy8oOsRHGmu2pxoWAjrIQRR3F3gGZ3g/vEO866aTqZCiuyD0etbR/xAS1qd77YF1GJgRbWKxolvy9N2wQR33eGI4AUYT2MCVltdADg++FmvsFwLj130Tfuw6dygCyoBG/Gm5ktw2hHvLYPSUzNwJfvyegfGBGtURA/m9P0cT1RQWgqyi8Ri2XukenBOzGn+QOYhhLsMdvza2Zs/xIUD9q5tcUhbTFCumVvrlcZA4bhOEUiH8lX+OUYy048cwgln8VyjrJeaDqC0orZORYHxGcLfDAyl7heH+OBzDt4oTwritlllf43iJHXfs8pWuiC5Kv/dIbfMj4fEokRTfYV2z+/m/fm+3a95e6WugNmtvEyetQFuYruYGW1UszTViqkTy50EUDfdQRbqBjUkm6w3ozKJybvVZelzmTRaTMxCZAg1QTXvD3u8Schu38VICTh0+QbEfoD9435XdNs/VNlBA/pi3CHA+QbGF+tBM3VnSfSl8aO4WNRemB1Ry/RbesYbizuK5uFleiQSNGMKp3CVC9D603ZbOWccNxcNuF2cGV5/76PIzQvtog+P7GXpvIpPI41r/XXHyyMHDifnDG/4cl32ZQHklume8u1/5s1j/MGVQaGKHupiD3QlGyIUe6SffDWgQqpz1oP+hdrd+oAL3VWFukIHDDk+5fn0jIAkpbMt8YjmY9lTSnHZybNPT3ItNhQxOETilDHLHS0rgpmX4XfzrvPVJq1qEW1CcyyfJJATHHAlFEjyU3mEug0HBIQEDtGB5aSuU0+F7fH1PV63BsFHUZOk2CxPxTB/89hiblMX1HerD7zVwPvQV4wilbMp9ziNba24l7+CuAo4+F6/8YOYtINl4UEsomfRzW7J0kkNhrjp5iPHELBCwopbuXM5zoQlQOVDsDlrKWADaa9Ux/pyJYOXJEGgcwhTxcPgHAhN299SoO5imVmmpTSPmK1bgx2HNcuhmyKFbzVXNwEPe0LwAx8uKeG4P85kEEfnRlosnSdRulE9inJwe4gd4Y5gjvEIq9xMQHy6ZU46NUS9OyVPpO8yfm+PFLATEHWX9RdBdMuLug9ozMgxhVAqat4RQo6wbgpOVJOvYX7+McISGyzyQrmdu1zAIf7ICbbdu/JrRBDKCCyrpmqabx3FRZ+3WApN6RrnloxUtEB4iwVQYBMR02R4dM+YgyjYv0Vsla9PQXVldmqIdjQ+XufewSjpZq+icEffGXjGXXfB1AZicTpkZE1yOfvB6aYuYQF1cqO/lOOQkgX7X1oWZKcPGzLkuZuahCHYZOTgYuVeFI8DXO6WXYAApozTAL1egNs8N0ixmCoRBP1iT7RxHYcC3MteDErVUWtchcRzVF97Vq6Z6afhKknwQP8zo99Z5bulKIZV/9XRXhAD7pLl2N8bvSR2GhLG+/0jQKcXt6nliClOY3/34BXAfogYva+gs+0rnU5IZBGCfuYCXVaGrmcxQ5Q6WhVgGf5Wfoi8Cjz2nWqyylNW3Nb7npGhmwawY+0os8/99l5csFpjIVSdDb1rVGYX8J1p6XjfepuH0ZRUfgMeVu6vgTljOtcmzVnA0IF/qVp/dDUsWRvb7wB+QnCO3z1f5BmWn8DiJXDnFvUOIxqIpN+G6ISViBUrFvRZIVXi7q9JQAwaGyAgU3huIhK/yfNbD9nchhq/Av3YZkIZ2AbpSuHwpyqV5llJv7xz+HME1MCR1CRuej4sKLTAqTPdDZKnMngjnCn7ScGHXI0NjEgE+oWOKpShI41wpzrvMryd42CsbVzwBu6drFI2RyEdN48csEaDF7wX4wXXStk4qiSmirtuzdRuS0Rxk9qkb3sW2caMHE2RvQy76FvE0bvrRe9baMl6I01ohkq3mHKXEaFdvzivpFpLLq6glyX5dLHDcJE0PS8xOSpwdUOOiLqoiHaRjKN7hsdCUGBvE6v+xUoEG6HCM3ZVGx3Pw2cKJguRJvyOZ762ofYl8YDCpnHth7b6WuE62tSNu71w5wvIzSfnm7zEv6z83Q2z9W1ppHiIuAoYJmkvAeONcMMcY4Yj9vKrOXCea0d31RiWRiU2gbyKiPqVrTNrURobFj60X5qUz5mnduKxxM3xLq2H5wgk87ttu1DOFupzW0UKK319vugKRTA3cViWTfl+hdsaYAFNkVbXQ/53AmHLK5tWblD8//KZTkSFOxeHm4JxfaEs0t40ht350FaQF6QErtBZZ8a5XbCMjshv2hlhCz1GlSmt5H/ZDdoFI+JOotc8/cxwBEgXGVHwJPXvrhJetEiT6Ttm9/8yUU1h7KSUIpydiZ+KwJeixehIbn1sMNwXBO2vz0vP5mHC7OVNTc/jBWbw2b63qwAsW96GzSwDMfAYYzhJMPvck8jt8bP8yqVPuUPyoKqJQdV+ui4MvglyvFlhAqVfj5o5HFk0UzOk1R9gI6aDinwP4ZPMgytsnVwvPl3CE8hIn9mwU+xAhavKh+BqmPkdmMgLcCGh+qkJbsW8uQWPlRHtcWAk2rgmexBEU7rDln6ccbW4ojGUnCi+wKy/tmQj/0BZDWwXbLMUVAJDoIKxprcZqfCHdKds5dIpx0haH694j4/OgrW4co6Ss3Wc0GmCMXzz/h3w2uiLLrxYxnMprN/KotPmjs1mSHH24FbNTnjofMrV8FjI2rRagiD/Xh9AYL+ZADik2NTq07BvEHzNqctPd2Q47MO1Cf0zgx+ZrO0L1o9cznBpJ8xHEN9kf6m8GgH62tS3dqJ/2+5egwvvdE+tC8cWvfX2R4+8tNwuYmKwsEqxLiU4uxc0WQ+LgmgoIib3v6p7z5eBapsAKtjtQ5qipcxZmVnuOHxmii9lne2n0ioG4Qo/C7C8BbKpBFZCP/X44poWQR+TUNhLc0b6mgJciGqaNNoZqyOl01ijn+Je5NDzOl6nsNJiEm+Jt5lLQsuIh+Gms3Xn8lxPloXSIJmzj560FNB+vAMXVqJLzN4q3csOfWEhujLG+CMk5d3cc0RunqV42j2kXG7t2pPPzJBwKotjq2TdYawVxq5kouOoN+5TdLYlZSiPfwmUikgke3F90xDxdcoGRmoJRjJd0bHmPIXGZrKjXI09RDCtXKh2FUsO5YIpA4cizuUcjW7k39jtp1Lgu8X417f6299GSG6Ihinn13addnGFhmegaFGag1nKMNj9dfwvW6KHpi7wyHWUJGZ6JbJFsweagNL2Dx7BB4B5YDhBQK6njN4KjnAyHTslxbaIGLcLl0Uw/iGoycf0DpSyhWGyiIDNTp7kRvPcoXil58q9CA9eWc71Lw6q/ZnzepSxfD2mfvxtrWNEzSCmHYHZbJBN+1xw12pf8BhvidYTelpA10jI/RszOWWsyyORarFF/kwjePmnYYH8ImC4T/b94wpBoI1vxjp6qLv5na4THbogWrAfETczI908IGoKQnO4oqahjo5U2jm83osXE8SRSIfNNx6E+wgTnvE18xaZR5oKBInU5xvb7+ygb05+Bg6GYdKgRgl5GwO3PIjF5natyltVC0mHA1BgDcJigPVrcglsI5QhWbfNK0HIyY7C/K48RNrS4l/NmeAxEsXH1fFyG9nk9w+sa2Ad27/9qloVqFKMWe99FoB5iglIUnxQxUcTqTErtdCsuAzUEGZvbVIy/3NdASenrVVUbKylI6LSaNpaYQ9zTGguLzPsQIprRtbDFYmwP19CxvCzFkIhBq5x19hYxbB78wfbVGBLD9KoACXJ4tnEIhAqrQuV9sq0F9Pnvq1tNiznYhJjZlYOHTpwyeVwpzM3zLDJkKo9NC9zK+4lZCFy7+e9liyTOj96wMyqiVXdHPCLGUKyi5eDCUyYpY7Tm7Yu1XgzsZpQpGVKHnV8hKa+39qO1FjsEiWrN6ZiACMHbRiFY/S7q69twh5wwd7Am0/yV0mYyLoP9mK875MrcLeoz1Rusnd33IavkgbBN6jQ69OrOI4VpUE/dKwPYDieX+74nZv17yAHRxdOaTS2gq9EYQmis7VZvfk6hEzdgZgsfjsw9YXH1Cd0y6pqnhkhQ20aVIFQVGM6lQ/pqbbScEVTSavXntNXDmAVm7xhqAARRY3o0zhOXqD4k2DJAHkqJNdFDcWxmw0Uw0B5BU+0ZfqKZeeBlRlhlE8kC1cawvPqE69NDiWuQB5dBw0r2t507tnRrBR9rvqgy2XATvTsYVCAmPWK/QmLtjWyoFIEOdLZrTdM6D4TjJFUTpGgxkRB5Rsc4PPPx0yKUCOdsjsEqzShQAAAABAAAAFAAFABDqLt2mwibWTQ==' \
  --compressed

As you can see there is only a string in the data field. If we convert the above curl to normal requests from https://curlconverter.com/ we get:

import requests

headers = {
    'xx-request-id': '1658229840503-198a71ed5b1e-1678283976400-482624',
    'sec-ch-ua': '"Chromium";v="110", "Not A(Brand";v="24", "Google Chrome";v="110"',
    'xx-spyro-version': '3.0',
    'accept-language': 'fr,fr-FR',
    'authorization': 'PASS Token="11101wFMydY17lNcvY28FkzF2F9eyKbZst68ae6C2gJpPjclYizPY6rQ1ReBzCwcDEiE148GU6GTtvnguTIDn3gMitxTIyNKwP4TWpjJ33KBcgqrXripnV1cLlpMYnm-FG7OvUFI6RNXZCc7RpaIAYPKrPQvCC18GW-_aK8QbZ2t5Zn3QV4sB3bsbqvmWH62b0h7FBTB7OexazmPe9wGSb0oCiDFdMWBkNXe5ytv1sQDa_I0wN-muOiGoFq074OyO8XvbYnsDOW8mLF59NhHfMv_uW4YImELSb0kbn_1xksFOltQbW_8W0BvBJWB-qnsw10hsV2m00tLLoQ8efu6uczX2MMKYjg1_P_-JtD8fLUVK06lLvUmbYKrkAgmlMuzj7ykFR25XYBBUH3D1HWX5GIqaKSDOUROxp_02fijNXOpzGWG_nHJnG5MJAgxYp63l8OCQsA9Kfw3MhKstnDTo7G0qCVOhc1s48L8msM1v9eXtJSstaFUdaFXoHGEk1YQlh55c-cMkt_IzVG73CM-u5b2l0SotiEvI1ByJfnOifgqxaE6N6sR6dL3wggDAc8FC4GQwfRt8UoAUpIikaNz4zMSTiXNdDSNHzgWozqgOc6tmYny67uFtssdRygK32nxd2P99JmG2qJJ_Sau8AUBJfzuKxeOUYBfcd-N9tAYA434oxjr7luca3TJbzBdi9HT20xSBNyXjFBYJG9Qh7qZAb6UmTg.."',
    'xx-profile-id': '0',
    'xx-operator': 'pc',
    'xx-api-version': '3.0',
    'xx-device': 'pc 1658229840503-198a71ed5b1e',
    'sec-ch-ua-platform': '"Windows"',
    'xx-service': 'mycanal',
    'xx-oz': 'cpfra',
    'sec-ch-ua-mobile': '?0',
    'user-agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/110.0.0.0 Safari/537.36',
    'content-type': 'text/plain',
    'xx-ol': 'fr',
    'accept': 'application/json, text/plain, */*',
    'xx-distmodes': 'catchup,live,svod,tvod,posttvod',
    'xx-domain': 'cpfra',
    'origin': 'https://www.canalplus.com',
    'sec-fetch-site': 'cross-site',
    'sec-fetch-mode': 'cors',
    'sec-fetch-dest': 'empty',
    'referer': 'https://www.canalplus.com/',
}

data = 'CAES3R8SSApGCjAIARIQjPgHSs34RIuk96KXbK//whoIV2lkZXZpbmUiCjEyMjA3NDkwOTNI49yVmwYQARoQlnrOk7t2NQE5vdtGGTw0BxgBIMipoqAGMBU4pru70Q5CgB8KFGxpY2Vuc2Uud2lkZXZpbmUuY29tEhAXBbkXzBIEhosGMzovdyqMGsAcrDigFhnsm+BZUH0PYtZ/fqWVP4VcgpukzQa367b2ftP/Yrc4CCF8VeOTR5i4IdbTS+Kbx5if4EkQuCpLPV3mupJPrPxk8spPu54B69AUAv8vKUi7Bv/OWRM195qhl1Yvaa2rMezQ3UOqp8g50sI9Z/gh7RA9Dv/o6eruwuWOcD4vvDElAVcROV7WJp1Ch52XuJ/2BsnuhkoT+FclB7vfLPNdyNkLvD4R4ksLeVFI8rP1cdmCwmYz9M4Ly+z3mFg7b+84PyvV+2HqdNVBaDcNozuQdUOcNusrd/rzsv3csWhSvfnGEzQ57CqlPduPAUR2akGxn0RUu6bIlxy3mPN3vuqO37XrhGHXaQdntEVl6PgvDnKGQk0NCiqvcu+myKdCc9uBK9RkDrSmQfXglV1/I7UHf3vM+Uf8N2OZO2DMJJn563dvllGx2o9gnQtECfkTHYkG5hxYckqo5SfJqH3OUezL9Ir6OJd9pH7Wt0ixxtpI6WZCdrYQjDziMzpysXYuI3qmGFpqWt8R2oWjzRGY8RpwDG7P4WMIfUNgcqFntummLOCsI4YEXBB5KujPSkMEL25ofx1dQoqqC6NGrwdeMTJp3hY5wIeY/4efqzdJHoGL0bm7j3g82AP7rB2kJ6hSgAcX8GadqaydBUmKMpKs/Ulw0rYeLypc132Hwoc2hNWuBZvYaMPGOoctP8R4ViI6vnfHucwQd6hjtxuMJEJ5b3OCKDsghYkb0G4JZTur1LKo9kg1V/1MVrLaczE6IdSSP4qLLnAtcpVALSS9XpUR1z5WwgYbCNDgDUeaUSXSfz1odvfgcLI6TpUPKP/cXIAWMXVc3Ew6VcDC86OgHpVlbxTRSZG4QVM6z789AN0CWBbiOI031tHlchHKBXM4MNKcdfE2KRE9aSHS/g5mCewjHW/OOxNE5iO4iqWiqweb0d3T822AChb+a86w63OxuATGaNlD838jKDVBA5amF/YNNySI88vgUSTrOiRhwgmG3IU7YgqOPdIdEqkBfCqTNwBhcGMbIacoqsvMUIpIDs/RZ3eAnYjptYa6FiX/ZXLsxVqX6qQDA96LsuzhOjB9qwAc3misaZZDYc8bsle6WeYKTRmcz5BL4CS33Zkk4W3S0wv1qZVBhwW8HimO3lsA0bZ54DF6fbDXQzdrfexAmjMhodPqy8oOsRHGmu2pxoWAjrIQRR3F3gGZ3g/vEO866aTqZCiuyD0etbR/xAS1qd77YF1GJgRbWKxolvy9N2wQR33eGI4AUYT2MCVltdADg++FmvsFwLj130Tfuw6dygCyoBG/Gm5ktw2hHvLYPSUzNwJfvyegfGBGtURA/m9P0cT1RQWgqyi8Ri2XukenBOzGn+QOYhhLsMdvza2Zs/xIUD9q5tcUhbTFCumVvrlcZA4bhOEUiH8lX+OUYy048cwgln8VyjrJeaDqC0orZORYHxGcLfDAyl7heH+OBzDt4oTwritlllf43iJHXfs8pWuiC5Kv/dIbfMj4fEokRTfYV2z+/m/fm+3a95e6WugNmtvEyetQFuYruYGW1UszTViqkTy50EUDfdQRbqBjUkm6w3ozKJybvVZelzmTRaTMxCZAg1QTXvD3u8Schu38VICTh0+QbEfoD9435XdNs/VNlBA/pi3CHA+QbGF+tBM3VnSfSl8aO4WNRemB1Ry/RbesYbizuK5uFleiQSNGMKp3CVC9D603ZbOWccNxcNuF2cGV5/76PIzQvtog+P7GXpvIpPI41r/XXHyyMHDifnDG/4cl32ZQHklume8u1/5s1j/MGVQaGKHupiD3QlGyIUe6SffDWgQqpz1oP+hdrd+oAL3VWFukIHDDk+5fn0jIAkpbMt8YjmY9lTSnHZybNPT3ItNhQxOETilDHLHS0rgpmX4XfzrvPVJq1qEW1CcyyfJJATHHAlFEjyU3mEug0HBIQEDtGB5aSuU0+F7fH1PV63BsFHUZOk2CxPxTB/89hiblMX1HerD7zVwPvQV4wilbMp9ziNba24l7+CuAo4+F6/8YOYtINl4UEsomfRzW7J0kkNhrjp5iPHELBCwopbuXM5zoQlQOVDsDlrKWADaa9Ux/pyJYOXJEGgcwhTxcPgHAhN299SoO5imVmmpTSPmK1bgx2HNcuhmyKFbzVXNwEPe0LwAx8uKeG4P85kEEfnRlosnSdRulE9inJwe4gd4Y5gjvEIq9xMQHy6ZU46NUS9OyVPpO8yfm+PFLATEHWX9RdBdMuLug9ozMgxhVAqat4RQo6wbgpOVJOvYX7+McISGyzyQrmdu1zAIf7ICbbdu/JrRBDKCCyrpmqabx3FRZ+3WApN6RrnloxUtEB4iwVQYBMR02R4dM+YgyjYv0Vsla9PQXVldmqIdjQ+XufewSjpZq+icEffGXjGXXfB1AZicTpkZE1yOfvB6aYuYQF1cqO/lOOQkgX7X1oWZKcPGzLkuZuahCHYZOTgYuVeFI8DXO6WXYAApozTAL1egNs8N0ixmCoRBP1iT7RxHYcC3MteDErVUWtchcRzVF97Vq6Z6afhKknwQP8zo99Z5bulKIZV/9XRXhAD7pLl2N8bvSR2GhLG+/0jQKcXt6nliClOY3/34BXAfogYva+gs+0rnU5IZBGCfuYCXVaGrmcxQ5Q6WhVgGf5Wfoi8Cjz2nWqyylNW3Nb7npGhmwawY+0os8/99l5csFpjIVSdDb1rVGYX8J1p6XjfepuH0ZRUfgMeVu6vgTljOtcmzVnA0IF/qVp/dDUsWRvb7wB+QnCO3z1f5BmWn8DiJXDnFvUOIxqIpN+G6ISViBUrFvRZIVXi7q9JQAwaGyAgU3huIhK/yfNbD9nchhq/Av3YZkIZ2AbpSuHwpyqV5llJv7xz+HME1MCR1CRuej4sKLTAqTPdDZKnMngjnCn7ScGHXI0NjEgE+oWOKpShI41wpzrvMryd42CsbVzwBu6drFI2RyEdN48csEaDF7wX4wXXStk4qiSmirtuzdRuS0Rxk9qkb3sW2caMHE2RvQy76FvE0bvrRe9baMl6I01ohkq3mHKXEaFdvzivpFpLLq6glyX5dLHDcJE0PS8xOSpwdUOOiLqoiHaRjKN7hsdCUGBvE6v+xUoEG6HCM3ZVGx3Pw2cKJguRJvyOZ762ofYl8YDCpnHth7b6WuE62tSNu71w5wvIzSfnm7zEv6z83Q2z9W1ppHiIuAoYJmkvAeONcMMcY4Yj9vKrOXCea0d31RiWRiU2gbyKiPqVrTNrURobFj60X5qUz5mnduKxxM3xLq2H5wgk87ttu1DOFupzW0UKK319vugKRTA3cViWTfl+hdsaYAFNkVbXQ/53AmHLK5tWblD8//KZTkSFOxeHm4JxfaEs0t40ht350FaQF6QErtBZZ8a5XbCMjshv2hlhCz1GlSmt5H/ZDdoFI+JOotc8/cxwBEgXGVHwJPXvrhJetEiT6Ttm9/8yUU1h7KSUIpydiZ+KwJeixehIbn1sMNwXBO2vz0vP5mHC7OVNTc/jBWbw2b63qwAsW96GzSwDMfAYYzhJMPvck8jt8bP8yqVPuUPyoKqJQdV+ui4MvglyvFlhAqVfj5o5HFk0UzOk1R9gI6aDinwP4ZPMgytsnVwvPl3CE8hIn9mwU+xAhavKh+BqmPkdmMgLcCGh+qkJbsW8uQWPlRHtcWAk2rgmexBEU7rDln6ccbW4ojGUnCi+wKy/tmQj/0BZDWwXbLMUVAJDoIKxprcZqfCHdKds5dIpx0haH694j4/OgrW4co6Ss3Wc0GmCMXzz/h3w2uiLLrxYxnMprN/KotPmjs1mSHH24FbNTnjofMrV8FjI2rRagiD/Xh9AYL+ZADik2NTq07BvEHzNqctPd2Q47MO1Cf0zgx+ZrO0L1o9cznBpJ8xHEN9kf6m8GgH62tS3dqJ/2+5egwvvdE+tC8cWvfX2R4+8tNwuYmKwsEqxLiU4uxc0WQ+LgmgoIib3v6p7z5eBapsAKtjtQ5qipcxZmVnuOHxmii9lne2n0ioG4Qo/C7C8BbKpBFZCP/X44poWQR+TUNhLc0b6mgJciGqaNNoZqyOl01ijn+Je5NDzOl6nsNJiEm+Jt5lLQsuIh+Gms3Xn8lxPloXSIJmzj560FNB+vAMXVqJLzN4q3csOfWEhujLG+CMk5d3cc0RunqV42j2kXG7t2pPPzJBwKotjq2TdYawVxq5kouOoN+5TdLYlZSiPfwmUikgke3F90xDxdcoGRmoJRjJd0bHmPIXGZrKjXI09RDCtXKh2FUsO5YIpA4cizuUcjW7k39jtp1Lgu8X417f6299GSG6Ihinn13addnGFhmegaFGag1nKMNj9dfwvW6KHpi7wyHWUJGZ6JbJFsweagNL2Dx7BB4B5YDhBQK6njN4KjnAyHTslxbaIGLcLl0Uw/iGoycf0DpSyhWGyiIDNTp7kRvPcoXil58q9CA9eWc71Lw6q/ZnzepSxfD2mfvxtrWNEzSCmHYHZbJBN+1xw12pf8BhvidYTelpA10jI/RszOWWsyyORarFF/kwjePmnYYH8ImC4T/b94wpBoI1vxjp6qLv5na4THbogWrAfETczI908IGoKQnO4oqahjo5U2jm83osXE8SRSIfNNx6E+wgTnvE18xaZR5oKBInU5xvb7+ygb05+Bg6GYdKgRgl5GwO3PIjF5natyltVC0mHA1BgDcJigPVrcglsI5QhWbfNK0HIyY7C/K48RNrS4l/NmeAxEsXH1fFyG9nk9w+sa2Ad27/9qloVqFKMWe99FoB5iglIUnxQxUcTqTErtdCsuAzUEGZvbVIy/3NdASenrVVUbKylI6LSaNpaYQ9zTGguLzPsQIprRtbDFYmwP19CxvCzFkIhBq5x19hYxbB78wfbVGBLD9KoACXJ4tnEIhAqrQuV9sq0F9Pnvq1tNiznYhJjZlYOHTpwyeVwpzM3zLDJkKo9NC9zK+4lZCFy7+e9liyTOj96wMyqiVXdHPCLGUKyi5eDCUyYpY7Tm7Yu1XgzsZpQpGVKHnV8hKa+39qO1FjsEiWrN6ZiACMHbRiFY/S7q69twh5wwd7Am0/yV0mYyLoP9mK875MrcLeoz1Rusnd33IavkgbBN6jQ69OrOI4VpUE/dKwPYDieX+74nZv17yAHRxdOaTS2gq9EYQmis7VZvfk6hEzdgZgsfjsw9YXH1Cd0y6pqnhkhQ20aVIFQVGM6lQ/pqbbScEVTSavXntNXDmAVm7xhqAARRY3o0zhOXqD4k2DJAHkqJNdFDcWxmw0Uw0B5BU+0ZfqKZeeBlRlhlE8kC1cawvPqE69NDiWuQB5dBw0r2t507tnRrBR9rvqgy2XATvTsYVCAmPWK/QmLtjWyoFIEOdLZrTdM6D4TjJFUTpGgxkRB5Rsc4PPPx0yKUCOdsjsEqzShQAAAABAAAAFAAFABDqLt2mwibWTQ=='

response = requests.post(
    'https://secure-gen-hapi.canal-plus.com/conso/view/70e7c3f0-bdb9-11ed;70e7c3f0-bdb9-11ed-8f2f-11937df79a3f;-8f2f-11937df79a3f/licence?drmConfig=mkpl::true',
    headers=headers,
    data=data,
).text

print(response)

If we run it as above with normal requests we get the response from the server as following:

<licenseresponse deviceid="MTY1ODIyOTg0MDUwMy0xOThhNzFlZDViMWVaV0V4TVdZMU5HSXhaVEJqTmpaaU0yWXpNR016WlRjMFlUazVaRFUyTkRZPQ==" xmlns="http://www.canal-plus.com/DRM/V1"><clientresponse statuscode="200"><license contentid="1220749093">CAISogMKNwoQlnrOk7t2NQE5vdtGGTw0BxIQ+a6v4hMRH0WxLFEiEniSSxoAIAEoADgAQICangFIyKmioAYSGQgBEAEYACCAmp4BKICangEwgJqeAUgAUAAaZhIQiOusCGJ0kXjuAmtQBEGFfxpQv21kbpgPS7pRgveNZkkNz44FK0ne/TwxHHJS04rRA/hK9+kuUn8/iUoZv+w0Vc/1AO2fSOAAdeuvvg2qxvNPKcrv0/iylpBMlnLD6Qk6/gsgARpqChA2/y5kD6NOlKwER2slbImZEhApHuYphDus85SV3gK2JlDHGiD6wbYtUzKOurH/VMgjASkL4Yq/YQb/BBcHIdCmIXBB2CACKAEyBggAGAAgAEISChBrYzE2ACeNAOou3aaAAAAIYgJTRBpqChCM+AdKzfhEi6T3opdsr//CEhCpaa0cFIGKcSmIcuEOnGAZGiDQJevHkRujbvPFh60ghHQ9Zkwg0//9qGs2KhjieHpPaCACKAIyBggBGAAgAEISChBrYzE2ACeNAOou3abEAAIMYgJIRCDIqaKgBjjj3JWbBlAFGiANFoIOLF5t550dYJxn8j7sJnQCcfxl/jtuBNFNgE95RCKAAXkJ/U6b/PKB2/IYpA/Cd0iTninH3TxQsDbl4WP/j7i9qOKVqK5AbJo4M6ar5dzXX87381z4Qd8gaYpCQjmJJtUAlRS8RQMvaRLIlemT1rqQ6R94ZQWGmX+oXbgV5WKetcNT+lZIkMixXS/Nj2lnMvRDh2pU64HXi9a7ngfmb6nLOjMKMTE3LjQuMCBCdWlsdCBvbiBKYW4gMTggMjAyMyAxNzozMTozMiAoMTY3NDA5MTg1MClAAUrYAQAAAAIAAADYAAUAEOou3abCJtZNAAAAWAAAABAAAABqAAAAUAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAQAAAAEAAAAAAAAAAAAAAAAAAAAAACeNAAAAAAAAJ40AAAAAAAAAAAAAAAACAAAAwAAAABAAAADSAAAAEAAAAOQAAAAQAAAAAAAAAAAAAAEUAAAAEAAAASwAAAAQAAABPgAAABAAAAFQAAAAEAAAAAAAAAAAAAABgAAAABA8nSDt6CBZGcu0m9TZbSm7TNY+P8dv4mEivGtp74+gJVgB</license></clientresponse></licenseresponse>

But if we do the same with curl_cffi we get:
TypeError: data must be dict, BytesIO or bytes

So this should be a bug right? Can you add support for these type of requests.

Session cookies do not inherit during redirection

problem: Session cookies do not inherit during redirection
version: 0.5.4
descibe: When a session with cookie requests a redirected website A, 302 jumps to B, and the request to access B does not carry any cookies

发送请求时无法控制请求头顺序

部分网站如cloudflare会检测到headers的默认顺序,从而识别爬虫,curl_cffi无法将请求头按照dict的顺序排列,这个是curl的问题吗?

Pyinstaller Packing problem

Thanks for this project!
I found a packaging problem with Pyinstaller
Here is my example code

from curl_cffi import requests
r = requests.get("https://www.zoopla.co.uk/to-rent/branch/johns-and-co-nine-elms-london-68981", impersonate="chrome101")
print(r.text)

It works fine
When I package it with pyinstaller

pyinstaller -F example.py

This error occurred at runtime

Traceback (most recent call last):
  File "ProjectTb\cffi_test\Test.py", line 1, in <module>
  File "PyInstaller\loader\pyimod02_importers.py", line 493, in exec_module
  File "curl_cffi\__init__.py", line 9, in <module>
ModuleNotFoundError: No module named '_cffi_backend'
[13448] Failed to execute script 'Test' due to unhandled exception!

Sometimes they report it wrong

curl_cffi.CurlError: Failed to perform, ErrCode: 77, Reason: error setting certificate verify locations:CAfile: C:\Us
ers\25605\AppData\Local\Temp\_MEI30202\curl_cffi cacert.pem CApath: none

post表单参数携带问题

post请求的data参数携带只能带dict类型,不能带字符串类型

headers = {
    "authority": "trendinsight.oceanengine.com",
    "accept": "application/json, text/plain, */*",
    "accept-language": "zh-CN,zh;q=0.9",
    "cache-control": "no-cache",
    "content-type": "application/json",
    "origin": "https://trendinsight.oceanengine.com",
    "pragma": "no-cache",
    "referer": "https://trendinsight.oceanengine.com/arithmetic-index/analysis?keyword=%E6%83%85%E4%BA%BA&appName=aweme",
    "sec-ch-ua": "\"Chromium\";v=\"110\", \"Not A(Brand\";v=\"24\", \"Google Chrome\";v=\"110\"",
    "sec-ch-ua-mobile": "?0",
    "sec-ch-ua-platform": "\"Windows\"",
    "sec-fetch-dest": "empty",
    "sec-fetch-mode": "cors",
    "sec-fetch-site": "same-origin",
    "user-agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/110.0.0.0 Safari/537.36"
}`
url = "https://trendinsight.oceanengine.com/api/open/index/get_multi_keyword_hot_trend"
params = {
    "msToken": mst,
    "X-Bogus": xb,
    "_signature": "_02B4Z6wo000016U3zkAAAIDBR4K-1yhJkyelN8rAAIqrY7j8042mQ9bMxDwIz7wfa6R6FdlZtGVkyaM95f4gdLW4kokHTxVoj6kxGhM7Jo7SzzG5nOUuJmGZeSE4p4QOk2OQYs7K9qNRSx09e0"
}
data = {"keyword_list":["情人"],"start_date":"20221120","end_date":"20221127","app_name":"aweme",'region':[]}
data = json.dumps(data)
response = req.post(url,headers=headers, params=params, data=data)

print(response.text)`

此例子为巨量算数网站上的一个数据接口,从浏览器上复制的curl转换为py代码,python原生requests能发请求但没有数据,而用这个requests库却引发data参数的报错,连发送请求都发不了,希望佬可以看看这个问题

Ability to use local proxy 127.0.0.1 for Fiddler

Issue: session.py 393 | curl_cffi.requests.errors.RequestsError: Failed to perform, ErrCode: 60, Reason: 'SSL certificate problem: unable to get local issuer certificate'

Just set the requests proxy to 127.0.0.1:8888 and open Fiddler. With the normal requests library it will allow you to receive the request, but not with this Curl_CFFI

Any help would be much appreciated :)

Best,
OE

Problems about the proxies

Thanks for this project!

When i set proxies parameters, i get this error:

(CurlError("Failed to perform, ErrCode: 35, Reason: 'error:100000f7:SSL routines:OPENSSL_internal:WRONG_VERSION_NUMBER'"),)

the proxy has user name and password. and i set the param. like this:

auth = ["username","pwd"],

asyncio.exceptions.InvalidStateError: invalid state

Exception in callback <bound method AsyncCurl.process_data of <curl_cffi.aio.AsyncCurl object at 0x7f7cca747bd0>>
handle:
Traceback (most recent call last):
File "uvloop/cbhandles.pyx", line 63, in uvloop.loop.Handle._run
File "/usr/local/lib/python3.11/site-packages/curl_cffi/aio.py", line 144, in process_data
self.set_result(curl)
File "/usr/local/lib/python3.11/site-packages/curl_cffi/aio.py", line 162, in set_result
future.set_result(None)
asyncio.exceptions.InvalidStateError: invalid state

This error sometimes shows up
What could it be related to?

[BUG] "ImportError: DLL load failed while importing _wrapper: The specified module could not be found." - PyInstaller issue

Hey again! So I was just trying to obfsucate & pack my app using pyarmor & pyinstaller and cant seem to get past this error on the final .exe:

┌─────────────────────────────── Traceback (most recent call last) ────────────────────────────────┐
│ in <module>:3                                                                                    │
│ in <module>:37                                                                                   │
│                                                                                                  │
│ in exec_module:495                                                                               │
│ in <module>:1                                                                                    │
│ in <module>:18                                                                                   │
│                                                                                                  │
│ in exec_module:495                                                                               │
│                                                                                                  │
│ in <module>:33                                                                                   │
└──────────────────────────────────────────────────────────────────────────────────────────────────┘
ImportError: DLL load failed while importing _wrapper: The specified module could not be found.
[13380] Failed to execute script 'main' due to unhandled exception!

I've tried following the instructions in the readme about using the hidden-import/collect-all options but can't seems to get it to work - any idea what could be happening?

Also, using the "collect all" option introduces another warning message:
WARNING: file already exists but should not: C:\Users\ADMINI~1\AppData\Local\Temp\2\_MEI146482\curl_cffi\_wrapper.pyd

The machine I'm trying to compile on is a 2016 windows server
Python version: 3.9.6
curl_cffi version: 0.5.5 (seems like you fixed most of the previous issues I had, so thanks!)
cffi version: 1.15.1
my pyinstaller string is super long with a bunch of hidden imports and stuff due to my app being large but I'm using one-file mode, if that makes any difference. I could try and make a sample app if needed, just lmk!

Thanks in advance as usual

不支持stream参数吗?

image
运行报错:request() got an unexpected keyword argument 'stream'
请问是还不支持stream参数吗?实际使用中异步应用挺多的,还请尽快支持。
另外,感谢作者的分享~!

[BUG] Too many headers

版本:curl-cffi==0.5.1
bug可以使用以下代码复现,大佬抽时间可以测试下

import asyncio

import httpx
from curl_cffi import requests

url = "http://127.0.0.1:9000"  # 随便启动一个fastapi接口,fastapi方便查看重复headers,flask会直接报431状态码提示headers过大
headers = {
    "user-agent": "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko)"
    " Chrome/110.0.0.0 Safari/537.36"
}


async def test():
    session = requests.AsyncSession(verify=False)
    for i in range(1000):
        # session每次复用都会将header追加进去
        response = await session.get(url, headers=headers)
        print(response.status_code)
    # 这种方式也会触发这个bug
    # session = requests.AsyncSession(verify=False, headers=headers)
    # for i in range(1000):
    #     response = await session.get(url)
    #     print(response.status_code)


async def test_httpx():
    client = httpx.AsyncClient(verify=False, headers=headers)
    for i in range(100):
        response = await client.get(url, headers=headers)
        print(response.status_code)


if __name__ == "__main__":
    asyncio.run(test())
    # asyncio.run(test_httpx()) # httpx可以正确处理
    """
    这是服务器收到的headers
    ['host', 'accept', 'accept-encoding', 'user-agent']
    ['host', 'accept', 'accept-encoding', 'user-agent', 'user-agent']
    ['host', 'accept', 'accept-encoding', 'user-agent', 'user-agent', 'user-agent']
    ['host', 'accept', 'accept-encoding', 'user-agent', 'user-agent', 'user-agent', 'user-agent']
    ['host', 'accept', 'accept-encoding', 'user-agent', 'user-agent', 'user-agent', 'user-agent', 'user-agent']
    ['host', 'accept', 'accept-encoding', 'user-agent', 'user-agent', 'user-agent', 'user-agent', 'user-agent', 'user-agent']
    ['host', 'accept', 'accept-encoding', 'user-agent', 'user-agent', 'user-agent', 'user-agent', 'user-agent', 'user-agent', 'user-agent']
    ['host', 'accept', 'accept-encoding', 'user-agent', 'user-agent', 'user-agent', 'user-agent', 'user-agent', 'user-agent', 'user-agent', 'user-agent']
    ['host', 'accept', 'accept-encoding', 'user-agent', 'user-agent', 'user-agent', 'user-agent', 'user-agent', 'user-agent', 'user-agent', 'user-agent', 'user-agent']
    ['host', 'accept', 'accept-encoding', 'user-agent', 'user-agent', 'user-agent', 'user-agent', 'user-agent', 'user-agent', 'user-agent', 'user-agent', 'user-agent', 'user-agent']
    ['host', 'accept', 'accept-encoding', 'user-agent', 'user-agent', 'user-agent', 'user-agent', 'user-agent', 'user-agent', 'user-agent', 'user-agent', 'user-agent', 'user-agent', 'user-agent']
    """

MacOs M2芯片 pip install error

ERROR: Command errored out with exit status 1:
command: /Library/Developer/CommandLineTools/usr/bin/python3 -u -c 'import io, os, sys, setuptools, tokenize; sys.argv[0] = '"'"'/private/var/folders/j6/4jxhrj8161xg60h5v_ssfk000000gn/T/pip-install-qc26puko/curl-cffi_964d697b07704e03b434331373194977/setup.py'"'"'; file='"'"'/private/var/folders/j6/4jxhrj8161xg60h5v_ssfk000000gn/T/pip-install-qc26puko/curl-cffi_964d697b07704e03b434331373194977/setup.py'"'"';f = getattr(tokenize, '"'"'open'"'"', open)(file) if os.path.exists(file) else io.StringIO('"'"'from setuptools import setup; setup()'"'"');code = f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, file, '"'"'exec'"'"'))' bdist_wheel -d /private/var/folders/j6/4jxhrj8161xg60h5v_ssfk000000gn/T/pip-wheel-fbgk2w9a
cwd: /private/var/folders/j6/4jxhrj8161xg60h5v_ssfk000000gn/T/pip-install-qc26puko/curl-cffi_964d697b07704e03b434331373194977/
Complete output (24 lines):
running bdist_wheel
running build
running build_py
creating build
creating build/lib.macosx-10.9-universal2-cpython-39
creating build/lib.macosx-10.9-universal2-cpython-39/curl_cffi
copying ./test_curl.py -> build/lib.macosx-10.9-universal2-cpython-39/curl_cffi
copying ./cffi_build.py -> build/lib.macosx-10.9-universal2-cpython-39/curl_cffi
copying ./init.py -> build/lib.macosx-10.9-universal2-cpython-39/curl_cffi
copying ./_const.py -> build/lib.macosx-10.9-universal2-cpython-39/curl_cffi
copying ./example.py -> build/lib.macosx-10.9-universal2-cpython-39/curl_cffi
running build_ext
generating cffi module 'build/temp.macosx-10.9-universal2-cpython-39/curl_cffi._curl_cffi.c'
creating build/temp.macosx-10.9-universal2-cpython-39
building 'curl_cffi._curl_cffi' extension
creating build/temp.macosx-10.9-universal2-cpython-39/build
creating build/temp.macosx-10.9-universal2-cpython-39/build/temp.macosx-10.9-universal2-cpython-39
creating build/temp.macosx-10.9-universal2-cpython-39/include
clang -Wno-unused-result -Wsign-compare -Wunreachable-code -fno-common -dynamic -DNDEBUG -g -fwrapv -O3 -Wall -iwithsysroot/System/Library/Frameworks/System.framework/PrivateHeaders -iwithsysroot/Applications/Xcode.app/Contents/Developer/Library/Frameworks/Python3.framework/Versions/3.9/Headers -arch arm64 -arch x86_64 -Werror=implicit-function-declaration -Iinclude -I/Applications/Xcode.app/Contents/Developer/Library/Frameworks/Python3.framework/Versions/3.9/Headers -c build/temp.macosx-10.9-universal2-cpython-39/curl_cffi._curl_cffi.c -o build/temp.macosx-10.9-universal2-cpython-39/build/temp.macosx-10.9-universal2-cpython-39/curl_cffi._curl_cffi.o
build/temp.macosx-10.9-universal2-cpython-39/curl_cffi._curl_cffi.c:50:14: fatal error: 'pyconfig.h' file not found

include <pyconfig.h>

           ^~~~~~~~~~~~

1 error generated.
error: command '/usr/bin/clang' failed with exit code 1

ERROR: Failed building wheel for curl-cffi
Running setup.py clean for curl-cffi
Failed to build curl-cffi
Installing collected packages: curl-cffi
Running setup.py install for curl-cffi ... error
ERROR: Command errored out with exit status 1:
command: /Library/Developer/CommandLineTools/usr/bin/python3 -u -c 'import io, os, sys, setuptools, tokenize; sys.argv[0] = '"'"'/private/var/folders/j6/4jxhrj8161xg60h5v_ssfk000000gn/T/pip-install-qc26puko/curl-cffi_964d697b07704e03b434331373194977/setup.py'"'"'; file='"'"'/private/var/folders/j6/4jxhrj8161xg60h5v_ssfk000000gn/T/pip-install-qc26puko/curl-cffi_964d697b07704e03b434331373194977/setup.py'"'"';f = getattr(tokenize, '"'"'open'"'"', open)(file) if os.path.exists(file) else io.StringIO('"'"'from setuptools import setup; setup()'"'"');code = f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, file, '"'"'exec'"'"'))' install --record /private/var/folders/j6/4jxhrj8161xg60h5v_ssfk000000gn/T/pip-record-82gum57w/install-record.txt --single-version-externally-managed --user --prefix= --compile --install-headers /Users/shixiaolong/Library/Python/3.9/include/python3.9/curl-cffi
cwd: /private/var/folders/j6/4jxhrj8161xg60h5v_ssfk000000gn/T/pip-install-qc26puko/curl-cffi_964d697b07704e03b434331373194977/
Complete output (26 lines):
running install
/Users/shixiaolong/Library/Python/3.9/lib/python/site-packages/setuptools/command/install.py:34: SetuptoolsDeprecationWarning: setup.py install is deprecated. Use build and pip and other standards-based tools.
warnings.warn(
running build
running build_py
creating build
creating build/lib.macosx-10.9-universal2-cpython-39
creating build/lib.macosx-10.9-universal2-cpython-39/curl_cffi
copying ./test_curl.py -> build/lib.macosx-10.9-universal2-cpython-39/curl_cffi
copying ./cffi_build.py -> build/lib.macosx-10.9-universal2-cpython-39/curl_cffi
copying ./init.py -> build/lib.macosx-10.9-universal2-cpython-39/curl_cffi
copying ./_const.py -> build/lib.macosx-10.9-universal2-cpython-39/curl_cffi
copying ./example.py -> build/lib.macosx-10.9-universal2-cpython-39/curl_cffi
running build_ext
generating cffi module 'build/temp.macosx-10.9-universal2-cpython-39/curl_cffi._curl_cffi.c'
creating build/temp.macosx-10.9-universal2-cpython-39
building 'curl_cffi._curl_cffi' extension
creating build/temp.macosx-10.9-universal2-cpython-39/build
creating build/temp.macosx-10.9-universal2-cpython-39/build/temp.macosx-10.9-universal2-cpython-39
creating build/temp.macosx-10.9-universal2-cpython-39/include
clang -Wno-unused-result -Wsign-compare -Wunreachable-code -fno-common -dynamic -DNDEBUG -g -fwrapv -O3 -Wall -iwithsysroot/System/Library/Frameworks/System.framework/PrivateHeaders -iwithsysroot/Applications/Xcode.app/Contents/Developer/Library/Frameworks/Python3.framework/Versions/3.9/Headers -arch arm64 -arch x86_64 -Werror=implicit-function-declaration -Iinclude -I/Applications/Xcode.app/Contents/Developer/Library/Frameworks/Python3.framework/Versions/3.9/Headers -c build/temp.macosx-10.9-universal2-cpython-39/curl_cffi._curl_cffi.c -o build/temp.macosx-10.9-universal2-cpython-39/build/temp.macosx-10.9-universal2-cpython-39/curl_cffi._curl_cffi.o
build/temp.macosx-10.9-universal2-cpython-39/curl_cffi._curl_cffi.c:50:14: fatal error: 'pyconfig.h' file not found
# include <pyconfig.h>
^~~~~~~~~~~~
1 error generated.
error: command '/usr/bin/clang' failed with exit code 1
----------------------------------------
ERROR: Command errored out with exit status 1: /Library/Developer/CommandLineTools/usr/bin/python3 -u -c 'import io, os, sys, setuptools, tokenize; sys.argv[0] = '"'"'/private/var/folders/j6/4jxhrj8161xg60h5v_ssfk000000gn/T/pip-install-qc26puko/curl-cffi_964d697b07704e03b434331373194977/setup.py'"'"'; file='"'"'/private/var/folders/j6/4jxhrj8161xg60h5v_ssfk000000gn/T/pip-install-qc26puko/curl-cffi_964d697b07704e03b434331373194977/setup.py'"'"';f = getattr(tokenize, '"'"'open'"'"', open)(file) if os.path.exists(file) else io.StringIO('"'"'from setuptools import setup; setup()'"'"');code = f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, file, '"'"'exec'"'"'))' install --record /private/var/folders/j6/4jxhrj8161xg60h5v_ssfk000000gn/T/pip-record-82gum57w/install-record.txt --single-version-externally-managed --user --prefix= --compile --install-headers /Users/shixiaolong/Library/Python/3.9/include/python3.9/curl-cffi Check the logs for full command output.

Failed to perform, ErrCode: 27, Reason: ''

Im trying to do the same thing as i did before with pycurl on linux, but here im stuck with some weird problem:
Script closes on second perform with Failed to perform, ErrCode: 27, Reason: '' , and i cant seem to be able to send second request inside one session.
In another cofings i tried - they crashed without even throwing an error but and still couldnt execute second perform.
What i have found - error 27 is a buffer overflow, but my example has pretty low data usage and there should be no overflow in here. point me where i might be wrong?

from io import BytesIO
from curl_cffi import Curl, CurlOpt, CurlInfo

def test():
    ua = "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/110.0.0.0 Safari/537.36"
    custom_headers = [ 
        "referer:https://google.com/",
        f"user-agent:{ua}",
        ]

    ch = Curl()
    buffer = BytesIO()
    ch.impersonate("chrome99")

    ch.setopt(CurlOpt.USERAGENT, bytes(ua, 'utf-8'))
    ch.setopt(CurlOpt.URL, 'https://ipv4.icanhazip.com/')
    ch.setopt(CurlOpt.WRITEDATA, buffer) 
    ch.setopt(CurlOpt.HTTPHEADER, [bytes(a, 'utf-8') for a in custom_headers])        
    ch.perform()    
    body = buffer.getvalue()
    print(body)
    print("================================")

    ch.setopt(CurlOpt.URL, 'https://ipv4.icanhazip.com/')
    ch.setopt(CurlOpt.WRITEDATA, buffer)  
    ch.setopt(CurlOpt.HTTPHEADER, [bytes(b, 'utf-8') for b in custom_headers])        
    ch.perform()
    body = buffer.getvalue()

    print(body)

    print('ENDS HERE')
    ch.close()

test() 

How can I use socks proxy in Session?

When I am tring to update the session proxies to the following:

        proxies = {
        'http': 'socks5://127.0.0.1:9150',
        'https': 'socks5://127.0.0.1:9150',
        }
        session.proxies.update(proxies=proxies)

Following is the error I am getting:

Traceback (most recent call last):
  File "/Users/anshul/heystack/scraper/api_scraper/dev/api_scraper.py", line 623, in <module>
    response = scraper.browser.get('https://ipinfo.io/json')
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/anshul/miniconda3/envs/main/lib/python3.11/site-packages/curl_cffi/requests/session.py", line 369, in request
    req, buffer, header_buffer = self._set_curl_options(
                                 ^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/anshul/miniconda3/envs/main/lib/python3.11/site-packages/curl_cffi/requests/session.py", line 247, in _set_curl_options
    if proxies["https"] is not None:
       ~~~~~~~^^^^^^^^^
KeyError: 'https'

Can't install with pip in alpine

the latest version ( 0.4.0 ) can't be installed

My environment:
OS: arm64 docker alpine
Python: 3.8.10
Pip: 23.0.1

here is the error log

  Using cached curl_cffi-0.4.0.tar.gz (75 kB)
  Installing build dependencies ... done
  Getting requirements to build wheel ... error
  error: subprocess-exited-with-error

  × Getting requirements to build wheel did not run successfully.
  │ exit code: 1
  ╰─> [37 lines of output]
      Traceback (most recent call last):
        File "/usr/lib/python3.8/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 353, in <module>
          main()
        File "/usr/lib/python3.8/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 335, in main
          json_out['return_val'] = hook(**hook_input['kwargs'])
        File "/usr/lib/python3.8/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 118, in get_requires_for_build_wheel
          return hook(config_settings)
        File "/tmp/pip-build-env-5aj63q6z/overlay/lib/python3.8/site-packages/setuptools/build_meta.py", line 338, in get_requires_for_build_wheel
          return self._get_build_requires(config_settings, requirements=['wheel'])
        File "/tmp/pip-build-env-5aj63q6z/overlay/lib/python3.8/site-packages/setuptools/build_meta.py", line 320, in _get_build_requires
          self.run_setup()
        File "/tmp/pip-build-env-5aj63q6z/overlay/lib/python3.8/site-packages/setuptools/build_meta.py", line 335, in run_setup
          exec(code, locals())
        File "<string>", line 16, in <module>
        File "/tmp/pip-build-env-5aj63q6z/overlay/lib/python3.8/site-packages/setuptools/__init__.py", line 108, in setup
          return distutils.core.setup(**attrs)
        File "/tmp/pip-build-env-5aj63q6z/overlay/lib/python3.8/site-packages/setuptools/_distutils/core.py", line 147, in setup
          _setup_distribution = dist = klass(attrs)
        File "/tmp/pip-build-env-5aj63q6z/overlay/lib/python3.8/site-packages/setuptools/dist.py", line 488, in __init__
          _Distribution.__init__(
        File "/tmp/pip-build-env-5aj63q6z/overlay/lib/python3.8/site-packages/setuptools/_distutils/dist.py", line 283, in __init__
          self.finalize_options()
        File "/tmp/pip-build-env-5aj63q6z/overlay/lib/python3.8/site-packages/setuptools/dist.py", line 912, in finalize_options
          ep(self)
        File "/tmp/pip-build-env-5aj63q6z/overlay/lib/python3.8/site-packages/setuptools/dist.py", line 932, in _finalize_setup_keywords
          ep.load()(self, ep.name, value)
        File "/tmp/pip-build-env-5aj63q6z/overlay/lib/python3.8/site-packages/cffi/setuptools_ext.py", line 219, in cffi_modules
          add_cffi_module(dist, cffi_module)
        File "/tmp/pip-build-env-5aj63q6z/overlay/lib/python3.8/site-packages/cffi/setuptools_ext.py", line 49, in add_cffi_module
          execfile(build_file_name, mod_vars)
        File "/tmp/pip-build-env-5aj63q6z/overlay/lib/python3.8/site-packages/cffi/setuptools_ext.py", line 25, in execfile
          exec(code, glob, glob)
        File "curl_cffi/build.py", line 6, in <module>
          ffibuilder = FFI()
        File "/tmp/pip-build-env-5aj63q6z/overlay/lib/python3.8/site-packages/cffi/api.py", line 48, in __init__
          import _cffi_backend as backend
      ImportError: Error loading shared library /tmp/pip-build-env-5aj63q6z/overlay/lib/python3.8/site-packages/_cffi_backend.cpython-38-aarch64-linux-gnu.so: Operation not permitted
      [end of output]

  note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error

× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> See above for output.

note: This error originates from a subprocess, and is likely not a problem with pip.```

Can it be used with eventlet/gevent?

I'm using celery and replaced default requests library with your curl_cffi.

But after starting 100 threads it seems like executing one after one (like only 1 thread is active).

How can it be patched to use in multithreaded?

TypeError: cannot pickle 'LockType' object

在对session.cookies进行deepcopy时遇到的报错:TypeError: cannot pickle 'LockType' object

backup_cookies = copy.deepcopy(session.cookies)

File "/usr/lib64/python3.8/copy.py", line 172, in deepcopy
y = _reconstruct(x, memo, *rv)
File "/usr/lib64/python3.8/copy.py", line 270, in _reconstruct
state = deepcopy(state, memo)
File "/usr/lib64/python3.8/copy.py", line 146, in deepcopy
y = copier(x, memo)
File "/usr/lib64/python3.8/copy.py", line 230, in _deepcopy_dict
y[deepcopy(key, memo)] = deepcopy(value, memo)
File "/usr/lib64/python3.8/copy.py", line 172, in deepcopy
y = _reconstruct(x, memo, *rv)
File "/usr/lib64/python3.8/copy.py", line 270, in _reconstruct
state = deepcopy(state, memo)
File "/usr/lib64/python3.8/copy.py", line 146, in deepcopy
y = copier(x, memo)
File "/usr/lib64/python3.8/copy.py", line 230, in _deepcopy_dict
y[deepcopy(key, memo)] = deepcopy(value, memo)
File "/usr/lib64/python3.8/copy.py", line 172, in deepcopy
y = _reconstruct(x, memo, *rv)
File "/usr/lib64/python3.8/copy.py", line 270, in _reconstruct
state = deepcopy(state, memo)
File "/usr/lib64/python3.8/copy.py", line 146, in deepcopy
y = copier(x, memo)
File "/usr/lib64/python3.8/copy.py", line 230, in _deepcopy_dict
y[deepcopy(key, memo)] = deepcopy(value, memo)
File "/usr/lib64/python3.8/copy.py", line 161, in deepcopy
rv = reductor(4)
TypeError: cannot pickle 'LockType' object

大佬,异步的使用是我打开的姿势不对吗?

import asyncio
from curl_cffi.requests.session import AsyncSession

async def async_main():
    async with AsyncSession() as s:
        r = await s.get("https://httpbin.org/headers")
        print(r.text)

        r = await s.get("https://httpbin.org/headers", stream=True)
        async for content in r.iter_content():
            print(content)

if __name__ == '__main__':
    asyncio.run(async_main())

报错内容:
image
image

Pyinstaller Error

Pyinstaller Error in version 0.4.0:
Traceback (most recent call last):
File "lt1.py", line 8, in
File "PyInstaller\loader\pyimod03_importers.py", line 540, in exec_module
File "curl_cffi_init_.py", line 39, in
ImportError: DLL load failed while importing _wrapper:

和sanic使用时会报错

Exception in callback <bound method AsyncCurl.process_data of <curl_cffi.aio.AsyncCurl object at 0x7f1738a64310>>
handle: <TimerHandle AsyncCurl.process_data created at /home/alex/workspace/parse_server/venv/lib/python3.8/site-packages/curl_cffi/aio.py:39>
source_traceback: Object created at (most recent call last):
  File "/home/alex/pycharm-2022.3.2/plugins/python/helpers/pydev/pydevd.py", line 1496, in _exec
    pydev_imports.execfile(file, globals, locals)  # execute the script
  File "/home/alex/pycharm-2022.3.2/plugins/python/helpers/pydev/_pydev_imps/_pydev_execfile.py", line 18, in execfile
    exec(compile(contents+"\n", file, 'exec'), glob, loc)
  File "/home/alex/workspace/parse_server/main.py", line 54, in <module>
    app.run(host="0.0.0.0", port=15368, debug=True, auto_reload=True, workers=1, access_log=False)
  File "/home/alex/workspace/parse_server/venv/lib/python3.8/site-packages/sanic/mixins/runner.py", line 145, in run
    self.__class__.serve(primary=self)  # type: ignore
  File "/home/alex/workspace/parse_server/venv/lib/python3.8/site-packages/sanic/mixins/runner.py", line 578, in serve
    serve_single(primary_server_info.settings)
  File "/home/alex/workspace/parse_server/venv/lib/python3.8/site-packages/sanic/server/runners.py", line 206, in serve_single
    serve(**server_settings)
  File "/home/alex/workspace/parse_server/venv/lib/python3.8/site-packages/sanic/server/runners.py", line 155, in serve
    loop.run_forever()
  File "/home/alex/workspace/parse_server/venv/lib/python3.8/site-packages/curl_cffi/aio.py", line 117, in process_data
    self.socket_action(sockfd, ev_bitmask)
  File "/home/alex/workspace/parse_server/venv/lib/python3.8/site-packages/curl_cffi/aio.py", line 113, in socket_action
    lib.curl_multi_socket_action(self._curlm, sockfd, ev_bitmask, running_handle)
  File "/home/alex/workspace/parse_server/venv/lib/python3.8/site-packages/curl_cffi/aio.py", line 39, in timer_function
    async_curl.timer = async_curl.loop.call_later(
Traceback (most recent call last):
  File "uvloop/cbhandles.pyx", line 251, in uvloop.loop.TimerHandle._run
  File "/home/alex/workspace/parse_server/venv/lib/python3.8/site-packages/curl_cffi/aio.py", line 117, in process_data
    self.socket_action(sockfd, ev_bitmask)
  File "/home/alex/workspace/parse_server/venv/lib/python3.8/site-packages/curl_cffi/aio.py", line 113, in socket_action
    lib.curl_multi_socket_action(self._curlm, sockfd, ev_bitmask, running_handle)
TypeError: initializer for ctype 'void *' must be a cdata pointer, not NoneType
Exception in callback <bound method AsyncCurl.process_data of <curl_cffi.aio.AsyncCurl object at 0x7f1738a642b0>>
handle: <TimerHandle AsyncCurl.process_data created at /home/alex/workspace/parse_server/venv/lib/python3.8/site-packages/curl_cffi/aio.py:39>
source_traceback: Object created at (most recent call last):
  File "/home/alex/pycharm-2022.3.2/plugins/python/helpers/pydev/pydevd.py", line 1496, in _exec
    pydev_imports.execfile(file, globals, locals)  # execute the script
  File "/home/alex/pycharm-2022.3.2/plugins/python/helpers/pydev/_pydev_imps/_pydev_execfile.py", line 18, in execfile
    exec(compile(contents+"\n", file, 'exec'), glob, loc)
  File "/home/alex/workspace/parse_server/main.py", line 54, in <module>
    app.run(host="0.0.0.0", port=15368, debug=True, auto_reload=True, workers=1, access_log=False)
  File "/home/alex/workspace/parse_server/venv/lib/python3.8/site-packages/sanic/mixins/runner.py", line 145, in run
    self.__class__.serve(primary=self)  # type: ignore
  File "/home/alex/workspace/parse_server/venv/lib/python3.8/site-packages/sanic/mixins/runner.py", line 578, in serve
    serve_single(primary_server_info.settings)
  File "/home/alex/workspace/parse_server/venv/lib/python3.8/site-packages/sanic/server/runners.py", line 206, in serve_single
    serve(**server_settings)
  File "/home/alex/workspace/parse_server/venv/lib/python3.8/site-packages/sanic/server/runners.py", line 155, in serve
    loop.run_forever()
  File "/home/alex/workspace/parse_server/venv/lib/python3.8/site-packages/curl_cffi/aio.py", line 117, in process_data
    self.socket_action(sockfd, ev_bitmask)
  File "/home/alex/workspace/parse_server/venv/lib/python3.8/site-packages/curl_cffi/aio.py", line 113, in socket_action
    lib.curl_multi_socket_action(self._curlm, sockfd, ev_bitmask, running_handle)
  File "/home/alex/workspace/parse_server/venv/lib/python3.8/site-packages/curl_cffi/aio.py", line 39, in timer_function
    async_curl.timer = async_curl.loop.call_later(
Traceback (most recent call last):
  File "uvloop/cbhandles.pyx", line 251, in uvloop.loop.TimerHandle._run
  File "/home/alex/workspace/parse_server/venv/lib/python3.8/site-packages/curl_cffi/aio.py", line 117, in process_data
    self.socket_action(sockfd, ev_bitmask)
  File "/home/alex/workspace/parse_server/venv/lib/python3.8/site-packages/curl_cffi/aio.py", line 113, in socket_action
    lib.curl_multi_socket_action(self._curlm, sockfd, ev_bitmask, running_handle)
TypeError: initializer for ctype 'void *' must be a cdata pointer, not NoneType
Exception in callback <bound method AsyncCurl.process_data of <curl_cffi.aio.AsyncCurl object at 0x7f1738a64f40>>
handle: <TimerHandle AsyncCurl.process_data created at /home/alex/workspace/parse_server/venv/lib/python3.8/site-packages/curl_cffi/aio.py:39>
source_traceback: Object created at (most recent call last):
  File "/home/alex/pycharm-2022.3.2/plugins/python/helpers/pydev/pydevd.py", line 1496, in _exec
    pydev_imports.execfile(file, globals, locals)  # execute the script
  File "/home/alex/pycharm-2022.3.2/plugins/python/helpers/pydev/_pydev_imps/_pydev_execfile.py", line 18, in execfile
    exec(compile(contents+"\n", file, 'exec'), glob, loc)
  File "/home/alex/workspace/parse_server/main.py", line 54, in <module>
    app.run(host="0.0.0.0", port=15368, debug=True, auto_reload=True, workers=1, access_log=False)
  File "/home/alex/workspace/parse_server/venv/lib/python3.8/site-packages/sanic/mixins/runner.py", line 145, in run
    self.__class__.serve(primary=self)  # type: ignore
  File "/home/alex/workspace/parse_server/venv/lib/python3.8/site-packages/sanic/mixins/runner.py", line 578, in serve
    serve_single(primary_server_info.settings)
  File "/home/alex/workspace/parse_server/venv/lib/python3.8/site-packages/sanic/server/runners.py", line 206, in serve_single
    serve(**server_settings)
  File "/home/alex/workspace/parse_server/venv/lib/python3.8/site-packages/sanic/server/runners.py", line 155, in serve
    loop.run_forever()
  File "/home/alex/workspace/parse_server/venv/lib/python3.8/site-packages/curl_cffi/aio.py", line 98, in _force_timeout
    self.socket_action(CURL_SOCKET_TIMEOUT, CURL_POLL_NONE)
  File "/home/alex/workspace/parse_server/venv/lib/python3.8/site-packages/curl_cffi/aio.py", line 113, in socket_action
    lib.curl_multi_socket_action(self._curlm, sockfd, ev_bitmask, running_handle)
  File "/home/alex/workspace/parse_server/venv/lib/python3.8/site-packages/curl_cffi/aio.py", line 39, in timer_function
    async_curl.timer = async_curl.loop.call_later(
Traceback (most recent call last):
  File "uvloop/cbhandles.pyx", line 251, in uvloop.loop.TimerHandle._run
  File "/home/alex/workspace/parse_server/venv/lib/python3.8/site-packages/curl_cffi/aio.py", line 117, in process_data
    self.socket_action(sockfd, ev_bitmask)
  File "/home/alex/workspace/parse_server/venv/lib/python3.8/site-packages/curl_cffi/aio.py", line 113, in socket_action
    lib.curl_multi_socket_action(self._curlm, sockfd, ev_bitmask, running_handle)
TypeError: initializer for ctype 'void *' must be a cdata pointer, not NoneType
Exception in callback <bound method AsyncCurl.process_data of <curl_cffi.aio.AsyncCurl object at 0x7f1738a64f40>>
handle: <TimerHandle AsyncCurl.process_data created at /home/alex/workspace/parse_server/venv/lib/python3.8/site-packages/curl_cffi/aio.py:39>
source_traceback: Object created at (most recent call last):
  File "/home/alex/pycharm-2022.3.2/plugins/python/helpers/pydev/pydevd.py", line 1496, in _exec
    pydev_imports.execfile(file, globals, locals)  # execute the script
  File "/home/alex/pycharm-2022.3.2/plugins/python/helpers/pydev/_pydev_imps/_pydev_execfile.py", line 18, in execfile
    exec(compile(contents+"\n", file, 'exec'), glob, loc)
  File "/home/alex/workspace/parse_server/main.py", line 54, in <module>
    app.run(host="0.0.0.0", port=15368, debug=True, auto_reload=True, workers=1, access_log=False)
  File "/home/alex/workspace/parse_server/venv/lib/python3.8/site-packages/sanic/mixins/runner.py", line 145, in run
    self.__class__.serve(primary=self)  # type: ignore
  File "/home/alex/workspace/parse_server/venv/lib/python3.8/site-packages/sanic/mixins/runner.py", line 578, in serve
    serve_single(primary_server_info.settings)
  File "/home/alex/workspace/parse_server/venv/lib/python3.8/site-packages/sanic/server/runners.py", line 206, in serve_single
    serve(**server_settings)
  File "/home/alex/workspace/parse_server/venv/lib/python3.8/site-packages/sanic/server/runners.py", line 155, in serve
    loop.run_forever()
  File "/home/alex/workspace/parse_server/venv/lib/python3.8/site-packages/curl_cffi/aio.py", line 117, in process_data
    self.socket_action(sockfd, ev_bitmask)
  File "/home/alex/workspace/parse_server/venv/lib/python3.8/site-packages/curl_cffi/aio.py", line 113, in socket_action
    lib.curl_multi_socket_action(self._curlm, sockfd, ev_bitmask, running_handle)
  File "/home/alex/workspace/parse_server/venv/lib/python3.8/site-packages/curl_cffi/aio.py", line 39, in timer_function
    async_curl.timer = async_curl.loop.call_later(
Traceback (most recent call last):
  File "uvloop/cbhandles.pyx", line 251, in uvloop.loop.TimerHandle._run
  File "/home/alex/workspace/parse_server/venv/lib/python3.8/site-packages/curl_cffi/aio.py", line 117, in process_data
    self.socket_action(sockfd, ev_bitmask)
  File "/home/alex/workspace/parse_server/venv/lib/python3.8/site-packages/curl_cffi/aio.py", line 113, in socket_action
    lib.curl_multi_socket_action(self._curlm, sockfd, ev_bitmask, running_handle)
TypeError: initializer for ctype 'void *' must be a cdata pointer, not NoneType

Memory leak Session

I ran into a memory leak when using Session.

using tracemalloc I traced it to line 123 in __init__.py.

func(ffi.buffer(binary_string.content, binary_string.size)[:])

Also found I could solve the issue by limiting the size of self._write_callbacks. around line 56.
I added this as a temporary measure.

if len(self._write_callbacks) > 4:
    del self._write_callbacks[0]
    self._write_callbacks.append((target_func, c_value))
else:
    self._write_callbacks.append((target_func, c_value))

not sure if there is more to it though.

Add support for curl_info options, like CurlInfo.COOKIELIST

Hi!
Thanks for this project!

I found that I can't get cookies from curl.
When i call like this
curl.getinfo(CurlInfo.COOKIELIST)
then I get an error:

c_value = ffi.new(ret_option[option & 0xF00000])
                      ~~~~~~~~~~^^^^^^^^^^^^^^^^^^^
KeyError: 4194304

I found that this is due to the fact that the type for the value 0x400000 is not defined in ret_option:

    ret_option = {
            0x100000: "char**",
            0x200000: "long*",
            0x300000: "double*",
        }

Can you please check how to fix this?
Thanks.

name 'CookieConflict' is not defined

{name: value for name, value in session.cookies.items()}

raise NameError: name 'CookieConflict' is not defined

def get(  # type: ignore
    self,
    name: str,
    default: typing.Optional[str] = None,
    domain: typing.Optional[str] = None,
    path: typing.Optional[str] = None,
) -> typing.Optional[str]:
    """
    Get a cookie by name. May optionally include domain and path
    in order to specify exactly which cookie to retrieve.
    """
    value = None
    for cookie in self.jar:
        if cookie.name == name:
            if domain is None or cookie.domain == domain:
                if path is None or cookie.path == path:
                    if value is not None:
                        message = f"Multiple cookies exist with name={name}"
                        raise CookieConflict(message)
                    value = cookie.value

    if value is None:
        return default
    return value

Headers order

Hi there i want to suggest one fix to build better impersonate.
Normal Chrome has a particular headers order : host;connection;user-agent;accept;accept-encoding;accept-language (thats may be old info)
On the other hand, in the current build of the Curl_cffi header orders looks way different from the real chrome, which in my opinion is a huge breach in what we are trying to establish with the impersonate itself.

I'm really bad at C code to be able to fix this myself, but if you have some free time and can look into adding that into lib, it would be really neat of you.

Thanks in advice.

[BUG] Proxies being mismatched on AsyncSession requests

Hey again, found another weird bug. You'll need a list of proxies to reproduce (@yifeikong - I can send you some on like tg or w/e if needed), but basically it seems that after the first request on each instance of the AsyncSession, they all start to use the first instance's proxy. It's probably best explained by example, so here's some code and what it output for me:

from curl_cffi import requests
import json, asyncio
from collections import OrderedDict
from itertools import cycle

with open('proxy_list.txt', 'r') as temp_file:
	proxies = list(OrderedDict.fromkeys([line.rstrip('\n') for line in temp_file]))

formatted_proxies = []
for prox in proxies:
	splitted = prox.split(':')
	formatted_proxies.append(f"http://{splitted[2]}:{splitted[3]}@{splitted[0]}:{splitted[1]}")
proxy_pool = cycle(formatted_proxies)

async def test():
	async def do_req(sess, proxy):
		session_proxies = {"https":proxy,"http":proxy}
		r = await sess.get("https://httpbin.org/ip", proxies=session_proxies, impersonate="chrome110")
		proxy_1 = r.json()['origin']
		r = await sess.get("https://httpbin.org/ip", proxies=session_proxies, impersonate="chrome110")
		proxy_2 = r.json()['origin']
		return proxy_1, proxy_2
	
	tasks = []
	async with requests.AsyncSession() as s:
		for i in range(5):
			p = next(proxy_pool)
			tasks.append(do_req(s, p))
		results = await asyncio.gather(*tasks)
	print("RESULTS:")
	i = 0
	for res in results:
		i += 1
		print(f"#{str(i)}: 1st Request: {str(res[0])} | 2nd Request: {str(res[1])}")

policy = asyncio.WindowsSelectorEventLoopPolicy()
asyncio.set_event_loop_policy(policy)
asyncio.get_event_loop().run_until_complete(test())

Running this code produces output that looks something like:

#1: 1st Request: XXX.XXX.82.225 | 2nd Request: XXX.XXX.82.225
#2: 1st Request: XXX.XXX.92.220 | 2nd Request: XXX.XXX.82.225
#3: 1st Request: XXX.XXX.95.128 | 2nd Request: XXX.XXX.82.225
#4: 1st Request: XXX.XXX.83.56 | 2nd Request: XXX.XXX.82.225
#5: 1st Request: XXX.XXX.87.59 | 2nd Request: XXX.XXX.82.225

As you can see, the first request on each instance uses the correct proxy, but the second request always uses the first instance's proxy. Any idea what might be happening?

Python version: 3.9.6
curl_cffi version: 0.5.2 (getting the error code 60 issue that we're already discussing in another thread on 0.5.3 so I downgraded for now)
cffi version: 1.15.1

Note after some more testing:
If you create a new AsyncSession for every iteration rather than use it as a context manager, the problem doesn't occur and the second request will use the same proxy as the first (see code below)

for i in range(5):
	p = next(proxy_pool)
	tasks.append(do_req(requests.AsyncSession(), p))
results = await asyncio.gather(*tasks)

Installation of curl_cffi

Hi, I am junior programmer of python with no experience of git,
I am geting the error of

  LINK : fatal error LNK1181: cannot open input file 'curl-impersonate-chrome.lib'
  error: command 'C:\\Program Files (x86)\\Microsoft Visual Studio\\2022\\BuildTools\\VC\\Tools\\MSVC\\14.35.32215\\bin\\HostX86\\x64\\link.exe' failed with exit code 1181

when I call pip install --upgrade curl_cffi
may you kindly provide a step by step guideline on how to install curl_cffi, like installing curl-impersonate

my environment is x64 windows 10, with MSVC v143 installed, running python 3.10
many thanks!

Bug: Request header is 'application/x-www-form-urlencoded' but use json as request body

Request header is 'application/x-www-form-urlencoded' but use json as request body when requests.post have both json and data parameter,

here are code

requests.post("https://httpbin.org/post", data={"data": 1}, json={"json": 1}).json()

here are output

{'args': {},
 'data': '',
 'files': {},
 'form': {'{"json": 1}': ''},
 'headers': {'Accept': '*/*',
  'Accept-Encoding': 'gzip, deflate, br',
  'Content-Length': '11',
  'Content-Type': 'application/x-www-form-urlencoded',
  'Host': 'httpbin.org',
  'X-Amzn-Trace-Id': 'Root=1-6401f15c-60b4e9a57970941f002fe1af'},
 'json': None,
 'origin': '103.116.72.5',
 'url': 'https://httpbin.org/post'}

请求返回错误信息

proxies = {'https': 'http://::15818/'} 这是我的代理headers = {
                        'accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.7',
                        'accept-language': 'zh-CN,zh;q=0.9',
                        'cache-control': 'no-cache',
                        'pragma': 'no-cache',
                        'sec-ch-ua-mobile': '?0',
                        'sec-fetch-dest': 'document',
                        'sec-fetch-mode': 'navigate',
                        'sec-fetch-site': 'none',
                        'sec-fetch-user': '?1',
                        'upgrade-insecure-requests': '1',
                        'user-agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/112.0.0.0 Safari/537.36'
                    }
resp = requests.get(url,impersonate="chrome110", headers=headers, timeout=25, proxies=proxies,verify=False)
print(resp.text)  

raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
请问这个是什么问题。我请求携带的参数不对吗?

Can't install with poetry and pip

the latest version ( 0.4.0 ) can't be installed

My environment:
OS: MacOS 13.2.1 (22D68)
Python: 3.9.12
Pip: 22.3.1
poetry: 1.3.1

here is the error log

pip install curl_cffi                                                                                                                                                                                       ─╯
Looking in indexes: https://pypi.tuna.tsinghua.edu.cn/simple
Collecting curl_cffi
  Downloading https://pypi.tuna.tsinghua.edu.cn/packages/33/9f/ef07f1c1348a7e2dd76be39fb534095014684a98cc64cb696c74cfcf5344/curl_cffi-0.4.0.tar.gz (75 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 75.2/75.2 kB 523.8 kB/s eta 0:00:00
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
  Preparing metadata (pyproject.toml) ... done
Requirement already satisfied: cffi>=1.12.0 in ./.venv/lib/python3.9/site-packages (from curl_cffi) (1.15.1)
Requirement already satisfied: pycparser in ./.venv/lib/python3.9/site-packages (from cffi>=1.12.0->curl_cffi) (2.21)
Building wheels for collected packages: curl_cffi
  Building wheel for curl_cffi (pyproject.toml) ... error
  error: subprocess-exited-with-error
  
  × Building wheel for curl_cffi (pyproject.toml) did not run successfully.
  │ exit code: 1
  ╰─> [114 lines of output]
      /private/var/folders/67/bsl5phh57_vg4dl70j75gtjr0000gn/T/pip-build-env-umak2o_7/overlay/lib/python3.9/site-packages/setuptools/config/pyprojecttoml.py:108: _BetaConfiguration: Support for `[tool.setuptools]` in `pyproject.toml` is still *beta*.
        warnings.warn(msg, _BetaConfiguration)
      running bdist_wheel
      running build
      running build_py
      creating build
      creating build/lib.macosx-13.0-arm64-cpython-39
      creating build/lib.macosx-13.0-arm64-cpython-39/curl_cffi
      copying curl_cffi/build.py -> build/lib.macosx-13.0-arm64-cpython-39/curl_cffi
      copying curl_cffi/__init__.py -> build/lib.macosx-13.0-arm64-cpython-39/curl_cffi
      copying curl_cffi/_const.py -> build/lib.macosx-13.0-arm64-cpython-39/curl_cffi
      creating build/lib.macosx-13.0-arm64-cpython-39/curl_cffi/requests
      copying curl_cffi/requests/cookies.py -> build/lib.macosx-13.0-arm64-cpython-39/curl_cffi/requests
      copying curl_cffi/requests/session.py -> build/lib.macosx-13.0-arm64-cpython-39/curl_cffi/requests
      copying curl_cffi/requests/__init__.py -> build/lib.macosx-13.0-arm64-cpython-39/curl_cffi/requests
      copying curl_cffi/requests/errors.py -> build/lib.macosx-13.0-arm64-cpython-39/curl_cffi/requests
      copying curl_cffi/requests/headers.py -> build/lib.macosx-13.0-arm64-cpython-39/curl_cffi/requests
      running egg_info
      writing curl_cffi.egg-info/PKG-INFO
      writing dependency_links to curl_cffi.egg-info/dependency_links.txt
      writing requirements to curl_cffi.egg-info/requires.txt
      writing top-level names to curl_cffi.egg-info/top_level.txt
      reading manifest file 'curl_cffi.egg-info/SOURCES.txt'
      reading manifest template 'MANIFEST.in'
      warning: no files found matching 'curl_cffi/cacert.pem'
      warning: no files found matching 'curl_cffi/*.dll'
      adding license file 'LICENSE'
      writing manifest file 'curl_cffi.egg-info/SOURCES.txt'
      /private/var/folders/67/bsl5phh57_vg4dl70j75gtjr0000gn/T/pip-build-env-umak2o_7/overlay/lib/python3.9/site-packages/setuptools/command/build_py.py:202: SetuptoolsDeprecationWarning:     Installing 'curl_cffi.include' as data is deprecated, please list it in `packages`.
          !!
      
      
          ############################
          # Package would be ignored #
          ############################
          Python recognizes 'curl_cffi.include' as an importable package,
          but it is not listed in the `packages` configuration of setuptools.
      
          'curl_cffi.include' has been automatically added to the distribution only
          because it may contain data files, but this behavior is likely to change
          in future versions of setuptools (and therefore is considered deprecated).
      
          Please make sure that 'curl_cffi.include' is included as a package by using
          the `packages` configuration field or the proper discovery methods
          (for example by using `find_namespace_packages(...)`/`find_namespace:`
          instead of `find_packages(...)`/`find:`).
      
          You can read more about "package discovery" and "data files" on setuptools
          documentation page.
      
      
      !!
      
        check.warn(importable)
      /private/var/folders/67/bsl5phh57_vg4dl70j75gtjr0000gn/T/pip-build-env-umak2o_7/overlay/lib/python3.9/site-packages/setuptools/command/build_py.py:202: SetuptoolsDeprecationWarning:     Installing 'curl_cffi.include.curl' as data is deprecated, please list it in `packages`.
          !!
      
      
          ############################
          # Package would be ignored #
          ############################
          Python recognizes 'curl_cffi.include.curl' as an importable package,
          but it is not listed in the `packages` configuration of setuptools.
      
          'curl_cffi.include.curl' has been automatically added to the distribution only
          because it may contain data files, but this behavior is likely to change
          in future versions of setuptools (and therefore is considered deprecated).
      
          Please make sure that 'curl_cffi.include.curl' is included as a package by using
          the `packages` configuration field or the proper discovery methods
          (for example by using `find_namespace_packages(...)`/`find_namespace:`
          instead of `find_packages(...)`/`find:`).
      
          You can read more about "package discovery" and "data files" on setuptools
          documentation page.
      
      
      !!
      
        check.warn(importable)
      copying curl_cffi/cdef.c -> build/lib.macosx-13.0-arm64-cpython-39/curl_cffi
      copying curl_cffi/shim.c -> build/lib.macosx-13.0-arm64-cpython-39/curl_cffi
      creating build/lib.macosx-13.0-arm64-cpython-39/curl_cffi/include
      copying curl_cffi/include/shim.h -> build/lib.macosx-13.0-arm64-cpython-39/curl_cffi/include
      creating build/lib.macosx-13.0-arm64-cpython-39/curl_cffi/include/curl
      copying curl_cffi/include/curl/Makefile.am -> build/lib.macosx-13.0-arm64-cpython-39/curl_cffi/include/curl
      copying curl_cffi/include/curl/curl.h -> build/lib.macosx-13.0-arm64-cpython-39/curl_cffi/include/curl
      copying curl_cffi/include/curl/curlver.h -> build/lib.macosx-13.0-arm64-cpython-39/curl_cffi/include/curl
      copying curl_cffi/include/curl/easy.h -> build/lib.macosx-13.0-arm64-cpython-39/curl_cffi/include/curl
      copying curl_cffi/include/curl/header.h -> build/lib.macosx-13.0-arm64-cpython-39/curl_cffi/include/curl
      copying curl_cffi/include/curl/mprintf.h -> build/lib.macosx-13.0-arm64-cpython-39/curl_cffi/include/curl
      copying curl_cffi/include/curl/multi.h -> build/lib.macosx-13.0-arm64-cpython-39/curl_cffi/include/curl
      copying curl_cffi/include/curl/options.h -> build/lib.macosx-13.0-arm64-cpython-39/curl_cffi/include/curl
      copying curl_cffi/include/curl/stdcheaders.h -> build/lib.macosx-13.0-arm64-cpython-39/curl_cffi/include/curl
      copying curl_cffi/include/curl/system.h -> build/lib.macosx-13.0-arm64-cpython-39/curl_cffi/include/curl
      copying curl_cffi/include/curl/typecheck-gcc.h -> build/lib.macosx-13.0-arm64-cpython-39/curl_cffi/include/curl
      copying curl_cffi/include/curl/urlapi.h -> build/lib.macosx-13.0-arm64-cpython-39/curl_cffi/include/curl
      running build_ext
      generating cffi module 'build/temp.macosx-13.0-arm64-cpython-39/curl_cffi._wrapper.c'
      creating build/temp.macosx-13.0-arm64-cpython-39
      building 'curl_cffi._wrapper' extension
      creating build/temp.macosx-13.0-arm64-cpython-39/build
      creating build/temp.macosx-13.0-arm64-cpython-39/build/temp.macosx-13.0-arm64-cpython-39
      creating build/temp.macosx-13.0-arm64-cpython-39/curl_cffi
      clang -Wno-unused-result -Wsign-compare -Wunreachable-code -DNDEBUG -g -fwrapv -O3 -Wall -I/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/usr/include -I/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/usr/include -Icurl_cffi/include -I/Users/maoxiandaozhenhaowan/Desktop/Code/google-trends-service/.venv/include -I/Users/maoxiandaozhenhaowan/.pyenv/versions/3.9.12/include/python3.9 -c build/temp.macosx-13.0-arm64-cpython-39/curl_cffi._wrapper.c -o build/temp.macosx-13.0-arm64-cpython-39/build/temp.macosx-13.0-arm64-cpython-39/curl_cffi._wrapper.o
      clang -Wno-unused-result -Wsign-compare -Wunreachable-code -DNDEBUG -g -fwrapv -O3 -Wall -I/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/usr/include -I/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/usr/include -Icurl_cffi/include -I/Users/maoxiandaozhenhaowan/Desktop/Code/google-trends-service/.venv/include -I/Users/maoxiandaozhenhaowan/.pyenv/versions/3.9.12/include/python3.9 -c curl_cffi/shim.c -o build/temp.macosx-13.0-arm64-cpython-39/curl_cffi/shim.o
      curl_cffi/shim.c:9:16: warning: unused variable 'opt_value' [-Wunused-variable]
          CURLoption opt_value = (CURLoption) option;
                     ^
      1 warning generated.
      clang -bundle -undefined dynamic_lookup -L/opt/homebrew/opt/readline/lib -L/opt/homebrew/opt/readline/lib -L/Users/maoxiandaozhenhaowan/.pyenv/versions/3.9.12/lib -L/opt/homebrew/lib -Wl,-rpath,/opt/homebrew/lib -L/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/usr/lib -L/opt/homebrew/opt/readline/lib -L/opt/homebrew/opt/readline/lib -L/Users/maoxiandaozhenhaowan/.pyenv/versions/3.9.12/lib -L/opt/homebrew/lib -Wl,-rpath,/opt/homebrew/lib -L/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/usr/lib build/temp.macosx-13.0-arm64-cpython-39/build/temp.macosx-13.0-arm64-cpython-39/curl_cffi._wrapper.o build/temp.macosx-13.0-arm64-cpython-39/curl_cffi/shim.o -L/usr/local/lib -lcurl-impersonate-chrome -o build/lib.macosx-13.0-arm64-cpython-39/curl_cffi/_wrapper.abi3.so
      ld: library not found for -lcurl-impersonate-chrome
      clang: error: linker command failed with exit code 1 (use -v to see invocation)
      error: command '/usr/bin/clang' failed with exit code 1
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for curl_cffi
Failed to build curl_cffi
ERROR: Could not build wheels for curl_cffi, which is required to install pyproject.toml-based projects

[notice] A new release of pip available: 22.3.1 -> 23.0.1
[notice] To update, run: pip install --upgrade pip

Request

Hello there. First of all thank you for your amazing work! Nowadays many websites triggers captcha for normal python requests. But this works amazingly by bypassing all of it. I was searching for a thing like this. Thank you again.

What I wanted to ask were the following things.

  1. As you have said in the to do list can you add the requests.Session() method. It will be a huge improvement and will be easy to implement with current projects too.
  2. Can you add support for socks proxies as well. I don't know if it is already supported or not. But since requests and curl both supports socks proxies, I think it should be possible.
  3. Can you say if this works like httpx currently? Does this send http2 requests currently? If not can you please add the support to specify an option to use http2 such as http2=True, like in the normal httpx package.

If these things could be added to this package it would be very easy to implement this in current projects without much work. Also if these could be added, I really don't see a reason why people would use normal requests anymore since it is easily detectable. This is really amazing. Thanks for your work again!

change sslversion disable

_client = requests.Session(
            impersonate="chrome110",
)
_client.curl.setopt(CurlOpt.SSLVERSION, 6)

Setting TLS 1.2 does not take effect.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.