浏览器可以正常访问,但是用requests发送请求失败。
后端是如何监测得呢?为什么浏览器可以返回结果,而requests模块不行呢?
1.TLS指纹
如今几乎所有平台通信都是基于Https的协议,而无论基于什么工具去发送Https请求时,都需要基于TLS/SSL先建立两端安全的通信(握手),建立后再进行数据传输。
TLS的握手阶段,客户端会向服务端发送 Client Hello 数据包,在数据包内的JA3就是指纹信息(基于电脑的TLS版本+内置算法等计算出来的字符串),并且该值不会随着请求头修改、代理等发生变化。所以,某个网站的后台就会去读取TLS客户端的JA3指纹,如果是非正常指纹,禁止访问。例如:
requests==2.31.0
urllib3==2.0.7
[JA3 Fullstring: 771,4866-4867-4865-49196-49200-159-52393-52392-52394-49195-49199-158-49188-49192-107-49187-49191-103-49162-49172-57-49161-49171-51-157-156-61-60-53-47-255,0-11-10-16-22-23-49-13-43-45-51-21,29-23-30-25-24,0-1-2]
[JA3: bc29aa426fc99c0be1b9be941869f88a]
固定,所以后端API可以禁止此指纹。
浏览器请求
[JA3 Fullstring: 771,4865-4866-4867-49195-49199-49196-49200-52393-52392-49171-49172-156-157-47-53,11-18-27-35-16-51-10-23-5-43-65281-65037-0-17513-13-45,29-23-24,0]
[JA3: 5ae2fe79293ec63d585f3f987cf69d01]
扩展:有些网站专门收录ja3黑名单。
https://sslbl.abuse.ch/ja3-fingerprints/ https://sslbl.abuse.ch/blacklist/sslblacklist.csv https://github.com/salesforce/ja3/blob/master/lists/osx-nix-ja3.csv https://ja3er.com/getAllUasJson https://ja3er.com/getAllHashesJson
1.1查看指纹
想要快速查看自己的TSL指纹信息:https://tls.browserleaks.com/json
import requests
res = requests.get('https://tls.browserleaks.com/json')
print(res.text)
其他网站还有:
https://tls.browserleaks.com/json https://tls.peet.ws/ https://tls.peet.ws/api/all
2.突破指纹
2.1
requests在发送请求时,内部依赖urllib3实现。
pip install urllib3==1.26.15 pip install urllib3==1.26.16 pip install urllib3==2.0.7
自定义 ciphers实现生成非默认ja3
import requests
import urllib3
urllib3.util.ssl_.DEFAULT_CIPHERS = ":".join([
# "ECDHE+AESGCM",
# "ECDHE+CHACHA20",
# "DHE+AESGCM",
# "DHE+CHACHA20",
# "ECDH+AESGCM",
# "DH+AESGCM",
# "ECDH+AES",
"DH+AES",
"RSA+AESGCM",
"RSA+AES",
"!aNULL",
"!eNULL",
"!MD5",
"!DSS",
])
res = requests.get(
url="https://tls.browserleaks.com/json",
headers={
'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.93 Safari/537.36',
}
)
res.encoding = 'utf-8'
print(res.text)
import requests
from requests.adapters import HTTPAdapter
from requests.packages.urllib3.util.ssl_ import create_urllib3_context
class MineAdapter(HTTPAdapter):
CIPHERS = ":".join(
[
"ECDHE+AESGCM",
"ECDHE+CHACHA20",
"DHE+AESGCM",
"DHE+CHACHA20",
"ECDH+AESGCM",
"DH+AESGCM",
"ECDH+AES",
"DH+AES",
"RSA+AESGCM",
"RSA+AES",
"!aNULL",
"!eNULL",
"!MD5",
"!DSS",
]
)
def init_poolmanager(self, *args, **kwargs):
context = create_urllib3_context(ciphers=self.CIPHERS)
kwargs['ssl_context'] = context
return super().init_poolmanager(*args, **kwargs)
def proxy_manager_for(self, *args, **kwargs):
context = create_urllib3_context(ciphers=self.CIPHERS)
kwargs['ssl_context'] = context
return super().proxy_manager_for(*args, **kwargs)
session = requests.Session()
session.headers.update({
'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.93 Safari/537.36',
})
session.mount("https://", MineAdapter())
res = session.get("https://tls.browserleaks.com/json")
res.encoding = 'utf-8'
print(res.text)
校验方法
import requests
import urllib3
urllib3.util.ssl_.DEFAULT_CIPHERS = ":".join([
# "ECDHE+AESGCM",
# "ECDHE+CHACHA20",
# "DHE+AESGCM",
# "DHE+CHACHA20",
# "ECDH+AESGCM",
# "DH+AESGCM",
# "ECDH+AES",
"DH+AES",
"RSA+AESGCM",
"RSA+AES",
"!aNULL",
"!eNULL",
"!MD5",
"!DSS",
])
res = requests.get(
url="https://match.yuanrenxue.cn/api/match/19?page=1",
headers={
'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.93 Safari/537.36',
}
)
res.encoding = 'utf-8'
print(res.text)
3.curl_cffi
curl-cffi · PyPI
curl是一个可以发送网络请求的工具。
curl-impersonate是一个基于curl基础上进行开发的一个工具,可以完美的模拟主流的浏览器。
curl_cffi,是套壳curl-impersonate,让此工具可以更方便的应用在Python中
pip install curl-cffi
from curl_cffi import requests
res = requests.get(
# url="https://ascii2d.net/",
# url="https://cn.investing.com/equities/amazon-com-inc-historical-data",
url="https://match.yuanrenxue.cn/api/match/19?page=1",
headers={
'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.93 Safari/537.36',
},
impersonate="chrome101",
)
print(res.text)