Coder Social home page Coder Social logo

kuimivm's People

Contributors

d3-3109 avatar kiliokuara avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

kuimivm's Issues

请问如何升级版本支持

8.9.58用了很久依然有不少号还可以用,但是新号几乎都使用不了,所以想请问以下如果要升级版本支持的话应该如何修改?

与fix-protocol-version对接的问题

我不太确定是否真的对接成功了,也不确定操作方式是否正确。

版本:
net.mamoe:mirai-core:2.15.0
fix-protocol-version-1.9.6

docker我放在一个screen里,运行了指令后返回了一长串数字字母,我猜应该是部署成功了

我的bot程序在另一个screen里,用kt写的,登录代码是这样的:

    FixProtocolVersion.fetch(BotConfiguration.MiraiProtocol.ANDROID_PAD, "8.9.58")
    bot = BotFactory.newBot(Config.qq, Config.password) {
        protocol = BotConfiguration.MiraiProtocol.ANDROID_PAD
        fileBasedDeviceInfo()
    }
    bot.login()

KFCFactory.json:

{
    "8.9.58": {
        "base_url": "http://127.0.0.1:8888",
        "type": "kiliokuara/magic-signer-guide",
        "serverIdentityKey": "vivo50",
        "authorizationKey": "kfc"
    }
}

端口和两个key和docker里的指令都是一致的。运行bot后会卡一分钟左右,然后出现以下报错:

2023-07-17 16:51:23 W/Net 3368816838: Exception in resumeConnection.
NettyChannelException(message=Failed to connect msfwifi.3g.qq.com/<unresolved>:8080, cause=java.net.UnknownHostException: msfwifi.3g.qq.com: Temporary failure in name resolution)
        at net.mamoe.mirai.internal.network.impl.netty.NettyNetworkHandler.createConnection$suspendImpl(NettyNetworkHandler.kt:116)
        at net.mamoe.mirai.internal.network.impl.netty.NettyNetworkHandler$createConnection$1.invokeSuspend(NettyNetworkHandler.kt)
        at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
        at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:104)
        at kotlinx.coroutines.scheduling.CoroutineScheduler.runSafely(CoroutineScheduler.kt:570)
        at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.executeTask(CoroutineScheduler.kt:750)
        at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.runWorker(CoroutineScheduler.kt:677)
        at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.run(CoroutineScheduler.kt:664)
Caused by: java.net.UnknownHostException: msfwifi.3g.qq.com: Temporary failure in name resolution
        at java.base/java.net.Inet6AddressImpl.lookupAllHostAddr(Native Method)
        at java.base/java.net.InetAddress$PlatformNameService.lookupAllHostAddr(InetAddress.java:932)
        at java.base/java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1517)
        at java.base/java.net.InetAddress$NameServiceAddresses.get(InetAddress.java:851)
        at java.base/java.net.InetAddress.getAllByName0(InetAddress.java:1507)
        at java.base/java.net.InetAddress.getAllByName(InetAddress.java:1366)
        at java.base/java.net.InetAddress.getAllByName(InetAddress.java:1300)
        at java.base/java.net.InetAddress.getByName(InetAddress.java:1250)
        at io.netty.util.internal.SocketUtils$8.run(SocketUtils.java:156)
        at io.netty.util.internal.SocketUtils$8.run(SocketUtils.java:153)
        at java.base/java.security.AccessController.doPrivileged(AccessController.java:554)
        at io.netty.util.internal.SocketUtils.addressByName(SocketUtils.java:153)
        at io.netty.resolver.DefaultNameResolver.doResolve(DefaultNameResolver.java:41)
        at io.netty.resolver.SimpleNameResolver.resolve(SimpleNameResolver.java:61)
        at io.netty.resolver.SimpleNameResolver.resolve(SimpleNameResolver.java:53)
        at io.netty.resolver.InetSocketAddressResolver.doResolve(InetSocketAddressResolver.java:55)
        at io.netty.resolver.InetSocketAddressResolver.doResolve(InetSocketAddressResolver.java:31)
        at io.netty.resolver.AbstractAddressResolver.resolve(AbstractAddressResolver.java:106)
        at io.netty.bootstrap.Bootstrap.doResolveAndConnect0(Bootstrap.java:206)
        at io.netty.bootstrap.Bootstrap.access$000(Bootstrap.java:46)
        at io.netty.bootstrap.Bootstrap$1.operationComplete(Bootstrap.java:180)
        at io.netty.bootstrap.Bootstrap$1.operationComplete(Bootstrap.java:166)
        at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:590)
        at io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:557)
        at io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:492)
        at io.netty.util.concurrent.DefaultPromise.setValue0(DefaultPromise.java:636)
        at io.netty.util.concurrent.DefaultPromise.setSuccess0(DefaultPromise.java:625)
        at io.netty.util.concurrent.DefaultPromise.trySuccess(DefaultPromise.java:105)
        at io.netty.channel.DefaultChannelPromise.trySuccess(DefaultChannelPromise.java:84)
        at io.netty.channel.AbstractChannel$AbstractUnsafe.safeSetSuccess(AbstractChannel.java:990)
        at io.netty.channel.AbstractChannel$AbstractUnsafe.register0(AbstractChannel.java:516)
        at io.netty.channel.AbstractChannel$AbstractUnsafe.access$200(AbstractChannel.java:429)
        at io.netty.channel.AbstractChannel$AbstractUnsafe$1.run(AbstractChannel.java:486)
        at io.netty.util.concurrent.AbstractEventExecutor.runTask(AbstractEventExecutor.java:174)
        at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:167)
        at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:470)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:569)
        at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)
        at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
        at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base/java.lang.Thread.run(Thread.java:831)

2023-07-17 16:51:23 W/Net 3368816838: Network selector received exception, closing bot. (NettyChannelException(message=Failed to connect msfwifi.3g.qq.com/<unresolved>:8080, cause=java.net.UnknownHostException: msfwifi.3g.qq.com: Temporary failure in name resolution))
Exception in thread "main" java.net.UnknownHostException: msfwifi.3g.qq.com: Temporary failure in name resolution
        at java.base/java.net.Inet6AddressImpl.lookupAllHostAddr(Native Method)
        at java.base/java.net.InetAddress$PlatformNameService.lookupAllHostAddr(InetAddress.java:932)
        at java.base/java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1517)
        at java.base/java.net.InetAddress$NameServiceAddresses.get(InetAddress.java:851)
        at java.base/java.net.InetAddress.getAllByName0(InetAddress.java:1507)
        at java.base/java.net.InetAddress.getAllByName(InetAddress.java:1366)
        at java.base/java.net.InetAddress.getAllByName(InetAddress.java:1300)
        at java.base/java.net.InetAddress.getByName(InetAddress.java:1250)
        at io.netty.util.internal.SocketUtils$8.run(SocketUtils.java:156)
        at io.netty.util.internal.SocketUtils$8.run(SocketUtils.java:153)
        at java.base/java.security.AccessController.doPrivileged(AccessController.java:554)
        at io.netty.util.internal.SocketUtils.addressByName(SocketUtils.java:153)
        at io.netty.resolver.DefaultNameResolver.doResolve(DefaultNameResolver.java:41)
        at io.netty.resolver.SimpleNameResolver.resolve(SimpleNameResolver.java:61)
        at io.netty.resolver.SimpleNameResolver.resolve(SimpleNameResolver.java:53)
        at io.netty.resolver.InetSocketAddressResolver.doResolve(InetSocketAddressResolver.java:55)
        at io.netty.resolver.InetSocketAddressResolver.doResolve(InetSocketAddressResolver.java:31)
        at io.netty.resolver.AbstractAddressResolver.resolve(AbstractAddressResolver.java:106)
        at io.netty.bootstrap.Bootstrap.doResolveAndConnect0(Bootstrap.java:206)
        at io.netty.bootstrap.Bootstrap.access$000(Bootstrap.java:46)
        at io.netty.bootstrap.Bootstrap$1.operationComplete(Bootstrap.java:180)
        at io.netty.bootstrap.Bootstrap$1.operationComplete(Bootstrap.java:166)
        at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:590)
        at io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:557)
        at io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:492)
        at io.netty.util.concurrent.DefaultPromise.setValue0(DefaultPromise.java:636)
        at io.netty.util.concurrent.DefaultPromise.setSuccess0(DefaultPromise.java:625)
        at io.netty.util.concurrent.DefaultPromise.trySuccess(DefaultPromise.java:105)
        at io.netty.channel.DefaultChannelPromise.trySuccess(DefaultChannelPromise.java:84)
        at io.netty.channel.AbstractChannel$AbstractUnsafe.safeSetSuccess(AbstractChannel.java:990)
        at io.netty.channel.AbstractChannel$AbstractUnsafe.register0(AbstractChannel.java:516)
        at io.netty.channel.AbstractChannel$AbstractUnsafe.access$200(AbstractChannel.java:429)
        at io.netty.channel.AbstractChannel$AbstractUnsafe$1.run(AbstractChannel.java:486)
        at io.netty.util.concurrent.AbstractEventExecutor.runTask(AbstractEventExecutor.java:174)
        at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:167)
        at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:470)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:569)
        at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)
        at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
        at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base/java.lang.Thread.run(Thread.java:831)
        Suppressed: NettyChannelException(message=Failed to connect msfwifi.3g.qq.com/<unresolved>:8080, cause=java.net.UnknownHostException: msfwifi.3g.qq.com: Temporary failure in name resolution)
                at net.mamoe.mirai.internal.network.impl.netty.NettyNetworkHandler.createConnection$suspendImpl(NettyNetworkHandler.kt:116)
                at net.mamoe.mirai.internal.network.impl.netty.NettyNetworkHandler$createConnection$1.invokeSuspend(NettyNetworkHandler.kt)
                at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
                at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:104)
                at kotlinx.coroutines.scheduling.CoroutineScheduler.runSafely(CoroutineScheduler.kt:570)
                at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.executeTask(CoroutineScheduler.kt:750)
                at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.runWorker(CoroutineScheduler.kt:677)
                at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.run(CoroutineScheduler.kt:664)
        Caused by: [CIRCULAR REFERENCE: java.net.UnknownHostException: msfwifi.3g.qq.com: Temporary failure in name resolution]

之后再运行就不卡了,直接弹code=45,和没有使用该插件一样。

你当前使用的QQ版本过低,请前往QQ官网im.qq.com下载最新版QQ后重试。

mira日志:

2023-09-13 16:08:19 I/KFCFactory: ANDROID_PHONE(8.9.58) server type: kiliokuara/magic-signer-guide, file:///home/mira/KFCFactory.json
2023-09-13 16:08:19 I/KFCFactory: magic-signer-guide by http://127.0.0.1:8888 about 
{"main_page":"https://github.com/kiliokuara/magic-signer-guide/issues","server_version":"70107d8e1b4acb747026c95a853523b575e7d98f","server_build_time":1690883369515,"supported_protocol_versions":["8.9.58"]}
2023-09-13 16:08:22 I/ViVo50: Bot(483392198) initialize by http://127.0.0.1:8888
2023-09-13 16:08:22 I/ViVo50: Session(bot=483392198) opened
2023-09-13 16:09:22 W/ViVo50: Session(bot=483392198) rpc.initialize timeout 60000ms
java.util.concurrent.TimeoutException
        at java.base/java.util.concurrent.CompletableFuture.timedGet(Unknown Source)
        at java.base/java.util.concurrent.CompletableFuture.get(Unknown Source)
        at fix-protocol-version-1.9.11.mirai2.jar//xyz.cssxsh.mirai.tool.ViVo50$Session.sendCommand(ViVo50.kt:427)
        at fix-protocol-version-1.9.11.mirai2.jar//xyz.cssxsh.mirai.tool.ViVo50.initialize(ViVo50.kt:106)
        at net.mamoe.mirai.internal.network.components.EcdhInitialPublicKeyUpdaterImpl.initializeSsoSecureEcdh(EcdhInitialPublicKeyUpdater.kt:123)
        at net.mamoe.mirai.internal.network.components.SsoProcessorImpl.login(SsoProcessor.kt:224)
        at net.mamoe.mirai.internal.network.components.SsoProcessorImpl$login$1.invokeSuspend(SsoProcessor.kt)
        at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
        at net.mamoe.mirai.internal.deps.io.ktor.util.pipeline.SuspendFunctionGun.resumeRootWith(SuspendFunctionGun.kt:138)
        at net.mamoe.mirai.internal.deps.io.ktor.util.pipeline.SuspendFunctionGun.loop(SuspendFunctionGun.kt:112)
        at net.mamoe.mirai.internal.deps.io.ktor.util.pipeline.SuspendFunctionGun.access$loop(SuspendFunctionGun.kt:14)
        at net.mamoe.mirai.internal.deps.io.ktor.util.pipeline.SuspendFunctionGun$continuation$1.resumeWith(SuspendFunctionGun.kt:62)
        at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:46)
        at net.mamoe.mirai.internal.deps.io.ktor.util.pipeline.SuspendFunctionGun.resumeRootWith(SuspendFunctionGun.kt:138)
        at net.mamoe.mirai.internal.deps.io.ktor.util.pipeline.SuspendFunctionGun.loop(SuspendFunctionGun.kt:112)
        at net.mamoe.mirai.internal.deps.io.ktor.util.pipeline.SuspendFunctionGun.access$loop(SuspendFunctionGun.kt:14)
        at net.mamoe.mirai.internal.deps.io.ktor.util.pipeline.SuspendFunctionGun$continuation$1.resumeWith(SuspendFunctionGun.kt:62)
        at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:46)
        at net.mamoe.mirai.internal.deps.io.ktor.util.pipeline.SuspendFunctionGun.resumeRootWith(SuspendFunctionGun.kt:138)
        at net.mamoe.mirai.internal.deps.io.ktor.util.pipeline.SuspendFunctionGun.loop(SuspendFunctionGun.kt:112)
        at net.mamoe.mirai.internal.deps.io.ktor.util.pipeline.SuspendFunctionGun.access$loop(SuspendFunctionGun.kt:14)
        at net.mamoe.mirai.internal.deps.io.ktor.util.pipeline.SuspendFunctionGun$continuation$1.resumeWith(SuspendFunctionGun.kt:62)
        at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:46)
        at net.mamoe.mirai.internal.deps.io.ktor.util.pipeline.SuspendFunctionGun.resumeRootWith(SuspendFunctionGun.kt:138)
        at net.mamoe.mirai.internal.deps.io.ktor.util.pipeline.SuspendFunctionGun.loop(SuspendFunctionGun.kt:112)
        at net.mamoe.mirai.internal.deps.io.ktor.util.pipeline.SuspendFunctionGun.access$loop(SuspendFunctionGun.kt:14)
        at net.mamoe.mirai.internal.deps.io.ktor.util.pipeline.SuspendFunctionGun$continuation$1.resumeWith(SuspendFunctionGun.kt:62)
        at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:46)
        at net.mamoe.mirai.internal.deps.io.ktor.util.pipeline.SuspendFunctionGun.resumeRootWith(SuspendFunctionGun.kt:138)
        at net.mamoe.mirai.internal.deps.io.ktor.util.pipeline.SuspendFunctionGun.loop(SuspendFunctionGun.kt:112)
        at net.mamoe.mirai.internal.deps.io.ktor.util.pipeline.SuspendFunctionGun.access$loop(SuspendFunctionGun.kt:14)
        at net.mamoe.mirai.internal.deps.io.ktor.util.pipeline.SuspendFunctionGun$continuation$1.resumeWith(SuspendFunctionGun.kt:62)
        at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:46)
        at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:106)
        at kotlinx.coroutines.scheduling.CoroutineScheduler.runSafely(CoroutineScheduler.kt:570)
        at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.executeTask(CoroutineScheduler.kt:750)
        at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.runWorker(CoroutineScheduler.kt:677)
        at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.run(CoroutineScheduler.kt:664)

2023-09-13 16:09:22 I/ViVo50: Bot(483392198) initialize complete
Login failed: BotAuthorization(BotAuthorization.byPassword(<ERASED>)) threw an exception during authorization process. See cause below.
2023-09-13 16:10:03 E/console: net.mamoe.mirai.network.BotAuthorizationException: BotAuthorization(BotAuthorization.byPassword(<ERASED>)) threw an exception during authorization process. See cause below.
net.mamoe.mirai.network.BotAuthorizationException: BotAuthorization(BotAuthorization.byPassword(<ERASED>)) threw an exception during authorization process. See cause below.
        at net.mamoe.mirai.internal.network.components.SsoProcessorImpl.login(SsoProcessor.kt:263)
        at net.mamoe.mirai.internal.network.handler.CommonNetworkHandler$StateConnecting$startState$2.invokeSuspend(CommonNetworkHandler.kt:247)
        at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
        at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:106)
        at kotlinx.coroutines.scheduling.CoroutineScheduler.runSafely(CoroutineScheduler.kt:570)
        at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.executeTask(CoroutineScheduler.kt:750)
        at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.runWorker(CoroutineScheduler.kt:677)
        at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.run(CoroutineScheduler.kt:664)
        Suppressed: net.mamoe.mirai.network.WrongPasswordException: Error(bot=Bot(483392198), code=45, title=禁止登录, message=你当前使用的QQ版本过低,请前往QQ官网im.qq.com下载最新版QQ后重试。, errorInfo=)
                at net.mamoe.mirai.internal.network.components.SsoProcessorImpl$SlowLoginImpl.doLogin(SsoProcessor.kt:490)
                at net.mamoe.mirai.internal.network.components.SsoProcessorImpl$SlowLoginImpl$doLogin$1.invokeSuspend(SsoProcessor.kt)
                at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
                at kotlinx.coroutines.internal.ScopeCoroutine.afterResume(Scopes.kt:33)
                at kotlinx.coroutines.AbstractCoroutine.resumeWith(AbstractCoroutine.kt:102)
                at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:46)
                ... 5 more
Caused by: [CIRCULAR REFERENCE: net.mamoe.mirai.network.WrongPasswordException: Error(bot=Bot(483392198), code=45, title=禁止登录, message=你当前使用的QQ版本过低,请前往QQ官网im.qq.com下载最新版QQ后重试。, errorInfo=)]

2023-09-13 16:10:03 I/Bot.483392198: Bot cancelled: Bot closed

docker日志:

[Native IO       ] read: /dev/__properties__, oflags: 557056
[Native IO       ] read: /proc/stat, oflags: 524288
fekit base: 0x7cf00000
OOL: JNI_OnLoad
RSP: 65542
======================= [init fekit encrypt service] ========================
======================= [init fekit encrypt service end] ========================
2023-09-13 08:08:24 [INFO ] [Vivo45#1] a - starting vm service of bot 483392198, local debug = false
======================= [spi initialize] ========================
![DTC: MMKV CALL] mmKVValue call: o3_switch_Xwid
[FEKitLog INFO   ] [FEKit_] 1 device_token_entry.h:86 initUin 0
![DTC: MMKV CALL] mmKVValue call: kO3WhiteCmdListKey
[Native IO       ] read: /dev/urandom, oflags: 524288
[Native IO       ] read: /dev/urandom, oflags: 524288
[Native IO       ] read: /dev/urandom, oflags: 0
[Native IO       ] read: /data/app/com.tencent.mobileqq/base.apk, oflags: 0
[Native IO       ] read: /data/app/com.tencent.mobileqq/base.apk, oflags: 0
[Native IO       ] read: /dev/urandom, oflags: 0
[FEKitLog ERROR  ] [FEKit_] 1 o3_channel_encrypt.h:275 gen new channel
[FEKitLog ERROR  ] [FEKit_] 1 o3_channel_encrypt.h:491 est check: 154c619900aa3a7f5b837e5fa0f9b8ab989994ba23f8ae5efaacefd0d83b8a4080102a4e70548cd46f563faf402c94ee
[FEKitLog ERROR  ] [FEKit_] 1 ChannelManager.cpp:72 o3cm@S: GetSecConf, trpc.o3.ecdh_access.EcdhAccess.SsoEstablishShareKey
[!!! ChannelProxy] sendMessage: {trpc.o3.ecdh_access.EcdhAccess.SsoEstablishShareKey}[0] 
![DTC: MMKV CALL] mmKVSaveValue call: key=O3_1bad5c33edb3fed0, value=0
[Native IO       ] read: /data/app/com.tencent.mobileqq/base.apk!/lib/arm64-v8a/libfekit.so, oflags: 0
2023-09-13 08:09:22 [INFO ] [vert.x-eventloop-thread-0] RpcServerBootstrap - [ROUTER] client request to check session state 0bdbd07c-6ac6-4a61-89ce-a7ed53be94ed.lP5vCJOqZOHwOZi6q65oLaQtsV5p/TbKdDv3YUZZFSNXAWJpYFDGsXBtGPJbxghJx8cS8BVY8B5f1v+9NzCk/rS5EAYoIJw1RE6wuedhjH59Wf5QKAiDnY9Mzk9B1d9jl0LoO0znEIRECXszbGT9MF50lNWGjqB7E7KMuXMqebL/2kV6W++QJA+VvBWiWOB290VQ2Sf5MZnQSxLhiFbGKNjO5qdM+bjZSVADnvn1Bg+0c41SKP4022lS1x82aD1zLWVNAMCzLIDj+Xa52RC95jp8hX80IbLGFP1NaAt24D9KnbYjmD70JwmSX4D1KTNTVo014pPA/OiYO2Zc+4lS4VCQImF4/D7THJLBvQrPv0vZeLOniHHRuyanhnlP/QaLDjcjRdDnk5VW8ztUtMj/8AI3+rKXjipDjUNMpOguEX0CZ8vUK0fx2nXP5OqnY5JYHmK1AwqxyOxtVrNEr6T9Oq23QSenPJKkjM41sujXh1ORc/x8PGLis0UApB/m2Nj9QW63af2eH6h4kK5KYPR9Pk2n7QP7R4+hYQ2auNZodfIH0tlqe1gpv1g5kUugXhOrdMUNHr6ubUl/V+PWf8euofHnTaD/vF38wi3KNc8AADMSygGoJ0ArK7yLwkpMnsyjf7vobw0qE0ncsS3s0M5xaDrAV3GX5ZoLH0o8O9ow8Lc=
2023-09-13 08:09:22 [INFO ] [vert.x-eventloop-thread-0] a - [WEBSOCKET] receiving packet from 172.17.0.1:34984: {"packetId":"42e523ff-2d3c-484c-aa6f-de8f1cdd7895","packetType":"rpc.get_cmd_white_list"}
2023-09-13 08:09:22 [INFO ] [Vivo45#5] a - [WEBSOCKET] respond packet to 172.17.0.1:34984: {"packetId":"42e523ff-2d3c-484c-aa6f-de8f1cdd7895","packetType":"rpc.get_cmd_white_list","response":["OidbSvcTrpcTcp.0x55f_0","OidbSvcTrpcTcp.0x1100_1","qidianservice.269","OidbSvc.0x4ff_9_IMCore","MsgProxy.SendMsg","SQQzoneSvc.shuoshuo","OidbSvc.0x758_1","QChannelSvr.trpc.qchannel.commwriter.ComWriter.DoReply","trpc.login.ecdh.EcdhService.SsoNTLoginPasswordLoginUnusualDevice","wtlogin.device_lock","OidbSvc.0x758_0","wtlogin_device.tran_sim_emp","OidbSvc.0x4ff_9","trpc.springfestival.redpacket.LuckyBag.SsoSubmitGrade","FeedCloudSvr.trpc.feedcloud.commwriter.ComWriter.DoReply","trpc.o3.report.Report.SsoReport","SQQzoneSvc.addReply","OidbSvc.0x8a1_7","QChannelSvr.trpc.qchannel.commwriter.ComWriter.DoComment","OidbSvcTrpcTcp.0xf67_1","friendlist.ModifyGroupInfoReq","OidbSvcTrpcTcp.0xf65_1","OidbSvcTrpcTcp.0xf65_10 ","OidbSvcTrpcTcp.0xf67_5","OidbSvc.0x56c_6","OidbSvc.0x8ba","SQQzoneSvc.like","OidbSvcTrpcTcp.0xf88_1","OidbSvc.0x8a1_0","wtlogin.name2uin","SQQzoneSvc.addComment","wtlogin.login","trpc.o3.ecdh_access.EcdhAccess.SsoSecureA2Access","OidbSvcTrpcTcp.0x101e_2","qidianservice.135","FeedCloudSvr.trpc.feedcloud.commwriter.ComWriter.DoComment","FeedCloudSvr.trpc.feedcloud.commwriter.ComWriter.DoBarrage","-1","OidbSvcTrpcTcp.0x101e_1","OidbSvc.0x89a_0","friendlist.addFriend","ProfileService.GroupMngReq","OidbSvc.oidb_0x758","MessageSvc.PbSendMsg","FeedCloudSvr.trpc.feedcloud.commwriter.ComWriter.DoLike","OidbSvc.0x758","trpc.o3.ecdh_access.EcdhAccess.SsoSecureA2Establish","FeedCloudSvr.trpc.feedcloud.commwriter.ComWriter.DoPush","qidianservice.290","trpc.qlive.relationchain_svr.RelationchainSvr.Follow","trpc.o3.ecdh_access.EcdhAccess.SsoSecureAccess","FeedCloudSvr.trpc.feedcloud.commwriter.ComWriter.DoFollow","SQQzoneSvc.forward","ConnAuthSvr.sdk_auth_api","wtlogin.qrlogin","wtlogin.register","OidbSvcTrpcTcp.0x6d9_4","trpc.passwd.manager.PasswdManager.SetPasswd","friendlist.AddFriendReq","qidianservice.207","ProfileService.getGroupInfoReq","OidbSvcTrpcTcp.0x1107_1","OidbSvcTrpcTcp.0x1105_1","SQQzoneSvc.publishmood","wtlogin.exchange_emp","OidbSvc.0x88d_0","wtlogin_device.login","OidbSvcTrpcTcp.0xfa5_1","trpc.qqhb.qqhb_proxy.Handler.sso_handle","OidbSvcTrpcTcp.0xf89_1","OidbSvc.0x9fa","FeedCloudSvr.trpc.feedcloud.commwriter.ComWriter.PublishFeed","QChannelSvr.trpc.qchannel.commwriter.ComWriter.PublishFeed","OidbSvcTrpcTcp.0xf57_106","ConnAuthSvr.sdk_auth_api_emp","OidbSvcTrpcTcp.0xf6e_1","trpc.qlive.word_svr.WordSvr.NewPublicChat","trpc.passwd.manager.PasswdManager.VerifyPasswd","trpc.group_pro.msgproxy.sendmsg","OidbSvc.0x89b_1","OidbSvcTrpcTcp.0xf57_9","FeedCloudSvr.trpc.videocircle.circleprofile.CircleProfile.SetProfile","OidbSvc.0x6d9_4","OidbSvcTrpcTcp.0xf55_1","ConnAuthSvr.fast_qq_login","OidbSvcTrpcTcp.0xf57_1","trpc.o3.ecdh_access.EcdhAccess.SsoEstablishShareKey","wtlogin.trans_emp","StatSvc.register"]}
2023-09-13 08:09:22 [INFO ] [vert.x-eventloop-thread-0] RpcServerBootstrap - [ROUTER] client request to check session state 0bdbd07c-6ac6-4a61-89ce-a7ed53be94ed.lP5vCJOqZOHwOZi6q65oLaQtsV5p/TbKdDv3YUZZFSNXAWJpYFDGsXBtGPJbxghJx8cS8BVY8B5f1v+9NzCk/rS5EAYoIJw1RE6wuedhjH59Wf5QKAiDnY9Mzk9B1d9jl0LoO0znEIRECXszbGT9MF50lNWGjqB7E7KMuXMqebL/2kV6W++QJA+VvBWiWOB290VQ2Sf5MZnQSxLhiFbGKNjO5qdM+bjZSVADnvn1Bg+0c41SKP4022lS1x82aD1zLWVNAMCzLIDj+Xa52RC95jp8hX80IbLGFP1NaAt24D9KnbYjmD70JwmSX4D1KTNTVo014pPA/OiYO2Zc+4lS4VCQImF4/D7THJLBvQrPv0vZeLOniHHRuyanhnlP/QaLDjcjRdDnk5VW8ztUtMj/8AI3+rKXjipDjUNMpOguEX0CZ8vUK0fx2nXP5OqnY5JYHmK1AwqxyOxtVrNEr6T9Oq23QSenPJKkjM41sujXh1ORc/x8PGLis0UApB/m2Nj9QW63af2eH6h4kK5KYPR9Pk2n7QP7R4+hYQ2auNZodfIH0tlqe1gpv1g5kUugXhOrdMUNHr6ubUl/V+PWf8euofHnTaD/vF38wi3KNc8AADMSygGoJ0ArK7yLwkpMnsyjf7vobw0qE0ncsS3s0M5xaDrAV3GX5ZoLH0o8O9ow8Lc=
2023-09-13 08:09:22 [INFO ] [vert.x-eventloop-thread-0] a - [WEBSOCKET] receiving packet from 172.17.0.1:34984: {"packetId":"99e75e53-7467-44e9-a7c6-6b9965efc41b","packetType":"rpc.tlv","tlvType":1348,"extArgs":{"KEY_COMMAND_STR":"810_9"},"content":"000000000010F2B23A72C52B086752B310CF3CFBD29E000A362E302E302E323534350000000900000000"}
[!!! ChannelProxy] sendMessage: {trpc.o3.report.Report.SsoReport}[-1] 
[!!! ChannelProxy] sendMessage: {trpc.o3.report.Report.SsoReport}[-1] 
======================= [spi initialize end] ========================
2023-09-13 08:10:02 [INFO ] [pool-2-thread-1] a - [WEBSOCKET] sending command trpc.o3.ecdh_access.EcdhAccess.SsoEstablishShareKey with seq server-1d6a564f-1f13-4690-a251-e4a7abb7f823-53a56fc4-b132-4383-b671-caea95e3f18c of bot 0 to 172.17.0.1:34984 by channel proxy: 0a0a476574536563436f6e661221028ef981f8a510375083901ffdb115369c4a3b55e05dd3874a86fd7c80068ec8bd22423833633536323337366539323963626161353837383730666262373264386337663363666539373731646631343830386562333061613931373932336664643563652a49bff528c38731611ceb96b24803b2418701afe74363139e08b31f34d1250dd9b975d65dddfc4c5cc3a29efb2b339402a39301dfec642fbed7b2c5549067f318ad77ec40cb6da0bccdbf32205ff06b9541c950229957006899137dc707a2e9bc46ff61d256230314031eb2873a30154c619900aa3a7f5b837e5fa0f9b8ab989994ba23f8ae5efaacefd0d83b8a4080102a4e70548cd46f563faf402c94ee
2023-09-13 08:10:02 [INFO ] [Vivo45#7] a - [WEBSOCKET] respond packet to 172.17.0.1:34984: {"packetId":"99e75e53-7467-44e9-a7c6-6b9965efc41b","packetType":"rpc.tlv","response":"0c0711fc57ba9215762c5f416b392c2c085e46e57aacaadea5a18c000000004042454600000000"}
2023-09-13 08:10:02 [INFO ] [pool-2-thread-1] a - [WEBSOCKET] sending command trpc.o3.report.Report.SsoReport with seq server-c99a1e11-c28b-4d59-81f6-08b8ab619172-9f39f83d-e2fd-45a4-976f-a7ec509bcce1 of bot 0 to 172.17.0.1:34984 by channel proxy: 0a0b30646630303037313634361284010a1b56315f414e445f53515f382e392e35385f343130365f5959425f440a07362e322e3232310a0b7369676e5f7265706f72740a04686f73740a01310a2066326232336137326335326230383637353262333130636633636662643239650a24326334373966333365613066653066613435336231643361313030303131363137333062
2023-09-13 08:10:02 [INFO ] [pool-2-thread-1] a - [WEBSOCKET] sending command trpc.o3.report.Report.SsoReport with seq server-6f02adc2-3d50-4df6-bd40-da90903b017e-f3ecdf71-81d8-40c8-b3a5-10e9c9d18f96 of bot 0 to 172.17.0.1:34984 by channel proxy: 0a0b306466303030373136343612c6020a1b56315f414e445f53515f382e392e35385f343130365f5959425f440a07362e322e3232310a0b7665726966795f66696c650a01310a422f646174612f6170702f636f6d2e74656e63656e742e6d6f62696c6571712f626173652e61706b212f6c69622f61726d36342d7638612f6c696266656b69742e736f0a40356239343431633764386339396166613136363730353265326132316632653030306237383464613830623461643431653361353033666433306361643730380a40356239343431633764386339396166613136363730353265326132316632653030306237383464613830623461643431653361353033666433306361643730380a2066326232336137326335326230383637353262333130636633636662643239650a24326334373966333365613066653066613435336231643361313030303131363137333062
2023-09-13 08:10:02 [INFO ] [vert.x-eventloop-thread-0] RpcServerBootstrap - [ROUTER] client request to check session state 0bdbd07c-6ac6-4a61-89ce-a7ed53be94ed.lP5vCJOqZOHwOZi6q65oLaQtsV5p/TbKdDv3YUZZFSNXAWJpYFDGsXBtGPJbxghJx8cS8BVY8B5f1v+9NzCk/rS5EAYoIJw1RE6wuedhjH59Wf5QKAiDnY9Mzk9B1d9jl0LoO0znEIRECXszbGT9MF50lNWGjqB7E7KMuXMqebL/2kV6W++QJA+VvBWiWOB290VQ2Sf5MZnQSxLhiFbGKNjO5qdM+bjZSVADnvn1Bg+0c41SKP4022lS1x82aD1zLWVNAMCzLIDj+Xa52RC95jp8hX80IbLGFP1NaAt24D9KnbYjmD70JwmSX4D1KTNTVo014pPA/OiYO2Zc+4lS4VCQImF4/D7THJLBvQrPv0vZeLOniHHRuyanhnlP/QaLDjcjRdDnk5VW8ztUtMj/8AI3+rKXjipDjUNMpOguEX0CZ8vUK0fx2nXP5OqnY5JYHmK1AwqxyOxtVrNEr6T9Oq23QSenPJKkjM41sujXh1ORc/x8PGLis0UApB/m2Nj9QW63af2eH6h4kK5KYPR9Pk2n7QP7R4+hYQ2auNZodfIH0tlqe1gpv1g5kUugXhOrdMUNHr6ubUl/V+PWf8euofHnTaD/vF38wi3KNc8AADMSygGoJ0ArK7yLwkpMnsyjf7vobw0qE0ncsS3s0M5xaDrAV3GX5ZoLH0o8O9ow8Lc=
2023-09-13 08:10:02 [INFO ] [vert.x-eventloop-thread-0] a - [WEBSOCKET] receiving packet from 172.17.0.1:34984: {"packetId":"76a1c1d0-584c-425a-9ab2-b3bced25aee7","packetType":"rpc.sign","seqId":41837,"command":"wtlogin.login","extArgs":{},"content":"02050E1F41081000011CCFFAC6030700000000020000000000000000020123628F3AECAB5B36AF0968743FF6108A01310002004104745A00BDF96C55C9D0A98C3C001A2151A602D96F2F097D398D43B6C7CBAE702D6F6762CB159D92DF4AE24A93CADC4F366BA5A935AB79EA701095B2D53AE183CDE9AFA11742EF2DC657F261C67C7116F00B3CE65AA3B95BC0E889B6F6B9F75DCFC7136C994D6EB61FE30BDD47DDCD4DEEC613395CC81593DEC6A72F1EDA42B04B56192DCBBC6FC26137BDD5C76D0DC9533007C5776DA8F3F2DBDDFBB4BEA5E6F47CEDBBCD8D049C35E4FABEDEE999BEBFD695B5602BEF3D3B373EB23658BE8D387106C1BB90E7D289F4FC858E6514C81D636CFBF9E6500BAC27A58176F822ADBBA3C1072FA87B00F49359FDE9C6B6236D9F4DF223ADDB9B1EFF0F0382BB993B458160164A9B234C9E99A22AAE0394FF1B75EC0DDFA512D1F2219F9A4BABCBECEE791F57B1DBD917CD80D0B5B8F2F172E021D661B6130FF8AFCA5E864E91BEFC4E21BB2C14A3E9385750087A19AC005A1CBFA2A333077A75F67430735515CC924F3CA1F9C225A0F700D3C3403CE3FFB244D50B4A49FDD2F987F9BEBBDB16E8DD48147198799EA41BBFC1F45AC52B65F1B3CD3FA94700B7C0013226FE08991EA55541DC62B1C0DB8BF3F657690B526FB82B5EDE2AE722CCE55F89EB290B957418441D7E64872DCEA40DB356FE0BDFC92F210DD220059DF7D159DCFAB8B761D82509719CE3DB9C9451487EED3FDA01AC44635CA247C1411FD6FC058971F4B01E3852CF326BDB67B4756960E8B73A63625067C2975BD6C120099DDE6153B9DAD39413EC9076AAF0E2677E5F3E031E55758576621A328D11DA1C6D78B42375F0BC3CB311ADF479420812FCE32BB0344C79B945B2126169A503244F455E557AF84765717F1DF9201EEFBA3EE086494641B3D0AB3C7F4E30A4144E887E6A6BBC3D1774F5C79B25B0F0041CF010F11B39CB534F1C08C2AED38516E861DC3AB464AF8CC02D782A8C8DD97090FA0EDD121B3C24D302FAAA6E54B317E2CC089B897AEBD8F88CABABE44A33C22F72E3B05DDC929DEDD19CE1F27BE59DC1FE00E1843EF3D284453B38DA9CC7F5EB1B5763BAAA093C1C21FA5BB0CF141EC741A3488668F199350D9D8E9B913B7699FF76FED96E4629E4DA99E2BB3DF94804031B11A9E8843D1DCA6FEBB01D5E62580E3D1CFC866FAD9C920A0C80CD8FA50160A26096700A638083F781BA23E93D70D6D4B3025A1B7B2E824C05028B1C49803F6FE27B06AF990BC4BBB3F8EEDD784C329B6670EE31C9093C900BF45D7DDD47356D32F8D7FBC384E5B1712B14267207FFAEB9E000054A26529AA3C6AF56D8EF7D52BBFDB5BBD45844C2AEC05A35F5216B6E072458254707C774F560A25FF85FCCA1E6AC9BEFA1BBDABEB24BC7C653CF340206A7D0E9D198BA810BCA93CFB402B3A047DB1599EF5157809D60BBD678164C22D461441013026E8078B0AE19413A68C66C3860B1BB2F435E0FCA54C0F346BB0A09F385368D7F26F558EB87E2A714947199F3F90C3B791867C3168786DA51F8E0B5F702A3259A75485998701EA65CCD478ACE3FEAA8067B2C25DB691E3FFC969AECEB10ADB8E8C5DDD91B79C50C605F9962AC21583FE7D1ECE74F95E0F4076EE7D934CB64FBFE6D8EAD0C37FEC473165345EF895CFD36ED7FF6302D224A6AC3061429CB4153C29F9607E4371C650E7E4AC1167FE636CB3478A40C41175EB700409581499E9E90EAC4D892A611174E44E96379C0599D8799A3AF89A39F2F1A88402AD25A1815B371C82BBAFF8FF84DF003"}
[FEKitLog INFO   ] [FEKit_] 1 device_token.h:323 getXwId but switch is close
[!!! ChannelProxy] sendMessage: {trpc.o3.report.Report.SsoReport}[-1] 
[FEKitLog INFO   ] [FEKit_] 1 qq_sign.h:132 [GetSign] cmd:wtlogin.login
SignResult[
  extra = byte[29] 12 1B 56 31 5F 41 4E 44 5F 53 51 5F 38 2E 39 2E 35 38 5F 34 31 30 36 5F 59 59 42 5F 44
  sign  = byte[39] 0C 07 49 B6 83 F2 36 7E BD 7E B5 AA 4C 47 0E F3 72 26 46 E0 C2 63 57 F7 58 A1 92 00 00 00 00 70 6A 71 72 00 00 00 00
  token = byte[0] 
]
2023-09-13 08:10:03 [INFO ] [pool-2-thread-1] a - [WEBSOCKET] sending command trpc.o3.report.Report.SsoReport with seq server-e0e08770-35a3-4fa2-a520-97e1604e7b16-ab5c31d7-3361-403e-9db6-91a81e6bfbeb of bot 0 to 172.17.0.1:34984 by channel proxy: 0a0b3064663030303731363436127a0a1b56315f414e445f53515f382e392e35385f343130365f5959425f440a07362e322e3232310a0a656d707479546f6b656e0a2066326232336137326335326230383637353262333130636633636662643239650a24326334373966333365613066653066613435336231643361313030303131363137333062
2023-09-13 08:10:03 [INFO ] [Vivo45#2] a - [WEBSOCKET] respond packet to 172.17.0.1:34984: {"packetId":"76a1c1d0-584c-425a-9ab2-b3bced25aee7","packetType":"rpc.sign","response":{"sign":"0c0749b683f2367ebd7eb5aa4c470ef3722646e0c26357f758a19200000000706a717200000000","extra":"121b56315f414e445f53515f382e392e35385f343130365f5959425f44","token":""}}
2023-09-13 08:10:03 [INFO ] [vert.x-eventloop-thread-0] RpcServerBootstrap - [ROUTER] client request to invalidate session 0bdbd07c-6ac6-4a61-89ce-a7ed53be94ed.lP5vCJOqZOHwOZi6q65oLaQtsV5p/TbKdDv3YUZZFSNXAWJpYFDGsXBtGPJbxghJx8cS8BVY8B5f1v+9NzCk/rS5EAYoIJw1RE6wuedhjH59Wf5QKAiDnY9Mzk9B1d9jl0LoO0znEIRECXszbGT9MF50lNWGjqB7E7KMuXMqebL/2kV6W++QJA+VvBWiWOB290VQ2Sf5MZnQSxLhiFbGKNjO5qdM+bjZSVADnvn1Bg+0c41SKP4022lS1x82aD1zLWVNAMCzLIDj+Xa52RC95jp8hX80IbLGFP1NaAt24D9KnbYjmD70JwmSX4D1KTNTVo014pPA/OiYO2Zc+4lS4VCQImF4/D7THJLBvQrPv0vZeLOniHHRuyanhnlP/QaLDjcjRdDnk5VW8ztUtMj/8AI3+rKXjipDjUNMpOguEX0CZ8vUK0fx2nXP5OqnY5JYHmK1AwqxyOxtVrNEr6T9Oq23QSenPJKkjM41sujXh1ORc/x8PGLis0UApB/m2Nj9QW63af2eH6h4kK5KYPR9Pk2n7QP7R4+hYQ2auNZodfIH0tlqe1gpv1g5kUugXhOrdMUNHr6ubUl/V+PWf8euofHnTaD/vF38wi3KNc8AADMSygGoJ0ArK7yLwkpMnsyjf7vobw0qE0ncsS3s0M5xaDrAV3GX5ZoLH0o8O9ow8Lc=
2023-09-13 08:10:03 [INFO ] [Vivo45#3] a - session of bot 483392198 is invalidated.
2023-09-13 08:10:03 [WARN ] [vert.x-eventloop-thread-2] a - [WEBSOCKET] error receiving command result with seq server-1d6a564f-1f13-4690-a251-e4a7abb7f823-53a56fc4-b132-4383-b671-caea95e3f18c.
io.vertx.core.impl.NoStackTraceThrowable: Cancelled
io.vertx.core.impl.NoStackTraceThrowable: Cancelled
2023-09-13 08:10:03 [WARN ] [vert.x-eventloop-thread-2] a - [WEBSOCKET] error receiving command result with seq server-6f02adc2-3d50-4df6-bd40-da90903b017e-f3ecdf71-81d8-40c8-b3a5-10e9c9d18f96.
io.vertx.core.impl.NoStackTraceThrowable: Cancelled
io.vertx.core.impl.NoStackTraceThrowable: Cancelled
2023-09-13 08:10:03 [WARN ] [vert.x-eventloop-thread-2] a - [WEBSOCKET] error receiving command result with seq server-c99a1e11-c28b-4d59-81f6-08b8ab619172-9f39f83d-e2fd-45a4-976f-a7ec509bcce1.
io.vertx.core.impl.NoStackTraceThrowable: Cancelled
io.vertx.core.impl.NoStackTraceThrowable: Cancelled
2023-09-13 08:10:03 [WARN ] [vert.x-eventloop-thread-2] a - [WEBSOCKET] error receiving command result with seq server-e0e08770-35a3-4fa2-a520-97e1604e7b16-ab5c31d7-3361-403e-9db6-91a81e6bfbeb.
io.vertx.core.impl.NoStackTraceThrowable: Cancelled
io.vertx.core.impl.NoStackTraceThrowable: Cancelled
2023-09-13 08:13:35 [INFO ] [vert.x-eventloop-thread-0] RpcServerBootstrap - [ROUTER] receiving get about page

使用的协议是8.9.58 ANDROID_PHONE

一段时间后就只能接收消息不能发送消息了

�[KCannot find exception handler from coroutineContext. �[m

�[KPlease extend SimpleListenerHost.handleException or provide a CoroutineExceptionHandler to the constructor of SimpleListenerHost�[m

�[K at net.mamoe.mirai.event.SimpleListenerHost.handleException(JvmMethodListeners.kt:192)�[m

�[K at net.mamoe.mirai.event.SimpleListenerHost$special$$inlined$CoroutineExceptionHandler$1.handleException(CoroutineExceptionHandler.kt:111)�[m

�[K at net.mamoe.mirai.internal.event.SafeListener.onEvent(SafeListener.kt:75)�[m

�[K at net.mamoe.mirai.internal.event.SafeListener$onEvent$1.invokeSuspend(SafeListener.kt)�[m

�[K ... 9 more�[m

�[KCaused by: net.mamoe.mirai.event.ExceptionInEventHandlerException: Exception in EventHandler�[m

�[K at net.mamoe.mirai.internal.event.JvmMethodListenersInternalKt.registerEventHandler$callMethod$invokeWithErrorReport(JvmMethodListenersInternal.kt:147)�[m

�[K at net.mamoe.mirai.internal.event.JvmMethodListenersInternalKt.access$registerEventHandler$callMethod$invokeWithErrorReport(JvmMethodListenersInternal.kt:1)�[m

�[K at net.mamoe.mirai.internal.event.JvmMethodListenersInternalKt$registerEventHandler$callMethod$2.invokeSuspend(JvmMethodListenersInternal.kt:154)�[m

�[K at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)�[m

�[K at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:106)�[m

�[K at kotlinx.coroutines.internal.LimitedDispatcher.run(LimitedDispatcher.kt:42)�[m

�[K at kotlinx.coroutines.scheduling.TaskImpl.run(Tasks.kt:95)�[m

�[K ... 4 more�[m

�[KCaused by: java.lang.reflect.InvocationTargetException�[m

�[K at jdk.internal.reflect.GeneratedMethodAccessor34.invoke(Unknown Source)�[m

�[K at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)�[m

�[K at java.base/java.lang.reflect.Method.invoke(Method.java:566)�[m

�[K at net.mamoe.mirai.internal.event.JvmMethodListenersInternalKt.registerEventHandler$callMethod$invokeWithErrorReport(JvmMethodListenersInternal.kt:140)�[m

�[K ... 10 more�[m

�[KCaused by: java.util.concurrent.ExecutionException: java.net.ConnectException: Connection refused: localhost/0:0:0:0:0:0:0:1:9999�[m

�[K at java.base/java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:395)�[m

�[K at java.base/java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1999)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//org.asynchttpclient.netty.NettyResponseFuture.get(NettyResponseFuture.java:201)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar//xyz.cssxsh.mirai.tool.ViVo50$Session.check(ViVo50.kt:326)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar//xyz.cssxsh.mirai.tool.ViVo50$Session.websocket(ViVo50.kt:368)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar//xyz.cssxsh.mirai.tool.ViVo50$Session.sendPacket(ViVo50.kt:380)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar//xyz.cssxsh.mirai.tool.ViVo50$Session.sendCommand(ViVo50.kt:393)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar//xyz.cssxsh.mirai.tool.ViVo50.qSecurityGetSign(ViVo50.kt:207)�[m

�[K at net.mamoe.mirai.internal.network.protocol.packet.OutgoingPacketKt.buildRawUniPacket(OutgoingPacket.kt:139)�[m

�[K at net.mamoe.mirai.internal.network.protocol.packet.chat.receive.MessageSvcPbSendMsg.createToGroupImpl$mirai_core(MessageSvc.PbSendMsg.kt:744)�[m

�[K at net.mamoe.mirai.internal.network.protocol.packet.chat.receive.MessageSvc_PbSendMsgKt.createToGroup(MessageSvc.PbSendMsg.kt:585)�[m

�[K at net.mamoe.mirai.internal.message.protocol.outgoing.GroupMessageProtocolStrategy.createPacketsForGeneralMessage$suspendImpl(MessageProtocolStrategy.kt:150)�[m

�[K at net.mamoe.mirai.internal.message.protocol.outgoing.GroupMessageProtocolStrategy.createPacketsForGeneralMessage(MessageProtocolStrategy.kt)�[m

�[K at net.mamoe.mirai.internal.message.protocol.outgoing.GroupMessageProtocolStrategy.createPacketsForGeneralMessage(MessageProtocolStrategy.kt:139)�[m

�[K at net.mamoe.mirai.internal.message.protocol.impl.GeneralMessageSenderProtocol$GeneralMessageSender.process(GeneralMessageSenderProtocol.kt:66)�[m

�[K at net.mamoe.mirai.internal.message.protocol.outgoing.OutgoingMessageProcessorAdapter.process(OutgoingMessagePipelineProcessor.kt:26)�[m

�[K at net.mamoe.mirai.internal.message.protocol.outgoing.OutgoingMessageProcessorAdapter.process(OutgoingMessagePipelineProcessor.kt:20)�[m

�[K at net.mamoe.mirai.internal.pipeline.AbstractProcessorPipeline.process$suspendImpl(ProcessorPipeline.kt:287)�[m

�[K at net.mamoe.mirai.internal.pipeline.AbstractProcessorPipeline.process(ProcessorPipeline.kt)�[m

�[K at net.mamoe.mirai.internal.message.protocol.MessageProtocolFacadeImpl.preprocessAndSendOutgoingImpl(MessageProtocolFacade.kt:361)�[m

�[K at net.mamoe.mirai.internal.message.protocol.MessageProtocolFacadeImpl.preprocessAndSendOutgoing(MessageProtocolFacade.kt:345)�[m

�[K at net.mamoe.mirai.internal.message.protocol.MessageProtocolFacade$INSTANCE.preprocessAndSendOutgoing(MessageProtocolFacade.kt)�[m

�[K at net.mamoe.mirai.internal.contact.AbstractUserKt.sendMessageImpl(AbstractUser.kt:263)�[m

�[K at net.mamoe.mirai.internal.contact.CommonGroupImpl.sendMessage$suspendImpl(GroupImpl.kt:221)�[m

�[K at net.mamoe.mirai.internal.contact.CommonGroupImpl.sendMessage(GroupImpl.kt)�[m

�[K at net.mamoe.mirai.contact.Group.sendMessage$suspendImpl(Group.kt:208)�[m

�[K at net.mamoe.mirai.contact.Group.sendMessage(Group.kt)�[m

�[K at net.mamoe.mirai.contact.Group$sendMessage$3.invoke(Group.kt)�[m

�[K at net.mamoe.mirai.contact.Group$sendMessage$3.invoke(Group.kt)�[m

�[K at kotlin.coroutines.intrinsics.IntrinsicsKt__IntrinsicsJvmKt$createCoroutineUnintercepted$$inlined$createCoroutineFromSuspendFunction$IntrinsicsKt__IntrinsicsJvmKt$1.invokeSuspend(IntrinsicsJvm.kt:205)�[m

�[K at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)�[m

�[K at kotlin.coroutines.ContinuationKt.startCoroutine(Continuation.kt:115)�[m

�[K at me.him188.kotlin.jvm.blocking.bridge.internal.RunSuspendKt.$runSuspend$(RunSuspend.kt:18)�[m

�[K at net.mamoe.mirai.contact.Group.sendMessage(Group.kt)�[m

�[K at shitboy-0.1.10-test6.mirai2.jar//net.lawaxi.ListenerYLG.sendXenonRecallMessage(ListenerYLG.java:112)�[m

�[K at shitboy-0.1.10-test6.mirai2.jar//net.lawaxi.ListenerYLG.onGroupRecall(ListenerYLG.java:90)�[m

�[K ... 14 more�[m

�[KCaused by: java.net.ConnectException: Connection refused: localhost/0:0:0:0:0:0:0:1:9999�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//org.asynchttpclient.netty.channel.NettyConnectListener.onFailure(NettyConnectListener.java:179)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//org.asynchttpclient.netty.channel.NettyChannelConnector$1.onFailure(NettyChannelConnector.java:108)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//org.asynchttpclient.netty.SimpleChannelFutureListener.operationComplete(SimpleChannelFutureListener.java:28)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//org.asynchttpclient.netty.SimpleChannelFutureListener.operationComplete(SimpleChannelFutureListener.java:20)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:578)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.util.concurrent.DefaultPromise.notifyListeners0(DefaultPromise.java:571)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:550)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:491)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.util.concurrent.DefaultPromise.setValue0(DefaultPromise.java:616)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.util.concurrent.DefaultPromise.setFailure0(DefaultPromise.java:609)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.util.concurrent.DefaultPromise.tryFailure(DefaultPromise.java:117)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.fulfillConnectPromise(AbstractNioChannel.java:321)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:337)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:707)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:655)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:581)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:493)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)�[m

�[K at java.base/java.lang.Thread.run(Thread.java:829)�[m

�[KCaused by: io.netty.channel.AbstractChannel$AnnotatedConnectException: Connection refused: localhost/0:0:0:0:0:0:0:1:9999�[m

�[KCaused by: java.net.ConnectException: Connection refused�[m

�[K at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)�[m

�[K at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:777)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:330)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:334)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:707)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:655)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:581)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:493)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)�[m

�[K at java.base/java.lang.Thread.run(Thread.java:829)�[m

�[K�[0m�[m

new protocol support

请求支持更新的协议,目前部分账号已无法使用 8.9.58 协议

该项目是否不支持i686

以如下参数运行Docker:
docker run -d --restart=always -e SERVER_IDENTITY_KEY=vivo50 -e AUTH_KEY=kfc -e PORT=8888 -p 8888:8888 --log-opt mode=non-blocking --log-opt max-buffer-size=4m -v /home/vivo50/serverData:/app/serverData -v /home/vivo50/testbot:/app/testbot kiliokuara/vivo50
相比于README中给出的命令缺少--rm是因为:#1
心情复杂.jpg
lscpu 运行结果:
image
但是我疑惑的地方在于,按理来讲这颗CPU可以执行这个吧?

docker部署内存占用异常

运行环境Debian11.1

日志

2023-07-31 15:36:57 [INFO ] [main] RpcServerBootstrap - unpacking resources.
2023-07-31 15:36:57 [INFO ] [main] RpcServerBootstrap - downloading mobile qq apk
2023-07-31 15:36:57 [INFO ] [main] n9e30d450041548ce8a3c88502172b430 - checking sha1 of apk serverData/android-8.9.58.apk
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
	at java.base/java.nio.file.Files.read(Files.java:3239)
	at java.base/java.nio.file.Files.readAllBytes(Files.java:3296)
	at kfc.n55600a3e0bd040beb5650998c3f54f46.n9e30d450041548ce8a3c88502172b430.c(Unknown Source)
	at kfc.n55600a3e0bd040beb5650998c3f54f46.n9e30d450041548ce8a3c88502172b430.b(Unknown Source)
	at tencentlibfekit.vmservice.rpc.RpcServerBootstrap.main(Unknown Source)
2023-07-31 15:36:57 [INFO ] [main] RpcServerBootstrap - unpacking resources.
2023-07-31 15:36:57 [INFO ] [main] RpcServerBootstrap - downloading mobile qq apk
2023-07-31 15:36:57 [INFO ] [main] n9e30d450041548ce8a3c88502172b430 - checking sha1 of apk serverData/android-8.9.58.apk
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
	at java.base/java.nio.file.Files.read(Files.java:3239)
	at java.base/java.nio.file.Files.readAllBytes(Files.java:3296)
	at kfc.n55600a3e0bd040beb5650998c3f54f46.n9e30d450041548ce8a3c88502172b430.c(Unknown Source)
	at kfc.n55600a3e0bd040beb5650998c3f54f46.n9e30d450041548ce8a3c88502172b430.b(Unknown Source)
	at tencentlibfekit.vmservice.rpc.RpcServerBootstrap.main(Unknown Source)

把 --memory 200M 这一行去掉可以正常运行,但内存占用多了好多,小水管服务器撑不住

docker 容器体积问题

在长时间运行docker 容器后,容器会变的异常的大。
使用

docker exec -it vivo50 /bin/bash

命令进入容器后,会发现一些 名为 core.数字 的文件,请问这些文件可以清除吗,能在容器运行时删除一些创建时间比较早的文件 以释放空间吗

README文档给出的docker部署部分似乎有些问题

原内容

docker run --rm -d --restart=always \
  -e SERVER_IDENTITY_KEY=vivo50 \
  -e AUTH_KEY=kfc \
  -e PORT=8888 \
  -p 8888:8888 \
  --log-opt mode=non-blocking --log-opt max-buffer-size=4m \
  -v /home/vivo50/serverData:/app/serverData \
  -v /home/vivo50/testbot:/app/testbot \
  kiliokuara/vivo50

执行后提示
docker: Conflicting options: --restart and --rm. See 'docker run --help'.

docker 部署后启动报错

复现步骤:
Centos 7 执行以下指令:
$ docker pull kiliokuara/vivo50:latest
$ docker run -d --restart=always
-e SERVER_IDENTITY_KEY=vivo50
-e AUTH_KEY=kfc
-e PORT=8888
-p 8888:8888
--log-opt mode=non-blocking --log-opt max-buffer-size=4m
-v /home/vivo50/serverData:/app/serverData
-v /home/vivo50/testbot:/app/testbot
--name vivo50
--memory 200M
kiliokuara/vivo50

然后 docker ps 检查容器状态时会发现 vivo50 一直处于 restarting 状态,无限循环。
以下是 docker logs vivo50 的输出,有错误信息:
2023-07-30 03:38:22 [INFO ] [main] RpcServerBootstrap - unpacking resources.
2023-07-30 03:38:22 [INFO ] [main] n1cbc826620954c6491c6d608dc024736 - extracting linuxfile/ls
Exception in thread "main" java.lang.RuntimeException: java.nio.file.FileAlreadyExistsException: serverData/resources/linuxfile/ls
at kfc.ne6e75658fda8432484ce61cae30527b9.n1cbc826620954c6491c6d608dc024736.a(Unknown Source)
at tencentlibfekit.vmservice.rpc.RpcServerBootstrap.main(Unknown Source)
Caused by: java.nio.file.FileAlreadyExistsException: serverData/resources/linuxfile/ls
at java.base/sun.nio.fs.UnixException.translateToIOException(UnixException.java:94)
at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:106)
at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:111)
at java.base/sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:218)
at java.base/java.nio.file.spi.FileSystemProvider.newOutputStream(FileSystemProvider.java:484)
at java.base/java.nio.file.Files.newOutputStream(Files.java:228)
at java.base/java.nio.file.Files.copy(Files.java:3161)
... 2 more
2023-07-30 03:38:23 [INFO ] [main] RpcServerBootstrap - unpacking resources.
2023-07-30 03:38:23 [INFO ] [main] n1cbc826620954c6491c6d608dc024736 - extracting linuxfile/ls
Exception in thread "main" java.lang.RuntimeException: java.nio.file.FileAlreadyExistsException: serverData/resources/linuxfile/ls
at kfc.ne6e75658fda8432484ce61cae30527b9.n1cbc826620954c6491c6d608dc024736.a(Unknown Source)
at tencentlibfekit.vmservice.rpc.RpcServerBootstrap.main(Unknown Source)
Caused by: java.nio.file.FileAlreadyExistsException: serverData/resources/linuxfile/ls
at java.base/sun.nio.fs.UnixException.translateToIOException(UnixException.java:94)
at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:106)
at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:111)
at java.base/sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:218)
at java.base/java.nio.file.spi.FileSystemProvider.newOutputStream(FileSystemProvider.java:484)
at java.base/java.nio.file.Files.newOutputStream(Files.java:228)
at java.base/java.nio.file.Files.copy(Files.java:3161)
... 2 more
2023-07-30 03:38:24 [INFO ] [main] RpcServerBootstrap - unpacking resources.
2023-07-30 03:38:24 [INFO ] [main] n1cbc826620954c6491c6d608dc024736 - extracting linuxfile/ls
Exception in thread "main" java.lang.RuntimeException: java.nio.file.FileAlreadyExistsException: serverData/resources/linuxfile/ls
at kfc.ne6e75658fda8432484ce61cae30527b9.n1cbc826620954c6491c6d608dc024736.a(Unknown Source)
at tencentlibfekit.vmservice.rpc.RpcServerBootstrap.main(Unknown Source)
Caused by: java.nio.file.FileAlreadyExistsException: serverData/resources/linuxfile/ls
at java.base/sun.nio.fs.UnixException.translateToIOException(UnixException.java:94)
at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:106)
at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:111)
at java.base/sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:218)
at java.base/java.nio.file.spi.FileSystemProvider.newOutputStream(FileSystemProvider.java:484)
at java.base/java.nio.file.Files.newOutputStream(Files.java:228)
at java.base/java.nio.file.Files.copy(Files.java:3161)
... 2 more
2023-07-30 03:38:26 [INFO ] [main] RpcServerBootstrap - unpacking resources.
2023-07-30 03:38:26 [INFO ] [main] n1cbc826620954c6491c6d608dc024736 - extracting linuxfile/ls
Exception in thread "main" java.lang.RuntimeException: java.nio.file.FileAlreadyExistsException: serverData/resources/linuxfile/ls
at kfc.ne6e75658fda8432484ce61cae30527b9.n1cbc826620954c6491c6d608dc024736.a(Unknown Source)
at tencentlibfekit.vmservice.rpc.RpcServerBootstrap.main(Unknown Source)
Caused by: java.nio.file.FileAlreadyExistsException: serverData/resources/linuxfile/ls
at java.base/sun.nio.fs.UnixException.translateToIOException(UnixException.java:94)
at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:106)
at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:111)
at java.base/sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:218)
at java.base/java.nio.file.spi.FileSystemProvider.newOutputStream(FileSystemProvider.java:484)
at java.base/java.nio.file.Files.newOutputStream(Files.java:228)
at java.base/java.nio.file.Files.copy(Files.java:3161)
... 2 more
2023-07-30 03:38:27 [INFO ] [main] RpcServerBootstrap - unpacking resources.
2023-07-30 03:38:27 [INFO ] [main] n1cbc826620954c6491c6d608dc024736 - extracting linuxfile/ls
Exception in thread "main" java.lang.RuntimeException: java.nio.file.FileAlreadyExistsException: serverData/resources/linuxfile/ls
at kfc.ne6e75658fda8432484ce61cae30527b9.n1cbc826620954c6491c6d608dc024736.a(Unknown Source)
at tencentlibfekit.vmservice.rpc.RpcServerBootstrap.main(Unknown Source)
Caused by: java.nio.file.FileAlreadyExistsException: serverData/resources/linuxfile/ls
at java.base/sun.nio.fs.UnixException.translateToIOException(UnixException.java:94)
at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:106)
at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:111)
at java.base/sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:218)
at java.base/java.nio.file.spi.FileSystemProvider.newOutputStream(FileSystemProvider.java:484)
at java.base/java.nio.file.Files.newOutputStream(Files.java:228)
at java.base/java.nio.file.Files.copy(Files.java:3161)
... 2 more
2023-07-30 03:38:30 [INFO ] [main] RpcServerBootstrap - unpacking resources.
2023-07-30 03:38:30 [INFO ] [main] n1cbc826620954c6491c6d608dc024736 - extracting linuxfile/ls
Exception in thread "main" java.lang.RuntimeException: java.nio.file.FileAlreadyExistsException: serverData/resources/linuxfile/ls
at kfc.ne6e75658fda8432484ce61cae30527b9.n1cbc826620954c6491c6d608dc024736.a(Unknown Source)
at tencentlibfekit.vmservice.rpc.RpcServerBootstrap.main(Unknown Source)
Caused by: java.nio.file.FileAlreadyExistsException: serverData/resources/linuxfile/ls
at java.base/sun.nio.fs.UnixException.translateToIOException(UnixException.java:94)
at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:106)
at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:111)
at java.base/sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:218)
at java.base/java.nio.file.spi.FileSystemProvider.newOutputStream(FileSystemProvider.java:484)
at java.base/java.nio.file.Files.newOutputStream(Files.java:228)
at java.base/java.nio.file.Files.copy(Files.java:3161)
... 2 more
2023-07-30 03:38:34 [INFO ] [main] RpcServerBootstrap - unpacking resources.
2023-07-30 03:38:34 [INFO ] [main] n1cbc826620954c6491c6d608dc024736 - extracting linuxfile/ls
Exception in thread "main" java.lang.RuntimeException: java.nio.file.FileAlreadyExistsException: serverData/resources/linuxfile/ls
at kfc.ne6e75658fda8432484ce61cae30527b9.n1cbc826620954c6491c6d608dc024736.a(Unknown Source)
at tencentlibfekit.vmservice.rpc.RpcServerBootstrap.main(Unknown Source)
Caused by: java.nio.file.FileAlreadyExistsException: serverData/resources/linuxfile/ls
at java.base/sun.nio.fs.UnixException.translateToIOException(UnixException.java:94)
at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:106)
at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:111)
at java.base/sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:218)
at java.base/java.nio.file.spi.FileSystemProvider.newOutputStream(FileSystemProvider.java:484)
at java.base/java.nio.file.Files.newOutputStream(Files.java:228)
at java.base/java.nio.file.Files.copy(Files.java:3161)
... 2 more
2023-07-30 03:38:42 [INFO ] [main] RpcServerBootstrap - unpacking resources.
2023-07-30 03:38:42 [INFO ] [main] n1cbc826620954c6491c6d608dc024736 - extracting linuxfile/ls
Exception in thread "main" java.lang.RuntimeException: java.nio.file.FileAlreadyExistsException: serverData/resources/linuxfile/ls
at kfc.ne6e75658fda8432484ce61cae30527b9.n1cbc826620954c6491c6d608dc024736.a(Unknown Source)
at tencentlibfekit.vmservice.rpc.RpcServerBootstrap.main(Unknown Source)
Caused by: java.nio.file.FileAlreadyExistsException: serverData/resources/linuxfile/ls
at java.base/sun.nio.fs.UnixException.translateToIOException(UnixException.java:94)
at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:106)
at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:111)
at java.base/sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:218)
at java.base/java.nio.file.spi.FileSystemProvider.newOutputStream(FileSystemProvider.java:484)
at java.base/java.nio.file.Files.newOutputStream(Files.java:228)
at java.base/java.nio.file.Files.copy(Files.java:3161)
... 2 more

异常java.lang.ArrayIndexOutOfBoundsException

大佬有空帮忙瞅瞅
环境; fix-protocol-version 1.9.11,
客户端jdk8,
magic-signer-guide: docker:latest qversion: 8.9.58
回显: 客户端开始使用签名服务时, 后台日志报错, 然后出现以下
在插件fix-protocol-version 使用时, signer日志报错, 报错后客户端
image

服务端日志:
gui-error-log

docker仓库不存在报错

~/mirai-dice-release-noextra# docker login
Authenticating with existing credentials...
WARNING! Your password will be stored unencrypted in /root/.docker/config.json.
Configure a credential helper to remove this warning. See
https://docs.docker.com/engine/reference/commandline/login/#credentials-store

Login Succeeded
~/mirai-dice-release-noextra# docker pull kiliokuara/vivo50:latest
Error response from daemon: pull access denied for kiliokuara/vivo50, repository does not exist or may require 'docker login': denied: requested access to the resource is denied

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.