Compare commits

...

60 Commits
1.9.0 ... main

Author SHA1 Message Date
6f3f6bda1d 🔖 1.12.1 2026-03-04 18:58:30 +08:00
pre-commit-ci[bot]
367ead2ff5 ⬆️ auto update by pre-commit hooks (#600)
Some checks failed
Code Coverage / Test (macos-latest, 3.10) (push) Has been cancelled
Code Coverage / Test (macos-latest, 3.11) (push) Has been cancelled
Code Coverage / Test (macos-latest, 3.12) (push) Has been cancelled
Code Coverage / Test (macos-latest, 3.13) (push) Has been cancelled
Code Coverage / Test (macos-latest, 3.14) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.10) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.11) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.12) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.13) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.14) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.10) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.11) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.12) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.13) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.14) (push) Has been cancelled
TypeCheck / TypeCheck (push) Has been cancelled
CodeQL / Analyze (python) (push) Has been cancelled
Code Coverage / check (push) Has been cancelled
updates:
- [github.com/astral-sh/ruff-pre-commit: v0.15.2 → v0.15.4](https://github.com/astral-sh/ruff-pre-commit/compare/v0.15.2...v0.15.4)

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2026-03-04 03:46:39 +08:00
renovate[bot]
7cfe6593f2 ⬆️ Lock file maintenance (#599)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-03-04 03:46:14 +08:00
67058a1492 🐛 修复 bind 模板 type 错误 2026-03-04 03:18:15 +08:00
c4ab6badbc TETR.IO UserInfo 添加 oldusernames 字段 2026-03-04 02:58:57 +08:00
pre-commit-ci[bot]
52bfd30ec5 ⬆️ auto update by pre-commit hooks (#598)
updates:
- [github.com/astral-sh/ruff-pre-commit: v0.15.1 → v0.15.2](https://github.com/astral-sh/ruff-pre-commit/compare/v0.15.1...v0.15.2)

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2026-02-24 23:26:39 +08:00
renovate[bot]
7eaf551dd0 ⬆️ Lock file maintenance (#595)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-02-24 01:31:34 +08:00
99e475b75b 🔖 1.12.0 2026-02-23 01:06:59 +08:00
呵呵です
ba0d1677cf 适配 Trending (#539)
*  适配 v1 tetrio 的 Trending

* 🗃️ 添加 compare_delta 配置项

* 🗃️ 添加 TETRIOLeagueUserMap 索引表

*  添加对比时间配置项

*  添加 compare_delta 解析函数

*  添加 Trending 类的 compare 方法

* 🗃️ 移除不正确的复合索引

*  定时任务拉取tl数据时同步更新索引

*  适配 trending

* 🐛 修复 find_entry 在无 uid 时的索引返回逻辑

* 📝 修正 compare_delta 迁移父迁移注释

* 🗃️ 为非 PostgreSQL 回填迁移补充外键约束

* 🔒 迁移中使用参数绑定设置 PG 内存参数

*  修正 Trends 的 vs 为 adpm

* 🐛 修正获取玩家 ID 的范围
2026-02-23 01:04:01 +08:00
pre-commit-ci[bot]
14f3e6960e ⬆️ auto update by pre-commit hooks (#594)
updates:
- [github.com/astral-sh/ruff-pre-commit: v0.15.0 → v0.15.1](https://github.com/astral-sh/ruff-pre-commit/compare/v0.15.0...v0.15.1)

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2026-02-17 11:12:41 +08:00
renovate[bot]
61c798de1a ⬆️ Lock file maintenance (#593)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-02-16 19:05:03 +08:00
呵呵です
d63976b881 🐛 support achievement rt=7 rank type (#592) 2026-02-15 11:20:23 +00:00
renovate[bot]
72042220d8 ⬆️ Lock file maintenance (#585)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-02-12 04:10:36 +08:00
pre-commit-ci[bot]
15d3a33cb8 ⬆️ auto update by pre-commit hooks (#587)
* ⬆️ auto update by pre-commit hooks

updates:
- [github.com/astral-sh/ruff-pre-commit: v0.14.14 → v0.15.0](https://github.com/astral-sh/ruff-pre-commit/compare/v0.14.14...v0.15.0)

* Fix ASYNC240: Use async file I/O in check_hash (#591)

* Initial plan

* Fix ASYNC240: Replace synchronous Path.read_text() with async aiofiles operation

Co-authored-by: shoucandanghehe <51957264+shoucandanghehe@users.noreply.github.com>

* Add explicit UTF-8 encoding to ensure consistent behavior across platforms

Co-authored-by: shoucandanghehe <51957264+shoucandanghehe@users.noreply.github.com>

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: shoucandanghehe <51957264+shoucandanghehe@users.noreply.github.com>

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Copilot <198982749+Copilot@users.noreply.github.com>
Co-authored-by: shoucandanghehe <51957264+shoucandanghehe@users.noreply.github.com>
2026-02-12 04:06:46 +08:00
renovate[bot]
f8fde51009 ⬆️ Upgrade re-actors/alls-green digest to a638d64 (#586)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-02-12 00:23:52 +08:00
renovate[bot]
6256053a82 ⬆️ Upgrade dependency pillow to v12.1.1 [SECURITY] (#590)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-02-12 00:17:37 +08:00
dependabot[bot]
689a2c72b8 ⬆️ Bump cryptography from 46.0.3 to 46.0.5 (#588)
Bumps [cryptography](https://github.com/pyca/cryptography) from 46.0.3 to 46.0.5.
- [Changelog](https://github.com/pyca/cryptography/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/pyca/cryptography/compare/46.0.3...46.0.5)

---
updated-dependencies:
- dependency-name: cryptography
  dependency-version: 46.0.5
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-12 00:08:00 +08:00
呵呵です
cdea262335 🐛 enforce object JSON storage (#584)
Some checks failed
Code Coverage / Test (macos-latest, 3.10) (push) Has been cancelled
Code Coverage / Test (macos-latest, 3.11) (push) Has been cancelled
Code Coverage / Test (macos-latest, 3.12) (push) Has been cancelled
Code Coverage / Test (macos-latest, 3.13) (push) Has been cancelled
Code Coverage / Test (macos-latest, 3.14) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.10) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.11) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.12) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.13) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.14) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.10) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.11) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.12) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.13) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.14) (push) Has been cancelled
TypeCheck / TypeCheck (push) Has been cancelled
CodeQL / Analyze (python) (push) Has been cancelled
Code Coverage / check (push) Has been cancelled
Migrate Pydantic JSON columns to objects and tighten serialization to avoid string-encoded JSON.
2026-02-01 01:21:58 +08:00
呵呵です
28a02aec0f migrate user-facing text to i18n (#581) 2026-01-27 05:20:41 +08:00
呵呵です
95aa00e2cd ⬆️ update nonebot dependency config (#580) 2026-01-27 05:18:42 +08:00
pre-commit-ci[bot]
73bdf93d88 ⬆️ auto update by pre-commit hooks (#577)
updates:
- [github.com/astral-sh/ruff-pre-commit: v0.14.10 → v0.14.14](https://github.com/astral-sh/ruff-pre-commit/compare/v0.14.10...v0.14.14)

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2026-01-26 21:18:13 +00:00
renovate[bot]
397047162e ⬆️ Lock file maintenance (#576)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-01-27 05:14:34 +08:00
renovate[bot]
653bccf48a ⬆️ Upgrade dependency prettier to v3.8.1 (#579)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-01-26 21:10:53 +00:00
呵呵です
bce73489dd 🐛 fix TOS unbind label (#582) 2026-01-27 05:06:47 +08:00
renovate[bot]
f4e1e8b1a2 ⬆️ Upgrade dependency prettier to v3.8.0 (#578)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-01-16 01:30:43 +08:00
renovate[bot]
b3bb425c44 ⬆️ Lock file maintenance (#575)
Some checks failed
Code Coverage / Test (macos-latest, 3.10) (push) Has been cancelled
Code Coverage / Test (macos-latest, 3.11) (push) Has been cancelled
Code Coverage / Test (macos-latest, 3.12) (push) Has been cancelled
Code Coverage / Test (macos-latest, 3.13) (push) Has been cancelled
Code Coverage / Test (macos-latest, 3.14) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.10) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.11) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.12) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.13) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.14) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.10) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.11) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.12) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.13) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.14) (push) Has been cancelled
TypeCheck / TypeCheck (push) Has been cancelled
CodeQL / Analyze (python) (push) Has been cancelled
Code Coverage / check (push) Has been cancelled
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-01-05 01:16:03 +08:00
呵呵です
59a1c80ce5 🐛 修复 io rank 比较或展示顺序可能有问题的bug (#574)
Some checks failed
Code Coverage / Test (macos-latest, 3.10) (push) Has been cancelled
Code Coverage / Test (macos-latest, 3.11) (push) Has been cancelled
Code Coverage / Test (macos-latest, 3.12) (push) Has been cancelled
Code Coverage / Test (macos-latest, 3.13) (push) Has been cancelled
Code Coverage / Test (macos-latest, 3.14) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.10) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.11) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.12) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.13) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.14) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.10) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.11) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.12) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.13) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.14) (push) Has been cancelled
TypeCheck / TypeCheck (push) Has been cancelled
CodeQL / Analyze (python) (push) Has been cancelled
Code Coverage / check (push) Has been cancelled
2025-12-25 23:58:51 +08:00
呵呵です
fcecf5a01f 🔧 修改 basedpyright 配置 (#573)
* 🔧 修改 basedpyright 配置

* 🚨 auto fix by pre-commit hooks

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2025-12-25 22:32:25 +08:00
pre-commit-ci[bot]
a4ffb6bbc6 ⬆️ auto update by pre-commit hooks (#569)
Some checks failed
Code Coverage / Test (macos-latest, 3.10) (push) Has been cancelled
Code Coverage / Test (macos-latest, 3.11) (push) Has been cancelled
Code Coverage / Test (macos-latest, 3.12) (push) Has been cancelled
Code Coverage / Test (macos-latest, 3.13) (push) Has been cancelled
Code Coverage / Test (macos-latest, 3.14) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.10) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.11) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.12) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.13) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.14) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.10) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.11) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.12) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.13) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.14) (push) Has been cancelled
TypeCheck / TypeCheck (push) Has been cancelled
CodeQL / Analyze (python) (push) Has been cancelled
Code Coverage / check (push) Has been cancelled
updates:
- [github.com/astral-sh/ruff-pre-commit: v0.14.9 → v0.14.10](https://github.com/astral-sh/ruff-pre-commit/compare/v0.14.9...v0.14.10)

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2025-12-25 05:18:41 +08:00
renovate[bot]
81150a36be ⬆️ Lock file maintenance (#572)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-12-25 05:18:20 +08:00
呵呵です
2ddf1dc336 🔧 让 renovate 定期更新 lock 文件 (#571) 2025-12-25 05:15:15 +08:00
呵呵です
3615a87926 👷 为测试矩阵添加更多 python 版本 (#568)
* 👷 为测试矩阵添加更多 python 版本

* ⬆️ 更新 lock 包版本

* 🚨 修复 mypy 报错
2025-12-25 05:02:18 +08:00
呵呵です
0228e1a480 📝 Update Wakatime badge link (#570) 2025-12-25 03:09:42 +08:00
呵呵です
d920c90936 🐛 修复重复创建浏览器上下文的bug (#567)
Some checks failed
Code Coverage / Test (macos-latest, 3.10) (push) Has been cancelled
Code Coverage / Test (macos-latest, 3.11) (push) Has been cancelled
Code Coverage / Test (macos-latest, 3.12) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.10) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.11) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.12) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.10) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.11) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.12) (push) Has been cancelled
TypeCheck / TypeCheck (push) Has been cancelled
CodeQL / Analyze (python) (push) Has been cancelled
Code Coverage / check (push) Has been cancelled
2025-12-21 18:25:20 +00:00
pre-commit-ci[bot]
024418f032 ⬆️ auto update by pre-commit hooks (#565)
updates:
- [github.com/astral-sh/ruff-pre-commit: v0.14.7 → v0.14.9](https://github.com/astral-sh/ruff-pre-commit/compare/v0.14.7...v0.14.9)

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2025-12-22 02:03:47 +08:00
呵呵です
3945da6655 🐛 修正 pyproject schema (#566) 2025-12-21 18:00:14 +00:00
pre-commit-ci[bot]
bc59e287d8 ⬆️ auto update by pre-commit hooks (#558)
Some checks failed
Code Coverage / Test (macos-latest, 3.10) (push) Has been cancelled
Code Coverage / Test (macos-latest, 3.11) (push) Has been cancelled
Code Coverage / Test (macos-latest, 3.12) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.10) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.11) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.12) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.10) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.11) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.12) (push) Has been cancelled
TypeCheck / TypeCheck (push) Has been cancelled
CodeQL / Analyze (python) (push) Has been cancelled
Code Coverage / check (push) Has been cancelled
updates:
- [github.com/astral-sh/ruff-pre-commit: v0.12.10 → v0.14.7](https://github.com/astral-sh/ruff-pre-commit/compare/v0.12.10...v0.14.7)

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2025-12-03 17:50:03 +00:00
renovate[bot]
f8382150c7 ⬆️ Upgrade astral-sh/setup-uv action to v7 (#562)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-12-03 17:48:35 +00:00
renovate[bot]
4fd17204c4 ⬆️ Upgrade github/codeql-action action to v4 (#560)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-12-03 17:46:11 +00:00
renovate[bot]
9d4333812d ⬆️ Upgrade actions/setup-python action to v6 (#559)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-12-03 17:43:40 +00:00
renovate[bot]
2c07a1e337 ⬆️ Upgrade dependency prettier to v3.7.4 (#564)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-12-03 17:41:18 +00:00
renovate[bot]
2f95650282 ⬆️ Upgrade actions/checkout action to v6 (#563)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-12-04 01:38:25 +08:00
pre-commit-ci[bot]
0b4d49ee6a ⬆️ auto update by pre-commit hooks (#556)
Some checks failed
Code Coverage / Test (macos-latest, 3.10) (push) Has been cancelled
Code Coverage / Test (macos-latest, 3.11) (push) Has been cancelled
Code Coverage / Test (macos-latest, 3.12) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.10) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.11) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.12) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.10) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.11) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.12) (push) Has been cancelled
TypeCheck / TypeCheck (push) Has been cancelled
CodeQL / Analyze (python) (push) Has been cancelled
Code Coverage / check (push) Has been cancelled
updates:
- [github.com/astral-sh/ruff-pre-commit: v0.12.4 → v0.12.10](https://github.com/astral-sh/ruff-pre-commit/compare/v0.12.4...v0.12.10)

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2025-08-30 15:57:34 +08:00
renovate[bot]
9392fefdf7 ⬆️ Upgrade actions/checkout action to v5 (#557)
Some checks failed
Code Coverage / Test (macos-latest, 3.10) (push) Has been cancelled
Code Coverage / Test (macos-latest, 3.11) (push) Has been cancelled
Code Coverage / Test (macos-latest, 3.12) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.10) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.11) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.12) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.10) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.11) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.12) (push) Has been cancelled
TypeCheck / TypeCheck (push) Has been cancelled
CodeQL / Analyze (python) (push) Has been cancelled
Code Coverage / check (push) Has been cancelled
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-08-26 00:28:00 +08:00
a57811b0d3 🔖 1.11.0
Some checks failed
Code Coverage / Test (macos-latest, 3.10) (push) Has been cancelled
Code Coverage / Test (macos-latest, 3.11) (push) Has been cancelled
Code Coverage / Test (macos-latest, 3.12) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.10) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.11) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.12) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.10) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.11) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.12) (push) Has been cancelled
TypeCheck / TypeCheck (push) Has been cancelled
CodeQL / Analyze (python) (push) Has been cancelled
Code Coverage / check (push) Has been cancelled
2025-07-28 01:41:59 +08:00
呵呵です
7a5170936b 🧑‍💻添加更多实用开发配置项 (#555) 2025-07-28 01:31:55 +08:00
呵呵です
068c508f57 IO添加重新验证命令 (#554)
* 🐛 修复更新绑定时验证状态没有正确更新的bug

*  IO添加重新验证命令

* Apply suggestion from @Copilot

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>

---------

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-07-28 01:11:41 +08:00
renovate[bot]
0648ca021b ⬆️ Upgrade re-actors/alls-green digest to 2765efe (#551)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-07-27 15:00:04 +00:00
呵呵です
65e7fed32b ♻️ 重构模板截图部分以解决导航导致的报错 (#553)
* ♻️ 把 path 放到数据模型里

* ♻️ 使用通用函数来生成模板图片

* 🎨 同步模板项目结构

* 🐛 修正导入路径
2025-07-27 22:58:41 +08:00
呵呵です
fdbb2f3f6e 通过io账号绑定的discord验证归属权 #64 (#552)
Some checks are pending
Code Coverage / Test (macos-latest, 3.10) (push) Waiting to run
Code Coverage / Test (macos-latest, 3.11) (push) Waiting to run
Code Coverage / Test (macos-latest, 3.12) (push) Waiting to run
Code Coverage / Test (ubuntu-latest, 3.10) (push) Waiting to run
Code Coverage / Test (ubuntu-latest, 3.11) (push) Waiting to run
Code Coverage / Test (ubuntu-latest, 3.12) (push) Waiting to run
Code Coverage / Test (windows-latest, 3.10) (push) Waiting to run
Code Coverage / Test (windows-latest, 3.11) (push) Waiting to run
Code Coverage / Test (windows-latest, 3.12) (push) Waiting to run
Code Coverage / check (push) Blocked by required conditions
TypeCheck / TypeCheck (push) Waiting to run
CodeQL / Analyze (python) (push) Waiting to run
* 🗃️ 添加 verify 字段到 Bind 模型,并在创建或更新绑定时支持该字段

*  通过io账号绑定的discord验证归属权
2025-07-27 05:01:33 +08:00
144c223fe9 🔖 1.10.2
Some checks failed
Code Coverage / Test (macos-latest, 3.10) (push) Has been cancelled
Code Coverage / Test (macos-latest, 3.11) (push) Has been cancelled
Code Coverage / Test (macos-latest, 3.12) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.10) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.11) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.12) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.10) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.11) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.12) (push) Has been cancelled
TypeCheck / TypeCheck (push) Has been cancelled
CodeQL / Analyze (python) (push) Has been cancelled
Code Coverage / check (push) Has been cancelled
2025-07-19 22:40:47 +08:00
呵呵です
52a6d95434 🐛 移除 julianday 的使用,兼容更多数据库 (#550) 2025-07-19 22:40:04 +08:00
d8255756ca 🔖 1.10.1 2025-07-19 19:55:54 +08:00
呵呵です
13c6d53b6a 🐛 修改用户唯一标识符字段长度 (#549) 2025-07-19 19:54:40 +08:00
6493aba7e0 🔖 1.10.0
Some checks failed
Code Coverage / Test (macos-latest, 3.10) (push) Has been cancelled
Code Coverage / Test (macos-latest, 3.11) (push) Has been cancelled
Code Coverage / Test (macos-latest, 3.12) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.10) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.11) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.12) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.10) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.11) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.12) (push) Has been cancelled
TypeCheck / TypeCheck (push) Has been cancelled
CodeQL / Analyze (python) (push) Has been cancelled
Code Coverage / check (push) Has been cancelled
2025-07-18 05:53:41 +08:00
pre-commit-ci[bot]
b82053be11 ⬆️ auto update by pre-commit hooks (#548)
* ⬆️ auto update by pre-commit hooks

updates:
- [github.com/astral-sh/ruff-pre-commit: v0.11.13 → v0.12.3](https://github.com/astral-sh/ruff-pre-commit/compare/v0.11.13...v0.12.3)

* 🚨 auto fix by pre-commit hooks

* ⬆️ Upgrade dependency ruff to v0.12.4

* 🚨 修复 lint 警告

* 🚨 添加一个 noqa(

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: 呵呵です <51957264+shoucandanghehe@users.noreply.github.com>
Co-authored-by: shoucandanghehe <wallfjjd@gmail.com>
2025-07-17 21:49:16 +00:00
renovate[bot]
11bc486420 ⬆️ Upgrade dependency prettier to v3.6.2 (#547)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-07-17 21:35:20 +00:00
呵呵です
9916902c10 🐛 修复 postgresql 标识符大于63字符的错误 (#545)
* 🗃️ 自定义表名

*  添加开发依赖 nonebot-plugin-orm[postgresql]

* 🗃️ postgresql 跳过所有旧迁移脚本

* 🗃️ 修正方言

* 🗃️ 添加迁移脚本

* 🚨 auto fix by pre-commit hooks

* 🚨 添加一个 noqa(

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2025-07-18 05:32:50 +08:00
pre-commit-ci[bot]
e347b41ba6 ⬆️ auto update by pre-commit hooks (#546)
Some checks failed
Code Coverage / Test (macos-latest, 3.10) (push) Has been cancelled
Code Coverage / Test (macos-latest, 3.11) (push) Has been cancelled
Code Coverage / Test (macos-latest, 3.12) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.10) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.11) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.12) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.10) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.11) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.12) (push) Has been cancelled
TypeCheck / TypeCheck (push) Has been cancelled
CodeQL / Analyze (python) (push) Has been cancelled
Code Coverage / check (push) Has been cancelled
updates:
- [github.com/astral-sh/ruff-pre-commit: v0.11.12 → v0.11.13](https://github.com/astral-sh/ruff-pre-commit/compare/v0.11.12...v0.11.13)

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2025-06-10 20:28:52 +08:00
pre-commit-ci[bot]
40d0bf06bb ⬆️ auto update by pre-commit hooks (#544)
Some checks failed
Code Coverage / Test (macos-latest, 3.10) (push) Has been cancelled
Code Coverage / Test (macos-latest, 3.11) (push) Has been cancelled
Code Coverage / Test (macos-latest, 3.12) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.10) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.11) (push) Has been cancelled
Code Coverage / Test (ubuntu-latest, 3.12) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.10) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.11) (push) Has been cancelled
Code Coverage / Test (windows-latest, 3.12) (push) Has been cancelled
TypeCheck / TypeCheck (push) Has been cancelled
CodeQL / Analyze (python) (push) Has been cancelled
Code Coverage / check (push) Has been cancelled
updates:
- [github.com/astral-sh/ruff-pre-commit: v0.11.11 → v0.11.12](https://github.com/astral-sh/ruff-pre-commit/compare/v0.11.11...v0.11.12)

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2025-06-07 19:32:27 +08:00
115 changed files with 7357 additions and 3070 deletions

View File

@@ -12,15 +12,15 @@ jobs:
id-token: write
contents: write
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- uses: astral-sh/setup-uv@v6
- uses: astral-sh/setup-uv@v7
name: Setup UV
with:
enable-cache: true
- name: 'Set up Python'
uses: actions/setup-python@v5
uses: actions/setup-python@v6
with:
python-version-file: '.python-version'

View File

@@ -3,7 +3,7 @@ name: Code Coverage
on:
push:
branches:
- 'main'
- "main"
pull_request:
concurrency:
@@ -16,8 +16,7 @@ jobs:
runs-on: ${{ matrix.os }}
strategy:
matrix:
# python-version: ['3.10', '3.11', '3.12', '3.13']
python-version: ['3.10', '3.11', '3.12']
python-version: ["3.10", "3.11", "3.12", "3.13", "3.14"]
os: [ubuntu-latest, windows-latest, macos-latest]
fail-fast: false
env:
@@ -25,10 +24,10 @@ jobs:
PYTHON_VERSION: ${{ matrix.python-version }}
steps:
- name: Checkout
uses: actions/checkout@v4
uses: actions/checkout@v6
- name: Setup uv
uses: astral-sh/setup-uv@v6
uses: astral-sh/setup-uv@v7
with:
enable-cache: true
cache-suffix: ${{ env.PYTHON_VERSION }}_${{ env.OS }}
@@ -53,6 +52,6 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Decide whether the needed jobs succeeded or failed
uses: re-actors/alls-green@223e4bb7a751b91f43eda76992bcfbf23b8b0302
uses: re-actors/alls-green@a638d6464689bbb24c325bb3fe9404d63a913030
with:
jobs: ${{ toJSON(needs) }}

View File

@@ -7,15 +7,15 @@ jobs:
TypeCheck:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- uses: astral-sh/setup-uv@v6
- uses: astral-sh/setup-uv@v7
name: Setup UV
with:
enable-cache: true
- name: 'Set up Python'
uses: actions/setup-python@v5
uses: actions/setup-python@v6
with:
python-version-file: '.python-version'

View File

@@ -38,11 +38,11 @@ jobs:
steps:
- name: Checkout repository
uses: actions/checkout@v4
uses: actions/checkout@v6
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@v3
uses: github/codeql-action/init@v4
with:
languages: ${{ matrix.language }}
# If you wish to specify custom queries, you can do so here or in a config file.
@@ -55,7 +55,7 @@ jobs:
# Autobuild attempts to build any compiled languages (C/C++, C#, or Java).
# If this step fails, then you should remove it and run the build manually (see below)
- name: Autobuild
uses: github/codeql-action/autobuild@v3
uses: github/codeql-action/autobuild@v4
# Command-line programs to run using the OS shell.
# 📚 See https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idstepsrun
@@ -68,4 +68,4 @@ jobs:
# ./location_of_script_within_repo/buildscript.sh
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v3
uses: github/codeql-action/analyze@v4

View File

@@ -7,7 +7,7 @@ ci:
autoupdate_commit_msg: ':arrow_up: auto update by pre-commit hooks'
repos:
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.11.11
rev: v0.15.4
hooks:
- id: ruff
args: [--fix, --exit-non-zero-on-fix]

View File

@@ -41,10 +41,10 @@
alt="Gitmoji"
/>
</a>
<a href="https://wakatime.com/badge/user/138b2226-8e02-42be-b99d-35c05198836f/project/65f5bdf7-45ec-479a-8dd2-18c498c910ca">
<img
src="https://wakatime.com/badge/user/138b2226-8e02-42be-b99d-35c05198836f/project/65f5bdf7-45ec-479a-8dd2-18c498c910ca.svg"
alt="wakatime"
<a href="https://wakatime.com/badge/user/138b2226-8e02-42be-b99d-35c05198836f/project/e26c7985-a236-4e76-90a6-a72f71d305ef">
<img
src="https://wakatime.com/badge/user/138b2226-8e02-42be-b99d-35c05198836f/project/e26c7985-a236-4e76-90a6-a72f71d305ef.svg"
alt="wakatime"
/>
</a>
</p>

View File

@@ -1,3 +1,5 @@
from pathlib import Path
from nonebot import get_driver, get_plugin_config
from nonebot_plugin_localstore import get_plugin_cache_dir, get_plugin_data_dir
from pydantic import BaseModel, Field
@@ -14,11 +16,17 @@ class Proxy(BaseModel):
top: str | None = None
class Dev(BaseModel):
enabled: bool = False
template_path: Path | None = None
enable_template_check: bool = True
class ScopedConfig(BaseModel):
request_timeout: float = 30.0
screenshot_quality: float = 2
proxy: Proxy = Field(default_factory=Proxy)
development: bool = False
dev: Dev = Field(default_factory=Dev)
class Config(BaseModel):

View File

@@ -25,6 +25,8 @@ depends_on: str | Sequence[str] | None = None
def upgrade(name: str = '') -> None:
if name:
return
if op.get_bind().dialect.name == 'postgresql':
return
# ### commands auto generated by Alembic - please adjust! ###
with op.batch_alter_table('nonebot_plugin_tetris_stats_iorank', schema=None) as batch_op:
batch_op.alter_column('create_time', new_column_name='update_time', existing_type=sa.DateTime())
@@ -41,6 +43,8 @@ def upgrade(name: str = '') -> None:
def downgrade(name: str = '') -> None:
if name:
return
if op.get_bind().dialect.name == 'postgresql':
return
# ### commands auto generated by Alembic - please adjust! ###
with op.batch_alter_table('nonebot_plugin_tetris_stats_iorank', schema=None) as batch_op:
batch_op.alter_column('update_time', new_column_name='create_time')

View File

@@ -25,6 +25,8 @@ depends_on: str | Sequence[str] | None = None
def upgrade(name: str = '') -> None:
if name:
return
if op.get_bind().dialect.name == 'postgresql':
return
# ### commands auto generated by Alembic - please adjust! ###
with op.batch_alter_table('nonebot_plugin_tetris_stats_iorank', schema=None) as batch_op:
batch_op.add_column(sa.Column('file_hash', sa.String(length=128), nullable=True))
@@ -38,6 +40,8 @@ def upgrade(name: str = '') -> None:
def downgrade(name: str = '') -> None:
if name:
return
if op.get_bind().dialect.name == 'postgresql':
return
# ### commands auto generated by Alembic - please adjust! ###
with op.batch_alter_table('nonebot_plugin_tetris_stats_iorank', schema=None) as batch_op:
batch_op.drop_index(batch_op.f('ix_nonebot_plugin_tetris_stats_iorank_file_hash'))

View File

@@ -0,0 +1,160 @@
"""fix json storage
迁移 ID: 1c5346b657d4
父迁移: 2ff388a8c486
创建时间: 2026-01-30 03:35:00
"""
from __future__ import annotations
from typing import TYPE_CHECKING
import msgspec
import sqlalchemy as sa
from alembic import op
from nonebot.log import logger
from sqlalchemy.dialects.postgresql import ARRAY, JSONB
if TYPE_CHECKING:
from collections.abc import Sequence
from sqlalchemy.engine import Connection
_LOG_INTERVAL = 10000
_BATCH_SIZE = 1000
_PG_CHUNK_SIZE = 50000
_SQLITE_FETCH_SIZE = 500
revision: str = '1c5346b657d4'
down_revision: str | Sequence[str] | None = '2ff388a8c486'
branch_labels: str | Sequence[str] | None = None
depends_on: str | Sequence[str] | None = None
tables: dict[str, list[str]] = {
'nb_t_io_tl_stats_field': [
'low_pps',
'low_apm',
'low_vs',
'high_pps',
'high_apm',
'high_vs',
],
'nb_t_io_hist_data': ['data'],
'nb_t_top_hist_data': ['data'],
'nb_t_tos_hist_data': ['data'],
'nb_t_io_tl_hist': ['data'],
}
def _pg_convert_column(conn: Connection, table: str, column: str) -> None:
tbl = sa.table(table, sa.column('id'), sa.column(column))
col = getattr(tbl.c, column)
path = sa.cast(sa.literal('{}'), ARRAY(sa.Text))
payload = sa.cast(sa.cast(col, JSONB).op('#>>')(path), sa.JSON)
base = sa.func.json_typeof(col) == 'string'
min_max_stmt = sa.select(sa.func.min(tbl.c.id), sa.func.max(tbl.c.id)).where(base)
result = conn.execute(min_max_stmt).one()
if result[0] is None or result[1] is None:
return
start_id, end_id = result
total = end_id - start_id + 1
processed = 0
context = op.get_context()
with context.autocommit_block():
for chunk_start in range(start_id, end_id + 1, _PG_CHUNK_SIZE):
chunk_end = min(chunk_start + _PG_CHUNK_SIZE - 1, end_id)
stmt = (
sa.update(tbl).values({column: payload}).where(base).where(sa.between(tbl.c.id, chunk_start, chunk_end))
)
conn.execute(stmt)
processed += chunk_end - chunk_start + 1
logger.warning(
f'tetris_stats: converting {table}.{column} chunk {chunk_start}-{chunk_end} '
f'processed={processed}/{total}'
)
remaining_stmt = sa.select(sa.func.count()).select_from(tbl).where(base)
remaining = conn.execute(remaining_stmt).scalar()
if remaining:
msg = f'json storage fix failed: {table}.{column} still has string rows'
raise ValueError(msg)
def _pg_convert(conn: Connection) -> None:
for table, columns in tables.items():
for column in columns:
logger.warning(f'tetris_stats: converting {table}.{column} from json string to object')
_pg_convert_column(conn, table, column)
def _convert_table_python(conn: Connection, table_name: str, columns: list[str]) -> None: # noqa: C901
meta = sa.MetaData()
table = sa.Table(table_name, meta, autoload_with=conn)
update_stmt = (
table.update().where(table.c.id == sa.bindparam('b_id')).values(**{col: sa.bindparam(col) for col in columns})
)
batch: list[dict[str, object]] = []
last_id = 0
processed = 0
while True:
rows = (
conn.execute(
sa.select(table.c.id, *[table.c[col] for col in columns])
.where(table.c.id > last_id)
.order_by(table.c.id)
.limit(_SQLITE_FETCH_SIZE)
)
.mappings()
.all()
)
if not rows:
break
for row in rows:
last_id = row['id']
processed += 1
update_values: dict[str, object] = {'b_id': row['id']}
changed = False
for column in columns:
value = row[column]
if isinstance(value, str | bytes):
parsed = msgspec.json.decode(value)
if not isinstance(parsed, dict | list):
msg = f'json storage fix failed: {table_name}.{column} value is not object'
raise TypeError(msg)
update_values[column] = parsed
changed = True
elif isinstance(value, dict | list):
update_values[column] = value
else:
msg = f'json storage fix failed: {table_name}.{column} invalid type {type(value)}'
raise TypeError(msg)
if changed:
batch.append(update_values)
if processed % _LOG_INTERVAL == 0:
logger.warning(f'tetris_stats: converting {table_name} processed={processed}')
if len(batch) >= _BATCH_SIZE:
conn.execute(update_stmt, batch)
batch.clear()
if batch:
conn.execute(update_stmt, batch)
def _generic_convert(conn: Connection) -> None:
for table, columns in tables.items():
logger.warning(f'tetris_stats: converting {table} via python')
_convert_table_python(conn, table, columns)
def upgrade(name: str = '') -> None:
if name:
return
conn = op.get_bind()
if conn.dialect.name == 'postgresql':
_pg_convert(conn)
else:
_generic_convert(conn)
def downgrade(name: str = '') -> None:
if name:
return

View File

@@ -0,0 +1,42 @@
"""add verify field
迁移 ID: 2ff388a8c486
父迁移: 3588702dd3a4
创建时间: 2025-07-22 18:09:09.734164
"""
from __future__ import annotations
from typing import TYPE_CHECKING
import sqlalchemy as sa
from alembic import op
if TYPE_CHECKING:
from collections.abc import Sequence
revision: str = '2ff388a8c486'
down_revision: str | Sequence[str] | None = '3588702dd3a4'
branch_labels: str | Sequence[str] | None = None
depends_on: str | Sequence[str] | None = None
def upgrade(name: str = '') -> None:
if name:
return
# ### commands auto generated by Alembic - please adjust! ###
with op.batch_alter_table('nb_t_bind', schema=None) as batch_op:
batch_op.add_column(sa.Column('verify', sa.Boolean(), nullable=False, server_default='false'))
# ### end Alembic commands ###
def downgrade(name: str = '') -> None:
if name:
return
# ### commands auto generated by Alembic - please adjust! ###
with op.batch_alter_table('nb_t_bind', schema=None) as batch_op:
batch_op.drop_column('verify')
# ### end Alembic commands ###

View File

@@ -0,0 +1,52 @@
"""modify field length
迁移 ID: 3588702dd3a4
父迁移: bc6abd57928f
创建时间: 2025-07-19 17:21:17.927162
"""
from __future__ import annotations
from typing import TYPE_CHECKING
import sqlalchemy as sa
from alembic import op
if TYPE_CHECKING:
from collections.abc import Sequence
revision: str = '3588702dd3a4'
down_revision: str | Sequence[str] | None = 'bc6abd57928f'
branch_labels: str | Sequence[str] | None = None
depends_on: str | Sequence[str] | None = None
def upgrade(name: str = '') -> None:
if name:
return
# ### commands auto generated by Alembic - please adjust! ###
with op.batch_alter_table('nb_t_tos_hist_data', schema=None) as batch_op:
batch_op.alter_column(
'user_unique_identifier',
existing_type=sa.VARCHAR(length=24),
type_=sa.String(length=256),
existing_nullable=False,
)
# ### end Alembic commands ###
def downgrade(name: str = '') -> None:
if name:
return
# ### commands auto generated by Alembic - please adjust! ###
with op.batch_alter_table('nb_t_tos_hist_data', schema=None) as batch_op:
batch_op.alter_column(
'user_unique_identifier',
existing_type=sa.String(length=256),
type_=sa.VARCHAR(length=24),
existing_nullable=False,
)
# ### end Alembic commands ###

View File

@@ -0,0 +1,353 @@
"""add io tl map
迁移 ID: 3a294ff14610
父迁移: 6ecf383d646a
创建时间: 2026-01-28 03:25:40.714853
"""
from __future__ import annotations
import os
import re
import time
from typing import TYPE_CHECKING
import sqlalchemy as sa
from alembic import op
from nonebot.log import logger
from rich.progress import (
BarColumn,
MofNCompleteColumn,
Progress,
ProgressColumn,
Task,
TaskProgressColumn,
TextColumn,
TimeRemainingColumn,
filesize,
)
from rich.text import Text
from sqlalchemy import Connection, text
from sqlalchemy.ext.automap import automap_base
from sqlalchemy.orm import Session
from typing_extensions import override
if TYPE_CHECKING:
from collections.abc import Sequence
revision: str = '3a294ff14610'
down_revision: str | Sequence[str] | None = '6ecf383d646a'
branch_labels: str | Sequence[str] | None = None
depends_on: str | Sequence[str] | None = None
class RateColumn(ProgressColumn):
"""Renders human readable processing rate."""
@override
def render(self, task: Task) -> Text:
"""Render the speed in iterations per second."""
def calculate_speed() -> float | None:
now = time.monotonic()
if task.start_time is not None:
elapsed = (task.finished_time or now) - task.start_time
if elapsed > 0:
return task.completed / elapsed
return None
speed = task.finished_speed or task.speed or calculate_speed()
if speed is None:
return Text('', style='progress.percentage')
unit, suffix = filesize.pick_unit_and_suffix(
int(speed),
['', '×10³', '×10⁶', '×10⁹', '×10¹²'], # noqa: RUF001
1000,
)
data_speed = speed / unit
return Text(f'{data_speed:.1f}{suffix} it/s', style='progress.percentage')
def _backfill_postgresql(conn: Connection, chunk_size: int = 20000) -> None:
result = conn.execute(text('SELECT min(id), max(id) FROM nb_t_io_tl_hist')).one()
if result[0] is None or result[1] is None:
return
min_id, max_id = result
total = max_id - min_id + 1
logger.warning('PG backfill: Disabling foreign key constraints...')
work_mem = os.getenv('TETRIS_STATS_MIGRATION_WORK_MEM', '256MB')
if not re.fullmatch(r'\d+(kB|MB|GB)', work_mem):
work_mem = '256MB'
conn.execute(
text("SELECT set_config('work_mem', :work_mem, true)"),
{'work_mem': work_mem},
)
temp_buffers = os.getenv('TETRIS_STATS_MIGRATION_TEMP_BUFFERS', '128MB')
if not re.fullmatch(r'\d+(kB|MB|GB)', temp_buffers):
temp_buffers = '128MB'
conn.execute(
text("SELECT set_config('temp_buffers', :temp_buffers, true)"),
{'temp_buffers': temp_buffers},
)
conn.execute(text('SET LOCAL synchronous_commit = off'))
logger.warning('tetris_stats: PG backfill synchronous_commit=off')
logger.warning(f'tetris_stats: PG backfill work_mem={work_mem}')
logger.warning(f'tetris_stats: PG backfill temp_buffers={temp_buffers}')
conn.execute(text('SET LOCAL max_parallel_workers_per_gather = 8'))
conn.execute(text('SET LOCAL parallel_setup_cost = 10'))
conn.execute(text('SET LOCAL parallel_tuple_cost = 0.01'))
logger.warning('tetris_stats: PG backfill max_parallel_workers_per_gather=8')
logger.warning('tetris_stats: PG backfill parallel_setup_cost=10')
logger.warning('tetris_stats: PG backfill parallel_tuple_cost=0.01')
with Progress(
TextColumn('[progress.description]{task.description}'),
BarColumn(),
MofNCompleteColumn(),
TaskProgressColumn(),
RateColumn(),
TimeRemainingColumn(),
) as progress:
task = progress.add_task('生成索引...', total=total)
for start_id in range(min_id, max_id + 1, chunk_size):
end_id = min(start_id + chunk_size - 1, max_id)
conn.execute(
text(
"""
WITH entries AS (
SELECT
h.stats_id,
h.id AS hist_id,
e.ordinality - 1 AS entry_index,
COALESCE(e.entry->>'_id', e.entry->>'id') AS uid_str
FROM nb_t_io_tl_hist h
CROSS JOIN LATERAL jsonb_array_elements(h.data::jsonb->'data'->'entries')
WITH ORDINALITY AS e(entry, ordinality)
WHERE h.id BETWEEN :start_id AND :end_id
AND COALESCE(e.entry->>'_id', e.entry->>'id') IS NOT NULL
),
upserted_uids AS (
INSERT INTO nb_t_io_uid (user_unique_identifier)
SELECT DISTINCT uid_str FROM entries
ON CONFLICT (user_unique_identifier)
DO UPDATE SET user_unique_identifier = EXCLUDED.user_unique_identifier
RETURNING id, user_unique_identifier
)
INSERT INTO nb_t_io_tl_map (stats_id, uid_id, hist_id, entry_index)
SELECT e.stats_id, u.id, e.hist_id, e.entry_index
FROM entries e
JOIN upserted_uids u ON u.user_unique_identifier = e.uid_str
"""
),
{'start_id': start_id, 'end_id': end_id},
)
progress.update(task, advance=end_id - start_id + 1)
def _add_foreign_keys_postgresql(conn: Connection) -> None:
logger.warning('PG backfill: Re-adding foreign key constraints (validating)...')
conn.execute(
text("""
ALTER TABLE nb_t_io_tl_map
ADD CONSTRAINT fk_nb_t_io_tl_map_hist_id_nb_t_io_tl_hist
FOREIGN KEY (hist_id) REFERENCES nb_t_io_tl_hist(id)
NOT VALID
""")
)
conn.execute(
text("""
ALTER TABLE nb_t_io_tl_map
VALIDATE CONSTRAINT fk_nb_t_io_tl_map_hist_id_nb_t_io_tl_hist
""")
)
conn.execute(
text("""
ALTER TABLE nb_t_io_tl_map
ADD CONSTRAINT fk_nb_t_io_tl_map_stats_id_nb_t_io_tl_stats
FOREIGN KEY (stats_id) REFERENCES nb_t_io_tl_stats(id)
NOT VALID
""")
)
conn.execute(
text("""
ALTER TABLE nb_t_io_tl_map
VALIDATE CONSTRAINT fk_nb_t_io_tl_map_stats_id_nb_t_io_tl_stats
""")
)
conn.execute(
text("""
ALTER TABLE nb_t_io_tl_map
ADD CONSTRAINT fk_nb_t_io_tl_map_uid_id_nb_t_io_uid
FOREIGN KEY (uid_id) REFERENCES nb_t_io_uid(id)
NOT VALID
""")
)
conn.execute(
text("""
ALTER TABLE nb_t_io_tl_map
VALIDATE CONSTRAINT fk_nb_t_io_tl_map_uid_id_nb_t_io_uid
""")
)
logger.success('PG backfill: Foreign keys validated successfully')
def _backfill_generic(conn: Connection) -> None:
Base = automap_base() # noqa: N806
Base.prepare(autoload_with=conn)
Hist = Base.classes.nb_t_io_tl_hist # noqa: N806
Uid = Base.classes.nb_t_io_uid # noqa: N806
Map = Base.classes.nb_t_io_tl_map # noqa: N806
with Session(conn) as session:
count = session.query(Hist).count()
if count == 0:
return
logger.warning('tetris_stats: 正在生成 TETR.IO 玩家分页索引, 请不要关闭程序...')
uid_map: dict[str, int] = {}
def refresh_uid_map() -> None:
uids = session.query(Uid).all()
uid_map.clear()
uid_map.update({uid.user_unique_identifier: uid.id for uid in uids})
with Progress(
TextColumn('[progress.description]{task.description}'),
BarColumn(),
MofNCompleteColumn(),
TaskProgressColumn(),
RateColumn(),
TimeRemainingColumn(),
) as progress:
total = progress.add_task('生成索引...', total=count)
for hist in session.query(Hist).yield_per(1):
data = hist.data
if isinstance(data, str | bytes):
msg = 'io tl map migration requires json object data'
raise TypeError(msg)
entries = data.get('data', {}).get('entries', []) if isinstance(data, dict) else []
entry_info: list[tuple[str, int]] = []
for index, entry in enumerate(entries):
if isinstance(entry, dict):
uid = entry.get('_id')
if isinstance(uid, str):
entry_info.append((uid, index))
if not entry_info:
progress.update(total, advance=1)
continue
session.add_all([Uid(user_unique_identifier=uid) for uid, _ in entry_info if uid not in uid_map])
session.flush()
refresh_uid_map()
session.add_all(
[
Map(
stats_id=hist.stats_id,
uid_id=uid_map[uid],
hist_id=hist.id,
entry_index=index,
)
for uid, index in entry_info
]
)
session.flush()
progress.update(total, advance=1)
def backfill_mapping(conn: Connection) -> None:
if conn.dialect.name == 'postgresql':
logger.warning('tetris_stats: 检测到 PostgreSQL, 使用快速索引回填...')
_backfill_postgresql(conn)
_add_foreign_keys_postgresql(conn)
return
_backfill_generic(conn)
def upgrade(name: str = '') -> None:
if name:
return
conn = op.get_bind()
op.create_table(
'nb_t_io_uid',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('user_unique_identifier', sa.String(length=24), nullable=False),
sa.PrimaryKeyConstraint('id', name=op.f('pk_nb_t_io_uid')),
info={'bind_key': 'nonebot_plugin_tetris_stats'},
)
with op.batch_alter_table('nb_t_io_uid', schema=None) as batch_op:
batch_op.create_index(
batch_op.f('ix_nb_t_io_uid_user_unique_identifier'),
['user_unique_identifier'],
unique=True,
)
if conn.dialect.name == 'postgresql':
op.create_table(
'nb_t_io_tl_map',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('stats_id', sa.Integer(), nullable=False),
sa.Column('uid_id', sa.Integer(), nullable=False),
sa.Column('hist_id', sa.Integer(), nullable=False),
sa.Column('entry_index', sa.Integer(), nullable=False),
sa.PrimaryKeyConstraint('id', name=op.f('pk_nb_t_io_tl_map')),
sa.UniqueConstraint('uid_id', 'hist_id', name='uq_nb_t_io_tl_map_uid_hist'),
info={'bind_key': 'nonebot_plugin_tetris_stats'},
)
else:
op.create_table(
'nb_t_io_tl_map',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('stats_id', sa.Integer(), nullable=False),
sa.Column('uid_id', sa.Integer(), nullable=False),
sa.Column('hist_id', sa.Integer(), nullable=False),
sa.Column('entry_index', sa.Integer(), nullable=False),
sa.ForeignKeyConstraint(
['stats_id'],
['nb_t_io_tl_stats.id'],
name=op.f('fk_nb_t_io_tl_map_stats_id_nb_t_io_tl_stats'),
),
sa.ForeignKeyConstraint(
['uid_id'],
['nb_t_io_uid.id'],
name=op.f('fk_nb_t_io_tl_map_uid_id_nb_t_io_uid'),
),
sa.ForeignKeyConstraint(
['hist_id'],
['nb_t_io_tl_hist.id'],
name=op.f('fk_nb_t_io_tl_map_hist_id_nb_t_io_tl_hist'),
),
sa.PrimaryKeyConstraint('id', name=op.f('pk_nb_t_io_tl_map')),
sa.UniqueConstraint('uid_id', 'hist_id', name='uq_nb_t_io_tl_map_uid_hist'),
info={'bind_key': 'nonebot_plugin_tetris_stats'},
)
backfill_mapping(conn)
with op.batch_alter_table('nb_t_io_tl_map', schema=None) as batch_op:
batch_op.create_index(batch_op.f('ix_nb_t_io_tl_map_stats_id'), ['stats_id'], unique=False)
batch_op.create_index(batch_op.f('ix_nb_t_io_tl_map_uid_id'), ['uid_id'], unique=False)
def downgrade(name: str = '') -> None:
if name:
return
with op.batch_alter_table('nb_t_io_tl_map', schema=None) as batch_op:
batch_op.drop_index(batch_op.f('ix_nb_t_io_tl_map_uid_id'))
batch_op.drop_index(batch_op.f('ix_nb_t_io_tl_map_stats_id'))
op.drop_table('nb_t_io_tl_map')
with op.batch_alter_table('nb_t_io_uid', schema=None) as batch_op:
batch_op.drop_index(batch_op.f('ix_nb_t_io_uid_user_unique_identifier'))
op.drop_table('nb_t_io_uid')

View File

@@ -16,7 +16,6 @@ from alembic import op
from nonebot.log import logger
from rich.progress import BarColumn, MofNCompleteColumn, Progress, TaskProgressColumn, TextColumn, TimeRemainingColumn
from sqlalchemy import desc, select
from sqlalchemy.dialects import sqlite
from sqlalchemy.ext.automap import automap_base
from sqlalchemy.orm import Session
@@ -30,7 +29,7 @@ depends_on: str | Sequence[str] | None = None
def migrate_old_data() -> None: # noqa: C901
from json import dumps, loads
from json import dumps, loads # noqa: PLC0415
Base = automap_base() # noqa: N806
Base.prepare(autoload_with=op.get_bind())
@@ -109,6 +108,8 @@ def migrate_old_data() -> None: # noqa: C901
def upgrade(name: str = '') -> None:
if name:
return
if op.get_bind().dialect.name == 'postgresql':
return
# ### commands auto generated by Alembic - please adjust! ###
op.create_table(
'nonebot_plugin_tetris_stats_tetriohistoricaldata',
@@ -219,23 +220,25 @@ def upgrade(name: str = '') -> None:
def downgrade(name: str = '') -> None:
if name:
return
if op.get_bind().dialect.name == 'postgresql':
return
# ### commands auto generated by Alembic - please adjust! ###
op.create_table(
'nonebot_plugin_tetris_stats_historicaldata',
sa.Column('id', sa.INTEGER(), nullable=False),
sa.Column('trigger_time', sa.DATETIME(), nullable=False),
sa.Column('bot_platform', sa.VARCHAR(length=32), nullable=True),
sa.Column('bot_account', sa.VARCHAR(), nullable=True),
sa.Column('source_type', sa.VARCHAR(length=32), nullable=True),
sa.Column('source_account', sa.VARCHAR(), nullable=True),
sa.Column('message', sa.BLOB(), nullable=True),
sa.Column('game_platform', sa.VARCHAR(length=32), nullable=False),
sa.Column('command_type', sa.VARCHAR(length=16), nullable=False),
sa.Column('command_args', sqlite.JSON(), nullable=False),
sa.Column('game_user', sqlite.JSON(), nullable=False),
sa.Column('processed_data', sqlite.JSON(), nullable=False),
sa.Column('finish_time', sa.DATETIME(), nullable=False),
sa.Column('user_unique_identifier', sa.VARCHAR(length=32), nullable=False),
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('trigger_time', sa.DateTime(), nullable=False),
sa.Column('bot_platform', sa.String(length=32), nullable=True),
sa.Column('bot_account', sa.String(), nullable=True),
sa.Column('source_type', sa.String(length=32), nullable=True),
sa.Column('source_account', sa.String(), nullable=True),
sa.Column('message', sa.PickleType(), nullable=True),
sa.Column('game_platform', sa.String(length=32), nullable=False),
sa.Column('command_type', sa.String(length=16), nullable=False),
sa.Column('command_args', sa.JSON(), nullable=False),
sa.Column('game_user', sa.JSON(), nullable=False),
sa.Column('processed_data', sa.JSON(), nullable=False),
sa.Column('finish_time', sa.DateTime(), nullable=False),
sa.Column('user_unique_identifier', sa.String(length=32), nullable=False),
sa.PrimaryKeyConstraint('id', name='pk_nonebot_plugin_tetris_stats_historicaldata'),
)
with op.batch_alter_table('nonebot_plugin_tetris_stats_historicaldata', schema=None) as batch_op:

View File

@@ -0,0 +1,82 @@
"""migrate nonebot_plugin_tetris_stats_tetrioleaguestats
迁移 ID: 3d900bb0e8d4
父迁移: 405c6936a164
创建时间: 2025-07-18 02:22:03.771903
"""
from __future__ import annotations
from typing import TYPE_CHECKING
from alembic import op
from nonebot.log import logger
from rich.progress import BarColumn, Progress, SpinnerColumn, TaskProgressColumn, TextColumn
from sqlalchemy import inspect
from sqlalchemy.ext.automap import automap_base
from sqlalchemy.orm import Session
if TYPE_CHECKING:
from collections.abc import Sequence
revision: str = '3d900bb0e8d4'
down_revision: str | Sequence[str] | None = '405c6936a164'
branch_labels: str | Sequence[str] | None = None
depends_on: str | Sequence[str] | None = None
def data_migrate() -> None:
conn = op.get_bind()
insp = inspect(conn)
table_names = insp.get_table_names()
if 'nonebot_plugin_tetris_stats_tetrioleaguestats' not in table_names:
return
Base = automap_base() # noqa: N806
Base.prepare(autoload_with=conn)
Old = Base.classes.nonebot_plugin_tetris_stats_tetrioleaguestats # noqa: N806
New = Base.classes.nb_t_io_tl_stats # noqa: N806
with Session(conn) as db_session:
count = db_session.query(Old).count()
if count == 0:
return
logger.warning('tetris_stats: 正在迁移数据, 请不要关闭程序...')
with Progress(
SpinnerColumn(),
TextColumn('[progress.description]{task.description}'),
BarColumn(),
TaskProgressColumn(),
) as progress:
task = progress.add_task('迁移数据...', total=count)
for i in db_session.query(Old).yield_per(1):
db_session.add(
New(
id=i.id,
update_time=i.update_time,
)
)
progress.update(task, advance=1)
if progress.tasks[task].completed % 100 == 0:
db_session.commit()
db_session.commit()
logger.success('tetris_stats: 数据迁移完成!')
def upgrade(name: str = '') -> None:
if name:
return
data_migrate()
def downgrade(name: str = '') -> None:
if name:
return

View File

@@ -0,0 +1,85 @@
"""migrate nonebot_plugin_tetris_stats_tetrioleaguehistorical
迁移 ID: 405c6936a164
父迁移: bbbdfd94e6fa
创建时间: 2025-07-18 01:55:27.406032
"""
from __future__ import annotations
from typing import TYPE_CHECKING
from alembic import op
from nonebot.log import logger
from rich.progress import BarColumn, Progress, SpinnerColumn, TaskProgressColumn, TextColumn
from sqlalchemy import inspect
from sqlalchemy.ext.automap import automap_base
from sqlalchemy.orm import Session
if TYPE_CHECKING:
from collections.abc import Sequence
revision: str = '405c6936a164'
down_revision: str | Sequence[str] | None = 'bbbdfd94e6fa'
branch_labels: str | Sequence[str] | None = None
depends_on: str | Sequence[str] | None = None
def data_migrate() -> None:
conn = op.get_bind()
insp = inspect(conn)
table_names = insp.get_table_names()
if 'nonebot_plugin_tetris_stats_tetrioleaguehistorical' not in table_names:
return
Base = automap_base() # noqa: N806
Base.prepare(autoload_with=conn)
Old = Base.classes.nonebot_plugin_tetris_stats_tetrioleaguehistorical # noqa: N806
New = Base.classes.nb_t_io_tl_hist # noqa: N806
with Session(conn) as db_session:
count = db_session.query(Old).count()
if count == 0:
return
logger.warning('tetris_stats: 正在迁移数据, 请不要关闭程序...')
with Progress(
SpinnerColumn(),
TextColumn('[progress.description]{task.description}'),
BarColumn(),
TaskProgressColumn(),
) as progress:
task = progress.add_task('迁移数据...', total=count)
for i in db_session.query(Old).yield_per(1):
db_session.add(
New(
id=i.id,
request_id=i.request_id,
data=i.data,
update_time=i.update_time,
stats_id=i.stats_id,
)
)
progress.update(task, advance=1)
if progress.tasks[task].completed % 100 == 0:
db_session.commit()
db_session.commit()
logger.success('tetris_stats: 数据迁移完成!')
def upgrade(name: str = '') -> None:
if name:
return
data_migrate()
def downgrade(name: str = '') -> None:
if name:
return

View File

@@ -25,6 +25,8 @@ depends_on: str | Sequence[str] | None = None
def upgrade(name: str = '') -> None:
if name:
return
if op.get_bind().dialect.name == 'postgresql':
return
# ### commands auto generated by Alembic - please adjust! ###
op.create_table(
'nonebot_plugin_tetris_stats_tetrioleaguestats',
@@ -102,6 +104,8 @@ def upgrade(name: str = '') -> None:
def downgrade(name: str = '') -> None:
if name:
return
if op.get_bind().dialect.name == 'postgresql':
return
# ### commands auto generated by Alembic - please adjust! ###
with op.batch_alter_table('nonebot_plugin_tetris_stats_tetrioleaguestatsfield', schema=None) as batch_op:
batch_op.drop_index(batch_op.f('ix_nonebot_plugin_tetris_stats_tetrioleaguestatsfield_rank'))

View File

@@ -25,6 +25,8 @@ depends_on: str | Sequence[str] | None = None
def upgrade(name: str = '') -> None:
if name:
return
if op.get_bind().dialect.name == 'postgresql':
return
op.create_table(
'nonebot_plugin_tetris_stats_triggerhistoricaldatav2',
sa.Column('id', sa.Integer(), nullable=False),
@@ -53,6 +55,8 @@ def upgrade(name: str = '') -> None:
def downgrade(name: str = '') -> None:
if name:
return
if op.get_bind().dialect.name == 'postgresql':
return
with op.batch_alter_table('nonebot_plugin_tetris_stats_triggerhistoricaldatav2', schema=None) as batch_op:
batch_op.drop_index(batch_op.f('ix_nonebot_plugin_tetris_stats_triggerhistoricaldatav2_game_platform'))
batch_op.drop_index(batch_op.f('ix_nonebot_plugin_tetris_stats_triggerhistoricaldatav2_command_type'))

View File

@@ -26,7 +26,9 @@ depends_on: str | Sequence[str] | None = None
def upgrade(name: str = '') -> None:
if name:
return
from json import dumps, loads
if op.get_bind().dialect.name == 'postgresql':
return
from json import dumps, loads # noqa: PLC0415
Base = automap_base() # noqa: N806
connection = op.get_bind()
@@ -50,7 +52,9 @@ def upgrade(name: str = '') -> None:
def downgrade(name: str = '') -> None:
if name:
return
from json import dumps, loads
if op.get_bind().dialect.name == 'postgresql':
return
from json import dumps, loads # noqa: PLC0415
Base = automap_base() # noqa: N806
connection = op.get_bind()

View File

@@ -0,0 +1,53 @@
"""add compare delta config
迁移 ID: 6ecf383d646a
父迁移: 1c5346b657d4
创建时间: 2026-01-27 06:05:04.481654
"""
from __future__ import annotations
from typing import TYPE_CHECKING
import sqlalchemy as sa
from alembic import op
if TYPE_CHECKING:
from collections.abc import Sequence
revision: str = '6ecf383d646a'
down_revision: str | Sequence[str] | None = '1c5346b657d4'
branch_labels: str | Sequence[str] | None = None
depends_on: str | Sequence[str] | None = None
def upgrade(name: str = '') -> None:
if name:
return
op.create_table(
'nb_t_top_u_cfg',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('compare_delta', sa.Interval(), nullable=True),
sa.PrimaryKeyConstraint('id', name=op.f('pk_nb_t_top_u_cfg')),
info={'bind_key': 'nonebot_plugin_tetris_stats'},
)
op.create_table(
'nb_t_tos_u_cfg',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('compare_delta', sa.Interval(), nullable=True),
sa.PrimaryKeyConstraint('id', name=op.f('pk_nb_t_tos_u_cfg')),
info={'bind_key': 'nonebot_plugin_tetris_stats'},
)
with op.batch_alter_table('nb_t_io_u_cfg', schema=None) as batch_op:
batch_op.add_column(sa.Column('compare_delta', sa.Interval(), nullable=True))
def downgrade(name: str = '') -> None:
if name:
return
with op.batch_alter_table('nb_t_io_u_cfg', schema=None) as batch_op:
batch_op.drop_column('compare_delta')
op.drop_table('nb_t_tos_u_cfg')
op.drop_table('nb_t_top_u_cfg')

View File

@@ -45,7 +45,10 @@ def data_migrate() -> None:
return
try:
from nonebot_session_to_uninfo import check_tables, get_id_map # type: ignore[import-untyped]
from nonebot_session_to_uninfo import ( # type: ignore[import-untyped] # noqa: PLC0415
check_tables,
get_id_map,
)
except ImportError as err:
msg = '请安装 `nonebot-session-to-uninfo` 以迁移数据'
raise ValueError(msg) from err
@@ -105,9 +108,13 @@ def data_migrate() -> None:
def upgrade(name: str = '') -> None:
if name:
return
if op.get_bind().dialect.name == 'postgresql':
return
data_migrate()
def downgrade(name: str = '') -> None:
if name:
return
if op.get_bind().dialect.name == 'postgresql':
return

View File

@@ -0,0 +1,94 @@
"""migrate nonebot_plugin_tetris_stats_tetrioleaguestatsfield
迁移 ID: 8459b2a4b7a3
父迁移: 3d900bb0e8d4
创建时间: 2025-07-18 02:24:59.560252
"""
from __future__ import annotations
from typing import TYPE_CHECKING
from alembic import op
from nonebot.log import logger
from rich.progress import BarColumn, Progress, SpinnerColumn, TaskProgressColumn, TextColumn
from sqlalchemy import inspect
from sqlalchemy.ext.automap import automap_base
from sqlalchemy.orm import Session
if TYPE_CHECKING:
from collections.abc import Sequence
revision: str = '8459b2a4b7a3'
down_revision: str | Sequence[str] | None = '3d900bb0e8d4'
branch_labels: str | Sequence[str] | None = None
depends_on: str | Sequence[str] | None = None
def data_migrate() -> None:
conn = op.get_bind()
insp = inspect(conn)
table_names = insp.get_table_names()
if 'nonebot_plugin_tetris_stats_tetrioleaguestatsfield' not in table_names:
return
Base = automap_base() # noqa: N806
Base.prepare(autoload_with=conn)
Old = Base.classes.nonebot_plugin_tetris_stats_tetrioleaguestatsfield # noqa: N806
New = Base.classes.nb_t_io_tl_stats_field # noqa: N806
with Session(conn) as db_session:
count = db_session.query(Old).count()
if count == 0:
return
logger.warning('tetris_stats: 正在迁移数据, 请不要关闭程序...')
with Progress(
SpinnerColumn(),
TextColumn('[progress.description]{task.description}'),
BarColumn(),
TaskProgressColumn(),
) as progress:
task = progress.add_task('迁移数据...', total=count)
for i in db_session.query(Old).yield_per(1):
db_session.add(
New(
id=i.id,
rank=i.rank,
tr_line=i.tr_line,
player_count=i.player_count,
low_pps=i.low_pps,
low_apm=i.low_apm,
low_vs=i.low_vs,
avg_pps=i.avg_pps,
avg_apm=i.avg_apm,
avg_vs=i.avg_vs,
high_pps=i.high_pps,
high_apm=i.high_apm,
high_vs=i.high_vs,
stats_id=i.stats_id,
)
)
progress.update(task, advance=1)
if progress.tasks[task].completed % 100 == 0:
db_session.commit()
db_session.commit()
logger.success('tetris_stats: 数据迁移完成!')
def upgrade(name: str = '') -> None:
if name:
return
data_migrate()
def downgrade(name: str = '') -> None:
if name:
return

View File

@@ -28,10 +28,12 @@ depends_on: str | Sequence[str] | None = None
def upgrade(name: str = '') -> None: # noqa: C901
if name:
return
if op.get_bind().dialect.name == 'postgresql':
return
from nonebot.compat import PYDANTIC_V2, type_validate_json
from pydantic import BaseModel, ValidationError
from rich.progress import (
from nonebot.compat import PYDANTIC_V2, type_validate_json # noqa: PLC0415
from pydantic import BaseModel, ValidationError # noqa: PLC0415
from rich.progress import ( # noqa: PLC0415
BarColumn,
MofNCompleteColumn,
Progress,
@@ -58,14 +60,14 @@ def upgrade(name: str = '') -> None: # noqa: C901
logger.info('空表, 跳过')
return
from nonebot_plugin_tetris_stats.version import __version__
from nonebot_plugin_tetris_stats.version import __version__ # noqa: PLC0415
if __version__ != '1.0.3':
msg = '本迁移需要1.0.3版本, 请先锁定版本至1.0.3版本再执行本迁移'
logger.critical(msg)
raise RuntimeError(msg)
from nonebot_plugin_tetris_stats.game_data_processor.schemas import ( # type: ignore[import-untyped]
from nonebot_plugin_tetris_stats.game_data_processor.schemas import ( # type: ignore[import-untyped] # pyright: ignore[reportMissingImports] # noqa: PLC0415
BaseProcessedData,
)
@@ -101,3 +103,5 @@ def upgrade(name: str = '') -> None: # noqa: C901
def downgrade(name: str = '') -> None:
if name:
return
if op.get_bind().dialect.name == 'postgresql':
return

View File

@@ -25,6 +25,8 @@ depends_on: str | Sequence[str] | None = None
def upgrade(name: str = '') -> None:
if name:
return
if op.get_bind().dialect.name == 'postgresql':
return
# ### commands auto generated by Alembic - please adjust! ###
op.create_table(
'nonebot_plugin_tetris_stats_bind',
@@ -122,6 +124,8 @@ def upgrade(name: str = '') -> None:
def downgrade(name: str = '') -> None:
if name:
return
if op.get_bind().dialect.name == 'postgresql':
return
# ### commands auto generated by Alembic - please adjust! ###
with op.batch_alter_table('nonebot_plugin_tetris_stats_iorank', schema=None) as batch_op:
batch_op.drop_index(batch_op.f('ix_nonebot_plugin_tetris_stats_iorank_rank'))

View File

@@ -63,6 +63,8 @@ def migrate_old_data(connection: Connection) -> None:
def upgrade(name: str = '') -> None:
if name:
return
if op.get_bind().dialect.name == 'postgresql':
return
try:
db_path = Path(config.db_url)
except AttributeError:
@@ -91,3 +93,5 @@ def upgrade(name: str = '') -> None:
def downgrade(name: str = '') -> None:
if name:
return
if op.get_bind().dialect.name == 'postgresql':
return

View File

@@ -12,7 +12,6 @@ from typing import TYPE_CHECKING
import sqlalchemy as sa
from alembic import op
from sqlalchemy.dialects import sqlite
if TYPE_CHECKING:
from collections.abc import Sequence
@@ -26,6 +25,8 @@ depends_on: str | Sequence[str] | None = None
def upgrade(name: str = '') -> None:
if name:
return
if op.get_bind().dialect.name == 'postgresql':
return
# ### commands auto generated by Alembic - please adjust! ###
with op.batch_alter_table('nonebot_plugin_tetris_stats_historicaldata', schema=None) as batch_op:
batch_op.drop_index('ix_nonebot_plugin_tetris_stats_historicaldata_command_type')
@@ -71,6 +72,8 @@ def upgrade(name: str = '') -> None:
def downgrade(name: str = '') -> None:
if name:
return
if op.get_bind().dialect.name == 'postgresql':
return
# ### commands auto generated by Alembic - please adjust! ###
with op.batch_alter_table('nonebot_plugin_tetris_stats_historicaldata', schema=None) as batch_op:
batch_op.drop_index(batch_op.f('ix_nonebot_plugin_tetris_stats_historicaldata_source_type'))
@@ -82,19 +85,19 @@ def downgrade(name: str = '') -> None:
op.create_table(
'nonebot_plugin_tetris_stats_historicaldata',
sa.Column('id', sa.INTEGER(), nullable=False),
sa.Column('trigger_time', sa.DATETIME(), nullable=False),
sa.Column('bot_platform', sa.VARCHAR(length=32), nullable=True),
sa.Column('bot_account', sa.VARCHAR(), nullable=True),
sa.Column('source_type', sa.VARCHAR(length=32), nullable=True),
sa.Column('source_account', sa.VARCHAR(), nullable=True),
sa.Column('message', sa.BLOB(), nullable=True),
sa.Column('game_platform', sa.VARCHAR(length=32), nullable=False),
sa.Column('command_type', sa.VARCHAR(length=16), nullable=False),
sa.Column('command_args', sqlite.JSON(), nullable=False),
sa.Column('game_user', sa.BLOB(), nullable=False),
sa.Column('processed_data', sa.BLOB(), nullable=False),
sa.Column('finish_time', sa.DATETIME(), nullable=False),
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('trigger_time', sa.DateTime(), nullable=False),
sa.Column('bot_platform', sa.String(length=32), nullable=True),
sa.Column('bot_account', sa.String(), nullable=True),
sa.Column('source_type', sa.String(length=32), nullable=True),
sa.Column('source_account', sa.String(), nullable=True),
sa.Column('message', sa.PickleType(), nullable=True),
sa.Column('game_platform', sa.String(length=32), nullable=False),
sa.Column('command_type', sa.String(length=16), nullable=False),
sa.Column('command_args', sa.JSON(), nullable=False),
sa.Column('game_user', sa.PickleType(), nullable=False),
sa.Column('processed_data', sa.PickleType(), nullable=False),
sa.Column('finish_time', sa.DateTime(), nullable=False),
sa.PrimaryKeyConstraint('id', name='pk_nonebot_plugin_tetris_stats_historicaldata'),
)
with op.batch_alter_table('nonebot_plugin_tetris_stats_historicaldata', schema=None) as batch_op:

View File

@@ -25,6 +25,8 @@ depends_on: str | Sequence[str] | None = None
def upgrade(name: str = '') -> None:
if name:
return
if op.get_bind().dialect.name == 'postgresql':
return
# ### commands auto generated by Alembic - please adjust! ###
op.create_table(
'nonebot_plugin_tetris_stats_tetriouserconfig',
@@ -39,6 +41,8 @@ def upgrade(name: str = '') -> None:
def downgrade(name: str = '') -> None:
if name:
return
if op.get_bind().dialect.name == 'postgresql':
return
# ### commands auto generated by Alembic - please adjust! ###
op.drop_table('nonebot_plugin_tetris_stats_tetriouserconfig')
# ### end Alembic commands ###

View File

@@ -25,6 +25,8 @@ depends_on: str | Sequence[str] | None = None
def upgrade(name: str = '') -> None:
if name:
return
if op.get_bind().dialect.name == 'postgresql':
return
# ### commands auto generated by Alembic - please adjust! ###
with op.batch_alter_table('nonebot_plugin_tetris_stats_bind', schema=None) as batch_op:
batch_op.drop_index('ix_nonebot_plugin_tetris_stats_bind_chat_account')
@@ -49,6 +51,8 @@ def upgrade(name: str = '') -> None:
def downgrade(name: str = '') -> None:
if name:
return
if op.get_bind().dialect.name == 'postgresql':
return
# ### commands auto generated by Alembic - please adjust! ###
with op.batch_alter_table('nonebot_plugin_tetris_stats_bind', schema=None) as batch_op:
batch_op.drop_index(batch_op.f('ix_nonebot_plugin_tetris_stats_bind_user_id'))
@@ -57,11 +61,11 @@ def downgrade(name: str = '') -> None:
op.create_table(
'nonebot_plugin_tetris_stats_bind',
sa.Column('id', sa.INTEGER(), nullable=False),
sa.Column('chat_platform', sa.VARCHAR(length=32), nullable=False),
sa.Column('chat_account', sa.VARCHAR(), nullable=False),
sa.Column('game_platform', sa.VARCHAR(length=32), nullable=False),
sa.Column('game_account', sa.VARCHAR(), nullable=False),
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('chat_platform', sa.String(length=32), nullable=False),
sa.Column('chat_account', sa.String(), nullable=False),
sa.Column('game_platform', sa.String(length=32), nullable=False),
sa.Column('game_account', sa.String(), nullable=False),
sa.PrimaryKeyConstraint('id', name='pk_nonebot_plugin_tetris_stats_bind'),
)
with op.batch_alter_table('nonebot_plugin_tetris_stats_bind', schema=None) as batch_op:

View File

@@ -0,0 +1,215 @@
"""create new tables
迁移 ID: b2075a5ce371
父迁移: 766cc7e75a62
创建时间: 2025-07-17 22:57:32.245327
"""
from __future__ import annotations
from typing import TYPE_CHECKING
import sqlalchemy as sa
from alembic import op
if TYPE_CHECKING:
from collections.abc import Sequence
revision: str = 'b2075a5ce371'
down_revision: str | Sequence[str] | None = '766cc7e75a62'
branch_labels: str | Sequence[str] | None = None
depends_on: str | Sequence[str] | None = None
def upgrade(name: str = '') -> None:
if name:
return
# ### commands auto generated by Alembic - please adjust! ###
op.create_table(
'nb_t_bind',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('user_id', sa.Integer(), nullable=False),
sa.Column('game_platform', sa.String(length=32), nullable=False),
sa.Column('game_account', sa.String(), nullable=False),
sa.PrimaryKeyConstraint('id', name=op.f('pk_nb_t_bind')),
info={'bind_key': 'nonebot_plugin_tetris_stats'},
)
with op.batch_alter_table('nb_t_bind', schema=None) as batch_op:
batch_op.create_index(batch_op.f('ix_nb_t_bind_user_id'), ['user_id'], unique=False)
op.create_table(
'nb_t_io_hist_data',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('user_unique_identifier', sa.String(length=24), nullable=False),
sa.Column('api_type', sa.String(length=32), nullable=False),
sa.Column('data', sa.JSON(), nullable=False),
sa.Column('update_time', sa.DateTime(), nullable=False),
sa.PrimaryKeyConstraint('id', name=op.f('pk_nb_t_io_hist_data')),
info={'bind_key': 'nonebot_plugin_tetris_stats'},
)
with op.batch_alter_table('nb_t_io_hist_data', schema=None) as batch_op:
batch_op.create_index(batch_op.f('ix_nb_t_io_hist_data_api_type'), ['api_type'], unique=False)
batch_op.create_index(batch_op.f('ix_nb_t_io_hist_data_update_time'), ['update_time'], unique=False)
batch_op.create_index(
batch_op.f('ix_nb_t_io_hist_data_user_unique_identifier'), ['user_unique_identifier'], unique=False
)
op.create_table(
'nb_t_io_tl_stats',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('update_time', sa.DateTime(), nullable=False),
sa.PrimaryKeyConstraint('id', name=op.f('pk_nb_t_io_tl_stats')),
info={'bind_key': 'nonebot_plugin_tetris_stats'},
)
with op.batch_alter_table('nb_t_io_tl_stats', schema=None) as batch_op:
batch_op.create_index(batch_op.f('ix_nb_t_io_tl_stats_update_time'), ['update_time'], unique=False)
op.create_table(
'nb_t_io_u_cfg',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('query_template', sa.String(length=2), nullable=False),
sa.PrimaryKeyConstraint('id', name=op.f('pk_nb_t_io_u_cfg')),
info={'bind_key': 'nonebot_plugin_tetris_stats'},
)
op.create_table(
'nb_t_top_hist_data',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('user_unique_identifier', sa.String(length=24), nullable=False),
sa.Column('api_type', sa.String(length=16), nullable=False),
sa.Column('data', sa.JSON(), nullable=False),
sa.Column('update_time', sa.DateTime(), nullable=False),
sa.PrimaryKeyConstraint('id', name=op.f('pk_nb_t_top_hist_data')),
info={'bind_key': 'nonebot_plugin_tetris_stats'},
)
with op.batch_alter_table('nb_t_top_hist_data', schema=None) as batch_op:
batch_op.create_index(batch_op.f('ix_nb_t_top_hist_data_api_type'), ['api_type'], unique=False)
batch_op.create_index(batch_op.f('ix_nb_t_top_hist_data_update_time'), ['update_time'], unique=False)
batch_op.create_index(
batch_op.f('ix_nb_t_top_hist_data_user_unique_identifier'), ['user_unique_identifier'], unique=False
)
op.create_table(
'nb_t_tos_hist_data',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('user_unique_identifier', sa.String(length=24), nullable=False),
sa.Column('api_type', sa.String(length=16), nullable=False),
sa.Column('data', sa.JSON(), nullable=False),
sa.Column('update_time', sa.DateTime(), nullable=False),
sa.PrimaryKeyConstraint('id', name=op.f('pk_nb_t_tos_hist_data')),
info={'bind_key': 'nonebot_plugin_tetris_stats'},
)
with op.batch_alter_table('nb_t_tos_hist_data', schema=None) as batch_op:
batch_op.create_index(batch_op.f('ix_nb_t_tos_hist_data_api_type'), ['api_type'], unique=False)
batch_op.create_index(batch_op.f('ix_nb_t_tos_hist_data_update_time'), ['update_time'], unique=False)
batch_op.create_index(
batch_op.f('ix_nb_t_tos_hist_data_user_unique_identifier'), ['user_unique_identifier'], unique=False
)
op.create_table(
'nb_t_trigger_hist_v2',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('trigger_time', sa.DateTime(), nullable=False),
sa.Column('session_persist_id', sa.Integer(), nullable=False),
sa.Column('game_platform', sa.String(length=32), nullable=False),
sa.Column('command_type', sa.String(length=16), nullable=False),
sa.Column('command_args', sa.JSON(), nullable=False),
sa.Column('finish_time', sa.DateTime(), nullable=False),
sa.PrimaryKeyConstraint('id', name=op.f('pk_nb_t_trigger_hist_v2')),
info={'bind_key': 'nonebot_plugin_tetris_stats'},
)
with op.batch_alter_table('nb_t_trigger_hist_v2', schema=None) as batch_op:
batch_op.create_index(batch_op.f('ix_nb_t_trigger_hist_v2_command_type'), ['command_type'], unique=False)
batch_op.create_index(batch_op.f('ix_nb_t_trigger_hist_v2_game_platform'), ['game_platform'], unique=False)
op.create_table(
'nb_t_io_tl_hist',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('request_id', sa.Uuid(), nullable=False),
sa.Column('data', sa.JSON(), nullable=False),
sa.Column('update_time', sa.DateTime(), nullable=False),
sa.Column('stats_id', sa.Integer(), nullable=False),
sa.ForeignKeyConstraint(
['stats_id'], ['nb_t_io_tl_stats.id'], name=op.f('fk_nb_t_io_tl_hist_stats_id_nb_t_io_tl_stats')
),
sa.PrimaryKeyConstraint('id', name=op.f('pk_nb_t_io_tl_hist')),
info={'bind_key': 'nonebot_plugin_tetris_stats'},
)
with op.batch_alter_table('nb_t_io_tl_hist', schema=None) as batch_op:
batch_op.create_index(batch_op.f('ix_nb_t_io_tl_hist_request_id'), ['request_id'], unique=False)
batch_op.create_index(batch_op.f('ix_nb_t_io_tl_hist_update_time'), ['update_time'], unique=False)
op.create_table(
'nb_t_io_tl_stats_field',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('rank', sa.String(length=2), nullable=False),
sa.Column('tr_line', sa.Float(), nullable=False),
sa.Column('player_count', sa.Integer(), nullable=False),
sa.Column('low_pps', sa.JSON(), nullable=False),
sa.Column('low_apm', sa.JSON(), nullable=False),
sa.Column('low_vs', sa.JSON(), nullable=False),
sa.Column('avg_pps', sa.Float(), nullable=False),
sa.Column('avg_apm', sa.Float(), nullable=False),
sa.Column('avg_vs', sa.Float(), nullable=False),
sa.Column('high_pps', sa.JSON(), nullable=False),
sa.Column('high_apm', sa.JSON(), nullable=False),
sa.Column('high_vs', sa.JSON(), nullable=False),
sa.Column('stats_id', sa.Integer(), nullable=False),
sa.ForeignKeyConstraint(
['stats_id'], ['nb_t_io_tl_stats.id'], name=op.f('fk_nb_t_io_tl_stats_field_stats_id_nb_t_io_tl_stats')
),
sa.PrimaryKeyConstraint('id', name=op.f('pk_nb_t_io_tl_stats_field')),
info={'bind_key': 'nonebot_plugin_tetris_stats'},
)
with op.batch_alter_table('nb_t_io_tl_stats_field', schema=None) as batch_op:
batch_op.create_index(batch_op.f('ix_nb_t_io_tl_stats_field_rank'), ['rank'], unique=False)
# ### end Alembic commands ###
def downgrade(name: str = '') -> None:
if name:
return
# ### commands auto generated by Alembic - please adjust! ###
with op.batch_alter_table('nb_t_io_tl_stats_field', schema=None) as batch_op:
batch_op.drop_index(batch_op.f('ix_nb_t_io_tl_stats_field_rank'))
op.drop_table('nb_t_io_tl_stats_field')
with op.batch_alter_table('nb_t_io_tl_hist', schema=None) as batch_op:
batch_op.drop_index(batch_op.f('ix_nb_t_io_tl_hist_update_time'))
batch_op.drop_index(batch_op.f('ix_nb_t_io_tl_hist_request_id'))
op.drop_table('nb_t_io_tl_hist')
with op.batch_alter_table('nb_t_trigger_hist_v2', schema=None) as batch_op:
batch_op.drop_index(batch_op.f('ix_nb_t_trigger_hist_v2_game_platform'))
batch_op.drop_index(batch_op.f('ix_nb_t_trigger_hist_v2_command_type'))
op.drop_table('nb_t_trigger_hist_v2')
with op.batch_alter_table('nb_t_tos_hist_data', schema=None) as batch_op:
batch_op.drop_index(batch_op.f('ix_nb_t_tos_hist_data_user_unique_identifier'))
batch_op.drop_index(batch_op.f('ix_nb_t_tos_hist_data_update_time'))
batch_op.drop_index(batch_op.f('ix_nb_t_tos_hist_data_api_type'))
op.drop_table('nb_t_tos_hist_data')
with op.batch_alter_table('nb_t_top_hist_data', schema=None) as batch_op:
batch_op.drop_index(batch_op.f('ix_nb_t_top_hist_data_user_unique_identifier'))
batch_op.drop_index(batch_op.f('ix_nb_t_top_hist_data_update_time'))
batch_op.drop_index(batch_op.f('ix_nb_t_top_hist_data_api_type'))
op.drop_table('nb_t_top_hist_data')
op.drop_table('nb_t_io_u_cfg')
with op.batch_alter_table('nb_t_io_tl_stats', schema=None) as batch_op:
batch_op.drop_index(batch_op.f('ix_nb_t_io_tl_stats_update_time'))
op.drop_table('nb_t_io_tl_stats')
with op.batch_alter_table('nb_t_io_hist_data', schema=None) as batch_op:
batch_op.drop_index(batch_op.f('ix_nb_t_io_hist_data_user_unique_identifier'))
batch_op.drop_index(batch_op.f('ix_nb_t_io_hist_data_update_time'))
batch_op.drop_index(batch_op.f('ix_nb_t_io_hist_data_api_type'))
op.drop_table('nb_t_io_hist_data')
with op.batch_alter_table('nb_t_bind', schema=None) as batch_op:
batch_op.drop_index(batch_op.f('ix_nb_t_bind_user_id'))
op.drop_table('nb_t_bind')
# ### end Alembic commands ###

View File

@@ -23,13 +23,15 @@ branch_labels: str | Sequence[str] | None = None
depends_on: str | Sequence[str] | None = None
def upgrade(name: str = '') -> None:
def upgrade(name: str = '') -> None: # noqa: C901
if name:
return
if op.get_bind().dialect.name == 'postgresql':
return
from nonebot.compat import type_validate_json
from pydantic import ValidationError
from rich.progress import (
from nonebot.compat import type_validate_json # noqa: PLC0415
from pydantic import ValidationError # noqa: PLC0415
from rich.progress import ( # noqa: PLC0415
BarColumn,
MofNCompleteColumn,
Progress,
@@ -37,9 +39,9 @@ def upgrade(name: str = '') -> None:
TextColumn,
TimeRemainingColumn,
)
from sqlalchemy import select
from sqlalchemy.ext.automap import automap_base
from sqlalchemy.orm import Session
from sqlalchemy import select # noqa: PLC0415
from sqlalchemy.ext.automap import automap_base # noqa: PLC0415
from sqlalchemy.orm import Session # noqa: PLC0415
with op.batch_alter_table('nonebot_plugin_tetris_stats_historicaldata', schema=None) as batch_op:
batch_op.add_column(sa.Column('user_unique_identifier', sa.String(length=32), nullable=True))
@@ -58,13 +60,15 @@ def upgrade(name: str = '') -> None:
if count == 0:
logger.info('空表, 跳过')
else:
from nonebot_plugin_tetris_stats.version import __version__
from nonebot_plugin_tetris_stats.version import __version__ # noqa: PLC0415
if __version__ != '1.0.4':
msg = '本迁移需要1.0.4版本, 请先锁定版本至1.0.4版本再执行本迁移'
logger.critical(msg)
raise RuntimeError(msg)
from nonebot_plugin_tetris_stats.game_data_processor.schemas import BaseUser # type: ignore[import-untyped]
from nonebot_plugin_tetris_stats.game_data_processor.schemas import ( # type: ignore[import-untyped] # pyright: ignore[reportMissingImports] # noqa: PLC0415
BaseUser,
)
models: list[type[BaseUser]] = BaseUser.__subclasses__()
@@ -103,6 +107,8 @@ def upgrade(name: str = '') -> None:
def downgrade(name: str = '') -> None:
if name:
return
if op.get_bind().dialect.name == 'postgresql':
return
# ### commands auto generated by Alembic - please adjust! ###
with op.batch_alter_table('nonebot_plugin_tetris_stats_historicaldata', schema=None) as batch_op:
batch_op.drop_index(batch_op.f('ix_nonebot_plugin_tetris_stats_historicaldata_user_unique_identifier'))

View File

@@ -0,0 +1,82 @@
"""migrate nonebot_plugin_tetris_stats_tetriouserconfig
迁移 ID: b96c8c18b79a
父迁移: 8459b2a4b7a3
创建时间: 2025-07-18 04:25:44.190319
"""
from __future__ import annotations
from typing import TYPE_CHECKING
from alembic import op
from nonebot.log import logger
from rich.progress import BarColumn, Progress, SpinnerColumn, TaskProgressColumn, TextColumn
from sqlalchemy import inspect
from sqlalchemy.ext.automap import automap_base
from sqlalchemy.orm import Session
if TYPE_CHECKING:
from collections.abc import Sequence
revision: str = 'b96c8c18b79a'
down_revision: str | Sequence[str] | None = '8459b2a4b7a3'
branch_labels: str | Sequence[str] | None = None
depends_on: str | Sequence[str] | None = None
def data_migrate() -> None:
conn = op.get_bind()
insp = inspect(conn)
table_names = insp.get_table_names()
if 'nonebot_plugin_tetris_stats_tetriouserconfig' not in table_names:
return
Base = automap_base() # noqa: N806
Base.prepare(autoload_with=conn)
Old = Base.classes.nonebot_plugin_tetris_stats_tetriouserconfig # noqa: N806
New = Base.classes.nb_t_io_u_cfg # noqa: N806
with Session(conn) as db_session:
count = db_session.query(Old).count()
if count == 0:
return
logger.warning('tetris_stats: 正在迁移数据, 请不要关闭程序...')
with Progress(
SpinnerColumn(),
TextColumn('[progress.description]{task.description}'),
BarColumn(),
TaskProgressColumn(),
) as progress:
task = progress.add_task('迁移数据...', total=count)
for i in db_session.query(Old).yield_per(1):
db_session.add(
New(
id=i.id,
query_template=i.query_template,
)
)
progress.update(task, advance=1)
if progress.tasks[task].completed % 100 == 0:
db_session.commit()
db_session.commit()
logger.success('tetris_stats: 数据迁移完成!')
def upgrade(name: str = '') -> None:
if name:
return
data_migrate()
def downgrade(name: str = '') -> None:
if name:
return

View File

@@ -26,6 +26,8 @@ depends_on: str | Sequence[str] | None = None
def upgrade(name: str = '') -> None:
if name:
return
if op.get_bind().dialect.name == 'postgresql':
return
Base = automap_base() # noqa: N806
connection = op.get_bind()
@@ -40,3 +42,5 @@ def upgrade(name: str = '') -> None:
def downgrade(name: str = '') -> None:
if name:
return
if op.get_bind().dialect.name == 'postgresql':
return

View File

@@ -0,0 +1,85 @@
"""migrate nonebot_plugin_tetris_stats_tetriohistoricaldata
迁移 ID: bbbdfd94e6fa
父迁移: d61e6ae36586
创建时间: 2025-07-18 00:42:33.730885
"""
from __future__ import annotations
from typing import TYPE_CHECKING
from alembic import op
from nonebot.log import logger
from rich.progress import BarColumn, Progress, SpinnerColumn, TaskProgressColumn, TextColumn
from sqlalchemy import inspect
from sqlalchemy.ext.automap import automap_base
from sqlalchemy.orm import Session
if TYPE_CHECKING:
from collections.abc import Sequence
revision: str = 'bbbdfd94e6fa'
down_revision: str | Sequence[str] | None = 'd61e6ae36586'
branch_labels: str | Sequence[str] | None = None
depends_on: str | Sequence[str] | None = None
def data_migrate() -> None:
conn = op.get_bind()
insp = inspect(conn)
table_names = insp.get_table_names()
if 'nonebot_plugin_tetris_stats_tetriohistoricaldata' not in table_names:
return
Base = automap_base() # noqa: N806
Base.prepare(autoload_with=conn)
Old = Base.classes.nonebot_plugin_tetris_stats_tetriohistoricaldata # noqa: N806
New = Base.classes.nb_t_io_hist_data # noqa: N806
with Session(conn) as db_session:
count = db_session.query(Old).count()
if count == 0:
return
logger.warning('tetris_stats: 正在迁移数据, 请不要关闭程序...')
with Progress(
SpinnerColumn(),
TextColumn('[progress.description]{task.description}'),
BarColumn(),
TaskProgressColumn(),
) as progress:
task = progress.add_task('迁移数据...', total=count)
for i in db_session.query(Old).yield_per(1):
db_session.add(
New(
id=i.id,
user_unique_identifier=i.user_unique_identifier,
api_type=i.api_type,
data=i.data,
update_time=i.update_time,
)
)
progress.update(task, advance=1)
if progress.tasks[task].completed % 100 == 0:
db_session.commit()
db_session.commit()
logger.success('tetris_stats: 数据迁移完成!')
def upgrade(name: str = '') -> None:
if name:
return
data_migrate()
def downgrade(name: str = '') -> None:
if name:
return

View File

@@ -0,0 +1,87 @@
"""migrate nonebot_plugin_tetris_stats_triggerhistoricaldatav2
迁移 ID: bc6abd57928f
父迁移: ee76ae37d70a
创建时间: 2025-07-18 04:33:04.222045
"""
from __future__ import annotations
from typing import TYPE_CHECKING
from alembic import op
from nonebot.log import logger
from rich.progress import BarColumn, Progress, SpinnerColumn, TaskProgressColumn, TextColumn
from sqlalchemy import inspect
from sqlalchemy.ext.automap import automap_base
from sqlalchemy.orm import Session
if TYPE_CHECKING:
from collections.abc import Sequence
revision: str = 'bc6abd57928f'
down_revision: str | Sequence[str] | None = 'ee76ae37d70a'
branch_labels: str | Sequence[str] | None = None
depends_on: str | Sequence[str] | None = None
def data_migrate() -> None:
conn = op.get_bind()
insp = inspect(conn)
table_names = insp.get_table_names()
if 'nonebot_plugin_tetris_stats_triggerhistoricaldatav2' not in table_names:
return
Base = automap_base() # noqa: N806
Base.prepare(autoload_with=conn)
Old = Base.classes.nonebot_plugin_tetris_stats_triggerhistoricaldatav2 # noqa: N806
New = Base.classes.nb_t_trigger_hist_v2 # noqa: N806
with Session(conn) as db_session:
count = db_session.query(Old).count()
if count == 0:
return
logger.warning('tetris_stats: 正在迁移数据, 请不要关闭程序...')
with Progress(
SpinnerColumn(),
TextColumn('[progress.description]{task.description}'),
BarColumn(),
TaskProgressColumn(),
) as progress:
task = progress.add_task('迁移数据...', total=count)
for i in db_session.query(Old).yield_per(1):
db_session.add(
New(
id=i.id,
trigger_time=i.trigger_time,
session_persist_id=i.session_persist_id,
game_platform=i.game_platform,
command_type=i.command_type,
command_args=i.command_args,
finish_time=i.finish_time,
)
)
progress.update(task, advance=1)
if progress.tasks[task].completed % 100 == 0:
db_session.commit()
db_session.commit()
logger.success('tetris_stats: 数据迁移完成!')
def upgrade(name: str = '') -> None:
if name:
return
data_migrate()
def downgrade(name: str = '') -> None:
if name:
return

View File

@@ -0,0 +1,85 @@
"""migrate nonebot_plugin_tetris_stats_tophistoricaldata
迁移 ID: ce073d279d19
父迁移: b96c8c18b79a
创建时间: 2025-07-18 04:28:13.820635
"""
from __future__ import annotations
from typing import TYPE_CHECKING
from alembic import op
from nonebot.log import logger
from rich.progress import BarColumn, Progress, SpinnerColumn, TaskProgressColumn, TextColumn
from sqlalchemy import inspect
from sqlalchemy.ext.automap import automap_base
from sqlalchemy.orm import Session
if TYPE_CHECKING:
from collections.abc import Sequence
revision: str = 'ce073d279d19'
down_revision: str | Sequence[str] | None = 'b96c8c18b79a'
branch_labels: str | Sequence[str] | None = None
depends_on: str | Sequence[str] | None = None
def data_migrate() -> None:
conn = op.get_bind()
insp = inspect(conn)
table_names = insp.get_table_names()
if 'nonebot_plugin_tetris_stats_tophistoricaldata' not in table_names:
return
Base = automap_base() # noqa: N806
Base.prepare(autoload_with=conn)
Old = Base.classes.nonebot_plugin_tetris_stats_tophistoricaldata # noqa: N806
New = Base.classes.nb_t_top_hist_data # noqa: N806
with Session(conn) as db_session:
count = db_session.query(Old).count()
if count == 0:
return
logger.warning('tetris_stats: 正在迁移数据, 请不要关闭程序...')
with Progress(
SpinnerColumn(),
TextColumn('[progress.description]{task.description}'),
BarColumn(),
TaskProgressColumn(),
) as progress:
task = progress.add_task('迁移数据...', total=count)
for i in db_session.query(Old).yield_per(1):
db_session.add(
New(
id=i.id,
user_unique_identifier=i.user_unique_identifier,
api_type=i.api_type,
data=i.data,
update_time=i.update_time,
)
)
progress.update(task, advance=1)
if progress.tasks[task].completed % 100 == 0:
db_session.commit()
db_session.commit()
logger.success('tetris_stats: 数据迁移完成!')
def upgrade(name: str = '') -> None:
if name:
return
data_migrate()
def downgrade(name: str = '') -> None:
if name:
return

View File

@@ -26,6 +26,8 @@ depends_on: str | Sequence[str] | None = None
def upgrade(name: str = '') -> None:
if name:
return
if op.get_bind().dialect.name == 'postgresql':
return
# ### commands auto generated by Alembic - please adjust! ###
with op.batch_alter_table('nonebot_plugin_tetris_stats_tetriohistoricaldata', schema=None) as batch_op:
batch_op.alter_column(
@@ -38,6 +40,8 @@ def upgrade(name: str = '') -> None:
def downgrade(name: str = '') -> None:
if name:
return
if op.get_bind().dialect.name == 'postgresql':
return
# ### commands auto generated by Alembic - please adjust! ###
logger.warning('新数据可能不支持降级!')
logger.warning('请确认数据库内数据可以迁移到旧版本!')

View File

@@ -0,0 +1,84 @@
"""migrate nonebot_plugin_tetris_stats_bind
迁移 ID: d61e6ae36586
父迁移: b2075a5ce371
创建时间: 2025-07-17 23:58:13.408384
"""
from __future__ import annotations
from typing import TYPE_CHECKING
from alembic import op
from nonebot.log import logger
from rich.progress import BarColumn, Progress, SpinnerColumn, TaskProgressColumn, TextColumn
from sqlalchemy import inspect
from sqlalchemy.ext.automap import automap_base
from sqlalchemy.orm import Session
if TYPE_CHECKING:
from collections.abc import Sequence
revision: str = 'd61e6ae36586'
down_revision: str | Sequence[str] | None = 'b2075a5ce371'
branch_labels: str | Sequence[str] | None = None
depends_on: str | Sequence[str] | None = None
def data_migrate() -> None:
conn = op.get_bind()
insp = inspect(conn)
table_names = insp.get_table_names()
if 'nonebot_plugin_tetris_stats_bind' not in table_names:
return
Base = automap_base() # noqa: N806
Base.prepare(autoload_with=conn)
Old = Base.classes.nonebot_plugin_tetris_stats_bind # noqa: N806
New = Base.classes.nb_t_bind # noqa: N806
with Session(conn) as db_session:
count = db_session.query(Old).count()
if count == 0:
return
logger.warning('tetris_stats: 正在迁移数据, 请不要关闭程序...')
with Progress(
SpinnerColumn(),
TextColumn('[progress.description]{task.description}'),
BarColumn(),
TaskProgressColumn(),
) as progress:
task = progress.add_task('迁移数据...', total=count)
for i in db_session.query(Old).yield_per(100):
db_session.add(
New(
id=i.id,
user_id=i.user_id,
game_platform=i.game_platform,
game_account=i.game_account,
)
)
progress.update(task, advance=1)
if progress.tasks[task].completed % 100 == 0:
db_session.commit()
db_session.commit()
logger.success('tetris_stats: 数据迁移完成!')
def upgrade(name: str = '') -> None:
if name:
return
data_migrate()
def downgrade(name: str = '') -> None:
if name:
return

View File

@@ -0,0 +1,85 @@
"""migrate nonebot_plugin_tetris_stats_toshistoricaldata
迁移 ID: ee76ae37d70a
父迁移: ce073d279d19
创建时间: 2025-07-18 04:29:52.976624
"""
from __future__ import annotations
from typing import TYPE_CHECKING
from alembic import op
from nonebot.log import logger
from rich.progress import BarColumn, Progress, SpinnerColumn, TaskProgressColumn, TextColumn
from sqlalchemy import inspect
from sqlalchemy.ext.automap import automap_base
from sqlalchemy.orm import Session
if TYPE_CHECKING:
from collections.abc import Sequence
revision: str = 'ee76ae37d70a'
down_revision: str | Sequence[str] | None = 'ce073d279d19'
branch_labels: str | Sequence[str] | None = None
depends_on: str | Sequence[str] | None = None
def data_migrate() -> None:
conn = op.get_bind()
insp = inspect(conn)
table_names = insp.get_table_names()
if 'nonebot_plugin_tetris_stats_toshistoricaldata' not in table_names:
return
Base = automap_base() # noqa: N806
Base.prepare(autoload_with=conn)
Old = Base.classes.nonebot_plugin_tetris_stats_toshistoricaldata # noqa: N806
New = Base.classes.nb_t_tos_hist_data # noqa: N806
with Session(conn) as db_session:
count = db_session.query(Old).count()
if count == 0:
return
logger.warning('tetris_stats: 正在迁移数据, 请不要关闭程序...')
with Progress(
SpinnerColumn(),
TextColumn('[progress.description]{task.description}'),
BarColumn(),
TaskProgressColumn(),
) as progress:
task = progress.add_task('迁移数据...', total=count)
for i in db_session.query(Old).yield_per(1):
db_session.add(
New(
id=i.id,
user_unique_identifier=i.user_unique_identifier,
api_type=i.api_type,
data=i.data,
update_time=i.update_time,
)
)
progress.update(task, advance=1)
if progress.tasks[task].completed % 100 == 0:
db_session.commit()
db_session.commit()
logger.success('tetris_stats: 数据迁移完成!')
def upgrade(name: str = '') -> None:
if name:
return
data_migrate()
def downgrade(name: str = '') -> None:
if name:
return

View File

@@ -25,6 +25,8 @@ depends_on: str | Sequence[str] | None = None
def upgrade(name: str = '') -> None:
if name:
return
if op.get_bind().dialect.name == 'postgresql':
return
with op.batch_alter_table('nonebot_plugin_tetris_stats_iorank', schema=None) as batch_op:
batch_op.drop_index('ix_nonebot_plugin_tetris_stats_iorank_file_hash')
batch_op.drop_index('ix_nonebot_plugin_tetris_stats_iorank_rank')
@@ -66,23 +68,25 @@ def upgrade(name: str = '') -> None:
def downgrade(name: str = '') -> None:
if name:
return
if op.get_bind().dialect.name == 'postgresql':
return
op.create_table(
'nonebot_plugin_tetris_stats_iorank',
sa.Column('id', sa.INTEGER(), nullable=False),
sa.Column('rank', sa.VARCHAR(length=2), nullable=False),
sa.Column('tr_line', sa.FLOAT(), nullable=False),
sa.Column('player_count', sa.INTEGER(), nullable=False),
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('rank', sa.String(length=2), nullable=False),
sa.Column('tr_line', sa.Float(), nullable=False),
sa.Column('player_count', sa.Integer(), nullable=False),
sa.Column('low_pps', sa.JSON(), nullable=False),
sa.Column('low_apm', sa.JSON(), nullable=False),
sa.Column('low_vs', sa.JSON(), nullable=False),
sa.Column('avg_pps', sa.FLOAT(), nullable=False),
sa.Column('avg_apm', sa.FLOAT(), nullable=False),
sa.Column('avg_vs', sa.FLOAT(), nullable=False),
sa.Column('avg_pps', sa.Float(), nullable=False),
sa.Column('avg_apm', sa.Float(), nullable=False),
sa.Column('avg_vs', sa.Float(), nullable=False),
sa.Column('high_pps', sa.JSON(), nullable=False),
sa.Column('high_apm', sa.JSON(), nullable=False),
sa.Column('high_vs', sa.JSON(), nullable=False),
sa.Column('update_time', sa.DATETIME(), nullable=False),
sa.Column('file_hash', sa.VARCHAR(length=128), nullable=True),
sa.Column('update_time', sa.DateTime(), nullable=False),
sa.Column('file_hash', sa.String(length=128), nullable=True),
sa.PrimaryKeyConstraint('id', name='pk_nonebot_plugin_tetris_stats_iorank'),
)
with op.batch_alter_table('nonebot_plugin_tetris_stats_iorank', schema=None) as batch_op:

View File

@@ -1,7 +1,7 @@
from asyncio import Lock
from collections.abc import AsyncGenerator
from contextlib import asynccontextmanager
from datetime import datetime, timezone
from datetime import datetime, timedelta, timezone
from enum import Enum, auto
from typing import TYPE_CHECKING, Literal, TypeVar, overload
@@ -11,6 +11,7 @@ from nonebot_plugin_orm import AsyncSession, get_session
from nonebot_plugin_user import User
from sqlalchemy import select
from ..utils.duration import DEFAULT_COMPARE_DELTA
from ..utils.typedefs import AllCommandType, BaseCommandType, GameType, TETRIOCommandType
from .models import Bind, TriggerHistoricalDataV2
@@ -18,8 +19,11 @@ UTC = timezone.utc
if TYPE_CHECKING:
from ..games.tetrio.api.models import TETRIOHistoricalData
from ..games.tetrio.models import TETRIOUserConfig
from ..games.top.api.models import TOPHistoricalData
from ..games.top.models import TOPUserConfig
from ..games.tos.api.models import TOSHistoricalData
from ..games.tos.models import TOSUserConfig
class BindStatus(Enum):
@@ -42,6 +46,8 @@ async def create_or_update_bind(
user: User,
game_platform: GameType,
game_account: str,
*,
verify: bool = False,
) -> BindStatus:
bind = await query_bind_info(
session=session,
@@ -53,11 +59,13 @@ async def create_or_update_bind(
user_id=user.id,
game_platform=game_platform,
game_account=game_account,
verify=verify,
)
session.add(bind)
status = BindStatus.SUCCESS
else:
bind.game_account = game_account
bind.verify = verify
status = BindStatus.UPDATE
await session.commit()
return status
@@ -80,12 +88,12 @@ async def remove_bind(
return False
T = TypeVar('T', 'TETRIOHistoricalData', 'TOPHistoricalData', 'TOSHistoricalData')
T_HistoricalData = TypeVar('T_HistoricalData', 'TETRIOHistoricalData', 'TOPHistoricalData', 'TOSHistoricalData')
lock = Lock()
async def anti_duplicate_add(model: T) -> None:
async def anti_duplicate_add(model: T_HistoricalData) -> None:
async with lock, get_session() as session:
result = (
await session.scalars(
@@ -104,6 +112,19 @@ async def anti_duplicate_add(model: T) -> None:
await session.commit()
T_CONFIG = TypeVar('T_CONFIG', 'TETRIOUserConfig', 'TOPUserConfig', 'TOSUserConfig')
async def resolve_compare_delta(
config: type[T_CONFIG], session: AsyncSession, user_id: int, compare: timedelta | None
) -> timedelta:
return (
compare
or await session.scalar(select(config.compare_delta).where(config.id == user_id))
or DEFAULT_COMPARE_DELTA
)
@asynccontextmanager
@overload
async def trigger(

View File

@@ -2,7 +2,7 @@ from collections.abc import Callable, Sequence
from datetime import datetime
from typing import Any
from nonebot.compat import PYDANTIC_V2, type_validate_json
from nonebot.compat import PYDANTIC_V2, type_validate_python
from nonebot_plugin_orm import Model
from pydantic import BaseModel, ValidationError
from sqlalchemy import JSON, DateTime, Dialect, String, TypeDecorator
@@ -30,27 +30,27 @@ class PydanticType(TypeDecorator):
if PYDANTIC_V2:
@override
def process_bind_param(self, value: Any | None, dialect: Dialect) -> str:
def process_bind_param(self, value: Any | None, dialect: Dialect) -> dict | list:
# 将 Pydantic 模型实例转换为 JSON
if isinstance(value, tuple(self.models)):
return value.model_dump_json(by_alias=True) # type: ignore[union-attr]
return value.model_dump(mode='json', by_alias=True) # type: ignore[union-attr]
raise TypeError
else:
@override
def process_bind_param(self, value: Any | None, dialect: Dialect) -> str:
def process_bind_param(self, value: Any | None, dialect: Dialect) -> dict | list:
# 将 Pydantic 模型实例转换为 JSON
if isinstance(value, tuple(self.models)):
return value.json(by_alias=True) # type: ignore[union-attr]
return value.dict(by_alias=True) # type: ignore[union-attr]
raise TypeError
@override
def process_result_value(self, value: Any | None, dialect: Dialect) -> BaseModel:
# 将 JSON 转换回 Pydantic 模型实例
if isinstance(value, str | bytes):
if isinstance(value, dict | list):
for i in self.models:
try:
return type_validate_json(i, value)
return type_validate_python(i, value)
except ValidationError: # noqa: PERF203
...
raise ValueError
@@ -65,13 +65,18 @@ class PydanticType(TypeDecorator):
class Bind(MappedAsDataclass, Model):
__tablename__ = 'nb_t_bind'
id: Mapped[int] = mapped_column(init=False, primary_key=True)
user_id: Mapped[int] = mapped_column(index=True)
game_platform: Mapped[GameType] = mapped_column(String(32))
game_account: Mapped[str]
verify: Mapped[bool]
class TriggerHistoricalDataV2(MappedAsDataclass, Model):
__tablename__ = 'nb_t_trigger_hist_v2'
id: Mapped[int] = mapped_column(init=False, primary_key=True)
trigger_time: Mapped[datetime] = mapped_column(DateTime)
session_persist_id: Mapped[int]

View File

@@ -7,7 +7,7 @@ from nonebot.typing import T_Handler
from nonebot_plugin_alconna import AlcMatches, Alconna, At, CommandMeta, on_alconna
from .. import ns
from ..i18n.model import Lang
from ..i18n import Lang
from ..utils.exception import MessageFormatError, NeedCatchError
command: Alconna = Alconna(
@@ -47,7 +47,7 @@ async def _(matcher: Matcher, matches: AlcMatches):
if (matches.head_matched and matches.options != {}) or matches.main_args == {}:
await matcher.finish(
(f'{matches.error_info!r}\n' if matches.error_info is not None else '')
+ f'输入"{matches.header_result} --help"查看帮助'
+ Lang.help.usage(command=matches.header_result)
)

View File

@@ -2,6 +2,7 @@ from abc import ABC, abstractmethod
from typing import Generic, TypeVar
from pydantic import BaseModel
from typing_extensions import override
from ..utils.typedefs import GameType
@@ -13,6 +14,7 @@ class BaseUser(BaseModel, ABC, Generic[T]):
platform: T
@override
def __eq__(self, other: object) -> bool:
if isinstance(other, BaseUser):
return self.unique_identifier == other.unique_identifier
@@ -22,3 +24,5 @@ class BaseUser(BaseModel, ABC, Generic[T]):
@abstractmethod
def unique_identifier(self) -> str:
raise NotImplementedError
__hash__ = BaseModel.__hash__

View File

@@ -23,7 +23,7 @@ command = Subcommand(
)
from . import bind, config, list, query, rank, record, unbind # noqa: A004, E402
from . import bind, config, list, query, rank, record, unbind, verify # noqa: A004, E402
main_command.add(command)
@@ -36,4 +36,5 @@ __all__ = [
'rank',
'record',
'unbind',
'verify',
]

View File

@@ -17,7 +17,7 @@ UTC = timezone.utc
request = Request(config.tetris.proxy.tetrio or config.tetris.proxy.main)
request.request = limit(timedelta(seconds=1))(request.request) # type: ignore[method-assign]
request.request = limit(timedelta(seconds=1))(request.request) # type: ignore[method-assign] # pyright: ignore[reportAttributeAccessIssue]
class Cache:
@@ -32,7 +32,7 @@ class Cache:
logger.debug(f'{url}: Cache hit!')
return cached_data
response_data = await request.request(url, extra_headers, enable_anti_cloudflare=True)
parsed_data: SuccessModel | FailedModel = type_validate_json(SuccessModel | FailedModel, response_data) # type: ignore[arg-type]
parsed_data: SuccessModel | FailedModel = type_validate_json(SuccessModel | FailedModel, response_data) # type: ignore[arg-type] # pyright: ignore[reportArgumentType]
if isinstance(parsed_data, SuccessModel):
await cls.cache.add(
url,

View File

@@ -20,7 +20,7 @@ async def by(
by_type: Literal['league', 'xp', 'ar'], parameter: Parameter, x_session_id: UUID | None = None
) -> BySuccessModel:
model: By = type_validate_json(
By, # type: ignore[arg-type]
By, # type: ignore[arg-type] # pyright: ignore[reportArgumentType]
await get(
BASE_URL / f'users/by/{by_type}',
parameter,
@@ -69,7 +69,7 @@ async def records(
match records_type:
case '40l' | 'blitz':
model = type_validate_json(
Solo, # type: ignore[arg-type]
Solo, # type: ignore[arg-type] # pyright: ignore[reportArgumentType]
await get(
BASE_URL / 'records' / f'{records_type}{scope}{revolution_id if revolution_id is not None else ""}',
parameter,
@@ -77,17 +77,14 @@ async def records(
)
case 'zenith' | 'zenithex':
model = type_validate_json(
Zenith, # type: ignore[arg-type]
Zenith, # type: ignore[arg-type] # pyright: ignore[reportArgumentType]
await get(
BASE_URL / 'records' / f'{records_type}{scope}{revolution_id if revolution_id is not None else ""}',
parameter,
),
)
case _:
msg = f'records_type: {records_type} is not supported'
raise ValueError(msg)
if isinstance(model, FailedModel):
msg = f'排行榜信息请求错误:\n{model.error}' # type: ignore[attr-defined]
msg = f'排行榜信息请求错误:\n{model.error}'
raise RequestError(msg)
return model

View File

@@ -11,6 +11,8 @@ from .typedefs import Records, Summaries
class TETRIOHistoricalData(MappedAsDataclass, Model):
__tablename__ = 'nb_t_io_hist_data'
id: Mapped[int] = mapped_column(init=False, primary_key=True)
user_unique_identifier: Mapped[str] = mapped_column(String(24), index=True)
api_type: Mapped[Literal['User Info', Records, Summaries]] = mapped_column(String(32), index=True)

View File

@@ -114,7 +114,7 @@ class Player:
"""Get User Info"""
if self._user_info is None:
raw_user_info = await Cache.get(BASE_URL / 'users' / self._request_user_parameter)
user_info: UserInfo = type_validate_json(UserInfo, raw_user_info) # type: ignore[arg-type]
user_info: UserInfo = type_validate_json(UserInfo, raw_user_info) # type: ignore[arg-type] # pyright: ignore[reportArgumentType]
if isinstance(user_info, FailedModel):
msg = f'用户信息请求错误:\n{user_info.error}'
raise RequestError(msg)
@@ -146,7 +146,7 @@ class Player:
BASE_URL / 'users' / self._request_user_parameter / 'summaries' / summaries_type
)
summaries: SummariesModel | FailedModel = type_validate_json(
self.__SUMMARIES_MAPPING[summaries_type] | FailedModel, # type: ignore[arg-type]
self.__SUMMARIES_MAPPING[summaries_type] | FailedModel, # type: ignore[assignment, arg-type] # pyright: ignore[reportArgumentType] #! waiting for [PEP 747](https://peps.python.org/pep-0747/)
raw_summaries,
)
if isinstance(summaries, FailedModel):
@@ -166,7 +166,7 @@ class Player:
async def get_leagueflow(self) -> LeagueFlowSuccess:
if self._leagueflow is None:
leagueflow: LeagueFlow = type_validate_json(
LeagueFlow, # type: ignore[arg-type]
LeagueFlow, # type: ignore[arg-type] # pyright: ignore[reportArgumentType]
await Cache.get(BASE_URL / 'labs/leagueflow' / self._request_user_parameter),
)
if isinstance(leagueflow, FailedModel):
@@ -227,7 +227,7 @@ class Player:
raw_records = await Cache.get(
BASE_URL / 'users' / self._request_user_parameter / 'records' / mode_type / records_type,
)
records: RecordsSoloSuccessModel | FailedModel = type_validate_json(SoloRecord, raw_records) # type: ignore[arg-type]
records: RecordsSoloSuccessModel | FailedModel = type_validate_json(SoloRecord, raw_records) # type: ignore[arg-type] # pyright: ignore[reportArgumentType]
if isinstance(records, FailedModel):
msg = f'用户Summaries数据请求错误:\n{records.error}'
raise RequestError(msg)

View File

@@ -52,12 +52,12 @@ class P(BaseModel):
# fmt: off
class ArCounts(BaseModel):
bronze: int | None = Field(default=None, alias='1') # pyright: ignore [reportGeneralTypeIssues]
silver: int | None = Field(default=None, alias='2') # pyright: ignore [reportGeneralTypeIssues]
gold: int | None = Field(default=None, alias='3') # pyright: ignore [reportGeneralTypeIssues]
platinum: int | None = Field(default=None, alias='4') # pyright: ignore [reportGeneralTypeIssues]
diamond: int | None = Field(default=None, alias='5') # pyright: ignore [reportGeneralTypeIssues]
issued: int | None = Field(default=None, alias='100') # pyright: ignore [reportGeneralTypeIssues]
bronze: int | None = Field(default=None, alias='1')
silver: int | None = Field(default=None, alias='2')
gold: int | None = Field(default=None, alias='3')
platinum: int | None = Field(default=None, alias='4')
diamond: int | None = Field(default=None, alias='5')
issued: int | None = Field(default=None, alias='100')
top3: int | None = Field(default=None, alias='t3')
top5: int | None = Field(default=None, alias='t5')
top10: int | None = Field(default=None, alias='t10')

View File

@@ -14,6 +14,7 @@ class RankType(IntEnum):
PERCENTILELAX = 4
PERCENTILEVLAX = 5
PERCENTILEMLAX = 6
PERCENTILEINVARIANT = 7
class ValueType(IntEnum):

View File

@@ -32,7 +32,7 @@ class PastInner(BaseModel):
class Past(BaseModel):
first: PastInner | None = Field(default=None, alias='1') # pyright: ignore [reportGeneralTypeIssues]
first: PastInner | None = Field(default=None, alias='1')
class BaseData(BaseModel):

View File

@@ -33,6 +33,11 @@ class Distinguishment(BaseModel):
type: str
class OldUsernames(BaseModel):
username: str
ts: datetime
class Data(BaseModel):
id: str = Field(default=..., alias='_id')
username: str
@@ -65,6 +70,7 @@ class Data(BaseModel):
achievements: list[int]
ar: int
ar_counts: ArCounts
oldusernames: list[OldUsernames]
class UserInfoSuccess(BaseSuccessModel):

View File

@@ -1,3 +1,4 @@
from asyncio import gather
from hashlib import md5
from secrets import choice
@@ -13,12 +14,13 @@ from yarl import URL
from ...config.config import global_config
from ...db import BindStatus, create_or_update_bind, trigger
from ...utils.host import HostPage, get_self_netloc
from ...i18n import Lang
from ...utils.host import get_self_netloc
from ...utils.image import get_avatar
from ...utils.lang import get_lang
from ...utils.render import Bind, render
from ...utils.render import render_image
from ...utils.render.schemas.base import Avatar, People
from ...utils.screenshot import screenshot
from ...utils.render.schemas.bind import Bind
from . import alc, command, get_player
from .api import Player
from .constant import GAME_TYPE
@@ -44,6 +46,42 @@ alc.shortcut(
humanized='io绑定',
)
try:
from nonebot.adapters.discord import MessageCreateEvent
@alc.assign('TETRIO.bind')
async def _(_: MessageCreateEvent, nb_user: User, account: Player, event_session: Uninfo, interface: QryItrface):
async with trigger(
session_persist_id=await get_session_persist_id(event_session),
game_platform=GAME_TYPE,
command_type='bind',
command_args=[],
):
user, user_info = await gather(account.user, account.get_info())
verify = (
user_info.data.connections.discord is not None
and user_info.data.connections.discord.id == event_session.user.id
)
async with get_session() as session:
bind_status = await create_or_update_bind(
session=session,
user=nb_user,
game_platform=GAME_TYPE,
game_account=user.unique_identifier,
verify=verify,
)
if bind_status in (BindStatus.SUCCESS, BindStatus.UPDATE):
await UniMessage.image(
raw=await make_bind_image(
player=account,
event_session=event_session,
interface=interface,
verify=verify,
)
).finish()
except ImportError:
pass
@alc.assign('TETRIO.bind')
async def _(nb_user: User, account: Player, event_session: Uninfo, interface: QryItrface):
@@ -62,36 +100,45 @@ async def _(nb_user: User, account: Player, event_session: Uninfo, interface: Qr
game_account=user.unique_identifier,
)
if bind_status in (BindStatus.SUCCESS, BindStatus.UPDATE):
netloc = get_self_netloc()
async with HostPage(
await render(
'v1/binding',
Bind(
platform='TETR.IO',
type='unknown',
user=People(
avatar=str(
URL(f'http://{netloc}/host/resource/tetrio/avatars/{user.ID}')
% {'revision': avatar_revision}
)
if (avatar_revision := (await account.avatar_revision)) is not None and avatar_revision != 0
else Avatar(type='identicon', hash=md5(user.ID.encode()).hexdigest()), # noqa: S324
name=user.name.upper(),
),
bot=People(
avatar=await get_avatar(
(
bot_user := await interface.get_user(event_session.self_id)
or UninfoUser(id=event_session.self_id)
),
'Data URI',
'../../static/logo/logo.svg',
),
name=bot_user.nick or bot_user.name or choice(list(global_config.nickname) or ['bot']),
),
prompt='io查我',
lang=get_lang(),
),
await UniMessage.image(
raw=await make_bind_image(
player=account,
event_session=event_session,
interface=interface,
verify=None,
)
) as page_hash:
await UniMessage.image(raw=await screenshot(f'http://{netloc}/host/{page_hash}.html')).finish()
).finish()
async def make_bind_image(
player: Player, event_session: Uninfo, interface: QryItrface, *, verify: bool | None = None
) -> bytes:
(user, avatar_revision) = await gather(player.user, player.avatar_revision)
return await render_image(
Bind(
platform='TETR.IO',
type='unknown' if verify is None else 'success' if verify else 'unverified',
user=People(
avatar=str(
URL(f'http://{get_self_netloc()}/host/resource/tetrio/avatars/{user.ID}')
% {'revision': avatar_revision}
)
if avatar_revision is not None and avatar_revision != 0
else Avatar(type='identicon', hash=md5(user.ID.encode()).hexdigest()), # noqa: S324
name=user.name.upper(),
),
bot=People(
avatar=await get_avatar(
(
bot_user := await interface.get_user(event_session.self_id)
or UninfoUser(id=event_session.self_id)
),
'Data URI',
'../../static/logo/logo.svg',
),
name=bot_user.nick or bot_user.name or choice(list(global_config.nickname) or ['bot']),
),
prompt=Lang.prompt.io_check(),
lang=get_lang(),
),
)

View File

@@ -1,3 +1,5 @@
from datetime import timedelta
from arclet.alconna import Arg
from nonebot_plugin_alconna import Option, Subcommand
from nonebot_plugin_alconna.uniseg import UniMessage
@@ -8,6 +10,8 @@ from nonebot_plugin_user import User
from sqlalchemy import select
from ...db import trigger
from ...i18n import Lang
from ...utils.duration import parse_duration
from . import alc, command
from .constant import GAME_TYPE
from .models import TETRIOUserConfig
@@ -22,6 +26,12 @@ command.add(
alias=['-DT', 'DefaultTemplate'],
help_text='设置默认查询模板',
),
Option(
'--default-compare',
Arg('compare', parse_duration, notice='对比时间距离'),
alias=['-DC', 'DefaultCompare'],
help_text='设置默认对比时间距离',
),
help_text='TETR.IO 查询个性化配置',
),
)
@@ -34,18 +44,28 @@ alc.shortcut(
@alc.assign('TETRIO.config')
async def _(user: User, session: async_scoped_session, event_session: Uninfo, template: Template):
async def _(
user: User,
session: async_scoped_session,
event_session: Uninfo,
template: Template | None = None,
compare: timedelta | None = None,
):
async with trigger(
session_persist_id=await get_session_persist_id(event_session),
game_platform=GAME_TYPE,
command_type='config',
command_args=[f'--default-template {template}'],
command_args=([f'--default-template {template}'] if template is not None else [])
+ ([f'--default-compare {compare}'] if compare is not None else []),
):
config = (await session.scalars(select(TETRIOUserConfig).where(TETRIOUserConfig.id == user.id))).one_or_none()
if config is None:
config = TETRIOUserConfig(id=user.id, query_template=template)
config = TETRIOUserConfig(id=user.id, query_template=template or 'v1', compare_delta=compare)
session.add(config)
else:
config.query_template = template
if template is not None:
config.query_template = template
if compare is not None:
config.compare_delta = compare
await session.commit()
await UniMessage('配置成功').finish()
await UniMessage(Lang.bind.config_success()).finish()

View File

@@ -4,12 +4,10 @@ from nonebot_plugin_uninfo import Uninfo
from nonebot_plugin_uninfo.orm import get_session_persist_id
from ...db import trigger
from ...utils.host import HostPage, get_self_netloc
from ...utils.lang import get_lang
from ...utils.metrics import get_metrics
from ...utils.render import render
from ...utils.render import render_image
from ...utils.render.schemas.v2.tetrio.user.list import Data, List, TetraLeague, User
from ...utils.screenshot import screenshot
from .. import alc
from . import command
from .api.leaderboards import by
@@ -59,9 +57,8 @@ async def _(
country=country,
)
league = await by('league', parameter)
async with HostPage(
await render(
'v2/tetrio/user/list',
await UniMessage.image(
raw=await render_image(
List(
show_index=True,
data=[
@@ -92,5 +89,4 @@ async def _(
lang=get_lang(),
),
)
) as page_hash:
await UniMessage.image(raw=await screenshot(f'http://{get_self_netloc()}/host/{page_hash}.html')).finish()
).finish()

View File

@@ -1,8 +1,8 @@
from datetime import datetime
from datetime import datetime, timedelta
from uuid import UUID
from nonebot_plugin_orm import Model
from sqlalchemy import DateTime, ForeignKey, String
from sqlalchemy import DateTime, ForeignKey, Integer, Interval, String, UniqueConstraint
from sqlalchemy.orm import Mapped, MappedAsDataclass, mapped_column, relationship
from ...db.models import PydanticType
@@ -12,11 +12,16 @@ from .typedefs import Template
class TETRIOUserConfig(MappedAsDataclass, Model):
__tablename__ = 'nb_t_io_u_cfg'
id: Mapped[int] = mapped_column(primary_key=True)
query_template: Mapped[Template] = mapped_column(String(2))
compare_delta: Mapped[timedelta | None] = mapped_column(Interval(native=True), nullable=True)
class TETRIOLeagueStats(MappedAsDataclass, Model):
__tablename__ = 'nb_t_io_tl_stats'
id: Mapped[int] = mapped_column(init=False, primary_key=True)
raw: Mapped[list['TETRIOLeagueHistorical']] = relationship(back_populates='stats', lazy='noload')
fields: Mapped[list['TETRIOLeagueStatsField']] = relationship(back_populates='stats')
@@ -24,11 +29,13 @@ class TETRIOLeagueStats(MappedAsDataclass, Model):
class TETRIOLeagueHistorical(MappedAsDataclass, Model):
__tablename__ = 'nb_t_io_tl_hist'
id: Mapped[int] = mapped_column(init=False, primary_key=True)
request_id: Mapped[UUID] = mapped_column(index=True)
data: Mapped[BySuccessModel] = mapped_column(PydanticType([], {BySuccessModel}))
update_time: Mapped[datetime] = mapped_column(DateTime, index=True)
stats_id: Mapped[int] = mapped_column(ForeignKey('nonebot_plugin_tetris_stats_tetrioleaguestats.id'), init=False)
stats_id: Mapped[int] = mapped_column(ForeignKey('nb_t_io_tl_stats.id'), init=False)
stats: Mapped['TETRIOLeagueStats'] = relationship(back_populates='raw')
@@ -36,6 +43,8 @@ entry_type = PydanticType([], {Entry})
class TETRIOLeagueStatsField(MappedAsDataclass, Model):
__tablename__ = 'nb_t_io_tl_stats_field'
id: Mapped[int] = mapped_column(init=False, primary_key=True)
rank: Mapped[ValidRank] = mapped_column(String(2), index=True)
tr_line: Mapped[float]
@@ -49,5 +58,23 @@ class TETRIOLeagueStatsField(MappedAsDataclass, Model):
high_pps: Mapped[Entry] = mapped_column(entry_type)
high_apm: Mapped[Entry] = mapped_column(entry_type)
high_vs: Mapped[Entry] = mapped_column(entry_type)
stats_id: Mapped[int] = mapped_column(ForeignKey('nonebot_plugin_tetris_stats_tetrioleaguestats.id'), init=False)
stats_id: Mapped[int] = mapped_column(ForeignKey('nb_t_io_tl_stats.id'), init=False)
stats: Mapped['TETRIOLeagueStats'] = relationship(back_populates='fields')
class TETRIOUserUniqueIdentifier(MappedAsDataclass, Model):
__tablename__ = 'nb_t_io_uid'
id: Mapped[int] = mapped_column(init=False, primary_key=True)
user_unique_identifier: Mapped[str] = mapped_column(String(24), unique=True, index=True)
class TETRIOLeagueUserMap(MappedAsDataclass, Model):
__tablename__ = 'nb_t_io_tl_map'
__table_args__ = (UniqueConstraint('uid_id', 'hist_id', name='uq_nb_t_io_tl_map_uid_hist'),)
id: Mapped[int] = mapped_column(init=False, primary_key=True)
stats_id: Mapped[int] = mapped_column(ForeignKey('nb_t_io_tl_stats.id'), index=True)
uid_id: Mapped[int] = mapped_column(ForeignKey('nb_t_io_uid.id'), index=True)
hist_id: Mapped[int] = mapped_column(ForeignKey('nb_t_io_tl_hist.id'))
entry_index: Mapped[int] = mapped_column(Integer)

View File

@@ -1,11 +1,11 @@
from datetime import timezone
from datetime import timedelta, timezone
from arclet.alconna import Arg, ArgFlag
from nonebot import get_driver
from nonebot.adapters import Event
from nonebot.matcher import Matcher
from nonebot_plugin_alconna import Args, At, Option, Subcommand
from nonebot_plugin_alconna.uniseg import UniMessage
from nonebot_plugin_alconna.uniseg import Image, UniMessage
from nonebot_plugin_orm import get_session
from nonebot_plugin_uninfo import Uninfo
from nonebot_plugin_uninfo.orm import get_session_persist_id
@@ -13,8 +13,9 @@ from nonebot_plugin_user import User as NBUser
from nonebot_plugin_user import get_user
from sqlalchemy import select
from ....db import query_bind_info, trigger
from ....db import query_bind_info, resolve_compare_delta, trigger
from ....i18n import Lang
from ....utils.duration import parse_duration
from ....utils.exception import FallbackError
from ....utils.typedefs import Me
from ... import add_block_handlers, alc
@@ -53,6 +54,12 @@ command.add(
alias=['-T'],
help_text='要使用的查询模板',
),
Option(
'--compare',
Arg('compare', parse_duration),
alias=['-C'],
help_text='指定对比时间距离',
),
help_text='查询 TETR.IO 游戏信息',
),
)
@@ -73,10 +80,10 @@ alc.shortcut(
add_block_handlers(alc.assign('TETRIO.query'))
async def make_query_result(player: Player, template: Template) -> UniMessage:
async def make_query_result(player: Player, template: Template, compare_delta: timedelta) -> UniMessage:
if template == 'v1':
try:
return UniMessage.image(raw=await make_query_image_v1(player))
return UniMessage.image(raw=await make_query_image_v1(player, compare_delta))
except FallbackError:
template = 'v2'
if template == 'v2':
@@ -92,12 +99,18 @@ async def _( # noqa: PLR0913
target: At | Me,
event_session: Uninfo,
template: Template | None = None,
compare: timedelta | None = None,
):
command_args: list[str] = []
if template is not None:
command_args.append(f'--template {template}')
if compare is not None:
command_args.append(f'--compare {compare}')
async with trigger(
session_persist_id=await get_session_persist_id(event_session),
game_platform=GAME_TYPE,
command_type='query',
command_args=[f'--template {template}'] if template is not None else [],
command_args=command_args,
):
async with get_session() as session:
bind = await query_bind_info(
@@ -111,25 +124,44 @@ async def _( # noqa: PLR0913
template = await session.scalar(
select(TETRIOUserConfig.query_template).where(TETRIOUserConfig.id == user.id)
)
compare_delta = await resolve_compare_delta(TETRIOUserConfig, session, user.id, compare)
if bind is None:
await matcher.finish('未查询到绑定信息')
await matcher.finish(Lang.bind.not_found())
player = Player(user_id=bind.game_account, trust=True)
await (
UniMessage.i18n(Lang.interaction.warning.unverified) + await make_query_result(player, template or 'v1')
UniMessage.i18n(Lang.interaction.warning.unverified)
+ (
UniMessage('\n')
if not (result := await make_query_result(player, template or 'v1', compare_delta)).has(Image)
else UniMessage()
)
+ result
).finish()
@alc.assign('TETRIO.query')
async def _(user: NBUser, account: Player, event_session: Uninfo, template: Template | None = None):
async def _(
user: NBUser,
account: Player,
event_session: Uninfo,
template: Template | None = None,
compare: timedelta | None = None,
):
command_args: list[str] = []
if template is not None:
command_args.append(f'--template {template}')
if compare is not None:
command_args.append(f'--compare {compare}')
async with trigger(
session_persist_id=await get_session_persist_id(event_session),
game_platform=GAME_TYPE,
command_type='query',
command_args=[f'--template {template}'] if template is not None else [],
command_args=command_args,
):
async with get_session() as session:
if template is None:
template = await session.scalar(
select(TETRIOUserConfig.query_template).where(TETRIOUserConfig.id == user.id)
)
await (await make_query_result(account, template or 'v1')).finish()
compare_delta = await resolve_compare_delta(TETRIOUserConfig, session, user.id, compare)
await (await make_query_result(account, template or 'v1', compare_delta)).finish()

View File

@@ -1,26 +1,223 @@
from asyncio import gather
from datetime import timedelta
from datetime import datetime, timedelta, timezone
from hashlib import md5
from typing import Literal, NamedTuple
from nonebot_plugin_orm import AsyncSession, get_session
from sqlalchemy import func, select
from yarl import URL
from ....utils.chart import get_split, get_value_bounds, handle_history_data
from ....utils.exception import FallbackError
from ....utils.host import HostPage, get_self_netloc
from ....utils.host import get_self_netloc
from ....utils.lang import get_lang
from ....utils.metrics import get_metrics
from ....utils.render import render
from ....utils.metrics import TetrisMetricsProWithPPSVS, get_metrics
from ....utils.render import render_image
from ....utils.render.schemas.base import Avatar, Trending
from ....utils.render.schemas.v1.base import History
from ....utils.render.schemas.v1.tetrio.user.info import Info, Multiplayer, Singleplayer, User
from ....utils.screenshot import screenshot
from ....utils.render.schemas.v1.tetrio.info import Info, Multiplayer, Singleplayer, User
from ..api import Player
from ..api.schemas.summaries.league import RatedData
from ..api.models import TETRIOHistoricalData
from ..api.schemas.leaderboards.by import Entry, InvalidEntry
from ..api.schemas.summaries.league import LeagueSuccessModel, NeverRatedData, RatedData
from ..constant import TR_MAX, TR_MIN
from ..models import TETRIOLeagueHistorical, TETRIOLeagueUserMap, TETRIOUserUniqueIdentifier
from .tools import flow_to_history, get_league_data
UTC = timezone.utc
async def make_query_image_v1(player: Player) -> bytes:
class Trends(NamedTuple):
pps: Trending = Trending.KEEP
apm: Trending = Trending.KEEP
adpm: Trending = Trending.KEEP
class HistoricalSnapshot(NamedTuple):
metrics: TetrisMetricsProWithPPSVS
delta: timedelta
async def get_nearest_historical(
session: AsyncSession,
unique_identifier: str,
target_time: datetime,
) -> HistoricalSnapshot | None:
before = await session.scalar(
select(TETRIOHistoricalData)
.where(
TETRIOHistoricalData.user_unique_identifier == unique_identifier,
TETRIOHistoricalData.api_type == 'league',
TETRIOHistoricalData.update_time <= target_time,
)
.order_by(TETRIOHistoricalData.update_time.desc())
.limit(1)
)
after = await session.scalar(
select(TETRIOHistoricalData)
.where(
TETRIOHistoricalData.user_unique_identifier == unique_identifier,
TETRIOHistoricalData.api_type == 'league',
TETRIOHistoricalData.update_time >= target_time,
)
.order_by(TETRIOHistoricalData.update_time.asc())
.limit(1)
)
candidates = [i for i in (before, after) if i is not None]
if not candidates:
return None
delta_seconds, selected = min(
(
abs((target_time - i.update_time.astimezone(UTC)).total_seconds()),
i,
)
for i in candidates
)
delta = timedelta(seconds=delta_seconds)
if not isinstance(selected.data, LeagueSuccessModel) or not isinstance(
selected.data.data, RatedData | NeverRatedData
):
return None
data = selected.data.data
return HistoricalSnapshot(get_metrics(pps=data.pps, apm=data.apm, vs=data.vs), delta)
async def _get_boundary_league_historical(
session: AsyncSession,
uid_id: int,
target_time: datetime,
*,
time_direction: Literal['before', 'after'],
) -> tuple[TETRIOLeagueUserMap, datetime] | None:
boundary_time = await session.scalar(
select((func.max if time_direction == 'before' else func.min)(TETRIOLeagueHistorical.update_time))
.select_from(TETRIOLeagueUserMap)
.join(TETRIOLeagueHistorical, TETRIOLeagueUserMap.hist_id == TETRIOLeagueHistorical.id)
.where(
TETRIOLeagueUserMap.uid_id == uid_id,
TETRIOLeagueHistorical.update_time <= target_time
if time_direction == 'before'
else TETRIOLeagueHistorical.update_time >= target_time,
)
)
if boundary_time is None:
return None
return (
(
await session.execute(
select(TETRIOLeagueUserMap, TETRIOLeagueHistorical.update_time)
.join(TETRIOLeagueHistorical, TETRIOLeagueUserMap.hist_id == TETRIOLeagueHistorical.id)
.where(
TETRIOLeagueUserMap.uid_id == uid_id,
TETRIOLeagueHistorical.update_time == boundary_time,
)
.order_by(TETRIOLeagueHistorical.id.desc())
.limit(1)
)
)
.tuples()
.first()
)
async def get_nearest_league_historical(
session: AsyncSession,
unique_identifier: str,
target_time: datetime,
) -> HistoricalSnapshot | None:
uid_id = await session.scalar(
select(TETRIOUserUniqueIdentifier.id).where(
TETRIOUserUniqueIdentifier.user_unique_identifier == unique_identifier
)
)
if uid_id is None:
return None
before = await _get_boundary_league_historical(
session,
uid_id,
target_time,
time_direction='before',
)
after = await _get_boundary_league_historical(
session,
uid_id,
target_time,
time_direction='after',
)
candidates = [i for i in (before, after) if i is not None]
if not candidates:
return None
delta_seconds, selected = min(
(
abs((target_time - i[1].astimezone(UTC)).total_seconds()),
i[0],
)
for i in candidates
)
delta = timedelta(seconds=delta_seconds)
historical = await session.get(TETRIOLeagueHistorical, selected.hist_id)
if historical is None or not isinstance(
(entry := find_entry(historical.data.data.entries, selected.entry_index, unique_identifier)), Entry
):
return None
return HistoricalSnapshot(get_metrics(pps=entry.league.pps, apm=entry.league.apm, vs=entry.league.vs), delta)
def find_entry(
entries: list[Entry | InvalidEntry],
entry_index: int,
unique_identifier: str | None = None,
) -> Entry | InvalidEntry | None:
if 0 <= entry_index < len(entries):
entry = entries[entry_index]
if unique_identifier is None or entry.id == unique_identifier:
return entry
if unique_identifier is None:
return None
for entry in entries:
if entry.id == unique_identifier:
return entry
return None
async def get_trends(player: Player, compare_delta: timedelta) -> Trends:
league = await player.league
if not isinstance(league.data, RatedData | NeverRatedData):
return Trends()
user = await player.user
async with get_session() as session:
target_time = (league.cache.cached_at - compare_delta).astimezone(UTC)
historical, league_historical = await gather(
get_nearest_historical(
session,
user.unique_identifier,
target_time,
),
get_nearest_league_historical(
session,
user.unique_identifier,
target_time,
),
)
selected = min((historical, league_historical), key=lambda x: x.delta if x is not None else timedelta.max)
if selected is None:
return Trends()
metrics = get_metrics(pps=league.data.pps, apm=league.data.apm, vs=league.data.vs)
return Trends(
pps=Trending.compare(selected.metrics.pps, metrics.pps),
apm=Trending.compare(selected.metrics.apm, metrics.apm),
adpm=Trending.compare(selected.metrics.adpm, metrics.adpm),
)
async def make_query_image_v1(player: Player, compare_delta: timedelta) -> bytes:
(
(user, user_info, league, sprint, blitz, leagueflow),
(avatar_revision,),
@@ -29,8 +226,6 @@ async def make_query_image_v1(player: Player) -> bytes:
gather(player.avatar_revision),
)
league_data = get_league_data(league, RatedData)
if league_data.vs is None:
raise FallbackError
histories = flow_to_history(leagueflow, handle_history_data)
values = get_value_bounds([i.score for i in histories])
split_value, offset = get_split(values, TR_MAX, TR_MIN)
@@ -40,61 +235,57 @@ async def make_query_image_v1(player: Player) -> bytes:
else:
sprint_value = 'N/A'
blitz_value = f'{blitz.data.record.results.stats.score:,}' if blitz.data.record is not None else 'N/A'
netloc = get_self_netloc()
dsps: float
dspp: float
# make mypy happy
async with HostPage(
page=await render(
'v1/tetrio/info',
Info(
user=User(
avatar=str(
URL(f'http://{netloc}/host/resource/tetrio/avatars/{user.ID}') % {'revision': avatar_revision}
)
if avatar_revision is not None and avatar_revision != 0
else Avatar(
type='identicon',
hash=md5(user.ID.encode()).hexdigest(), # noqa: S324
),
name=user.name.upper(),
bio=user_info.data.bio,
return await render_image(
Info(
user=User(
avatar=str(
URL(f'http://{get_self_netloc()}/host/resource/tetrio/avatars/{user.ID}')
% {'revision': avatar_revision}
)
if avatar_revision is not None and avatar_revision != 0
else Avatar(
type='identicon',
hash=md5(user.ID.encode()).hexdigest(), # noqa: S324
),
multiplayer=Multiplayer(
glicko=f'{round(league_data.glicko, 2):,}',
rd=round(league_data.rd, 2),
rank=league_data.rank,
tr=f'{round(league_data.tr, 2):,}',
global_rank=league_data.standing,
history=History(
data=histories,
split_interval=split_value,
min_value=values.value_min,
max_value=values.value_max,
offset=offset,
),
lpm=(metrics := get_metrics(pps=league_data.pps, apm=league_data.apm, vs=league_data.vs)).lpm,
pps=metrics.pps,
lpm_trending=Trending.KEEP,
apm=metrics.apm,
apl=metrics.apl,
apm_trending=Trending.KEEP,
adpm=metrics.adpm,
vs=metrics.vs,
adpl=metrics.adpl,
adpm_trending=Trending.KEEP,
app=(app := (league_data.apm / (60 * league_data.pps))),
dsps=(dsps := ((league_data.vs / 100) - (league_data.apm / 60))),
dspp=(dspp := (dsps / league_data.pps)),
ci=150 * dspp - 125 * app + 50 * (league_data.vs / league_data.apm) - 25,
ge=2 * ((app * dsps) / league_data.pps),
),
singleplayer=Singleplayer(
sprint=sprint_value,
blitz=blitz_value,
),
lang=get_lang(),
name=user.name.upper(),
bio=user_info.data.bio,
),
)
) as page_hash:
return await screenshot(f'http://{netloc}/host/{page_hash}.html')
multiplayer=Multiplayer(
glicko=f'{round(league_data.glicko, 2):,}',
rd=round(league_data.rd, 2),
rank=league_data.rank,
tr=f'{round(league_data.tr, 2):,}',
global_rank=league_data.standing,
history=History(
data=histories,
split_interval=split_value,
min_value=values.value_min,
max_value=values.value_max,
offset=offset,
),
lpm=(metrics := get_metrics(pps=league_data.pps, apm=league_data.apm, vs=league_data.vs)).lpm,
pps=metrics.pps,
lpm_trending=(trends := (await get_trends(player, compare_delta))).pps,
apm=metrics.apm,
apl=metrics.apl,
apm_trending=trends.apm,
adpm=metrics.adpm,
vs=metrics.vs,
adpl=metrics.adpl,
adpm_trending=trends.adpm,
app=(app := (league_data.apm / (60 * league_data.pps))),
dsps=(dsps := ((league_data.vs / 100) - (league_data.apm / 60))),
dspp=(dspp := (dsps / league_data.pps)),
ci=150 * dspp - 125 * app + 50 * (league_data.vs / league_data.apm) - 25,
ge=2 * ((app * dsps) / league_data.pps),
),
singleplayer=Singleplayer(
sprint=sprint_value,
blitz=blitz_value,
),
lang=get_lang(),
),
)

View File

@@ -5,10 +5,10 @@ from hashlib import md5
from yarl import URL
from ....utils.exception import FallbackError
from ....utils.host import HostPage, get_self_netloc
from ....utils.host import get_self_netloc
from ....utils.lang import get_lang
from ....utils.metrics import get_metrics
from ....utils.render import render
from ....utils.render import render_image
from ....utils.render.schemas.base import Avatar
from ....utils.render.schemas.v2.tetrio.user.info import (
Achievement,
@@ -25,7 +25,6 @@ from ....utils.render.schemas.v2.tetrio.user.info import (
Zen,
Zenith,
)
from ....utils.screenshot import screenshot
from ..api import Player
from ..api.schemas.summaries.league import InvalidData, NeverPlayedData, NeverRatedData
from .tools import flow_to_history, handling_special_value
@@ -74,137 +73,133 @@ async def make_query_image_v2(player: Player) -> bytes:
except FallbackError:
history = None
netloc = get_self_netloc()
async with HostPage(
await render(
'v2/tetrio/user/info',
Info(
user=User(
id=user.ID,
name=user.name.upper(),
country=user_info.data.country,
role=user_info.data.role,
botmaster=user_info.data.botmaster,
avatar=str(
URL(f'http://{netloc}/host/resource/tetrio/avatars/{user.ID}') % {'revision': avatar_revision}
)
if avatar_revision is not None and avatar_revision != 0
else Avatar(
type='identicon',
hash=md5(user.ID.encode()).hexdigest(), # noqa: S324
),
banner=str(
URL(f'http://{netloc}/host/resource/tetrio/banners/{user.ID}') % {'revision': banner_revision}
)
if banner_revision is not None and banner_revision != 0
else None,
bio=user_info.data.bio,
friend_count=user_info.data.friend_count,
supporter_tier=user_info.data.supporter_tier,
bad_standing=user_info.data.badstanding or False,
badges=[
Badge(
id=i.id,
description=i.label,
group=i.group,
receive_at=i.ts if isinstance(i.ts, datetime) else None,
)
for i in user_info.data.badges
],
xp=user_info.data.xp,
ar=user_info.data.ar,
achievements=[
Achievement(
key=i.achievement_id,
rank_type=i.rank_type,
ar_type=i.ar_type,
stub=i.stub,
rank=i.rank,
achieved_score=i.achieved_score,
pos=i.pos,
progress=i.progress,
total=i.total,
)
for i in achievements.data
],
playtime=play_time,
join_at=user_info.data.ts,
),
tetra_league=TetraLeague(
rank=league.data.rank,
highest_rank='z' if isinstance(league.data, NeverRatedData) else league.data.bestrank,
tr=round(league.data.tr, 2),
glicko=round(league.data.glicko, 2),
rd=round(league.data.rd, 2),
global_rank=league.data.standing,
country_rank=league.data.standing_local,
pps=(metrics := get_metrics(pps=league.data.pps, apm=league.data.apm, vs=league.data.vs)).pps,
apm=metrics.apm,
apl=metrics.apl,
vs=metrics.vs,
adpl=metrics.adpl,
statistic=TetraLeagueStatistic(total=league.data.gamesplayed, wins=league.data.gameswon),
decaying=league.data.decaying,
history=history,
return await render_image(
Info(
user=User(
id=user.ID,
name=user.name.upper(),
country=user_info.data.country,
role=user_info.data.role,
botmaster=user_info.data.botmaster,
avatar=str(
URL(f'http://{netloc}/host/resource/tetrio/avatars/{user.ID}') % {'revision': avatar_revision}
)
if not isinstance(league.data, NeverPlayedData | InvalidData)
else None,
zenith=Zenith(
week=Week(
altitude=zenith.data.record.results.stats.zenith.altitude,
global_rank=zenith.data.rank,
country_rank=zenith.data.rank_local,
play_at=zenith.data.record.ts,
)
if zenith.data.record is not None
else None,
best=Best(
altitude=zenith.data.best.record.results.stats.zenith.altitude,
global_rank=zenith.data.best.rank,
play_at=zenith.data.best.record.ts,
)
if zenith.data.best.record is not None
else None,
if avatar_revision is not None and avatar_revision != 0
else Avatar(
type='identicon',
hash=md5(user.ID.encode()).hexdigest(), # noqa: S324
),
zenithex=Zenith(
week=Week(
altitude=zenithex.data.record.results.stats.zenith.altitude,
global_rank=zenithex.data.rank,
country_rank=zenithex.data.rank_local,
play_at=zenithex.data.record.ts,
)
if zenithex.data.record is not None
else None,
best=Best(
altitude=zenithex.data.best.record.results.stats.zenith.altitude,
global_rank=zenithex.data.best.rank,
play_at=zenithex.data.best.record.ts,
)
if zenithex.data.best.record is not None
else None,
),
statistic=Statistic(
total=handling_special_value(user_info.data.gamesplayed),
wins=handling_special_value(user_info.data.gameswon),
),
sprint=Sprint(
time=sprint_value,
global_rank=sprint.data.rank,
country_rank=sprint.data.rank_local,
play_at=sprint.data.record.ts,
banner=str(
URL(f'http://{netloc}/host/resource/tetrio/banners/{user.ID}') % {'revision': banner_revision}
)
if sprint.data.record is not None
if banner_revision is not None and banner_revision != 0
else None,
blitz=Blitz(
score=blitz.data.record.results.stats.score,
global_rank=blitz.data.rank,
country_rank=blitz.data.rank_local,
play_at=blitz.data.record.ts,
)
if blitz.data.record is not None
else None,
zen=Zen(level=zen.data.level, score=zen.data.score),
lang=get_lang(),
bio=user_info.data.bio,
friend_count=user_info.data.friend_count,
supporter_tier=user_info.data.supporter_tier,
bad_standing=user_info.data.badstanding or False,
badges=[
Badge(
id=i.id,
description=i.label,
group=i.group,
receive_at=i.ts if isinstance(i.ts, datetime) else None,
)
for i in user_info.data.badges
],
xp=user_info.data.xp,
ar=user_info.data.ar,
achievements=[
Achievement(
key=i.achievement_id,
rank_type=i.rank_type,
ar_type=i.ar_type,
stub=i.stub,
rank=i.rank,
achieved_score=i.achieved_score,
pos=i.pos,
progress=i.progress,
total=i.total,
)
for i in achievements.data
],
playtime=play_time,
join_at=user_info.data.ts,
),
tetra_league=TetraLeague(
rank=league.data.rank,
highest_rank='z' if isinstance(league.data, NeverRatedData) else league.data.bestrank,
tr=round(league.data.tr, 2),
glicko=round(league.data.glicko, 2),
rd=round(league.data.rd, 2),
global_rank=league.data.standing,
country_rank=league.data.standing_local,
pps=(metrics := get_metrics(pps=league.data.pps, apm=league.data.apm, vs=league.data.vs)).pps,
apm=metrics.apm,
apl=metrics.apl,
vs=metrics.vs,
adpl=metrics.adpl,
statistic=TetraLeagueStatistic(total=league.data.gamesplayed, wins=league.data.gameswon),
decaying=league.data.decaying,
history=history,
)
if not isinstance(league.data, NeverPlayedData | InvalidData)
else None,
zenith=Zenith(
week=Week(
altitude=zenith.data.record.results.stats.zenith.altitude,
global_rank=zenith.data.rank,
country_rank=zenith.data.rank_local,
play_at=zenith.data.record.ts,
)
if zenith.data.record is not None
else None,
best=Best(
altitude=zenith.data.best.record.results.stats.zenith.altitude,
global_rank=zenith.data.best.rank,
play_at=zenith.data.best.record.ts,
)
if zenith.data.best.record is not None
else None,
),
zenithex=Zenith(
week=Week(
altitude=zenithex.data.record.results.stats.zenith.altitude,
global_rank=zenithex.data.rank,
country_rank=zenithex.data.rank_local,
play_at=zenithex.data.record.ts,
)
if zenithex.data.record is not None
else None,
best=Best(
altitude=zenithex.data.best.record.results.stats.zenith.altitude,
global_rank=zenithex.data.best.rank,
play_at=zenithex.data.best.record.ts,
)
if zenithex.data.best.record is not None
else None,
),
statistic=Statistic(
total=handling_special_value(user_info.data.gamesplayed),
wins=handling_special_value(user_info.data.gameswon),
),
sprint=Sprint(
time=sprint_value,
global_rank=sprint.data.rank,
country_rank=sprint.data.rank_local,
play_at=sprint.data.record.ts,
)
if sprint.data.record is not None
else None,
blitz=Blitz(
score=blitz.data.record.results.stats.score,
global_rank=blitz.data.rank,
country_rank=blitz.data.rank_local,
play_at=blitz.data.record.ts,
)
if blitz.data.record is not None
else None,
zen=Zen(level=zen.data.level, score=zen.data.score),
lang=get_lang(),
),
) as page_hash:
return await screenshot(f'http://{netloc}/host/{page_hash}.html')
)

View File

@@ -1,9 +1,9 @@
from collections import defaultdict
from collections.abc import Callable, Sequence
from collections.abc import Callable, Iterator, Sequence
from datetime import datetime, timedelta, timezone
from math import floor
from statistics import mean
from typing import TYPE_CHECKING
from typing import TYPE_CHECKING, TypeVar
from uuid import uuid4
from nonebot import get_driver
@@ -22,7 +22,13 @@ from ..api.schemas.base import P
from ..api.schemas.leaderboards import Parameter
from ..api.schemas.leaderboards.by import Entry
from ..constant import RANK_PERCENTILE
from ..models import TETRIOLeagueHistorical, TETRIOLeagueStats, TETRIOLeagueStatsField
from ..models import (
TETRIOLeagueHistorical,
TETRIOLeagueStats,
TETRIOLeagueStatsField,
TETRIOLeagueUserMap,
TETRIOUserUniqueIdentifier,
)
if TYPE_CHECKING:
from ..api.schemas.leaderboards.by import BySuccessModel
@@ -81,6 +87,14 @@ def find_special_player(
return sort(users, field)
T = TypeVar('T')
def _chunked(values: list[T], size: int) -> Iterator[list[T]]:
for i in range(0, len(values), size):
yield values[i : i + size]
@scheduler.scheduled_job('cron', hour='0,6,12,18', minute=0)
async def get_tetra_league_data() -> None:
x_session_id = uuid4()
@@ -94,9 +108,7 @@ async def get_tetra_league_data() -> None:
if len(model.data.entries) < 100: # 分页值 # noqa: PLR2004
break
players: list[Entry] = []
for result in results:
players.extend([i for i in result.data.entries if isinstance(i, Entry)])
players = [i for result in results for i in result.data.entries if isinstance(i, Entry)]
players.sort(key=lambda x: x.league.tr, reverse=True)
rank_player_mapping: defaultdict[Rank, list[Entry]] = defaultdict(list)
@@ -132,12 +144,41 @@ async def get_tetra_league_data() -> None:
]
stats.raw = historicals
stats.fields = fields
player_ids = {i.id for result in results for i in result.data.entries}
async with get_session() as session:
session.add(stats)
existing_ids: list[TETRIOUserUniqueIdentifier] = []
for chunk in _chunked(list(player_ids), 500):
existing_ids.extend(
(
await session.scalars(
select(TETRIOUserUniqueIdentifier).filter(
TETRIOUserUniqueIdentifier.user_unique_identifier.in_(chunk)
)
)
).all()
)
new_ids = [
TETRIOUserUniqueIdentifier(user_unique_identifier=i)
for i in player_ids - {i.user_unique_identifier for i in existing_ids}
]
session.add_all(new_ids)
await session.flush()
uid_mapping = {i.user_unique_identifier: i.id for i in list(existing_ids) + new_ids}
maps: list[TETRIOLeagueUserMap] = []
for i in stats.raw:
for index, entry in enumerate(i.data.data.entries):
maps.append(
TETRIOLeagueUserMap(
stats_id=stats.id, uid_id=uid_mapping[entry.id], hist_id=i.id, entry_index=index
)
)
session.add_all(maps)
maps.clear()
await session.commit()
if not config.tetris.development:
if not config.tetris.dev.enabled:
@driver.on_startup
async def _() -> None:

View File

@@ -5,20 +5,18 @@ from nonebot_plugin_alconna import Option, Subcommand, UniMessage
from nonebot_plugin_orm import get_session
from nonebot_plugin_uninfo import Uninfo
from nonebot_plugin_uninfo.orm import get_session_persist_id
from sqlalchemy import func, select
from sqlalchemy import select
from sqlalchemy.orm import selectinload
from ....db import trigger
from ....utils.host import HostPage, get_self_netloc
from ....utils.lang import get_lang
from ....utils.metrics import get_metrics
from ....utils.render import render
from ....utils.render import render_image
from ....utils.render.schemas.v1.tetrio.rank import Data as DataV1
from ....utils.render.schemas.v1.tetrio.rank import ItemData as ItemDataV1
from ....utils.render.schemas.v2.tetrio.rank import AverageData as AverageDataV2
from ....utils.render.schemas.v2.tetrio.rank import Data as DataV2
from ....utils.render.schemas.v2.tetrio.rank import ItemData as ItemDataV2
from ....utils.screenshot import screenshot
from .. import alc
from ..constant import GAME_TYPE
from ..models import TETRIOLeagueStats
@@ -41,6 +39,7 @@ async def _(event_session: Uninfo, template: Template | None = None):
command_args=['--all'] + ([f'--template {template}'] if template is not None else []),
):
async with get_session() as session:
# 获取最新记录
latest_data = (
await session.scalars(
select(TETRIOLeagueStats)
@@ -49,19 +48,42 @@ async def _(event_session: Uninfo, template: Template | None = None):
.options(selectinload(TETRIOLeagueStats.fields))
)
).one()
compare_data = (
await session.scalars(
# 计算目标时间点 (24小时前)
target_time = latest_data.update_time - timedelta(hours=24)
# 查询目标时间点之前的最近记录
before = (
await session.scalar(
select(TETRIOLeagueStats)
.order_by(
func.abs(
func.julianday(TETRIOLeagueStats.update_time)
- func.julianday(latest_data.update_time - timedelta(hours=24))
)
)
.where(TETRIOLeagueStats.update_time <= target_time)
.order_by(TETRIOLeagueStats.update_time.desc())
.limit(1)
.options(selectinload(TETRIOLeagueStats.fields))
)
).one()
or latest_data
)
# 查询目标时间点之后的最近记录
after = (
await session.scalar(
select(TETRIOLeagueStats)
.where(TETRIOLeagueStats.update_time >= target_time) # 使用 >= 避免间隙
.order_by(TETRIOLeagueStats.update_time.asc())
.limit(1)
.options(selectinload(TETRIOLeagueStats.fields))
)
or latest_data
)
# 确定最接近的记录
compare_data = (
before
if abs((target_time - before.update_time).total_seconds())
< abs((target_time - after.update_time).total_seconds())
else after
)
match template:
case 'v1' | None:
await UniMessage.image(raw=await make_image_v1(latest_data, compare_data)).finish()
@@ -69,50 +91,66 @@ async def _(event_session: Uninfo, template: Template | None = None):
await UniMessage.image(raw=await make_image_v2(latest_data, compare_data)).finish()
_RANK_ORDER_INDEX = {
v: i
for i, v in enumerate(
('x+', 'x', 'u', 'ss', 's+', 's', 's-', 'a+', 'a', 'a-', 'b+', 'b', 'b-', 'c+', 'c', 'c-', 'd+', 'd', 'z')
)
}
def _rank_sort_key(rank: str) -> int:
try:
return _RANK_ORDER_INDEX[rank]
except KeyError as e:
msg = f'未知段位: {rank!r}'
raise ValueError(msg) from e
async def make_image_v1(latest_data: TETRIOLeagueStats, compare_data: TETRIOLeagueStats) -> bytes:
async with HostPage(
await render(
'v1/tetrio/rank',
DataV1(
items={
i[0].rank: ItemDataV1(
trending=round(i[0].tr_line - i[1].tr_line, 2),
require_tr=round(i[0].tr_line, 2),
players=i[0].player_count,
)
for i in zip(latest_data.fields, compare_data.fields, strict=True)
},
updated_at=latest_data.update_time,
lang=get_lang(),
),
)
) as page_hash:
return await screenshot(f'http://{get_self_netloc()}/host/{page_hash}.html')
return await render_image(
DataV1(
items={
i[0].rank: ItemDataV1(
trending=round(i[0].tr_line - i[1].tr_line, 2),
require_tr=round(i[0].tr_line, 2),
players=i[0].player_count,
)
for i in zip(
sorted(latest_data.fields, key=lambda x: _rank_sort_key(x.rank)),
sorted(compare_data.fields, key=lambda x: _rank_sort_key(x.rank)),
strict=True,
)
},
updated_at=latest_data.update_time,
lang=get_lang(),
),
)
async def make_image_v2(latest_data: TETRIOLeagueStats, compare_data: TETRIOLeagueStats) -> bytes:
async with HostPage(
await render(
'v2/tetrio/rank',
DataV2(
items={
i[0].rank: ItemDataV2(
require_tr=round(i[0].tr_line, 2),
trending=round(i[0].tr_line - i[1].tr_line, 2),
average_data=AverageDataV2(
pps=(metrics := get_metrics(pps=i[0].avg_pps, apm=i[0].avg_apm, vs=i[0].avg_vs)).pps,
apm=metrics.apm,
apl=metrics.apl,
vs=metrics.vs,
adpl=metrics.adpl,
),
players=i[0].player_count,
)
for i in zip(latest_data.fields, compare_data.fields, strict=True)
},
updated_at=latest_data.update_time,
lang=get_lang(),
),
)
) as page_hash:
return await screenshot(f'http://{get_self_netloc()}/host/{page_hash}.html')
return await render_image(
DataV2(
items={
i[0].rank: ItemDataV2(
require_tr=round(i[0].tr_line, 2),
trending=round(i[0].tr_line - i[1].tr_line, 2),
average_data=AverageDataV2(
pps=(metrics := get_metrics(pps=i[0].avg_pps, apm=i[0].avg_apm, vs=i[0].avg_vs)).pps,
apm=metrics.apm,
apl=metrics.apl,
vs=metrics.vs,
adpl=metrics.adpl,
),
players=i[0].player_count,
)
for i in zip(
sorted(latest_data.fields, key=lambda x: _rank_sort_key(x.rank)),
sorted(compare_data.fields, key=lambda x: _rank_sort_key(x.rank)),
strict=True,
)
},
updated_at=latest_data.update_time,
lang=get_lang(),
),
)

View File

@@ -7,16 +7,14 @@ from nonebot_plugin_alconna import Option, UniMessage
from nonebot_plugin_orm import get_session
from nonebot_plugin_uninfo import Uninfo
from nonebot_plugin_uninfo.orm import get_session_persist_id
from sqlalchemy import func, select
from sqlalchemy import select
from sqlalchemy.orm import selectinload
from ....db import trigger
from ....utils.host import HostPage, get_self_netloc
from ....utils.lang import get_lang
from ....utils.metrics import get_metrics
from ....utils.render import render
from ....utils.render import render_image
from ....utils.render.schemas.v2.tetrio.rank.detail import Data, SpecialData
from ....utils.screenshot import screenshot
from .. import alc
from ..api.typedefs import ValidRank
from ..constant import GAME_TYPE
@@ -39,6 +37,7 @@ async def _(rank: ValidRank, event_session: Uninfo):
command_args=[f'--detail {rank}'],
):
async with get_session() as session:
# 获取最新记录
latest_data = (
await session.scalars(
select(TETRIOLeagueStats)
@@ -47,19 +46,41 @@ async def _(rank: ValidRank, event_session: Uninfo):
.options(selectinload(TETRIOLeagueStats.fields))
)
).one()
compare_data = (
await session.scalars(
# 计算目标时间点 (24小时前)
target_time = latest_data.update_time - timedelta(hours=24)
# 查询目标时间点之前的最近记录
before = (
await session.scalar(
select(TETRIOLeagueStats)
.order_by(
func.abs(
func.julianday(TETRIOLeagueStats.update_time)
- func.julianday(latest_data.update_time - timedelta(hours=24))
)
)
.where(TETRIOLeagueStats.update_time <= target_time)
.order_by(TETRIOLeagueStats.update_time.desc())
.limit(1)
.options(selectinload(TETRIOLeagueStats.fields))
)
).one()
or latest_data # 回退到最新记录
)
# 查询目标时间点之后的最近记录
after = (
await session.scalar(
select(TETRIOLeagueStats)
.where(TETRIOLeagueStats.update_time >= target_time)
.order_by(TETRIOLeagueStats.update_time.asc())
.limit(1)
.options(selectinload(TETRIOLeagueStats.fields))
)
or latest_data # 回退到最新记录
)
# 确定最接近的记录
compare_data = (
before
if abs((target_time - before.update_time).total_seconds())
< abs((target_time - after.update_time).total_seconds())
else after
)
await UniMessage.image(
raw=await make_image(
rank,
@@ -91,40 +112,36 @@ async def make_image(rank: ValidRank, latest: TETRIOLeagueStats, compare: TETRIO
max_vs = get_metrics(
pps=latest_data.high_vs.league.pps, apm=latest_data.high_vs.league.apm, vs=latest_data.high_vs.league.vs
)
async with HostPage(
await render(
'v2/tetrio/rank/detail',
Data(
name=latest_data.rank,
trending=round(latest_data.tr_line - compare_data.tr_line, 2),
require_tr=round(latest_data.tr_line, 2),
players=latest_data.player_count,
minimum_data=SpecialData(
apm=low_apm.apm,
pps=low_pps.pps,
lpm=low_pps.lpm,
vs=low_vs.vs,
adpm=low_vs.adpm,
apm_holder=latest_data.low_apm.username.upper(),
pps_holder=latest_data.low_pps.username.upper(),
vs_holder=latest_data.low_vs.username.upper(),
),
average_data=SpecialData(
apm=avg.apm, pps=avg.pps, lpm=avg.lpm, vs=avg.vs, adpm=avg.adpm, apl=avg.apl, adpl=avg.adpl
),
maximum_data=SpecialData(
apm=max_apm.apm,
pps=max_pps.pps,
lpm=max_pps.lpm,
vs=max_vs.vs,
adpm=max_vs.adpm,
apm_holder=latest_data.high_apm.username.upper(),
pps_holder=latest_data.high_pps.username.upper(),
vs_holder=latest_data.high_vs.username.upper(),
),
updated_at=latest.update_time.replace(tzinfo=UTC).astimezone(ZoneInfo('Asia/Shanghai')),
lang=get_lang(),
return await render_image(
Data(
name=latest_data.rank,
trending=round(latest_data.tr_line - compare_data.tr_line, 2),
require_tr=round(latest_data.tr_line, 2),
players=latest_data.player_count,
minimum_data=SpecialData(
apm=low_apm.apm,
pps=low_pps.pps,
lpm=low_pps.lpm,
vs=low_vs.vs,
adpm=low_vs.adpm,
apm_holder=latest_data.low_apm.username.upper(),
pps_holder=latest_data.low_pps.username.upper(),
vs_holder=latest_data.low_vs.username.upper(),
),
)
) as page_hash:
return await screenshot(f'http://{get_self_netloc()}/host/{page_hash}.html')
average_data=SpecialData(
apm=avg.apm, pps=avg.pps, lpm=avg.lpm, vs=avg.vs, adpm=avg.adpm, apl=avg.apl, adpl=avg.adpl
),
maximum_data=SpecialData(
apm=max_apm.apm,
pps=max_pps.pps,
lpm=max_pps.lpm,
vs=max_vs.vs,
adpm=max_vs.adpm,
apm_holder=latest_data.high_apm.username.upper(),
pps_holder=latest_data.high_pps.username.upper(),
vs_holder=latest_data.high_vs.username.upper(),
),
updated_at=latest.update_time.replace(tzinfo=UTC).astimezone(ZoneInfo('Asia/Shanghai')),
lang=get_lang(),
),
)

View File

@@ -15,14 +15,13 @@ from yarl import URL
from ....db import query_bind_info, trigger
from ....i18n import Lang
from ....utils.exception import RecordNotFoundError
from ....utils.host import HostPage, get_self_netloc
from ....utils.host import get_self_netloc
from ....utils.lang import get_lang
from ....utils.metrics import get_metrics
from ....utils.render import render
from ....utils.render import render_image
from ....utils.render.schemas.base import Avatar
from ....utils.render.schemas.v2.tetrio.record.base import Finesse, Max, Mini, Tspins, User
from ....utils.render.schemas.v2.tetrio.record.blitz import Record, Statistic
from ....utils.screenshot import screenshot
from ....utils.typedefs import Me
from .. import alc
from ..api.player import Player
@@ -60,7 +59,7 @@ async def _(
game_platform=GAME_TYPE,
)
if bind is None:
await matcher.finish('未查询到绑定信息')
await matcher.finish(Lang.bind.not_found())
player = Player(user_id=bind.game_account, trust=True)
await (
UniMessage.i18n(Lang.interaction.warning.unverified) + UniMessage.image(raw=await make_blitz_image(player))
@@ -81,73 +80,69 @@ async def _(account: Player, event_session: Uninfo):
async def make_blitz_image(player: Player) -> bytes:
user, blitz = await gather(player.user, player.blitz)
if blitz.data.record is None:
msg = f'未找到用户 {user.name.upper()} 的 Blitz 记录'
msg = Lang.record.not_found(username=user.name.upper(), mode=Lang.record.blitz())
raise RecordNotFoundError(msg)
stats = blitz.data.record.results.stats
clears = stats.clears
duration = timedelta(milliseconds=stats.finaltime).total_seconds()
metrics = get_metrics(pps=stats.piecesplaced / duration)
netloc = get_self_netloc()
async with HostPage(
page=await render(
'v2/tetrio/record/blitz',
Record(
type='best',
user=User(
id=user.ID,
name=user.name.upper(),
avatar=str(
URL(f'http://{netloc}/host/resource/tetrio/avatars/{user.ID}') % {'revision': avatar_revision}
)
if (avatar_revision := (await player.avatar_revision)) is not None and avatar_revision != 0
else Avatar(
type='identicon',
hash=md5(user.ID.encode()).hexdigest(), # noqa: S324
),
return await render_image(
Record(
type='best',
user=User(
id=user.ID,
name=user.name.upper(),
avatar=str(
URL(f'http://{netloc}/host/resource/tetrio/avatars/{user.ID}') % {'revision': avatar_revision}
)
if (avatar_revision := (await player.avatar_revision)) is not None and avatar_revision != 0
else Avatar(
type='identicon',
hash=md5(user.ID.encode()).hexdigest(), # noqa: S324
),
replay_id=blitz.data.record.replayid,
rank=blitz.data.rank,
personal_rank=1,
statistic=Statistic(
keys=stats.inputs,
kpp=round(stats.inputs / stats.piecesplaced, 2),
kps=round(stats.inputs / duration, 2),
max=Max(
combo=max((0, stats.topcombo - 1)),
btb=max((0, stats.topbtb - 1)),
),
pieces=stats.piecesplaced,
pps=metrics.pps,
lines=stats.lines,
lpm=metrics.lpm,
holds=stats.holds,
score=stats.score,
spp=round(stats.score / stats.piecesplaced, 2),
single=clears.singles,
double=clears.doubles,
triple=clears.triples,
quad=clears.quads,
tspins=Tspins(
total=clears.realtspins,
single=clears.tspinsingles,
double=clears.tspindoubles,
triple=clears.tspintriples,
mini=Mini(
total=clears.minitspins,
single=clears.minitspinsingles,
double=clears.minitspindoubles,
),
),
all_clear=clears.allclear,
finesse=Finesse(
faults=stats.finesse.faults,
accuracy=round(stats.finesse.perfectpieces / stats.piecesplaced * 100, 2),
),
level=stats.level,
),
play_at=blitz.data.record.ts,
lang=get_lang(),
),
)
) as page_hash:
return await screenshot(f'http://{netloc}/host/{page_hash}.html')
replay_id=blitz.data.record.replayid,
rank=blitz.data.rank,
personal_rank=1,
statistic=Statistic(
keys=stats.inputs,
kpp=round(stats.inputs / stats.piecesplaced, 2),
kps=round(stats.inputs / duration, 2),
max=Max(
combo=max((0, stats.topcombo - 1)),
btb=max((0, stats.topbtb - 1)),
),
pieces=stats.piecesplaced,
pps=metrics.pps,
lines=stats.lines,
lpm=metrics.lpm,
holds=stats.holds,
score=stats.score,
spp=round(stats.score / stats.piecesplaced, 2),
single=clears.singles,
double=clears.doubles,
triple=clears.triples,
quad=clears.quads,
tspins=Tspins(
total=clears.realtspins,
single=clears.tspinsingles,
double=clears.tspindoubles,
triple=clears.tspintriples,
mini=Mini(
total=clears.minitspins,
single=clears.minitspinsingles,
double=clears.minitspindoubles,
),
),
all_clear=clears.allclear,
finesse=Finesse(
faults=stats.finesse.faults,
accuracy=round(stats.finesse.perfectpieces / stats.piecesplaced * 100, 2),
),
level=stats.level,
),
play_at=blitz.data.record.ts,
lang=get_lang(),
),
)

View File

@@ -15,14 +15,13 @@ from yarl import URL
from ....db import query_bind_info, trigger
from ....i18n import Lang
from ....utils.exception import RecordNotFoundError
from ....utils.host import HostPage, get_self_netloc
from ....utils.host import get_self_netloc
from ....utils.lang import get_lang
from ....utils.metrics import get_metrics
from ....utils.render import render
from ....utils.render import render_image
from ....utils.render.schemas.base import Avatar
from ....utils.render.schemas.v2.tetrio.record.base import Finesse, Max, Mini, Statistic, Tspins, User
from ....utils.render.schemas.v2.tetrio.record.sprint import Record
from ....utils.screenshot import screenshot
from ....utils.typedefs import Me
from .. import alc
from ..api.player import Player
@@ -60,7 +59,7 @@ async def _(
game_platform=GAME_TYPE,
)
if bind is None:
await matcher.finish('未查询到绑定信息')
await matcher.finish(Lang.bind.not_found())
player = Player(user_id=bind.game_account, trust=True)
await (
UniMessage.i18n(Lang.interaction.warning.unverified) + UniMessage.image(raw=await make_sprint_image(player))
@@ -81,7 +80,7 @@ async def _(account: Player, event_session: Uninfo):
async def make_sprint_image(player: Player) -> bytes:
user, sprint = await gather(player.user, player.sprint)
if sprint.data.record is None:
msg = f'未找到用户 {user.name.upper()} 的 40L 记录'
msg = Lang.record.not_found(username=user.name.upper(), mode=Lang.record.sprint())
raise RecordNotFoundError(msg)
stats = sprint.data.record.results.stats
clears = stats.clears
@@ -89,65 +88,61 @@ async def make_sprint_image(player: Player) -> bytes:
sprint_value = f'{duration:.3f}s' if duration < 60 else f'{duration // 60:.0f}m {duration % 60:.3f}s' # noqa: PLR2004
metrics = get_metrics(pps=stats.piecesplaced / duration)
netloc = get_self_netloc()
async with HostPage(
page=await render(
'v2/tetrio/record/sprint',
Record(
type='best',
user=User(
id=user.ID,
name=user.name.upper(),
avatar=str(
URL(f'http://{netloc}/host/resource/tetrio/avatars/{user.ID}') % {'revision': avatar_revision}
)
if (avatar_revision := (await player.avatar_revision)) is not None and avatar_revision != 0
else Avatar(
type='identicon',
hash=md5(user.ID.encode()).hexdigest(), # noqa: S324
),
return await render_image(
Record(
type='best',
user=User(
id=user.ID,
name=user.name.upper(),
avatar=str(
URL(f'http://{netloc}/host/resource/tetrio/avatars/{user.ID}') % {'revision': avatar_revision}
)
if (avatar_revision := (await player.avatar_revision)) is not None and avatar_revision != 0
else Avatar(
type='identicon',
hash=md5(user.ID.encode()).hexdigest(), # noqa: S324
),
time=sprint_value,
replay_id=sprint.data.record.replayid,
rank=sprint.data.rank,
personal_rank=1,
statistic=Statistic(
keys=stats.inputs,
kpp=round(stats.inputs / stats.piecesplaced, 2),
kps=round(stats.inputs / duration, 2),
max=Max(
combo=max((0, stats.topcombo - 1)),
btb=max((0, stats.topbtb - 1)),
),
pieces=stats.piecesplaced,
pps=metrics.pps,
lines=stats.lines,
lpm=metrics.lpm,
holds=stats.holds,
score=stats.score,
single=clears.singles,
double=clears.doubles,
triple=clears.triples,
quad=clears.quads,
tspins=Tspins(
total=clears.realtspins,
single=clears.tspinsingles,
double=clears.tspindoubles,
triple=clears.tspintriples,
mini=Mini(
total=clears.minitspins,
single=clears.minitspinsingles,
double=clears.minitspindoubles,
),
),
all_clear=clears.allclear,
finesse=Finesse(
faults=stats.finesse.faults,
accuracy=round(stats.finesse.perfectpieces / stats.piecesplaced * 100, 2),
),
),
play_at=sprint.data.record.ts,
lang=get_lang(),
),
)
) as page_hash:
return await screenshot(f'http://{netloc}/host/{page_hash}.html')
time=sprint_value,
replay_id=sprint.data.record.replayid,
rank=sprint.data.rank,
personal_rank=1,
statistic=Statistic(
keys=stats.inputs,
kpp=round(stats.inputs / stats.piecesplaced, 2),
kps=round(stats.inputs / duration, 2),
max=Max(
combo=max((0, stats.topcombo - 1)),
btb=max((0, stats.topbtb - 1)),
),
pieces=stats.piecesplaced,
pps=metrics.pps,
lines=stats.lines,
lpm=metrics.lpm,
holds=stats.holds,
score=stats.score,
single=clears.singles,
double=clears.doubles,
triple=clears.triples,
quad=clears.quads,
tspins=Tspins(
total=clears.realtspins,
single=clears.tspinsingles,
double=clears.tspindoubles,
triple=clears.tspintriples,
mini=Mini(
total=clears.minitspins,
single=clears.minitspinsingles,
double=clears.minitspindoubles,
),
),
all_clear=clears.allclear,
finesse=Finesse(
faults=stats.finesse.faults,
accuracy=round(stats.finesse.perfectpieces / stats.piecesplaced * 100, 2),
),
),
play_at=sprint.data.record.ts,
lang=get_lang(),
),
)

View File

@@ -13,12 +13,13 @@ from yarl import URL
from ...config.config import global_config
from ...db import query_bind_info, remove_bind, trigger
from ...utils.host import HostPage, get_self_netloc
from ...i18n import Lang
from ...utils.host import get_self_netloc
from ...utils.image import get_avatar
from ...utils.lang import get_lang
from ...utils.render import Bind, render
from ...utils.render import render_image
from ...utils.render.schemas.base import Avatar, People
from ...utils.screenshot import screenshot
from ...utils.render.schemas.bind import Bind
from . import alc, command
from .api import Player
from .constant import GAME_TYPE
@@ -44,19 +45,18 @@ async def _(nb_user: User, event_session: Uninfo, interface: QryItrface):
get_session() as session,
):
if (bind := await query_bind_info(session=session, user=nb_user, game_platform=GAME_TYPE)) is None:
await UniMessage('您还未绑定 TETR.IO 账号').finish()
resp = await suggest('您确定要解绑吗?', ['', ''])
await UniMessage(Lang.bind.no_account(game='TETR.IO')).finish()
resp = await suggest(Lang.bind.confirm_unbind(), ['', ''])
if resp is None or resp.extract_plain_text() == '':
return
player = Player(user_id=bind.game_account, trust=True)
user = await player.user
netloc = get_self_netloc()
async with HostPage(
await render(
'v1/binding',
await UniMessage.image(
raw=await render_image(
Bind(
platform='TETR.IO',
type='unlink',
type='unbind',
user=People(
avatar=str(
URL(f'http://{netloc}/host/resource/tetrio/avatars/{user.ID}')
@@ -77,10 +77,9 @@ async def _(nb_user: User, event_session: Uninfo, interface: QryItrface):
),
name=bot_user.nick or bot_user.name or choice(list(global_config.nickname) or ['bot']),
),
prompt='io绑定{游戏ID}',
prompt=Lang.prompt.io_bind(),
lang=get_lang(),
),
)
) as page_hash:
await UniMessage.image(raw=await screenshot(f'http://{netloc}/host/{page_hash}.html')).send()
).send()
await remove_bind(session=session, user=nb_user, game_platform=GAME_TYPE)

View File

@@ -0,0 +1,111 @@
from asyncio import gather
from hashlib import md5
from secrets import choice
from nonebot_plugin_alconna import Subcommand
from nonebot_plugin_alconna.uniseg import UniMessage
from nonebot_plugin_orm import get_session
from nonebot_plugin_uninfo import QryItrface, Uninfo
from nonebot_plugin_uninfo import User as UninfoUser
from nonebot_plugin_uninfo.orm import get_session_persist_id
from nonebot_plugin_user import User
from yarl import URL
from ...config.config import global_config
from ...db import create_or_update_bind, query_bind_info, trigger
from ...i18n import Lang
from ...utils.host import get_self_netloc
from ...utils.image import get_avatar
from ...utils.lang import get_lang
from ...utils.render import render_image
from ...utils.render.schemas.base import Avatar, People
from ...utils.render.schemas.bind import Bind
from . import alc, command
from .api import Player
from .constant import GAME_TYPE
command.add(Subcommand('verify', help_text='验证 TETR.IO 账号'))
alc.shortcut(
'(?i:io)(?i:验证|verify)',
command='tstats TETR.IO verify',
humanized='io验证',
)
try:
from nonebot.adapters.discord import MessageCreateEvent
@alc.assign('TETRIO.verify')
async def _(_: MessageCreateEvent, nb_user: User, event_session: Uninfo, interface: QryItrface):
async with (
trigger(
session_persist_id=await get_session_persist_id(event_session),
game_platform=GAME_TYPE,
command_type='verify',
command_args=[],
),
get_session() as session,
):
if (bind := await query_bind_info(session=session, user=nb_user, game_platform=GAME_TYPE)) is None:
await UniMessage(Lang.bind.no_account(game='TETR.IO')).finish()
if bind.verify is True:
await UniMessage(Lang.bind.verify_already()).finish()
player = Player(user_id=bind.game_account, trust=True)
user_info = await player.get_info()
verify = (
user_info.data.connections.discord is not None
and user_info.data.connections.discord.id == event_session.user.id
)
if verify is False:
await UniMessage(Lang.bind.verify_failed(game='TETR.IO')).finish()
await create_or_update_bind(
session=session,
user=nb_user,
game_platform=GAME_TYPE,
game_account=bind.game_account,
verify=verify,
)
user, avatar_revision = await gather(player.user, player.avatar_revision)
await UniMessage.image(
raw=await render_image(
Bind(
platform='TETR.IO',
type='success',
user=People(
avatar=str(
URL(f'http://{get_self_netloc()}/host/resource/tetrio/avatars/{user.ID}')
% {'revision': avatar_revision}
)
if avatar_revision is not None and avatar_revision != 0
else Avatar(type='identicon', hash=md5(user.ID.encode()).hexdigest()), # noqa: S324
name=user.name.upper(),
),
bot=People(
avatar=await get_avatar(
(
bot_user := await interface.get_user(event_session.self_id)
or UninfoUser(id=event_session.self_id)
),
'Data URI',
'../../static/logo/logo.svg',
),
name=bot_user.nick or bot_user.name or choice(list(global_config.nickname) or ['bot']),
),
prompt=Lang.prompt.io_check(),
lang=get_lang(),
),
)
).finish()
except ImportError:
pass
@alc.assign('TETRIO.verify')
async def _(event_session: Uninfo):
async with trigger(
session_persist_id=await get_session_persist_id(event_session),
game_platform=GAME_TYPE,
command_type='verify',
command_args=[],
):
await UniMessage(Lang.bind.only_discord()).finish()

View File

@@ -1,6 +1,7 @@
from arclet.alconna import Arg, ArgFlag
from nonebot_plugin_alconna import Args, At, Subcommand
from nonebot_plugin_alconna import Args, At, Option, Subcommand
from ...utils.duration import parse_duration
from ...utils.exception import MessageFormatError
from ...utils.typedefs import Me
from .. import add_block_handlers, alc, command
@@ -33,6 +34,16 @@ command.add(
'unbind',
help_text='解除绑定 TOP 账号',
),
Subcommand(
'config',
Option(
'--default-compare',
Arg('compare', parse_duration, notice='对比时间距离'),
alias=['-DC', 'DefaultCompare'],
help_text='设置默认对比时间距离',
),
help_text='TOP 查询个性化配置',
),
Subcommand(
'query',
Args(
@@ -49,6 +60,12 @@ command.add(
flags=[ArgFlag.HIDDEN, ArgFlag.OPTIONAL],
),
),
Option(
'--compare',
Arg('compare', parse_duration),
alias=['-C'],
help_text='指定对比时间距离',
),
help_text='查询 TOP 游戏信息',
),
help_text='TOP 游戏相关指令',
@@ -70,7 +87,12 @@ alc.shortcut(
command='tstats TOP query',
humanized='top查',
)
alc.shortcut(
'(?i:top)(?i:配置|配|config)',
command='tstats TOP config',
humanized='top配置',
)
add_block_handlers(alc.assign('TOP.query'))
from . import bind, query, unbind # noqa: E402, F401
from . import bind, config, query, unbind # noqa: E402, F401

View File

@@ -10,6 +10,8 @@ from .schemas.user_profile import UserProfile
class TOPHistoricalData(MappedAsDataclass, Model):
__tablename__ = 'nb_t_top_hist_data'
id: Mapped[int] = mapped_column(init=False, primary_key=True)
user_unique_identifier: Mapped[str] = mapped_column(String(24), index=True)
api_type: Mapped[Literal['User Profile']] = mapped_column(String(16), index=True)

View File

@@ -9,12 +9,12 @@ from nonebot_plugin_user import User
from ...config.config import global_config
from ...db import BindStatus, create_or_update_bind, trigger
from ...utils.host import HostPage, get_self_netloc
from ...i18n import Lang
from ...utils.image import get_avatar
from ...utils.lang import get_lang
from ...utils.render import Bind, render
from ...utils.render import render_image
from ...utils.render.schemas.base import People
from ...utils.screenshot import screenshot
from ...utils.render.schemas.bind import Bind
from . import alc
from .api import Player
from .constant import GAME_TYPE
@@ -37,9 +37,8 @@ async def _(nb_user: User, account: Player, event_session: Uninfo, interface: Qr
game_account=user.unique_identifier,
)
if bind_status in (BindStatus.SUCCESS, BindStatus.UPDATE):
async with HostPage(
await render(
'v1/binding',
await UniMessage.image(
raw=await render_image(
Bind(
platform=GAME_TYPE,
type='unknown',
@@ -62,11 +61,8 @@ async def _(nb_user: User, account: Player, event_session: Uninfo, interface: Qr
),
name=bot_user.nick or bot_user.name or choice(list(global_config.nickname) or ['bot']),
),
prompt='top查我',
prompt=Lang.prompt.top_check(),
lang=get_lang(),
),
)
) as page_hash:
await UniMessage.image(
raw=await screenshot(f'http://{get_self_netloc()}/host/{page_hash}.html')
).finish()
).finish()

View File

@@ -0,0 +1,32 @@
from datetime import timedelta
from nonebot_plugin_alconna.uniseg import UniMessage
from nonebot_plugin_orm import async_scoped_session
from nonebot_plugin_uninfo import Uninfo
from nonebot_plugin_uninfo.orm import get_session_persist_id
from nonebot_plugin_user import User
from sqlalchemy import select
from ...db import trigger
from ...i18n import Lang
from . import alc
from .constant import GAME_TYPE
from .models import TOPUserConfig
@alc.assign('TOP.config')
async def _(user: User, session: async_scoped_session, event_session: Uninfo, compare: timedelta):
async with trigger(
session_persist_id=await get_session_persist_id(event_session),
game_platform=GAME_TYPE,
command_type='config',
command_args=[f'--default-compare {compare}'],
):
config = (await session.scalars(select(TOPUserConfig).where(TOPUserConfig.id == user.id))).one_or_none()
if config is None:
config = TOPUserConfig(id=user.id, compare_delta=compare)
session.add(config)
else:
config.compare_delta = compare
await session.commit()
await UniMessage(Lang.bind.config_success()).finish()

View File

@@ -0,0 +1,12 @@
from datetime import timedelta
from nonebot_plugin_orm import Model
from sqlalchemy import Interval
from sqlalchemy.orm import Mapped, MappedAsDataclass, mapped_column
class TOPUserConfig(MappedAsDataclass, Model):
__tablename__ = 'nb_t_top_u_cfg'
id: Mapped[int] = mapped_column(primary_key=True)
compare_delta: Mapped[timedelta | None] = mapped_column(Interval(native=True), nullable=True)

View File

@@ -1,38 +1,96 @@
from datetime import datetime, timedelta, timezone
from nonebot.adapters import Event
from nonebot.matcher import Matcher
from nonebot_plugin_alconna import At
from nonebot_plugin_alconna.uniseg import UniMessage
from nonebot_plugin_orm import get_session
from nonebot_plugin_alconna.uniseg import Image, UniMessage
from nonebot_plugin_orm import AsyncSession, get_session
from nonebot_plugin_uninfo import Uninfo
from nonebot_plugin_uninfo.orm import get_session_persist_id
from nonebot_plugin_user import User as NBUser
from nonebot_plugin_user import get_user
from sqlalchemy import select
from ...db import query_bind_info, trigger
from ...db import query_bind_info, resolve_compare_delta, trigger
from ...i18n import Lang
from ...utils.exception import FallbackError
from ...utils.host import HostPage, get_self_netloc
from ...utils.lang import get_lang
from ...utils.metrics import TetrisMetricsBasicWithLPM, get_metrics
from ...utils.render import render
from ...utils.render import render_image
from ...utils.render.avatar import get_avatar
from ...utils.render.schemas.base import People, Trending
from ...utils.render.schemas.v1.top.info import Data as InfoData
from ...utils.render.schemas.v1.top.info import Info
from ...utils.screenshot import screenshot
from ...utils.typedefs import Me
from . import alc
from .api import Player
from .api.models import TOPHistoricalData
from .api.schemas.user_profile import Data, UserProfile
from .constant import GAME_TYPE
from .models import TOPUserConfig
UTC = timezone.utc
async def get_compare_profile(session: AsyncSession, user_name: str, target_time: datetime) -> UserProfile | None:
before = await session.scalar(
select(TOPHistoricalData)
.where(
TOPHistoricalData.user_unique_identifier == user_name,
TOPHistoricalData.api_type == 'User Profile',
TOPHistoricalData.update_time <= target_time,
)
.order_by(TOPHistoricalData.update_time.desc())
.limit(1)
)
after = await session.scalar(
select(TOPHistoricalData)
.where(
TOPHistoricalData.user_unique_identifier == user_name,
TOPHistoricalData.api_type == 'User Profile',
TOPHistoricalData.update_time >= target_time,
)
.order_by(TOPHistoricalData.update_time.asc())
.limit(1)
)
if before is None:
selected = after
elif after is None:
selected = before
else:
selected = (
before
if abs((target_time - before.update_time).total_seconds())
<= abs((target_time - after.update_time).total_seconds())
else after
)
if selected is None or not isinstance(selected.data, UserProfile):
return None
return selected.data
def compare_metrics(
current: TetrisMetricsBasicWithLPM, compare: TetrisMetricsBasicWithLPM | None
) -> tuple[Trending, Trending]:
if compare is None:
return Trending.KEEP, Trending.KEEP
return Trending.compare(compare.lpm, current.lpm), Trending.compare(compare.apm, current.apm)
@alc.assign('TOP.query')
async def _(event: Event, matcher: Matcher, target: At | Me, event_session: Uninfo):
async def _( # noqa: PLR0913
user: NBUser,
event: Event,
matcher: Matcher,
target: At | Me,
event_session: Uninfo,
compare: timedelta | None = None,
):
async with trigger(
session_persist_id=await get_session_persist_id(event_session),
game_platform=GAME_TYPE,
command_type='query',
command_args=[],
command_args=[f'--compare {compare}'] if compare is not None else [],
):
async with get_session() as session:
bind = await query_bind_info(
@@ -42,23 +100,44 @@ async def _(event: Event, matcher: Matcher, target: At | Me, event_session: Unin
),
game_platform=GAME_TYPE,
)
if bind is None:
await matcher.finish('未查询到绑定信息')
if bind is None:
await matcher.finish(Lang.bind.not_found())
compare_delta = await resolve_compare_delta(TOPUserConfig, session, user.id, compare)
player = Player(user_name=bind.game_account, trust=True)
profile = await player.get_profile()
compare_profile = await get_compare_profile(
session,
profile.user_name,
datetime.now(tz=UTC) - compare_delta,
)
await (
UniMessage.i18n(Lang.interaction.warning.unverified)
+ await make_query_result(await Player(user_name=bind.game_account, trust=True).get_profile())
+ (
UniMessage('\n')
if not (result := await make_query_result(profile, compare_profile)).has(Image)
else UniMessage()
)
+ result
).finish()
@alc.assign('TOP.query')
async def _(account: Player, event_session: Uninfo):
async def _(user: NBUser, account: Player, event_session: Uninfo, compare: timedelta | None = None):
async with trigger(
session_persist_id=await get_session_persist_id(event_session),
game_platform=GAME_TYPE,
command_type='query',
command_args=[],
command_args=[f'--compare {compare}'] if compare is not None else [],
):
await (await make_query_result(await account.get_profile())).finish()
async with get_session() as session:
compare_delta = await resolve_compare_delta(TOPUserConfig, session, user.id, compare)
profile = await account.get_profile()
compare_profile = await get_compare_profile(
session,
profile.user_name,
datetime.now(tz=UTC) - compare_delta,
)
await (await make_query_result(profile, compare_profile)).finish()
def get_avg_metrics(data: list[Data]) -> TetrisMetricsBasicWithLPM:
@@ -70,61 +149,61 @@ def get_avg_metrics(data: list[Data]) -> TetrisMetricsBasicWithLPM:
return get_metrics(lpm=total_lpm / num, apm=total_apm / num)
async def make_query_image(profile: UserProfile) -> bytes:
async def make_query_image(profile: UserProfile, compare: UserProfile | None) -> bytes:
if profile.today is None or profile.total is None:
raise FallbackError
today = get_metrics(lpm=profile.today.lpm, apm=profile.today.apm)
history = get_avg_metrics(profile.total)
async with HostPage(
await render(
'v1/top/info',
Info(
user=People(avatar=get_avatar(profile.user_name), name=profile.user_name),
today=InfoData(
pps=today.pps,
lpm=today.lpm,
lpm_trending=Trending.KEEP,
apm=today.apm,
apl=today.apl,
apm_trending=Trending.KEEP,
),
historical=InfoData(
pps=history.pps,
lpm=history.lpm,
lpm_trending=Trending.KEEP,
apm=history.apm,
apl=history.apl,
apm_trending=Trending.KEEP,
),
lang=get_lang(),
compare_today = get_metrics(lpm=compare.today.lpm, apm=compare.today.apm) if compare and compare.today else None
compare_history = get_avg_metrics(compare.total) if compare is not None and compare.total is not None else None
today_lpm_trending, today_apm_trending = compare_metrics(today, compare_today)
history_lpm_trending, history_apm_trending = compare_metrics(history, compare_history)
return await render_image(
Info(
user=People(avatar=get_avatar(profile.user_name), name=profile.user_name),
today=InfoData(
pps=today.pps,
lpm=today.lpm,
lpm_trending=today_lpm_trending,
apm=today.apm,
apl=today.apl,
apm_trending=today_apm_trending,
),
)
) as page_hash:
return await screenshot(f'http://{get_self_netloc()}/host/{page_hash}.html')
historical=InfoData(
pps=history.pps,
lpm=history.lpm,
lpm_trending=history_lpm_trending,
apm=history.apm,
apl=history.apl,
apm_trending=history_apm_trending,
),
lang=get_lang(),
),
)
def make_query_text(profile: UserProfile) -> UniMessage:
message = ''
if profile.today is not None:
today = get_metrics(lpm=profile.today.lpm, apm=profile.today.apm)
message += f'用户 {profile.user_name} 24小时内统计数据为: '
message += f"\nL'PM: {today.lpm} ( {today.pps} pps )"
message += f'\nAPM: {today.apm} ( x{today.apl} )'
message += Lang.stats.daily_stats(name=profile.user_name)
message += Lang.stats.lpm(lpm=today.lpm, pps=today.pps)
message += Lang.stats.apm(apm=today.apm, apl=today.apl)
else:
message += f'用户 {profile.user_name} 暂无24小时内统计数据'
message += Lang.stats.no_daily(name=profile.user_name)
if profile.total is not None:
total = get_avg_metrics(profile.total)
message += '\n历史统计数据为: '
message += f"\nL'PM: {total.lpm} ( {total.pps} pps )"
message += f'\nAPM: {total.apm} ( x{total.apl} )'
message += Lang.stats.history_stats()
message += Lang.stats.lpm(lpm=total.lpm, pps=total.pps)
message += Lang.stats.apm(apm=total.apm, apl=total.apl)
else:
message += '\n暂无历史统计数据'
message += Lang.stats.no_history()
return UniMessage(message)
async def make_query_result(profile: UserProfile) -> UniMessage:
async def make_query_result(profile: UserProfile, compare: UserProfile | None) -> UniMessage:
try:
return UniMessage.image(raw=await make_query_image(profile))
return UniMessage.image(raw=await make_query_image(profile, compare))
except FallbackError:
...
return make_query_text(profile)

View File

@@ -10,12 +10,12 @@ from nonebot_plugin_waiter import suggest # type: ignore[import-untyped]
from ...config.config import global_config
from ...db import query_bind_info, remove_bind, trigger
from ...utils.host import HostPage, get_self_netloc
from ...i18n import Lang
from ...utils.image import get_avatar
from ...utils.lang import get_lang
from ...utils.render import Bind, render
from ...utils.render import render_image
from ...utils.render.schemas.base import People
from ...utils.screenshot import screenshot
from ...utils.render.schemas.bind import Bind
from . import alc
from .api import Player
from .constant import GAME_TYPE
@@ -37,19 +37,17 @@ async def _(
get_session() as session,
):
if (bind := await query_bind_info(session=session, user=nb_user, game_platform=GAME_TYPE)) is None:
await UniMessage('您还未绑定 TOP 账号').finish()
resp = await suggest('您确定要解绑吗?', ['', ''])
await UniMessage(Lang.bind.no_account(game='TOP')).finish()
resp = await suggest(Lang.bind.confirm_unbind(), ['', ''])
if resp is None or resp.extract_plain_text() == '':
return
player = Player(user_name=bind.game_account, trust=True)
user = await player.user
netloc = get_self_netloc()
async with HostPage(
await render(
'v1/binding',
await UniMessage.image(
raw=await render_image(
Bind(
platform='TOP',
type='unlink',
type='unbind',
user=People(
avatar=await get_avatar(
event_session.user,
@@ -69,10 +67,9 @@ async def _(
),
name=bot_user.nick or bot_user.name or choice(list(global_config.nickname) or ['bot']),
),
prompt='top绑定{游戏ID}',
prompt=Lang.prompt.top_bind(),
lang=get_lang(),
),
)
) as page_hash:
await UniMessage.image(raw=await screenshot(f'http://{netloc}/host/{page_hash}.html')).send()
).send()
await remove_bind(session=session, user=nb_user, game_platform=GAME_TYPE)

View File

@@ -1,6 +1,7 @@
from arclet.alconna import Arg, ArgFlag
from nonebot_plugin_alconna import Args, At, Subcommand
from nonebot_plugin_alconna import Args, At, Option, Subcommand
from ...utils.duration import parse_duration
from ...utils.exception import MessageFormatError
from ...utils.typedefs import Me
from .. import add_block_handlers, alc, command
@@ -38,6 +39,16 @@ command.add(
'unbind',
help_text='解除绑定 TOS 账号',
),
Subcommand(
'config',
Option(
'--default-compare',
Arg('compare', parse_duration, notice='对比时间距离'),
alias=['-DC', 'DefaultCompare'],
help_text='设置默认对比时间距离',
),
help_text='茶服 查询个性化配置',
),
Subcommand(
'query',
Args(
@@ -54,6 +65,12 @@ command.add(
flags=[ArgFlag.HIDDEN, ArgFlag.OPTIONAL],
),
),
Option(
'--compare',
Arg('compare', parse_duration),
alias=['-C'],
help_text='指定对比时间距离',
),
help_text='查询 茶服 游戏信息',
),
help_text='茶服 游戏相关指令',
@@ -75,7 +92,12 @@ alc.shortcut(
command='tstats TOS query',
humanized='茶服查',
)
alc.shortcut(
'(?i:tos|茶服)(?i:配置|配|config)',
command='tstats TOS config',
humanized='茶服配置',
)
add_block_handlers(alc.assign('TOS.query'))
from . import bind, query, unbind # noqa: E402, F401
from . import bind, config, query, unbind # noqa: E402, F401

View File

@@ -11,8 +11,10 @@ from .schemas.user_profile import UserProfile
class TOSHistoricalData(MappedAsDataclass, Model):
__tablename__ = 'nb_t_tos_hist_data'
id: Mapped[int] = mapped_column(init=False, primary_key=True)
user_unique_identifier: Mapped[str] = mapped_column(String(24), index=True)
user_unique_identifier: Mapped[str] = mapped_column(String(256), index=True)
api_type: Mapped[Literal['User Info', 'User Profile']] = mapped_column(String(16), index=True)
data: Mapped[UserInfoSuccess | UserProfile] = mapped_column(
PydanticType(get_model=[], models={UserInfoSuccess, UserProfile})

View File

@@ -68,7 +68,7 @@ class Player:
raw_user_info = await request.failover_request(
[i / path % query for i in BASE_URL], failover_code=[502], failover_exc=(TimeoutException,)
)
user_info: UserInfo = type_validate_json(UserInfo, raw_user_info) # type: ignore[arg-type]
user_info: UserInfo = type_validate_json(UserInfo, raw_user_info) # type: ignore[arg-type] # pyright: ignore[reportArgumentType]
if not isinstance(user_info, UserInfoSuccess):
msg = f'用户信息请求错误:\n{user_info.error}'
raise RequestError(msg)

View File

@@ -9,12 +9,12 @@ from nonebot_plugin_user import User
from ...config.config import global_config
from ...db import BindStatus, create_or_update_bind, trigger
from ...utils.host import HostPage, get_self_netloc
from ...i18n import Lang
from ...utils.image import get_avatar
from ...utils.lang import get_lang
from ...utils.render import Bind, render
from ...utils.render import render_image
from ...utils.render.schemas.base import People
from ...utils.screenshot import screenshot
from ...utils.render.schemas.bind import Bind
from . import alc
from .api import Player
from .constant import GAME_TYPE
@@ -43,9 +43,8 @@ async def _(
)
user_info = await account.get_info()
if bind_status in (BindStatus.SUCCESS, BindStatus.UPDATE):
async with HostPage(
await render(
'v1/binding',
await UniMessage.image(
raw=await render_image(
Bind(
platform=GAME_TYPE,
type='unknown',
@@ -68,11 +67,8 @@ async def _(
),
name=bot_user.nick or bot_user.name or choice(list(global_config.nickname) or ['bot']),
),
prompt='茶服查我',
prompt=Lang.prompt.tos_check(),
lang=get_lang(),
),
)
) as page_hash:
await UniMessage.image(
raw=await screenshot(f'http://{get_self_netloc()}/host/{page_hash}.html')
).finish()
).finish()

View File

@@ -0,0 +1,32 @@
from datetime import timedelta
from nonebot_plugin_alconna.uniseg import UniMessage
from nonebot_plugin_orm import async_scoped_session
from nonebot_plugin_uninfo import Uninfo
from nonebot_plugin_uninfo.orm import get_session_persist_id
from nonebot_plugin_user import User
from sqlalchemy import select
from ...db import trigger
from ...i18n import Lang
from . import alc
from .constant import GAME_TYPE
from .models import TOSUserConfig
@alc.assign('TOS.config')
async def _(user: User, session: async_scoped_session, event_session: Uninfo, compare: timedelta):
async with trigger(
session_persist_id=await get_session_persist_id(event_session),
game_platform=GAME_TYPE,
command_type='config',
command_args=[f'--default-compare {compare}'],
):
config = (await session.scalars(select(TOSUserConfig).where(TOSUserConfig.id == user.id))).one_or_none()
if config is None:
config = TOSUserConfig(id=user.id, compare_delta=compare)
session.add(config)
else:
config.compare_delta = compare
await session.commit()
await UniMessage(Lang.bind.config_success()).finish()

View File

@@ -0,0 +1,12 @@
from datetime import timedelta
from nonebot_plugin_orm import Model
from sqlalchemy import Interval
from sqlalchemy.orm import Mapped, MappedAsDataclass, mapped_column
class TOSUserConfig(MappedAsDataclass, Model):
__tablename__ = 'nb_t_tos_u_cfg'
id: Mapped[int] = mapped_column(primary_key=True)
compare_delta: Mapped[timedelta | None] = mapped_column(Interval(native=True), nullable=True)

View File

@@ -1,4 +1,5 @@
from asyncio import gather
from collections.abc import Iterable
from datetime import datetime, timedelta, timezone
from http import HTTPStatus
from typing import Literal, NamedTuple
@@ -7,34 +8,38 @@ from zoneinfo import ZoneInfo
from nonebot.adapters import Event
from nonebot.matcher import Matcher
from nonebot_plugin_alconna import At
from nonebot_plugin_alconna.uniseg import UniMessage
from nonebot_plugin_orm import get_session
from nonebot_plugin_alconna.uniseg import Image, UniMessage
from nonebot_plugin_orm import AsyncSession, get_session
from nonebot_plugin_uninfo import Uninfo, User
from nonebot_plugin_uninfo.orm import get_session_persist_id
from nonebot_plugin_user import User as NBUser
from nonebot_plugin_user import get_user
from sqlalchemy import select
from ...db import query_bind_info, trigger
from ...db import query_bind_info, resolve_compare_delta, trigger
from ...i18n import Lang
from ...utils.chart import get_split, get_value_bounds, handle_history_data
from ...utils.exception import RequestError
from ...utils.host import HostPage, get_self_netloc
from ...utils.exception import FallbackError, RequestError
from ...utils.image import get_avatar
from ...utils.lang import get_lang
from ...utils.metrics import TetrisMetricsProWithLPMADPM, get_metrics
from ...utils.render import render
from ...utils.render import render_image
from ...utils.render.avatar import get_avatar as get_random_avatar
from ...utils.render.schemas.base import HistoryData, People, Trending
from ...utils.render.schemas.v1.base import History
from ...utils.render.schemas.v1.tos.info import Info, Multiplayer, Singleplayer
from ...utils.screenshot import screenshot
from ...utils.time_it import time_it
from ...utils.typedefs import Me, Number
from . import alc
from .api import Player
from .api.models import TOSHistoricalData
from .api.schemas.user_info import UserInfoSuccess
from .api.schemas.user_profile import Data as UserProfileData
from .api.schemas.user_profile import UserProfile
from .constant import GAME_TYPE
from .models import TOSUserConfig
UTC = timezone.utc
def add_special_handlers(
@@ -42,16 +47,18 @@ def add_special_handlers(
) -> None:
@alc.assign('TOS.query')
async def _(
user: NBUser,
event: Event,
target: At | Me,
event_session: Uninfo,
compare: timedelta | None = None,
):
if isinstance(event, match_event):
async with trigger(
session_persist_id=await get_session_persist_id(event_session),
game_platform=GAME_TYPE,
command_type='query',
command_args=[],
command_args=[f'--compare {compare}'] if compare is not None else [],
):
player = Player(
teaid=f'{teaid_prefix}{target.target}'
@@ -60,16 +67,14 @@ def add_special_handlers(
trust=True,
)
try:
user_info, game_data = await gather(player.get_info(), get_game_data(player))
if game_data is not None:
await UniMessage.image(
raw=await make_query_image(
user_info,
game_data,
async with get_session() as session:
await (
await make_query_result(
player,
await resolve_compare_delta(TOSUserConfig, session, user.id, compare),
None if isinstance(target, At) else event_session.user,
)
).finish()
await make_query_text(user_info, game_data).finish()
except RequestError as e:
if e.status_code == HTTPStatus.BAD_REQUEST and '未找到此用户' in e.message:
return
@@ -108,58 +113,66 @@ except ImportError:
@alc.assign('TOS.query')
async def _(
async def _( # noqa: PLR0913
user: NBUser,
event: Event,
matcher: Matcher,
target: At | Me,
event_session: Uninfo,
compare: timedelta | None = None,
):
async with trigger(
session_persist_id=await get_session_persist_id(event_session),
game_platform=GAME_TYPE,
command_type='query',
command_args=[],
async with (
trigger(
session_persist_id=await get_session_persist_id(event_session),
game_platform=GAME_TYPE,
command_type='query',
command_args=[f'--compare {compare}'] if compare is not None else [],
),
get_session() as session,
):
async with get_session() as session:
bind = await query_bind_info(
session=session,
user=await get_user(
event_session.scope, target.target if isinstance(target, At) else event.get_user_id()
),
game_platform=GAME_TYPE,
)
bind = await query_bind_info(
session=session,
user=await get_user(event_session.scope, target.target if isinstance(target, At) else event.get_user_id()),
game_platform=GAME_TYPE,
)
if bind is None:
await matcher.finish('未查询到绑定信息')
message = UniMessage.i18n(Lang.interaction.warning.unverified)
await matcher.finish(Lang.bind.not_found())
player = Player(teaid=bind.game_account, trust=True)
user_info, game_data = await gather(player.get_info(), get_game_data(player))
if game_data is not None:
await (
message
+ UniMessage.image(
raw=await make_query_image(
user_info,
game_data,
await (
UniMessage.i18n(Lang.interaction.warning.unverified)
+ (
UniMessage('\n')
if not (
result := await make_query_result(
player,
await resolve_compare_delta(TOSUserConfig, session, user.id, compare),
None if isinstance(target, At) else event_session.user,
)
)
).finish()
await (message + make_query_text(user_info, game_data)).finish()
).has(Image)
else UniMessage()
)
+ result
).finish()
@alc.assign('TOS.query')
async def _(account: Player, event_session: Uninfo):
async with trigger(
session_persist_id=await get_session_persist_id(event_session),
game_platform=GAME_TYPE,
command_type='query',
command_args=[],
async def _(user: NBUser, account: Player, event_session: Uninfo, compare: timedelta | None = None):
async with (
trigger(
session_persist_id=await get_session_persist_id(event_session),
game_platform=GAME_TYPE,
command_type='query',
command_args=[f'--compare {compare}'] if compare is not None else [],
),
get_session() as session,
):
user_info, game_data = await gather(account.get_info(), get_game_data(account))
await get_historical_data(user_info.data.teaid)
if game_data is not None:
await UniMessage.image(raw=await make_query_image(user_info, game_data, None)).finish()
await make_query_text(user_info, game_data).finish()
await (
await make_query_result(
account,
await resolve_compare_delta(TOSUserConfig, session, user.id, compare),
None,
)
).finish()
class GameData(NamedTuple):
@@ -170,48 +183,125 @@ class GameData(NamedTuple):
ge: Number
class GameAccumulator:
def __init__(self, target_num: int) -> None:
self._target_num = max(1, target_num)
self._weighted_total_lpm = 0.0
self._weighted_total_apm = 0.0
self._weighted_total_adpm = 0.0
self._total_time = 0.0
self._total_attack = 0
self._total_dig = 0
self._total_offset = 0
self._total_pieces = 0
self._total_receive = 0
self._num = 0
@property
def num(self) -> int:
return self._num
@property
def reached_target(self) -> bool:
return self._num >= self._target_num
def add(self, data: UserProfileData) -> bool:
# 排除单人局和时间为 0 的游戏
# 茶: 不计算没挖掘的局, 即使 apm 和 lpm 也如此
if data.num_players == 1 or data.time == 0:
return False
seconds = data.time / 1000
self._weighted_total_lpm += 24 * data.pieces
self._weighted_total_apm += 60 * data.attack
self._weighted_total_adpm += 60 * (data.attack + data.dig)
self._total_attack += data.attack
self._total_dig += data.dig
self._total_offset += data.offset
self._total_pieces += data.pieces
self._total_receive += data.receive
self._total_time += seconds
self._num += 1
return True
def to_game_data(self) -> GameData | None:
if self._num == 0 or self._total_time == 0:
return None
metrics = get_metrics(
lpm=self._weighted_total_lpm / self._total_time,
apm=self._weighted_total_apm / self._total_time,
adpm=self._weighted_total_adpm / self._total_time,
)
return GameData(
game_num=self._num,
metrics=metrics,
or_=self._total_offset / self._total_receive * 100 if self._total_receive else 0.0,
dspp=self._total_dig / self._total_pieces if self._total_pieces else 0.0,
ge=2 * ((self._total_attack * self._total_dig) / self._total_pieces**2) if self._total_pieces else 0.0,
)
def get_game_data_from_profile(profile: UserProfile, query_num: int = 50) -> GameData | None:
accumulator = GameAccumulator(query_num)
for row in profile.data:
if accumulator.reached_target:
break
accumulator.add(row)
return accumulator.to_game_data()
def get_game_data_from_profiles(profiles: Iterable[UserProfile], query_num: int = 50) -> GameData | None:
accumulator = GameAccumulator(query_num)
for profile in profiles:
for row in profile.data:
if accumulator.reached_target:
return accumulator.to_game_data()
accumulator.add(row)
return accumulator.to_game_data()
async def get_game_data(player: Player, query_num: int = 50) -> GameData | None:
"""获取游戏数据"""
user_profile = await player.get_profile()
if user_profile.data == []:
return None
weighted_total_lpm = weighted_total_apm = weighted_total_adpm = total_time = 0.0
total_attack = total_dig = total_offset = total_pieses = total_receive = num = 0
for i in user_profile.data:
# 排除单人局和时间为0的游戏
# 茶: 不计算没挖掘的局, 即使apm和lpm也如此
if i.num_players == 1 or i.time == 0 or i.dig is None:
continue
# 加权计算
time = i.time / 1000
lpm = 24 * (i.pieces / time)
apm = (i.attack / time) * 60
adpm = ((i.attack + i.dig) / time) * 60
weighted_total_lpm += lpm * time
weighted_total_apm += apm * time
weighted_total_adpm += adpm * time
total_attack += i.attack
total_dig += i.dig
total_offset += i.offset
total_pieses += i.pieces
total_receive += i.receive
total_time += time
num += 1
if num >= query_num:
break
if num == 0:
return None
# TODO)) 如果有效局数小于 {查询数} , 并且没有无dig信息的局, 且 user_profile.data 内有{请求数}个局, 则继续往前获取信息
metrics = get_metrics(
lpm=weighted_total_lpm / total_time, apm=weighted_total_apm / total_time, adpm=weighted_total_adpm / total_time
return get_game_data_from_profile(user_profile, query_num)
async def get_compare_profile(
session: AsyncSession, unique_identifier: str, target_time: datetime
) -> UserProfile | None:
before = await session.scalar(
select(TOSHistoricalData)
.where(
TOSHistoricalData.user_unique_identifier == unique_identifier,
TOSHistoricalData.api_type == 'User Profile',
TOSHistoricalData.update_time <= target_time,
)
.order_by(TOSHistoricalData.update_time.desc())
.limit(1)
)
return GameData(
game_num=num,
metrics=metrics,
or_=total_offset / total_receive * 100,
dspp=total_dig / total_pieses,
ge=2 * ((total_attack * total_dig) / total_pieses**2),
after = await session.scalar(
select(TOSHistoricalData)
.where(
TOSHistoricalData.user_unique_identifier == unique_identifier,
TOSHistoricalData.api_type == 'User Profile',
TOSHistoricalData.update_time >= target_time,
)
.order_by(TOSHistoricalData.update_time.asc())
.limit(1)
)
if before is None:
selected = after
elif after is None:
selected = before
else:
selected = (
before
if abs((target_time - before.update_time).total_seconds())
<= abs((target_time - after.update_time).total_seconds())
else after
)
if selected is None or not isinstance(selected.data, UserProfile):
return None
return selected.data
@time_it
@@ -251,8 +341,41 @@ async def get_historical_data(unique_identifier: str) -> list[HistoryData]:
]
async def make_query_image(user_info: UserInfoSuccess, game_data: GameData, event_user_info: User | None) -> bytes:
class Trends(NamedTuple):
lpm: Trending = Trending.KEEP
apm: Trending = Trending.KEEP
adpm: Trending = Trending.KEEP
async def get_trends(player: Player, compare_delta: timedelta) -> Trends:
game_data = await get_game_data(player)
if game_data is None:
raise FallbackError
async with get_session() as session:
compare_profile = await get_compare_profile(
session,
(await player.user).teaid,
datetime.now(tz=UTC) - compare_delta,
)
if compare_profile is None or (old_game_data := get_game_data_from_profile(compare_profile)) is None:
raise FallbackError
return Trends(
lpm=Trending.compare(old_game_data.metrics.lpm, game_data.metrics.lpm),
apm=Trending.compare(old_game_data.metrics.apm, game_data.metrics.apm),
adpm=Trending.compare(old_game_data.metrics.adpm, game_data.metrics.adpm),
)
async def make_query_image(
player: Player,
compare_delta: timedelta,
event_user_info: User | None,
) -> bytes:
user_info, game_data = await gather(player.get_info(), get_game_data(player))
if game_data is None:
raise FallbackError
metrics = game_data.metrics
trends = await get_trends(player, compare_delta)
sprint_value = (
(
f'{duration:.3f}s'
@@ -264,69 +387,81 @@ async def make_query_image(user_info: UserInfoSuccess, game_data: GameData, even
)
data = handle_history_data(await get_historical_data(user_info.data.teaid))
values = get_value_bounds([i.score for i in data])
async with HostPage(
await render(
'v1/tos/info',
Info(
user=People(
avatar=await get_avatar(event_user_info, 'Data URI', None)
if event_user_info is not None
else get_random_avatar(user_info.data.teaid),
name=user_info.data.name,
),
multiplayer=Multiplayer(
history=History(
data=data,
max_value=values.value_max,
min_value=values.value_min,
split_interval=(split := get_split(value_bound=values, min_value=0)).split_value,
offset=split.offset,
),
rating=round(float(user_info.data.rating_now), 2),
rd=round(float(user_info.data.rd_now), 2),
lpm=metrics.lpm,
pps=metrics.pps,
lpm_trending=Trending.KEEP,
apm=metrics.apm,
apl=metrics.apl,
apm_trending=Trending.KEEP,
adpm=metrics.adpm,
vs=metrics.vs,
adpl=metrics.adpl,
adpm_trending=Trending.KEEP,
app=(app := (metrics.apm / (60 * metrics.pps))),
or_=game_data.or_,
dspp=game_data.dspp,
ci=150 * game_data.dspp - 125 * app + 50 * (metrics.vs / metrics.apm) - 25,
ge=game_data.ge,
),
singleplayer=Singleplayer(
sprint=sprint_value,
challenge=f'{int(user_info.data.pb_challenge):,}' if user_info.data.pb_challenge != '0' else 'N/A',
marathon=f'{int(user_info.data.pb_marathon):,}' if user_info.data.pb_marathon != '0' else 'N/A',
),
lang=get_lang(),
return await render_image(
Info(
user=People(
avatar=await get_avatar(event_user_info, 'Data URI', None)
if event_user_info is not None
else get_random_avatar(user_info.data.teaid),
name=user_info.data.name,
),
)
) as page_hash:
return await screenshot(f'http://{get_self_netloc()}/host/{page_hash}.html')
multiplayer=Multiplayer(
history=History(
data=data,
max_value=values.value_max,
min_value=values.value_min,
split_interval=(split := get_split(value_bound=values, min_value=0)).split_value,
offset=split.offset,
),
rating=round(float(user_info.data.rating_now), 2),
rd=round(float(user_info.data.rd_now), 2),
lpm=metrics.lpm,
pps=metrics.pps,
lpm_trending=trends.lpm,
apm=metrics.apm,
apl=metrics.apl,
apm_trending=trends.apm,
adpm=metrics.adpm,
vs=metrics.vs,
adpl=metrics.adpl,
adpm_trending=trends.adpm,
app=(app := (metrics.apm / (60 * metrics.pps))),
or_=game_data.or_,
dspp=game_data.dspp,
ci=150 * game_data.dspp - 125 * app + 50 * (metrics.vs / metrics.apm) - 25,
ge=game_data.ge,
),
singleplayer=Singleplayer(
sprint=sprint_value,
challenge=f'{int(user_info.data.pb_challenge):,}' if user_info.data.pb_challenge != '0' else 'N/A',
marathon=f'{int(user_info.data.pb_marathon):,}' if user_info.data.pb_marathon != '0' else 'N/A',
),
lang=get_lang(),
),
)
def make_query_text(user_info: UserInfoSuccess, game_data: GameData | None) -> UniMessage:
async def make_query_text(player: Player) -> UniMessage:
user_info, game_data = await gather(player.get_info(), get_game_data(player))
user_data = user_info.data
message = f'用户 {user_data.name} ({user_data.teaid}) '
message = Lang.stats.user_info(name=user_data.name, id=user_data.teaid)
if user_data.ranked_games == '0':
message += '暂无段位统计数据'
message += Lang.stats.no_rank()
else:
message += f', 段位分 {round(float(user_data.rating_now), 2)}±{round(float(user_data.rd_now), 2)} ({round(float(user_data.vol_now), 2)}) '
message += Lang.stats.rank_info(
rating=round(float(user_data.rating_now), 2),
rd=round(float(user_data.rd_now), 2),
vol=round(float(user_data.vol_now), 2),
)
if game_data is None:
message += ', 暂无游戏数据'
message += Lang.stats.no_game()
else:
message += f', 最近 {game_data.game_num} 局数据'
message += f"\nL'PM: {game_data.metrics.lpm} ( {game_data.metrics.pps} pps )"
message += f'\nAPM: {game_data.metrics.apm} ( x{game_data.metrics.apl} )'
message += f'\nADPM: {game_data.metrics.adpm} ( x{game_data.metrics.adpl} ) ( {game_data.metrics.vs}vs )'
message += f'\n40L: {float(user_data.pb_sprint) / 1000:.2f}s' if user_data.pb_sprint != '2147483647' else ''
message += f'\nMarathon: {user_data.pb_marathon}' if user_data.pb_marathon != '0' else ''
message += f'\nChallenge: {user_data.pb_challenge}' if user_data.pb_challenge != '0' else ''
message += Lang.stats.recent_games(count=game_data.game_num)
message += Lang.stats.lpm(lpm=game_data.metrics.lpm, pps=game_data.metrics.pps)
message += Lang.stats.apm(apm=game_data.metrics.apm, apl=game_data.metrics.apl)
message += Lang.stats.adpm(adpm=game_data.metrics.adpm, adpl=game_data.metrics.adpl, vs=game_data.metrics.vs)
if user_data.pb_sprint != '2147483647':
message += Lang.stats.sprint_pb(time=f'{float(user_data.pb_sprint) / 1000:.2f}')
if user_data.pb_marathon != '0':
message += Lang.stats.marathon_pb(score=user_data.pb_marathon)
if user_data.pb_challenge != '0':
message += Lang.stats.challenge_pb(score=user_data.pb_challenge)
return UniMessage(message)
async def make_query_result(player: Player, compare_delta: timedelta, event_user_info: User | None) -> UniMessage:
try:
return UniMessage.image(raw=await make_query_image(player, compare_delta, event_user_info))
except FallbackError:
...
return await make_query_text(player)

View File

@@ -10,18 +10,17 @@ from nonebot_plugin_waiter import suggest # type: ignore[import-untyped]
from ...config.config import global_config
from ...db import query_bind_info, remove_bind, trigger
from ...utils.host import HostPage, get_self_netloc
from ...utils.image import get_avatar
from ...utils.lang import get_lang
from ...utils.render import Bind, render
from ...utils.render import render_image
from ...utils.render.schemas.base import People
from ...utils.screenshot import screenshot
from ...utils.render.schemas.bind import Bind
from . import alc
from .api import Player
from .constant import GAME_TYPE
@alc.assign('TOP.unbind')
@alc.assign('TOS.unbind')
async def _(
nb_user: User,
event_session: Uninfo,
@@ -37,19 +36,17 @@ async def _(
get_session() as session,
):
if (bind := await query_bind_info(session=session, user=nb_user, game_platform=GAME_TYPE)) is None:
await UniMessage('您还未绑定 TOP 账号').finish()
await UniMessage('您还未绑定 TOS 账号').finish()
resp = await suggest('您确定要解绑吗?', ['', ''])
if resp is None or resp.extract_plain_text() == '':
return
player = Player(user_name=bind.game_account, trust=True)
user = await player.user
netloc = get_self_netloc()
async with HostPage(
await render(
'v1/binding',
await UniMessage.image(
raw=await render_image(
Bind(
platform='TOS',
type='unlink',
type='unbind',
user=People(
avatar=await get_avatar(event_session.user, 'Data URI', None),
name=user.name,
@@ -69,6 +66,5 @@ async def _(
lang=get_lang(),
),
)
) as page_hash:
await UniMessage.image(raw=await screenshot(f'http://{netloc}/host/{page_hash}.html')).send()
).send()
await remove_bind(session=session, user=nb_user, game_platform=GAME_TYPE)

View File

@@ -12,6 +12,58 @@
"scope": "error",
"types": [{ "subtype": "MessageFormatError", "types": ["TETR.IO", "TOS", "TOP"] }]
},
{ "scope": "template", "types": ["template_language"] }
{ "scope": "template", "types": ["template_language"] },
{
"scope": "bind",
"types": [
"not_found",
"no_account",
"confirm_unbind",
"config_success",
"verify_already",
"verify_failed",
"only_discord"
]
},
{
"scope": "record",
"types": ["not_found", "blitz", "sprint"]
},
{
"scope": "stats",
"types": [
"user_info",
"no_rank",
"rank_info",
"no_game",
"recent_games",
"daily_stats",
"no_daily",
"history_stats",
"no_history",
"lpm",
"apm",
"adpm",
"sprint_pb",
"marathon_pb",
"challenge_pb"
]
},
{
"scope": "template_ui",
"types": ["invalid_tag", "update_success", "update_failed"]
},
{
"scope": "help",
"types": ["usage"]
},
{
"scope": "prompt",
"types": ["io_check", "io_bind", "top_check", "top_bind", "tos_check", "tos_bind"]
},
{
"scope": "retry",
"types": ["message"]
}
]
}

View File

@@ -1,7 +1,7 @@
# This file is @generated by tarina.lang CLI tool
# It is not intended for manual editing.
# ruff: noqa: E402, F401, PLC0414
# ruff: noqa: E402
from pathlib import Path

View File

@@ -15,5 +15,55 @@
},
"template": {
"template_language": "en-US"
},
"bind": {
"not_found": "No binding information found",
"no_account": "You haven't bound a {game} account yet",
"confirm_unbind": "Are you sure you want to unbind?",
"config_success": "Configuration successful",
"verify_already": "You have already completed verification.",
"verify_failed": "Verification failed. Please confirm the target {game} account is linked to your current Discord account.",
"only_discord": "Currently only Discord account verification is supported"
},
"record": {
"not_found": "No {mode} record found for user {username}",
"blitz": "Blitz",
"sprint": "40L"
},
"stats": {
"user_info": "User {name} ({id})",
"no_rank": "No rank statistics available",
"rank_info": ", Rating {rating}±{rd} ({vol})",
"no_game": ", No game data available",
"recent_games": ", Recent {count} games data",
"daily_stats": "User {name} 24h statistics: ",
"no_daily": "User {name} has no 24h statistics available",
"history_stats": "\nHistorical statistics: ",
"no_history": "\nNo historical statistics available",
"lpm": "\nL'PM: {lpm} ( {pps} pps )",
"apm": "\nAPM: {apm} ( x{apl} )",
"adpm": "\nADPM: {adpm} ( x{adpl} ) ( {vs}vs )",
"sprint_pb": "\n40L: {time}s",
"marathon_pb": "\nMarathon: {score}",
"challenge_pb": "\nChallenge: {score}"
},
"template_ui": {
"invalid_tag": "{revision} is not a valid tag in the template repository",
"update_success": "Template updated successfully",
"update_failed": "Template update failed"
},
"help": {
"usage": "Type \"{command} --help\" for help"
},
"prompt": {
"io_check": "io check me",
"io_bind": "io bind {{gameID}}",
"top_check": "top check me",
"top_bind": "top bind {{gameID}}",
"tos_check": "tos check me",
"tos_bind": "tos bind {{gameID}}"
},
"retry": {
"message": "Retrying: {func} ({i}/{max_attempts})"
}
}

View File

@@ -31,7 +31,71 @@ class Template:
template_language: LangItem = LangItem('template', 'template_language')
class Bind:
not_found: LangItem = LangItem('bind', 'not_found')
no_account: LangItem = LangItem('bind', 'no_account')
confirm_unbind: LangItem = LangItem('bind', 'confirm_unbind')
config_success: LangItem = LangItem('bind', 'config_success')
verify_already: LangItem = LangItem('bind', 'verify_already')
verify_failed: LangItem = LangItem('bind', 'verify_failed')
only_discord: LangItem = LangItem('bind', 'only_discord')
class Record:
not_found: LangItem = LangItem('record', 'not_found')
blitz: LangItem = LangItem('record', 'blitz')
sprint: LangItem = LangItem('record', 'sprint')
class Stats:
user_info: LangItem = LangItem('stats', 'user_info')
no_rank: LangItem = LangItem('stats', 'no_rank')
rank_info: LangItem = LangItem('stats', 'rank_info')
no_game: LangItem = LangItem('stats', 'no_game')
recent_games: LangItem = LangItem('stats', 'recent_games')
daily_stats: LangItem = LangItem('stats', 'daily_stats')
no_daily: LangItem = LangItem('stats', 'no_daily')
history_stats: LangItem = LangItem('stats', 'history_stats')
no_history: LangItem = LangItem('stats', 'no_history')
lpm: LangItem = LangItem('stats', 'lpm')
apm: LangItem = LangItem('stats', 'apm')
adpm: LangItem = LangItem('stats', 'adpm')
sprint_pb: LangItem = LangItem('stats', 'sprint_pb')
marathon_pb: LangItem = LangItem('stats', 'marathon_pb')
challenge_pb: LangItem = LangItem('stats', 'challenge_pb')
class TemplateUi:
invalid_tag: LangItem = LangItem('template_ui', 'invalid_tag')
update_success: LangItem = LangItem('template_ui', 'update_success')
update_failed: LangItem = LangItem('template_ui', 'update_failed')
class Help:
usage: LangItem = LangItem('help', 'usage')
class Prompt:
io_check: LangItem = LangItem('prompt', 'io_check')
io_bind: LangItem = LangItem('prompt', 'io_bind')
top_check: LangItem = LangItem('prompt', 'top_check')
top_bind: LangItem = LangItem('prompt', 'top_bind')
tos_check: LangItem = LangItem('prompt', 'tos_check')
tos_bind: LangItem = LangItem('prompt', 'tos_bind')
class Retry:
message: LangItem = LangItem('retry', 'message')
class Lang(LangModel):
interaction = Interaction
error = Error
template = Template
bind = Bind
record = Record
stats = Stats
template_ui = TemplateUi
help = Help
prompt = Prompt
retry = Retry

View File

@@ -13,5 +13,55 @@
},
"template": {
"template_language": "zh-CN"
},
"bind": {
"not_found": "未查询到绑定信息",
"no_account": "您还未绑定 {game} 账号",
"confirm_unbind": "您确定要解绑吗?",
"config_success": "配置成功",
"verify_already": "您已经完成了验证.",
"verify_failed": "您未通过验证, 请确认目标 {game} 账号绑定了当前 Discord 账号",
"only_discord": "目前仅支持 Discord 账号验证"
},
"record": {
"not_found": "未找到用户 {username} 的 {mode} 记录",
"blitz": "Blitz",
"sprint": "40L"
},
"stats": {
"user_info": "用户 {name} ({id})",
"no_rank": "暂无段位统计数据",
"rank_info": ", 段位分 {rating}±{rd} ({vol})",
"no_game": ", 暂无游戏数据",
"recent_games": ", 最近 {count} 局数据",
"daily_stats": "用户 {name} 24小时内统计数据为: ",
"no_daily": "用户 {name} 暂无24小时内统计数据",
"history_stats": "\n历史统计数据为: ",
"no_history": "\n暂无历史统计数据",
"lpm": "\nL'PM: {lpm} ( {pps} pps )",
"apm": "\nAPM: {apm} ( x{apl} )",
"adpm": "\nADPM: {adpm} ( x{adpl} ) ( {vs}vs )",
"sprint_pb": "\n40L: {time}s",
"marathon_pb": "\nMarathon: {score}",
"challenge_pb": "\nChallenge: {score}"
},
"template_ui": {
"invalid_tag": "{revision} 不是模板仓库中的有效标签",
"update_success": "更新模板成功",
"update_failed": "更新模板失败"
},
"help": {
"usage": "输入\"{command} --help\"查看帮助"
},
"prompt": {
"io_check": "io查我",
"io_bind": "io绑定{{游戏ID}}",
"top_check": "top查我",
"top_bind": "top绑定{{游戏ID}}",
"tos_check": "茶服查我",
"tos_bind": "茶服绑定{{游戏ID}}"
},
"retry": {
"message": "Retrying: {func} ({i}/{max_attempts})"
}
}

View File

@@ -78,7 +78,7 @@ class BrowserManager:
"""启动浏览器实例"""
playwright = await async_playwright().start()
cls._browser = await playwright.firefox.launch(
headless=not config.tetris.development,
headless=not config.tetris.dev.enabled,
firefox_user_prefs={
'network.http.max-persistent-connections-per-server': 64,
},
@@ -95,9 +95,11 @@ class BrowserManager:
cls, context_id: str = 'default', factory: Callable[[], Coroutine[Any, Any, BrowserContext]] | None = None
) -> BrowserContext:
"""获取浏览器上下文"""
return cls._contexts.setdefault(
context_id, await factory() if factory is not None else await (await cls.get_browser()).new_context()
)
context = cls._contexts.get(context_id)
if context is None:
context = await (factory or (await cls.get_browser()).new_context)()
cls._contexts[context_id] = context
return context
@classmethod
async def del_context(cls, context_id: str) -> None:

View File

@@ -0,0 +1,28 @@
from datetime import timedelta
from .exception import MessageFormatError
DEFAULT_COMPARE_DELTA = timedelta(days=7)
_MIN_DURATION_LEN = 2
_DURATION_UNITS = {
'w': 'weeks',
'd': 'days',
'h': 'hours',
'm': 'minutes',
's': 'seconds',
}
def parse_duration(value: str) -> timedelta | MessageFormatError:
raw = value.strip().lower()
if raw.isdigit():
return timedelta(days=int(raw))
if len(raw) < _MIN_DURATION_LEN or not raw[:-1].isdigit():
return MessageFormatError('时间格式不正确')
amount = int(raw[:-1])
if amount <= 0:
return MessageFormatError('时间格式不正确')
unit = _DURATION_UNITS.get(raw[-1])
if unit is None:
return MessageFormatError('时间格式不正确')
return timedelta(**{unit: amount})

View File

@@ -1,12 +1,17 @@
from typing_extensions import override
class TetrisStatsError(Exception):
"""所有 TetrisStats 发生的异常基类"""
def __init__(self, message: str = ''):
self.message = message
@override
def __str__(self) -> str:
return self.message
@override
def __repr__(self) -> str:
return self.message

View File

@@ -20,7 +20,11 @@ from .templates import TEMPLATES_DIR
if TYPE_CHECKING:
from pydantic import IPvAnyAddress
app: FastAPI = get_app()
app = get_app()
if not isinstance(app, FastAPI):
msg = '本插件需要 FastAPI 驱动器才能运行'
raise RuntimeError(msg) # noqa: TRY004
driver = get_driver()
@@ -28,10 +32,6 @@ global_config = driver.config
BASE_URL = URL('https://tetr.io/user-content/')
if not isinstance(app, FastAPI):
msg = '本插件需要 FastAPI 驱动器才能运行'
raise RuntimeError(msg) # noqa: TRY004
NOT_FOUND = HTMLResponse('404 Not Found', status_code=status.HTTP_404_NOT_FOUND)
@@ -45,7 +45,7 @@ class HostPage:
async def __aenter__(self) -> str:
return self.page_hash
if not config.tetris.development:
if not config.tetris.dev.enabled:
async def __aexit__(self, exc_type, exc, tb) -> None: # noqa: ANN001
self.pages.pop(self.page_hash, None)

View File

@@ -1,6 +1,6 @@
from typing import cast
from ..i18n.model import Lang
from ..i18n import Lang
from .typedefs import Lang as LangType

View File

@@ -1,22 +1,10 @@
from typing import Literal, overload
from jinja2 import Environment, FileSystemLoader
from nonebot.compat import PYDANTIC_V2
from ..host import HostPage, get_self_netloc
from ..screenshot import screenshot
from ..templates import TEMPLATES_DIR
from .schemas.base import Base
from .schemas.bind import Bind
from .schemas.v1.tetrio.rank import Data as TETRIORankDataV1
from .schemas.v1.tetrio.user.info import Info as TETRIOUserInfoV1
from .schemas.v1.top.info import Info as TOPInfoV1
from .schemas.v1.tos.info import Info as TOSInfoV1
from .schemas.v2.tetrio.rank import Data as TETRIORankDataV2
from .schemas.v2.tetrio.rank.detail import Data as TETRIORankDetailDataV2
from .schemas.v2.tetrio.record.blitz import Record as TETRIORecordBlitzV2
from .schemas.v2.tetrio.record.sprint import Record as TETRIORecordSprintV2
from .schemas.v2.tetrio.tetra_league import Data as TETRIOTetraLeagueDataV2
from .schemas.v2.tetrio.user.info import Info as TETRIOUserInfoV2
from .schemas.v2.tetrio.user.list import List as TETRIOUserListV2
env = Environment(
loader=FileSystemLoader(TEMPLATES_DIR),
@@ -27,39 +15,19 @@ env = Environment(
)
@overload
async def render(render_type: Literal['v1/binding'], data: Bind) -> str: ...
@overload
async def render(render_type: Literal['v1/tetrio/info'], data: TETRIOUserInfoV1) -> str: ...
@overload
async def render(render_type: Literal['v1/tetrio/rank'], data: TETRIORankDataV1) -> str: ...
@overload
async def render(render_type: Literal['v1/top/info'], data: TOPInfoV1) -> str: ...
@overload
async def render(render_type: Literal['v1/tos/info'], data: TOSInfoV1) -> str: ...
@overload
async def render(render_type: Literal['v2/tetrio/rank'], data: TETRIORankDataV2) -> str: ...
@overload
async def render(render_type: Literal['v2/tetrio/rank/detail'], data: TETRIORankDetailDataV2) -> str: ...
@overload
async def render(render_type: Literal['v2/tetrio/record/blitz'], data: TETRIORecordBlitzV2) -> str: ...
@overload
async def render(render_type: Literal['v2/tetrio/record/sprint'], data: TETRIORecordSprintV2) -> str: ...
@overload
async def render(render_type: Literal['v2/tetrio/tetra-league'], data: TETRIOTetraLeagueDataV2) -> str: ...
@overload
async def render(render_type: Literal['v2/tetrio/user/info'], data: TETRIOUserInfoV2) -> str: ...
@overload
async def render(render_type: Literal['v2/tetrio/user/list'], data: TETRIOUserListV2) -> str: ...
async def render(
render_type: str,
data: Base,
) -> str:
if PYDANTIC_V2:
return await env.get_template('index.html').render_async(
path=render_type, data=data.model_dump_json(by_alias=True)
)
return await env.get_template('index.html').render_async(path=render_type, data=data.json(by_alias=True))
return await env.get_template('index.html').render_async(data=data.model_dump_json(by_alias=True))
return await env.get_template('index.html').render_async(data=data.json(by_alias=True))
async def render_image(
data: Base,
) -> bytes:
async with HostPage(page=await render(data)) as page_hash:
return await screenshot(f'http://{get_self_netloc()}/host/{page_hash}.html#/{data.path}')
__all__ = ['render']

View File

@@ -43,8 +43,8 @@ class Piece(Enum):
)
I5 = (
(True, True, True, True, True), # fmt: skip
)
(True, True, True, True, True),
) # fmt: skip
V = (
(True, False, False),

View File

@@ -1,3 +1,4 @@
from abc import ABC, abstractmethod
from datetime import datetime
from typing import Literal
@@ -7,7 +8,12 @@ from strenum import StrEnum
from ...typedefs import Lang, Number
class Base(BaseModel):
class Base(BaseModel, ABC):
@property
@abstractmethod
def path(self) -> str:
raise NotImplementedError
lang: Lang
@@ -35,3 +41,11 @@ class Trending(StrEnum):
UP = 'up'
KEEP = 'keep'
DOWN = 'down'
@classmethod
def compare(cls, old: float, new: float) -> 'Trending':
if old > new:
return cls.DOWN
if old < new:
return cls.UP
return cls.KEEP

View File

@@ -1,11 +1,18 @@
from typing import Literal
from typing_extensions import override
from .base import Base, People
class Bind(Base):
@property
@override
def path(self) -> str:
return 'v1/binding'
platform: Literal['TETR.IO', 'TOP', 'TOS']
type: Literal['success', 'unknown', 'unlink', 'unverified', 'error']
type: Literal['success', 'unknown', 'unbind', 'unverified', 'error']
user: People
bot: People
prompt: str

View File

@@ -1,9 +1,10 @@
from pydantic import BaseModel
from typing_extensions import override
from .......games.tetrio.api.typedefs import Rank
from ......typedefs import Number
from ....base import Base, People, Trending
from ...base import History
from ......games.tetrio.api.typedefs import Rank
from .....typedefs import Number
from ...base import Base, People, Trending
from ..base import History
class User(People):
@@ -45,6 +46,11 @@ class Singleplayer(BaseModel):
class Info(Base):
@property
@override
def path(self) -> str:
return 'v1/tetrio/info'
user: User
multiplayer: Multiplayer
singleplayer: Singleplayer

View File

@@ -1,6 +1,7 @@
from datetime import datetime
from pydantic import BaseModel
from typing_extensions import override
from ......games.tetrio.api.typedefs import ValidRank
from ...base import Base
@@ -13,5 +14,10 @@ class ItemData(BaseModel):
class Data(Base):
@property
@override
def path(self) -> str:
return 'v1/tetrio/rank'
items: dict[ValidRank, ItemData]
updated_at: datetime

View File

@@ -1,4 +1,5 @@
from pydantic import BaseModel
from typing_extensions import override
from .....typedefs import Number
from ...base import Base, People, Trending
@@ -14,6 +15,11 @@ class Data(BaseModel):
class Info(Base):
@property
@override
def path(self) -> str:
return 'v1/top/info'
user: People
today: Data
historical: Data

View File

@@ -1,4 +1,5 @@
from pydantic import BaseModel, Field
from typing_extensions import override
from .....typedefs import Number
from ...base import Base, People, Trending
@@ -37,6 +38,11 @@ class Singleplayer(BaseModel):
class Info(Base):
@property
@override
def path(self) -> str:
return 'v1/tos/info'
user: People
multiplayer: Multiplayer
singleplayer: Singleplayer

View File

@@ -1,6 +1,7 @@
from datetime import datetime
from pydantic import BaseModel
from typing_extensions import override
from .......games.tetrio.api.typedefs import ValidRank
from ......typedefs import Number
@@ -23,5 +24,10 @@ class ItemData(BaseModel):
class Data(Base):
@property
@override
def path(self) -> str:
return 'v2/tetrio/rank'
items: dict[ValidRank, ItemData]
updated_at: datetime

Some files were not shown because too many files have changed in this diff Show More