TacoSkill LABTacoSkill LAB

The full-lifecycle AI skills platform.

Product

  • SkillHub
  • Playground
  • Skill Create
  • SkillKit

Resources

  • Privacy
  • Terms
  • About

Platforms

  • Claude Code
  • Cursor
  • Codex CLI
  • Gemini CLI
  • OpenCode

© 2026 TacoSkill LAB. All rights reserved.

TacoSkill LAB
TacoSkill LAB
HomeSkillHubCreatePlaygroundSkillKit
  1. Home
  2. /
  3. SkillHub
  4. /
  5. rule-validation
Improve

rule-validation

4.9

by majiayu000

185Favorites
78Upvotes
0Downvotes

"validate_skills.pyを実行してSkills/Agents/Commandsの構文を検証し、エラーを修正してログを記録する。「バリデーション」「構文検証」「Skillsチェック」を依頼されたときに使用する。"

validation

4.9

Rating

0

Installs

Testing & Quality

Category

Quick Review

The skill provides a clear validation workflow with concrete steps for executing validate_skills.py, logging results, and iterating until all checks pass. The description adequately explains when to invoke the skill (validation, syntax checking, Skills check requests). Structure is logical with preflight, execution, QC, and backlog phases. Task knowledge is strong with specific commands, output requirements, and QC delegation patterns. However, novelty is limited as this primarily wraps a Python validation script execution—a task a CLI agent could perform with moderate prompting. The workflow adds value through structured error handling, QC loops, and logging standards, but the core task complexity is moderate.

LLM Signals

Description coverage7
Task knowledge8
Structure7
Novelty4

GitHub Signals

49
7
1
1
Last commit 0 days ago

Publisher

majiayu000

majiayu000

Skill Author

Related Skills

code-reviewerdebugging-wizardtest-master

Loading SKILL.md…

Try onlineView on GitHub

Publisher

majiayu000 avatar
majiayu000

Skill Author

Related Skills

code-reviewer

Jeffallan

6.4

debugging-wizard

Jeffallan

6.4

test-master

Jeffallan

6.4

playwright-expert

Jeffallan

6.4
Try online