seo-engine
Expert SEO auditing skill for analyzing URLs, HTML source code, robots.txt, and sitemaps. Evaluates on-page technical requirements, content basics, optimization signals, and spam-policy compliance (cloaking, hidden text, keyword stuffing). Trigger phrases: "Run an SEO audit", "Analyze this URL for SEO", "Check my webpage for SEO issues", "Is my robots.txt correct?", "Find cloaking or hidden text on this page".
下記のコマンドをコピーしてターミナル(Mac/Linux)または PowerShell(Windows)に貼り付けてください。 ダウンロード → 解凍 → 配置まで全自動。
mkdir -p ~/.claude/skills && cd ~/.claude/skills && curl -L -o seo-engine.zip https://jpskill.com/download/10390.zip && unzip -o seo-engine.zip && rm seo-engine.zip
$d = "$env:USERPROFILE\.claude\skills"; ni -Force -ItemType Directory $d | Out-Null; iwr https://jpskill.com/download/10390.zip -OutFile "$d\seo-engine.zip"; Expand-Archive "$d\seo-engine.zip" -DestinationPath $d -Force; ri "$d\seo-engine.zip"
完了後、Claude Code を再起動 → 普通に「動画プロンプト作って」のように話しかけるだけで自動発動します。
💾 手動でダウンロードしたい(コマンドが難しい人向け)
- 1. 下の青いボタンを押して
seo-engine.zipをダウンロード - 2. ZIPファイルをダブルクリックで解凍 →
seo-engineフォルダができる - 3. そのフォルダを
C:\Users\あなたの名前\.claude\skills\(Win)または~/.claude/skills/(Mac)へ移動 - 4. Claude Code を再起動
⚠️ ダウンロード・利用は自己責任でお願いします。当サイトは内容・動作・安全性について責任を負いません。
🎯 このSkillでできること
下記の説明文を読むと、このSkillがあなたに何をしてくれるかが分かります。Claudeにこの分野の依頼をすると、自動で発動します。
📦 インストール方法 (3ステップ)
- 1. 上の「ダウンロード」ボタンを押して .skill ファイルを取得
- 2. ファイル名の拡張子を .skill から .zip に変えて展開(macは自動展開可)
- 3. 展開してできたフォルダを、ホームフォルダの
.claude/skills/に置く- · macOS / Linux:
~/.claude/skills/ - · Windows:
%USERPROFILE%\.claude\skills\
- · macOS / Linux:
Claude Code を再起動すれば完了。「このSkillを使って…」と話しかけなくても、関連する依頼で自動的に呼び出されます。
詳しい使い方ガイドを見る →- 最終更新
- 2026-05-18
- 取得日時
- 2026-05-18
- 同梱ファイル
- 1
📖 Claude が読む原文 SKILL.md(中身を展開)
この本文は AI(Claude)が読むための原文(英語または中国語)です。日本語訳は順次追加中。
SEO Engine
The SEO Engine provides a standardized methodology for performing deterministic SEO audits using pre-defined rules and specialized diagnostic scripts. It handles everything from input preparation (fetching live data) to heuristic analysis of spam policies.
Use Case
Perform technical SEO audits on websites or local files to ensure compliance with search engine guidelines and identify potential ranking issues.
Triggering Logic
Use this skill when a user:
- Provides a website URL for analysis.
- Uploads/references HTML, robots.txt, or sitemap files.
- Asks for a "technical SEO check" or "audit".
- Specifically mentions SEO-related concerns like "indexability", "crawlability", or "spam signals".
When NOT to use
- For general SEO strategy advice (e.g., "how do I rank for 'best laptops'?").
- For off-page SEO (backlink analysis, domain authority).
- For live performance monitoring (use Lighthouse/CWV tools).
Workflow: SEO Audit Pipeline
Follow these steps sequentially. Do not skip validation gates.
Step 1 — Input Identification & Preparation
- Identify Input: URL or local files (
@filename.html). - Validation Gate (Input Type): Ensure the input is a valid URL or supported file types (
.html,.htm,robots.txt,.xml). - Prepare Live Data (if URL):
cd scripts/prepare_input/- Run
python fetch_html.py <URL> - Run
python fetch_robots_txt.py <URL> - Run
python fetch_sitemap.py <URL>
- Validation Gate (File Presence): Confirm that the fetched files (or provided local files) are readable and non-empty.
Step 2 — Basic Rule Application (Deterministic)
- Scan the
rules/directory for applicable rules. - Read the YAML frontmatter of each rule to match
inputFieldswith your prepared data. - Execute Checks: Compare the content of your files against the "Incorrect/Correct" examples in the rule documentation.
- Apply Logic: Categorize findings into Pass/Fail.
Step 3 — Specialized Script Execution (Heuristic & Complex)
For complex checks, execute the dedicated Python scripts in the scripts/ directory:
| Task | Script Path |
|---|---|
| Cloaking | scripts/cloaking_detection/cloaking_detection.py |
| Hidden Text | scripts/hidden_text_detection/hidden_text_detection.py |
| Keyword Stuffing | scripts/keyword_stuffing_detection/keyword_stuffing_detection.py |
| Sneaky Redirects | scripts/sneaky_redirect_detection/sneaky_redirect_detection.py |
| Favicon Audit | scripts/favicon_dimensions/favicon_dimensions.py |
| Experience Diversity | scripts/page_experience_diversity/page_experience_diversity.py |
Instructions for running each script are found in their respective README.md files.
Step 4 — Result Synthesis & Reporting
- Normalize Results: Translate pass/fail outputs from rules and JSON results from scripts into a unified report.
- Prioritization: Sort results by "Critical", "High", "Medium", and "Low" priorities.
- Remediation: For every failure, provide the specific "Actionable Fix" found in the rule documentation or script output.
Example Invocations
Example 1 — Website URL Audit
User: "Analyze https://example.com for SEO issues"
Action: Prepare inputs using fetch_*.py -> Apply Rules -> Run specialized scripts -> Generate prioritized report.
Example 2 — Local File Audit
User: "Check @index.html and @robots.txt for SEO compliance"
Action: Validate files -> Apply matching rules from rules/ -> Report findings.
Example 3 — Spam Detection
User: "Check if this page is using hidden text or cloaking: @page.html"
Action: Run cloaking_detection.py and hidden_text_detection.py -> Report similarity scores and hidden elements.
Troubleshooting
Skill Undertriggering
- Cause: Vague request like "Check my site".
- Fix: Ask the user to provide a specific URL or upload files. Re-trigger with "I will now run an SEO audit on [URL]".
Preparation Failures
- If
fetch_html.pyfails due to bot detection, advise the user that the site might be blocking headless browsers. - If
fetch_robots_txt.pyreturns 503, immediately flag theROBOTS_TXT_NOT_503rule as Critical Fail.
Trigger Testing Plan
Should Trigger
- "Run an SEO audit on example.com"
- "Audit this HTML file for technical SEO"
- "Are there any spam policy violations on this page?"
- "Check my sitemap and robots.txt for errors"
Should NOT Trigger
- "Write a blog post about SEO"
- "What is the keyword volume for 'cat toys'?"
- "How do I get more backlinks?"