Seo

SEO automation and integration for optimizing search engine visibility and rankings

SEO is a community skill for search engine optimization, covering keyword research, on-page optimization, technical SEO auditing, content strategy, and performance monitoring for improving organic search visibility.

What Is This?

Overview

SEO provides tools and practices for improving website visibility in search engine results pages. It covers keyword research that identifies terms with traffic potential and ranking feasibility, on-page optimization that structures content with proper headings and meta tags, technical SEO auditing that checks crawlability, indexing, and page speed, content strategy that plans topic clusters targeting search demand, and performance monitoring that tracks rankings and traffic metrics. The skill helps websites attract organic search traffic.

Who Should Use This

This skill serves content marketers optimizing web pages for search, developers implementing technical SEO requirements, and business owners seeking to increase organic website traffic.

Why Use It?

Problems It Solves

Websites without SEO optimization remain invisible in search results despite having valuable content. Pages missing proper meta tags lose click-through opportunities from search listings. Technical issues like slow page speed prevent search engines from properly indexing content. Content created without keyword research fails to match user search queries.

Core Highlights

Keyword researcher identifies terms with traffic potential and feasibility. On-page optimizer structures content with proper tags and linking. Technical auditor checks crawlability, speed, and indexing. Performance tracker monitors rankings and traffic trends.

How to Use It?

Basic Usage

from dataclasses import (
  dataclass)
import re
from urllib.parse import (
  urlparse)

@dataclass
class SEOCheck:
  name: str
  passed: bool
  detail: str

class OnPageAuditor:
  def __init__(
    self, html: str,
    url: str
  ):
    self.html = html
    self.url = url
    self.checks = []

  def check_title(self):
    match = re.search(
      r'<title>(.*?)</'
      r'title>',
      self.html)
    title = (
      match.group(1)
      if match else '')
    length = len(title)
    self.checks.append(
      SEOCheck(
        'Title tag',
        30 <= length <= 60,
        f'{length} chars'))

  def check_meta_desc(
    self
  ):
    match = re.search(
      r'meta.*?name='
      r'"description".*?'
      r'content="(.*?)"',
      self.html)
    desc = (
      match.group(1)
      if match else '')
    length = len(desc)
    self.checks.append(
      SEOCheck(
        'Meta description',
        70 <= length <= 160,
        f'{length} chars'))

  def check_headings(self):
    h1s = re.findall(
      r'<h1[^>]*>(.*?)</h1>',
      self.html)
    self.checks.append(
      SEOCheck(
        'H1 tag',
        len(h1s) == 1,
        f'{len(h1s)} found'))

  def report(self) -> list:
    return [
      {'name': c.name,
       'status': 'pass'
         if c.passed
         else 'fail',
       'detail': c.detail}
      for c in self.checks]

auditor = OnPageAuditor(
  html_content, url)
auditor.check_title()
auditor.check_meta_desc()
auditor.check_headings()
for item in (
  auditor.report()
):
  print(
    f'{item["status"]}: '
    f'{item["name"]} - '
    f'{item["detail"]}')

Real-World Examples

import xml.etree\
  .ElementTree as ET
from urllib.parse import (
  urlparse)

class SitemapAnalyzer:
  def __init__(
    self, xml_content: str
  ):
    self.root = (
      ET.fromstring(
        xml_content))
    self.ns = {
      's': 'http://www'
        '.sitemaps.org/'
        'schemas/sitemap'
        '/0.9'}

  def get_urls(
    self
  ) -> list:
    urls = []
    for url in (
      self.root.findall(
        './/s:url',
        self.ns)
    ):
      loc = url.find(
        's:loc',
        self.ns)
      if loc is not None:
        urls.append(
          loc.text)
    return urls

  def coverage(
    self
  ) -> dict:
    urls = self.get_urls()
    paths = [
      urlparse(u).path
      for u in urls]
    sections = {}
    for p in paths:
      parts = (
        p.strip('/').split(
          '/'))
      sec = (
        parts[0]
        if parts[0]
        else 'root')
      sections[sec] = (
        sections.get(
          sec, 0) + 1)
    return {
      'total': len(urls),
      'sections':
        sections}

analyzer = SitemapAnalyzer(
  sitemap_xml)
report = analyzer.coverage()
print(
  f'Total URLs: '
  f'{report["total"]}')
for sec, count in (
  report['sections']
  .items()
):
  print(
    f'  {sec}: {count}')

Advanced Tips

Build topic clusters with a pillar page and supporting content that interlinks for topical authority. Use structured data markup to enable rich snippets in search results. Monitor Core Web Vitals as page experience signals affecting rankings.

When to Use It?

Use Cases

Audit a website for technical SEO issues including broken links, missing meta tags, and slow page speed. Research keywords for a content strategy targeting topics with search demand and manageable competition. Implement structured data markup for product pages to enable rich search result displays.

Related Topics

Search engine optimization, keyword research, content marketing, technical SEO, web analytics, structured data, and page speed.

Important Notes

Requirements

Access to website HTML and server configuration for technical audits. Analytics platform such as Google Search Console for performance data. Understanding of search engine ranking factors and best practices.

Usage Recommendations

Do: write content for users first and optimize for search engines second. Use descriptive, keyword-informed title tags and meta descriptions for every page. Maintain a clean URL structure with descriptive paths and proper canonical tags.

Don't: stuff keywords unnaturally into content as search engines penalize this practice. Ignore mobile optimization since most search traffic comes from mobile devices. Create thin content pages solely to target keywords without providing real value.

Limitations

Search engine algorithms change frequently and ranking strategies require ongoing adaptation. SEO results take months to materialize and cannot guarantee specific rankings. Competitive keywords require sustained investment in content and authority.