postgresql
Assists with designing schemas, writing performant queries, managing indexes, and operating PostgreSQL databases. Use when working with JSONB, full-text search, window functions, CTEs, row-level security, replication, or performance tuning. Trigger words: postgresql, postgres, sql, database, jsonb, rls, window functions, cte.
Usage
Getting Started
- Install the skill using the command above
- Open your AI coding agent (Claude Code, Codex, Gemini CLI, or Cursor)
- Reference the skill in your prompt
- The AI will use the skill's capabilities automatically
Example Prompts
- "Analyze the sales data in revenue.csv and identify trends"
- "Create a visualization comparing Q1 vs Q2 performance metrics"
Documentation
Overview
PostgreSQL is an advanced relational database with features that often eliminate the need for separate tools: JSONB for semi-structured data, built-in full-text search, window functions for analytics, recursive CTEs for hierarchical queries, row-level security for multi-tenant isolation, and streaming replication for high availability. It supports partitioning, multiple index types (B-tree, GIN, GiST, BRIN), and connection pooling via PgBouncer.
Instructions
- When designing schemas, use
UUIDprimary keys withgen_random_uuid(),TIMESTAMP WITH TIME ZONEfor all timestamps, appropriate constraints (CHECK, UNIQUE, foreign keys with ON DELETE), and partitioning for time-series data. - When working with JSON, use
JSONBfor truly dynamic data with GIN indexes for containment queries, but prefer proper columns for known fields since they provide better validation and performance. - When optimizing queries, add indexes based on
EXPLAIN ANALYZEoutput rather than guesswork, use partial indexes for filtered queries, expression indexes for computed values, and covering indexes withINCLUDEfor index-only scans. - When building full-text search, create
tsvectorgenerated columns with GIN indexes, usets_rank()for relevance scoring, and choose the appropriate language configuration for stemming. - When implementing multi-tenancy, use row-level security (RLS) policies for database-level isolation rather than application-level checks, setting the user context via
current_setting(). - When managing production databases, use PgBouncer for connection pooling, monitor with
pg_stat_statements, runVACUUM ANALYZEafter bulk operations, and set up streaming replication with Patroni for high availability.
Examples
Example 1: Design a multi-tenant SaaS database with RLS
User request: "Set up a PostgreSQL database with row-level security for multi-tenant isolation"
Actions:
- Create tables with a
tenant_idcolumn andUUIDprimary keys - Enable RLS with
ALTER TABLE ... ENABLE ROW LEVEL SECURITY - Create policies using
current_setting('app.tenant_id')for per-request isolation - Set up connection pooling with PgBouncer and configure
app.tenant_idper connection
Output: A multi-tenant database where tenant data is isolated at the database level, preventing cross-tenant data leaks.
Example 2: Add full-text search to a content platform
User request: "Implement search across articles with relevance ranking and highlighting"
Actions:
- Add a
search_vectorgenerated column usingto_tsvector('english', title || ' ' || body) - Create a GIN index on the search vector column
- Build a search query using
@@withplainto_tsquery()and rank results withts_rank() - Add
ts_headline()for highlighting matched terms in results
Output: A fast full-text search with relevance ranking, highlighting, and GIN index-backed performance.
Guidelines
- Use
UUIDprimary keys to avoid sequential ID enumeration and merge conflicts. - Use
TIMESTAMP WITH TIME ZONEfor all timestamps; never useTIMESTAMPwhich loses timezone context. - Add indexes based on
EXPLAIN ANALYZEoutput, not guesswork; measure before optimizing. - Use connection pooling (PgBouncer) for applications with more than 20 connections since PostgreSQL forks a process per connection.
- Use RLS for multi-tenant applications since database-level isolation is more reliable than application-level checks.
- Use
JSONBfor truly dynamic data, not as a replacement for proper columns. - Run
VACUUM ANALYZEafter bulk operations since stale statistics lead to bad query plans.
Information
- Version
- 1.0.0
- Author
- terminal-skills
- Category
- Data & AI
- License
- Apache-2.0