db

安装量: 50
排名: #14985

安装

npx skills add https://github.com/johnlindquist/claude --skill db

Database Manager

Query and manage databases across SQLite, PostgreSQL, and MySQL.

Prerequisites

Install database CLIs as needed:

SQLite (usually pre-installed on macOS/Linux)

sqlite3 --version

PostgreSQL

brew install postgresql

or

apt install postgresql-client

MySQL

brew install mysql-client

or

apt install mysql-client

CLI Reference SQLite

Connect to database

sqlite3 database.db

Execute query

sqlite3 database.db "SELECT * FROM users LIMIT 10"

Output as CSV

sqlite3 -csv database.db "SELECT * FROM users"

Output as JSON (requires sqlite 3.33+)

sqlite3 -json database.db "SELECT * FROM users"

Column headers

sqlite3 -header database.db "SELECT * FROM users"

Execute SQL file

sqlite3 database.db < queries.sql

Schema commands

sqlite3 database.db ".schema" sqlite3 database.db ".tables" sqlite3 database.db ".schema users"

PostgreSQL

Connect

psql postgresql://user:pass@host:5432/dbname

Execute query

psql -c "SELECT * FROM users LIMIT 10" postgresql://...

Tuples only (no headers)

psql -t -c "SELECT count(*) FROM users" postgresql://...

No alignment (machine-readable)

psql -t -A -c "SELECT id,name FROM users" postgresql://...

Execute SQL file

psql -f queries.sql postgresql://...

List tables

psql -c "\dt" postgresql://...

Describe table

psql -c "\d users" postgresql://...

Output format

psql -c "SELECT * FROM users" --csv postgresql://... psql -c "SELECT * FROM users" --html postgresql://...

MySQL

Connect

mysql -h host -u user -p dbname

Execute query

mysql -h host -u user -p -e "SELECT * FROM users LIMIT 10" dbname

Batch mode (no headers)

mysql -h host -u user -p -B -e "SELECT * FROM users" dbname

Execute SQL file

mysql -h host -u user -p dbname < queries.sql

Show tables

mysql -h host -u user -p -e "SHOW TABLES" dbname

Describe table

mysql -h host -u user -p -e "DESCRIBE users" dbname

Common Operations Schema Inspection SQLite

All tables

sqlite3 db.sqlite ".tables"

Table schema

sqlite3 db.sqlite ".schema tablename"

All schemas

sqlite3 db.sqlite ".schema"

PostgreSQL

All tables

psql -c "\dt" $DATABASE_URL

Table schema

psql -c "\d tablename" $DATABASE_URL

Table with indexes

psql -c "\d+ tablename" $DATABASE_URL

MySQL

All tables

mysql -e "SHOW TABLES" -h host -u user -p dbname

Table schema

mysql -e "DESCRIBE tablename" -h host -u user -p dbname

Create statement

mysql -e "SHOW CREATE TABLE tablename" -h host -u user -p dbname

Query Explanation

SQLite

sqlite3 db.sqlite "EXPLAIN QUERY PLAN SELECT * FROM users WHERE email = 'x'"

PostgreSQL

psql -c "EXPLAIN ANALYZE SELECT * FROM users WHERE email = 'x'" $DATABASE_URL

MySQL

mysql -e "EXPLAIN SELECT * FROM users WHERE email = 'x'" dbname

Data Export

SQLite to CSV

sqlite3 -csv -header db.sqlite "SELECT * FROM users" > users.csv

PostgreSQL to CSV

psql -c "\COPY users TO 'users.csv' CSV HEADER" $DATABASE_URL

MySQL to CSV

mysql -e "SELECT * FROM users" -B dbname | tr '\t' ',' > users.csv

AI-Assisted Query Generation

Use Gemini to help write queries:

Describe what you want

gemini -m pro -o text -e "" "Write a SQL query to: - Find all users who signed up in the last 30 days - Who have made at least one purchase - Order by purchase count descending

Table schemas: - users (id, email, created_at) - purchases (id, user_id, amount, created_at)

Output PostgreSQL-compatible SQL."

Safe Query Review

Generate query

QUERY=$(gemini -m pro -o text -e "" "Write SQL for: [your request]")

Review before executing

echo "Generated query:" echo "$QUERY"

Then execute if safe

psql -c "$QUERY" $DATABASE_URL

Migration Patterns Schema Changes

Create migration file

cat > migrations/001_add_column.sql << 'EOF' ALTER TABLE users ADD COLUMN status VARCHAR(50) DEFAULT 'active'; EOF

Apply migration

psql -f migrations/001_add_column.sql $DATABASE_URL

Safe Migration Workflow

1. Test on copy first

createdb test_migration pg_dump $DATABASE_URL | psql test_migration

2. Run migration on test

psql -f migration.sql test_migration

3. Verify

psql -c "\d tablename" test_migration

4. Apply to production

psql -f migration.sql $DATABASE_URL

5. Cleanup

dropdb test_migration

Environment Variables

Store connection strings securely:

.env file (don't commit!)

DATABASE_URL=postgresql://user:pass@host:5432/dbname SQLITE_DB=./data/app.db

Usage

psql $DATABASE_URL sqlite3 $SQLITE_DB

Best Practices Never hardcode credentials - Use environment variables Review AI-generated queries - Before executing Use EXPLAIN - Check query performance Test migrations - On copy before production Backup before changes - Especially destructive ones Use transactions - For multi-statement changes Limit results - Always use LIMIT during exploration

返回排行榜