ai-automation-workflows

安装量: 4.9K
排名: #594

安装

npx skills add https://github.com/inferen-sh/skills --skill ai-automation-workflows

AI Automation Workflows Build automated AI workflows via inference.sh CLI. Quick Start Requires inference.sh CLI ( infsh ). Install instructions infsh login

Simple automation: Generate daily image

infsh app run falai/flux-dev --input '{ "prompt": "Inspirational quote background, minimalist design, date: ' " $( date +%Y-%m-%d ) "'" } ' Automation Patterns Pattern 1: Batch Processing Process multiple items with the same workflow.

!/bin/bash

batch_images.sh - Generate images for multiple prompts

PROMPTS

( "Mountain landscape at sunrise" "Ocean waves at sunset" "Forest path in autumn" "Desert dunes at night" ) for prompt in " ${PROMPTS [ @ ] } " ; do echo "Generating: $prompt " infsh app run falai/flux-dev --input "{ \" prompt \" : \" $prompt , professional photography, 4K \" }"

"output_ ${prompt / / / _} .json" sleep 2

Rate limiting

done Pattern 2: Sequential Pipeline Chain multiple AI operations.

!/bin/bash

content_pipeline.sh - Full content creation pipeline

TOPIC

"AI in healthcare"

Step 1: Research

echo "Researching..." RESEARCH = $( infsh app run tavily/search-assistant --input "{ \" query \" : \" $TOPIC latest developments \" }" )

Step 2: Write article

echo "Writing article..." ARTICLE = $( infsh app run openrouter/claude-sonnet-45 --input "{ \" prompt \" : \" Write a 500-word blog post about $TOPIC based on: $RESEARCH \" }" )

Step 3: Generate image

echo "Generating image..." IMAGE = $( infsh app run falai/flux-dev --input "{ \" prompt \" : \" Blog header image for article about $TOPIC , modern, professional \" }" )

Step 4: Generate social post

echo "Creating social post..." SOCIAL = $( infsh app run openrouter/claude-haiku-45 --input "{ \" prompt \" : \" Write a Twitter thread (5 tweets) summarizing: $ARTICLE \" }" ) echo "Pipeline complete!" Pattern 3: Parallel Processing Run multiple operations simultaneously.

!/bin/bash

parallel_generation.sh - Generate multiple assets in parallel

Start all jobs in background

infsh app run falai/flux-dev --input '{"prompt": "Hero image..."}'

hero.json & PID1 = $! infsh app run falai/flux-dev --input '{"prompt": "Feature image 1..."}'

feature1.json & PID2 = $! infsh app run falai/flux-dev --input '{"prompt": "Feature image 2..."}'

feature2.json & PID3 = $!

Wait for all to complete

wait $PID1 $PID2 $PID3 echo "All images generated!" Pattern 4: Conditional Workflow Branch based on results.

!/bin/bash

conditional_workflow.sh - Process based on content analysis

INPUT_TEXT

" $1 "

Analyze content

ANALYSIS

$( infsh app run openrouter/claude-haiku-45 --input "{ \" prompt \" : \" Classify this text as: positive, negative, or neutral. Return only the classification. \n \n $INPUT_TEXT \" }" )

Branch based on result

case " $ANALYSIS " in positive ) echo "Generating celebration image..." infsh app run falai/flux-dev --input '{"prompt": "Celebration, success, happy"}' ; ; negative ) echo "Generating supportive message..." infsh app run openrouter/claude-sonnet-45 --input "{ \" prompt \" : \" Write a supportive, encouraging response to: $INPUT_TEXT \" }" ; ; * ) echo "Generating neutral acknowledgment..." ; ; esac Pattern 5: Retry with Fallback Handle failures gracefully.

!/bin/bash

retry_workflow.sh - Retry failed operations

generate_with_retry ( ) { local prompt = " $1 " local max_attempts = 3 local attempt = 1 while [ $attempt -le $max_attempts ] ; do echo "Attempt $attempt ..." result = $( infsh app run falai/flux-dev --input "{ \" prompt \" : \" $prompt \" }" 2

&1 ) if [ $? -eq 0 ] ; then echo " $result " return 0 fi echo "Failed, retrying..." (( attempt ++ )) sleep $(( attempt * 2 ))

Exponential backoff

done

Fallback to different model

echo "Falling back to alternative model..." infsh app run google/imagen-3 --input "{ \" prompt \" : \" $prompt \" }" } generate_with_retry "A beautiful sunset over mountains" Scheduled Automation Cron Job Setup

Edit crontab

crontab -e

Daily content generation at 9 AM

0 9 * * * /path/to/daily_content.sh

/var/log/ai-automation.log 2

&1

Weekly report every Monday at 8 AM

0 8 * * 1 /path/to/weekly_report.sh

/var/log/ai-automation.log 2

&1

Every 6 hours: social media content

0 */6 * * * /path/to/social_content.sh

/var/log/ai-automation.log 2

&1 Daily Content Script

!/bin/bash

daily_content.sh - Run daily at 9 AM

DATE

$( date +%Y-%m-%d ) OUTPUT_DIR = "/output/ $DATE " mkdir -p " $OUTPUT_DIR "

Generate daily quote image

infsh app run falai/flux-dev --input '{ "prompt": "Motivational quote background, minimalist, morning vibes" }'

" $OUTPUT_DIR /quote_image.json"

Generate daily tip

infsh app run openrouter/claude-haiku-45 --input '{ "prompt": "Give me one actionable productivity tip for today. Be concise." }'

" $OUTPUT_DIR /daily_tip.json"

Post to social (optional)

infsh app run twitter/post-tweet --input "{...}"

echo "Daily content generated: $DATE " Monitoring and Logging Logging Wrapper

!/bin/bash

logged_workflow.sh - With comprehensive logging

LOG_FILE

"/var/log/ai-workflow- $( date +%Y%m%d ) .log" log ( ) { echo "[ $( date '+%Y-%m-%d %H:%M:%S' ) ] $1 " | tee -a " $LOG_FILE " } log "Starting workflow"

Track execution time

START_TIME

$( date +%s )

Run workflow

log "Generating image..." RESULT = $( infsh app run falai/flux-dev --input '{"prompt": "test"}' 2

&1 ) STATUS = $? if [ $STATUS -eq 0 ] ; then log "Success: Image generated" else log "Error: $RESULT " fi END_TIME = $( date +%s ) DURATION = $(( END_TIME - START_TIME )) log "Completed in ${DURATION} s" Error Alerting

!/bin/bash

monitored_workflow.sh - With error alerts

run_with_alert ( ) { local result result = $( "$@" 2

&1 ) local status = $? if [ $status -ne 0 ] ; then

Send alert (webhook, email, etc.)

curl -X POST "https://your-webhook.com/alert" \ -H "Content-Type: application/json" \ -d "{ \" error \" : \" $result \" , \" command \" : \" $* \" }" fi echo " $result " return $status } run_with_alert infsh app run falai/flux-dev --input '{"prompt": "test"}' Python SDK Automation

!/usr/bin/env python3

automation.py - Python-based workflow

import subprocess import json from datetime import datetime from pathlib import Path def run_infsh ( app_id : str , input_data : dict ) -

dict : """Run inference.sh app and return result.""" result = subprocess . run ( [ "infsh" , "app" , "run" , app_id , "--input" , json . dumps ( input_data ) ] , capture_output = True , text = True ) return json . loads ( result . stdout ) if result . returncode == 0 else None def daily_content_pipeline ( ) : """Generate daily content.""" date_str = datetime . now ( ) . strftime ( "%Y-%m-%d" ) output_dir = Path ( f"output/ { date_str } " ) output_dir . mkdir ( parents = True , exist_ok = True )

Generate image

image

run_infsh ( "falai/flux-dev" , { "prompt" : f"Daily inspiration for { date_str } , beautiful, uplifting" } ) ( output_dir / "image.json" ) . write_text ( json . dumps ( image ) )

Generate caption

caption

run_infsh ( "openrouter/claude-haiku-45" , { "prompt" : "Write an inspiring caption for a daily motivation post. 2-3 sentences." } ) ( output_dir / "caption.json" ) . write_text ( json . dumps ( caption ) ) print ( f"Generated content for { date_str } " ) if name == "main" : daily_content_pipeline ( ) Workflow Templates Content Calendar Automation

!/bin/bash

content_calendar.sh - Generate week of content

TOPICS

( "productivity" "wellness" "technology" "creativity" "leadership" ) DAYS = ( "Monday" "Tuesday" "Wednesday" "Thursday" "Friday" ) for i in " ${ ! DAYS [ @ ] } " ; do DAY = ${DAYS [ $i ] } TOPIC = ${TOPICS [ $i ] } echo "Generating $DAY content about $TOPIC ..."

Image

infsh app run falai/flux-dev --input "{ \" prompt \" : \" $TOPIC theme, $DAY motivation, social media style \" }"

"content/ ${DAY} _image.json"

Caption

infsh app run openrouter/claude-haiku-45 --input "{ \" prompt \" : \" Write a $DAY motivation post about $TOPIC . Include hashtags. \" }"

"content/ ${DAY} _caption.json" done Data Processing Pipeline

!/bin/bash

data_processing.sh - Process and analyze data files

INPUT_DIR

"./data/raw" OUTPUT_DIR = "./data/processed" for file in " $INPUT_DIR " /*.txt ; do filename = $( basename " $file " .txt )

Analyze content

infsh app run openrouter/claude-haiku-45 --input "{ \" prompt \" : \" Analyze this data and provide key insights in JSON format: $( cat $file ) \" }"

" $OUTPUT_DIR / ${filename} _analysis.json" done Best Practices Rate limiting - Add delays between API calls Error handling - Always check return codes Logging - Track all operations Idempotency - Design for safe re-runs Monitoring - Alert on failures Backups - Save intermediate results Timeouts - Set reasonable limits

返回排行榜