Skip to main content

Overview

Shipstar’s content generation runs asynchronously — you trigger a request, and the content is generated in the background. This guide covers strategies for testing your integration during development without waiting for real AI generation.

Development Workflow

Understanding the Content Lifecycle

Every piece of content follows this lifecycle:
pending → processing → completed (or failed)
When you call a generation endpoint, you get back a content_id and status: "pending". You then poll for the result.

Testing Content Generation

1

Trigger Generation

Call any content generation endpoint. The request body is optional — defaults work for testing:
curl -X POST "https://app.shipstar.ai/api/internal/sources/github/changelog" \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{}'
Response:
{
  "content_id": "a1b2c3d4-e5f6-7890-abcd-ef1234567890",
  "status": "pending"
}
2

Poll for Status

Check the content status until it’s completed or failed:
curl -X GET "https://app.shipstar.ai/api/internal/sources/content/a1b2c3d4-e5f6-7890-abcd-ef1234567890" \
  -H "Authorization: Bearer YOUR_API_KEY"
3

Inspect the Result

Once completed, the response includes the full generated content:
{
  "id": "a1b2c3d4-e5f6-7890-abcd-ef1234567890",
  "source": "My Product",
  "output_type": "public_changelog",
  "status": "completed",
  "from_date": "2025-03-24T00:00:00Z",
  "to_date": "2025-03-31T00:00:00Z",
  "model": "anthropic/claude-sonnet-4-6",
  "content": "{ ... generated content ... }",
  "public_slug": null,
  "is_published": false,
  "created_at": "2025-03-31T10:00:00Z"
}

Testing Strategies

Use a Custom Date Range

Test with a specific date range that you know has commits:
curl -X POST "https://app.shipstar.ai/api/internal/sources/github/blog-post" \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "start_date": "2025-03-01T00:00:00Z",
    "end_date": "2025-03-31T23:59:59Z",
    "product_name": "Test Product"
  }'

Test All Content Types

Run through each content type to verify your integration handles all formats:
Content TypeEndpointResponse Format
Product UpdatePOST /sources/github/product-updateMarkdown text
Feature BlogPOST /sources/github/feature-blogMarkdown text
Blog PostPOST /sources/github/blog-postMarkdown text
LinkedIn PostPOST /sources/github/linkedin-postPlain text
Twitter ThreadPOST /sources/github/twitter-threadPlain text
Feature PagePOST /sources/github/feature-pageJSON
KB ArticlesPOST /sources/github/kb-articlesJSON
ChangelogPOST /sources/github/changelogJSON
Release Notes EmailPOST /sources/github/release-notes-emailJSON

Test Publishing

After content is generated, test the publish flow:
# Publish
curl -X POST "https://app.shipstar.ai/api/internal/content/{content_id}/publish" \
  -H "Authorization: Bearer YOUR_API_KEY"

# Verify it's publicly accessible (no auth needed)
curl "https://app.shipstar.ai/api/v1/changelog/{public_slug}"

# Check the RSS feed
curl "https://app.shipstar.ai/api/v1/changelog/{public_slug}/feed"

# Unpublish when done testing
curl -X POST "https://app.shipstar.ai/api/internal/content/{content_id}/unpublish" \
  -H "Authorization: Bearer YOUR_API_KEY"

Test Error Scenarios

Verify your application handles these cases:
If no GitHub repos are connected, generation will fail. Test that your app handles the error gracefully.
Use a date range with no activity to test how your app handles empty content.
Check for status: "failed" responses with an error_message field. Your polling logic should handle this state.
Test with a bad key to ensure your app handles 401 responses properly.

Polling Implementation

Since content generation is asynchronous, you’ll need to poll for results. Here’s a robust implementation:
async function generateAndWait(contentType, options = {}) {
  const API_URL = 'https://app.shipstar.ai/api/internal';
  const headers = {
    'Authorization': `Bearer ${process.env.SHIPSTAR_API_KEY}`,
    'Content-Type': 'application/json'
  };

  // Trigger generation
  const genResponse = await fetch(`${API_URL}/sources/github/${contentType}`, {
    method: 'POST',
    headers,
    body: JSON.stringify(options)
  });

  const { content_id } = await genResponse.json();

  // Poll for result
  const maxAttempts = 30;
  const intervalMs = 2000;

  for (let i = 0; i < maxAttempts; i++) {
    await new Promise(resolve => setTimeout(resolve, intervalMs));

    const statusResponse = await fetch(`${API_URL}/sources/content/${content_id}`, {
      headers
    });

    const content = await statusResponse.json();

    if (content.status === 'completed') {
      return content;
    }

    if (content.status === 'failed') {
      throw new Error(`Generation failed: ${content.error_message}`);
    }
  }

  throw new Error('Generation timed out');
}

// Usage
const changelog = await generateAndWait('changelog', {
  start_date: '2025-03-01T00:00:00Z',
  end_date: '2025-03-31T23:59:59Z'
});

CI/CD Integration

Automated Testing with GitHub Actions

# .github/workflows/test.yml
name: Shipstar Integration Tests

on: [push, pull_request]

jobs:
  test:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4

      - name: Setup Node.js
        uses: actions/setup-node@v4
        with:
          node-version: '20'

      - name: Install dependencies
        run: npm ci

      - name: Run tests
        env:
          SHIPSTAR_API_KEY: ${{ secrets.SHIPSTAR_API_KEY }}
        run: npm test

Integration Test Example

describe('Shipstar API Integration', () => {
  const API_URL = 'https://app.shipstar.ai';
  const headers = {
    'Authorization': `Bearer ${process.env.SHIPSTAR_API_KEY}`,
    'Content-Type': 'application/json'
  };

  test('should verify API key is valid', async () => {
    const response = await fetch(`${API_URL}/api/v1/me`, { headers });
    expect(response.status).toBe(200);

    const user = await response.json();
    expect(user.is_active).toBe(true);
  });

  test('should list generated content', async () => {
    const response = await fetch(`${API_URL}/api/internal/sources/content`, {
      headers
    });
    expect(response.status).toBe(200);

    const content = await response.json();
    expect(Array.isArray(content)).toBe(true);
  });

  test('should list published changelogs', async () => {
    const response = await fetch(`${API_URL}/api/v1/changelogs`);
    expect(response.status).toBe(200);
  });
});

Best Practices

Create a separate Shipstar project for testing with its own tracked repositories. This keeps test-generated content separate from production.
Unpublish and delete test content to avoid cluttering your project:
# Unpublish
curl -X POST "https://app.shipstar.ai/api/internal/content/{id}/unpublish" \
  -H "Authorization: Bearer YOUR_API_KEY"

# Delete
curl -X DELETE "https://app.shipstar.ai/api/internal/sources/content/{id}" \
  -H "Authorization: Bearer YOUR_API_KEY"
Your polling logic should handle all possible statuses: pending, processing, completed, and failed. Don’t assume content will always succeed.
Use date ranges that include actual commits in your tracked repos. Empty date ranges will produce less useful test results.