Kaizen’s Built-in Security Architecture

The below outlines how Kaizen’s architecture and technology choices protect your application against common security threats. Our stack is designed with security-first principles while remaining developer-friendly.

Infrastructure Protection

DDoS Protection

  • Vercel Edge Network automatically mitigates DDoS attacks through their global edge network, load balancing, and traffic filtering
  • Attack traffic is filtered before reaching your application servers

API Abuse Prevention

  • Convex Rate Limiting prevents API abuse with built-in rate limiting capabilities
  • Different throttling rules for sensitive endpoints (auth, payments)
  • IP-based and fingerprinting techniques to identify abusive clients
  • Automatic rate limiting responses when limits are exceeded

Application Security

Injection Attacks

  • Convex Functions use parameterized queries and built-in validation, eliminating injection risks
  • TypeScript Validation ensures all inputs are strictly typed and validated before processing
  • Content Security Policy headers prevent XSS attacks

Authentication Security

  • Clerk provides enterprise-grade authentication (and DDoS protection) with advanced security features
  • Secure HTTP-only cookies for session management
  • CSRF protection built into authentication flows
  • Multi-factor authentication options for additional security
  • Ban and impersonate users from your app with Clerk

Data Protection

  • Convex provides secure database access with built-in security policies
  • Row-level security in database for multi-tenant setups
  • Environment variables securely managed through Vercel and Convex

Payment Security

  • Polar.sh handles payment processing with secure payment flows
  • Card details never touch our servers, eliminating PCI compliance burden
  • Strong customer authentication (SCA) for European payments
  • Built-in fraud detection and prevention mechanisms

Compliance & Best Practices

  • Framework Security Updates automatically applied through dependabot
  • HTTP Security Headers implemented according to OWASP recommendations
  • Content Security Policy restricts resource loading to trusted sources
  • Input sanitization throughout the application
  • SOC 2 compliance for enterprise applications

Ongoing Protection

  • Dependabot alerts for vulnerable packages
  • Vercel’s monitoring detects unusual traffic patterns
  • Convex Analytics provides visibility into API usage and potential abuse

Remember that no system is 100% secure, but Kaizen implements industry best practices across the stack. The architecture uses battle-tested cloud services that handle billions of requests daily, each with dedicated security teams and infrastructure.

If you have specific security requirements or concerns, please reach out to the community on Discord or GitHub.


Security should be a priority from day one. While you can iterate on features, security breaches can be catastrophic for early-stage companies.

A good boilerplate and infrastructure will usually cover the basics, but you should always be on the lookout for new threats and vulnerabilities.

We’ve gathered some resources below to help you get started.

Essential Resources

  • DefendSaaS - Comprehensive guide for securing your SaaS application, including:
    • Preventing abuse and fraud
    • DDoS protection strategies
    • Authentication security best practices
    • API security patterns
    • Compliance considerations

Regular Security Tasks

Weekly

  • Update dependencies
  • Review user access
  • Check backup status

Monthly

  • Security patches
  • Access review
  • Policy updates

Quarterly

  • Security audit
  • Penetration testing
  • Policy review

If you have some capital, you can consider using one of these agencies: ShipSecure or Jolt Security to audit your application for you.

Additional Tools

  1. Security Scanning

    • ProjectDiscovery - Open-source vulnerability scanning platform that focuses on exploitable vulnerabilities, used by 100k+ security pros. Includes:
      • Real-time attack surface monitoring
      • Custom exploit detection via Nuclei framework
      • False positive elimination
      • CI/CD integration
    • OWASP ZAP - Free security testing tool
    • Snyk - Dependency vulnerability scanning
    • SonarQube - Code quality and security review
  2. Monitoring

    • Sentry - Error tracking and monitoring
    • Datadog - Infrastructure monitoring
    • PagerDuty - Incident response platform
  3. Compliance

    • Vanta - Security compliance automation
    • Drata - Security and compliance automation
    • ComplyCube - KYC and AML compliance

Code Security Tools

Rate Limiting with Convex

Kaizen leverages Convex’s built-in rate limiting capabilities to protect your API endpoints from abuse. Convex provides serverless-friendly rate limiting that scales automatically with your application.

Rate limiting is crucial for protecting your API endpoints from abuse and ensuring fair usage.

  1. Configure rate limiting in your Convex functions

    Convex provides built-in rate limiting that you can configure in your function definitions:

    // In your Convex function
    export const myFunction = internalAction({
      args: { /* your args */ },
      handler: async (ctx, args) => {
        // Your function logic here
      },
      // Rate limiting configuration
      rateLimit: {
        requests: 20,
        window: "10s"
      }
    });
    
  2. Different rate limits for different endpoints

    You can set different rate limits based on the sensitivity of your endpoints:

    // Sensitive operations (auth, payments)
    export const sensitiveFunction = internalAction({
      args: { /* your args */ },
      handler: async (ctx, args) => {
        // Your function logic here
      },
      rateLimit: {
        requests: 5,
        window: "1m"
      }
    });
    
    // Regular API operations
    export const regularFunction = internalAction({
      args: { /* your args */ },
      handler: async (ctx, args) => {
        // Your function logic here
      },
      rateLimit: {
        requests: 50,
        window: "30s"
      }
    });
    

Rate limiting configuration

Convex provides sensible defaults for rate limiting:

  • Default rate limit: 100 requests per minute per user
  • Customizable per function: You can override defaults for specific functions
  • Automatic scaling: Rate limits scale with your application usage

Testing rate limits locally

To verify your rate limits are working:

  1. Send multiple rapid requests to a Convex function

    You can use the Convex dashboard or your application to send multiple requests to a rate-limited function.

  2. Check rate limit responses

    When rate limits are exceeded, Convex will return appropriate error responses with rate limit information.

  3. Monitor in Convex dashboard

    The Convex dashboard provides visibility into rate limiting and API usage patterns.

Implementing rate limiting on new functions

To add rate limiting to a new Convex function:

  1. Add rate limiting configuration

    export const newFunction = internalAction({
      args: { /* your args */ },
      handler: async (ctx, args) => {
        // Your function logic here
      },
      rateLimit: {
        requests: 20,
        window: "10s"
      }
    });
    
  2. Use different rate limits for different operations

    // High-frequency operations
    export const highFrequencyFunction = internalAction({
      args: { /* your args */ },
      handler: async (ctx, args) => {
        // Your function logic here
      },
      rateLimit: {
        requests: 100,
        window: "1m"
      }
    });
    
    // Low-frequency, sensitive operations
    export const sensitiveFunction = internalAction({
      args: { /* your args */ },
      handler: async (ctx, args) => {
        // Your function logic here
      },
      rateLimit: {
        requests: 5,
        window: "5m"
      }
    });
    

Deploying to production

  1. Rate limits are automatically applied

    Convex rate limiting is automatically applied when you deploy your functions to production.

  2. Monitor usage in Convex dashboard

    export async function GET(req: NextRequest) {
      // Apply rate limiting (using default 'api' limiter)
      const rateLimit = await applyRateLimit(req);
      if (!rateLimit.success) {
        return new NextResponse(
          JSON.stringify({ error: "Too many requests" }),
          { 
            status: 429, 
            headers: rateLimit.headers 
          }
        );
      }
      
      // Your API logic here
      
      // Add rate limit headers to successful responses
      return NextResponse.json({ data: yourData }, { headers: rateLimit.headers });
    }
    
  3. Use a specific limiter (optional)

    // Use the "auth" limiter instead of the default "api" limiter
    const rateLimit = await applyRateLimit(req, "auth");
    

Deploying to production

  1. Create a production Upstash Redis database

    Follow the same steps as for development, but choose an appropriate plan and region for your production needs.

  2. Add environment variables to your hosting platform

    Add the Upstash credentials to your hosting platform (e.g., Vercel, Netlify):

    • UPSTASH_REDIS_REST_URL
    • UPSTASH_REDIS_REST_TOKEN
  3. Using Vercel integration (recommended)

    If you’re deploying to Vercel, you can use the Upstash integration:

    a. Go to the Vercel dashboard for your project

    b. Navigate to “Settings” > “Integrations”

    c. Search for “Upstash” and click “Add Integration”

    d. Follow the steps to link your Upstash account

    e. Select your Redis database to connect to your Vercel project

    This will automatically set up the environment variables for you.

Common Issues & Troubleshooting

Rate limiting isn’t working

  • Verify your Upstash Redis credentials are correctly set in your environment variables
  • Check if your Redis database is accessible from your application
  • Ensure you’re applying rate limiting correctly in your API route

Getting Redis connection errors

  • Verify your Upstash Redis database is active
  • Check network connectivity between your application and Upstash
  • Ensure your token has the correct permissions

Rate limits are too restrictive or too lenient

Adjust the rate limit configuration in lib/ratelimit.ts based on your traffic patterns:

// Example: Increase API rate limit to 50 requests per 30 seconds
api: new Ratelimit({
  redis: Redis.fromEnv(),
  limiter: Ratelimit.slidingWindow(50, "30 s"),
  analytics: true,
}),

Rate limiting is only as good as its configuration. Monitor your API usage and adjust limits as needed to balance security and user experience.