Rate Limits
The VibePeak API implements rate limiting to ensure fair usage and maintain service quality for all users. Rate limits are based on your subscription plan.
Concurrent Task Limits
The primary rate limit is the number of concurrent video generation tasks you can have in progress at any time:
Plan Concurrent Tasks API Access Free 0 No access Starter 0 No access Plus 1 Yes Pro 3 Yes
A task is considered “in progress” while it’s in queued or processing status. Once a task reaches completed or failed status, it no longer counts against your limit.
How It Works
When you submit a new slideshow request, the API checks your current number of in-progress tasks:
If you’re under your limit, the task is accepted and queued
If you’ve reached your limit, you receive a 429 Too Many Requests error
Rate Limit Response
When you exceed your concurrent task limit, you’ll receive:
{
"error" : {
"code" : "CONCURRENCY_LIMIT_EXCEEDED" ,
"message" : "You have reached your concurrent task limit of 1. Please wait for existing tasks to complete." ,
"request_id" : "req_xyz123" ,
"details" : {
"limit" : 1 ,
"in_flight" : 1 ,
"plan" : "Plus"
}
}
}
Field Description limitYour maximum allowed concurrent tasks in_flightNumber of tasks currently in progress planYour current subscription plan
Plan Required Response
If you’re on a Free or Starter plan without API access:
{
"error" : {
"code" : "PLAN_REQUIRED" ,
"message" : "API access requires a Plus or Pro plan. Please upgrade your subscription." ,
"request_id" : "req_xyz123" ,
"details" : {
"current_plan" : "Free" ,
"required_plans" : [ "Plus" , "Pro" ]
}
}
}
Handling Rate Limits
async function createSlideshowWithRetry ( images , apiKey , maxRetries = 3 ) {
for ( let attempt = 0 ; attempt < maxRetries ; attempt ++ ) {
const response = await fetch ( 'https://api.vibepeak.ai/v1/real-estate/narrated-slideshow' , {
method: 'POST' ,
headers: {
'Authorization' : `Bearer ${ apiKey } ` ,
'Content-Type' : 'application/json'
},
body: JSON . stringify ({ images })
});
if ( response . status === 429 ) {
const error = await response . json ();
console . log ( `Rate limited. In-flight: ${ error . error . details . in_flight } ` );
// Wait for some tasks to complete
await new Promise ( resolve => setTimeout ( resolve , 60000 )); // Wait 1 minute
continue ;
}
if ( ! response . ok ) {
throw new Error ( `API error: ${ response . status } ` );
}
return response . json ();
}
throw new Error ( 'Max retries exceeded' );
}
Best Practices
Monitor Task Status Track your in-progress tasks to avoid hitting limits
Use Webhooks Get notified when tasks complete instead of polling
Queue Requests Implement a local queue to manage submission rate
Handle 429 Gracefully Implement exponential backoff for rate limit errors
Implementing a Task Queue
For high-volume applications, implement a local queue:
class TaskQueue {
constructor ( apiKey , concurrencyLimit ) {
this . apiKey = apiKey ;
this . concurrencyLimit = concurrencyLimit ;
this . inFlight = 0 ;
this . queue = [];
}
async submit ( images ) {
return new Promise (( resolve , reject ) => {
this . queue . push ({ images , resolve , reject });
this . processQueue ();
});
}
async processQueue () {
while ( this . queue . length > 0 && this . inFlight < this . concurrencyLimit ) {
const { images , resolve , reject } = this . queue . shift ();
this . inFlight ++ ;
try {
const result = await this . createSlideshow ( images );
resolve ( result );
} catch ( error ) {
reject ( error );
} finally {
this . inFlight -- ;
this . processQueue ();
}
}
}
async createSlideshow ( images ) {
// Implementation here
}
}
// Usage
const queue = new TaskQueue ( apiKey , 3 ); // Pro plan limit
await queue . submit ( images1 );
await queue . submit ( images2 );
// Tasks are automatically queued and processed within limits
Upgrading Your Plan
Need higher limits? Upgrade your plan to increase your concurrent task quota:
Plus : 1 concurrent task
Pro : 3 concurrent tasks
Monitoring Usage
Keep track of your in-flight tasks to avoid rate limiting. The task status endpoint shows your active tasks.
You can monitor your usage by:
Tracking the in_flight count in 429 error responses
Maintaining a local counter of submitted vs completed tasks
Using webhooks to get real-time completion notifications