Integrate with a line of code
You don’t have to rewrite your code to leverage Dispatch, nor there are any new APIs to learn. Our Python SDK exposes a single decorator to wrap your function with to add automatic retries, execution resumability, rate limiting and asynchronous execution.
As an added bonus, there’s no vendor lock-in. Rip out @dispatch.function
decorator and your code continues to work.
import dispatch
@dispatch.function
async def checkout(user_id):
receipt_id = await charge_credit_card(user_id):
await send_email_confirmation(user_id, receipt_id)
checkout.dispatch(123)
Failure recovery built in
Dispatch automatically retries your function in case of a failure, restores state from the last run and resumes execution where it left off. Code that was successfully executed doesn’t run again, so you don’t need to worry about nightmare scenarios like charging a credit card twice or spamming mailboxes when you need to retry.
import dispatch
@dispatch.function
async def checkout(user_id):
receipt_id = await charge_credit_card(user_id)
await send_email_confirmation(user_id, receipt_id)
checkout.dispatch(123)
Rate limiting on autopilot
Configuring throttling, debouncing or rate limits is not how we are supposed to write code in 2024. With Dispatch, execution rate is adapted on the fly to make sure you stay under the rate limits, with zero configuration involved.
As number of tasks increases, Dispatch automatically scales up concurrency to perform queued up tasks as quickly as possible. When failures start to appear, concurrency goes back down to avoid overloading your or third-party system you may be relying on.
We use AIMD, an algorithm similar to TCP’s congestion control, to optimally rate limit traffic.