Skip to content

Rate limit guidance for large-scale donation migration (100k+ records) #1400

@BrendanRomanDev

Description

@BrendanRomanDev

I've seen this issue regarding bulk donations, among others, but we'd need some guidance on using the API or platform to make a smooth transition.

I'm part of an org looking to move historical donation records from another system into Planning Center to maintain accurate financial history. We're looking at over 100k donations that need to be imported. I noticed the Planning Center web portal supports CSV uploads for some entities, but not donations!

If we're limited to posting a single donation at a time, that's 100k API calls for 100k rows, not including prerequisite queries for the required person_id, or creation of the batches! Even after implementing a caching strategy, we're facing bottlenecks and failed requests on jobs as small as 100 csv rows.

What's the recommended approach for this kind of one-time historical data migration?

  1. Is there a bulk import process or tool that we're missing, that would be better suited for this use case?
  2. Would it be possible to request a temporary rate limit increase for a one-time data migration like this?
  3. Are there any best practices you'd recommend for handling large-scale imports to minimize impact on PCO's infrastructure? The inability to POST multiple donations in a single call feels at odds with the API's rate limiting - especially for conversions 😵

Appreciate any guidance you can provide. We want to make sure we're doing this the right way🙏

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions