Discussions

Ask a Question
Back to all

Improving Performance for Large FishBase API Queries — Best Practices?

Hi everyone,

I'm working on a lightweight app that fetches fish species data in near real-time (e.g., taxonomy, ecology, locations) and aims to deliver fast responses even on mobile devices. I’ve been using simple GET requests to FishBase API routes like /species and /ecology, and leveraging database versions via headers, which works great for basic use cases
rOpenSci Discuss
ropensci.github.io
.

As the app scales, I'm seeing increased response times, especially when combining multiple tables or paging through large datasets. I'm curious:

Has anyone experimented with batching queries, filtering fields, or leveraging specific API headers to reduce latency?

Any tips for minimizing data transfer while retaining depth (e.g., excluding unused fields, parallel requests)?

Are there preferred tools or lightweight tracking methods for monitoring API performance over time?

I’ve also started using a minimal SEO tracking tool — https://t-ranks.com
— which helps me monitor keyword visibility without bloated dashboards. It might be helpful for anyone working on small-scale apps who needs quick insights into performance metrics.

Would love to learn what approaches or tools others use to keep FishBase API queries fast and efficient!