Towards a systematic approach to proxy benchmarking for API Rate Limiting management





Publicado en

Actas de las XVIII Jornadas de Ciencia e Ingeniería de Servicios (JCIS 2023)

Licencia Creative Commons


API proxies are a powerful option for abstracting several aspects from REST APIs that are not specific or custom to the actual API logic, namely: authentication and authorisation, load balancing, rate limiting, API versioning and caching, security, etc. Nowadays heterogeneous microservices architectures have a common ground for managing these topics, which is based on the configuration of an API proxy that supports them. This has become popular in particular for rate limiting given such proxies can enforce the business model contracted by the API customers. In other words, reject clients requests over specific limits. However, there are currently multiple options with similar features and capabilities, which makes it difficult to compare and eventually select the one that better satisfies ones needs. In this paper we introduce a first approach to have a systematic analysis of the limitations and performance of different proxies in order to build a map which allows identifying which proxy fits best a specific use case.


Acerca de Peluaga, Ignacio

Palabras clave

API Proxy, Service Level Agreement, Rate Limiting, Benchmarking
Página completa del ítem
Notificar un error en este artículo
Mostrar cita
Mostrar cita en BibTeX
Descargar cita en BibTeX