CallerAPI: An Innovative Microservice Processing Solution
Feb 18, 2025
—

Microservice communication doesn’t have to be complex. In this article, written by our developer Eduard, we introduce CallerAPI—a powerful internal tool designed to streamline interaction between microservices. Built to support over 40 services in our document processing platform, CallerAPI reduces boilerplate code, simplifies error handling, and supports environment configuration with ease. Discover how Eduard’s solution improves productivity and makes service integration effortless.
Disclaimer
CallerAPI was developed exclusively for our company’s internal needs and is intended to simplify the interaction between microservices within our document processing platform. The main goal of developing this package was to create a convenient and effective tool that would simplify the integration and interaction of a large number of microservices, allowing our developers to focus on solving business problems, rather than on the routine work of setting up and calling services.
I would like to share an experience and best practices, as they may be useful to other developers facing similar problems.
Introduction
In the modern world, microservice architecture has become one of the standards for developing complex software solutions. It allows you to split your application into many small, independent services, making them easier to develop, test, and deploy. However, with the increase in the number of microservices, problems arise between their interaction and coordination. In the company, I was faced with the need to integrate more than 40 microservices and the developed CallerAPI package greatly simplified this process.
Problem representation
With a large number of microservices, it becomes inconvenient and difficult to manage their interactions. Each service has its own API, and calling one service from another requires writing a lot of code to configure HTTP requests, error handling, etc. This leads to code duplication, increased development and testing complexity.
Below I listed some related problems of microservices interaction:
- Difficulty of multiple microservices integration.
- Necessity of writing a big code volume for each HTTP-request.
- Increased complexity of error handling and session management.
- Difficulties in maintaining and updating the code.
Solution
To solve these problems, the CallerAPI package was developed, which allows you to easily and conveniently interact with microservices through a unified interface. The main idea of CallerAPI is to collect all services and their APIs in one package that can be installed in any microservice and used with minimal effort.
The main advantages of CallerAPI
- Simplification of calling microservices, using a single interface.
- Reduction of code volume by unifying interaction methods.
- Convenient error handling and session management.
- Easy code maintenance and update.
- Convenient setting of environment variables(env).
Realization
- The main components of CallerAPI
CallerAPI consists of several main components, each of which plays an important role in providing the functionality of the package.
1.1. AIOHttpClient
The AIOHttpClient class is responsible for managing HTTP client sessions. It provides creation and configuration of aiohttp.ClientSession for each service.
```python
class AIOHttpClient:
"""
A class for managing AIOHttp client sessions.
"""
def __init__(self, base_url: str) -> None:
self.base_url = base_url
def aiohttp_client(
self, service_id: Union[ServiceID, str], timeout: Optional[Union[int, aiohttp.ClientTimeout]] = None
) -> aiohttp.ClientSession:
"""
Return aiohttp.ClientSession based on the given service ID and timeout.
Creates a new session if it does not exist or the previous one is closed.
Parameters:
service_id: service ID.
timeout: Session timeout.
Returns:
aiohttp.ClientSession: Session instance.
"""
import logging
logging.basicConfig(level=logging.CRITICAL)
if not timeout:
timeout_settings = aiohttp.ClientTimeout(total=300, sock_read=180)
elif isinstance(timeout, int):
timeout_settings = aiohttp.ClientTimeout(total=timeout, sock_read=180)
elif isinstance(timeout, aiohttp.ClientTimeout):
timeout_settings = timeout
headers = {
"Accept": "application/json",
}
client = aiohttp.ClientSession(
headers=headers,
base_url=self.base_url.format(use_service_id=service_id.value.lower()),
timeout=timeout_settings,
)
return client
```
It also allows to flexibly configure timeouts for HTTP sessions, which is especially useful for microservices with different timing requirements.
```python
def aiohttp_client(self, service_id: Union[ServiceID, str], timeout: Optional[Union[int, aiohttp.ClientTimeout]] = None) -> aiohttp.ClientSession:
...
```
1.2. BaseApi
The BaseApi class is the base class for all API classes. It combines all the methods for executing HTTP requests (GET, POST, DELETE, etc.), which allows you to minimize code repetition and centralize request processing.
```python
class BaseApi:
"""
A base class for making API calls
"""
http_client = None
def __init__(
self,
app_id: str,
base_url: str,
timeout: Optional[Union[int, aiohttp.ClientTimeout]] = None,
) -> None:
"""
Initialize BaseApi with application ID, base URL, and optional timeout.
Parameters:
app_id: The application ID.
base_url: The base URL.
timeout: Timeout for the client session.
"""
self.app_id = app_id
self.http_client = AIOHttpClient(base_url=base_url)
self.timeout = timeout
async def _get_client(self) -> aiohttp.ClientSession:
"""
Return ClientSession instance.
Returns:
Client session instance.
"""
return self.http_client.aiohttp_client(self.app_id, self.timeout)
async def _request(self, method: str, ep: str, data: Any = None, encode: bool = True) -> aiohttp.ClientResponse:
"""
Make an HTTP request using the given method, endpoint, and data.
Parameters:
method: The HTTP method. Supported: 'post', 'get', 'delete', 'patch', 'put', 'delete_body', 'post_data'.
ep: The endpoint to make the request to.
data: The data to be sent with the request.
Returns:
The response from the server.
Raises:
ValueError: If an unsupported method is provided.
"""
http_client = await self._get_client()
json_compatible_data = jsonable_encoder(data) if encode else data
if method == "post":
response = await http_client.post(ep, json=json_compatible_data)
elif method == "get":
response = await http_client.get(ep, params=data)
elif method == "delete":
response = await http_client.delete(ep, params=json_compatible_data)
elif method == "patch":
response = await http_client.patch(ep, json=json_compatible_data)
elif method == "put":
response = await http_client.put(ep, json=json_compatible_data)
elif method == "delete_body":
response = await http_client.delete(ep, json=json_compatible_data)
elif method == "post_data":
response = await http_client.post(ep, data=json_compatible_data)
else:
raise ValueError(f"Unsupported method: {method}")
return response
```
! Please note that there are 2 additional methods `post_data` and `delete_body` used here, their difference from the usual post and delete is that `post` uses the `json` parameter, while `post_data` uses the ` parameter data`.
For `delete_body` - `json`. `Delete` - `params`.
Let’s take a look at how easy setting environment variables is (env):
CallerAPI includes a convenient mechanism for setting environment variables, which greatly simplifies the configuration and management of various parameters. This is achieved by using a `settings.py` file, which reads the values from the `.env` file and sets them for use in the code. One of the key features is the separation of calls to local services and the production environment using one parameter – `DEPLOY`.
```python
from starlette.config import Config
config = Config(".env")
DEPLOY = config("DEPLOY", default=False)
BASE_URLS = {
"SERVICE_NAME": "http://localhost:6001",
...
}
def get_config(key: str) -> str:
if DEPLOY:
DEFAULT_URL = config("DEFAULT_URL", default="http://localhost:6000")
return config(key, default=DEFAULT_URL)
else:
return config(key, default=BASE_URLS[key])
for key in BASE_URLS:
globals()[key] = get_config(key)
```
The main peculiarities
- Switching between local and production services
The `DEPLOY` variable set in the `.env` file allows you to easily switch between local services and the production environment. If `DEPLOY` is set to True, all services will use the URL set in the production environment, otherwise local URLs will be used.
```python
if DEPLOY:
DEFAULT_URL = config("DEFAULT_URL", default="http://localhost:6000")
return config(key, default=DEFAULT_URL)
```
- Universal configuration of service URLs
All base URLs for microservices are stored in the BASE_URLS dictionary, which makes their configuration and management centralized and convenient.
```python
BASE_URLS = {
"SERVICE_NAME": "http://localhost:6001",
...
}
```
- Dynamic creation of global variables
Environment variables for each service are created dynamically, allowing flexible configuration management.
```python
for key in BASE_URLS:
globals()[key] = get_config(key)
```
Among the possible extensions are the following:
- Support of several environments
The settings.py file can be extended to support multiple environments, such as test and staging. This can be done by adding additional checks and environment variables.
- Logging
Involvement of logging to track configuration loading and usage can be useful for debugging and monitoring.
Usage examples
The `ServiceID` class uses `Enum` to store service IDs, which improves readability and reduces errors when using string values.
``python
class ServiceID(str, Enum):
foo = "Foo"
...
```
6.1. Example class for calling a service
Let’s explore in detail an example of this class:
```python
class Foo(CommonEndpoints):
"""
A class for calling Foo service.
"""
def __init__(self, timeout: Optional[Union[int, aiohttp.ClientTimeout]] = None) -> None:
super().__init__(
app_id=ServiceID.foo,
base_url=S.FOO_URL,
timeout=timeout,
)
async def v1_process(self, data: model.FooInput) -> aiohttp.ClientResponse:
"""
Make a post request to the /v1/process endpoint.
Parameters:
data: FooInput object to be sent with the request.
Returns:
"""
return await self._request("post", "/v1/process", data)
```
The response from the server.
6.2. File transfer and usage examples of multipart/form-data
6.2.1. File transfer
Transferring files to CallerAPI is done using `aiohttp.FormData`(multipart/form-data). Here is an example method for transferring a file:
```python
class Bar(CommonEndpoints):
"""
A class for calling Bar service.
"""
def __init__(self, timeout: Optional[Union[int, aiohttp.ClientTimeout]] = None) -> None:
super().__init__(
ServiceID.bar,
S.Bar_URL,
timeout,
)
async def v1_file(self, subdomain: str, client_id: str, file: UploadFile) -> aiohttp.ClientResponse:
"""
Make a post request to the /v1/file endpoint.
Parameters:
subdomain: tenant,
client_id: user identifier,
file: UploadFile object
Returns:
The response from the server.
"""
ep = f"/v1/file?subdomain={subdomain}&client_id={client_id}"
form = aiohttp.FormData()
form.add_field("file", await file.read(), filename=file.filename, content_type="application/octet-stream")
return await self._request("post_data", ep, form, False)
```
Conclusion
CallerAPI greatly simplifies communication between microservices by providing a unified interface for calling APIs. It reduces the amount of code required to integrate services, improves session management and error handling, which ultimately improves productivity and ease of development. Additionally, CallerAPI provides convenient mechanisms for configuring environment variables, as well as support for file transfer and multipart/form-data.
We are confident that the idea of CallerAPI will be useful for developers working with microservice architecture, and will allow them to focus on developing business logic, rather than on routine integration tasks.