level 300

Asynchronous Request Reply

Expose asynchronous process state by decoupling request and response

Context

HTTPs uses a synchronous request-response model - a consumer makes a request, and the service sends a response. Exposing services as RESTful HTTPs APIs means that our architecture is now bound to this request-response model.

Sometimes the work just doesn’t fit the synchronous model of HTTPs. Tasks may take seconds, minutes or longer to complete. In a world where latency has a material effect on customer experience and customer conversion, every millisecond matters. Services just cannot afford to wait for long-running tasks to complete before responding.

Solution

Decouple request and response by separating task initialization from completion. Individual API calls respond immediately, but the logical steps are implemented as separate endpoints. A request initiates the work and a reply endpoint can be polled to track the state.

Endpoints are implemented as Lambda functions exposed as web services with API Gateway. State is tracked in a DynamoDB table - this state can be as simple as “start/finish” or more complex transitions through a workflow.

In a RESTful architecture this can be modeled as a task resource. If the long-running downstream process creates a new resource, a Location Header can communicate the location of a new resource.

# Creates a new task and return an id
POST /task
{id: "id", status: "started"}

# Read the status of task using id
GET /task/id
{status: "pending"}

GET /task/id
Location: /resource/id
{status: "complete"}

Components

  • API Gateway and Lambda Functions
  • DynamoDb Table
  • Long Running Downstream Process
API Gateway and Lambda Functions
Expose functions as API endpoints. If no processing is required before landing the message on the queue, consider a gateway proxy.
An initial endpoint is responsible for starting the long-running downstream process and storing the state in the database. Returns a HTTP 201 Created or 202 Accepted response that includes a unique identifier that can be used to query the status.
A status endpoint can receive the identifier and query the database for status information.
DynamoDb Table
A table holds the state of the processing. Rather than perform update operations on a row, consider modeling this as an append-only log of state information. The last updated record is always the current state and we can always retrieve the history of state changes for a particularly id. Consider implementing a Time-to-Live on the table to manage table size.
| id | timestamp   |  status |
| ---| ----------- |-------- |
| A  | 1572647506  | Finish  |
| A  | 1572643906  | Start   |
Long Running Downstream Process
The long-running process pushes new state information to the DynamoDb table as required. Direct SDK access may be appropriate, we may not want to expose another system to the internal implementation details. The operation could be abstracted using a Gateway Proxy or Function as a Web Service.

Notes

Asynchronous Request Reply can be expose state performed by processes constructed using the Queue Based Load Leveling pattern.

The AWS platform has some hard limits on request times that effect long-running operations:

  • maximum 30 second request using API Gateway
  • maximum 15 minute Lambda function execution time

Cost Profile

Service Charge
API Gateway Request
API Gateway Data Transfer
Lambda Request
Lambda (Compute Time x Memory)
CloudWatch Log Data Ingestion
DynamoDB Read + Write Throughput
DynamoDB Data Storage

The above cost profile does not include the components of the downstream worker process.

Related Patterns