How to bring fast data access to microservice architecture with in-memory data grids

Data Analytics

Source: O\’Reilly Radar
How to bring fast data access to microservice architecture with in-memory data grids

For stack scalability, elasticity at the business logic layer should be matched with elasticity at the caching layer.

Ever increasing numbers of people are using mobile and web apps and the average time consumed using them continues to grow. This use comes with expectations of real-time response, even during peak access times.

Modern, cloud-native applications have to stand up to all this application demand. In addition to user-initiated requests, applications respond to requests from other applications or handle data streamed in from sensors and monitors. Lapses in service will add friction to customer touch points, cause big declines in operational efficiency, and make it difficult to take advantage of fleeting sales opportunities during periods of high activity, like Cyber Monday.

What are your options for dealing with demand?

Scaling large monoliths dynamically on demand becomes impractical as these systems grow. Meeting demands like Cyber Monday by scaling up a large clunky deployment of a monolith, and then scaling it back down when the higher capacity is no longer needed, is impractical, because as these systems grow, they become increasingly fragile, severely limiting the scope of what can be changed.

Continue reading How to bring fast data access to microservice architecture with in-memory data grids.


Source: O\’Reilly Radar
How to bring fast data access to microservice architecture with in-memory data grids

Leave a Reply

Your email address will not be published. Required fields are marked *