I am slightly hesitant to write this post, as it might attract some criticism. Nevertheless, I told myself there is nothing wrong with sharing my point of view (even though it might not be well accepted). I would like to share my personal experience regarding yesterday’s Monolithic and today’s Microservice architecture in this post.

Yesterday’s Monolithic Application
20 years back, I was starting my career. At that point, I was working for a large financial corporation in N. America. This financial institution’s middleware platform was running in CORBA, C++ platform. Since CORBA was getting extinct around that time, the technology vendor decided not to support CORBA anymore. It was a huge risk for the enterprise to run its mission-critical middleware platform on unsupported technology. Thus, management decided to port their middleware application to platform-independent technology stack: "SOAP" and "Java," which was considered as the cool kid of the time.

There were around 500+ services that were running on this C++, CORBA stack. They ported all those services into SOAP/Java technology. It was one of the biggest "portings" done in the industry of the time. Since then, any new services they developed to meet the growing business demand were also written on this new platform. So, what’s the relationship between this technology migration and this post? Have I gone too far? Not really…

Astonishingly, all those 500+ services were built from one single code base and deployed on one single JVM instance – perfectly meeting today’s definition of "Monolithic" architecture. But, interestingly, all those 500+ services were running on a JVM instance, which had only 2 GB memory size (-Xmx).

When I tell this memory size to a few millennial engineers now, they aren’t able to believe it. They are asking “Are you sure? Have you grown too old? Is memory faded?” I am 100% sure, my memory hasn’t faded :-). However, they had several 2 GB JVM instances to service the incoming traffic. Several of these services are business-critical and sensitive transactions like "List Transaction History," "Funds Transfer," "Get Account Details," "Get Customer Details"…. This middleware platform was beautifully processing several millions of transactions/day that too with tight milliseconds level SLAs.

Today’s Microservice
Now, let’s fast forward 20 years. We are now in the 2021 Microservices, containers, Kubernetes world. In my personnel career, after an unbelievable amount of struggle and pain, I progressed to build a *small*, profitable technology business. Our company specializes in building performance engineering tools (GCeasy, HeapHero, fastThread, yCrash). We are humbled to see the world’s premier enterprises use our toolsets.

Because of this current role, I get the opportunity to see the architecture and deployments of large corporations, who use our products. I could see several enterprises are either talking about switching to "Microservices" architecture or already in the process of switching to "Microservices" architecture. We could also see a few enterprises already switched to Microservices architecture. But, one sharp observation here is: several of these microservices applications’ memory sizes are pretty large. They are in the range of 10 GB, 20 GB, … 100 GB. It’s getting harder to see modern enterprise applications that run under a 2 GB heap. Besides that, response times are also getting degraded, due to several network hops between Microservices and garbage collection.

Does that mean modern applications are consuming more memory than their predecessors? Here is a study we published earlier highlighting the inefficiencies of one of the world’s well celebrated modern frameworks.

So, I am asking myself: Do modern/Microservice applications consume more resources than yesterday’s monolithic applications? Is the road as the industry we are heading is in the right direction?

Conclusion
Please don’t interpret that I am lobbying against Microservice architecture. I very well see the benefits of Microservice architecture: rapid development, decoupled deployments, reduced test cycles, polyglot, and programming. Here, I am only expressing concerns about growing memory size, response time, and its associated computing cost, complexity. You are welcome to share your experiences/thoughts!