It doesn't take a lot of looking to find companies who claim to modernize your legacy applications by moving them to the cloud.
The question is: Does this actually modernize an application, or is it more akin to slapping a cheap paint job on an old jalopy?
It may look new and shiny, but it's still an old car.
It's still got squeaks, rattles, worn out upholstery, and a puddle of oil beneath the engine wherever it parks.
All the cloud players--Amazon, Microsoft, Google--are talking about a "lift and shift" to modernize apps by moving them to the cloud. If you're confused by the hype, you're not alone.
The simple answer is taking an application and running it inside a VM on a cloud server.
What is the public cloud, actually? It's a set of services enabled by regional, physical data centers with hundreds or thousands of boxes, each with a CPU, disk, and memory. Those boxes run as VM hosts--so when you "buy" a cloud PC you're actually just instantiating a VM on a physical device, configured the way you want it.
That physical device may be running one, two, or 10 VMs, each seeming to be a dedicated computer.
This is the beauty of the cloud. You can access enormous amounts of computing power without owning any of the hardware. No updating the OS. No setting access protocols. No replacing failing blade disks.
All you need on your end is a credit card, a browser, and some bandwidth.
Applications built for a thin client (e.g., web applications) can be moved to the cloud by replacing the physical environment that the web server runs in with an identical environment in a VM, and the VM is in the cloud. Now you can throw away the physical server and just connect to an IP address that matches the virtual instance.
This does provide benefits, particularly around physical access and security--SMB companies with servers in closets will particularly benefit from hosting on Azure or AWS. They can dial up or dial down easily; failover is guaranteed; and they can easily and affordably take advantage of app monitoring and measurement tools like New Relic or Application Insights.
Although it might appear to be identical to the "server in a VM" above, this is fundamentally a different (and worse) approach. Applications that run on the desktop as a native executable are the poster children for thick clients. Take Microsoft Excel, for example. It runs as a native Windows application, taking advantage of all the capabilities a physical Windows PC has: hardware access, the local file system, trusted code execution via macros, COM connection to other applications, and more. Basically none of this really works inside a browser, which explains why Microsoft still has a LOT of customers using desktop versions of Excel and the rest of the Office suite.
Many, many Windows apps exist in the real world--written in languages like Visual Basic 6.0, PowerBuilder 12.5, Delphi, and even C#--and those apps typically run as fat clients with perhaps a network file server somewhere managing a database. To move those apps to the cloud, you have to re-create a Windows desktop, so you use Citrix or some similar remote desktop program to connect to VMs running the desktop app. Those VMs can connect to the file server (also as a cloud instance--I didn't draw it in the picture above: use your imagination.).
While this is a super-quick way to get the app off the physical desktop, it doesn't solve many problems, while at the same time creating some. For example, latency. Anyone who's used RD knows how non-responsive the UX can be to mouse and keyboard clicks, since they have to travel from the local client to the server, update the view, and have those changed screen pixels sent back to the client. Ugh.
Also: money. Those Citrix licenses are not free, nor are they cheap. For a handful of users, you might not feel much pain. But for a virtualized app with a LOT of users, the recurring costs can really start to hurt.
It just doesn't. It only moves the problem from one closet to another. The old code is still there. The old language is still there. The old runtimes are still there. If the app was written in PowerBuilder, none of these steps make that app easier to update. Easier to find developers. Easier to implement modern architectures or patterns. Easier to implement DevOps, CI/CD, containers. Ok, maybe containers--a little.
In reality, these tactics are just another way to kick the can down the road.
Borrowing from Wikipedia, modernization entails "converting, rewriting, or porting of a legacy system to a modern computer programming language, software libraries, protocols, or hardware platform. Legacy transformation aims to retain and extend the value of the legacy investment through migration to new platforms."
Note "modern computer programming language, software libraries, protocols, or hardware platform." The lift and shift approach:
This last bullet point may seem incorrect: isn't deploying on the cloud changing the hardware platform? No, not from the standpoint of modernizing a legacy app. Moving from a Windows server in your server room to a Windows server in Amazon's data center doesn't change the platform at all--only where the hardware is located and who owns it.
What about WebMAP?
Let's look at WebMAP from the lens of the four bullets above:
This is what a modernized web app should look like:
Suppose this is a snippet of code from your legacy app:
Now you want to add a new flag to your database of seafood products: "Farmed" (bool). Once you've moved this VB6 app to the cloud, here's what the code you have to modify looks like:
If you guessed that they are identical, you're right.
Moving a desktop app to the cloud doesn't change the programming language, or the libs, or the platform.
You still have to open up the source code, find a VB6 dev, change the code, compile, fix bugs, rebuild the exe, and test.
Now imagine after actually modernizing your application you have this code to modify:
Now youv've got a modern language (C#), modern platform (ASP.NET), modern pattern (model view controller), and a modern hardware platform (scalable web server with cross-platform browser-based client).
See the line "modConnection.rs.AddNew();"? Helper classes ease the transition from old to new by implementing recordset style syntax inside C#. No runtime--source code classes.
It's all about the code. Unless you have an app that will never need to be modified, you need to modernize the source code. Because it's getting harder and more expensive to find developers with legacy skills and knowledge. Because old code is garbage after years of uncoordinated attacks on structure, patterns, style, and methods. Because your AMC Javelin was cool when you got it but today it's just a clunker.
Even if you never modify it, you need to modernize the source code. Why? Because old code is a ticking timebomb for an attack, via code-level holes like SQL injection or cross-site scripting, or via known and yet-unknown vulnerabilities in legacy platforms.
Many vendors promise to "modernize" your legacy apps by hoisting them up to the cloud. There's no reason not to do it. It may save you some money and aggravation.
But it's not app modernization. If the code is the same, and the runtime is the same, it's just an old app up in the cloud.
No more, no less.