As an architect in a company that is basically a “Microsoft shop”, most of my landscape (if not all) is made out of custom applications based on Microsoft technologies and .NET Framework, with versions starting from .NET 2.0 up to .NET 4.5, which are running along enterprise Microsoft applications like Sharepoint, Dynamics, Office 365 and so on.
The custom applications landscape has both desktop applications, built on a heavy client philosophy, making use of .NET Remoting functionalities (which are long deprecated) and more modern web applications, built using .NET 4.0.
As we are moving the landscape to Microsoft Azure cloud, I’m facing some challenges about bringing old applications to current century, making them ready for a cloud migration, deciding what will be migrated as lift and shift, where we will use replatform and what applications will be refactored.
In any of the above mentioned migration strategies, a major show stopper is , in many cases, the legacy technologies used by applications. Old versions of .NET Framework or obsolete technologies that are preventing us to even try a lift and shift approach must be dealt somehow and a long term approach to modernise and further support our business applications must be established.
Microsoft .NET Framework was released in 2002 and since then the company have invested heavily in the entire landscape up to the point where we have about 70% of the business applications portfolio based on .NET.
In 2019, Microsoft announced that its development effort will be focused on .NET Core, with the Windows .NET Framework moving to LTS (Long Term Support). The consequences are that only bug fixes and security patches will be provided and no new features will be added to .NET Framework.
All of our applications are still meeting their business requirements and business is heavily relying on them to perform and bring revenue. On the same time, we are trying to accelerate application delivery and move to cloud-native architectures for the new applications we are providing for the business and also to decide how old applications are going to be fitted in the cloud landscape. Also we intend to move away from the traditional n-tier application design and use modern architecture patterns like APIs, microservices and so on.
From a technical perspective, there are some standardised steps to be taken when reviewing the applications landscape.
Identify end of life platforms and .NET Framework versions for each application
When talking about application support and compatibility with newer .NET versions, the following criteria I’ve found out to be of significant importance:
- NET Framework and .NET Core version support
- Windows OS end of life
- Cloud migration
.NET Framework and .NET Core version support
According to Microsoft, at the time of writing, their official support policy for .NET Framework I’ve summarised it in the table below:
|MS .NET Version||Currently supported||End of Life||LTS|
|.NET 1.0 and SPs||No||Yes, since 2009|
|.NET 2.0||No||Yes, since 2011|
|.NET 3.0||No||Yes, since 2011|
|.NET 3.5||Yes||No||Yes, up to 2028|
|.NET 4.0||No||Yes, since 2016|
|.NET 4.5||No||Yes, since 2016|
|.NET 4.5.1||No||Yes, since 2016|
|.NET 4.8||Yes||Ongoing LTS|
Microsoft .NET Core versions support
|,NET Core Version||Currently supported||End of life||LTS|
|.NET Core 1.0||No||Since 2019|
|.NET Core 1.1||No||Since 2019|
|.NET Core 2.0||No||Since 2018|
|.NET Core 2.1||Yes||No||Yes, up to 2021|
|.NET Core 2.2||No||Since 2019|
|.NET Core 3.0||No||Since March 2020|
|.NET Core 3.1||Yes||Yes, up to 2022|
After compiling the tables above, I’ve decided that applications which use a non LTS .NET Framework or .NET Core must be migrated to a LTS version. In some cases this is just a matter of recompile and redeploy the application (fortunate cases where the company is the owner of the source code). For others, where we have lost the knowledge about source code, requirements and so on, they are perfect candidates for assessment of application business value, consolidation of functionalities to another application or platform or just plain rewrite from scratch, if the business revenue justifies it.
The same applies to work in progress applications, where it is easier to switch to a LTS version of .NET or .NET Core because we were using anyway a newer version of .NET.
For the externally sourced applications, things are a bit more tricky because in some cases we have to wait for a new major version release from the vendor (if we are lucky enough that the vendor has something like this in his roadmap).
In other cases, the vendor may have a SaaS equivalent solution that might work for us (if all constraints allow the use of a SaaS solution instead of a hosted one).
If none of the above can be done, then it all comes to a business decision.
Windows OS versions support
Currently, the picture for Windows OS support looks like this:
|Windows Version||Currently supported||End of life||LTSC|
|Windows Server 2019||Yes, until January 2024 with Extended until January 2029|
Available also as container image
|Windows Server 2016||Yes, until January 2022 with Extended until January 2027|
Available also as container image
|Windows Server 2012 R2||Ended on October 2018. Extended until 2023||Not yet||No|
|Windows Server 2012||Ended on October 2018. Extended until 2023||Not yet||No|
|Windows Server 2008 R2||Ended on January 2015. Extended until January 2020||Yes||No|
As you can see, Windows Server 2019 and 2016 are under support, with Long Term Servicing Channel, meaning we can rely on them for the next 5 to 8 years. Also, an important aspect is they are also available as Windows Container Images and we can take advantage of that trying to move some applications to Windows Containers, Doker and Kubernetes on Azure.
My end goal is to have most of the applications prepared to run in Azure cloud, in a hybrid environment. Of course, there were still be stuff on premises, stuff that can’t be moved to cloud due to regulations or just it is to expensive to refactor and run it in cloud.
Also, as we operate in many countries (Europe, South America and Africa), another goal is data center consolidation and for this to be efficiently accomplished, I have to have all applications modernised to be monitoring and automation friendly (Remember: Never send a human to do a machine’s job)
Now, after I’ve established a clear course of action regarding .NET platform and Windows OS versions, I had to come with a prioritised list of applications and a modernisation method for each of them.
The first step towards this is to identify which modernisation approach works best for each application, decision which is more than a technical one. First, you have to assess the business value for each application and then see which app will benefit the most from modernisation and also what kind of modernisation should be applied. Some high value, high revenue applications are worth investing in and makes sense to go on the microservices path but others will do just fine with a rehosting approach. Also, the effort for modernisation should be realistically taken into account and aligned with budgeting and projects portfolio (for the execution phase, TOGAF ADM might be a valuable tool).
In my case, for prioritising applications, I’ve used Gartner’s definition to create viewpoints by which the applications portfolio can be viewed, to identify potential modernisation candidates.
Gartner identifies three main application categories:
- Systems of a record: capabilities that have a clear focus on standardisation or operational efficiency.
- Systems of differentiation: business capabilities that enable unique processes or industry specific capabilities.
- Systems of innovation: new business capabilities to address emerging business needs, new business opportunities and modes.
Whatever applications fall under systems of differentiation or systems of innovation are sure candidates for modernisation effort.
Actually, here the discussion is a bit broader, as many applications from those two categories are tightly coupled with some system of records, which in turn must be somehow touched to sustain the modernisation effort (for example, if a system of differentiation is a mobile banking application which is tightly coupled with CRM, which is a system of a record and CRM is an old monolithic application, for implementing, let’s say, real time push of data from CRM to mobile banking then CRM must be modified or another layer added in between).
Another goal of modernisation was to reduce the operational overhead associated with applications support and here there a couple of options:
- Move to a PaaS solution, switching the overhead to a service provider
- Migrate to a better cloud integrated platform like Azure Kubernetes Service – AKS and use all available functionalities for support, monitoring and maintenance
- Long shot – adopt a DevOps approach. This wasn’t taken into account in the first phase of the modernisation as the organisation was not ready
For assessing each application current state and actions and effort estimation for modernisation, I’ve used a list of steps, like below.
1. In depth application assessment
- Review the current state of the application
- Do we have the latest version source code available? If yes, is this source checked in a version control tool? Is available and documented? Is there some knowledge about it in dev teams?
- Is the source code functional? Can be built a running version of the application?
- Do we know all 3rd party components and libraries used? Do we have them. in a build ready state?
- Are the 3rd party components available in an LTS .NET Framework version? Are they available for .NET Core? Are they still maintained and supported?
- Is the application using .NET Framework or .NET Core? And what version?
- On what version of Windows is currently running? Does the version of Windows match our LTS requirements?
- What kind o application are we assessing? Is a heavy client desktop application, a web app, a Windows Service or an IIS service?
- Assess the use of technologies that are not compatible with a move to Azure cloud.
- Legacy applications often used technologies that were appropriated for on premises intranet hosting but if you try to move them to cloud, those approaches are not working anymore. Issues that I’ve faced are listed below:
|Local storage||Application is storing files, either temporary or long term on a local files system, on path hard coded or configured||– Remove absolute paths and migrate to block storage (Azure storage accounts)|
|Embedded logging||Application uses non standard logging system and / or writes logs locally||– Use a centralised logging system|
– Use standard .NET logging components
– Use Azure Application Insights as much as possible
|Embedded configuration parameters||Configuration parameters are stored in config files or hard coded in the application||– Use a centralised config repository and modify the app to extract config parameters from the repository|
– Make use of Azure Key Vault for securely storage of config parameters
|For web applications, state is not explicitly managed or is managed in databases||State management for web application is not dealt with, leaving web servers to handle it or is stored in databases||– If you want to take advantage of cloud horizontal scaling and elasticity capabilities, state must be taken out of the application itself and be stored and managed separately.|
– Make use of Azure Redis cache or other in memory caching tools for state management
|SQL Databases||Application is using SQL databases||In some cases, if the application is not utterly complex or not heavily transactional, then it can make sense to try and change the database from an SQL based one to a NoSQL DB, like Cosmos DB for cloud hosting. A proper designed NoSQL data model can simplify the data access layer and also can come with a cost reduction in cloud hosting. Instead of having an SQL instance to pay for, you just pay for Cosmos DB capacity. Of course, in such a shift in DB technology there are also other things to consider (if app really needs a NoSQL DB, dev teams skills, cost estimation and so on)|
|Multicast broadcasting||Sending one to many messages on a network segment. Haven’t met the case yet.||Change to message queues|
|Localhost IP addresses||Application is using localhost or 127.0.0.1 addresses||Ensure that application can lookup it’s own hostname|
|Hostnames or DNS dependencies||Hard coded addresses or URLs areused||Same as for config parameters, use a centralised repository or service discovery|
|Full trust code or admin user||Application requires elevated privileges to run||Identify what and why there such a need and change it|
|Application security||Application is using a built in log in mechanism||Make use on single sign on. |
If moving to cloud is an option, then use Azure AD, or Azure B2C AD
- .NET Standard is a formal specification of .NET APIs that are intended to be available on all .NET implementations and are included in modern .NET implementations as part of the Base Class Library (BCL).
- NET Standard versions are of two types:
- Additive: .NET Standard versions are logically concentric circles: higher versions incorporate all APIs from previous versions.
- Immutable: Once shipped, .NET Standard versions are frozen
- I advise to target the latest LTS version of .NET core and also the latest LTS version of .NET Framework (4.8)
- NET Standard versions are of two types:
- If we decide to rewrite the application to .NET Core, let’s see what components are not compatible with .NET Core and will never be added to .NET Core
- Windows Communication Foundation WCF: This a framework for building service-oriented applications. Using WCF, you can send data as asynchronous messages from one service endpoint to another. A service endpoint can be part of a continuously available service hosted by IIS, or it can be a service hosted in an application
- ASP.NET Web Forms
- .NET Remoting: .NET Remoting is a framework where you can invoke or consume methods or objects in a remote computer named the server from your computer, the client. .NET Remoting was superseded by WCF in later versions of .NET Framework, and only remains a component of .NET Framework for backward compatibility purposes
- Windows Workflow WF: Technology that provides an API, an in-process workflow engine, and a rehostable designer to implement long-running processes as workflows within .NET applications
If you are using one of the above technologies, then alternatives must be searched for or a major rewrite of the application.
- Check to see if any kind of entity framework is used
Official Definition: “Entity Framework is an object-relational mapper (O/RM) that enables .NET developers to work with a database using .NET objects. It eliminates the need for most of the data-access code that developers usually need to write.”
If you are not already using it, I strongly suggest using entity framework (or other O/RMs) because it helps saving a lot of time in the development process and also helps in standardising data access and manipulation for your applications.
As per the above figure, Entity Framework fits between the business entities (domain classes) and the database. It saves data stored in the properties of business entities and also retrieves data from the database and converts it to business entities objects automatically.
Also Entity Framework is available for .NET Core, as per below table.
End of part I.