The Strategies for Converting Large, Tightly Integrated VFP Legacy Systems

The Strategies for Converting Large, Tightly Integrated VFP Legacy Systems

By Dr. Ronald Mueller | Published on August 2nd, 2017 | Last updated on June 25th, 2019 |

Explore the Strategies for Converting Large, Tightly Integrated VFP Legacy Systems

Over the last couple of years, we have had the opportunity to convert several ultra-large VFP legacy applications (~1-2M LOC) that also happened to be very tightly integrated. This raises a huge challenge on the part of the developers where the various components of functionality are linked closely together. Also, there is no distinct separation between the database tables by module or major area of functionality as the code and database are very much interconnected.

We recommend the best way to convert small-to-modest sized VFP systems (<=500k LOC) is a ‘do-it-all’ two-phase approach. In this 2-phase approach, there is only one-time database migration required without any batch updates and no downtime. First, we write scripts to migrate all of the data to the new database and then the client application is modified to work with the new database. This approach provides the shortest timeframe to a fully complete new system with a provision for enhancement once it is up and running. Most of all, there is only one user acceptance testing(UAT) required before implementation.

In really large VFP systems we need to assess the application to see if there are any independent modules that can be converted one at a time. This method allows the client to take advantage of the new application as and how the independent modules get converted. It gives clients a much clearer picture of the new application and allows them to make any changes much earlier before the development is complete. Module based approach can be implemented only when there are individual modules that are available for separation from the old application.

In case of very large systems that are tightly integrated the above approaches are not feasible. We employ three strategies for systems that are both very large and tightly interconnected.

Do-it-All Two phase approach: The ‘do-it-all’ two-phase approach (similar to first strategy referred earlier) is a one-time database migration, without any downtime and no batch updates required. This approach is not practically feasible as the full conversion cycle may take approximately 2 years to complete, which is way too long for any client to see at least some part of the new system up and functional.

Multi-phase Module based approach: the Second strategy is a multi-phase, module-wise migration approach, where we migrate the application to a single module at a time. This is a doable deployment strategy where both the new and old application can run parallelly without any data conflict. A batch application will have to be built to update the data between the new module database and the VFP application database. Batch update program allows you to update the tables in the new database and the tables in the VFP database frequently. A batch application will no longer be required once all the modules in the VFP application are converted to the .NET application and the application is functional.

Sequence-based Selected Functionality approach: the Third strategy is the migration of selected functionality in a multi-phase approach (similar to second strategy referred earlier), but now based on a sequence of the selected functionalities. Here also we need to create a batch program for data synchronization between VFP database and the new database, but now it depends on the functionality being converted.

It is evident from above that combining the 2nd and 3rd strategies would be the best possible option. These two strategies allow the client to take a firsthand look at the new application even before the conversion process is complete. Laying out the best order of sequence of modules and functionality before beginning the project is key to ensuring a seamless migration.

Please contact us to learn more about how we can make this work.

Dr. Ronald Mueller on Linkedin
Dr. Ronald Mueller
Founder & Chairman at Macrosoft Inc
Ron is Chairman and Founder of Macrosoft, Inc. Ron heads up all company strategic activities and directs day-to-day work of the Leadership Team at Macrosoft. Ron is also Macrosoft’s Chief Scientist, defining and structuring Macrosoft’s path forward on new technologies and products, such as Cloud; Big Data; and AI. Ron has a Ph.D. in Theoretical Physics from New York University, and worked in physics for over a decade at Yale University, The Fusion Energy Institute in Princeton, NJ, and at Argonne National Laboratory. Ron also worked at Bell Laboratories in Murray Hill, NJ., where he managed a group on Big Data, including very early work on neural networks. Ron has a career-long passion for ultra-large-scale data processing and analysis including predictive analytics; data mining, machine learning, and neural networks.
Recent Blogs

How to Virtualize your VFP Application
How to Virtualize your VFP Application
Read Blog
Stress Points in Converting Visual FoxPro to .NET
Stress Points in Converting Visual FoxPro to .NET
Read Blog
How To Convert Visual FoxPro to .NET
How To Convert Visual FoxPro to .NET
Read Blog
The Strategies for Converting Large, Tightly Integrated VFP Legacy Systems
The Strategies for Converting Large, Tightly Integrated VFP Legacy Systems
Read Blog
Modernizing Legacy Applications – A Strategic Imperative for Insurance Firms
Modernizing Legacy Applications – A Strategic Imperative for Insurance Firms
Read Blog