Wednesday, August 9, 2017

Time to Replace and Rip? Yes!

Until recently, the concept of Rip and Replace always carried a terrific fear component. My mother would have rather heard a litany of curses than hearing “Rip and Replace” anywhere in sight of her data center. But, alas! Times change.

Informatica, IBM, Tibco and others, have gone the way of punched cards, Cobol, and Fortran. Data Warehouses have served well now, for a couple of decades, but the overhead and slowness continue to build up tech debt as tech teams fail to keep up with the requisite pace of business. Some businesses will keep trying to cajole their ancient software to mimic today’s technologies as they plod forward trying to remain competitive. They won’t succeed though. You just can’t squeeze agility out of a pipe wrench. 


I’m convinced that if my mother had met Data Virtualization, for instance, before she walked out the door, she would have been the first to jump in. She always embraced new ideas, but she also exercised a pragmatic skepticism.

Well Ma, it’s time. All the smart companies are doing it. Not Rip and replace, really. It’s more like Replace and Rip.

What I’m talking about is an orderly modernization path that surprisingly quickly replaces those ancient approaches to data integration that businesses put so much effort into, not to mention money. Huge teams still are spending years integrating across multiple systems, and the cost of every small modification could feed an army. It’s time to get serious about this relatively new Data Virtualization(DV) paradigm. If you don’t know about DV, better wake up and check it out. And while you’re at it, take a look at Agile ETL™. The two together will take you quickly from what Gartner calls your “Mode one” clunky IT infrastructure to a “Mode 2,” embracing mobile, IoT, Cloud/hybrid and all manner of digital.


Here’s a quick overview of DataVirtualization sometimes referred to as "Logical Data Warehouse:" Instead of gathering data physically into a staging database or warehouse, a virtual data model is defined, and all of the participating data sources are logically aligned with transformations, validations, and business rules. The virtual models are packaged as OBC, JDBC, Odata, and other services. When the virtual data model is queried, the DV reaches out live to the sources, applies all the configured rules, resolves the queries, and delivers the data to the calling program. Many companies are getting familiar with DV by leveraging it for their latest wave of Business Intelligence and Analytics.  

Here is a quick overview of Agile ETL: There is finally a technology to significantly streamline ETL. That is to leverage the same type of Federation used in DV, for moving data physically to another application or database. Stone Bond’s Enterprise Enabler® (EE) supports rapid configuration of the Federation, validations, and business rules, which are executed live across all the sources, and delivers the data in the exact form required by the destination live, without any staging. Just think about the amount of infrastructure that you can eliminate. Ma would be all over it!

So, there you are:  Rip out the old and Slip in the new. Or rather, Slip in the new and Rip out the old.


1 comment:

  1. IT's very informative blog and useful article thank you for sharing with us , keep posting learn more about Product engineering services | Product engineering solutions.

    ReplyDelete