Yet, as digital needs have evolved, many wealth management firms have struggled to scale core applications to keep pace with demand. This has resulted in a rise of “bolted on” data structures as applications have continued to grow, many of which rely on on-premises software. Many firms are finding themselves saddled with unwieldy data structures that make data difficult, if not impossible, to access. Additionally, the data structures require undue amounts of time, resources, and project dollars simply to keep the existing physical systems operating.
Despite these challenges, wealth managers recognize the substantial value in data. Data is not only the backbone of their applications, it drives future growth and product innovation.
In the pursuit of making customer data actionable, wealth management firms should first address any challenges they may face with regard to data management structures and practices. Data refactoring and service reengineering, which makes data easily accessible for external client insights and internal product innovation, are necessary prior to going to market.
Innovation-focused wealth management firms are coming to an important realization: decomposing data into microservices and API’s at the business domain level is leading to a future in which a marketplace of data and services can drive value and insights both internally and externally. This is a short-term investment that can drive long-term gains.
More than 70% of Fortune 500 companies are still heavily utilizing mainframes, but many recognize the benefits of a cloud or hybrid cloud environment to manage data. And with a migration, those firms have an opportunity to reengineer systems. The challenge becomes how to replace the engines on a plane while it is still in the air. Taking a system offline for six months while it is being reengineered isn’t acceptable in most cases. This is where a clear target state architecture that includes microservices with APIs on the front end and data organized by domains with context boundaries on the back end becomes vitally important.
Once the vision is set, an iterative cloud migration with in-kind testing between existing and target services comes into play. As services are built and data is transformed, front-end systems can begin to methodically migrate to the target state. The safety net is in replicating data back on-premises so that a canary approach to service migration may be taken (i.e., move 10% of users to the target system and then fail back if smoke is detected). This is a heavy lift in the best of circumstances. However, bringing the right expertise to the table with a proven track record can greatly mitigate risk for this type of effort.
Long Term Benefits
There are many benefits associated with putting the right data closer to the customer and end users. In the future, this will allow large institutions to be nimbler as they develop applications, localize problem resolution, and accelerate innovation. Benefits will be seen in dollars saved and new revenue generated. The days of standing up entire initiatives just to get to the right data set and ownership are waning. Winning organizations will be those who are able to go to market with the speed to meet the demands of data-hungry customers.