How hard could it be? A new Data Mart was being constructed for a national health care provider and the client activity reports used on a monthly, quarterly and yearly basis needed to be modified to use this new Data Mart and report this new aggregate data. I thought this project was going to be simple and straightforward. I had never been a part of a Business Intelligence remediation effort before, but how hard could it be? The logic was already complete. The reports were already built. The combination of both these elements was already tried, tested, and in production. All that needed to be done was rename some tables, update some column names and update the database targets, right? Could it really be as simple as a using Find & Replace?
Despite my initial expectations, this project was full of challenges. In order to minimize costs and maximize efficiency, organizations often conduct multiple projects in parallel. So was the case with this project. The Data Mart and its data-loading processes were being tested and finalized in parallel with the reports being modified and tested. Simply having data to test with was an initial hurdle which often halted development. Additionally, having a moving target as a data source presented some unique and often frustrating challenges for this project. Tables would truncate, seemingly at random, providing some unexpected results during development and testing. Keeping the entire team up-to-date on the status of the data mart became a day-to-day necessity.
Another challenge with modifying in-production reports occurs when those reports are being updated with incremental production improvements. Bugs are going to be fixed and new enhancements are going to be implemented. Often times, the BI support team is separate from the BI remediation teams. Communication between both is critical to ensure any updates from the current production reports make it into the future production reports. Otherwise, there will be rework, and no one likes rework.
Don't forget testing!
Testing revamped reports from a new data source has its own set of challenges. Presumably, the easiest thing to do would have been to run the old reports and the new reports with the same set of parameters and comparing the results for a 1:1 match. Unfortunately, availability of current production data in the TEST version of new Data Mart as well as inherent differences between the old and the new marts prevented this direct, 1:1 comparison. Modified SQL scripts had to be created from the report logic to pull the old data using the ‘new' logic. Using those scripts, metrics from each report were hand selected and compiled manually to provide a way to successfully compare the legacy reports to the new ones.
As with all changes to the status quo, an important goal for any IT-related project is User Acceptance. It is important to account for User Acceptance Testing (UAT) when scheduling the project. In order for UAT to be successful, the end-users of the reports must be comfortable with the data they are seeing in the reports before approving them for production. Any large differences in the metrics must be able to be explained. The new data mart, for instance, requires a 1:1 relationship between memberID and caseID, whereas the old mart allowed a 1:M relationship. This drop-off in the Active Cases metric for a given population over a given time had to be explained to the users conducting UAT before the reports could receive a passing grade.
Despite the initial appearance of being fairly straightforward, a BI Repoint project presents its own set of unique challenges to overcome. Having to develop and test reports in parallel with the data mart requires both patience and active communication between team members. The testing strategy must be carefully developed to account for differences in data availability between legacy and future data mats. Finally, any drastic differences in established metrics must be accounted for and explainable in order for the new reports to find acceptance among the users and enter production.