M.I. & Data Warehousing Case Study

Client: Top 5 global banking group

Objective: To deliver an integrated marketing data warehouse and a suite of MI reports

Requirements:

  • Data warehousing processes must be automated and have completed before the start of the business day
  • The data warehouse must integrate data from the call centre, web analytics, core operational system, direct mail contact histories and business targets and forecasts
  • The end-to-end process must be delivered with existing software (SAS/BASE and Microsoft Office), there must be no requirement for new technology
  • The end result must be data views that can be exported directly into MS Office for reporting, with no additional processing required

We followed our tested and proven road map for data warehouse and M.I. system delivery,

  • Gather reporting requirements from data consumers and review existing processes
  • Identify and implement quick tactical wins to improve efficiency and free resources from manual work
  • Identify appropriate strategic data warehousing software (a requirement of the project was to work with existing software)
  • Design the data warehouse data model
  • Develop data mining and ETL processes to populate the data model

For more information on our approach to data warehousing and M.I., please visit our data warehousing consultancy services and M.I. consultancy services pages.

This case study highlights the importance of this approach, and why discipline over design before implementation is so important.

  • Gather reporting requirements from data consumers and review existing processes

The client for this project was an extremely large organisation, with outstanding analytic and M.I. capabilities in the group.  However, the marketing team had been neglected and did not have ready access to the data they needed to measure the success of their marketing effort.  The main issues were,

  • Other areas of the business are satisfied with weekly M.I., because the business drivers they influenced had longer term impacts on performance.  The marketing department required daily M.I. because they wanted to know the following day what impact the DM campaign that went out to 50,000 people had on sales volumes.
  • There was no single version of the truth in the business.  The processes that the marketing team had developed produced different results to the processes that the commercial and finance teams used.  As a result, the marketing figures were considered “wrong” and lacked credibility.
  • Processes were extremely resource intensive and required a lot of intervention.  This was undesirable to the reporting and insight manager because it took his resources away from producing insights to add value to the business, and was undesirable to the marketing community because the processes took a long time to run and M.I. was often late.
  • Processes were messy and not well understood.  As so often happens, processes had been handed down from one generation of analysts to the next, with new bits being added by each new generation, leading to complex and fragmented code that is not fully understood by the current owner.

Details of the steps in the road map,

  • Identify and implement quick tactical wins to improve efficiency and free resources from manual work

Having listened to the woes of the consumers and producers of the M.I., and before setting off on a delivery plan for a full production data warehouse and M.I. system, we identified a set of fast changes we could make to the processes that would deliver immediate benefit.

  • Review and improve core data mining and ETL code.  The code had been augmented and developed over the years without a great deal of change control.  Without changing the underlying processes, just the code used to generate the result, we were able to take the weekly M.I. run time down from 3 days to 0.5 days.
  • We were also able to automate the entire process, and run it as a morning batch job, so that it required no human intervention and was complete every morning before the start of the business day.
  • We also identified that the data was being exported to MS Excel too early in the process, with a lot of data manipulation and calculations being processed in Excel with worksheet functions and VBA Macros.  We changed the processes to keep the data in SAS/BASE until it was in the summarised form required for final reporting, which further reduced M.I. production time and manual intervention.

With 3 days effort, we were able to save the data & insight team 4 days per week of human resource, which could then be re-invested back into supporting the delivery of the production data warehouse development.  A great win!

  • Design the data warehouse data model

The client in question is a very large business, with several of the biggest UK brands in its portfolio, each with a range of financial products on offer.  New brands and products are added regularly, and so scalability was key to the design.

Also, given that this was a marketing data warehouse, and marketing campaigns are added regularly, the data model need to properly identify dimensional elements such as campaigns, campaign categories, channels, brands & products, and join these to the business measures (sales, applications, etc…).

A Star Schema Dimensional Model was the best fit for the requirements.

Example Data Model

Star Schema Dimensional Data Model Example

The above example allows for data views to be overlaid onto the data model that report applications, sales and conversion (ratio of sales to applications) against any combination of the attached dimensions.

  • Develop data mining and ETL processes to populate the data model

The data sources required to populate the data model were diverse, ranging from telephony stats available on an online system, direct access to a DB2 mainframe and data submitted from finance in flat files.  Efficient automated ETL processes were built for each of these sources which pulled the data from the root source, pulling data down from the web for example, and loaded the daily increments into the data warehouse.  Error trapping and logging was built into the system so that if unexpected data was found the process was halted for investigation.

The result:

An efficient, automated set of ETL processes to pull the business data into a single, structured data warehouse, to produce daily M.I. reports that matched results from other business areas.

Please get in touch with us to discuss your requirements.

Add Comment Register



Leave a Reply