Blogpost

Unlocking the Strategic Potentials of Big Data

KEY FACTS

  • Big Data has both challenges and potential

  • Currently benefits are gained from cost reductions, accelerating processes, and analyzing customer Needs

  • In addition Big Data is an investment with future benefit potential

  • A comprehensive Big Data strategy is essential for organizations

  • Big Data architecture and technology are the foundation for this strategy

1. CHALLENGES AND POTENTIAL

Financial institutes are currently faced with the challenge of leveraging the potential of Big Data. On the one hand, they are driven by factors such as decreasing margins and increasing competition in environments characterized by low interest rates and rising regulatory requirements. On the other hand, financial institutes seek to address the growing potential of digitalization and to integrate it in a comprehensive digitalization strategy. By keeping an eye on the two sides – both the requirements for increased efficiency as well as the chance to leverage business potential – financial institutes can use Big Data to create benefits in terms of both the cost and speed of doing business, and can thereby operationalize Big Data as a competitive advantage.

At the moment, banks are employing Big Data technologies just in a few specific areas, even though they are collecting increasing amounts and types of data from different sources through the use of multichannel strategies. The use of this data is currently still very limited, as the analysis and evaluation of large amounts of data – hence the term “Big Data” – poses a significant challenge to all organizations. At the same time, the growing importance of Big Data is increasing the pressure on established market participants to act. Some describe this as a structural transformation occuring in the digital world, one that is greatly expanding the importance of data in comparison with hardware and software.

Although Big Data is increasingly being used in different industries, in its current state of development it is more of an innovative technology, one whose benefit is being only partially realized and one that serves as an investment in the future. In order to work towards the integration of Big Data, organizations need a comprehensive Big Data strategy.

2. CASE STUDIES

One prerequisite for the comprehensive use of Big Data is a strategy that makes technological possibilities consistently useable throughout different areas of a business. This strategy must increase data transparency, facilitate access to relevant data, and increase predicatability in specific areas of application. The areas in which users can leverage the potential of Big Data currently include:

  • Cost reduction

  • Acceleration of processes

  • Analysis of customer needs

 

Figure 1: Business potential through the use of Big Data technologies

Cost reduction

TESCO, the multinational grocery and general merchandise retailer, uses internal and external data to make predictions about customer demand for individual stores. The modelling software, based on a 100 terabyte datawarehouse, takes into consideration the full range of data from TESCO’s customer loyalty program to the local weather report. The overall result is a reduction in wasted stock worth about £100m per year.

Process acceleration

Singapore-based bank UOB (United Overseas Bank) uses in-memory technology combined with an analysis solution from SAS for their bank-wide risk calculation. Taking into account risk divided over 45,000 different financial instruments and in excess of 100,000 market parameters, the software calculates the overall risk assuming over 8.8 billion individual value at risk (VaR) calculations. Previously the process for calculating the effects of changing market parameters on the risk of the bank took approximately 18 hours. Thanks to Big Data analysis, this process has been cut to minutes. The significance of the results allows the bank to react to market changes rapidly and evaluate different scenarios and their respective risks upfront.

Analysis of customer needs

Otto, traditionally the world’s largest mail order company and currently one of the biggest e-commerce companies, is using Big Data to forecast return rates on products based on predictive analytics technology from software provider Blue Yonder (in which they own a 50% stock). As a result of tailoring the range of goods to customer needs and improving the inventory forecast by as much as 30%, the business has been able to reduce return rates and overstocking. The software is supplied by 300 million datasets each week and provides approximately 1 billion predictions on product demands and sales each year. The result is a range of goods directly aligned with customer needs.

3. ELEMENTS OF THE BIG DATA STRATEGY

In order to make use of existing potential and to lay the foundation for the aforementioned and future areas of application, organizations must develop a dedicated Big Data strategy. In general, such a strategy consists of five elements:

  • Big Data technology and architecture

  • Data management and data governance

  • Security and conformity management

  • Skills and qualifications

  • Optimization of business processes

 

Figure 2: 5 Elements of Big Data Strategy

3.1 Big Data Technology and Architecture

Big Data technology and architecture are particularly important elements of the foundation of a Big Data strategy. The amount of data available is growing exponentially. In the 90s the largest memory storage unit was the gigabyte, which by the start of the new millennium had already been replaced by the terabyte. Today corporate databases contain multiple petabytes, and even this unit will soon be history. At the same time, the ever greater digitalization of products and processes is accelerating business processes, making the need for rapid evaluation of large amounts of data ever more important from a business perspective and turning such evaluation into a differentiating competitive advantage. Classic data warehouse systems are no longer able to store the increasing amounts of data and to simultaneously provide the business side with a flexible evaluation of the data in real time. Such solutions require new IT technologies and concepts – meaning a new architecture for Big Data.

 

Figure 3: Enterprise Data Layer (EDL)

Enterprise Data Layer

At the moment, enterprise data are typically stored in structured form within the individual silos of the enterprise, directly in the applications. In order to produce a comprehensive overview of the data, the portions of the data relevant to the evaluation are extracted from the operating system, standardized, and stored anew in specialized evaluation systems. During this process, data for one specific application are compressed, so that a part of the data is lost. On top of that, the distribution of the data across multiple systems creates considerable transport and consolidation costs that have negative effects, particularly on the processing speed.

The concept of an Enterprise Data Layer (EDL) provides a solution to these problems. Instead of storing the data in area-specific silos, the EDL stores all the data – structured as well as unstructured – in a single logical system. This logical system must be technically and economically highly scalable and fault-tolerant. In the classic warehouse system the structure of the data must be established prior to data storage, while in the EDL this step does not occur until the data are read. Delivery systems therefore do not have to worry about target formats and the data are stored just as they are. No information is lost and it is the task of the evaluative components to structure the data as needed. Another strength of modern EDL concepts lies in the colocation of the processing logic and the data. Instead of transporting the data to the processing logic point, a massive simultaneously working logic is transported to the data. This allows the process steps to be greatly accelerated. Thanks to the principle of a uniform data storage unit, the evaluated results are immediately available to the subsequent processes and do not need to first be transferred into another system.

Innovative analytical applications

Data does not add value to an enterprise until a business benefit can be derived from linking the different data sources. New analytical applications are needed in order to leverage this potential. In classical warehouse systems, even small changes require great developmental efforts. Defined dashboards and reports are often not available until weeks after data has been collected, by which time the information they provide is no longer relevant for decisions. New analytical applications enable analysts to evaluate large amounts of data in real time. This is possible thanks to highly scalable distributed databases and in-memory technologies that are able to process enormous amounts of data within seconds. Here, classic BI solutions merge ever more closely together with statisitical tools. Furthermore, the predictive aspect of data evaluation is gaining in significance for many companies. Increasing amounts of data are collected by businesses that can be used to fine-tune predictive models. Thanks to the computational power available today, these calculation models can be rasterized to an ever finer degree, leading to increasingly optimized predictive models.

3.2 Data Management und Data Governance

A key factor required for deriving benefits from business data while simultaneously providing the necessary security and conformity management is the introduction of a clearly defined structure for data management and data governance. This requires the establishment of a centralized team to work across all business units within an organization. Traceable data management with clearly assigned responsibilities demonstrates trustful handling of information, which is especially important within the highly regulated banking industry.

3.3 Security and Compliance Management

The appropriate management of security and compliance for data storage and data access in a Big Data environment requires new approaches to meet new challenges. The complexity of Big Data environments makes it increasingly difficult to know who accessed what data or information and what they did with it. Moreover, data is often stored in multiple places, which makes it even more difficult to assign and trace user rights. Therefore tool-based monitoring and sensible access restrictions are necessary. These must allow users to access the data they require to carry out their work while preventing the misuse of data.

3.4 Skills und Qualifications

The effective use of Big Data requires new qualifications and skills linked to data processing, modelling, analysis and output management within a Big Data environment. Demand for experts in platform management, multivariate statistics, data mining, predictive analysis and modelling, natural language processing, content analysis, text analysis and social network analysis is increasing. Enterprises are asked to identify their specific needs and build up knowledge accordingly through training and recruiting.

3.5 Optimization of Business Processes

In order to apply Big Data analyses to the optimization of business processes, organizations must decide on the steps necessary for the implementation of Big Data within the context of appropriate use cases. As most financial institutes pursue a gradual introduction of Big Data architecture platforms, the selection of use cases is of considerable significance. In the context of an accompanying and coordinated prioritization, these use cases enable an organization to structure the implementation of a Big Data strategy and to clearly steer it towards specific processes.

Figure 3: The five elements of the Big Data strategy

4. CONCLUSION

Financial institutes can integrate Big Data into their business strategies – in the same way as other actors – in order to realize specific benefits from it within the framework of targeted areas of application. While the immediate increase in efficiency that derives from such strategies is of great importance, organizations must simultaneously assign the same degree of importance to understanding Big Data as an investment in their future. Establishing a strategic framework at an early stage enables organizations to tap the as yet unpredicted opportunities that will arise along with the already visible benefits from applying Big Data and linking it to different purposes – such as through the fundamentally different approach to the anlaysis of unstructured data. As a result, increasing numbers of banks are deciding to use the timely development of Big Data elements as a cornerstone of their future. The challenge lies not so much in becoming active in the field of Big Data in general, but in coming up with specific ways of applying this technological innovation strategically.

SOURCES

TESCO
http://www.retail-week.com/topics/analysis-how-tesco-and-otto-are-using-data-to-forecast-demand/5053784.article

United Overseas Bank (UOB)
http://www.t-systems.de/news-media/examples-of-successes-companies-analyze-big-data-in-record-time-l-t-systems/1029702

Otto
http://www.retail-week.com/topics/analysis-how-tesco-and-otto-are-using-data-to-forecast-demand/5053784.article
http://www.ethority.de/weblog/2013/11/15/wie-otto-und-dm-von-big-data-profitieren/

  •  

Meet our authors

Reference items

Expert EN - Karsten Trostmann

Karsten Trostmann
Expert Director
Karsten
Trostmann

Karsten Trostmann is an Expert Director at CORE. As a computer scientist, Karsten covers the main topics of IT strategy, evolutionary IT architecture, domain driven design, agile software developme...

Read more

Karsten Trostmann is an Expert Director at CORE. As a computer scientist, Karsten covers the main topics of IT strategy, evolutionary IT architecture, domain driven design, agile software development, and cloud infrastructures. Karsten’s previous project experience at CORE includes evaluating the platform strategy for a digital insurer, developing cloud operations for an identity provider, and architecture reviews in various projects.

Read less