Blog

Data-driven learning systems – extracting operational value from data warehouses

bigstock-Web-Analytics-Doodle-Elements-52160116

Since the early 2000s, virtually all modern companies have invested heavily on different kinds of data collection and warehouse solutions. These massive investments have now been in use for a decade or so and the outcome is that most of the firms have significant amounts of data from their business. The question that now has been raised, what practical impact these systems actually had?

Of course much has change in companies – both for good and for bad. Much of the development has been on managerial level, where e.g. periodical reporting could be largely automated. But when looking these systems from operational level, the picture starts to look different. There seems to be a growing feeling that these large databases would have great value for firm’s daily operations, but so far the usage of these systems has been limited or fully neglected.  I have even heard these databases referred as ‘graveyards of information’, as most of the data recorded will never be analysed or used to improve firm’s operations. It seems that something is missing from the link between firm’s information systems and daily operations.

“It seems that something is missing from the link between firm’s information systems and daily operations.”

In a traditional organization, the link between large data systems and firm’s operational processes is organized so that analysts are in a central role. They implement tailored data analyses to understand selected problems. The outcome from analysis is a report or presentation which is then disseminated for personnel in firm’s operational processes.

traditional

Figure 1. Traditional data analysis structure

The benefit of analyst driven system is that there is virtually no limitation on the potential analyses, other than the skill level and persistency of the analyst. The downside is that level of automation in the cycle is limited, especially delivery of results relies on traditional reporting or education services.

Development of computational data analysis methods has made it possible to create automated decision support systems to facilitate continuous connection between firm’s data storages and daily operations. Technically these systems can be based on e.g. machine learning, where the massive data storages are used to teach the system to understand firm’s operational processes. In machine learning world, the more data is available – the better the suggestions made by the system.

Integration of these automated decision support systems offers an alternative connection between firm’s data warehouse and daily processes that should be seen as parallel method to the traditional analyst. The analyst based learning still offers the most flexible way to find new insights from data, while the automated system offers continuous and scalable support for a specific problem area. The key in designing these systems, is to focus on ease of use and effective delivery of key findings from analytics. If these can be achieved, these systems can have significant effect to the efficiency of firm’s daily operations.

“Integration of these automated decision support systems, offers an alternative connection between firm’s data warehouse and daily processes that should be seen as parallel method to the traditional analyst.”

Integrated

Figure 2. Data analysis structure with automated analysis

The expected benefit from computational support system is that the quality of operational decisions can increase significantly leading to either increased production capability or process efficiency. The further benefit of automated system is that it can be integrated to be a part of normal daily operations. As such the system can always be present to support daily operations. The downside of such a system is that they can be used only for the intended use purpose(s), creating a need for outside analyst in a support role. Additionally, development of these systems is also a relatively large software development project, but the costs are likely small compared to high costs of building and maintaining data collection systems and the potential benefits from the system.

These kind of systems are currently still rare, but due to current hype around analytics systems they are likely to become more common. Most public cases of such systems are from service industry such as banking, where machine learning based systems are used to apprise loan applications[1].

“…the quality of operational decisions can increase significantly leading to either increased production capability or process efficiency.”

From more philosophical level this new link creates a new form of organizational learning. In this learning process the firm’s data warehouses are treated as an ‘extremely large memory’, where the amount of data is so high that computational analysis methods are the only way to rapidly make sense of the data – Data-driven learning systems. We are only now starting to reach the maturity level in analytics that allows for creating these data-driven learning systems. The future is full of great opportunities and it is going to be exciting to see how far the boundaries of these systems can be pushed in the near future.

[1] See case description e.g. from Siegel (2013), book contains also additional case stories. Siegel, Eric (2013), ’Predictive Analytics: The Power to Predict Who Will Click, Buy, Lie, or Die’, Wiley publishing.

Knowledge is broken

OLYMPUS DIGITAL CAMERA

Data is the new oil” is well hyped slogan. As with oil, data is meaningless unless you use and capitalize it. Amount of data is growing but is our sense-making capacity matching the speed?

The way to make money with oil is to pump it out, refine it and only then we have modern infrastructure. With data money-making is similar as it needs to be refined to valuable information and then it can truly drive the development.

Intriguingly, combining different sources and reusing data just increases its value. We can save atoms, nature, human lives and nature with smart traffic, cities and healthcare.

“In addition, tools perfect for linear world are still mainstream, even though our world and business environment are growing in complexity.”

But do we have capacity for data refinement in firms and society at large?

Our experience from different industries and public sector tells quite grim story; most organizations still work in process silos supported by process specific data silos. Horizontal view or enterprise architectures are either completely missing or only theoretical concept level designs. In addition, tools perfect for linear world are still mainstream, even though our world and business environment are growing in complexity.

However, there is a growing minority of leaders and professionals accepting the fact that in complex world you cannot survive with command and control type of management systems and tools. These businesses are coming up with tools to refine and make new value with their data.

In the future they are the ones controlling the business as the likes of Google and Facebook are currently dominating information platforms.

What should business analytics learn from music legends?

Blurred Couple with bicicle stay on the street at sunny day.

In music many legends claim that ‘less is more’. However, all musicians do not share this view, as it is common, say, in more heavy oriented music circles to argue ‘more is more’. This is an old debate between musicians and it now seems that the same discussion should be taken also to modern boardrooms.

view pointLarge databases withhold a lot of strategic information. It is also clear that ever-increasing computational capabilities and storage capacities allow a wide range of new analytics. As new buzzwords like ‘internet-of-things’ and ‘big-data’ are coming true they come with the fact that the number of data sources continue to increase with mind-blowing speed. From measurement perspective this is great, but from analytics perspective it creates challenges.  The immediate outcome is a sharp increase on different metrics that can be measured, which will lead to immense databases of historical row based measurement data.

Modern data based analytics is driven by size and speed. The spotlight in this discussion has been on the rapid development of computational technology and, hence, those developing new technologies control the content of discussion. By focusing on ‘more is more’ the most certain thing is continuous investment to larger and faster IT infrastructure and true business benefits are too often left to sidelines. This discussion is dominated by the question ‘How’How to build infrastructure for analytics?

The critical question in strategic information management should, however, be ‘Why’Why does the firm perform this way? The ultimate goal of analytics should be achieving firm’s strategic goals, which in many cases is firm’s competitiveness that ultimately means maximizing long-term financial performance. We want to shift the focus in analytics discussion to the realized business value.

It is comforting to know that achieving business value with analytics does not mean analysing the whole world but focusing on the critical and substantial parts of firm’s processes. So often we see analysts put to almost impossible situation. Common task is to analyse all data that firm collects, from which the analyst is expected to come up with significant business conclusions. In most cases this is close to impossible at least with the tools in hand!

In terms of data management experts should assess the most valuable measurements for understanding firm’s core business. Scaling down the data requirements calls for deep understanding on the business problem but also thorough understanding of various analytical methods and their capabilities. This task requires a very distinctive but broad skill set. Optimal would be a sort of  ‘strategic analyst’, who possesses deep understanding on both worlds. However, many companies have found the hard way that these specialists are a rare breed. Alternatively, this can be achieved with a special team composed of business professionals and analytics experts.

“Answering these questions will start your firm’s transition from how-analytics towards more impactful why-analytics.”

Customers are often surprised how great of an impact from analysis can be achieved with very limited amount of different measurements. Don’t get me wrong – although we limit the number of different metrics, there is often still a significant amount of row based data from systemic perspective. Our experience has also shown that many companies have missed the value of some simple metrics that need to be measured in order to truly understand their business. Collecting almost incomprehensive amounts of data has lead to false confidence of the completeness of current data collection systems. Simplifying the data stream eases the search of data gaps. In the end, by identifying valuable data and using relatively simple metrics with this focused approach we have been able to increase the business impact of analytics.

We encourage all managers to challenge their current data management with simple questions:

1) Why do we do analytics?

2) What do we want to achieve and why?

3) What are the most essential measurements we need in order to achieve the target?

Answering these questions will start your firm’s transition from how-analytics towards more impactful why-analytics. This is the road towards smarter data analytics that has a real business value.

R.I.P. B.B. King – legend that understood the meaning of less is more

Latest in disturbance free logistics

Lang Son is a province in North East of Vietnam

IMG_3621Logproof was a TEKES funded research project that focused on management of logistics disturbances. The final seminar of the project was held this month in Helsinki and the event featured a wide range of interesting new research results, solutions, and other future development areas in logistics management. Also SimAnalytics had the opportunity to present possibilities in using simulation modeling to enhance and develop logistics management.

The range of different topics in the seminar was wide including e.g. processes for grocery transport safety and development of longer range RFID technologies. Two safety related topics reoccurred rather intriguing: 1) The current state of transport safety and 2) Real time tracking and monitoring of deliveries.

The first topic seemed first as an overreaction. However, it was surprising to see that even in relatively safe countries (like Finland) there is some degree of threats against logistics transports. The situation especially in eastern and southern Europe is much worse. Evidence shows that 1,7% of shipments from Finland to Russia are subjected to some sort of crime (CHS logistics Oy). In some extreme cases trucks have been trespassed even on freeways. Take a look at for example this video. Sometimes reality is stranger than fiction.

The second trend was partly a reaction to the above mentioned problem. Firms are trying increasingly to find ways to protect their transports. One way to ensure this is to create more efficient tracking and monitoring solutions. One firm presented their own improvised solution to follow trucks with hunting camera and mobile phone combination. This simple solution had proven to be relatively effective in tracking down illegal entries to vehicle in Russia.

“…it will lead to a rapid accumulation of data and eventually create new interesting possibilities for modelling based solutions.”

Israel based Starcom Systems provided an alternative and more commercial solution – Triton for container tracking and monitoring. The idea of the product was that the tracking device is connected to container door and it activates as the container closes. Unit then keeps track on the container location through GPS, collects basic telemetrics such as temperature and G-forces, and notifies of above parameter activities with SMS (e.g. container door open or too strong G-force). A Finnish firm presented a use case where this device was used to track a shipment of machinery tools from Finland to Mid US. This case revealed that the container was subjected to surprisingly strong G-forces, especially in US Harbors where the container was moved from boat to land transport. The shipment movement could also be tracked with-in couple meters for the whole trip – impressive.

In all, the seminar showed interesting new developments in management of disturbances in logistics. This seems to be a vivid business field which is developing strongly thanks to recent advancements in different wireless communication techniques.

When these telemetrics become more widely used it will lead to a rapid accumulation of data and eventually create new interesting possibilities for modelling based solutions.

First SimAnalytics Academy simulation workshop in Finland

Full-length of businessmen walking at creative work space

late ohjaaSimulation modelling understandably draws attention across industries and academic disciplines. This was evident also last week when SimAnalytics Academy organized the first open modelling workshop in Finland. Altogether eight participants ranging from researchers to industrial warehousing experts to university lecturers attended the 2-day “Simulation Modelling & Logistics Management” workshop in Kerava.

attendeesThe program acquainted motivated learners to the basics of complex thinking and building simulation models with AnyLogic software. Instructors tried to keep the lectures as short as possible and get the participants hands-on building their simulation models. This structure was clearly appreciated.

“Illustrative exercises with discussions gave many new ideas to utilize simulation modelling with many occasions and challenges. Thank you!”

The possibilities of agent-based modelling stimulated the most ideas among the attendees. Controlling real-life randomness through Monte Carlo principles gathered great amount of attention too. Also ideas  for utilizing modelling methods in teaching came up during conversations.

“An intensive, well-guided workshop. Well-informed instructors. Thank you!”

The feedback gave us many ideas to develop Academy events in the future. In advanced level events we will definitely focus even more on running experiments with created models.

markus puhuuSimAnalytics Academy organized the workshop in cooperation with AnyLogic Company and Laurea University of Applied Sciences. We want to say special thanks to the partners and of course all the attendees.

Upcoming Academy events are already under planning for spring 2014!

PS. Interest towards these modern analytic methods seems to be growing, as after the workshop plans for tailored modelling training was initiated by a customer. If you are interested in familiarizing yourself and your team with modelling methods and/or AnyLogic software, you may contact us to tailor a program suited for your ambitions.

Building competitive advantage with simulation supported logistics

traffic blur motions in modern city hong kong street

In a recent article SCDigest editorial staff analyzes the role of logistic designing as a source of firm’s competitiveness. Still many companies aren’t utilizing advanced methods in designing their logistic chain and only see the topic “a necessary but not a critical routine”. According to supply chain management experts, these companies are missing an opportunity to minimize costs and maximize effectiveness. The key is to view logistics as a process that can be constantly developed and not a separate function in the company.

Our experience at SimAnalytics is very much in line with the SCDigest authors and similar conclusions have also been made in other outlets. A logistic survey in Finland concluded that 35%-43% of the company competitiveness for large organizations originates from logistics. Further conclusion was that 40%-50% of this competitiveness could be affected by company’s own actions and decisions.  Altogether it’s fair to say that logistics is an important process in building firm’s competitive advantage. Furthermore, it is reasonable to argue that there is great potential in advanced methods in logistic development.

The design of logistics is critical especially during the times of change. Decision makers face questions like “what kind of logistic system the company needs when extending its operations” or “how the company should organize manufacturing and delivery of a new product”. These types of questions are critical in designing firm’s processes, but they cannot be answered only with static data provided even by the latest ERP systems. More advanced tools are needed and luckily also available.

SCDigest: “If the analysis is done on spreadsheets, can you really factor in all the right variables, do you really have an optimal answer in the end, and what is your level of confidence you have telling executives what should be done?”

We see simulation modelling as one of the most exciting methods in advanced business intelligence for logistics. It can offer highly customizable tools to design the supply chain with scientifically rigor approaches. When the simulation model is designed, utilized and interpreted properly the method can provide very detailed analyzes to support successful supply chain design.

A company with proper supply chain design system gains many benefits:

  1. Optimization of existing supply chain ranging from raw material to distribution of products. Also the possibility to deeply understand the firm’s logistic operations is eye-opening.
  2. Performing different scenario analyzes to test the firm’s system in changing business environment.
  3. Speed up the firm’s ability to react to changing business conditions such as costs, demand, etc. Also the risks related to delivery within the logistics chain can be tested.

Do your company a big favor and find out how your supply chain design could benefit from the advanced BI tools available. You’ll be amazed.

Undetected Hidden Biases in Data Lead to Poor Decisions

Workplace with computer with young businessman standing near by

The discovery of big data has reached nearly epiphany scale. Such hype around a new idea makes many people blind for the traps hidden behind the great potential.

Harvard Business Review has lately published many insights regarding big data in their blog network. The latest take on this hot topic by Kate Crawford discusses the hidden biases in big data. Crawford describes examples where big data originated from consumer mobile devices with social media features contains hidden biases caused by social and ethnographical factors. In such cases the generalization of results evidently leads to misinterpretations even if the data first seems comprehensive.

Hidden biases are not only an issue with big social media data but the same principles apply also to carefully collected corporate data. No matter how lavish your data is, it never makes the detailed understanding of its nature and shape redundant. In fact, quite the opposite. The more excessive your data is, the more accurately tailored your supportive tools need to be.

“As soon as the data was deeply understood, the model producing important predictions was created accordingly and the adaptive tool started to pay off…”

Our team worked with an industrial company that had recently deployed nice mathematical tools to refine their abundant data into demand predictions. These predictive analytics formed the foundation for operational management and decisions. The only issue was that the forecasts were far from accurate which led to poor managerial decisions affecting operational efficiency in over 30 countries.

The data was complete and the tools were based on rigor statistical methods so how could the derived predictions be wrong? Situation was frustrating as those predictions were the only meaningful information regarding successful decision-making.

There was a hidden bias in fundamental assumptions regarding distribution of the data. False assumptions made predictions inaccurate which directly led to uninformed decisions. As soon as the data was deeply understood, the model producing important predictions was created accordingly and the adaptive tool started to pay off. Decisions were supported by accurate analyses.

Specialized business analysts are able to tell you how reliable your data is for different predictive analyses. Once the biases in data have been detected, multi-method approaches are often needed to interpret and analyze the data accurately.

Don’t waste the excessive potential that lies in your business data. Take actions to first understand it thoroughly and build your predictive analytical tools on that. It will pay off.

Gartner Taps Predictive Analytics as Next Big Business Intelligence Trend

Focus on documents and pen on the table. Blurred people on background.
** Note: Shallow depth of field

Analyzes on firm’s internal processes and business environment are the driving engine in modern companies. This is moneywise evident in the large-scale investments that the big firms are still making to build new business intelligence tools. If the current growth continues, Gartner estimates that the total value of business intelligence business over triples by 2020 (current size estimated at 57 billion USD).

“Gartner proposes that in near future simulation based models are required to make more sophisticated predictions…”

The fastest growing subclass in business intelligence is data discovery or more generally large data analysis tools. These tools give an opportunity to describe and understand what the data implies. The current trend is that these tools are developing towards more visualized and illustrative ways to present the datasets.

Predicting the future means looking forwards – This is also the future of business intelligence according to Gartner. The article identifies that the next breakthrough in business intelligence is predictive analytics, which will increase the accuracy of predictions. Currently used methods such as extrapolation enables predictions to certain accuracy, but Gartner proposes that in near future simulation based models are required to make more sophisticated predictions. Gartner foresees that having an efficient way to utilize big data in predictions and then building decision-making on them is a critical component in building foundations for firm’s sustained competitive advantage in the future.

It’s hard not to agree.

Read the full article HERE.