Recent years have shown a strong expansion of technologies based on data analysis - or analytics - to support decision-making processes. Whether in everyday situations like finding the best route to a particular destination, or in more complex problems typical of the industrial environment, such as planning production batches, evaluating the risks of a given project, or determining the most important factors in the performance of an operation, decision support systems have become essential elements in the search for better performance.
Decision Theory suggests that good decisions are informed – using relevant data and information on the problem at hand - and rational – maximizing some set of performance criteria. In fact, a formal decision-making process is all the more important the greater the amount of resources involved (or the greater the scarcity of resources), the greater the risks, and the greater the demand for effective results.
Good decision support systems must be supported by three fundamental elements: Data, Models and Decisions. Data are used to construct and feed models that in turn suggest decisions to be evaluated and, combined with judgment and human experience, to then be put into practice.
Models are simplified representations of reality, mechanisms that translate the main elements of interest for the analysis of a given problem. Good models are simple, flexible, and robust with to the variability of the data that feed them.
However, the quality of the decisions recommended by a model is intrinsically linked to the quality of the data used. Hence the importance of counting on technologies capable of robustly integrating and effectively managing data from different sources, in different volumes and of widely varying levels of quality, as well as providing the manager with the possibility of performing descriptive, predictive and prescriptive analyses.
Such analyses may involve well-known techniques based on statistics, mathematics and probability, or even more advanced techniques, also supported by these same pillars, but boosted by the rapid increase of computer processing and communication capacity.
A quite interesting work by authors Thomas Davenport and Jeanne Harris – Competing on Analytics: The New Science of Winning - puts these techniques in perspective. We will use the organization proposed by them to discuss the power of each of these types of analysis.
Descriptive Analytics involves collecting, cleaning and presenting data about quantities of interest such as consumption and generation of energy, production, and emission levels, among others. The objective is to present the data in a coherent way, through graphs, reports, dashboards, and hierarchical analyses, at the level of temporal and/or contextual aggregation suitable for interpretation by human analysts.
Descriptive analysis helps answer questions such as “what happened,” “how often,” “what requires immediate action,” and so on. Because they provide significant value and because they are easy to use, tools for descriptive analysis have been increasingly adopted by industry.
In turn, predictive analytics are based on models constructed from mathematical and probabilistic fundamentals, and a considerable amount of data on the system of interest. Machine learning models are included here.
Regression algorithms, sorting, and grouping (clustering) confer to these models the ability to learn from the past behavior of complex systems, and enable them to predict their future behavior. Estimating future levels of energy consumption, determining the main factors that influence the energy efficiency of an operation, or even identifying correlations of interest among process variables are some examples of the great analytical power of these techniques.
In other words, predictive analytics helps answer questions like “what are the main causes of the problem,” or “what results can we expect from a given system configuration.” Perhaps because they demand a significant volume of good quality data, and because they require a greater degree of sophistication, both in their construction and in their interpretation, the adoption of tools for predictive analysis has been occurring relatively less often – despite their great potential to create value.
Prescriptive analytics seeks to determine, or prescribe, the decisions that lead to the best possible performance of a given system. Such analyses require the construction of mathematical models that represent the behavior of a system through decision variables, constraints or limits to its behavior, and a mathematical function that quantifies its performance.
The optimal solution of a prescriptive analysis model thus corresponds to the set of values for the decision variables that maximize or minimize system performance. Optimization models make it possible to determine, for example, the energy matrix that gives the highest efficiency (or the lowest cost) to a given operation, or the best operating regime for generating plants, or the contracting of rate classes and demand levels that ensure the lowest operating costs in a given planning horizon. Prescriptive analysis tools still have a good way to go in terms of their adoption in the corporate world, perhaps because of their high degree of sophistication and the high level of specificity of their application, despite the enormous potential for value creation.
It is a fact that organizations that invest consistently in the development of strategies and in the adoption of data analysis tools have achieved significant results, guaranteeing high levels of competitiveness.
The subject is as extensive as it is interesting. Perhaps the most important message here is the recognition that each analysis technique may be more suited to a given context or problem. Identifying opportunities to choose the most appropriate combination of the above tools requires a good deal of human analytical capability, but is essential for pursuing greater efficiency and performance.
We will talk more about the advantages and challenges each approach in upcoming articles.
The Viridis platform provides functionalities for collecting, cleaning, and visualizing data on consumption and generation of energy, facilitating the analysis of complex problems and the identification of opportunities. Capable of processing large volumes of data in real time, the Viridis platform incorporates algorithms for building models that combine information on consumption and power generation, production context, and financial information , providing important insights for the manager .The Viridis platform's planning capabilities permit the identification of optimal operational configurations for large energy consumers, enabling scenario and sensitivity analyses that ensure greater robustness of the decision-making process. Click here and read more about Viridis's Products.