Abstract
The paper analyzes the problem of entropy in the moments of transition from a normal economic situation (2015–2019) to the Pandemic period (2020–2021) and the period of Russia’s attack on Ukraine (2022–2023). The research in the article is based on the analysis of electricity, oil, coal, and gas prices in 27 countries of the European Union and Norway. The daily data cover the period from January 1, 2015, to March 30, 2023, and were analyzed using two-dimensional sets of electricity and commodity prices. The work uses the time dependent James-Stein estimator of the Shannon informational entropy.
Figures
Citation: Papla D, Siedlecki R (2025) Analysis of entropy on the European markets of energy and energy commodities prices. PLoS ONE 20(1): e0315348. https://doi.org/10.1371/journal.pone.0315348
Editor: Alessandro Mazzoccoli, Roma Tre University: Universita degli Studi Roma Tre, ITALY
Received: August 22, 2024; Accepted: November 24, 2024; Published: January 16, 2025
Copyright: © 2025 Papla, Siedlecki. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Data Availability: All relevant data are within the manuscript and its Supporting Information files.
Funding: The author(s) received no specific funding for this work.
Competing interests: The authors have declared that no competing interests exist.
1. Introduction
Issues with energy price analysis and its forecasts are known but still current and interesting. In today’s "turbulent" times, there is no doubt that forecasting electricity prices is not only for participants of commodity and derivative markets (day traders and speculators) but also for energy systems planning and operations [1, 2].
In the literature, they are often analyzed using statistical and econometric methods to describe financial market data, where either a normal distribution of returns and their "stationarity" is assumed or variability of variance is analyzed [3–5]. In practice, attempts are made to make time-series stationary by determining logarithmic rates of return, which only "reduces" their non-stationarity but does not eliminate it [5]. The literature often notes that research based on these methods or artificial neural networks (ANN) leads to difficulties in their application, and forecasts are often acceptable only once or for a short period (they are not universal) [6, 7]. It is because the prices in the energy market are characterized by very high volatility and outliers caused by many factors, both measurable and unmeasurable: weather conditions, economic and political factors, technology development and problems with its storage, among others. So those prices can be treated as a probabilistic values [1, 8]. Prices and consumption are also influenced by the emergence of new energy sources (renewable energy; green energy) and the variability (fashion) of using old ones, such as e.g. nuclear energy. The use and sharing of these sources in overall consumption varies depending on countries and continents (see Fig 1). In European countries, the United States and Canada, there is a significant decrease in the share of coal in energy sources, while in Asia, it rapidly increases, making it the main energy source in the world, along with gas and oil.
Source: Energy Institute Statistical Review of World Energy (2023).
Currently, we are dealing with significant instability not only in the financial and commodity markets (e.g. energy) but also in the economy and social policy (ecology), caused by, among others, the COVID-19 pandemic and the armed conflict in Ukraine and the Middle East, but also the very rapid development of new technologies such as A.I. leading to high energy demand. In times without turbulences and shocks, the analysis of prices on financial markets (e.g. stock exchanges) and energy indicated the possibility of forecasting long-term trends and the lack of random walks for daily or monthly observations [3, 7].
In this contribution, the entropy analysis as a statistical method is used to analyze energy prices in markets. The concept of entropy occurs primarily in thermodynamics but also the theory of probability, information, stochastic processes, and economics and finance [9–11].
As in [12] "Thermodynamics as this drives the world economy from its ecological foundations as solar energy passes through food chains in dissipative process of entropy rising and production fundamentally involving the replacement of lower entropy energy states with higher entropy ones". Thus, tools and methods for measuring entropy can effectively model and analyze financial and commodity markets. Based on financial market theory, prices of energy drivers such as coal, oil, or gas are similar to the prices of financial assets during "normal" times following a random walk. Therefore, it can be concluded that the phenomenon of energy price drivers is an isolated system that tends to maximum entropy more or less slowly. In finance and economics, entropy often determines the measure of system order (which can be defined as moments of the high degree of market efficiency), which is a prerequisite for making effective forecasts [3, 12]. It means there is a high entropy of financial asset prices, which can lead to severe mistakes made by financial analysts and economists, which can deepen the crisis. According to Bejan Constructal Law [13]: "For a finite-size flow systems to persist in time, it must evolve with freedom such it provides an easier access to its flows". The paper aims to investigate whether in crisis situations, in which external interference occurs, entropy decreases, which means that the possibility of forecasting the prices of electricity and its carriers (namely oil, coal and gas) increases.The following hypothesis was put forward in the article: Similar to the second law of thermodynamics, during crises, external interference in the market (e.g., price regulation) causes decrease of the entropy of energy prices and the factors affecting energy prices are becoming more predictable. According to the principles of physics, in an isolated system, entropy increases or decreases at the moment of external intervention [14, 15], similar to finance, where during crises and economic slowdowns, there is interference from governments introducing new regulations and intervening in financial markets [6, 16]. Our research support the hypothesis that regulations make market behavior less random.
To verify hypothesis, the Shannon entropy with an estimator by James-Stein [17] is used.The method allows us to take into account the non-linear nature of the studied phenomena and also analyze current data, from the period of the COVID-19 pandemic and from the current period, including, among others, armed conflict in Ukraine. This method seems to be best adapted to our case, since it does not require the analysis of the stationarity of the series, it is simple and requires a small number of assumptions, which makes this method attractive and can be applied even without deep knowledge of statistics and econometrics. Since this method does not require the analysis of the stationarity of the series, it is simple and requires a small number of assumptions, which makes this method attractive and can be applied even without deep knowledge of statistics and econometrics. This entropy estimator also performs very well in the small sample situation [17]. Statistical methods assume normality of series and stationarity and therefore often the results are on the borderline. When studying prices it would be necessary to assume and exclude the trend and assume a normal distribution. Return rates also do not have a normal distribution. The method seems to be the best suited not necessarily the most effective. The simple method may be less effective but in the long run requires less interference in the model.
For entropy estimation, especially for time-dependent entropy, we used proprietary software created in the R package for the paper. A robustness analysis of the results is also conducted, we have calculated the entropy estimator for the daily data with one year moving window and for the monthly data with two years moving window and both results corroborates our original results. Those results are included in the appendix.
2. Methodology
Entropy is a measure of the degree of uncertainty associated with the variables used in statistical phenomena in the context of statistical systems or probability theory.
In order to numerically quantify the amount of "lost information" in phone-line signals, C.E. Shannon proposed a measure of uncertainty in 1948 that was later dubbed Shannon entropy. Based on works by Nyquist [18, 19] and Hartley [20], the measure was first presented in his well-known work A Mathematical Theory of Communication [21]. It was a key component in the development of information theory, the first comprehensive mathematical theory of communication. Shannon made a major contribution when he demonstrated that entropy could be applied to any series in which probabilities exist. This was a major advancement over earlier research by Clausius and Boltzmann, which was limited to thermodynamic series. The average quantity of "information, choice, and uncertainty" encoded in patterns extracted from a signal or message is what Shannon formally defined as entropy. According to some interpretations, entropy is a system’s degree of disorder and unpredictability. The application of Shannon’s entropy to any series with a distinct probability distribution was recognized early on and was extensively used, especially in the financial science.
We can define the Shannon entropy [21], using a categorical random variable with corresponding cell probabilities (frequencies) p1,…, pn and alphabet size n, where pk > 0 and Σkpk = 1. In our assumption n is fixed and known. The Shannon entropy in natural units of information [21], i.e. units of information based on natural logarithms, is provided by:
Since the underlying frequencies known as values of the probability mass function are unknown in practice, it is necessary to estimate H and pk from observed cell counts yk ≥ 0.
The maximum likelihood (ML) estimator, which is created by plugging the ML frequency estimates into Shannon equation, is an especially straightforward and popular entropy estimator.
with
being the total number of counts.
After some consideration, we have chosen the James-Stein shrinkage estimator [17], which is well adapted to our case because it performs better for data with less information, that is where the number of observations (in our case 250 for one year and 500 for two years window) is small comparing with the number of cells (in our case 100 x 100 = 10 000). This estimator is based on the weighted average of two models: a high-dimensional model with low bias and high variance–the maximum likelihood estimator—and a lower-dimensional model with larger bias but smaller variance–the shrinkage target [17]:
where λ ∈ [0, 1] is the shrinkage intensity, and tk is the shrinkage target that ranges from 0 (no shrinkage) to 1 (complete shrinkage). The uniform distribution of
is a practical option [17].
Of course there are also other forms of entropy like the Tsallis entropy [22–24] or the Rényi entropy [25–28], but for our research the Shannon entropy seems the best choice. Unlike the Tsallis and Rényi entropy, it was created from the outset with application to information theory, and the other two were created more with reference to physics (chemistry, thermodynamics and even quantum physics). Of course, it will be very interesting to adapt and modify other entropy methods to finance.
A common presumption in many statistical techniques is stationarity. It ensures that under time translations, processes’ mean, variance, and auto-correlation structure is invariant. For the purpose of modeling and forecasting financial systems, this is particularly crucial. However, in the world of finance, series are usually nonstationary, which means that their statistical characteristics vary over time and they are unpredictable models with drifts and trends. Mathematical transformations like logarithms and first differences can make time series almost stationary, making them suitable for use with conventional statistical techniques such as ARMA or GARCH models. Time-dependent methods become good substitutes when transformations to stationary processes are not feasible or possible, for example even after the first differentiation of some processes remain to be nonstationary. To produce a temporal entropy evolution, for example, time-dependent entropy (TDE) methods have been introduced in information theory. TDE is able to capture changes in local irregularity in a signal, which can provide valuable information about its nonstationary dynamics [29]. Applications come from a variety of fields, including finance and physics [30–35].
The time-dependent entropy method is defined formally as follows. Considering a nonstationary time series with time-varying statistical characteristics, time-varying information cannot be captured by standard entropy methods. We define sliding window
of size w≤N at each time step
with a sliding step of Δ≤w. The operator [.] denotes casting the argument into an integer. A temporal evolution of entropy is produced by computing the desired entropy at a specific time t using the values of the time series in each window Zt. It is well known that variables like window size and time lag can impact outcomes. A few studies using EEG signals are displayed in [36].
3. Results
The application proposed in this paper is based on the analysis of electricity, oil, coal, and gas prices in 22 countries of the European Union and Norway (associated countries). Data was obtained from Our World in Data, Eurostat, and Reuters databases. Daily data from January 1, 2015, to March 30, 2023 are used. To check the robustness of our results we also used monthly data.
For each country, three two-dimensional sets of the country’s electricity price versus oil, coal, and gas prices are constructed. For each set and each country, time-dependent Shannon entropy [21] is estimated using, as stated earlier, the James-Stein shrinkage estimator [17], with a two-year window, and the step is one day. To get cumulated results for Europe, the average of the estimated entropy for each day is calculated.
According to the research hypothesis, main energy commodities prices in relation to electricity prices in the period 2015–2019 were unpredictable (see Figs 2 and 3). A high level of entropy (random walk phenomenon) was visible in every analyzed country. The decline during armed conflict in Ukraine was less pronounced in such countries like Denmark or Sweden. Throughout Europe, the price of coal had the greatest impact on the price of electricity despite an evident decline in its consumption in Europe.
Source: own calculations.
Source: own calculations.
During the pandemic, the decrease in entropy was small, so the prognostic abilities did not change despite the crisis and turbulence in the energy market in some countries. The time of entropy decline was March 2022, i.e. the beginning of the conflict in Ukraine and the slow abandonment of gas, oil and coal of Russian origin. It was political and legal interference (into the closed system of production and distribution of the electricity market), mainly in European countries. The greatest decline was in the case of post-communist countries, i.e. those most at risk and, at the same time, dependent on gas and oil from Russia.
Our research confirms previous research [37–39] stating, that during the crisis, when governments support financial markets, entropy decreases, and during stable times, entropy is close to the maximum, and, due to random behaviour of the prices, there could be significant problems with short-term forecasts [40].
In the near future, on the one hand, a transition to green energy and a decrease in demand for oil, coal, and gas as a source of electricity is expected; on the other hand, dynamic technological development significantly increases the electricity demand, which in turn will extend the transition time to green energy. It can cause another period of increased entropy and a decrease in predictability.
Results from monthly data with two year window and daily data but with one year window (see Figs 4 and 5) mostly corroborate our results, we can see the same sharp decline in entropy in March 2022, so we can say that our method is robust in case of frequency of data and window length. Only due to the lower frequency of the data, differences are visible in the form of a temporary increase in entropy and subsequent decline at the beginning of 2023.
Source: own calculations.
Source: own calculations.
4. Conclusions
In finance, statistical and econometric methods are most often used to analyze relationships and "behaviours", which require knowledge of the probability distribution and, in the case of prices on financial markets, most often the normal distribution. In our approach, we did not seek the distribution of prices as we had detailed information about them. We have calculated the entropy of systems in different periods.
Methods used in physics and engineering sciences can be used in finance and economics because there are many analogies with phenomena existing in physics, such as Constructual Law and the 2nd law of thermodynamics (entropy, Verhulst and Bejan curve) [10] market temperature and price volatility on financial markets or analysis of business cycles and economic forecasts. This is also confirmed by the Nobel Prize winners’ Black-Sholes model, which is derived from the analysis of physical processes and used to value derivatives. We also confirm that one may look for some analogies with Constructual law and entropy in finance and economics. We may suspect that after some time, the free market should reach some "equilibrium state" characterized by a "random walk" of investigated quantities, particularly prices. In such a case, the price would be an analogue of velocity, and thus, some function of prices should correspond to the total energy of the system. Our research confirms the above theories on the example of prices of primary energy sources (oil, gas or coal) in the EU-27 countries, i.e. the poor ability to forecast prices in normal conditions and an increase in forecasting possibilities and the impact of other factors on electricity prices in times of crisis, when "outside" actions are taken (e.g. government or international institutions’ interventions). Entropy did not drop during the pandemic, i.e. the shock on the financial and commodity markets, but only after Russia attacked Ukraine and the interference of European governments in the energy market, which resulted in more remarkable forecasting ability. However, this was temporary, and, as in isolated systems, entropy began to increase despite prolonged government regulations, returning to the level before the attack (March 2022).
Electricity prices, like prices of financial instruments in short periods, are subject to random walk, i.e. high entropy, which makes building effective econometric and statistical models much more difficult and often impossible.
Of course more statistical approach, especially confidence analysis and comparison of results using other methods, such as the Hurst coefficient, econometric methods (ARIMA, GARCH, etc.) and different methods of examining entropy is in the course of our research. We hope it will be presented in subsequent publications.
References
- 1. Nowotarski J., Weron R., Recent advances in electricity price forecasting: A review of probabilistic forecasting Renewable and Sustainable Energy Reviews, Volume 81, (2018).
- 2. Weron R., Electricity price forecasting: A review of the state-of-the-art with a look into the future, International Journal of Forecasting, Volume 30, Issue 4, 2014.
- 3. Godfrey M.D.; Granger C.W.J.; Morgenstern O. The Random Walk Hypothesis of Stock Market Behavior Kyklos 2007.
- 4. Contreras-Reyes J.E.; Rényi entropy and complexity measure for skew-gaussian distributions and related families, Physica A 433 2015.
- 5. Berk I., Yetkiner H. Energy prices and economic growth in the long run: Theory and evidence Renewable and Sustainable Energy Reviews, Volume 36, 2014.
- 6. Darbellay G., Wuertz D., The entropy as a tool for analyzing statistical dependences in financial time series, Physica A: Statistical Mechanics and its Applications, Volume 287 2000.
- 7. Fama Eugene F. "Random Walks in Stock Market Prices." Financial Analysts Journal, vol. 21, no. 5, 1965, pp. 55–59. JSTOR, http://www.jstor.org/stable/4469865. Accessed December 14 2023.
- 8. Amjady N.; Hemmati M.; "Energy price forecasting—problems and proposals for such predictions," IEEE Power and Energy Magazine, vol. 4, no. 2, pp. 20–29, 2006.
- 9. Bejan A.; Tsatsaronise G. Purpose in Thermodynamics. Energies 14(2), 408; 2021.
- 10. Bejan A. Fundamentals of exergy analysis, entropy generation minimization, and the generation of architecture. Int. J. Energy Res. p. 26:545; 2002.
- 11. Maasoumi E.; Racine J. Entropy and predictability of stock market returns Journal of Econometrics 107 2002.
- 12. Barkley Rosser J. Jr. Entropy and econophysics The European Physical Journal vol 225, p. 3091–3104, 2016.
- 13.
Bejan A. The Physics of Life: The Evolution of Everything; St. Martin’s Press: New York, NY, USA, 2016.
- 14. Schrödinger E. What is life? the physical aspect of the living cell & Mind and matter Cambridge University Press 1944.
- 15. Bejan A., Method of entropy generation minimization, or modeling and optimization based on combined heat transfer and thermodynamics Revue Générale de Thermique, Volume 35, 1996.
- 16. Fumo N., Rafe Biswas M.A. Regression analysis for prediction of residential energy consumption, Renewable and Sustainable Energy Reviews, Volume 47, 2015.
- 17. Hausser J.; Strimmer K. Entropy inference and the James-Stein estimator, with application to non-linear gene association networks. J. Mach. Learn. Res. Vol. 10, p. 1469–1484 2009.
- 18.
Nyquist H. Certain factors affecting telegraph speed. Bell System Technical Journal, Blackwell Publishing Ltd, v. 3, n. 2, p. 324–346, 1924.
- 19. Nyquist H. Certain topics in telegraph transmission theory. Transactions of the American Institute of Electrical Engineers, v. 47, n. 2, p. 617–644, April 1928.
- 20.
Hartley R. V. L. Transmission of information1. Bell System Technical Journal, Blackwell Publishing Ltd, v. 7, n. 3, p. 535–563, 1928.
- 21.
Shannon C. E. A mathematical theory of communication. SIGMOBILE Mob. Comput. Commun. Rev., ACM, New York, NY, USA, v. 5, n. 1, p. 3–55, jan. 2001.
- 22. Boon J. P.; Tsallis C. Special issue overview nonextensive statistical mechanics: new trends, new perspectives. Europhysics News, v. 36, n. 6, p. 185–186, 2005.
- 23. Cartwright J. Roll over, Boltzmann. Physics World, v. 27, n. 05, p. 31, 2014.
- 24. Tsallis C. Possible generalization of boltzmann-gibbs statistics. Journal of Statistical Physics, v. 52, n. 1, p. 479–487. 1988.
- 25.
Rényi, A. On measures of entropy and information. In: Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability, Volume 1: Contributions to the Theory of Statistics. Berkeley, Calif.: University of California Press, 1961. p. 547–561.
- 26. Harremoes P. Interpretations of rényi entropies and divergences. Physica A:Statistical Mechanics and its Applications, v. 365, n. 1, p. 57–62, 2006.
- 27.
Xu D.; Erdogmuns D. Renyi’s entropy, divergence and their nonparametric estimators. In: Information Theoretic Learning. [S.l.]: Springer New York, 2010, (Information Science and Statistics). p. 47–102.
- 28. Baez J. C. Rényi entropy and free energy, 2011.
- 29. Gamero L.; Plastino A.; Torres M. Wavelet analysis and non-linear dynamics in a nonextensive setting. Physica A: Statistical Mechanics and its Applications, v. 246, n. 3–4, p. 487–509, 1997.
- 30. Dunkel J.; Trigger S. A. Time-dependent entropy of simple quantum model systems. Phys. Rev. A, American Physical Society, v. 71, May 2005.
- 31. Pelap F. et al. Time dependent entropy and decoherence in a modified quantum damped harmonic oscillator. Journal of Quantum Information Science, v. 4, p. 214, 2014.
- 32. Özcan Ö.; Aktürk E.; Sever R. Time dependence of joint entropy of oscillating quantum systems. International Journal of Theoretical Physics, v. 47, n. 12, p. 3207–3218, 2008.
- 33. Ishizaki R.; Inoue M. Time-series analysis of foreign exchange rates using time-dependent pattern entropy. Physica A: Statistical Mechanics and its Applications, v. 392, n. 16, p. 3344–3350, 2013.
- 34. Alvarez-Ramirez J.; Rodriguez E.; Alvarez J. A multiscale entropy approach for market efficiency. International Review of Financial Analysis, v. 21, p. 64–69, 2012.
- 35. Martina E. et al. Multiscale entropy analysis of crude oil price dynamics. Energy Economics, v. 33, n. 5, p. 936–947, 2011.
- 36. Tong S.; Thakor N. Quantitative EEG Analysis Methods and Clinical Applications. [S.l.]: Artech House, 2009. (Artech House engineering in medicine & biology series).
- 37. Lin B.; et al. Does COVID-19 open a Pandora’s box of changing the connectedness in energy commodities? Res Int Bus Finance 2021.
- 38. Lin B.; et al. Oil prices and economic policy uncertainty: evidence from global, oil importers, and exporters’ perspective Res Int Bus Finance 2021
- 39. Asl M.G.; et al. Dynamic asymmetric optimal portfolio allocation between energy stocks and energy commodities: evidence from clean energy and oil and gas companies Resour Pol 2021.
- 40. Siedlecki R.; Papla D.; Bem A.; A logistic law of growth as a base for methods of company’s life cycle phases forecasting Proceedings of the Romanian Academy Series A-Mathematics Physics Technical Sciences Information Science, vol. Special Issue, p. 141–146 2018.