Economy Informatics vol. 12, no. 1/2012 5 SOA and Web Technology for Computing the Intrinsic Entropy of BSE Listed Stocks Claudiu VINŢE 1 , Ion SMEUREANU 1 , Ionuţ-Alexandru LIXANDRU 2 1 Bucharest University of Economic Studies 2 Bucharest Stock Exchange claudiu.vinte@ie.ase.ro, smeurean@ase.ro, alexandru.lixandru@bvb.ro Measuring investors’ level of interest for a traded equity product can provide, along with the status of the stock market as a whole, a consistent mean for building a hierarchy among the traded equities. The concept of intrinsic entropy associated to stock exchange traded equity has the ability to capture the factual perception of investors regarding the performance of a publically traded company. Our ongoing research has been conducted based on the transac- tions executed on the Bucharest Stock Exchange (BSE), and aims to prove that price variation weighted entropy can offer a synthetic and readily computed indicator, for evaluating direc- tion and intensity of trading activity. This paper will focus on how trade data is captured, and stock intrinsic entropy is computed and presented within a SOA solution. Keywords: Service-Oriented Architecture (SOA), Message-Oriented Middleware (MOM), Ja- va Message Service (JMS), Stock Intrinsic Entropy, Web Services Introduction Entropy concept originated in physics, and it plays important roles in numerous oth- er disciplines ranging from logic and statis- tics to biology and economics. However, there is not an unique or unified interpreta- tion of this concept. Entropy is defined dif- ferently in different contexts, and even within the same domain, different notions of entropy are at work. Some of these are defined in terms of probabilities, others are not. Even when entropy is defined in terms of probabil- ities, these could be either probabilities chances – as physical probabilities - or cre- dence - as degrees of belief [1]. Gibbs defined in 1878 the most general for- mula for the entropy of a thermodynamic system, after earlier work by Boltz- mann (1872). The Gibbs entropy, ∑ () where k B is the Boltzmann constant, and p i is the probability of a microstate. The summation is over all the possible mi- crostates of the system, and p i is the probabil- ity that the system is in the i th microstate [1]. For most practical purposes, this can be taken as the fundamental definition of entropy since all other formulas for S can be mathe- matically derived from it, but not vice versa. The most general interpretation of entropy is as a measure of our uncertainty about a sys- tem. The equilibrium state of a system max- imizes the entropy because we have lost all information about the initial conditions ex- cept for the conserved variables. This uncer- tainty is not of the everyday subjective kind, but rather the uncertainty inherent to the ex- perimental method and interpretative model. The interpretative model has a central role in determining entropy. In other words, the set of macroscopic variables one chooses must include everything that may change in the experiment; otherwise, one might see de- creasing entropy [2]. We will see later on that the number of states observed or taken into consideration can have a significant impact on the value of entropy, and in contouring of a certain conclusion or another during analy- sis. In information theory, entropy is a measure of the uncertainty associated with a random variable. In this context, the term usually re- fers to the Shannon entropy [3], which quan- tifies the expected value of the information contained in a message, usu- ally in units such as bits. In the context of 1