CONCURRENCY AND COMPUTATION: PRACTICE AND EXPERIENCE Concurrency Computat.: Pract. Exper. 2012; 24:921–933 Published online 14 June 2011 in Wiley Online Library (wileyonlinelibrary.com). DOI: 10.1002/cpe.1772 SPECIAL ISSUE PAPER Fast learning and predicting of stock returns with virtual generalized random access memory weightless neural networks Alberto F. De Souza 1 , Fabio Daros Freitas 2, * ,† and André Gustavo Coelho de Almeida 1 1 Departamento de Informática, Universidade Federal do Espírito Santo, Vitória, Espirito Santo, Brazil 2 Receita Federal do Brasil, Vitória, Espirito Santo, Brazil SUMMARY We employ virtual generalized random access memory weightless neural networks, VG-RAM WNN, for predicting future stock returns. We evaluated our VG-RAM WNN stock predictor architecture in predicting future weekly returns of the Brazilian stock market and obtained the same error levels and properties of baseline autoregressive neural network predictors; however, our VG-RAM WNN predictor runs 5000 times faster than autoregressive neural network predictors. This allowed us to employ VG-RAM WNN predictors to build a high frequency trading system able to achieve a monthly return of approximately 35% in the Brazilian stock market. Copyright © 2011 John Wiley & Sons, Ltd. Received 19 March 2011; Accepted 2 April 2011 KEY WORDS: high-performance time-series prediction; weightless neural networks; high-frequency trading 1. INTRODUCTION Neural network-based predictors have been successfully applied in predicting future stock returns and other financial variables, exhibiting many advantages over alternative methods [1–9]; however, little attention has been given to their computational performance requirements. Recently, trading firms started using sophisticated computer algorithms that must place hundreds of millions, even billions, of buy and sell orders a day—in some cases spending less than 10 ms per order. Today, high-frequency trading is a multibillion-dollar business, accounting for an estimated 50 to 70% of the total US equity market volume and 28% of European and 16% of Asian order flows [10–12]. The standard approach for neural network time-series predictor is the autoregressive neural net- work (ARNN) predictor [13], also known as focused time-lagged feedforward network [14], with p inputs—the present value and the p 1 past values of the series—whose output is an estimate of the series value for the next time period. After being trained, the ARNN predictor implements a non- linear multiple regression model of the time series using its past values as explanatory variables and the network’s weights as regression coefficients. However, their training procedure is intrinsically time consuming and difficult to parallelize. On the other hand, in random access memory (RAM)- based neural networks, also known as weightless neural networks (WNN) [15], training can be made in one shot and basically consists of storing the desired output in a memory position associated with the input of the neuron [16], thus being more time efficient and easier to parallelize than ARNN. In this work, we present a new WNN-based time-series predictor that uses virtual generalized random access memory weightless neural network (VG-RAM WNN) [15] to predict future stock *Correspondence to: Fabio Daros Freitas, Receita Federal do Brasil, Vitória, Espirito Santo, Brazil. E-mail: alberto@lcad.inf.ufes.br, freitas@computer.org, andre@lcad.inf.ufes.br Copyright © 2011 John Wiley & Sons, Ltd.