Online Risk Analytics on the Cloud Hyunjoo Kim, Shivangi Chaudhari, Manish Parashar * , and Christopher Marty † * NSF Center for Autonomic Computing Department of Electrical & Computer Engineering Rutgers, The State University of New Jersey 94 Brett Road, Piscataway, NJ, USA Email: {hyunjoo, shivangc, parashar}@rutgers.edu † Bloomberg LP, New York, NY, USA Email: cmarty@bloomberg.net Abstract—In todays turbulent market conditions, the ability to generate accurate and timely risk measures has become critical to operating successfully, and necessary for survival. Value-at- Risk (VaR) is a market standard risk measure used by senior management and regulators to quantify the risk level of a firm’s holdings. However, the time-critical nature and dynamic computational workloads of VaR applications, make it essential for computing infrastructures to handle bursts in computing and storage resources needs. This requires on-demand scalability, dy- namic provisioning, and the integration of distributed resources. While emerging utility computing services and clouds have the potential for cost-effectively supporting such spikes in resource requirements, integrating clouds with computing platforms and data centers, as well as developing and managing applications to utilize the platform remains a challenge. In this paper, we focus on the dynamic resource requirements of online risk analytics applications and how they can be addressed by cloud environments. Specifically, we demonstrate how the CometCloud autonomic computing engine can support online multi-resolution VaR analytics using and integration of private and Internet cloud resources. I. I NTRODUCTION In todays turbulent market conditions, financial firms must carefully monitor and manage risk or face severe consequences up to and including bankruptcy. Accordingly, the ability to generate accurate and timely risk measures has become critical to operating successfully, and necessary for survival. Value-at- Risk (VaR) is a market standard risk measure used by senior management and regulators to quantify the risk level of a firm’s holdings. The VaR measure looks at the entirety of the firm’s holdings at a confidence interval and time horizon, and reports an expected loss number. For example, a 1-Day 99% VaR number of $1 Million means that with 99% confidence, the firm holdings won’t decrease in value by more than $1 Million over the next day. The non-linear nature of instrument pricing models and the requirement to preserve correlations of price movements make developing a closed form solution to this problem virtually impossible for all but the most trivial The research presented in this paper is supported in part by National Science Foundation via grants numbers IIP 0758566, CCF-0833039, DMS-0835436, CNS 0426354, IIS 0430826, and CNS 0723594, and by Department of Energy via the grant number DE-FG02-06ER54857, and was conducted as part of the NSF Center for Autonomic Computing at Rutgers University. The research is also supported by Amazon Elastic Compute Cloud. portfolios. Large complex VaR calculations are typically done using computationally intensive Monte-Carlo simulations. Consider a medium size firm holding positions in 20,000 different financial instruments. Running a 100,000 simulation Monte-Carlo VaR calculation requires generating 2 Billion simulated instrument prices. With a conservative estimate of 10 ms. per pricing, this calculation requires more than 5,500 hours of processor time or roughly 700 processors working concurrently over an 8-hour window. As a result, the capital cost of hardware plus the operational cost for data center space, power, cooling, and maintenance make this cost prohibitive to all but the largest firms. The tradeoff between increased cost and complexity versus careful and accurate risk measures have driven financial firms to look for innovative ways to decrease computing costs and, at the same time, increase quality of risk measurements. There are a number of properties of the VaR calculation that make it a compelling candidate for a cloud computing architecture. A VaR calculation will typically start after the end of the trading day, when market data and final positions have been verified. It must complete, and updated risk numbers must be available, before the start of the next trading day. As the number and complexity of positions change, the computational requirements for the calculation can change significantly, however the completion deadline of the begin- ning of the next trading day remains fixed. Furthermore, as market conditions change, a firm may want to vary the number of Monte Carlo scenarios run (and thus the resolution of the calculation), which will add additional variability to the computation time. The requirement for additional computa- tion happens irregularly. Besides, considering the significant computational resources necessary for VaR, statically over provisioning computing resources to account for worst-case requirements does not make economic sense. As a result, the elastic nature of cloud computing resources makes it a very attractive solution. Additionally, the IT staff of a typical financial institution lacks the specialized development and operational resources necessary to build and operate a large scale distributed pro- cessing environment. The ability to outsource this complexity, as well as to take advantage of favorable pricing due to the off hours nature of the VaR calculation is very compelling. How-