MetaTree: Augmented Reality Narrative Explorations of Urban Forests Ruth West* a , Todd Margolis a , Jarlath O’Neil-Dunne b , Eitan Mendelowitz c a University of California, San Diego, 9500 Gilman Drive MC 0037, La Jolla CA 92093-0037; b University of Vermont, Rubenstein School of the Environment and Natural Resources, 81 Carrigan Drive, Aiken Center, Spatial Analysis Laboratory, Burlington, VT 05405; c Computer Science Department, Ford Hall 252, Smith College, Northampton, MA 01063. ABSTRACT As cities world-wide adopt and implement reforestation initiatives to plant millions of trees in urban areas, they are engaging in what is essentially a massive ecological and social experiment. Existing air-borne, space-borne, and field- based imaging and inventory mechanisms fail to provide key information on urban tree ecology that is crucial to informing management, policy, and supporting citizen initiatives for the planting and stewardship of trees. The shortcomings of the current approaches include: spatial and temporal resolution, poor vantage point, cost constraints and biological metric limitations. Collectively, this limits their effectiveness as real-time inventory and monitoring tools. Novel methods for imaging and monitoring the status of these emerging urban forests and encouraging their ongoing stewardship by the public are required to ensure their success. This art-science collaboration proposes to re-envision citizens’ relationship with urban spaces by foregrounding urban trees in relation to local architectural features and simultaneously creating new methods for urban forest monitoring. We explore creating a shift from overhead imaging or field-based tree survey data acquisition methods to continuous, ongoing monitoring by citizen scientists as part of a mobile augmented reality experience. We consider the possibilities of this experience as a medium for interacting with and visualizing urban forestry data and for creating cultural engagement with urban ecology. Keywords: mobile augmented reality, citizen science, urban forestry 1. INTRODUCTION As major cities world-wide adopt and implement reforestation initiatives to plant millions of trees in urban areas, they are engaging in what is essentially a massive ecological experiment. For example, the City of New York is planting one million trees from 2008 to 2017 1 ; Albuquerque, New Mexico has completed its first planting of one million trees and is in the process of planting its second million 2 ; The City of Los Angeles is also engaged in a one million tree planting initiative 3 . Non-profit and government organizations in Washington DC are planting trees on public and private land in order to reach their 40% tree canopy goal. Monitoring programs and urban forestry research is struggling to keep up with the rapid pace of these tree-planting initiatives, leaving urban forestry managers lacking the comprehensive understanding they require to better inform their activities. The comprehensive, timely, and actionable understanding that urban forest stewardship requires provides challenges and opportunities in designing new methods for data collection and public participation [1] . Existing overhead remote sensing and field inventories have been used to provide key information for informing management, policy, and supporting citizen initiatives for planting and stewardship of trees [2,3,4,5] , but these data collection approaches suffer from limitations. These limitations range from the lack of spatial resolution and vantage point, to the cost of acquiring multi-temporal imagery over a growing season, to the inability to obtain cost-effective measurements of tree structure and health for an entire city. For example, while overhead remote sensing approaches using high-resolution data from air- or space-borne platforms have been able to accurately map tree canopy [6] , the complexity of the urban forest makes it impossible to apply techniques that have been successfully used in rural forests to estimate such factors as basal area and stem 1 http://www.milliontreesnyc.org 2 http://www.treenm.com 3 http://www.milliontreesla.org The Engineering Reality of Virtual Reality 2012, edited by Ian E. McDowall, Margaret Dolinsky, Proc. of SPIE-IS&T Electronic Imaging, SPIE Vol. 8289, 82890G · © 2012 SPIE-IS&T CCC code: 0277-786X/12/$18 · doi: 10.1117/12.912051 SPIE-IS&T/ Vol. 8289 82890G-1 Downloaded From: http://proceedings.spiedigitallibrary.org/ on 04/21/2017 Terms of Use: http://spiedigitallibrary.org/ss/termsofuse.aspx