EEG-based methods to characterize memorised visual space ⋆ Mauro Nascimben 1,2 , Thomas Zo¨ ega Ramsøy 1 , and Luis Emilio Bruni 2 1 Neurons Inc, Taastrup, Denmark https://neuronsinc.com/ 2 Augmented Cognition Laboratory, Aalborg University Copenhagen, Denmark https://augcog.aau.dk/ Abstract. One second of memory maintenance was evaluated to deter- mine EEG metrics ability to track memory load and its variations con- nected with the lateral presentation of objects in the visual hemi-field. An initial approach focused on features gathered from the N2pc time series to detect the memory load using ensemble learners.Conversely, the secondary approach employed a regularised support vector classifier to predict the area of N2pc event-related components, identifying 6 levels of memory load and stimulus location. Keywords: Visual working memory · Memory load · Retention period. 1 Introduction 1.1 Visual working memory and cognitive load Visual working memory (i.e. VWM) supports high cognitive functions provid- ing temporary storage for retained information from one fixation to the next. The visual features primarily maintained in mind are position, shape, color and texture of the objects in space commonly referred an object’s attributes above the perceptive threshold [1]. The number of items stored in memory is rather related to the concept of capacity [2] and limited to 3-4 multi-attribute ob- jects [3] depending on subjective performance and task characteristics (for ex- ample, encoding time [4]). Multitasking impacts the number of memory items that can be maintained and the amount of cognitive resources expended [5]. In- deed, working memory not only includes maintenance of information, but also information processing during encoding time of filtering irrelevant stimuli (dis- tractor avoidance) [6]. Such multiprocessing is commonly termed the cognitive load. The time-based resource-sharing model [7] aims to theorize the relation- ship between cognitive load and memory performance by identifying four major mental stages: encoding, filtering distractors, recall, and refreshing-the last of ⋆ This project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Sklodowska-Curie Grant Agreement No 813234.