Vol.:(0123456789) 1 3
CCF Transactions on Pervasive Computing and Interaction
https://doi.org/10.1007/s42486-020-00028-0
REGULAR PAPER
Research on human–AI co‑creation based on refective design practice
Zhiyong Fu
1
· Yuyao Zhou
1
Received: 6 June 2019 / Accepted: 5 January 2020
© China Computer Federation (CCF) 2020
Abstract
Artifcial intelligence has appeared in every aspect of our life, whether it is a large number of data analysis and repetitive
work, or the scanning recognition in life, AI has become an essential application technology. How humans and AI will coex-
ist in the future is a thought-provoking question. The involvement of AI will lead to changes in design method and process,
the designers and researchers need to conduct research on future designs from a new perspective. We selected the feld of
design practice and put forward the concept of human–AI co-creation (HACC), aiming to study how AI can better design
and create with people. We use ethnographic design research methods to summarize how teachers guide students in design
practice, and summarize the normative vocabulary that AI guides students to use in experiments. Finally, we summarize the
basic elements of HACC, and then use the “Wizard of OZ” method to simulate HACC experiment, and further improve the
HACC model through video observation and data analysis.
Keywords Human–AI · Co-creation · Artifcial intelligence · Design practice
1 Introduction
Artifcial intelligence is currently the hottest area. While
nurturing new technologies and new products, it is also
driving changes in various industries. In the art of artifcial
intelligence, robot art is the most direct, most expressive
artistic medium, material and subject with visual arts advan-
tages (Zhang 2018). Nowadays artifcial intelligent painting
that an agent can paint strokes on a canvas in sequence to
generate a painting that resembles the given target image.
Some work has studied in teaching machines to learn paint-
ing-related skills, such as sketch, doodle and write charac-
ters (Huang 2019). Human–machine art has triggered our
thinking about AI, showing the current state of human life
through machines, and paying attention to the real living
environment of people. It will also use “machine” to talk
about the possibility of imagining the future, and to refect
on reality, thus breaking the art description and refecting the
real society. In the future, with the development of science
and technology, we will gradually expand from the relation-
ship of “harmony between man and nature” to the relation-
ship of “man–machine integration”. The two relationships
may develop in parallel or in combination. Man–machine
integration is the ultimate spiritual concept of man–machine
relationship and the state of subconscious integration
between you and me (Zhang 2018).
Whether most jobs will be replaced by artifcial intelli-
gence in the future is also a heated topic. To achieve mutual
enhancement between human and AI, both parties need to
honor and realize each other’s full potential. In this pro-
cess, the notion of “engagement” (Ma 2018) ofers a lens
into the synergistic relationship between human users and
AI technologies (Niksirat et al. 2018). Co-creation is more
inclusive than engagement (Feldman 2017). In the future,
human–machine relationship will become a new ecological
relationship besides man and man, man and nature, so it is
particularly important to formulate some norms or models.
Because ML is becoming an increasingly commonplace fea-
ture of new interactive systems, we should now expect UX
designers to regularly instigate innovative products and ser-
vices. They should be generating many new forms that have
not been imagined by the engineers who focus on making
the technology work (Dove et al. 2017).
Therefore, the paper proposes the concept of human–AI
co-creation (HACC), aiming to study how AI can better
* Zhiyong Fu
fuzhiyong@tsinghua.edu.cn
Yuyao Zhou
zhouyy17@mails.tsinghua.edu.cn
1
Department of Information Art and Design, Tsinghua
University, Beijing 100084, People’s Republic of China