Vol.:(0123456789) 1 3 Journal of Business Ethics https://doi.org/10.1007/s10551-019-04205-9 ORIGINAL PAPER Technological Unemployment, Meaning in Life, Purpose of Business, and the Future of Stakeholders Tae Wan Kim 1  · Alan Scheller‑Wolf 2 Received: 1 September 2017 / Accepted: 27 May 2019 © Springer Nature B.V. 2019 Abstract We ofer a precautionary account of why business managers should proactively rethink about what kinds of automation frms ought to implement, by exploring two challenges that automation will potentially pose. We engage the current debate concerning whether life without work opportunities will incur a meaning crisis, ofering an argument in favor of the posi- tion that if technological unemployment occurs, the machine age may be a structurally limited condition for many without work opportunities to have or add meaning to their lives. We term this the axiological challenge. This challenge, if it turns out to be persuasive, leads to a second challenge, to which managers should pay special attention: the teleological chal- lenge, a topic especially relevant to the broad literature about corporate purpose and governance. We argue that both the shareholder proft-maximization model and its major alternative, stakeholder theory, are insufcient to address the meaning crisis. Unless rebutted, the two challenges compel business leaders to proactively rethink the purpose of business for future society. Otherwise, businesses will be contributors to a major ethical crisis and societal externality in the coming society. Keywords Automation · Meaning of work · Stakeholder I think everybody should be a machine. —Andy Warhol. Whether one agrees with Andy Warhol or not, there is no denying that the relationship between people and machines—more specifcally workers and machines—is rapidly evolving. And as this evolution unfolds, it is yet to be determined whether the two groups will fnd a mutualis- tic equilibrium, or whether machines will emerge as domi- nant in the workplace, greatly diminishing, if not essentially extinguishing, the role of workers. As we write this, busi- nesses are automating workplaces with advanced technolo- gies, including but not limited to driverless cargo trucks, artifcially intelligent mortgage approvals, machine learning- based paralegals, and algorithmic managers. Such techno- logical advancement raises a host of normative questions (Bhargava and Kim 2017; Hooker and Kim 2018; Martin 2016; Parmar and Freeman 2016). As Thomas Donaldson recently remarked, “It’s an instance of a problem that more sophisticated engineering cannot solve, and that requires a more sophisticated appeal to values” (Ufberg 2017). One normative question to which has been paid much public attention is whether the government ought to ofer a basic income to everyone if robots take over human jobs on an unprecedented scale in the near future (Ito et al. 2016; Van Parijs 2004), often-called “the second machine age” (Brynjolfsson and McAfee 2014). Although this is an impor- tant public policy question, its seeming focus on the govern- ment as the sole agent responsible for mitigating societal problems can obscure questions about the role and account- ability that businesses themselves should accept, especially regarding workplace automation and its potential impact upon unemployment. 1 For example, it could lead one to * Tae Wan Kim twkim@andrew.cmu.edu Alan Scheller-Wolf awolf@andrew.cmu.edu 1 Tepper School of Business, Carnegie Mellon University, 5000 Forbes Avenue, Pittsburgh, PA 15213, USA 2 Ricahrd M. Cyert Professor of Operations Management, Tepper School of Business, Carnegie Mellon University, 5000 Forbes Avenue, Pittsburgh, PA 15213, USA 1 We use the terms “accountability” almost interchangeably with “role” or “responsibility.” Our uses of “accountability” are close to Scanlon’s (1998) uses of his term “substantive responsibility,” when