Contents lists available at ScienceDirect
Medical Hypotheses
journal homepage: www.elsevier.com/locate/mehy
Dietary assessment can be based on pattern recognition rather than recall
D.L. Katz
a,
⁎
,1
, L.Q. Rhee
a
, C.S. Katz
a
, D.L. Aronson
a
, G.C. Frank
b
, C.D. Gardner
c
, W.C. Willett
d
,
M.L. Dansinger
e
a
Diet ID, Inc, Detroit, MI, United States
b
Department of Family and Consumer Sciences, California State University, Long Beach, United States
c
Stanford Prevention Research Center, Department of Medicine, Stanford University Medical School, Stanford, CA, United States
d
Harvard T.H. Chan School of Public Health, Harvard Medical School, Boston, MA, United States
e
Boston Heart Diagnostics, Framingham, MA, United States
ABSTRACT
Diet is the leading predictor of health status, including all-cause mortality, in the modern world, yet is rarely measured; whereas virtually every adult in a developed
country knows their approximate blood pressure, hardly any knows their objective diet quality. Leading authorities have called for the inclusion of nutrition in every
electronic health record as one of the many remedial steps required to give dietary quality the routine attention it warrants. Existing tools to capture dietary intake
are based on either real-time journaling or recall. Journaling, or logging, is time and labor intensive. Recall is notoriously unreliable, as humans are notably bad at
remembering detail. Even allowing for the challenge of recall, these dietary intake methods are labor and time intensive, and require analysis at the n-of-1 level. We
hypothesize that dietary intake assessment can be “reverse engineered”—predicating assessment on the recognition of fully formed dietary patterns—rather than
endeavoring to assemble such a representation one food, meal, dish, or day at a time. This pattern recognition-based method offers potential advantages over exiting
methods, including speed, efficiency, cost, and applicability. We have developed and provisionally tested such a system, and the results thus far support our
hypothesis. We are convinced that leveraging pattern recognition to make dietary assessment quick, user-friendly, economical, and scalable can allow for the
conversion of dietary quality into a universally measured and routinely managed vital sign. In this paper, we present the supporting case.
Introduction
Diet quality is the single most critical predictor of all-cause mor-
tality, chronic disease risk, longevity, and vitality in the United States
and much of the modern world [1,2]. Key indicators of health risk such
as blood pressure, lipid levels, glucose levels, and various other bio-
markers are measured routinely to facilitate both treatment to specific
goal levels, and the monitoring of trends that dictate a course for
management and treatment [3]. Diet is a notable exception, in whole or
in part due to the intrinsic limitations of prevailing dietary assessment
instruments [4].
Dietary assessment is the collection and measurement of a person’s
food intake to determine dietary pattern, diet quality, and/or health
risk. Traditional methods for routine use in dietetics, clinical care, re-
search, and epidemiology rely either on recall of dietary details, or real-
time logging of food intake [5]. These methods are typically tedious,
time-consuming, labor-intensive, and in the case of recall-based
methods (see Table 1), prone to considerable inaccuracies [6–9]. This
problem is well-known, as illustrated by recent calls for improvements
in dietary assessment methods by both the Gates Foundation [10] and
the National Institutes of Health [11].
Many diet assessment tools, such as 24-hour recalls, dietary his-
tories, and food frequency questionnaires, differ in methodology but
have in common their ultimate depend on recall—that is, memory—on
behalf of the patient. Unfortunately, relying solely on memory suffers
from significant inaccuracy and error, while food and meal logging
(whether written/typed or via photo image capture) suffers from poor
estimates and reporting bias.
In spite of their inherent limitations, existing dietary assessment
methods are of established value, correlating with biomarkers [12,13],
and when stratified into quintiles of diet quality, with total chronic
disease risk [14,15] and all-cause mortality [16–18]. The semi-quanti-
tative food frequency questionnaire routinely serves as the “gold stan-
dard” for assessing dietary intake in large study populations [19]; the 7-
day weighted food record is the standard when more intensive and
time-consuming methods are tenable.
The evolution of dietary assessment methods has been quite limited
over a span of decades. Pen-and-paper assessment tools have been
adapted electronically for online use; intake questionnaires have be-
come more responsive and “smarter,” and food logging now entails
electronic inputs backed by comprehensive nutrient databases [21].
While these adaptations have reduced the need for manual calculations,
https://doi.org/10.1016/j.mehy.2020.109644
Received 2 January 2020; Received in revised form 24 February 2020; Accepted 25 February 2020
⁎
Corresponding author at: 1001 Woodward Ave, Suite 500, Detroit, MI 48226, United States.
E-mail address: dkatz@dietid.com (D.L. Katz).
1
ORCID ID: http://orcid.org/0000-0001-6845-6192.
Medical Hypotheses 140 (2020) 109644
0306-9877/ © 2020 Elsevier Ltd. All rights reserved.
T