Wearables: A Path to Pandemic Management

Wearables: A Path to Pandemic Management

Read More

A note on the authors: This article was co-authored and published simultaneously by Keith Deutsch and Peter Orban.

In “A New Division of Labor: IoT, Wearables and the Human Workforce”, the authors introduced the need for a new human-machine interaction paradigm centered around wearable technology driven by the emergence of increasingly autonomous automated systems including AI and robotics in the Industrial world.

Observing the challenges presented by the lack of sufficient detection capability of the Novel Coronavirus causing COVID-19, this article expands on the wearable technology focused rationale for human-digital integration, exploring its potential in the medical and consumer worlds, and expanding where it can be expected to go.

To frame the benefit of such an approach, imagine that the availability of self-administered test kits (and processing capacity) is unlimited, results are returned in a few seconds and the cost associated is negligible. This could immediately identify the affected, enabling an individualized quarantine (and treatment) while responsibly eliminating the need for a widespread lockdown and associated economic penalty.

At the current state of medical technology, using physical samples, reagents, and dedicated laboratories, this is not realistic. But a wearable path to human-digital interaction may offer another approach.

A Wearable Universe Emerging

No alt text provided for this image

For the purpose of this article, we consider wearables to be portable devices that people attach to their bodies and operate primarily without the use of hands. The majority are worn on the wrist but there are also eye, ear, finger, neck, and body-worn devices. The category has roots in defense, sports, entertainment, and health, but started to coalesce into a single, consumer category with the introduction of the iconic Fitbit, more than a decade ago. Measured by value, today the vast majority of consumer wearable devices are smartwatches and fitness trackers with an increasingly blurring line between the two categories.

No alt text provided for this image

The major innovation that has driven the emergence of these devices is the ability to make and make use of, detailed measurements related to the physical state of the Wearer. Whereas the earliest wearables were limited to using accelerometers to detect and count steps, as sensor technology evolved, wrist-worn devices were packed with additional sensors measuring pulse, heart rate variability, blood oxygen level, ECG, blood pressure, and galvanic skin response. These, in turn, were combined with other sensor data (e.g. GPS) and increasingly sophisticated analytics to create “virtual” measurements of sleep, stress, calories burned, and overall health indicators. However, all these share one common attribute: the data produced has been good enough for consumer use, but not regarded as adequate for clinical use in a more formal healthcare setting.

Upping the Ante: From Casual Measurement to Clinical Imperative

No alt text provided for this image

The healthcare industry today spends vast sums of money on complex devices which produce extremely high quality, but narrowly scoped data snapshots about the inner workings of a patient’s body. Think MRI or CAT Scan machines. The monitoring equipment in an ICU also produces very high-quality data, and does it over a longer interval, but also in a highly constrained setting. The problem with these snapshots, as valuable as they are, is that they cover only a narrow window of time. Because they’re so expensive and focused, these tools are primarily diagnostic in nature. A physician is generally already observing a problem and seeking specific information to guide diagnosis and, because the data is being used in such a definitive way, there is very little tolerance for noise or uncertainty. This emphasis on definitive diagnosis places an enormous premium on precision and certainty that has come to characterize the notion of clinical data use.

Most of life, and most of what happens in and to our bodies, happens outside the clinical diagnostic window, and over much longer periods of time. Capturing insights from that stream of life requires a more ubiquitous approach to data gathering and the gathering of data from inherently noisy environments. Moreover, most of the time outside the clinical setting, we are clinically “normal”. The data that’s gathered and any insights garnered would be mostly of the nature of “baseline” observations, perhaps tracking the baseline as it drifts over time. When things go wrong, they might show up as variances or anomalies in that baseline, but detecting these requires ongoing collecting and storage of the baseline data. Because the clinical community has never had the means to capture broad baseline characterizations on an individual basis, it has never developed methodologies that depend on it. Ever wonder why your primary care provider has no interest in all that Fitbit data you’ve accumulated?

No alt text provided for this image

On the other hand, as the current crisis so deeply illustrates, there is a need to expand the clinical window from a reactive view to something more proactive and predictive. This is leading to a transformation in how the data that can be captured from wearable devices is used. In part, this transformation is being driven by improvements in the quality of the data produced by wearable sensors, (e.g. Lifesignal’s unique sensor) in some cases sufficient to warrant FDA approval. However, equally important is the growing sophistication of the analytic methods applied to the data. In the industrial context, it is increasingly a given that large, diverse streams of data, analyzed over prolonged periods of time can produce remarkably reliable and precise predictive, pre-emptive, and diagnostic indicators. Rather than service a jet engine on a schedule, or wait until it fails, a detailed and comprehensive picture of the internal condition of the equipment is continually derived from hundreds to thousands of sensors embedded within it. As important, by examining its historical data, and that of many other engines in the field, anomalies can be detected even when the equipment seems to be in perfect operating condition. In this way, data can be used to detect not only known failure conditions but also unknown ones.

Of Sensors and Humans

No alt text provided for this image

Unlike a modern jet engine, humans don’t come with a comprehensive suite of embedded sensors, and the number of streams of data available for analysis is therefore much more limited. Still, the combination of wearable and portable devices (e.g. cell phones) provides a surprisingly rich source of data streams. Moreover, unlike a jet engine where each sensor reports exactly one metric about one element of the engine, the sensors we carry with us are configured to monitor much richer signals. For example, the Samsung Galaxy Watch Active 2’s motion sensor data can be processed to reveal signals that inform analysis of our gait, how much exercise we’re getting, sleep patterns, breathing patterns – or combined with pulse data – even the Wearer’s stress level. Moreover, despite technical challenges in doing so, ambient sensors already present in the environment can be added to the mix. For example, a security camera in an office, which may already be doing facial recognition, could also be harnessed to detect changes in complexion or its range could be extended into the infrared spectrum to look for evidence of elevated body temperature. Additionally, human beings have a critical feature that is lacking in industrial machinery. We can be directly queried and the assessment of our data can be iterative based on our direct input. Finally, all of this field data can be combined with clinical data from Electronic Health Records.

Once assembled, this data collection can be processed in at least a couple of ways using Machine Learning (ML) techniques pioneered in the industrial world. First, individual baselines can be used to detect anomalies that might indicate that an individual is under health stress, even when he or she feels perfectly fine. Second, when a known pathogen is being tracked, data from individuals with confirmed cases of the disease can be used to train ML models. In this way, digital “fingerprints” of a disease are created that can be applied to detect the disease in people as they go about their lives. In effect, these techniques provide a path to data-based testing and detection regimens that can be broadly and continually applied to a large population.

In fact, interested parties are already experimenting with partially similar approaches[1,2,3,4] The upshot is that, despite the inherent resistance to change in the clinical community, the critical need to expand the clinical window is driving the acceptance of data analysis techniques applied to wearable sensors as a clinically valid source of predictive and pre-emptive health indicators.

Closing the Loop: from prediction to prescription

While approaches like the ones currently being explored differ on a technical level they all share a key attribute: the one-way flow. Data flows one way from the field (wearers, the environment or labs) to a central platform where the detection algorithms are applied. In cases where the condition or disease is detected, the path to action is at best indirect.

On the other hand, as noted above, human intelligence creates the possibility of iterative, human-in-the-loop, analytic models. This, however, is just a starting point: the universe of wearable equipment is not limited to smartwatches that send data. It includes devices like earbuds (e.g. Jabra Elite Sport)_and smart glasses (e.g. Focals) that can provide a real-time feedback path to the user. In this way, wearables enable the creation of a full, human-centered, behavioral loop. This behavioral loop presents a previously overlooked dynamic: a direct influence – and possible change – of individual behavior in response to emerging threats. The behavioral loop literally allows for prescribing and monitoring the behavior of individuals in response to a possible emerging adversarial health condition, like a pandemic.

No alt text provided for this image

Think of the current system of Wireless Emergency Alerts (WEA) that warns the public about dangerous weather, missing children, and other critical situations. But unlike the WEA, the behavioral loop is individualized, dynamic, and able to provide specific instructions in a non-intrusive manner. A fully integrated system of wearables can be thought of as a bidirectional communication and control system. The possibilities for real-time analysis, and both group and individual behavioral influence are enormous.

In Summary

Put differently, a wearables-based approach would enable society to identify members under threat at various different levels and could enable individualized action via an established behavioral loop. This could mean instructing suspected patients to shelter-in-place in dense urban environments or just “flag” the person to other, uninfected people for safe distance keeping in areas of low population density.

Any of these approaches are inevitably going to encounter the sensitivities around Personally Identifiable Information (PII), Individuals will want the ability to control and regulate their data flow. In this regard, the behavioral loop enables a form of negotiation, for example, individuals can be incentivized to opt into higher levels of sharing in exchange for higher levels of benefit, such as earlier warning, faster testing, etc. A good illustration of this principle would be the Global Traveler program where registered participants can bypass security lines at airports by volunteering PII.

Taken together, the result is that society could accomplish objectives like “flattening the curve” without major economic disruption since the reaction to infection would be individually prescribed and proportionate to the risk a single individual may represent.

Footnotes:

  1. One of the most visible initiatives is archrivals Apple & Google teaming up for contact tracing.
  2. Another group of researchers, led by MIT aims to do the same, albeit in a more privacy-friendly manner.
  3.  Fitbit (now part of Google) teamed up with Stanford University and Scripps Research and has unveiled an initiative that intends to use Fitbit devices to identify the early signs of infections.
  4.  Oura, another wearable manufacturer is sponsoring research at the “...University of California, San Francisco (UCSF) to study whether physiological data collected by the Oura ring, combined with responses to daily symptom surveys, can predict illness symptoms.”

A New Division of Labor: IoT, Wearables and the Human Workforce

A New Division of Labor: IoT, Wearables and the Human Workforce

Read More

A note on the authors: This article was co-authored and published simultaneously by Keith Deutsch and Peter Orban.

The personal computer has had a monumental impact on the productivity of desktop workers. The era of blueprints, typewriters, desk phones, and paper files has given way to CAD, speech recognition, email, and software platforms. Just about every aspect of how work gets done has completely changed in the past three decades, ranging from creation to communication, to control.

Like the previous generations of technology innovation, deploying desktop computers initially required a considerable amount of abstraction and steep learning curves: creating even a simple sketch on a screen required coding and math skills. Experts and many mitigated layers of knowledge were required to effectively use this newly created window into the world of work. As computers and software evolved, using them became more intuitive but their users were still tied to desks.

The ensuing era of the Mobile helped greatly. It enabled the unchaining of the device and led to the creation of wholly new solutions, solving the challenges of location, real-time, and visual consumption of the world.

But there is a group who – comparatively speaking – benefited much less from all these changes: the legions of deskless workers. Workers on the factory floor, on the telephone poles, in mines, on oil rigs or on the farm for whom even a rugged laptop or tablet is impractical or inconvenient: either you do what you need to do or you’re staring at a screen. The mobile era unchained desk workers from their desks but its contribution to the world of workers in the field, to the folks who work on things rather than information, was negligible. Working on things often requires both hands to get the job done, and also doesn’t map well to a desktop abstraction.

Enter the wearable device, a new device class enabled by mobile-driven miniaturization of components, the proliferation of affordable sensor technology, and the movement to the cloud.

Wearable devices started as a consumer phenomenon (think smartwatches), mostly built around sensors. Initially, they focused on elevating the utility of the incorporated sensor and their market success was commensurate with how well the sensor data stream could be laddered up to meaningful, and personalized insights. With the entrance of the ‘traditional’ mobile players, wearables’ role expanded into facilitating access, in a simplified way, to the more powerful devices in a user’s possession (e.g. their smartphone). The consumer market for wearables continues to pivot around the twin notions of access and self-monitoring. However, to understand the deeper and longer-term implications of the emergence of intelligent wearable devices, we need to look to the industrial world.

An important, new chapter in wearable history was written by Google Glass, the first affordable commercial Head-Mounted Display (HMD). Although it failed as a consumer device, it successfully catalyzed the introduction of HMDs in the Enterprise. Perhaps even more importantly, this new device type led the way in integrating with other enterprise systems, aggregating the compute power of a node and the cloud-centered on a wearer. Unlike the shift to mobile devices, however, this has the potential to drive profound changes in the lives of field workers and could be a harbinger of even deeper changes in how all of us interact with the digital world.

Division of Labor: Re-empowering the Human Workforce

Computers and handheld devices had a limited impact on deskless workers. But technological changes like automation, robotics, and IoT had a profound impact, effectively splitting the industrial world into work that is fit for robots and work that isn’t. And the line demarcating this division itself is in continuous motion.

Early robotic systems focused on automating precise, repetitive, and often physically demanding activities. More recent advances in analytics and decision support technology (e.g. Machine Learning and “AI”) and integration via the Internet of Things have led to the extension of physical robots into the digital domain, coupling them with software counterparts (software agents, bots, etc.) capable of more dynamic response to the world around them. Automation is thus becoming more autonomous and, as it does so, it’s increasingly moving out of it’s isolated, tightly controlled confines and becoming ever more entwined with human activity.

Because automation inherently displaces human participation in industrial processes, the rapid advances in analytics, complex event processing, and digital decision making have prompted concerns about the possibility of “human obsolescence”. In terms of the role of bulk labor, this is a real concern. However, the AI community has perpetually underestimated the sophistication of the human brain and the limits to “AI” based machine autonomy in the real world have remained clear: creativity, decision making, complex, non-repetitive activity, untrainable pattern recognition, self-directed evolution, and intuition are still largely the domains of the human workforce, and are likely to remain so for some time.

Even the most sophisticated autonomous machines can only operate in a highly constrained environment. Self-driving vehicles, for example, depend on well-marked, regular roads and the goal of an ‘unattended autonomous vehicle’ is very likely to require extensive orchestration and physical infrastructure, and the resolution of some very serious security challenges. By contrast, the human brain is extraordinarily well adapted to operating in the extreme fuzziness of the real world and is a marvel of efficiency. Rather than try to replace it with fully digital processes, a safer, and more cost-effective strategy would be to find ever better and closer ways to integrate human processing with the digital world. The role of wearable technology provides a first path forward in this regard.

Initial industrial use cases for wearables have tended to emphasize human productivity through the incorporation of monitoring and “field appropriate” access to task-specific information. The first use cases included training and enabling less experienced field personnel to operate with less guidance and oversight. Some good examples are Librestream’s Onsight which creates “virtual experts”, Ubimax’s X-pick that guides warehouse pickers or Atheer’s AR-Training solutions. Honeywell’s Connected Plant solution goes a step beyond: it is an “IIoT style” platform that already connects industrial assets and processes for diagnostic and maintenance purposes, a new dimension of value.

The introduction of increasingly robust autonomous machines and the consideration of productivity and monitoring across more complex use cases involving multiple workers and longer spans of time will drive the next generation of use cases.

Next Reality

Consider the following – today still hypothetical although reality based – use case:

Iron ore mining is a complex operation involving machines (some of which are very large) stationary objects and human workers – all sharing the same confined space with limited visibility. It is critical not only to be able to direct the flow of these participants for safety reasons but also to optimize it for maximum productivity.

The first step in accomplishing this requires deploying sensors at the edge that create awareness of context: state, condition, location. Sensors on large machines or objects are not new and increasingly, miners carry an array of sensors built into their hard-hats, vests, and wrist-worn devices. But ‘sense’ is not enough – optimization requires a change in behavior. For this, a feedback loop is needed, which is comparatively easy to accomplish with machines. For workers, a display mounted on the hard hat, and haptic actuators embedded in their vest and wrist devices close the feedback loop.

eHelmet concept for Jannatech - Studio Norcat

Thus equipped, both human and machine participants in the mining ecosystem can be continuously aware of each other, getting a heads up – or even a warning – about proximity. Beyond awareness, this also allows for independent action: for example, stopping vehicles or giving directional instructions via the HMD or haptic feedback.

Being connected in this way helps to promote safety, but isn’t enough for optimization. For that a backend system that uses historical data, rules and ML algorithms to predict and – ultimately – prescribe optimum paths are required. This provides humans with key decision support capabilities and a means to provide guidance to machines without explicitly having to operate them. Practically speaking: they operate machines via their presence. Considering the confined environment, this means that sometimes the worker needs to give way to the 50-ton hauler and other times the other way around. What needs to happen gets deduced from the actual conditions, decided in real time, on the edge.

Wearable Experience fro Knowledge Intensive Training (WEKIT) - EU

s this use case illustrates, wearable devices are emerging as a new way for humans to interact with machines (physical or digital). The sensors on these devices are also being used in a new and more dynamic way. Whereas each sensor in a traditional industrial context provides a very tightly defined window into a specific operating parameter of a specific asset, sensor data in the emerging paradigm is interpreted situationally. Temperature, speed, vibration may carry very different meanings depending on the task and situation at hand. The Key Performance Indicators (KPIs) to be extracted from these data streams are also task and situation-specific, as are the ways in which these KPIs are used to validate, certify, and optimize both the individual tasks and the overarching process or mission in which these tasks are embedded.

A key takeaway in considering this new human-machine interaction paradigm is that almost everything is dynamic and situational. And, at least in the industrial context, the logical container for managing all of this is what we’re calling the “Mission”. This has important ramifications for considering what systems need to be in place to enable workers and machines to interoperate in this way and to make possible an Industrial Internet of Things that effectively leverages the unique features of the human brain. A discussion of the nature and implications of “Mission-Driven Enablement” will be the subject of the next article in this series.

******************

Hello world!

Hello world!

Read More

Welcome to WordPress. This is your first post. Edit or delete it, then start writing!

A Partnership for the Human and Artificial Workforce, For a Better Enterprise.

XR Solutions Newsletter

Close