Indiana University: The integration of multimodal wearable biosensing technology and multi domain artificial intelligence, leading to the path of decentralized healthcare

Wearable biosensors have become a platform for continuous and real-time health monitoring, capable of collecting physiological, biochemical, and environmental signals from different parts of the human body in a non-invasive or minimally invasive manner. With their compact design, well organized fit, and excellent comfort, these devices can be integrated into daily life to collect multimodal biological signals with performance comparable to traditional clinical devices. In addition to immediate diagnosis, they are driving the development of medical services towards non-invasive and decentralized home monitoring, supporting continuous monitoring, early detection, and precise intervention, while reducing clinical burden.

Despite its powerful sensing performance, the bottleneck lies in interpretation: high-capacity, heterogeneous data streams are difficult to manually parse and cannot be expanded. Multi domain artificial intelligence solves this problem by jointly learning multiple modalities and injecting structured domain knowledge. Multi sensor fusion aligns complementary signals into a shared representation; Knowledge graph encodes carefully organized clinical relationships to enhance reasoning ability and reduce vulnerability. Cross domain migration refers to the reuse of abstract concepts learned in data rich environments to guide exploration in situations where exploration is insufficient, thereby improving the robustness of the system in sparse data and driving the system's transition from sensors to decision-makers. The Large Language Model (LLM) adds a dialogue layer to this technology stack, transforming complex outputs into clear, patient and clinical guidance, and supporting interactive exploration of uncertainty and preferences. Combined with continuous sensing technology, these intelligent agents can guide the system to transition from raw signals to decision-makers, trigger decisions, analyze risks, and propose follow-up steps without the need for face-to-face consultation, thereby supporting patient self-management and reducing the workload of medical staff.

In this review, we provide an overview of the latest developments in multimodal biosensing and multi domain artificial intelligence, and explore how their integration can achieve decentralized, patient-centered healthcare services. We first provide an overview of the latest developments in multimodal wearable biosensing technology to demonstrate its complementary coverage of physiological processes, with a focus on representative diagnostic and screening studies. Then, we focus on exploring data fusion, elaborating on how to combine first-hand sensor data streams with electronic health records and medical literature to support more accurate individual feature analysis, differential diagnosis, and decision support. Finally, we discussed the remaining challenges, standards for interoperable data integration, privacy and governance, universality and trust issues, and looked forward to the opportunities for closed-loop systems to provide comprehensive assessments at the vital sign level.