Personal tools
You are here: Home EITA Emerging Technologies Research and Ventures Medicine, Life Sciences, Biomedical Science The Theme - Biomedical Research

The Theme - Biomedical Research

[Harvard University - World Book]


 "Biomedical Research in A New Health ICT Framework"




1. Overview

- Biomedical Research are More Interdisciplinary Than Ever

"Biotech is the New Digital" -- [Prof. Nicholas Negroponte, Founder, MIT Media Lab.] 

In the 21st century, groundbreaking research and discovery in the Biomedical Research are more interdisciplinary than ever. Biomedical Research represents the (basic and applied) research activities in the areas of Medicine, Public Health, Pharmacology, Life Sciences, Biology, Biochemistry, Chemistry, Physics, Mathematics, Statistics, Engineering, New Materials, Information and Communication Technology (ICT), and Health-related topics. These scientists work to understand the biological principles that govern the function of the human body, to discover the mechanisms of disease, and to find innovative ways to treat or cure disease by developing advanced diagnostic tools or new therapeutic strategies for physicians - especially new smart devices that could help transform the detection, prevention, and management of disease. The increased longevity of humans over the past century can be significantly attributed to advances resulting from Biomedical Sciences Research. 

- Biomedical and ICT Convergence

The field of biomedicine is concerned with the application of various natural science disciplines for the development of knowledge, interventions, and/or technologies for use in healthcare. The Information and Communications Technology (ICT) sector focuses on telecommunications, computing, and the integration of both. The convergence of biomedical and ICT represents a high opportunity area for both healthcare and ICT industries. In addition to seeking cures to disease and ailment prevention, the convergence of biomedical and ICT technologies and solutions will lead to improved treatment efficacy as well as overall improvements in healthcare service delivery efficiency and effectiveness. 

- Biotech is the New Digital

In the past decade, tremendous advances in expanding computing capabilities - sensors, data analytics, networks, advanced imaging, and cyber-physical systems - have, and will continue to, enhance healthcare and health research, with resulting improvements in health and wellness.

We’re at the cusp of a major revolution in understanding the workings of the human body. The biomedical sciences are being radically transformed by advances in our ability to monitor, record, store, integrate and analyze information characterizing human biology and health at scales that range from individual molecules to large populations of subjects. 

Data intensive biomedical science is developing new methods for analyzing large-scale biomedical data sets to understand how living systems function and to harness this knowledge in order to understand disease mechanisms and provide improved health care at lower costs. Precision medicine has become a common label for data-intensive and patient-driven biomedical research. Its intended future is reflected in endeavours such as the Precision Medicine Initiative in the USA.

- 4 P's for the 21st Century Medicine

According to Google Ventures, the following top eight life sciences technologies are the most promising and will transform medicine: Artificial Intelligence, Understanding the Brain, Reinventing Antibiotics, Battling Cancer, Genetic Repair, Understanding the Microbiome, Organ Generation, and Stem Cells. For example, stem cell research has the potential to revolutionize the way we treat many conditions, including degenerative diseases for which few effective treatments currently exist. Stem cell research is rapidly advancing towards potential therapeutic applications such as tissue and organ replacement, disease modelling and drug testing. Dr. Aaron Ciechanover, Nobel Prize in Chemistry 2004, characterizes 21st century medicine with four P’s: it’s personalized, predictive, preventive – and it should be participatory.


(Bald Eagle, Seattle, Washington, U.S.A. - Jeff M. Wang)

2. Life Sciences, Pharmaceutics and ICT

- The Themes and Concepts of Biology and The Levels of Organization of Living Things

Key characteristics or functions of living beings are order, stimuli, reproduction, growth/development, regulation, homeostasis, and energy.

Order can include highly organized structures such as cells, tissues, organs, and organ systems; Interaction with the environment is shown by response to stimuli; The ability to reproduce, grow and develop are defining features of life; The concepts of biological regulation and maintenance of homeostasis are key to survival and define major properties of life. Organisms use energy to maintain their metabolic processes; Populations of organisms evolve to produce individuals that are adapted to their specific environment.

Biology is the study of life. Since life is such a broad topic, scientists break it down into several different levels of organization to make it easier to study. These levels start from the smallest unit of life and work up to the largest and most broad category. The levels, from smallest to largest, are: molecule, cell, tissue, organ, organ system, organism, population, community, ecosystem, biosphere.

- Life Sciences in the ICT Age

The ICT Age has arrived. We are in a stage where we have access to all information all the time. This is likely to have a great impact from our medical care but also our behavior and how we live our lives. Much of our software and data have moved from out of our desktop, yet they are accessible locally and remotely.

Life sciences and ICT are coming together to revolutionize scientific and medical discovery; comprising: acquisition, transmission, processing, storage and retrieval of biomedical and health information. The general computing trend is to leverage shared web resources and massive amounts of data over the Internet. 

- A New Era of Data-Driven Medicine

The foundation for a new era of data-driven medicine has been set by recent technological advances that enable the assessment and management of human health at an unprecedented level of resolution (high-definition medicine). Telemedicine, predictive diagnostics, wearable sensors and a host of new apps will transform how people manage their health. With today’s high-throughput sequencing technology, it’s much easier to generate genomic data than to transform it into information or knowledge that can improve human health. We are at the beginning of the genomics revolution. 

The promise of genomics is to revolutionize treatment of disease, to personalize treatment. The unprecedented abundance of medically relevant data (e.g. molecular, cellular, organismal, ecological, behavioral, clinical), from detailed information about genes and genetic diseases and the relative efficacy of drugs in diverse patient populations, to three-dimensional imaging of living cells giving researchers a more detailed and accurate spatial visualization of the interplay of cells and their components, is driving the use of quantitative methods in medicine. 

- Beyond Precision Medicine: High Definition Medicine

The foundation for a new era of data-driven medicine has been set by recent technological advances that enable the assessment and management of human health at an unprecedented level of resolution - what we refer to as high-definition medicine. Our ability to assess human health in high definition is enabled, in part, by advances in DNA sequencing, physiological and environmental monitoring, advanced imaging, and behavioral tracking. 

For example, recent advances have made 3D imaging (e.g., enabling 3D images of living organisms to be obtained with greater speed and precision) a valuable tool for many applications, such as cell biology, developmental biology, neuroscience and cancer research. These new approaches will improve our understanding of finding better diagnostics, treatments and therapies for diseases. 

- New Technologies are Accelerating Drug Discovery and Development

Many patients and their doctors wait for years before promising treatments become available. All too often, unforeseen side effects send researchers back to the drawing board, just when they thought they were close to bringing a new medication to market. It takes, on average, at least 10 years for a drug to make the journey from discovery to the marketplace at an average cost of $2.6 billion. The overall cost includes not only the development costs for drugs that successfully made it to market, but also for the drugs that failed along the way. 

Today, the likelihood that a drug entering clinical testing will eventually be approved is estimated to be much less (more than 12%). What researchers are learning is that by using certain technologies (for example, machine learning - training a machine to see more than we can, mining big data, deep text mining and analysis, sentiment analysis, facilitating collaborations across sectors and organizations, etc.), early in the drug-development process, they can identify issues that might cause a drug to fail early on, in many cases before the compound even goes into clinical testing. Then they can either modify the compound to address the issues, while maintaining the therapeutic effects, or make an early decision to no longer pursue the drug candidate, thereby averting a more expensive later stage failure. 


3. Pharmaceutics, Virology and Vaccinology

- Pharmaceutics
Pharmaceutics is a multidisciplinary science that examines the development, production and characterization of dosage forms, as well as the disposition and action of drugs in the body. It is the study of relationships between drug formulation, delivery, disposition and clinical response. Pharmaceutics encompasses a molecular evaluation of drug metabolism and transport processes and the study of genetic, environmental and disease-related factors that regulate or perturb those processes, as well as the fundamental mathematical relationships between enzyme/transporter function, blood concentration-time profiles and the spectrum of pharmacological effects. Pharmaceutics nclude: Pharmacokinetics, Pharmacodynamics, Pharmacoepidemiology, Pharmacogenomics, Pharmacovigilance, Pharmaceutical formulation and Pharmaceutical technology. 

- Virology
Virology is the study of viruses and virus-like agents, including (but not limited to) their taxonomy, disease-producing properties, cultivation and genetics. It is often considered a part of microbiology or pathology. In the early years this discipline was dependent upon advances in the chemical and physical sciences, but viruses soon became tools for probing basic biochemical processes of cells. 

It is no accident that virologists have played major roles in the biological revolutions of the last century. Viral gene products engage all the key nodes of biology, ranging from the atomic to the organismal, and thus serve as ideal tools to dissect the most intricate life processes. The challenges are to identify and understand these biological nodes and extrapolate from this information how viruses replicate, disseminate, and sometimes cause disease. Virology in the 21st century will continue to prosper. 

- Vaccinology  

Vaccinology is the science of vaccines, and historically includes basic science, immunogens, the host immune response, delivery strategies and technologies, manufacturing, and clinical evaluation. More recently, the science has expanded further to include the safety, regulatory, ethical and economic considerations of vaccine development and utilisation. Veterinary vaccines are equally important in the field of vaccinology for their contribution not only to animal health but also to the security of the food supply for humans. Although
traditionally vaccinology has focused on infectious diseases, as we move forward in the 21st century vaccines will also potentially make significant contributions to the control of
non-infectious diseases such as cancers, neurodegenerative diseases and addictions. 

The field of vaccinology continues to expand and innovate in basicscience discovery, product development and implementation, and evaluation of effectiveness. Innate and induced immune regulatory pathways are unraveled, new adjuvants and antigen constructs proven effective, and recently licensed products achieve high coverage, yielding noticeable decreases in disease incidence. These achievements are moving the field forward, with the expectation that many current, challenging diseases—including chronic, noninfectious, and neoplastic - might become vaccine-preventable or vaccine-treatable.  


4. Connecting the Dots: New Media, the Social Web, Cloud Computing, and Integration

- New and Growing Electronic Resources is Transforming Healthcare

Modern healthcare is being transformed by new and growing electronic resources, with hospitals generating terabytes of imaging, diagnostic, monitoring, and treatment data. Machine learning (ML) is central to utilizing these rapidly expanding datasets, combing through data across patients, clinics, and hospitals to uncover more effective treatments and practices that increase the quality and longevity of human life. 

- New Media Will Have Major Impact on Healthcare

The rise of new media has increased communication between people all over the world and the Internet. It allows people to on-demand (cloud computing) access to content anytime, anywhere, on any digital device, as well as interactive user feedback, and creative participation. New media allows the real-time generation of new, unregulated content, including (at least for now) Internet, blogs, websites, computer multimedia (e.g., medical audio or speech, real-time or recorded video, high resolution still image, and so forth), pictures, and other user-generated media. It is apparently envisioned that the field of medicine expects to gain a large benefit from the explosion of wearables and Internet-connected sensors that surround us to acquire and communicate unprecedented data on symptoms, medication, food intake, and daily-life activities impacting one's health and wellness. New media will have major impact on healthcare delivery and, perhaps, on costs as well.

- Healthcare Fog Computing is Rising

Pushing computing, control, data storage and processing into the cloud has been a key trend in the past decade. However, cloud alone is encountering growing limitations in meeting the computing and intelligent networking demands of many new systems and applications. Local computing both at the network edge and among the connected things is often necessary to, for example, meet stringent latency requirements, integrate local multimedia contextual information in real time, reduce processing load and conserve battery power on the endpoints, improve network reliability and resiliency, and overcome the bandwidth and cost constraints for long-haul communications. Healthcare fog computing can offer healthcare organizations a new way to support their IT solutions as they continue their digital transformations.


5. Fog (Edge) Computing in Healthcare

The cloud is now "descending" to the network edge and sometimes diffused onto end user devices, which forms the "Fog". Fog computing is a new computing mode. As a derivative of cloud computing, fog computing can solve the problems of high latency, overloaded center server and overloaded bandwidth of network.

- Distributed Fog Computing Architecture

Fog computing is a system-level horizontal architecture that disputes resources and services of computing, storage, control, and networking anywhere along the continuum from cloud to things. This distributed architecture allows computing analytics and decision-making to be done near the data source which is especially efficient for organizations using the Internet of Things (IoT). Fog can also help healthcare organizations solve interoperability issues with the IEEE 1934 standard, which addresses the need for end-to-end interoperable solution for the things-to-cloud architecture used by fog computing. 

- Service-Oriented Intermediate Layer in IoT

Fog computing is a service-oriented intermediate layer in the Internet of Things (IoT), providing the interfaces between the sensors and cloud servers for facilitating connectivity, data transfer, and queryable local database. The centerpiece of Fog computing is a low-power, intelligent, wireless, embedded computing node that carries out signal conditioning and data analytics on raw data collected from wearables or other medical sensors and offers efficient means to serve telehealth interventions. Open standards for fog computing are critical as IT systems become more complex with cloud and IoT solutions.

- Extending the Cloud to the Network Edge

Fog computing extends the concept of cloud computing to the network edge, making it ideal for Internet of Things (IoT) and other applications that require real-time interactions. It enables key applications in wireless 5G, IoT, and big data. Fog computing and networking present a new architecture vision where distributed edge and user devices collaborate with each other and with the clouds to carry out computing, control, networking, and data management tasks. The IoT may more likely be supported by fog computing, in which computing, storage, control and networking power may exist anywhere along the architecture, either in data centers, the cloud, edge devices such as gateways or routers, edge equipment itself such as a machine, or in sensors. Fog computing distributes the services of computation, communication, control and storage closer to the edge, access and users. 


6. The Healthcare Internet of Things (IoT)

- Digital Transformation in Healthcare

The fundamental objective of the Internet of Things (IoT) is to obtain and analyze data from things (devices) that were previously disconnected from most data processing tools. This data is generated by physical things (devices) deployed at the very edge of the network - such as motors, generators, pumps, and relays - that perform specific tasks to support a business process. The IoT is about connecting these unconnected devices (things) and sending their data to the cloud or Internet to be analyzed.

The Internet of Things (IoT) has numerous applications in healthcare (i.e., healthcare IoT), from remote monitoring to smart sensors and medical device integration. It has the potential to not only keep patients safe and healthy, but to improve how physicians deliver care as well. Healthcare IoT can also boost patient engagement and satisfaction by allowing patients to spend more time interacting with their doctors. Wireless 5G have the capacity to impact the Internet of Medical Things (IoMT), it will help enable medical innovations using augmented reality, virtual reality, artificial intelligence (AI), remote medical learning, remote patient monitoring, and more.

IoT-enabled devices capture and monitor relevant patient data and allow providers to gain insights without having to bring patients in for visits. This process can help improve patient outcomes and prevent potential complications for those who might be considered high risk. These deployments and use cases are just the beginning. More advanced and integrated approaches within the scope of the digital transformation of healthcare are starting to be used.  

- The Internet of Bodies (IoB) 

IoT and self-monitoring technologies are moving closer to and even inside the human body. Consumers are comfortable with self-tracking using external devices (such as fitness trackers and smart glasses) and with playing games using augmented reality devices. Digital pills are entering mainstream medicine, and body-attached, implantable, and embedded IoB devices are also beginning to interact with sensors in the environment. These devices yield richer data that enable more interesting and useful applications, but also raise concerns about security, privacy, physical harm, and abuse. 

- IoT Device Interoperability- A Need to Accelerate Innovation

But healthcare IoT isn't without its obstacles. The number of connected devices and the tremendous amount of data they collect can be a challenge for hospital IT to manage. One of the central-most challenges facing IoT (still very immature, and a long way to go) is the enablement of seamless interoperability between each connection (i.e., lack of interoperability at the application level). Merely connecting "things" gives you very little or almost no benefit. The vast majority of "things" and data that we might be collecting may have no relevance to the decision we want to make. A business case is needed to justify the investment. It’s about getting the right data to the right person at the right time to make the right decision. There is a need of a consolidated common standard that makes devices communicable, operable, and programmable, regardless of make, model, manufacturer, or industry. 

- Integration with Electronic Health Record Systems 

Electronic health records (EHRs) serve the role of comprehensive structured data repository. But they lack the functionality to make critical and time-sensitive elements of that data actionable. While the data that is collected from IoT devices can include a patient's vital signs, physical activity or glucose levels while at home, that information does not typically travel to an EHR system and, in most cases, is not centralized or made easily available to providers. This limits the information's value since it is not always presented to the provider in a clinical context. Some EHR systems allow patients to import data into their record, but this still remains relatively limited to a few dominant EHR players and leaves many providers uncertain of how to handle information that lives outside of their records systems. 

- Data Privacy and Security

In addition, the IoT technology implementations will likely raise concerns around data privacy and security. How to keep all of that data secure, especially if it is being exchanged with other devices. While most of today's devices use secure methods to communication information to the cloud, they could still be vulnerable to hackers.


(Stata Center, MIT - Yu-Chih Ko)

7. The Web of Things (WoT)

- Building the Web of Things (WoT)

The enablement or implementation of the Internet of Things (IoT) is done using the Web of Things (WoT). WoT aims to build IoT in a truly open, flexible, and scalable way, using the Web as its application layer. WoT is a refinement of IoT by integrating smart things not only into the Internet (network), but into the Web Architecture (application). 

Connecting every Thing to the Internet and giving them an IP addresses is only the first step towards IoT. Things could then easily exchange data with each other, but not necessarily understand what that data means. This is what Web protocols like HTTP brought to the Internet: a universal way to describe images, text, and other media elements so that machines could “understand” each other. WoT is simply the next stage in this evolution: using and adapting Web protocols to connect anything in the physical world and give it a presence on the World Wide Web. 

- The Web of Things (WoT) Architecture

WoT is a term used to describe approaches, software architectural styles and programming patterns that allow real-world objects to be part of the World Wide Web. Similarly to what the Web (application layer) is to the Internet (network layer), WoT provides an application layer that simplifies the creation of IoT applications. Rather than re-inventing completely new standards, WoT reuses existing and well-known Web standards used in the programmable Web, semantic Web, the real-time Web and the social Web. 

WoT is a distributed platform over the latest Web technology. And WoT can wrap the difference among legacy IoT protocols defined by various standardizing organizations and also can provide interoperability between above legacy standards.Utilizing this interoperability, IoT application developers can ship only one wot-compliant application on behalf of shipping each legacy standard-compliant application.

- A Hyper-Connected World 

In WoT, any device can be accessed using standard Web protocols. Connecting heterogeneous devices to the Web makes the integration across systems and applications much simpler. The use of Web technologies is expected to dramatically reduce the cost for implementing and deploying IoT services. Correspondingly, WoT brings into focus a wide variety of challenges and opportunities while paving a way to a variety of exciting applications for individuals to industries. The reality of a hyper-connected world is here today.


8. Artificial Intelligence (AI) in Medicine

- Knowledge Synthesis: From Medical Research to Medical Practice
The value of medical research derives from its ability to impact further research and medical practice.  We are currently struggling to find the right information either about lifestyle or therapeutic decisions. Medicine is a field in which technology is much needed. Our increasing expectations of the highest quality healthcare and the rapid growth of ever more detailed medical knowledge leave the physician without adequate time to devote to each case and struggling to keep up with the newest developments in his (or her) field. Due to lack of time, most medical decisions must be based on rapid judgments of the case relying on the physician's unaided memory. This could change with Artificial Intelligence (AI). 

Medical knowledge synthesis, bridging the gap between current research, future research and medical practice, is a rapidly changing industry. The expanding mass of medical information makes knowledge synthesis ever more essential to enable and inform evidence-based decision-making. Systematic reviews (SRs), clinical practice guidelines (CPGs), textbooks and electronic information tools are the dominant modes of medical knowledge synthesis.

 - AI is Transforming Medicine

Artificial Intelligence (AI) is a pervasive trend that is rapidly accelerating thanks to vast amounts of data and progress in both algorithms and the processing capacity of modern devices. Biomedical data integrated with high performance computing allows for the analysis of several terabytes of data involved with modern machine learning and AI tools. Science as a service is evolving into a viable practice for delivering scientific solutions integrated in software. Researchers are creating high-throughput software for extracting symbolic and ontologic information from massive data sets using machine learning.

AI has the ability to interpret and analyse a lot of information quickly, which is very promising in the field of medicine where more and more digital data is being generated. Tasks such as the development of new drugs, the sequencing of DNA, the use of implants and smart patches, the remote monitoring of patients and the carrying out of epidemiological studies with thousands of patients are some of the fields that could benefit from this technology in the near future.

- AI and the Future of Medicine

AI in medicine is a new research area that combines sophisticated representational and computing techniques with the insights of expert physicians to produce tools for improving health care. AI scans data and uses statistical methods, probability theory, and machine and deep learning to find patterns that are difficult for the human mind to see. One of the most fertile grounds to take advantage of AI in medicine is the acquisition and interpretation of images for diagnosis - such as ultrasound, computerized tomography (CT) or magnetic resonance imaging (MRI). The results, in the form of digital images, must be interpreted by the doctors, who with their training and expertise can extract useful information to reach a diagnosis. As the number of images that are acquired - and their quality, sensitivity and resolution - increase steadily, researchers are working to develop technologies to help radiologists assess these images more quickly, accurately and effectively.This high-level computing augments physicians' knowledge to help doctors make predictions and treatment recommendations that are personalized for individual patients.  

Machine Learning (ML), referring to computer algorithms that can learn to perform particular tasks on their own by analyzing data, is the science of getting computers to act without being explicitly programmed. ML is an approach to achieve AI. Like a human, a ML application learns by experience and/or instruction. By applying the advanced ML capabilities, patients and healthcare providers benefit from more rapid and thorough analysis to translate DNA insights, understand a person’s genetic profile and gather relevant information from medical literature to personalize treatment options for patients. Deep Learning (DL), a technique for implementing ML, has enabled many practical applications of ML and by extension the overall field of AI. 

- AI Could Help with the Next Pandemic
It was an AI that first saw COVID-19 coming. On December 30, 2019, an AI company called BlueDot, which uses machine learning to monitor outbreaks of infectious diseases around the world, alerted clients -- including various governments, hospitals, and businesses -- to an unusual bump in pneumonia cases in Wuhan, China. It would be another nine days before the World Health Organization officially flagged what we’ve all come to know as Covid-19. BlueDot wasn’t alone. An automated service called HealthMap at Boston Children’s Hospital also caught those first signs. As did a model run by Metabiota, based in San Francisco. That AI could spot an outbreak on the other side of the world is pretty amazing, and early warnings save lives.

In April, 2020, at the height of the lockdown, Professor Àlex Arenas of Spain predicted that a second wave of coronavirus was highly possible in summer in Spain. At the time, many scientists were still confident that high temperature and humidity would slow the impact and spread of the virus over the summer months, as happens with seasonal flu. Unfortunately, Professor Arenas' predictions have turned out to be accurate.  Professor Arenas' predictions were based on mathematical modeling and underline the important role technology can play in the timing of decisions about the virus and understanding its spread. "The virus does as we do," So analyzing epidemiological, environmental and mobility data becomes crucial to taking the right actions to contain the spread of the virus.

- Potential AI Applications in Medicine

Potential applications for AI include, for example, guidance for decisions about the best medication to treat an individual with conditions such as Alzheimer's disease or depression; Rapid processing of thousands of medical images, to enhance diagnoses; Algorithms to identify individuals who might benefit from genetic testing for a predisposition to certain cancers; Predictions of risk for heart infection in people with implanted heart devices; AI-assisted robotic surgery; Virtual Nursing Assistants; Workflow and administrative tasks; etc..


9. Machine Learning (ML) in Pharmaceuticals

- Pharmaceutics 

Pharmaceutics is the study of relationships between drug formulation, delivery, disposition and clinical response. It encompasses a molecular evaluation of drug metabolism and transport processes and the study of genetic, environmental and disease-related factors that regulate or perturb those processes, as well as the fundamental mathematical relationships between enzyme/transporter function, blood concentration-time profiles and the spectrum of pharmacological effects

- Machine Learning (ML) is the Future of Pharma

The ability of AI to spot patterns in massive volumes of data provides machine learning (ML) a wide range of applications. ML, one of the most prominent approaches in AI, is the future of pharma. The human genome project and thousands of subsequent discoveries at the DNA, RNA, and protein levels were made possible by ML's ability to detect patterns across large and often messy data sets. ML has the potential to expedite the clinical drug discovery and development process by applying sophisticated algorithms to the analysis and mining of different data sources to predict molecule behavior and suitability as drug targets or therapeutic entities. Therefore, ML plays a crucial role in improving our health today.

The current drug discovery process is a lengthy, complex, and costly process, entrenched with a high degree of uncertainty that a drug will actually succeed. It can take up to 15 years to translate a drug discovery idea from initial inception to a market ready product. Industry is currently said to spend well over $1 billion per drug. That’s partly because all the drugs that didn’t make it have to be paid for. As our understanding of biology deepens thanks to the availability of new data and algorithms capable of learning from it, the drug discovery process is literally being transformed. ML presents the pharmaceutical industry with a real opportunity to do R&D differently, so that it can operate more efficiently and substantially improve success at the early stages of drug development.

- Drug Development Challenges 

The drug discovery process and the researchers that drive the pipelines can be greatly aided by the latest innovations in ML technology. The average biomedical researcher is dealing with a huge amount of new information every day. It’s estimated that the bioscience industry is getting 10,000 new publications uploaded on a daily basis – from across the globe and among a huge variety of biomedical databases and journals. So it’s impossible for researchers to know, let alone process, all of the scientific knowledge out there relating to their area of investigation. What’s more, without the ability to correlate, assimilate and connect all this data, it’s impossible for new usable knowledge – which can be used to develop new drug hypotheses – to be created. 

ML has a vital role to play in augmenting the work of drug development researchers so that an informed, first analysis of the mass of scientific data can be conducted in order to form essential new knowledge. What was once an entirely hypothesis driven approach where humans posed the questions is shifting toward scientists starting with an outcome and using machine learning to help discover important relationships to that outcome within the data. 

- AI and Machine Learning (ML) for Clinical Trials

The clinical trial is a foundational pillar of the pharmaceutical drug discovery process. Essentially, clinical trials are research studies which seek to determine if a medical treatment or device is safe and effective for humans. However, major challenges facing researchers structuring clinical trials are high costs and have low rates of success. According to the U.S. Food and Drug Administration, roughly 1 in 10 of drugs tested in human subjects receive FDA approval and millions of dollars are invested in the research process.

ML will also help in terms of the industry’s selection of patients for clinical trials and enable companies to identify any issues with compounds much earlier when it comes to efficacy and safety. So the industry has much to gain by adopting ML approaches. It can be used to good effect to build a strong, sustainable pipeline of new medicines.


Stanford _00044
(Stanford University - Hank Ping Han Hsieh)

10. 5G Wireless Network and 5G Technologies

- The Critical Role of Mobile Wireless 5G Transforming Digital Healthcare (Connected Healthcare)

Mobile revolution has changed everything. Our future is a world of connected devices. That means enormous needs for infrastructure, speed and support. The next-generation wireless telecommunications technology known as 5G, which offers speeds much faster than the 4G LTE most IoT devices are currently connecting with. 

In the healthcare industry, connected healthcare - a new era of healthcare, could increase efficiencies and revenue, helping health systems create faster, more efficient networks to keep up with the large amounts of data involved. The connected healthcare technology could also enable the use of remote monitoring devices to improve health outcomes.

- Moble Edge Computing

While 5G is not yet available, fog computing can eventually take advantage of the wider 5G bandwidth and act as a platform for 5G applications that require near real time communication. By running theses applications through a fog layer, even more latency is eliminated because the data doesn’t need to be communicated from the edge, all the way to the cloud.

- 5G Will Provide the Backbone for IoT

5G wireless technology will provide the backbone for IoT (e.g., Health IoT) that greatly improves data transfer speeds and processing power over its predecessors. This combination of speed and computing power will enable new applications for mobile technologies, especially in health care. By 2020, the 5G network will support more than 20 billion connected devices, 212 billion connected sensors and enable access to 44 zettabytes of data gathered from a wide range of devices from smartphones to remote monitoring devices. Healthcare organizations are eager to embrace IoT devices because they save money by keeping patients out of the hospital. If IoT devices can diagnose people in advance then that saves huge costs. 

- Virtual Healthcare (or Telemedicine)

5G networks open up new avenues for the delivery of health care. Instead of bringing patients to a doctor for treatment, 5G networks can connect patients and doctors from across the globe. It isn’t always possible for people to meet face-to-face with their doctors, due to the provider or patient’s location or ability to travel, and so virtual healthcare or telemedicine is emerging to bridge that gap. 5G technology could elevate virtual healthcare to new levels, enabling even more effective remote care. A very, very fast 5G network can make it possible for people to be able to communicate better and to be able to perform tasks together, whether they’re located in the same place or in different places.

Digital imaging can be sent anywhere in the world for analysis, expanding access for patients who live far away from health care providers. 5G might be used for wireless remote surgery. The point of care will move rapidly into the home, With ubiquitous mobile broadband-enabled internet access, connectivity and networking are becoming completely independent of location.


11. Smart and Pervasive Healthcare Computing

- The Vision of Pervasive Healthcare

Pervasive healthcare is the conceptual system of providing healthcare to anyone, at anytime, and anywhere by removing restraints of time and location while increasing both the coverage and the quality of healthcare. Pervasive Healthcare Computing is at the forefront of this research, and presents the ways in which mobile and wireless technologies can be used to implement the vision of pervasive healthcare. This vision includes prevention, healthcare maintenance and checkups; short-term monitoring (home healthcare monitoring), long-term monitoring (nursing home), and personalized healthcare monitoring; and incidence detection and management, emergency intervention, and transportation and treatment. 

- Pervasive Healthcare Applications

The pervasive healthcare applications include pervasive health monitoring, intelligent emergency management system, pervasive healthcare data access, and ubiquitous mobile telemedicine. 

Today the promise of precision and personalised medicine needs to take into consideration the potentiality offered by new technologies aiming at collecting and managing environmental, healthcare and lifestyle data, such as pervasive healthcare does. Pervasive healthcare focuses on technologies and human factors related to the use of ubiquitous computing in healthcare and for wellbeing. Recent advances in technology has led to the development of small, intelligent, wearable sensors capable of remotely performing critical health monitoring tasks and then transmitting patient’s data back to health care centres over wireless medium. Such wireless health monitoring platforms aim to continuously monitor mobile patients needing permanent surveillance. Patients benefit from continuous ambulatory monitoring as a part of a diagnostic procedure, optimal maintenance of a chronic condition or during supervised recovery from an acute event or surgical procedure. 

- Pervasive Wireless Healthcare

The rapid evolution of wireless technologies coupled with advances in related fields such as biosensor design, low power battery operated systems, diagnosing and reporting for intelligent information management, genome sequencing, and advances in analytic software, etc. has opened up many new applications for wireless systems in medicine (uHealth – ubiquitous Health). With the inclusion of Electronic Health Care, Point-of-Care technologies, E-Health and M-Health protocols, and personalized healthcare/medicine, the medical informatics area is entering into another era of massive amount of information. The medical and health care information databases would lead to new knowledge bases, discoveries in medical research, engineering oriented developments and clinical translational research and practices. 

Rapid advances in biomedical sensors, low-power electronics, and wireless communications have brought the vision of pervasive wireless healthcare to the verge of reality. At the individual level, health tracker apps on our mobile devices are sending data to health care providers to improve patient care and provide early-warning signs in at-risk patients. Early detection and monitoring is critical to mounting effective cancer treatments (and speed is critical because cancer treatment is a race against fast-replicating cells.). By combining implantable cancer detectors (using new methods in molecular imaging and micro-electromechanical systems (MEMS) technologies) with wireless data transmission technologies, new tools and emerging technologies for continuous monitoring during and after cancer treatment to signal remission and relapse or even trigger micro-scale drug delivery systems for automatic therapeutic interventions are on the horizon.


12. A Digital Revolution in Health Care

- Digital Health, Innovation, Value-Based Care

The convergence of several trends - wider adoption of electronic health records (EHRs) or electronic medical records (EMRs), advances in mobile technology, and payment reform - is accelerating the pace of change in how healthcare is delivered. A digital revolution in health care is speeding up. Telemedicine, predictive diagnostics, wearable sensors and a host of new apps (i.e., FDA-approved mobile devices) will transform how people, (or e-patients - individuals who are equipped, enabled, empowered and engaged in their health and health care decisions), manage their health. The age of digital health/medicine is here. 

- Digital Technologies Are Transforming Healthcare Delivery

The tools of modern healthcare are increasingly, progressively rooted in a variety of digital technologies. Many of these tools have already dramatically transformed healthcare delivery in a number of settings (and even created new medical settings). 

At the intersection of health, technology and health care are the devices and instruments that capture physiological data. Digital health is an approach focused on using such technology to monitor and provide relevant health-related data about individuals. These technologies include a rapidly expanding array of consumer products and wearables, as well as complex clinical care platforms in academic medical centers. These new devices need to be tested and validated, which also falls into the digital health rubric. For example, In-Vivo Networking (IVN) is the new technology that can wirelessly power and communicate with tiny devices implanted deep within the human body. Such devices could be used to deliver drugs, monitor conditions inside the body, or treat disease by stimulating the brain with electricity or light. The implants are powered by radio frequency waves, which are safe for humans. Medical devices that can be ingested or implanted in the body could offer doctors new ways to diagnose, monitor, and treat many diseases. 

- Medicine 2.0: Peer-to-Peer Healthcare
Peer-to-peer healthcare is a way for people to do what they have always done - lend a hand, lend an ear, lend advice - but at Internet speed and at Internet scale. Peer-to-peer healthcare acknowledges that patients and caregivers know things - about themselves, about each other, about treatments - and they want to share what they know to help other people. Technology helps to surface and organize that knowledge to make it useful for as many people as possible.

Social media and mobile devices have swiftly become more ubiquitous in the healthcare industry and integrated into daily life (for example, our incessant need for instantaneous medical diagnoses via the web). And digital health tools like smartphones certainly do make it easier. As patients continue to gain access and share healthcare information (such as sleep patterns, heart rate, activity levels, blood oxygen, glucose levels, and even stress, etc.) through various forms of online media (such as via a smartphone, smartband, or glucose monitor, etc.), healthcare organizations have started to use social media (i.e., Internet-based applications) to better connect with patients and their community on a wide range of healthcare issues. This initiative could appeal to anyone with an interest in a healthier lifestyle or, more specifically, to patients who suffer from chronic illnesses like heart disease, diabetes, stroke, hypertension (high blood pressure). We have really entered the era of peer-to-peer healthcare.


13. Mobile Health and Biometric Data

- Technology to Manage Our Health

Chronic diseases are long-term medical conditions that are generally progressive and are a significant cause of illness and death. These patients need closer health status monitoring and the study of their biometric data could allow physicians to foresee their crises. Heart disease and high blood pressure affect a large number of people, as well as lipid and cholesterol disorders, are well-understood. Heart disease is the No. 1 killer in the world, and strokes are among the leading causes of death. There are a variety of tools that can help us take greater control of our well-being and make managing our health more convenient. Sensors and apps can monitor our heartbeat, breathing, sleep patterns, activity level, posture and stress – and even help us to adjust treatments and communicate with remote care teams. The information from these devices can empower us to play a more active role in managing illnesses, and offer early warnings about potential problems. There's evidence that consumer products can help. For example, Apple's smartwatch now includes an electrocardiogram, which can detect heart rhythm irregularities. people would use it to discover they were pregnant, at risk for a heart attack or experiencing a dangerous irregular heart rhythm.

- Technologies for Chronic Disease Care

A number of technologies can reduce overall costs for the prevention or management of chronic illnesses. These include devices that constantly monitor health indicators, devices that auto-administer therapies, or devices that track real-time health data when a patient self-administers a therapy. Because they have increased access to high-speed Internet and smartphones, many patients have started to use mobile applications (i.e., FDA-approved mobile devices or apps) to manage various health needs. For example, having an EKG (or ECG)-accurate (Electrocardiogram) monitor strapped to (a mass amount of) wearers (or patients) throughout the day could be hugely beneficial to the study of heart disease. Accurate EKG (or ECG) data generated throughout a normal day and having the data collected sent automatically to scientists and doctors, combined with other metrics, could help researchers understand more about heart performance. 

- Healthcare Software and Biometric Data

Healthcare software could be used to warn users of a heart attack or stroke days in advance. Such biometric data provide a practitioner immediate information about a patient, and when collected from large numbers of people, can reveal patterns and trends that are clinically useful. Data collection by patients gives them “ownership” of the process; they become more motivated to track and adjust their behavior to prevent disease, to recognize changes and to follow care plans developed in consultation with their providers. The “Internet of Things and Beyond” will make health monitoring, diagnostics and treatment more personalized (i.e., personalized and precision medicine), timely and convenient, while also lowering costs.


14. Electronic Health Records (EHRs) and Big Data

- Electronic Health Records - A Digital Version of A Patient’s Paper Chart 

An electronic health record (EHR), the technology that underpins our health care system and holds data that could transform how millions of people engage in their long-term wellness, is a digital version of a patient’s paper chart. EHRs are real-time, patient-centered records that make information available instantly and securely to authorized users. EHRs can: (a) contain a patient’s medical history, diagnoses, medications, treatment plans, immunization dates, allergies, radiology images, and laboratory and test results; (b) allow access to evidence-based tools that providers can use to make decisions about a patient’s care; (c) automate and streamline provider workflow.  

- Diagnostic EHR Data Sharing in Health Systems Improves Outcomes 

One of the key features of an EHR is that health information can be created and managed by authorized providers in a digital format capable of being shared with other providers across more than one health care organization. EHRs are built to share information with other health care providers and organizations – such as laboratories, specialists, medical imaging facilities, pharmacies, emergency facilities, and school and workplace clinics – so they contain information from all clinicians involved in a patient’s care. This allows two things: First, patients, using secure passwords or key codes can access their own records and thus participate more in their own healthcare. Second, multiple providers can access individual patient records, and have complete histories as they provide care to new patients. A national database of medical records is not far off. 

There are undeniable clinical, operational, and administrative benefits of embracing EHR in medical care. It helps in having a clear overview of the patient history and relevant data, it can safely store clinical notes, provide a thorough list of patient’s allergies, make viewing lab and imaging results a lot easier, and much more. It truly can improve patient care and help with increasing the level of safety when it comes to medical practice. 

- Major Challenge Facing EHRs

However, there are many different EHRs systems used in the U.S. each with its own language for representing and sharing data. Interoperability is one hurdle. Critical information is often scattered across multiple facilities, and sometimes it isn’t accessible when it is needed most - a situation that plays out every day around the U.S., costing money and sometimes even lives. Patient engagement, activation and participation is another hurdle. Privacy concerns also abound. Patients are worried about their genetic data and what happens to it, and how it can be used when it is contributed to the research cohort. 


(The University of Chicago - Maya Lim)

15. Big Data Applications and Analytics in Biomedical Research and Healthcare

- Big Data in Medical Research and Application

Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression.

 - Biomedical Data Integration

New medical breakthroughs as well as the effective management of healthcare in the future requires the integration of data (e.g., crowdsourced data collection) and methods across the different realms of fundamental research, development of therapeutics (e.g., nanotechnology-based cancer therapeutics), healthcare practice, and massive high performance computing infrastructure. 

The growing volume and diversity of data types and sources requires a flexible, scalable storage infrastructure and software to manage it all.

- Big Data Infrastructure

The large volume of data coming from all the different health-monitoring devices and constituting the ‘individualome’ requires large-capacity hardware infrastructures for storage and processing. Such resources can be implemented locally at the data centers associated with hospitals or deployed on secured cloud computing or virtual private server computing environments. In particular, given the sensitivity of the information, only secured and HIPAA-approved architectures should be considered. 

High-throughput platforms such as microarray, mass spectrometry, and next-generation sequencing are producing an increasing volume of 'omics data that needs large data storage and computing power. Cloud computing offers massive scalable computing and storage, data sharing, on-demand anytime and anywhere access to resources and applications, and thus, it may represent the key technology for facing those issues. 

High-performance analytics, high-speed connections and affordable data storage have made large data-sharing projects possible in healthcare too. Healthcare analytics can not only help reduce the cost of healthcare facilities including treatments, medication, and diagnosis. Analytics in this area can also contribute to predicting the outbreak of endemic and epidemic diseases like SAARS and the Flu.

- Reshaping the Healthcare System Through Data Science and Analytics

In healthcare, big data tools and technologies have the potential to create significant value by improving outcomes while lowering costs for each individual patient.

Diagnostic images, genetic test results and biometric information are increasingly generated and stored in electronic health records presenting us with challenges in data that is by nature high volume, variety and velocity, thereby necessitating novel ways to store, manage and process big data. This presents an urgent need to develop new, scalable and expandable big data infrastructure and analytical methods that can enable healthcare providers access knowledge for the individual patient, yielding better decisions and outcomes. 

With these combined data sources from hundreds of studies and dozens of companies, researchers – from large academic institutions, commercial organizations, or small research labs in remote corners of the world – are finding deeper insights than ever before, getting answers faster, reducing duplication of effort and improving efficiency.

Data-driven healthcare has its own set of obstacles. Medical data encompasses different hospitals, districts, and states. They include several administrative systems. This call for the necessity of a new tool that can help data providers and data users collaborate with each other. This is why the creation of new analytics tools, strategies, and data applications are significant right now. Healthcare needs acute data analysis in the forms of the graph, machine learning and predictive analysis that other industries are already enjoying.


16. Reality Technology (MR, AR, VR) in Medicine and Healthcare

- The Difference Between VR, MR, and AR

Virtual reality (VR), augmented reality (AR) and mixed reality (MR) are emerging technologies utilizing a variety of digital (artificial) immersion and overlays on the real world that users can interact with. To cut a long story short, here’s the difference between virtual, augmented, and mixed reality technologies: 

Virtual Reality (VR) immerses users in a fully artificial digital environment. VR encompasses immersive experiences and content via a VR headset or HMD (head-mounted display). The content is 100% digital and computer generated. Current reality is replaced with a new 3D digital environment in which the user is isolated from the real world.

Augmented Reality (AR) overlays virtual objects on the real-world environment. Augmented reality (AR) overlays computer-generated content on top of the real world. This superimposed digital overlay can superficially interact with the environment in real time. AR is primarily experienced via a wearable glass device or through smartphone applications.

Mixed Reality (MR) brings together real world and digital elements. In MR, you interact with and manipulate both physical and virtual items and environments, using next-generation sensing and imaging technologies. MR allows you to see and immerse yourself in the world around you even as you interact with a virtual environment using your own hands - all without ever removing your headset. It provides the ability to have one foot (or hand) in the real world, and the other in an imaginary place, breaking down basic concepts between real and imaginary, offering an experience that can change the way you game and work today.

Extended reality (XR) is an umbrella term that encompasses all real and virtual environments which include VR, AR and MR.

- VR in Medicine

Healthcare is one of the biggest adopters of virtual reality which encompasses surgery simulation, phobia treatment, robotic surgery and skills training. Virtual Reality (VR), with its ability to fully immerse the user in a simulated environment, is a natural fit for medicine. One of the advantages of this technology is that it allows healthcare professionals to learn new skills as well as refreshing existing ones in a safe environment. Plus it allows this without causing any danger to the patients.

Patients and doctors alike are thankful for anything that will increase a medical procedure’s chance of success. As such, healthcare is leading the charge to widespread adoption of VR. As VR, as well as Mixed Reality (MR) and Augmented Reality (AR), continue to find a place in the mainstream public consciousness, more and more healthcare applications are developed all the time. 

- Examples of Virtual Reality and Healthcare

Human Simulation Software - The Human Simulation System enables doctors, nurses and other medical personnel to interact with others in an interactive environment. They engage in training scenarios in which they have to interact with a patient but within a 3D environment only. This is an immersive experience which measures the participant’s emotions via a series of sensors. 

Virtual Reality Diagnostics - VR is often used as a diagnostic tool in that it enables doctors to arrive at a diagnosis in conjunction with other methods such as MRI scans. This removes the need for invasive procedures or surgery. 

Virtual Robotic Surgery - A popular use of this technology is in robotic surgery. This is where surgery is performed by means of a robotic device – controlled by a human surgeon, which reduces time and risk of complications. Virtual reality has been also been used for training purposes and, in the field of remote telesurgery in which surgery is performed by the surgeon at a separate location to the patient.

- Looking Ahead 

We can certainly state that VR is already the best Telehealth platform because the actual place that you are in is no longer relevant. In a VR world, it doesn’t matter where a patient is. VR will change healthcare as we know it: it will reduce costs, improve access and patients’ experience. 


17. Supercomputing and Biomedical Research

Supercomputing (HTC and HPC ) efficiently solves extremely complex or data intensive problems by concentrating the processing power of multiple, parallel computers. It involves a system working at the maximum potential performance of any computer, typically measured in Petaflops. The majority of supercomputers today run Linux-based operating systems. 

Supercomputing enables problem solving and data analysis that would be simply impossible, too time-consuming or costly with standard computers. It enables a revolutionary approach to improve biological understanding, human health and biosecurity through application of advanced computational technology -- bringing together large-scale simulation, deep analysis of complex and diverse data and new targeted sensor and measurement technologies. Supercomputing opens up new horizons, offering the possibility of discovering new ways to understand life’s complexity.

For example, Supercomputers can help identify effective treatments for diseases such as cancer. Supercomputers have the potential to greatly accelerate the development of cancer therapies by finding patterns in massive datasets too large for human analysis. Supercomputers can help us better understand the complexity of cancer development, identify novel and effective treatments, and help elucidate patterns in vast and complex data sets that advance our understanding of cancer. 


18. Healthcare 4.0, Smart Hospitals and the Connected Healthcare Ecosystem

- Telemedicine and Telehealth 2.0

Telemedicine and telehealth provide convenience, efficiency, and quality care for both medical providers and patients. Over the past decade, the telemedicine technologies and use models have matured. Today we live in the era of Telemedicine 2.0, where physicians and patients can interact remotely and transfer medical data using their own general-purpose computing devices.

In the digital age, health interoperability - the ability of different devices, IT systems and software to communicate, exchange and use shared data – will  play an increasingly important role in providing timely, accurate care based on access to real-time patient health data and records. An emerging open, standards based technology platform enables health systems, providers and app vendors to share and integrate health data from multiple sources, making pertinent patient information securely accessible when and where it’s needed. Telemedicine success hinges on connections. Hospitals (the digital hospitals) need to integrate hardware and software to improve remote care delivery experiences for patients. Fueled by three converging trends - increasing government support and reimbursement for telehealth services; purpose-built, integrated hardware-software solutions; and the “consumerization of medical devices” - telemedicine is expected to grow rapidly. 
Big Data and machine learning, the Internet of Things as well as mobile patient platforms are starting to change every day healthcare practice. As a result, the industry is facing a second wave of digitalization, sometimes also referred to as “Healthcare 4.0”. The goal will be more patients to be seen, diagnosed and cared for in more affordable and effective ways, and to recognize that health and care management needs (a patient-centric business model) to occur wherever the patient is, not just in hospitals or physician offices. 

- Smart Hospitals and Future Technology  

Smart hospitals are those that optimize, redesign or build new clinical processes, management systems and potentially even infrastructure, enabled by underlying digitized networking infrastructure of interconnected assets, to provide a valuable service or insight which was not possible or available earlier, to achieve better patient care, experience and operational efficiency. Artificial Intelligence (AI) command centers (or AI-driven care coordination centers) are helping hospitals harness analytics to manage operations. They can process streams of real-time data from multiple sources and offer alerts and suggested actions to help the hospital track patient progression, predict and prevent safety risks and manage the workload of its staff.  

The smart hospital framework involves three essential layers - data, insight and access. Data is being collected even today, although not necessarily from all systems in a hospital, but is not integrated together to derive ‘smart’ insight, which can be done by feeding it in to analytics or machine learning software. This insight must be accessible to the user - a doctor, a nurse, facilities personnel or any other stakeholder, through an interface including a desktop or a smartphone or similar handheld device, to empower them to make critical decisions faster, improving their efficiency. 

- Smart Hospitals and Beyond

There are three areas that any smart hospital addresses - operations, clinical tasks and patient centricity. Operational efficiency can be achieved by employing building automation systems and smart asset maintenance and management solutions, along with improving internal logistics of mobile assets, pharmaceutical, medical device, supplies and consumables inventory as well as control over people flow (staff, patients and visitors). Not only do these solutions reduce operational costs such as energy requirements, but also reduce the need for capital expenditures on mobile assets for example, by improving utilization rates of existing equipment. Patient flow bottlenecks, when addressed, improve efficiency, allowing more patients to be ‘processed’ through the system, allowing for more revenue opportunities at lower costs. 
The widespread use of Internet of Things (IoT), especially smart wearables, will play an important role in improving the quality of medical care, bringing convenience for patients and improving the management level of hospitals. However, due to the limitation of communication protocols, there exists non unified architecture that can connect all intelligent things in smart hospitals, which is made possible by the emergence of the Narrowband IoT (NB-IoT). In light of this, we propose an architecture to connect intelligent things in smart hospitals based on NB-IoT, and introduce edge computing to deal with the requirement of latency in medical process.



<updated by hhw: 3/26/2021>



Document Actions