Does technology really help save millions of lives? How will the medicine of the future be?

 



Does technology really help save millions of lives? How will the medicine of the future be?



Does technology really help save millions of lives? How will the medicine of the future be?



Astronomers predict that we are living on the edge of the technological singularity - an era in which the development of technology becomes so unstoppable that it fundamentally changes human civilization. Medicine will also change, because it depends entirely on technical development, new inventions and scientific discoveries. About future changes - in the joint venture "Lenta.ru" and Sberbank .


After 30 years, a person wakes up, as usual, to take a shower. But the shower cabin will essentially be a robot with a CT scanner and sensors that track dozens of health indicators. This data will be analyzed by artificial intelligence and then examined by a specialist. After that, the person will receive an injection of a specially selected medical "cocktail" of anti-inflammatory drugs. When the shower is turned on, the user already feels the positive effect ...

This is how Miles Romney, co-founder of the medical software company eVisit, describes the near future. Romney is convinced that dramatic changes await medicine. It will integrate with information technology, artificial intelligence and robotics .

The patient will no longer need to personally come to the hospital and wait in line at the doctor's office to consult a specialist and find out the diagnosis. And all this will become possible, including thanks to telemedicine.

Humanity became convinced of its importance in 2020, when the entire planet faced the COVID-19 pandemic It has played a role in ensuring the safety of patients and physicians alike.

According to experts, online consultations not only saved time, but also significantly reduced the cost of transporting seriously ill patients. In the UK alone, more than half a billion pounds were saved in one year, according to the NHS.



The origin of telemedicine can be considered the end of the 19th century, when wired communications appeared. In 1897, a doctor diagnosed a child in the middle of the night over the phone. Then the Lancet, which wrote about it, raised the question of the possibility of monitoring patients remotely for the first time.

Several decades later - in 1925 - inventor Hugo Gernsbek, publisher of Science and Invention, predicted that in the future, doctors would be able to treat patients remotely using a device he called a "teledactyl". This ingenious unit consists of manipulators with long, slender fingers, which are connected to other similar devices remotely and repeat their movement. With their help, the doctor can remotely touch the patient and monitor him through a huge screen. As Gernsbeck envisioned, the teledactyl could measure body temperature, pulse, listen to the lungs and transmit data almost instantly to a doctor. A doctor can write a prescription by asking the patient to place a pen in the teledactyl's fingers.

Gernsbeck literally predicted modern technology. He understood that telephones, radio, and television would completely change many aspects of a person's daily life. Its teledactyl is the same surgical robot that already exists, with which the doctor can perform the procedure without being in the operating room next to the patient.

Robotic surgery began to develop in the 1980s, 55 years after Gernsbeck's predictions. One of the most successful surgical systems is the Da Vinci system, which was funded by the US military.


It was originally planned that such a system would perform operations in hot spots, in fact, directly on the battlefield, thereby reducing combat losses, while the operating surgeon who controls the robot using telemedicine elements will be elsewhere. A prototype was created - a mobile vehicle with automated surgical equipment, in which a wounded soldier was placed. The doctor can perform the operations while in a nearby mobile hospital. The development has been successfully tested on animals, but its widespread adoption is still a matter of the future.

However, robotic operations have become a reality: they were first implemented in 2001 using the same da Vinci system. By 2012, more than 200,000 surgeries had been performed, and by 2020 there were about 5,000 operating da Vinci surgical robots in the world.

The list of potential surgeries performed by robots is truly astounding: from repairing heart tissue and treating herniated discs, to removing tumors and opening brain surgeries.

However, robots do not have to be surgeons . They can become a kind of "teledactyls on wheels", that is, help doctors examine patients from a distance. The autonomous robots will move from room to room and, if necessary, automatically return to the docking station for charging, saving the doctor or nurse time. These robots must have artificial intelligence (AI) and vision, which will allow them to recognize potential obstacles and create routes along hospital corridors.

One such robot is Dr Rho , developed by the Indian company Vyas Labs. It consists of a moving body and a screen for communication between the patient and the doctor. The device is equipped with an optical system that tracks the doctor's gestures and movements, a set of manipulators and medical instruments: an electronic stethoscope, a tonometer, a thermometer, an electrocardiogram, a pulse oximeter.

Another robot, Stevie, helps care for the elderly by playing and interacting with them. Stevie, equipped with an independent navigation, can navigate the corridors of a nursing home without assistance. It is able to recognize simple voice commands - for example, "help me" - and alert staff to urgent care needs.


Artificial intelligence and robotics

Humanity is entering the era of high-speed internet and cloud technologies . The connection becomes more reliable and secure, along with improving the quality of telemedicine services.

Artificial intelligence helps the doctor identify diseases, make a diagnosis, analyze images and search results, and can free doctors from routine operations, and improve the quality of medical services in general.

For example, if the AI detects that the indicators are different from the norm, it can bring this to the attention of the attending physician, saving valuable time. In addition, AI can help with another important aspect of treatment - medications. In the future, he will monitor the treatment regimen and reduce the likelihood of non-compliance with the doctor's recommendations.

However, according to Yuri Krestinsky, Vice President, Head of the Healthcare Sector of Sberbank, today's developments of Sberbank help doctors and medical personnel in their daily work, eliminating and improving routine operations.

One of the projects of Sberbank is the TOP-3 smart doctor assistant, developed jointly with the Moscow government. Using artificial intelligence , it allows us to identify the three most likely diagnoses out of 265 groups of diseases - this is 95 percent of all possible diagnoses for Russians at the first visit to the doctor. But the decision is always up to the doctor only. The model is installed in all adult clinics in Moscow, and more than 3.5 thousand doctors use it


“SberMedII, together with the Sberbank Artificial Intelligence Laboratory, created the “CT of Lungs” service, Krestinsky gives another example. - The service aims to search for diseases, including those caused by COVID-19 , as well as segmentation of tissue areas The affected lung with an indication of the extent of the lesion: it makes it possible, on the basis of computed tomography, to identify patients with changes in the lungs with viral pneumonia, and allows an estimate of the volume in a few seconds and the extent of these changes.”

However, the unique developments of Sberbank do not end there. In particular, Yuri Krestinsky talks about the "CT Stroke" service:

Based on artificial intelligence algorithmsIt automatically marks CT scans and allows you to quickly and accurately assess both the type of stroke and the extent of damage. This helps clinicians, regardless of their experience and knowledge, make fast and consistent treatment decisions. These products and solutions and many other products and solutions of SberMedII, as well as other companies within the Sber ecosystem and partners, are combined in one platform - the Digital Medical Diagnostic Center (MDDC). The platform was created to help clinicians make diagnoses and make medical decisions based on initial input data, automated and laboratory diagnoses. In many regions of Russia, hospitals are already using another AI product - Voice2Med (developed by the Millennium Development Goals Group), which allows you to fill in medical documents in real time, converting the doctor's voice into text.


The Voice2Med solution from the MDG group of companies is successfully used in medical institutions of 30 regions of Russia, it can be used by doctors of various directions. The most active use of Voice2Med is Radiodiagnostic Physicians - Radiologists. The Center for Diagnostics and Telemedicine recorded a 22 percent decrease in the time of preparation of medical protocols in medical organizations of the Ministry of Health in Moscow. Moscow radiologists have prepared more than 95 thousand protocols using speech recognition technology. And since this year, two new medical terminology dictionaries have been added: the surgeon's dictionary, used when filling patients' diaries, and protocols for operations, in the emergency room, and the cardiologist's dictionary.


Another voice robot from the RTC group of companies helps to improve the work of medical institutions, save the doctor's time and focus on the detailed examination and communication of the patient. The virtual assistant can remind you of an appointment with a doctor - it will save time for canceled or rescheduled appointments for other patients and increase the efficiency of loading medical institutions, which is especially important today. During the appointment reminder call, the bot can advise you to prepare for the procedure and collect symptoms.


Many of the world's algorithms can perform tests and help a doctor with a diagnosis. For example, AI analyzes a patient's medical records and suggests treatment according to them. Moreover, this is not a prediction of the future, but a realistic development. IBM Watson Health is already being used to develop cancer treatment plans at Jupiter Hospital in Florida. Watson can look at a patient's history and suggest treatment options, as well as analyze the effectiveness of current treatments in order to determine which treatment will ultimately give the best results.


Telemedicine is convenient because the patient can save his time and communicate with the doctor from home or any other convenient place where the Internet is, while avoiding contact with potential sources of infection. You can get an appointment in a few minutes - no queues, spent on the road and waiting time. A large proportion of questions to the doctor can be resolved at the initial appointment remotely


The remote monitoring platform has been implemented in more than 38 regions of Russia and provides more than 50 thousand patients with remote monitoring of COVID-19 and chronic non-communicable diseases. Its launch came with the first wave of the COVID-19 epidemic , so many patients were able to receive help at a time when the workload of doctors increased significantly and specialists could not devote much time to each patient. Since April 2020, telemedicine doctors have monitored the health of tens of thousands of people online. During this time, approximately five thousand patients with deteriorating health were identified. The total number of consultations exceeded 20,000, and the average "appointment" duration was about 15 minutes.


by the power of thought

The patient is not always able to move freely. He may suffer from a variety of serious neuromuscular disorders, become paralyzed and be unable to tell a doctor what is bothering him. To improve the standard of living of these patients, neural interfaces are being developed that allow a person to exchange information directly with a computer and other devices, such as prosthetics or even an exoskeleton that allows movement.


Neural interfaces, combined with virtual reality technologies, could work to rehabilitate patients who have survived a stroke and have lost the ability to move normally. The device reads signals from the brain when a person wants to, for example, raise his arm, and the use of electrical muscle stimulation makes the limb move in the desired direction. In this case, the patient could be inside a VR environment where they have to perform certain tasks - for example, taking a virtual ball or performing simple exercises.


At the moment, brain-computer interfaces are still in an early stage of development, but it is already possible to imagine networks connecting the brains of many people at once to exchange information literally with the power of thought. Such applications can be a breakthrough in the treatment of autism spectrum disorders, when a person has difficulty communicating with others, to the point of being completely unable to socialize.


Artificial sensory organs already exist - for example, a cochlear implant that attaches to the auditory nerve and allows you to compensate for severe hearing loss if conventional hearing aids are powerless.


It is estimated that about 2.2 million people with amputations live in the United States alone. Prosthetics help restore the full life of people who have lost limbs. However, prosthetic arms or legs can be uncomfortable, unreliable, or painful, requiring patients to abandon them. At the same time, in science fiction and futuristic fiction, prosthetics are almost indistinguishable from real ones, and even provide users with additional options.


An important task is to give the patient complete control over the prosthesis and the possibility to respond to it. This problem lies in the same area as neural interfaces. There are already prototypes of neural prostheses that contain networks of electrodes placed in nerves, muscles and the brain. When a patient with quadriplegia (paralysis of the arms and legs) tries to move the arm or imagine how to move it, neural activity arises in the motor cortex corresponding to this movement. The neural suit can read this activity, decode it, and perform the desired action.


In 2016, scientists first tested a prototype of an advanced neural limb called Utah Massif on 28-year-old quadriplegic patient Nathan Copeland. He was paralyzed after a car accident and only retained the ability to move his forearm. Copeland became the first person in the world to have electrodes implanted in the somatosensory and motor cortex, as a result of which he could not only move the prosthesis, but also feel it to some extent. Nathan was able to experience the different finger touch of the robotic device as his own tip. This interface was bidirectional, that is, signals are transmitted from the prosthesis to the brain (touch) and from the brain to the prosthesis (hand movement).




No comments:

Post a Comment