Would you
be comfortable going in for a physical if your doctor was a Robot ?
Well, I
say simple because we see examples of image recognition in our daily lives such
as social media platforms, that prompt you to tag your friends and pictures.
But teaching machines to analyze and recognize elements and photographs and video isn't easy, Then there's IBM
Source:www.fwthinking.com
Not much long
ago, Researchers posted an interesting article in the Journal of Radiology. The
researchers used software to analyze patients with Pulmonary Hypertension.
A serious
condition that can weaken the heart, as a result of high blood pressure in the
lungs.
The
researchers trained the software using a machine learning strategy to analyze
MRI scans and as well the results from blood tests.
The
software could observe 30,000 points on the heart in motion during a heartbeat.
Researchers fed 8 years, worth of data into the software representing 256
different patients.
Sadly,
many of those patients had passed away. More than half of those diagnosed with
pulmonary hypertension died within five years. There are treatments ranging
from moderate to extreme.
But
doctors can't always be sure of which path to take. The software's job is to
give doctors more information in an effort to save patient lives.
The
software turned out to be more effective at predicting which patients would survive
for a year than human doctor.
Humans
scored at 60 percent and the software hit 80 percent accuracy. Using the software
could help doctors make critical decisions for their patients.
This is
just one way artificial intelligence is lending a hand in the medical industry.
Another is through simple Image Recognition software.
Source: www.samadimd.com
Source: www.samadimd.com
But teaching machines to analyze and recognize elements and photographs and video isn't easy, Then there's IBM
You might
remember that IBM planned for Watson, the computer that's also a jeopardy
champion to put it simulated brain power to work for the medical industry.
IBM calls
it cognitive healthcare. Watson's job is to look at many points of data in an
effort to find meaning in it.
According
to IBM 80 % of health data is unstructured and most modern systems can't make
any use of it.
Watson's
ability to find patterns and links can lead to more effective diagnosis and
treatment Watson can also help doctors mitigate risks to patients.
In a
white paper, IBM researchers explain that effective predictive models can give
doctors more information about likely outcomes of any specific course of action.
Doctors
can work with patients to explore different approaches to treatment with a
deeper understanding of potential consequences and side effects.
The goal
is to create the best treatment for the patient without putting them to
unnecessary risk IBM is also working on deep learning strategies to train software
to recognize medical problems from images like Tumors.
Johns
Hopkins has run trials with doctors analyzing mammogram results with computer
assistance. But it's still early days for this technology and it may be some
time before it's effective enough to warrant wide deployment.
We're
definitely not at the point where a machine is going to do all the thinking. These
tools are meant to augment a doctors ability to treat patients.
If we
ever do see a future in which machines diagnose and treat human patients, it
will likely take decades of advancements in machine intelligence and robotics which
can also be negative.
But in
the meantime, we can enjoy the benefits of powerful computers containing the
sum total of our medical knowledge which is pretty forward-thinking if you ask
me.
My
question to you guys this week is how long do you think it will be before we
see the world's first honest-to-goodness robot doctor .Let me know your guess
in the comments below.
Thanks for reading.
No comments:
Post a Comment