AI Analyzing Voice Data to Create “Predictive Medical Diagnoses

0 Shares

AI Analyzing Voice Data to Create “Predictive Medical Diagnoses

NEW PODCAST HIGHLIGHTS HOW AI IS ANALYZING VOICE DATA TO MAKE PREDICTIVE MEDICAL DIAGNOSES

AI, or artificial intelligence, has been creating an impressive news footprint for the past year or so, with cause.  And we can expect more to come, at least for the next few human generations.  But this particular story feels like the opening lid of a more specific Pandora’s Box, one that involves healthcare and individual privacy.  A new episode of the Podcast “In Machines We Trust” focused on the way that universities, medical researchers, practitioners and the private industry are using AI to make medical diagnoses.  This entire path makes sense in terms of utility.  But like most things involving AI, the pitfalls seem bottomless.

Read More: Cops Intercepted 58 Kilos of Nazi Cocaine in Peru Headed to Belgium

SCIENCE FICTION IS NOW AGAIN ALREADY SCIENCE FACT, AS OUR VOICE DATA IS OUT IN THE WILD

One notable quote from the podcast includes, “Hidden away in our voices are signals that may hold clues to how we’re doing, what we’re feeling, and even what’s going on with our physical health. Now, AI systems tasked with analyzing these signals are moving into healthcare.”  Any consumer of media who has seen any of a number of movies or tv shows has been exposed to the idea of a computer of some kind asking a human if they’re feeling ok, as they can detect high “stress levels” in their voices.  But that science fiction has yet again moved into science fact.  Now, today, this minute and forever moving forward.

Related: 

Senator Michael Bennet Spearheads AI Task Force

AS A MEDICAL TOOL, AI ANALYSIS OF VOICE DATA IS FANTASTIC, BUT BAD OR GREY ACTORS COULD MAKE IT THEIR OWN

Already, AI voice data analysis is starting to excel over human doctors in terms of diagnostic success.  While that’s a good thing, especially as AI is able to triangulate more diagnostic nuance, it should give us pause.  Why?  Because if AI can extrapolate how we are feeling emotionally, as well as diagnose hidden physical or mental illnesses, bad actors (or even supposed grey actors) could use these tools without our knowledge or consent.  Do you know now how many times your voice has been recorded without your knowledge or consent?  Is your voice already part of a database, a sellable resource to anyone in the world?

Knowledge is power.  And now, there is a new possible (or already realized) power over the very knowledge of yourself and wellbeing.  Perhaps more than you yourself know.

0 Shares

Comments are closed.