California Doctors to record patient visit notes with Apple Watch

Posted:
in General Discussion
Clinicians in California are to utilize an Apple Watch system that will let them dictate their notes after a visit, and then have the transcription plus relevant health data automatically appended to a patient's medical records.

Apple Watch Series 5 has many new features
Apple Watch Series 5 has many new features


A new service for doctors from Altais, is to have them wear Apple Watches during patient visits and then use the technology to drastically cut the time needed for writing up their notes. The system, a platform developed in conjunction with Notable Health and Blue Shield of California, uses machine learning to automate parts of the process.

"Our goal is to help physicians seamlessly leverage technology to improve the health and well-being of their patients," said Jeff Bailet, M.D., president and CEO of Altais, "all while reducing administrative hassles and enhancing their professional gratification."

Instead of typing notes and entering details into a patient's Electronic Health Record (EHR), a doctor will be able to dictate them into his or her Apple Watch.

This can be done during or after the visit, but once the notes are entered into the Apple Watch system this way, natural language processing will determine the key points. This most relevant data will then be added to the EHR automatically.

"As a general internist taking primary care of an elderly population with multiple complex illnesses," says Richard Thorp, MD, president and CEO of Paradise Medical Group, "I will now have a maximally efficient workflow, streamlined data entry, and patient input pre-built into each of my patient encounters, and that is extremely exciting."

As well as cutting down admin time for doctors and making sure data is captured accurately, the service is intended to directly assist patients, too.

A Blue Shield app will allow patients in the region, and whose doctors are on the program, to be reminded of appointments, check their insurance, and also run self-assessment health surveys.

The Apple Watch is increasingly being used by health professionals, though it is chiefly assisting researchers looking into hearing, reproductive and general health.

Comments

  • Reply 1 of 15
    I record health visits with my doctor.
    toysandmelostkiwi
  • Reply 2 of 15
    How long until it turns out that snippets of audio are being reviewed for quality by an off-shore third party?
  • Reply 3 of 15
    How long until it turns out that snippets of audio are being reviewed for quality by an off-shore third party?
    I doubt that will be necessary. The quality of expert systems continues to improve and such systems analyzing X-ray data are more accurate than the average human. I suspect machine learning based expert systems will surpass their human counterparts in terms of diagnosis within the next 5 years or so.
    lostkiwi
  • Reply 4 of 15
    chasmchasm Posts: 3,275member
    How long until it turns out that snippets of audio are being reviewed for quality by an off-shore third party?
    This seems very unlikely, given that these transcriptions are part of the patient's record, therefore HIPPA standards come into play.
    davenAppleExposed
  • Reply 5 of 15
    roakeroake Posts: 809member
    As a physician, working at a hospital system that utilizes the “Cadillac” of the electronic medical records (EMR) software, I can tell you that the machine learning and “automation” has a long, long way to go.

    The “automation” part of the machine learning basically looks for key words in my documentation as well as looking at lab results and diagnoses that have been entered by humans to try to match patterns and suggest additional medical issues.  The suggestions are wrong probably 70 percent of the time, and have been already documented 25 percent of the time (but the software doesn’t recognize this).

    The few times that the suggestions are “relevant,” I’m getting asked to clarify something like which bacteria is responsible for causing a pneumonia.  How the f*%k should I know?  Most of the time respiratory cultures don’t grow the culprit bacteria; if it DOES grow, it usually takes a few days before there is enough to identify, and we document that anyway.

    I’ve found that the AI portion of this 100’s of millions of dollars EMR is more of a distraction than anything else.  I don’t know any physician who doesn’t simply ignore it.

    On a related topic, I would point out that the documentation that a physician dictates must meet an EXTREMELY complex set of criteria, covering tons of nonsense bullet points that change based on the patient and various diagnoses.  Dictating an entire note as a single recording into a watch would bypass all the advantages that the crazy expensive EMR software grants you, such as automating certain parts of the note (insertion of lab values, test results, etc.).  The article also doesn’t say whether it’s (A) SIRI transcribing (which works extremely poorly when medical terminology is involved), (B) Custom dictation software such as Dragon Medical is involved, or (C) A human transcription service types this up.

    Lastly, to those of you that think computers are “good” at reading studies such as XRays, EKG’s, etc, I can assure that while this may be true in science fiction movies, these things suck in real life.  The automated EKG interpretation algorithms have been progressing for decades, but still suck.  They get some things right, but get just as much wrong.  We aren’t going to see these things nearly as good as humans anytime in the foreseeable future.
    edited October 2019 mwhiteindieshacktoysandmeGeorgeBMacjmc54neilmlostkiwi
  • Reply 6 of 15
    roakeroake Posts: 809member
    chasm said:
    How long until it turns out that snippets of audio are being reviewed for quality by an off-shore third party?
    This seems very unlikely, given that these transcriptions are part of the patient's record, therefore HIPPA standards come into play.
    To be HIPAA-compliant, these contractors would simply have to sign the appropriate agreements with the medical facility, or get a release.  It’s the same for medical facility office personnel, insurance agents, and anyone else that comes into contact with patient data.  The standard that must be met to show “need” to legally access those records is very low.  A TON of different people and organizations touch your medical records.

    Ironically, medical staff cannot access their OWN medical records easily at all until the medical encounter (such as a hospitalization) is over.  They then have to go to Medical Records and request paper copies unless they have been to a facility that allows access to the records on-line.  However, a 2-3 day hospitalization will generate 1-2 hundred pages of information (mostly automated crap).  The on-line access give you access to only the tiniest portion of your hospital data.  Getting it through medical record gives you theoretical access to virtually everything.
    edited October 2019 mwhiteindieshacktoysandme
  • Reply 7 of 15
    GeorgeBMacGeorgeBMac Posts: 11,421member
    How long until it turns out that snippets of audio are being reviewed for quality by an off-shore third party?
    I doubt that will be necessary. The quality of expert systems continues to improve and such systems analyzing X-ray data are more accurate than the average human. I suspect machine learning based expert systems will surpass their human counterparts in terms of diagnosis within the next 5 years or so.
    You may have missed the point:   A third party "review" wouldn't be to improve accuracy but to clue in "health partners" on sales (oops!) I mean "health" opportunities.

    EHRs are not there to improve your health but to improve profits.   While physicians generally hate them, the large health care organizations keep rolling them out and expanding them because they are good for the health of the bottom line.
  • Reply 8 of 15
    GeorgeBMacGeorgeBMac Posts: 11,421member
    chasm said:
    How long until it turns out that snippets of audio are being reviewed for quality by an off-shore third party?
    This seems very unlikely, given that these transcriptions are part of the patient's record, therefore HIPPA standards come into play.
    HIPPA does not apply to "health partners" -- only to your friends and relatives.
  • Reply 9 of 15
    GeorgeBMacGeorgeBMac Posts: 11,421member
    roake said:
    As a physician, working at a hospital system that utilizes the “Cadillac” of the electronic medical records (EMR) software, I can tell you that the machine learning and “automation” has a long, long way to go.

    The “automation” part of the machine learning basically looks for key words in my documentation as well as looking at lab results and diagnoses that have been entered by humans to try to match patterns and suggest additional medical issues.  The suggestions are wrong probably 70 percent of the time, and have been already documented 25 percent of the time (but the software doesn’t recognize this).

    The few times that the suggestions are “relevant,” I’m getting asked to clarify something like which bacteria is responsible for causing a pneumonia.  How the f*%k should I know?  Most of the time respiratory cultures don’t grow the culprit bacteria; if it DOES grow, it usually takes a few days before there is enough to identify, and we document that anyway.

    I’ve found that the AI portion of this 100’s of millions of dollars EMR is more of a distraction than anything else.  I don’t know any physician who doesn’t simply ignore it.

    On a related topic, I would point out that the documentation that a physician dictates must meet an EXTREMELY complex set of criteria, covering tons of nonsense bullet points that change based on the patient and various diagnoses.  Dictating an entire note as a single recording into a watch would bypass all the advantages that the crazy expensive EMR software grants you, such as automating certain parts of the note (insertion of lab values, test results, etc.).  The article also doesn’t say whether it’s (A) SIRI transcribing (which works extremely poorly when medical terminology is involved), (B) Custom dictation software such as Dragon Medical is involved, or (C) A human transcription service types this up.

    Lastly, to those of you that think computers are “good” at reading studies such as XRays, EKG’s, etc, I can assure that while this may be true in science fiction movies, these things suck in real life.  The automated EKG interpretation algorithms have been progressing for decades, but still suck.  They get some things right, but get just as much wrong.  We aren’t going to see these things nearly as good as humans anytime in the foreseeable future.
    Thanks,  that was good, insider's insight.
    As a patient I find that anytime I visit a new physician who has access my EHR I then spend half the visit correcting the misinformation in the EHR.  Adding automatic transcription -- especially combined with AI editing -- would certainly compound the problem.
  • Reply 10 of 15
    jmc54jmc54 Posts: 207member
    roake said:
    As a physician, working at a hospital system that utilizes the “Cadillac” of the electronic medical records (EMR) software, I can tell you that the machine learning and “automation” has a long, long way to go.

    The “automation” part of the machine learning basically looks for key words in my documentation as well as looking at lab results and diagnoses that have been entered by humans to try to match patterns and suggest additional medical issues.  The suggestions are wrong probably 70 percent of the time, and have been already documented 25 percent of the time (but the software doesn’t recognize this).

    The few times that the suggestions are “relevant,” I’m getting asked to clarify something like which bacteria is responsible for causing a pneumonia.  How the f*%k should I know?  Most of the time respiratory cultures don’t grow the culprit bacteria; if it DOES grow, it usually takes a few days before there is enough to identify, and we document that anyway.

    I’ve found that the AI portion of this 100’s of millions of dollars EMR is more of a distraction than anything else.  I don’t know any physician who doesn’t simply ignore it.

    On a related topic, I would point out that the documentation that a physician dictates must meet an EXTREMELY complex set of criteria, covering tons of nonsense bullet points that change based on the patient and various diagnoses.  Dictating an entire note as a single recording into a watch would bypass all the advantages that the crazy expensive EMR software grants you, such as automating certain parts of the note (insertion of lab values, test results, etc.).  The article also doesn’t say whether it’s (A) SIRI transcribing (which works extremely poorly when medical terminology is involved), (B) Custom dictation software such as Dragon Medical is involved, or (C) A human transcription service types this up.

    Lastly, to those of you that think computers are “good” at reading studies such as XRays, EKG’s, etc, I can assure that while this may be true in science fiction movies, these things suck in real life.  The automated EKG interpretation algorithms have been progressing for decades, but still suck.  They get some things right, but get just as much wrong.  We aren’t going to see these things nearly as good as humans anytime in the foreseeable future.
    My wife uses EPIC, couldn't agree more!
  • Reply 11 of 15
    neilmneilm Posts: 985member
    jmc54 said:
    roake said:
    As a physician, working at a hospital system that utilizes the “Cadillac” of the electronic medical records (EMR) software, I can tell you that the machine learning and “automation” has a long, long way to go.

    The “automation” part of the machine learning basically looks for key words in my documentation as well as looking at lab results and diagnoses that have been entered by humans to try to match patterns and suggest additional medical issues.  The suggestions are wrong probably 70 percent of the time, and have been already documented 25 percent of the time (but the software doesn’t recognize this).

    The few times that the suggestions are “relevant,” I’m getting asked to clarify something like which bacteria is responsible for causing a pneumonia.  How the f*%k should I know?  Most of the time respiratory cultures don’t grow the culprit bacteria; if it DOES grow, it usually takes a few days before there is enough to identify, and we document that anyway.

    I’ve found that the AI portion of this 100’s of millions of dollars EMR is more of a distraction than anything else.  I don’t know any physician who doesn’t simply ignore it.

    On a related topic, I would point out that the documentation that a physician dictates must meet an EXTREMELY complex set of criteria, covering tons of nonsense bullet points that change based on the patient and various diagnoses.  Dictating an entire note as a single recording into a watch would bypass all the advantages that the crazy expensive EMR software grants you, such as automating certain parts of the note (insertion of lab values, test results, etc.).  The article also doesn’t say whether it’s (A) SIRI transcribing (which works extremely poorly when medical terminology is involved), (B) Custom dictation software such as Dragon Medical is involved, or (C) A human transcription service types this up.

    Lastly, to those of you that think computers are “good” at reading studies such as XRays, EKG’s, etc, I can assure that while this may be true in science fiction movies, these things suck in real life.  The automated EKG interpretation algorithms have been progressing for decades, but still suck.  They get some things right, but get just as much wrong.  We aren’t going to see these things nearly as good as humans anytime in the foreseeable future.
    My wife uses EPIC, couldn't agree more!
    So EPIC Fail then!

    Perhaps they should have chose a different name for that product.
    GeorgeBMac
  • Reply 12 of 15
    mobirdmobird Posts: 752member
    roake said:
    As a physician, working at a hospital system that utilizes the “Cadillac” of the electronic medical records (EMR) software, I can tell you that the machine learning and “automation” has a long, long way to go.

    The “automation” part of the machine learning basically looks for key words in my documentation as well as looking at lab results and diagnoses that have been entered by humans to try to match patterns and suggest additional medical issues.  The suggestions are wrong probably 70 percent of the time, and have been already documented 25 percent of the time (but the software doesn’t recognize this).

    The few times that the suggestions are “relevant,” I’m getting asked to clarify something like which bacteria is responsible for causing a pneumonia.  How the f*%k should I know?  Most of the time respiratory cultures don’t grow the culprit bacteria; if it DOES grow, it usually takes a few days before there is enough to identify, and we document that anyway.

    I’ve found that the AI portion of this 100’s of millions of dollars EMR is more of a distraction than anything else.  I don’t know any physician who doesn’t simply ignore it.

    On a related topic, I would point out that the documentation that a physician dictates must meet an EXTREMELY complex set of criteria, covering tons of nonsense bullet points that change based on the patient and various diagnoses.  Dictating an entire note as a single recording into a watch would bypass all the advantages that the crazy expensive EMR software grants you, such as automating certain parts of the note (insertion of lab values, test results, etc.).  The article also doesn’t say whether it’s (A) SIRI transcribing (which works extremely poorly when medical terminology is involved), (B) Custom dictation software such as Dragon Medical is involved, or (C) A human transcription service types this up.

    Lastly, to those of you that think computers are “good” at reading studies such as XRays, EKG’s, etc, I can assure that while this may be true in science fiction movies, these things suck in real life.  The automated EKG interpretation algorithms have been progressing for decades, but still suck.  They get some things right, but get just as much wrong.  We aren’t going to see these things nearly as good as humans anytime in the foreseeable future.
    Sounds like they have been using Siri. /s ;)
  • Reply 13 of 15
    kavekave Posts: 1member
    In Sweden medical staff is not allowed to wear any wearables at all, no watches, no jewelry during practice. Why not just dictate to a phone or iPad?
    AppleExposed
  • Reply 14 of 15
    AppleExposedAppleExposed Posts: 1,805unconfirmed, member
    Hope Watch becomes the iPod of the medical industry.
  • Reply 15 of 15
    GeorgeBMacGeorgeBMac Posts: 11,421member
    Hope Watch becomes the iPod of the medical industry.
    That won't happen till the medical industry realizes that you can't dispense health from a bottle of pills and switches over to being an (actual) healthcare industry.
Sign In or Register to comment.