Soon after I wrote about people using AI chatbots such as ChatGPT to challenge medical bills, Loyal Reader Terry messaged me with concerns about medical privacy:
Why would I ever want to share my medical history with AI to have it ingested into every other chatbot through their now or soon to be shared hive mind? It’s one thing to assume there could be a data breach with my own medical provider or Medicare. It’s seems a whole other level to willing share that information with AI where you have little to no control on where that information ends up.
I’ll admit, I hadn’t thought much about privacy. Perhaps this is not surprising — I’m the guy who writes about his Parkinson’s symptoms on the internet. Privacy isn’t my default setting. But Loyal Reader Terry has a point.
For better or worse, America is turning to “Dr. AI” in large numbers. OpenAI reported in January that over 5% of ChatGPT messages globally are about healthcare — about 40 million messages each day.
In the US alone, reports the company,
- Around 2 million messages per week are about health insurance—comparing plans, understanding prices, or disputing claims.
- Rural users send nearly 600,000 healthcare questions every week.
- And seven in ten of these chats happen after clinic hours.
People aren’t just curious—they’re desperate for answers, even at 2 a.m.
Caitlin Owens of Axios recently reported the story of her husband Luke, who started a late-night conversation AI chat about his abdominal pains because he didn’t want to wake his wife and 3-month son. To its credit, ChatGPT correctly suspected gall bladder issues and urged him to get to an ER.
Owens’ article also discussed her mother’s illness and subsequent death — and the family’s use of ChatGPT to help evaluate the advice they were getting from the medical team. The family
…presented it with lab panels and echocardiogram summaries and asked it to translate the results, explain how the pieces fit together and tell us what should happen next in the diagnostic process. The same information went to my [heart specialist] friend. Though his word card more weight, we were surprised by how consistent the two sets of responses were.
Abdominal pain symptoms. Lab panels. Echocardiogram summaries. Medical bills, often with patient information and diagnostic codes.
40 million times a day, people are uploading this stuff into chatbots without any idea where this information will go or how it will be used.
In most cases, generic chatbots are not HIPAA-compliant [specific services such as ChatGPT for Healthcare may be compliant in some circumstances.] Each company has its own data rules, and few users read them. Once you upload sensitive information, your control over it is largely gone.
As the New York Times points out, there’s another risk few users consider:
One issue is that many people don’t opt out of handing over their data for training purposes. This creates the possibility that, if one person uploads medical data and someone else asks a future model about that person, a chatbot “might accidentally leak very sensitive information,” said Karni Chagal-Feferkorn, an assistant professor at the Bellini College of Artificial Intelligence, Cybersecurity and Computing at the University of South Florida.
I wrote to Caitlin Owens of Axios to ask for her take on the privacy issue. “Your husband shared his symptoms with ChatGPT; your family shared your mom’s lab results with the same bot,” I wrote. “What happens to that information? Millions of Americans are probably doing the same thing, without thinking of the possible privacy consequences. What guardrails are in place to protect us?“
Her reply, in full: “It’s definitely a good question and one that I understand could also become very legally relevant.”
If she is at all concerned about her own family’s privacy, she didn’t mention it. Which probably makes sense: like me, she writes online about her household’s medical issues.
Which brings us back to last issue’s topic:
How Can We Use AI to Challenge a Medical Bill
AND Protect Our Privacy?
I fired up my chatbot of choice — Perplexity — and asked the question. Here’s what Perplexity recommended:
Option 1: Redact your bill first
Before you paste anything into a chatbot, delete or obscure all personally identifiable information (PII) and protected health information that isn’t needed for the analysis.
Redact things like:
Full name, initials, username, signature
Street address, city, ZIP code, email, phone
Date of birth, age if very specific (e.g., “93-year-old”), Social Security number, medical record number, insurance ID, claim numbers
Provider account numbers, login links, barcodes/QR codes
Names of family members, employers, or rare personal details (like “only Parkinson’s patient in XYZ town”)
You can:
Copy the bill into a text editor and delete or replace identifiers with placeholders (e.g., “[PATIENT]”, “[MRN]”).
If you only have a PDF, print it, black-marker out identifiers, scan or photograph the redacted version, then use OCR (e.g., your phone’s “copy text from image”) to get text to paste.
De-identified health information like “CPT 99213, Parkinson’s disease visit, $285” is far less risky to share than a full bill that includes your name, address, medical record number, and insurer details.
.
Option 2: Share only parts of the billYou don’t have to paste the whole thing. You can:
Paste just a few suspect lines (codes + descriptions + prices), redacted.
Ask the AI to explain those codes and compare typical costs.
Repeat for other sections instead of giving it everything at once.
Example prompt:
“Here are some anonymized lines from a medical bill (patient name, MRN, address, and ID numbers removed). Please: (1) explain each charge in plain English, (2) flag anything that looks like duplicate or unusual billing, and (3) suggest language I can use to question these charges with the billing office.”
I’m careful about financial risks — my credit files are frozen, I use two-factor authentication everywhere, my passwords are gobbledygook and I change all of them every few months.
But I can’t get excited otherwise about privacy. It feels too late for that.
The cat’s already out of the bag, the horse is already out of the barn, and the beans have already been spilled.
Ultimately, we need to fashion our own balance between control and convenience, and know the risks each time we hit “Send.”
.
.
My Ping Pong Table Came With a Surprise
Last week, I took delivery of a ping pong table, to be used for our new Ping Pong for Parkinson’s program at Fremont United Methodist Church in Northeast Portland. We paid a bit extra to have the retailer’s delivery guys put the table together… because you really wouldn’t want me to do it.
Two nice young men brought the table in, unpacked everything, and quickly assembled it. While they worked, they asked what the table would be used for. I told them about the clinical research showing benefits for people with Parkinson’s, and shared with them that I had the disease.
When they were done I tipped them, thanked them, and they left.
A couple of minutes later there was a knock on the door. It was one of the delivery guys. I let him in, and he asked if it would be okay if he prayed for me.
I have never been a religious person, and nobody has ever asked me that before.
I said sure, and assumed he would go back to his truck and pray on the way to their next stop.
Au contraire — he was ready to pray right then and there. He took my hand, bowed his head, and asked Jesus to deliver his love and healing power to his new friend Phil.
Sixty seconds later he wrapped it up and headed out the door.
I still don’t know what to think about it.
_________________________________
Bonus for Those Who Read to the Bottom
From the Ted Mack Amateur Hour, five Mamaroneck, NY teenagers do the surf thing. Notable primarily lead singer Ralph DiForio’s dancing, beginning at the 46-second mark.
.
Paul Anka swings Nirvana.
.
People keep asking me if there’s a Japanese hip hop version of the theme to “The Godfather.” It turns out there is.
.
Ringo Starr, Elton John, Mark Bolan and T. Rex attack “Tutti Frutti.” Sound quality isn’t great, but the enthusiasm and charisma make this a winner.







Leave a comment