Between DoctorsBetween DoctorsDon't start over
Blog

Second opinions

How to use AI safely to organize a second-opinion packet

A safety-first guide to using AI for organizing medical records, timelines, and questions for a second opinion without asking AI to diagnose or decide treatment.

AI SafetyAI-safety informationalReviewed 2026-05-118 min

Second-opinion prep

5

key points to organize before the visit

1

What AI can safely organize

2

Keep sources attached

3

Prompt AI for organization, not judgment

4

Build the packet sections

5

Protect privacy

Quick Answer

AI can be useful for organizing a second-opinion packet. It can help turn scattered notes into:

  • a short medical story,
  • timeline of visits, reports, and treatment advice,
  • medicine and supplement list as reported,
  • source-document checklist,
  • missing-record list,
  • questions for the clinician.

AI should not decide:

  • your diagnosis,
  • whether a treatment is right,
  • whether one doctor is wrong,
  • whether a symptom is urgent,
  • whether to start, stop, restart, or change medicines,
  • which doctor to trust.

WHO AI health guidance emphasizes ethics, human autonomy, safety, transparency, and accountability. FDA material on AI/ML-enabled medical devices emphasizes that AI technologies used in medical products require careful lifecycle management and regulatory review when they function as medical devices. A patient-facing second-opinion packet should treat AI as an organizer, not a clinician.

What AI can safely organize

AI can help with low-risk organization when you keep source documents attached and clinician review central.

Useful tasks:

  • sort reports by date,
  • extract report names, dates, and ordering clinician,
  • summarize your story in your own words,
  • turn a long note into a short timeline,
  • list current medicines and supplements from your notes,
  • create questions for the second-opinion visit,
  • flag missing records,
  • draft a respectful explanation of why you are seeking another opinion.

MedlinePlus and NIH communication guidance support preparing questions, writing down symptoms, taking notes, and tracking records, test results, treatment plans, and medicines. AI can help make that preparation tidier, but it should not replace the clinician's interpretation.

Keep sources attached

A safe second-opinion packet should not contain free-floating AI statements. For every important point, include where it came from:

Packet itemSource
Diagnosis label or working diagnosisReport, visit note, prescription, or "patient was told"
MedicinePrescription, label photo, discharge summary, or patient list
Lab valueOriginal lab report with date, units, reference range
Imaging findingOriginal imaging report
Treatment adviceVisit note, message, discharge note, or patient recollection
Symptom timelinePatient/caregiver note

If the source is missing, mark it "missing." Do not let AI invent the source.

Prompt AI for organization, not judgment

Safer prompts:

  • "Turn these notes into a timeline for doctor discussion. Do not diagnose or recommend treatment."
  • "List missing records I should ask the clinic for. Do not interpret results."
  • "Create respectful second-opinion questions based on these notes."
  • "Summarize the current medicines exactly as written. Do not suggest changes."
  • "Separate facts from questions and uncertainties."

Unsafe prompts:

  • "Which doctor is right?"
  • "What diagnosis do I have?"
  • "Should I take this medicine?"
  • "Can I wait to see a doctor?"
  • "Which treatment should I choose?"
  • "Is this report dangerous?"

Between Doctors should keep AI output visibly tied to source material and labeled for doctor discussion only.

Build the packet sections

Use these sections:

  1. Reason for second opinion
  2. Short patient story
  3. Timeline
  4. Current medicines and supplements
  5. Prior advice received
  6. Key reports and documents
  7. Questions for the clinician
  8. Missing or unclear details
  9. Safety note

AHRQ's QuestionBuilder helps patients and caregivers prepare questions for medical appointments. NICE shared decision-making guidance supports conversations that include reliable information, options, risks, benefits, consequences, and patient preferences. Your AI-assisted packet should make that conversation easier.

Protect privacy

Before using any AI tool, consider what information you are entering and who controls it. WHO AI guidance highlights privacy, data protection, transparency, responsibility, and accountability concerns in AI for health.

Practical safety steps:

  • avoid uploading more personal information than needed,
  • remove unrelated identifiers where possible,
  • keep original files separate from AI summaries,
  • check AI output against the source documents,
  • label uncertain points,
  • do not publish or share someone else's health details without permission,
  • ask the patient before involving family members or caregivers unless they cannot safely participate.

This is privacy hygiene, not legal advice.

What Not To Ask AI To Decide

Do not ask AI:

  • whether your diagnosis is correct,
  • whether your doctor is wrong,
  • whether to choose surgery, a procedure, a medicine, dialysis, antibiotics, hormones, chemotherapy, psychiatric medicines, supplements, or any other treatment,
  • whether to start, stop, restart, or change a medicine,
  • whether a symptom is an emergency,
  • whether a lab or scan is normal or abnormal for you,
  • whether you can delay care,
  • which clinician is better.

AI can summarize. Clinicians diagnose, prescribe, set monitoring plans, and decide urgency.

When to seek urgent help

Do not wait for AI, a second-opinion packet, or a routine appointment if symptoms feel urgent, severe, rapidly worsening, or connected to emergency instructions you were already given. Use local emergency services, urgent care, or your clinician's emergency instructions.

Seek urgent or emergency medical care for severe symptoms, rapidly worsening symptoms, fainting, severe breathlessness, chest pain, confusion, signs of stroke, severe allergic reaction, heavy bleeding, severe pain, or any symptom that feels like an emergency. MedlinePlus emergency guidance lists warning signs such as breathing problems, change in mental status, chest pain or discomfort, fainting, severe allergic reaction, and sudden inability to speak, see, walk, or move.

Create your Between Doctors profile

Between Doctors is designed for this safe use case: organizing a patient-owned profile for doctor discussion.

Your profile can include:

  • story,
  • timeline,
  • medicines and supplements as reported,
  • reports and source documents,
  • prior advice,
  • questions,
  • missing details,
  • safety note that the profile is not diagnosis, prescription, dose-change advice, emergency-care advice, or doctor replacement advice.

Start here: Create Patient Profile.

Read the site AI boundary: What AI can and cannot do with your health profile.

For crawler/AI-facing product context, see llms.txt.

Frequently Asked Questions

Can AI create my second-opinion packet?

AI can help organize your records, timeline, medicine list, and questions. You should check every output against the original source documents, and a clinician must review medical decisions.

Can AI tell me which doctor is right?

No. AI should not judge doctor quality or decide which recommendation is medically correct. A second-opinion visit should focus on facts, reasoning, missing information, risks, benefits, and next steps.

Is it safe to upload medical records into AI?

It depends on the tool, privacy controls, and what information you upload. Use the least information needed, keep source files secure, and avoid sharing someone else's health information without permission.

What should an AI-assisted packet include?

Include the reason for the second opinion, short story, timeline, medicines and supplements, key reports, prior advice, questions, and missing details. Keep the source documents attached.

Can AI interpret my lab report?

AI may extract test name, date, units, and reference range for organization. It should not interpret what the result means for you or decide urgency.

What is the safest AI prompt?

Use prompts that say "organize for doctor discussion only" and "do not diagnose, recommend treatment, interpret results, or decide urgency."

Sources

  1. Ethics and governance of artificial intelligence for health

    World Health Organization • WHO guideline • 2021-06-28

  2. Regulatory considerations on artificial intelligence for health

    World Health Organization • WHO publication/guidance • 2023-10-19

  3. Digital health

    World Health Organization • Global public-health resource • Page date not listed; page includes 2025-2026 updates

  4. Artificial Intelligence in Software as a Medical Device

    U.S. Food and Drug Administration • Regulator medical-device AI resource • Not listed; page references 2024-2025 FDA AI actions

  5. QuestionBuilder App

    Agency for Healthcare Research and Quality • Government patient question-preparation tool • Last reviewed June 2022

  6. Make the most of your doctor visit

    MedlinePlus Medical Encyclopedia • NIH patient education • Review date 2024-09-15

  7. Talking With Your Doctor or Health Care Provider

    National Institutes of Health • NIH patient communication guidance • Last reviewed 2025-03-04

  8. Shared decision making, NICE guideline NG197

    National Institute for Health and Care Excellence • Clinical guideline • Last reviewed 2021-06-17

  9. Recognizing medical emergencies

    MedlinePlus Medical Encyclopedia • NIH patient emergency education • Published 2025

Medical information only

This article summarizes public medical sources to help you organize questions, records, and next steps for a doctor visit. It is not a diagnosis, treatment recommendation, medication-change guide, or emergency advice. For personal medical advice, contact a licensed clinician. If symptoms feel urgent or severe, seek local emergency care.