The moment you realize something is wrong
It might be a staff member who can't log into the EHR. A screen covered in encrypted filenames. An email from a patient asking why they received a strange message from your practice. Or just a gut feeling that something's off.
Discovering a breach is disorienting. Your pulse goes up. Your first instinct is to do something — anything. Most people reach for the power button. That's the wrong move, and it's one of the most consequential mistakes you can make in the first hour.
Before anything else: take a breath, pull out a notepad, and start writing. Record the exact time, which system or device is involved, who found it, and what they saw. This timestamp matters — HIPAA's notification clock starts from the moment of "discovery," and your documentation needs to be airtight.
Resist the urge to immediately shut down all systems. Resist the urge to call every person you know. Resist the urge to delete the suspicious email. Your first job is to slow down and contain — not destroy.
Hour 0–12: Contain without destroying
// Hour 0–4: Isolate the affected system
Containment means cutting off the affected system's ability to communicate — without wiping out the evidence. The right move is to unplug the ethernet cable and disable WiFi on the affected machine. Do not power it off.
If you're dealing with ransomware, speed matters. Ransomware encrypts files and then typically tries to spread laterally across your network. Every minute the infected system stays connected is another minute it can reach your EHR, your billing system, and every shared drive your staff uses. Pull the network plug first, then think.
After isolating the machine, your next call should be to your IT provider — not your front desk staff, not a family member who "knows computers." Your IT provider needs to assess what happened before anyone else touches anything.
- Unplug the ethernet cable from the affected device
- Disable WiFi on the device (if possible without powering off)
- Do not log out, restart, or power off the machine
- Do not delete logs, files, or emails related to the incident
- Do not run antivirus scans that might overwrite forensic artifacts
- Call your IT provider and tell them what you saw and when
Powering off an infected system can destroy the forensic evidence you'll need to prove what happened — or didn't happen — to OCR. Volatile memory (RAM) contains active processes, attacker tooling, and encryption keys that disappear the moment the machine loses power. A forensics firm needs that machine running.
// Hour 4–12: Assess the scope
Once you've contained the immediate threat, the next job is understanding how bad it actually is. This is where most practices either overcorrect (assume everything is compromised) or undercorrect (assume nothing important was touched). Both are expensive mistakes.
Work with your IT provider to answer these questions — and write down every answer with a timestamp:
- Which systems were affected? Just one workstation, or the server too?
- Was ePHI (electronic Protected Health Information) on any of those systems?
- Was the ePHI accessed, exfiltrated, or just encrypted and unavailable?
- How many patient records could potentially be involved?
- What type of incident is this: ransomware (availability), unauthorized access (confidentiality), or data exfiltration (both)?
- Are any other systems on the network showing signs of compromise?
The distinction between encryption and exfiltration matters enormously for HIPAA purposes. If a ransomware attack encrypted your files but there's no evidence data left your network, OCR may find that the breach presumption is rebutted — meaning you may not need to notify patients. If data was actually exfiltrated and sent to an external server, that's a different situation entirely.
Hour 12–48: Make the hard calls
// Hour 12–24: Legal, insurance, and law enforcement
This is the window where most small practices make their most costly mistake: they wait. They want to know more before they call anyone. They don't want to worry people. They're hoping it turns out to be nothing.
Don't wait. Three calls need to happen in this window, in roughly this order:
- Legal counsel with healthcare experience. General business attorneys are not equipped for HIPAA breach response. You need someone who has handled OCR investigations. They'll guide every decision from this point forward and their communications may be protected by attorney-client privilege.
- Your cyber insurance carrier. Most policies have a 24–72 hour notification requirement. Miss it, and you may lose coverage for the entire incident. Your carrier will also often direct you to a preferred forensics firm and breach coach — use them.
- The FBI, if ransomware is involved. This surprises many practice managers, but the FBI's Cyber Division actively investigates healthcare ransomware. Reporting doesn't obligate you to anything, and the FBI sometimes has decryption keys or intelligence about the specific ransomware variant that can help you recover faster. Call your local field office.
Your insurer will typically recommend — or require — bringing in a forensics firm. Don't skip this step to save money. The forensics report is what gives you the documented basis for your HIPAA risk assessment, and it's what you'll hand to OCR if they come calling.
// Hour 24–48: Documentation and notification prep
By now you should have a clearer picture of what happened. This is when your written documentation becomes critical. Start a formal incident log — a running record of every action taken, by whom, at what time. This log will be central to your OCR response if a complaint is filed.
- Compile a list of all potentially affected patients (name, contact information, what PHI was exposed)
- Begin drafting patient notification letters with your attorney's guidance — OCR specifies what these must include
- Do not send any notifications yet — wait for legal review
- Notify any affected business associates (EHR vendor, billing company, clearinghouse) — they may have their own notification obligations
- Preserve all system logs, email records, and forensic images — treat them as legal documents
Your notification letters must include: a description of what happened, the types of PHI involved, steps patients should take to protect themselves, what your practice is doing to investigate, and contact information for questions. OCR's website has sample language. Your attorney should review before anything goes out.
The HIPAA timeline you need to understand
HIPAA's Breach Notification Rule sets specific deadlines, and they run from the date of discovery — not the date the breach actually happened. If you discovered a breach on March 1st, your clock started March 1st, even if the attacker had been in your system since January.
- Affected individuals: Must be notified within 60 calendar days of discovery
- HHS/OCR (500+ patients): Report to the HHS Breach Portal within 60 days of discovery
- HHS/OCR (fewer than 500 patients): Report annually, within 60 days of the end of the calendar year in which the breach occurred
- Media notification: Required if 500 or more residents of a single state or jurisdiction are affected — notification goes to major print or broadcast media in that area
Not every unauthorized access automatically triggers full notification. HIPAA allows covered entities to conduct a risk assessment to determine whether there is a "low probability that the PHI has been compromised." If you can document that conclusion, notification may not be required. But this determination has to be documented — you can't just decide it informally and move on.
OCR applies a four-factor test when evaluating whether notification is required:
The nature and extent of the PHI involved, including the types of identifiers and the likelihood of re-identification
Who accessed or could have accessed the PHI — an authorized employee or an external attacker?
Whether the PHI was actually acquired or viewed, or whether access was opportunistic without meaningful exposure
The extent to which the risk to the PHI has been mitigated — was the data recovered, was the attacker identified?
Hour 48–72: Keeping the practice running
While the response team works the breach, your practice still has patients. Staff need to know what to do — and what not to do. This is where a lot of practices improvise poorly.
- Activate paper-based protocols for intake, prescriptions, and scheduling — if you don't have them written down, you're improvising under pressure
- Brief staff on what happened at a high level: "We've had a security incident and we're working with IT and legal to respond. Please do not discuss this externally or post anything on social media."
- Designate a single point of contact for external questions — this should not be a front desk employee
- Contact your EHR vendor if their systems were involved — they may have their own notification obligations under their Business Associate Agreement
- Document any patient care impacts that resulted from system downtime
The social media instruction matters more than most administrators realize. A well-meaning staff member posting "rough week at the office, dealing with a hacker situation" can compromise your legal position, trigger media attention before you're ready, and may violate HIPAA if it discloses information about the incident prematurely.
What OCR actually looks for
When OCR investigates a breach — whether triggered by a patient complaint or a report you filed yourself — they are not primarily looking to punish you for being attacked. They're looking at how you responded.
The questions OCR asks in an investigation are revealing:
- Did you have a written incident response plan before the breach occurred?
- Did you follow that plan when the incident happened?
- Did you document your actions in real time with timestamps?
- Did you notify affected individuals and HHS within the required timeframes?
- Did you conduct and document a proper risk assessment?
- Did you take corrective action after the breach to prevent recurrence?
Practices that respond well — that can hand OCR a documented timeline of their response, a completed risk assessment, and evidence that notifications went out on time — frequently avoid fines even after significant breaches. Practices that can't produce documentation, missed notification deadlines, or never had a response plan are the ones that end up in settlement agreements and corrective action plans.
OCR's Resolution Agreements are public. Reading through them, you'll notice a pattern: the fines aren't just for being breached. They're for the pre-existing failures — no risk analysis, no policies, no workforce training — that OCR discovers when they start digging.
The thing that actually determines your outcome
After all the timelines and checklists, two factors determine whether a healthcare breach becomes a manageable incident or a practice-altering disaster: your backups and your plan.
If you have current, tested, offsite backups — and by "current" we mean updated within the last 24 hours, and by "tested" we mean you've actually restored from them — ransomware becomes a nuisance rather than a catastrophe. You lose a day of operations, not your entire practice history. If you don't have those backups, you're at the mercy of the attacker.
And if you have a written incident response plan — one that's been reviewed with your team, that has specific names and phone numbers in it, that your staff has practiced at least once — the first four hours of a breach look completely different. You're not improvising. You're executing.
If you don't have either of those things today, that's the gap to close first. It doesn't require a massive project — a 15-minute conversation with someone who knows what they're looking at is a reasonable starting point.