If your SaaS product touches health data in any way, you need to understand HIPAA. Violations can result in fines from $100 to $50,000 per incident, up to $1.5 million per year per violation category. Criminal penalties can include prison time. This is not optional compliance. But HIPAA compliance does not require a massive budget or a dedicated compliance team at the early stage. This guide breaks down exactly what SaaS founders need to know: what HIPAA actually requires, which technical safeguards to implement, how to execute a Business Associate Agreement, and the most common mistakes that get startups in trouble.
First, determine if HIPAA applies to you. HIPAA applies to covered entities (healthcare providers, health plans, clearinghouses) and their business associates. If your SaaS product stores, processes, or transmits Protected Health Information (PHI) on behalf of a covered entity, you are a business associate. PHI includes any individually identifiable health information: names combined with diagnoses, treatment records, billing information, or even IP addresses if they are associated with health data.
If a hospital, clinic, therapist, health insurer, or any healthcare organization uses your software and their data flows through your systems, you are almost certainly a business associate. Even if you think you are not handling PHI, review carefully. A scheduling app for doctors that stores patient names and appointment types is handling PHI.
The Business Associate Agreement (BAA). Before any healthcare customer sends you PHI, you must have a signed BAA. This is a legal contract that specifies: what PHI you will handle, how you will protect it, how you will report breaches, and what happens when the relationship ends. Your cloud infrastructure providers also need to sign BAAs with you. AWS, Google Cloud, and Azure all offer BAAs. Supabase, Vercel, and many other platforms offer them on their enterprise or healthcare-specific plans. Do not store PHI on any service that will not sign a BAA.
Technical safeguards required by HIPAA. Encryption: PHI must be encrypted at rest and in transit. Use TLS 1.2 or higher for all data in transit. Use AES-256 encryption for data at rest. Most cloud databases and storage services provide this by default, but verify and document it.
Access controls: implement role-based access control (RBAC) so that only authorized users can access PHI. Every user must have a unique identifier (no shared accounts). Implement automatic session timeouts. Require multi-factor authentication for any access to systems containing PHI.
Audit controls: log all access to PHI. Who accessed what data, when, and from where. Retain logs for at least six years (the HIPAA retention requirement). Use tamper-evident logging so that logs cannot be altered after the fact. Review audit logs regularly for suspicious access patterns.
Integrity controls: implement mechanisms to ensure PHI is not improperly altered or destroyed. Use database transactions, checksums, and backup verification. Have a documented data backup and recovery plan with regular testing.
Transmission security: beyond encryption, implement controls to guard against unauthorized access during transmission. This means TLS for web traffic, encrypted email or secure messaging for any PHI sent via email, and VPN or private network connections for administrative access to production systems.
The Risk Assessment. HIPAA requires a documented risk assessment. This is not a one-time task. It must be updated annually or whenever you make significant changes to your systems. The risk assessment identifies: what PHI you handle and where it is stored, threats and vulnerabilities to that data, the likelihood and impact of each threat, and the safeguards you have in place (or plan to implement). The Department of Health and Human Services provides a Security Risk Assessment Tool that walks you through the process. Use it. It is free and it produces the documentation you need.
Breach notification requirements. If a breach of unsecured PHI occurs, you must: notify the covered entity (your healthcare customer) without unreasonable delay, and no later than 60 days after discovery. The covered entity then notifies affected individuals, the HHS, and potentially the media (for breaches affecting 500 or more individuals). Document every breach, no matter how small, in a breach log. Even near-misses should be documented.
Common mistakes SaaS founders make. Using non-compliant infrastructure: storing PHI in a standard S3 bucket without encryption, or using a database service that will not sign a BAA. Neglecting employee training: every employee who could potentially access PHI must receive HIPAA training. Document the training. Ignoring the minimum necessary rule: only access and disclose the minimum amount of PHI necessary for the task. Your entire engineering team should not have access to production health data. Skipping the risk assessment: this is the single most-cited deficiency in HIPAA audits. Do the assessment and document it. Not having an incident response plan: know exactly what you will do if a breach occurs, before it occurs.
Getting started on a budget. Phase one: sign BAAs with your infrastructure providers. Phase two: implement encryption at rest and in transit. Phase three: implement access controls and audit logging. Phase four: conduct your first risk assessment. Phase five: create policies and procedures documentation. Phase six: train your team. This entire process can be completed in two to four weeks for an early-stage startup. As you grow, invest in automated compliance monitoring, penetration testing, and potentially SOC 2 certification, which covers many of the same controls as HIPAA.
Continue Reading
This content is available with BliniBot Pro or as an individual purchase.