top of page
Search

2026 California Privacy Law Update - Risk Assessments

  • Deserae Abed-Rabbo
  • Jan 20
  • 4 min read

Updated: Feb 4

While several states have introduced new privacy laws, California remains at the forefront of U.S. privacy regulation. The Delete Act and other updated privacy rules will bring fresh compliance challenges for businesses serving California residents in 2026. Companies should prepare by developing a plan to navigate these changes.


Understanding the New Privacy Regulations


Among the changes is the requirement for businesses to conduct formal risk assessments when they engage in certain processing activities, as discussed below.


Step 1 – Determine Whether Risk Assessment(s) are Required


Starting January 1, 2026, businesses must conduct formal risk assessments before processing personal information in a manner that presents a “significant risk” to consumer privacy. “Significant risks” include:


  • Selling or sharing personal information.

  • Processing sensitive personal information, such as gender, sexual orientation, immigration status, etc.

  • Using Automated Decision-Making Technology (ADMT) for significant decisions concerning consumers. ADMT is broadly defined as “any technology that processes personal information to replace or substantially replace human decision-making.”

  • Profiling, which means using automated processing to infer or extrapolate a consumer’s intelligence, ability, aptitude, performance at work, economic situation, health (including mental health), personal preferences, interests, reliability, predispositions, behavior, or movements, based on that consumer’s presence in a sensitive location.


Sensitive Locations


“Sensitive location” can include healthcare facilities, such as hospitals, doctors’ offices, urgent care facilities, and community health clinics; pharmacies; domestic violence shelters; food pantries; housing/emergency shelters; educational institutions; political party offices; legal services offices; union offices; and places of worship.


Additionally, profiling can infer or extrapolate a consumer’s intelligence, ability, aptitude, performance at work, economic situation, health (including mental health), personal preferences, interests, reliability, predispositions, behavior, location, or movements, based on systematic observation of that consumer when they are acting in their capacity as an educational program applicant, job applicant, student, employee, or independent contractor for the business.


Note: This will be very impactful for employers that leverage AI tools to hire, evaluate, and/or promote employees.


Step 2 – Complete the Risk Assessment


The risk assessment requirements include:


  • Stakeholder Involvement: Must include relevant business stakeholders in the risk assessment process.

  • Written Risk Assessment: Each written assessment must determine whether the privacy risks outweigh the benefits to the business, consumers, and the public. To do so, the written report must include:

1. Purpose of Processing: The business's purpose of processing the personal information.

2. Personal Information Processed: Categories of personal information involved, including sensitive personal information and minimization practices.

3. Operational Details: Outline the processing, including the method of collection/processing, length of retention, applicable disclosures, approximate number of consumers affected, and the categories of third parties involved, if any.

4. Benefits: The potential benefits from the processing to both the business and consumers, e.g., what value the processing creates, whether through improved services, enhanced security, cost savings, or other outcomes.

5. Harm: The potential negative impacts/harm to consumers’ privacy associated with the processing. This critical element requires assessment of risks, such as unauthorized access, discriminatory outcomes, loss of autonomy, surveillance concerns, and/or reputational harm.

6. Safeguards to Mitigate Risks: These may include, for example, encryption, privacy-enhancing technologies, network segmentation, data minimization, and/or bias testing.


NOTE: If ADMT is used, the report should also include:


  • The ADMT’s logic, assumptions, and limitations.

  • Output and how the output will be used to make a significant decision.

  • If the business makes ADMT available to another business for a significant decision, it must provide the facts available to allow the business to conduct its own assessment.


  • Decision: Whether the business will proceed with the processing after weighing the benefits against the risks.

  • Approval by an authorized decision-maker: Names and positions of all stakeholders involved should be included.


After the Risk Assessment is Completed


  • Ongoing Updates to Assessments: All risk assessments must be updated within forty-five (45) days of any material change in the processing or risks.

  • Retention of Assessments: All versions of the assessments must be retained for a minimum of five (5) years after completion or until processing is complete.


Step 3 – Complete Risk Assessments on Time


For new initiatives that started after January 1, 2026, businesses must:


  1. Complete risk assessments before processing begins.

  2. Review those assessments at least every three (3) years.


For existing processing activities initiated prior to January 1, 2026, businesses must conduct a risk assessment no later than December 31, 2027.


Step 4 – Draft and Submit Annual Summary


Starting on April 1, 2028, businesses must submit an annual summary to the California Privacy Protection Agency (CPPA) with the following:


  • Time period covered.

  • Number of assessments completed.

  • Data categories assessed.

  • Certification by an executive attesting to compliance under penalty of perjury.


The CPPA and the California Attorney General may request a copy of any risk assessment report, which must be submitted within thirty (30) days of the request. Businesses must submit information concerning risk assessments conducted in 2026 and 2027 to CalPrivacy by April 2028.


Conclusion


This is just one of many updates to privacy law in California that went into effect on January 1, 2026. If you need a partner in navigating these changes, please contact us!


ADMT is broadly defined as "any technology that processes personal information and uses computation to replace human decision-making or substantially replace human decision-making.” A decision is “substantially” replacing human involvement if the output is used to make a decision without meaningful human review. ADMT does not include purely technical tools like web hosting, spellcheckers, calculators, or anti-virus software, provided they do not replace human decision-making.

Comments


Commenting on this post isn't available anymore. Contact the site owner for more info.
bottom of page