The California Privacy Protection Agency (CPPA) announced draft AI and ADMT regulations in November 2023.
The proposed guidelines are still being developed, but organisations should monitor them. California has many of the world’s largest technological businesses, so new AI rules might affect them globally.
ADMT uses covered
Making big decisions
It is used to make consumer-impacting decisions would be subject to the draft rules. Significant decisions usually affect rights or access to essential commodities, services, and opportunities.
For instance, the draft guidelines would encompass automated judgements that affect a person’s work, school, healthcare, or loan.
Comprehensive profiling
Profiling involves automatically processing personal data to assess, analyse, or anticipate attributes including job performance, product interests, and behaviour.
“Extensive profiling” relates to specific profiling:
- Profiling consumers in work or education, such as by tracking employee performance with a keyboard logger.
- Profiling consumers in public areas, such as utilising facial recognition to analyse store buyers’ emotions.
- Consumer behavioural advertising profiling. Personal data is used to target ads in behavioural advertising.
ADMT training
ADMT Tool
Businesses using consumer personal data to train ADMT tools would be subject to the suggested restrictions. The regulations would cover teaching an ADMT to make important judgements, identify people, construct deepfakes, or undertake physical or biological identification and profiling.
By the AI and ADMT regulations, who would be protected?
As a California law, the CCPA solely protects California consumers. Similar protections are provided by the draft ADMT guidelines.
However, these standards define “consumer” broader than other data privacy laws. The regulations apply to business interactions, employees, students, independent contractors, and school and job seekers.
How does the CCPA regulate AI and automated decision-making?
Draft CCPA AI laws have three main requirements. Covered it users must notify consumers, offer opt-out options, and explain how their use impacts them.
The CPPA has altered the regulations previously and may do so again before adoption, but these essential requirements appear in each draft. These requirements will likely stay in the final rules, even if their implementation changes.
Pre-use notes
Before employing this for a covered purpose, organisations must clearly and conspicuously notify consumers. The notice must clarify how the company utilises this and consumers’ rights to learn more and opt out in straightforward language.
The corporation can’t say, “We use automated tools to improve our services.” Instead, the organisation must specify use. Example: “We use automated tools to assess your preferences and deliver targeted ads.”
The notification must direct consumers to more information about the ADMT’s rationale and business usage of its results. This information need not be in the notice. The company can provide a link or other access.
If customers can challenge automated choices, the pre-use notice must describe how.
Exit rights
Consumers can opt out of most it’s applications. Businesses must provide two opt-out methods to support this right.
At least one opt-out mechanism must use the business’s main customer channel. A digital merchant may have a web form.
Opt-out procedures must be easy and not require account creation.
Within 15 days of receiving an opt-out request, a corporation must stop processing customer data. Consumer data processed by the firm is no longer usable. The company must also notify any service providers or third parties that received user data.
Exemptions
Organisations need not let consumers opt out of this for safety, security, and fraud protection. This is mentioned in the proposed guidelines to detect and respond to data security breaches, prevent and prosecute fraud and criminal activities, and safeguard human physical safety.
If an organisation permits people to appeal automatic judgements to a trained human reviewer with the power to overturn them, opt-outs are not required.
Organisations can also waive opt-outs for limited ADMT use in work and school. Among these uses:
- Performance evaluation for entrance, acceptance, and hiring.
- Workplace task allocation and compensation.
- Profiles used just to evaluate student or employee performance.
The following criteria must be met for work and school uses to be opt-out-free:
- The ADMT must be needed for and utilised solely for the business’s purpose.
- To assure accuracy and non-discrimination, the firm must formally review the ADMT.
- The company must maintain the ADMT‘s accuracy and impartiality.
These exemptions do not apply to behavioural advertising or ADMT training. These usage are always optional.
Access to ADMT use data
Customers can request information on how a company uses ADMT. Businesses must make it easy for customers to request this information.
Organisations must explain why they used ADMT, the consumer outcome, and how they made a choice when responding to access requests.
Access request responses should also explain how consumers can exercise their CCPA rights, such as filing complaints or deleting their data.
Notifying of problematic decisions
If a business uses ADMT to make a major decision that negatively impacts a consumer, such as job termination, it must notify the consumer of their access rights.
Notice must include:
- An explanation that the company made a bad judgement using ADMT.
- Notification that businesses cannot retaliate against consumers who exercise CCPA rights.
- How the consumer can get more ADMT usage information.
- If applicable, appeal instructions.
AI/ADMT risk assessments
The CPPA is drafting risk assessment regulations alongside AI and ADMT requirements. Despite being independent sets of requirements, risk assessment regulations affect how organisations employ AI and ADMT.
The risk assessment regulations require organisations to examine before using ADMT for major decisions or profiles. Before training ADMT or AI models with personal data, organisations must do risk assessments.
Risk assessments must identify ADMT consumer risks, organisation or stakeholder benefits, and risk mitigation or elimination measures. Organisations should avoid AI and ADMT when the risks outweigh the advantages.
European AI development and use are strictly regulated by the EU AI Act
The Colorado Privacy Act and Virginia Consumer Data Protection Act allow US consumers to opt out of having their personal data processed for important decisions.
In October 2023, President Biden signed an executive order mandating federal agencies and departments to establish, use, and oversee AI standards.
California’s planned ADMT regulations are more notable than other state laws since they may effect companies beyond the state.
Many of the most advanced automated decision-making tool makers must comply with these rules because California is home to much of the global technology industry. The consumer protections only apply to California residents, although organisations may offer the same options for convenience.
Because it set national data privacy standards, the original CCPA is sometimes called the US version of the GDPR. New AI and ADMT rules may yield similar outcomes.
CCPA AI and ADMT regulations take effect when?
As the rules are still being finalised, it’s impossible to say. Many analysts expect the rules to take effect in mid-2025.
Further regulation discussion is scheduled at the July 2024 CPPA board meeting. Many expect the CPPA Board to start official rulemaking at this meeting. Since the agency would have a year to finalise the guidelines, mid-2025 is the expected implementation date.
How will rules be enforced?
With other elements of the CCPA, the CPPA can investigate infractions and sanction organisations. Noncompliance might result in civil penalties from the California attorney general.
Organisations can be fined USD 2,500 for unintentional infractions and USD 7,500 for purposeful ones. One violation per affected consumer. Many infractions involve many consumers, which can significantly compound penalties.
Status of CCPA AI and ADMT regulations?
Draft rules are evolving. The CPPA continues to gather public views and hold board discussions, so the guidelines may alter before adoption.
Based on comments, the CPPA has made major regulation changes. The agency expanded opt-out exclusions and restricted physical and biological profiling after the December 2023 board meeting.
To limit which tools the rules apply to, the government changed ADMT’s definition. The earlier proposal included any technology that aided human decision-making, while the present draft only pertains to ADMT that significantly aids it.
Privacy advocates worry the amended definition creates exploitable gaps, but industry groups say it better reflects ADMT use.
Even the CPPA Board disagrees on the final rules. Two board members raised concerns that the plan breaches their power in March 2024.
Given how the rules have evolved, pre-use notices, opt-out rights, and access rights are likely to persist. Organisations may have remaining questions like:
- Which AI and automated decision-making technologies will the final rules cover?
- How will practical consumer protections be implemented?
- What exemptions will organisations receive?
These laws will affect how AI and automation are governed nationwide and how customers are safeguarded in the face of this rising technology.