TechMoran

Kenya Probes Ray-Ban Meta Smart Glasses Over Privacy & Surveillance Concerns

Share this

Kenya’s Office of the Data Protection Commissioner has opened an investigation into the use of Ray-Ban Meta smart glasses, citing growing concerns over privacy, surveillance, and the potential misuse of personal data in artificial intelligence training.

The regulator said it had initiated suo moto proceedings following reports that the AI-powered wearable devices may be capturing and processing personally identifiable information without adequate user consent, raising questions about compliance with Kenya’s data protection laws.

The inquiry comes after a formal petition by The Oversight Lab, which earlier in March called for scrutiny of the glasses’ surveillance capabilities and their broader human rights implications.

“The ODPC is taking this matter seriously and has decided to investigate,” said Mercy Mutemi, Executive Director at The Oversight Lab, who urged authorities to ensure a transparent and consultative review process. She added that the findings could set a precedent for how Kenya governs emerging digital technologies.

Public pressure has intensified, with more than 150 organizations and individuals backing calls for an inquiry and demanding greater accountability from both regulators and technology firms.

Concerns escalated further following local reports alleging that an individual used the smart glasses to secretly record women without their consent, an indecent incident that has amplified fears over the misuse of discreet recording features embedded in wearable devices.

Separately, international media investigations have raised questions about how data collected through such devices is processed. Some reports suggest that footage captured globally may have been reviewed in Nairobi by contracted workers, including sensitive and private recordings.

Regulators ‘playing catch-up’ on wearables

According to Maria Buza, Senior Policy Analyst at Digital Policy Alert, existing data protection frameworks including Kenya’s apply in principle to wearable technologies, but were not designed with always-on, body-worn devices in mind.

“These frameworks regulate the processing of personal data regardless of the device used,” Buza said. “The challenge lies in their level of specificity, particularly for technologies that continuously generate and infer data beyond traditional contexts.”

She noted that wearable devices can capture sensitive personal data such as biometric identifiers and behavioural patterns, making compliance more complex especially when data is collected passively in public or semi-public spaces involving individuals who are not aware they are being recorded.

Traditional consent models, she added, assume a clear relationship between users and data controllers, an assumption that breaks down when devices collect data continuously and from bystanders.

Maria Buza, Senior Policy Analyst at Digital Policy Alert
Maria Buza, Senior Policy Analyst at Digital Policy Alert

“Obtaining consent from all affected individuals may not always be feasible,” Buza said, pointing to the need for additional safeguards such as transparency measures, data minimisation, and privacy-by-design systems.

Global policy response taking shape

Buza pointed to emerging regulatory responses globally, noting that policymakers are beginning to adapt frameworks to address wearable surveillance risks.

In Switzerland, authorities have issued guidance on connected devices, warning that covert recording via wearables could constitute an offence. Brazil and parts of the United States are also considering laws requiring visible or audible recording indicators and stricter safeguards for AI-enabled glasses.

“These developments suggest a shift towards complementing consent with stronger transparency requirements and default safeguards,” she said.

Digital Policy Alert database provides a growing body of evidence tracking how governments are responding to such challenges. The platform monitors regulatory developments across G20 economies including the European Union and its member states as well as Switzerland and several Southeast Asian countries such as Malaysia, New Zealand, the Philippines, Singapore, Thailand, and Vietnam.

More recently, it has expanded coverage in Africa, tracking how countries including Algeria, Egypt, Ethiopia, Ghana, Kenya, Morocco, Nigeria, and Rwanda regulate the digital economy and enforce emerging rules.

The database also tracks international cooperation frameworks addressing the digital economy, including those focused on artificial intelligence governance.

Cross-border data risks under scrutiny

The investigation also highlights concerns around cross-border data processing particularly where sensitive or non-consensual recordings are involved.

Buza warned that transferring data across jurisdictions can create compliance and accountability risks, especially where legal protections differ.

“Where data is collected in one country and processed in another, the level of protection depends on the safeguards in the receiving jurisdiction,” she said.

Kenya’s Data Protection Act provides several mechanisms for cross-border transfers, including adequacy decisions, appropriate safeguards such as binding corporate rules, necessity-based transfers, and explicit consent which is mandatory for sensitive personal data.

The issue is particularly relevant as Kenya and the European Union continue discussions on a potential data adequacy agreement, a move that could shape future digital trade and data governance.

Rethinking consent in the age of invisible recording

Buza said policymakers may need to rethink how consent is defined in a world where recording devices are embedded in everyday objects and are not easily detectable.

“The assumption that individuals are aware of data collection and able to make informed choices becomes less applicable,” she said.

Instead, regulators are increasingly exploring complementary approaches, including visible recording indicators, clearer disclosures, and mechanisms that allow individuals to opt out where feasible.

Global implications for Meta and AI regulation

The scrutiny extends beyond Kenya. Meta Platforms, which produces the glasses in partnership with Ray-Ban, is facing regulatory attention in multiple jurisdictions. In the United Kingdom, the Information Commissioner’s Office has launched a similar review into potential privacy breaches linked to the devices.

Meta is also confronting legal challenges in the United States over its handling of user data, adding to mounting global pressure on the company.

Buza noted that investigations such as Kenya’s could have broader ripple effects.

“Past cases show that investigative reporting and civil society engagement can influence regulatory responses,” she said, citing recent global scrutiny of AI systems and biometric data collection practices.

She added that findings from such probes could lead to new enforcement actions and legislation requiring stronger safeguards including clearer disclosures, visible recording indicators, and stricter limits on how data from wearable devices is used, particularly in AI training.

The ODPC said it would provide further updates once the investigation is complete.

The case underscores the growing tension between rapid innovation in AI-powered consumer devices and the ability of regulators to safeguard privacy, as technologies increasingly blur the boundaries between public and private life.

 

Share this
Exit mobile version