Digital Policy Alert, an independent, public repository tracking policy changes shaping the digital economy is quickly becoming an essential infrastructure in a regulation-driven tech landscape.
Launched in 2021, Digital Policy Alert (DPA), is available in more than 50 jurisdictions and covers areas such as artificial intelligence, data governance, online safety, and digital markets. Guided by a mission to build a more interoperable, transparent, and accessible regulatory ecosystem, DPA is not just tracking rules but is helping shape how the world understands and navigates digital governance.
Speaking to TechMoran, Maria Buza, Senior Policy Analyst at Digital Policy Alert, said the DPA transforms complex regulation into actionable intelligence for decision-makers. Through tools like “Topical Threads,” “Digital Digests,” and comparative reports, users can analyze global trends, compare cross-border regulations, and quickly understand new markets.
”The Digital Policy Alert database, which has recently expanded to include Kenya, provides a repository of impartial evidence on how the digital economy is regulated. It covers both policies currently in force, how they are implemented, and those under deliberation across approximately 60 jurisdictions,” she said.
DPA’s legal analysis suite, Clairk, simplifies dense legal texts into clear, actionable insights, supported by a verified-source chatbot, cross-country comparison tools, and privacy-first infrastructure.
With real-time alerts, data tracking tools, and a growing footprint across African markets including Kenya, Nigeria, Rwanda, Ethiopia, Ghana, Morocco, Egypt, and Algeria, DPA positions regulatory awareness as a strategic advantage. For founders and CEOs scaling across borders, it shifts compliance from a reactive burden to a proactive growth strategy. One perfect example is the Kenyan government probe into meta Ray-Ban smart glasses over privacy and surveillance concerns. Cross-border processing of data can create security and misuse risks.
”Differences in legal frameworks and oversight mechanisms may further increase compliance and accountability risks,” Buza told TechMoran. ”Where data is collected in one jurisdiction and subsequently stored or processed in another, the applicable safeguards depend on the legal basis for the transfer and the protections available in the receiving jurisdiction.”
The Digital Policy Alert database indicates ongoing regulatory activity in this area across a range of jurisdictions. Several governments are updating their cross-border data transfer frameworks to ensure that personal data transferred abroad remains subject to an equivalent level of protection, relying on mechanisms such as adequacy decisions, appropriate safeguards, or consent. In addition, some jurisdictions restrict or prohibit the transfer or sharing of certain categories of sensitive data altogether.
Here is the rest of the interview on the Kenyan government probe into meta Ray-Ban smart glasses and what it means for other tech giants.
Are existing global data protection frameworks equipped to handle emerging wearable technologies like smart glasses, or are regulators fundamentally playing catch-up?
The data protection frameworks, including Kenya’s Data Protection Act, apply in principle to wearable technologies, as they regulate the processing of personal data regardless of the device used. The challenge lies in their level of specificity. These frameworks were not designed with always-on, body-worn, ambient data-capture devices in mind, which continuously generate and infer data in ways that extend beyond traditional processing contexts.
In particular, wearable devices may process different categories of personal data, including sensitive personal data, such as biometric data, or behavioural patterns inferred from movement, voice, or facial characteristics. While such data typically requires explicit consent and is subject to additional safeguards, the practical application of these safeguards becomes complex when data is collected passively and continuously, including in public or semi-public environments of other individuals.
Existing consent mechanisms are generally built on the assumption of a clear, informed, and relatively stable relationship between the data subject and the controller. This model may be less directly applicable where wearable devices collect data passively and continuously, including data relating to individuals other than the device user. In such situations, individuals who are not users of the device may be unaware that their personal data is being processed, and obtaining consent from all affected individuals may not always be feasible.
In these contexts, controllers may rely on a combination of legal bases such as legitimate purposes, depending on the applicable framework, and may implement additional safeguards such as transparency measures, data minimisation, purpose limitation, and technical and organisational measures aligned with data protection by design and by default. Transparency may be supported through user-facing disclosures, visible indicators of recording, or other context-appropriate notifications.
In response to these developments, some jurisdictions have begun to issue guidance or consider legislative proposals addressing wearable technologies. For example, in March 2026, Switzerland’s Federal Data Protection and Information Commissioner adopted guidelines on connected wearable devices that explicitly address risks associated with cameras and microphones capturing data relating to third parties, not only the device user. The guidelines emphasise data protection by design, transparency, and consent for secondary uses, and note that covert recording through such devices may, in certain circumstances, constitute an offence under Swiss law. Similarly, Brazil introduced a bill in February 2026 addressing AI-enabled glasses, proposing requirements such as visible or audible recording indicators, default-off facial recognition, and data protection impact assessments prior to market entry. In the United States, a bill introduced in the California Senate in the same period proposes restrictions on the use of wearable recording devices in contexts where individuals have a reasonable expectation of privacy, including provisions relating to consent and recording indicators.
Overall, while existing frameworks provide a baseline, current developments indicate that regulators are adapting the regulatory landscape to address the data processing of these technologies.
What risks arise when data captured in one country is processed or reviewed in another, especially when it may include sensitive or non-consensual recordings?
Cross-border processing of data can create security and misuse risks. Differences in legal frameworks and oversight mechanisms may further increase compliance and accountability risks. Where data is collected in one jurisdiction and subsequently stored or processed in another, the applicable safeguards depend on the legal basis for the transfer and the protections available in the receiving jurisdiction.
The Digital Policy Alert database indicates ongoing regulatory activity in this area across a range of jurisdictions. Several governments are updating their cross-border data transfer frameworks to ensure that personal data transferred abroad remains subject to an equivalent level of protection, relying on mechanisms such as adequacy decisions, appropriate safeguards, or consent. In addition, some jurisdictions restrict or prohibit the transfer or sharing of certain categories of sensitive data altogether.
Within African jurisdictions, the African Union Convention on Cyber Security and Personal Data Protection provides a regional framework for addressing these issues, although implementation at the domestic level varies. Kenya’s Data Protection Act and the Data Protection General Regulations, for example, establish four mechanisms for cross-border transfers of personal data.
- Proof of appropriate data protection safeguards, based on legal instruments, such as binding corporate rules, and the circumstances in the recipient country. Transfers based on such safeguards must be documented.
- An adequacy decision from the Office of the Data Protection Commissioner, confirming sufficient protection levels in the recipient country. In May 2024, Kenya and the European Union launched an adequacy dialogue, the first such dialogue in Africa.
- Necessity, including for the performance of a contract, the protection of vital interests, or the pursuit of legitimate interests that do not override the rights of data subjects.
- Explicit informed consent from the data subject. Consent is always required for transfers of sensitive personal data.
How should policymakers redefine consent when recording devices are embedded in everyday objects and are not easily detectable?
The data protection frameworks generally require a legal basis for processing personal data. The legal basis based on consent assumes that a data subject is aware of the presence of a device, understands what data is being collected, and is able to make an informed choice. These assumptions may be less applicable where devices such as smart glasses operate continuously and are not readily noticeable, particularly for bystanders who may be recorded without direct interaction.
Controllers may rely on alternative legal bases for processing personal data, such as legitimate interest, which are increasingly subject to stricter interpretation in case law. Where processing is based on a legal basis other than consent, safeguards remain relevant. In regimes that require a legal basis and those that do not, transparency and user control measures, such as clear disclosures, visible recording indicators, and opt-out or equivalent mechanisms where feasible, can support awareness and help mitigate risks to individuals.
Regulatory developments reflected in the Digital Policy Alert database suggest that some policymakers are exploring complementary approaches. For example, Brazil’s proposed bill on AI-enabled glasses introduces requirements for visible or audible signalling by default. The bill introduced in California similarly addresses the use and visibility of recording indicators. In Switzerland, guidance on connected wearable devices highlights that users may bear certain responsibilities in ensuring that others are informed when data is being captured.
These developments indicate a gradual shift towards supplementing the legal basis of processing data based on consent with additional safeguards, including transparency measures based on signalling by default and limits of its use in public spaces.
Could investigations like this influence how regulators in other regions approach AI training data and wearable surveillance technologies, particularly for companies like Meta?
Past developments suggest that investigative reporting and civil society engagement can play a role in shaping regulatory responses over time. For example, in early January of 2026, reports emerged that the Grok AI chatbot on the X platform was used to generate and disseminate non-consensual sexualised images, including undressed images of individuals and sexualised images of children, prompting investigations in several jurisdictions into the same conduct and access restrictions. The reports and investigations led to an increase in the number of legislative proposals to address the AI-generated non-consensual sexual and child abuse content. Regarding the processing of sensitive data, including biometric information, several authorities have investigated Worldcoin. These investigations found that offering cryptocurrency tokens in exchange for biometric data does not meet the threshold for valid and freely given consent, leading to measures such as bans on data processing and orders to delete the data collected in several jurisdictions.
In this context, emerging investigative findings, particularly those concerning non-consensual recordings or data collected via wearable devices, may give rise to further enforcement actions and legislative proposals addressing the risks associated with such technologies. These may cover both the use of such technologies by private individuals and obligations on companies that develop or provide them to introduce appropriate safeguards, such as user-facing disclosures, visible indicators of recording, or other context-appropriate notifications.

