Digital Trust and Identity and Young Canadians

Protection That Respects Privacy

April 16, 2026

#PrivacyInPracticeCA

Introduction: Protection and Privacy Together

Protecting children online and respecting their privacy are complementary objectives that thoughtful design can achieve together. The false choice between protection and privacy has too often dominated the discussion of children’s digital trust and identity. Better approaches exist.

Children face real risks online. Age-inappropriate content, predatory behaviour, and exploitation are genuine threats that justify protective measures. Parents and policymakers are right to seek protections.

But children also have privacy rights. Privacy is a fundamental right that applies to people of all ages.[1] Systems that protect children by surveilling them may create different harms beyond those they prevent. Protection that requires sacrificing privacy is not the only option.

This article examines digital trust and identity among young Canadians: how digital service delivery and, by extension, age verification can respect privacy; how youth access and technologies like wallets should evolve as autonomy develops; and how all stakeholders can protect young people without compromising their rights.

The Age Verification Opportunity

Privacy-respecting age verification approaches exist and are being deployed. The technical capability to verify age without collecting identity is available today.

Attribute-Based Verification

Digital credentials can attest to age category without disclosing name, birthdate, or other identifying information. A credential can prove “over 18” or “under 13” without revealing a specific birthdate. The verifier learns only what they need to know.

BC Wallet’s selective disclosure capabilities directly enable this approach.[2] Using this technology, a British Columbian can share age-relevant attributes without revealing their full birthdate or other personal details to the verifier.[3] The privacy protection that adults enjoy through selective disclosure is equally applicable to verifying youth status.

This capability transforms the equation for age verification. Typical practices like checking physical ID and collecting a birthdate pose privacy risks and lead to unnecessary data collection. Attribute-based verification reduces these risks while achieving the verification purpose.

Device-Based Estimation

AI-powered age estimation can estimate age from facial features without requiring document submission. The Office of the Privacy Commissioner of Canada has examined these technologies in the context of age assurance, noting both their potential and their limitations.[4] When properly designed, the estimate occurs on-device without transmitting biometric data. No name, no birthdate, no document is needed to produce an age estimate.

This approach has limitations. Estimation is approximate, not precise. Some contexts require certainty that estimation cannot provide. The OPC’s guidance on biometric processing emphasizes that any use of biometric information, even on-device, must meet privacy requirements, including the necessity and proportionality requirements.[5] But for many purposes (e.g., age-gating content and applying age-appropriate settings), estimation may be sufficient.

Device-based estimation combined with a privacy-protective architecture keeps sensitive data local. The estimate occurs on the user’s device; the service receives only the result. No central database of children’s biometrics is created.

Privacy-Respecting Verification Services

The digital identity ecosystem includes organizations that develop age-verification services aligned with privacy-protective principles. These services aim to verify age without retaining personal information beyond the verification interaction and to support multiple verification methods to accommodate different user circumstances.

The private sector has invested in privacy-protective age verification, though implementation maturity varies across the ecosystem. Parents want protection, platforms want compliance, and privacy-protective options that achieve both goals are increasingly finding market traction. Realizing this potential consistently across the ecosystem remains a work in progress.

Commissioner Guidance on Age Assurance

Commissioner Dufresne has prioritized children’s privacy, including work on age-assurance approaches, as reflected in the OPC’s recent annual report and in ongoing regulatory engagement.[6] This regulatory attention reflects the importance of getting children’s digital trust and identity right. The commissioner’s office has recognized that many existing approaches do not adequately balance protection with privacy.

Drawing on commissioner guidance and the FPT Joint Resolution’s privacy principles,[7] several key considerations emerge for age assurance design:

Proportionality: Age verification measures should be proportionate to the risk. Accessing content with modest age restrictions requires less rigorous verification than accessing higher-risk content. Purchasing age-restricted products online requires different verification than accessing adult content. One-size-fits-all approaches that apply maximum verification everywhere create unnecessary privacy intrusion for lower-risk contexts.

Data minimization: Age verification should collect only the information necessary to verify age—complete identity verification, when age-only verification would suffice, results in unjustified data collection. Organizations often default to collecting government ID for age verification when attribute-based approaches would serve the purpose with less data exposure.

Child-centred design: Systems should consider children’s perspectives and developmental needs, not just adult concerns about children. Children are rights-holders, not just objects of protection. Their views about privacy, autonomy, and appropriate protections deserve consideration in system design.

Parental role recognition: Parents have legitimate roles in children’s digital lives, but parental oversight should not eliminate children’s privacy. The balance between parental involvement and child autonomy should evolve with age. A system that gives parents complete visibility into a 17-year-old’s every digital interaction may not serve healthy development.

For both government and private digital services, delivery transparency is critical for trust. How and when data is accessed can be part of that transparency and should explicitly distinguish between a youth accessing their own records and a guardian acting as a proxy. This creates accountability and transparency that protects the youth from unauthorized oversight and provides parents with a clear framework for their supportive role.

Transparency: Children should understand what information is collected about them and how it is used. Age-appropriate explanations help young people develop privacy awareness. Data collection that is not disclosed to the child, even for protective purposes, undermines trust and learning opportunities.

While one simple, clear language message may work, the target audience still needs to be considered. A tailored approach to meet the needs of the youth audience as they progress through the different stages of their development may be necessary.

Youth Digital Wallet Access

“Protecting young Canadians online requires a move beyond age-gates, and static access control policies toward a model of ‘Progressive Autonomy.’ At IDENTOS, we’ve seen oversimplified trust models fail when faced with nuanced scenarios. By leveraging dynamic technologies like Policy-Based Access Control and digital wallets, we can allow digital trust to evolve alongside a minor’s needs – seamlessly shifting from parental oversight to individual autonomy, ensuring protection never comes at the cost of a young person’s developing right to privacy.”

 

– Mike Cook, CEO IDENTOS

As digital wallets become more prevalent, questions arise about youth access and autonomy.

Progressive Autonomy

Children’s autonomy develops progressively. A 7-year-old appropriately has less autonomous control than a 15-year-old. A 15-year-old appropriately has less than an 18-year-old. Digital wallet design should accommodate this progression.

Rigid approaches that treat all minors identically fail to recognize developmental differences. A system that gives a 17-year-old no more autonomy than a 10-year-old does not serve either well. Age-appropriate design requires nuanced approaches.

The emerging progressive consent model takes into account that a youth’s capacity evolves with age, but is not solely focused on a single age. While parental consent for youth is essential to ensure that as youth engage in the digital world, the need to ensure that a youth’s fundamental right to privacy and to have a say in what happens with their personal information must be respected. As the technological landscape evolves, a design that takes into account the maturity of the individual user, developmental stages, and other considerations such as cultural inclusiveness will provide meaningful privacy protections while respecting the rights of individual youth. This model also takes into consideration the information itself, recognizing that certain personal information requires different safeguards.

Industry and government will need to meet this moment of accelerated digital adoption and consumer demand with greater flexibility in the technical tools that support progressive, dynamic conditions. Specifically, wallets, consent frameworks, and Policy Based Access Control (PBAC) will become foundations for privacy and collective responsibility to the safety of our youth.

Parental Controls and Child Privacy

“Safeguarding the privacy of children and youth in the digital ecosystem is grounded in the best interests of the child. This includes meaningfully involving children and youth in decisions that affect their digital lives, ensuring they can develop their identity, maintain autonomy, and participate in society without undue surveillance or permanent consequences.”

 

– Erin Hardy, General Counsel and Corporate Secretary / Chief Privacy Officer, Service New Brunswick

Parents legitimately want visibility into their children’s digital activities, and children have privacy interests even from their parents. Balancing these interests is genuinely difficult.

Younger children, appropriately, have less privacy from their parents. Older children appropriately have more. The transition should be gradual, not sudden. Systems that enable parents to monitor every transaction by a 16-year-old may not serve that teenager’s healthy development.

Parental control mechanisms should be transparent to children. Children should know what parents can see. Undisclosed monitoring undermines trust and may not effectively serve protection goals.

Article 12 of the Convention on the Rights of the Child makes it clear that children have the right to express an opinion on issues that affect them. While parental oversight is essential to the safeguarding of children’s and youth privacy, it cannot be an all-or-nothing approach. As with any transformation, the way we approach it must transform as well. Meaningful models should include ongoing conversations with children and youth who are actively involved in the decisions that affect them. Parents should take more of a mentoring role, rather than an authoritative one.

It is equally important to recognize that not everyone starts from the same baseline, so any model must be flexible and account for the fact that risk levels and risk tolerance are not always viewed in the same way.

Privacy for young Canadians should include privacy within the family unit as well as privacy-respecting experiences online. The digital wallets of the future could play a critical role in how the next generation achieves agency across their digital experiences with consent, consumer-directed data, real-time access control and management of delegated access and relationships. For example, a ‘mature minor’ might authorize a parent to manage their educational records while keeping sensitive health or social service data private. Access should be a negotiable set of permissions. That same individual might someday consent to have their data donated to health research or exercise their GDPR right to be forgotten on a social network.

Case study – Policy-Based Delegation in Pediatric Care: Practical implementation of these wallet principles can be seen in recent Canadian digital health initiatives—such as the TrustSphere Project with BC Children’s Hospital[12]—designed to address the fragmentation of pediatric health data.

To solve this, the digital architecture was designed around the policy-based nuance of the “Circle of Care.” A youth’s digital identity is securely linked to a guardian’s through Delegated Authority, enabling a collaborative environment in which access is not a binary “on/off” switch. Instead, it functions as a sliding scale that supports the legal and developmental transition from caregiver-led management to youth-led autonomy.[13]

Youth Voice in Design

Youth perspectives should inform design decisions. Young people understand their own needs and concerns better than adults designing for them. Consultation with young people themselves, not just parents, advocates, and experts who speak for them, should shape youth digital trust and identity verification systems.

This consultation requires age-appropriate methods. Engaging a 10-year-old requires different approaches than engaging a 17-year-old. But both can contribute meaningfully to design decisions that affect them.

Challenges in Current Age Verification Approaches

Many current age verification approaches do not fully align with privacy principles. Understanding these challenges clarifies what better approaches must address. These are systemic issues across the ecosystem, and many organizations are actively working to improve their practices.

Data Over-Collection

Many age verification systems collect full identity documents to verify age. Users submit driver’s licences, passports, or other identity documents, creating databases of identity information when only age was needed. This over-collection creates breach risk and privacy concerns regardless of data-handling practices.

Centralized Data Repositories

Some age verification approaches create centralized databases of verified users. These databases, even if they contain only age-verification status rather than complete identity information, enable tracking and become targets for breaches. Users’ verification history can reveal their activities.

Identity Linkage

Age verification that uses persistent identifiers can enable linking activity across services. Even if individual services do not share data, common identifiers may enable third parties to correlate behaviour. The verification intended to protect privacy can itself become a privacy concern.

Exclusion Effects

Some age verification approaches exclude users who cannot or will not provide identity documents. Users without government-issued IDs, those with privacy concerns, and those in sensitive circumstances may be unable to access services. This exclusion may disproportionately affect vulnerable youth, the very population that protection measures are designed to serve.

What Better Approaches Require

Age verification that respects privacy requires deliberate design choices.

Minimal Data Collection

Verification should collect only age-relevant information. If the purpose is to confirm “over 18,” the system should not collect names, addresses, or images of documents. Selective disclosure credentials enable this minimal collection.[8]

No Persistent Tracking

Verification should not enable tracking across sessions or services. Each verification should be unlinkable to previous verifications. Users should be able to repeatedly verify their age without creating correlated data.

Accessible Alternatives

Multiple verification pathways should accommodate different circumstances. Users without smartphones, government ID, or a willingness to share identity documents should have options available to them. Exclusion from age-verified services should not be the cost of privacy protection.

Age-Appropriate Design

Systems should recognize that “minor” encompasses a wide range of development. A 7-year-old and a 17-year-old have different needs, capabilities, and appropriate levels of autonomy. Treating all minors the same serves no one well.

Interoperability

Where possible, age verification should be designed to support interoperability so that children and youth benefit from the tell-you-once model and do not have to verify their age each time they interact digitally continuously.

The Global Context

Canada is not alone in grappling with youth digital trust and identity. International developments provide both warnings and models.

The UK’s Age Appropriate Design Code (also known as the Children’s Code) establishes requirements for services likely to be accessed by children, including default privacy settings and limitations on data collection.[9] The code has influenced platform design globally, demonstrating that regulatory requirements can drive privacy-protective approaches.

Australia has enacted age-related legislation, including the Online Safety Amendment (Social Media Minimum Age) Act 2024, which sets a minimum age of 16 for accessing social media.[10] The debates around these measures have highlighted tensions between protection goals and privacy concerns that Canada also faces, particularly around how age is verified in practice.

The EU addresses children’s rights in the Digital Services Act and the GDPR, including specific provisions for parental consent, age-appropriate design, and protection from harmful content.[11] European approaches tend to emphasize rights-based frameworks that recognize children as rights-holders.

Different jurisdictions make different trade-offs. Some prioritize verification certainty over privacy, accepting the collection of identity data as the price of protection. Others prioritize privacy over verification effectiveness, accepting some leakage as the price of minimizing data collection. Canada has the opportunity to demonstrate that both goals can be advanced together through thoughtful technical and policy design.

The Global Privacy Assembly, which brings together privacy commissioners from around the world, has prioritized children’s digital rights, providing a forum for sharing international best practices.[14] Commissioner participation in these international discussions helps inform Canadian approaches with global perspectives.

DIACC’s Recommendations

DIACC recommends exploring the establishment of a Youth Digital Trust Forum that brings together diverse expertise: technologists who understand what is technically possible, child development experts who understand developmental needs, privacy professionals who understand protection requirements, and young people themselves, who understand their own perspectives. Such a forum would require meaningful youth engagement and structured participation that gives young people genuine influence over decisions that affect them.

This forum could guide the design of age-appropriate digital trust, privacy-respecting age-verification approaches, progressive autonomy frameworks for wallet access, and parental-involvement models that balance oversight with child privacy.

The Pan-Canadian Trust Framework should also evolve to address youth-specific considerations.[15] As the framework develops, certified organizations would be encouraged to demonstrate appropriate handling of youth-related data, and DIACC welcomes member input on what these expectations should look like.

The Essential Balance

Protecting young Canadians is essential. Parents, educators, and society rightly seek to shield children from harm. Digital systems that protect children serve essential goals.

Respecting young people’s privacy is equally essential. Privacy is a right, not a privilege granted by adults.[16] Systems that protect children by eliminating their privacy may create harms of their own. Comprehensive monitoring of children’s every digital action is not the only path to protection.

The good news is that protection and privacy can advance together. Attribute-based verification protects age-restricted access without collecting foundational identity data. Privacy-preserving age estimation protects without biometric databases. Thoughtful wallet design can involve parents appropriately while respecting developing autonomy.

DIACC is working to help Canada advance this balance. Through the Privacy in Practice series, our information-sharing with privacy commissioners, and our work with members, we are actively exploring how Canadian digital trust and identity services can protect young people while respecting their rights. We believe this is achievable, and we invite the ecosystem to help make it real.

Next Week

Article 12 examines From Principle to Practice: Closing the Implementation Gap Together
Why privacy principles without implementation are insufficient and what serious implementation requires.

Footnotes

[1] Federal, Provincial and Territorial Privacy Commissioners and Ombuds with Responsibility for Privacy, Joint Resolution on Digital Identity, September 20-21, 2022, St. John’s, Newfoundland and Labrador. The Joint Resolution establishes that digital identity systems must respect privacy rights for all users, including protections that apply regardless of age. 

[2] Government of British Columbia, BC Wallet. BC Wallet uses verifiable credentials technology that enables selective disclosure, sharing only specific attributes (such as “over 19”) without revealing underlying personal details. 

[3] Government of British Columbia, BC Wallet Privacy Policy. The privacy policy describes how BC Wallet enables users to control what information is shared with verifiers, including the ability to share age-related attributes without disclosing full personal information. 

[4] Office of the Privacy Commissioner of Canada, Issue Sheets on Bill S-210: An Act to restrict young persons’ online access to sexually explicit material, May 2024. The OPC examined age estimation and age verification technologies, noting the privacy implications of different approaches, including facial age estimation, digital identity-based verification, and token-based systems. 

[5] Office of the Privacy Commissioner of Canada, Guidance for processing biometrics, August 11, 2025. The OPC’s guidance addresses requirements for biometric processing, including necessity, proportionality, and privacy impact assessment. 

[6] Office of the Privacy Commissioner of Canada, Annual Report to Parliament 2024-2025. Commissioner Dufresne identified children’s privacy as a strategic priority, including work on age assurance and the privacy implications of digital systems that affect young people

[7] Federal, Provincial and Territorial Privacy Commissioners and Ombuds with Responsibility for Privacy, Joint Resolution on Digital Identity, September 20-21, 2022, St. John’s, Newfoundland and Labrador. The principles of data minimization, proportionality, and transparency articulated in the Joint Resolution apply directly to the design of age assurance. 

[8] Government of British Columbia, BC Wallet. Selective disclosure credentials, as implemented in BC Wallet, demonstrate how attribute-based verification can confirm age-relevant information without collecting or transmitting full identity documents. 

[9] UK Information Commissioner’s Office, Age Appropriate Design: A Code of Practice for Online Services (the Children’s Code), 2021. The code establishes 15 standards for online services likely to be accessed by children, including requirements for default privacy settings, data minimization, and age-appropriate applications. 

[10] Australian Government, Online Safety Amendment (Social Media Minimum Age) Act 2024. Australia enacted legislation establishing a minimum age of 16 for social media access, with ongoing debate over the privacy implications of age-verification mechanisms required to enforce the restriction. The Australian eSafety Commissioner has published guidance on age verification approaches. 

[11] Regulation (EU) 2022/2065 of the European Parliament and of the Council (Digital Services Act), Articles 28 and 35 on protection of minors; Regulation (EU) 2016/679 (GDPR), Article 8 on conditions applicable to a child’s consent in relation to information society services. These provisions establish rights-based frameworks recognizing children as rights-holders in the digital environment.

[12] Canada’s Digital Technology Supercluster, TRUSTSPHERE project overview. The TrustSphere project, led by Careteam Technologies in partnership with BC Children’s Hospital Research Institute, IDENTOS, Smile CDR, SecureKey, MedStack, the University of British Columbia, and Interac Corp., developed a digital health platform with a pilot focused on children living with Type 1 diabetes, integrating strong digital identity and consent management to enable patient-directed data sharing across the circle of care. 

[13] Abdulhussein FS, Pinkney S, Görges M, van Rooij T, Amed S. “Designing a Collaborative Patient-Centered Digital Health Platform for Pediatric Diabetes Care in British Columbia: Formative Needs Assessment by Caregivers of Children and Youths Living With Type 1 Diabetes and Health Care Providers.” JMIR Pediatrics and Parenting, 2023;6:e46432. The peer-reviewed study documents the user-centered design of TrustSphere, including its approach to delegated authority and collaborative access to pediatric health data across patients, caregivers, and healthcare providers. 

[14] Global Privacy Assembly. The GPA has addressed children’s digital rights through multiple resolutions, including work on age-appropriate design, children’s data protection, and the balance between protection and privacy in digital services. 

[15] DIACC, Pan-Canadian Trust Framework. The PCTF establishes requirements for trusted digital identity systems in Canada. As the framework evolves, youth-specific considerations represent an important area for development. 

[16] Article 16 of the United Nations Convention on the Rights of the Child establishes children’s right to privacy. General Comment No. 25 (2021) on children’s rights in relation to the digital environment further articulates that children’s privacy rights apply in digital contexts and that protective measures should not disproportionately infringe on children’s privacy.

The Privacy Scorecard

A practical tool for measuring digital identity services against the FPT privacy principles. Assess your organization’s implementation across architecture, policy, user experience, and ecosystem coverage. It is not a compliance checklist or legal advice. Use it to spark conversation, explore unfamiliar concepts, and identify areas worth digging into further.

Access the Privacy Scorecard

Follow the Series