Digital privacy and cybersecurity experts as well as lawyers ET spoke to also said that there needs to be additional deliberation on the finer workings of the consent artefact architecture, including how giving and revoking consent will work.
“The tricky part is how consent provided under Section 7(a) of the Act will be later located to withdraw such consent, which was not originally recorded, or provided using a consent artefact,” a technology lawyer, who has seen a copy of the proposed rules and was present during the industry consultations on DPDP Rules, told ET.
Section 7(a) of the proposed executive rules under the Act envisages transactional situations where neither the user’s consent has been documented nor a privacy notice has been exchanged with the said user.
Also read | Experts flag revised Data Bill’s silence on generative AI tools
The DPDP Act defines companies processing personal data as data fiduciaries and the users whose data is processed as data principals.
Discover the stories of your interest
The proposed final draft of the executive rules under the Act, set to be released soon, has introduced the concept of a consent artefact, which could be any machine-readable electronic record such as an e-mail, a digital signature or an electronic document of any kind.This electronic document, a kind of digital form, should be able to provide both data fiduciaries and data principals with the option to notify each other on various aspects such as granting, managing, reviewing or withdrawing consent for personal data processing.
“The draft has proposed that the consent artefact must also contain information to correctly identify both the user and the company. The problem that could arise here is how does one verify whether such data, provided either by the user or the data fiduciary, is correct,” a senior executive at a social media intermediary told ET.
Another issue, experts said, could be with the IT ministry’s proposal to allow the use of a digital locker service but not mandating it.
“It is a recognition that the government’s DigiLocker is being used by only around 234 million users currently, which is a fraction of this country (population). The government has kept it technology agnostic. It is not going into how reliable age and identity is established,” a senior public policy research executive, who has seen a copy of the draft rules, said.
Also read | Decoding the Digital Personal Data Protection Bill 2023
Though the government had indicated that there could be more reliance on a government-issued identity card-based or digital-locker-based verification of age of children, the stance now seems to have changed as civil societies raised concerns on the exclusionary impact of such lockers or ID cards, the executive said.
“Now the government has left it to the intermediary to choose the technology that is convenient to them,” she said.
Though the US and countries in the EU and other parts of the world have come up with the concept of consent managers or other facial recognition technology to obtain verifiable parental consent, these methods have their limitations.
“There is enough room for technology innovation. Every method has its limitations. It cannot be uniform globally,” she said.
Apart from these, the draft has also raised concerns about the exemptions from the restrictions on the processing of children’s data mandated under the Act provided to educational institutions, health establishments and certain government entities.
The entities eligible for this exemption include government entities that perform functions related to the ‘interests of a child’, healthcare professionals, crèche or childcare institutions and government bodies that issue subsidies and licences.
For healthcare institutions, the exemption may be restricted to the provision of healthcare services to a child. Educational institutions will be allowed to undertake behavioural monitoring and tracking of educational activities.
“This kind of classification is not very useful in giving exemptions. For example, YouTube is considered a social media platform, but it is used for educational purposes. Kids from low-income backgrounds learn a lot from YouTube. So, the classification and exemptions should be risk-based,” the public policy executive said.