brand logo

Is the Data Protection Bill right for Sri Lanka?

11 Dec 2021

By Prof. Rohan Samarajiva Data protection is considered an esoteric subject, but it can have a powerful effect in the emerging digital economy. Depending on the success of digitisation efforts, pretty much every organisation may soon fall within the scope of data protection regulation. The use of data, and insights extracted from data, predates computers. The difference is that a lot more can be done with data, and more easily, now that computers are involved. Hospitals and dispensaries have long maintained medical records of patients. But there is a big difference between records maintained in paper files and those in databases; copies can be made and transferred easily, analyses may be conducted rapidly to identify patterns and relationships, and medical records can be combined with data from other sources to gain even deeper insights. This has enormous potential for transforming patient care. But it also poses dangers of data being misused by unauthorised parties. If you think about what mischief can be done with leaked medical records of political leaders and why they guard such data carefully, the dangers will become evident. Data protection laws seek to minimise the risks of the misuse of data stored in computer databases of various forms. There has been a worldwide surge in interest in data protection since the General Data Protection Regulation (GDPR), a European legal instrument with extra-territorial impact, which came into effect in May 2018. Many in the private sector, especially those engaged in business process outsourcing (BPO), have lobbied for GDPR-like legislation to improve their business prospects by having the country meet the adequacy test. This is a certification non-European jurisdictions must obtain if their enterprises are to be permitted to work with European Union (EU)-originated data. Drafting the Sri Lankan legislation commenced in January 2019, with multiple rounds of consultations. The gazetted Bill presents a well-crafted and sophisticated regulatory design. There is no “right answer” in terms of regulatory design. The right answer is what fits a country’s conditions. Depending on what the judicial system in a country is like, what the administrative law is like, and what kind of hiring could be done for the regulatory agency, the right design would have to be decided. Responding to the demands of vocal interests, drafters have adapted the European model. But even Europeans find the GDPR model with stand-alone and procedure-focused data protection authorities difficult to implement. Will their creative adaptation suffice for Sri Lanka? Impact on individuals Increasingly, individuals maintain databases in computerised form. A family’s invitee list for a wedding is an example. Section 2(3) of the Bill excludes “personal data processed purely for private, domestic, or household purposes by an individual”. If the invitee list is maintained by an event organiser, it is subject to the provisions of the Bill. Citizens need not concern themselves about the obligations imposed on data controllers by the proposed law. The law impacts them in their roles as data subjects. For example, an individual may suffer serious repercussions because of a data breach, wherein sensitive personal data such as credit card information and passwords stored in a government or company database are taken unlawfully by a third party and sold in the dark web or used for extortion. Breaches may also occur accidentally. Because of damage to reputation or the desire to avoid paying damages, companies may not disclose breaches in a timely manner, causing further harm. Section 23 sets out an obligation to report breaches but leaves the details to rules that are to be formulated under the Act. Citizens give personal data to the Government and to companies in order to obtain services (e.g., loans) or because of legal compulsion. Increasingly, data is collected as byproducts of transactions. For example, data on one’s location and movement are recorded as a byproduct of providing mobile communication services and billing for them. The principles of informed consent and purpose specification, which have been central to data protection regimes since the Organisation for Economic Co-operation and Development (OECD) Guidelines on the Protection of Privacy and Transborder Flows of Personal Data, were adopted in 1980, when processing transaction-generated data was unthought of. There is a lively academic debate on the continued relevance of consent in the qualitatively different circumstances of today. Consent is especially problematic with regard to jointly produced transaction-generated data, where the giving of informed consent is quite onerous. Most people do not read the information provided when asking for consent, because doing so would leave them little time for anything else. Broadly worded consent language inserted into customer agreements may satisfy the legal requirements, though there may be little substance to consent so given. Section 18 gives every data subject the right to request a controller to review a decision based solely on automated processing, which has created, or is likely to create, an irreversible and continuous impact on the rights and freedoms of the data subject under any written law. This may be illustrated using the procedure by which job applications are processed. Assume 300 applications are received for a position. Some form of screening would be necessary to develop a short list of those to be interviewed. If it is done by junior staff or by using an automated procedure which has a human component, an applicant will not have the right to request a review. If the screening is done solely using some form of software or artificial intelligence (AI)-based process (arguably more objective), the organisation would have to be ready to review upon request. Section 27 on unsolicited messages is likely to be quite popular. Those sending out messages in bulk (spam), usually for marketing purposes, by electronic means or through the post, will have to obtain the consent of the addressees and provide them with opt-out facilities. This will also apply to political messaging via bulk SMS and the postal service. Therefore, these sections are likely to be amended prior to approval by Parliament. Impact on businesses Data protection laws modelled on the GDPR impose considerable burdens on controllers (those who determine the purposes and means of the processing of personal data), such as ensuring that informed consent is obtained; that data subjects are provided information about their personal data held by controllers; that data is rectified or completed upon request; that data subjects should be permitted to withdraw consent, which will result in the erasure of their data; and so on. They are also mandated to appoint data protection officers with academic and professional qualifications to be specified by rules. Large organisations can absorb these costs. Compliance costs will be onerous for small businesses and organisations which are engaged in the processing of personal data. An example is a cake supplier who has the birthdays, addresses, and preferences of regular customers in a spreadsheet, and has to appoint a suitably qualified data protection officer to be fully compliant. Jurisdictions that follow the European model of data protection require all entities, large and small, who fall within the scope of the legislation to register and renew their registrations periodically. This allows the data protection regulator to have a record of all entities subject to its jurisdiction and to allow it to conduct inspections, to serve papers, etc.  Usually, the registration must be accompanied by a fee. In many countries, these fees are a source of revenue for the regulator. The scope of those who are required to register is so vast and the transaction costs are so high that many small businesses and organisations do not register. Even in Europe, data protection authorities do not have the personnel to actively compel registration and compliance. But this poses two dangers. Laws that are disregarded breed disrespect for the law. For example, in the event a non-registered firm experiences a data breach and has to interact with the regulator, it may find itself charged with multiple infractions. Unusually, there is no registration requirement in the Bill. Non-registration is not an offence. No registration fees are charged, and the Data Protection Authority will be reliant on budgetary allocations.  However, organisations large and small who fall within the scope of the law are bound to conduct their data processing and related activities as specified in the Bill. It is possible that the regulator may, in some instances, especially with regard to small controllers, experience difficulties in serving papers and therefore in regulating them. But locating large controllers in the private and public sectors is unlikely to be difficult. The reduction of costs of compliance for the many thousands of micro, small, and medium enterprises (SMEs) is well worth the costs of locating an entity against which a complaint has been made. Indeed, proper design of forms used for lodging complaints can overcome the problem by mandating the inclusion of location and contact details of the offending controller. Extra-territorial implications Google Maps provides an extraordinarily valuable service. By processing the travel patterns and speeds of millions of persons with map applications installed on their phones, Google provides dynamic routing instructions and travel time estimates for travel by various modes. Arguably, these actions fall within the scope of Section 2(1)(v): “Specifically monitors the behaviour of data subjects in Sri Lanka including profiling with the intention of making decisions in relation to the behaviour of such data subjects in so far as such behaviour takes place in Sri Lanka.” This would make Google, which engages in machine learning-based processing of the travel behaviours of millions of Sri Lankans, a controller subject to the provisions of the law. Had the registration requirement been retained, it is doubtful whether a global internet service company such as Google could have been compelled to register, let alone establish a physical presence in Sri Lanka. Nepal tried, and was ignored. The elimination of the registration requirement is a creative solution to that problem. The law, as drafted, imposes duties and obligations on global entities without a presence in Sri Lanka; it creates rights against such entities that the regulator is bound to safeguard through the laid down procedures in the event a citizen chooses to seek redressal. How the law may be enforced against such entities is a problem left for the future. Section 26 of the Bill restricts the processing of data outside Sri Lankan territory. In the case of public authorities (ministries, departments, corporations, including companies where the state holds more than 50% of shares), the processing cannot be done outside, other than for specific categories of data in countries classified as “adequate” by the Minister of Technology. This means that entities such as SriLankan Airlines, Litro Gas Lanka Ltd., and possibly even Sri Lanka Cricket, are precluded from using cloud-based services such as those offered by AWS and Google. They will be limited to the cloud services with storage in the few Tier 3 Data Centres located in Sri Lanka, where the price-quality package is inferior to those offered by global providers. With competition from the big cloud services absent , the local data centres will have less incentive to lower prices or enhance quality. The usual protectionist justifications about creating opportunities for local data centres are likely to be made, even if the supra-normal profits made by them will be repatriated by their foreign owners. Somewhat peculiarly, adequacy provisions have also been extended to private entities that are not public authorities. The difference appears to be that public authorities may process only specific subsets of data even in countries that pass the adequacy test, while the entirety of the data held by private entities may be processed without such restrictions. It is unlikely that the powerful cloud-based processing capacities of companies such as Google will be fragmented and located in national territories to satisfy data localisation rules. The granting of adequacy status to countries even in well-resourced Europe has been slow and apparently political. The specified procedures are so complicated that it would be fair to surmise either that no adequacy determinations are likely to be made, or that such decisions will be made for political reasons, bypassing the specified procedures. Another likely outcome is that large internet service companies such as Facebook and Google will simply ignore the data localisation provisions because they are impossible to comply with. If the Government presses for compliance, services are likely to be withdrawn.  Implications for innovation Machine learning or deep learning, commonly described as artificial intelligence (AI), is one of the most exciting innovations today. In the old days, one had to develop a complex model with multiple variables. With machine learning, software is trained using large amounts of data. For example, it is possible to distinguish between cat and dog images if a software has been trained on a large enough set of labelled images. The exact method by which the results were obtained cannot be reduced to a set of rules.  Google Chief Economist Hal Varian questioned the requirement of explainability, imported into Sri Lanka through Section 18 of the Bill. He questioned the reasonableness of demanding more of a machine learning model than we ask of ourselves. Can an individual explain stepwise how they identify an image of their spouse from among many photographs? Provisions such as Section 18 suggest inadequate weight has been given to innovation, perhaps driven by the false hope of winning the adequacy certification from the EU.   Incorporation of purpose specification and informed consent as core principles is inimical to the development of AI and data analytics in Sri Lanka. Section 6(2) and the important Schedules I and II do make some exceptions for research and actions taken in the broader public interest, but the incorporation of the purpose specification principle as a central design element reduces the scope of the exceptions: “Every controller shall, ensure that personal data is processed for…(a) specified; (b) explicit; and (c) legitimate purposes and such personal data shall not be further processed in a manner which is incompatible with such purposes.” LIRNEasia has done research on the geographical incidence of poverty and how formerly residential areas in the city of Colombo are gradually turning into business-dominated areas, among others. These policy-relevant data analytics research activities were undertaken using pseudonymised call detail records and related data kept by providers of mobile telecommunication services. Will these data be still available to researchers after the importation of rigid forms of data protection anchored on purpose specification and informed consent? Service providers can satisfy the legal requirements by inserting broadly worded statements on purposes such as improvement of services into the contracts they enter with all customers. But it will be impossible to specify the kinds of novel research uses that are necessary for innovation among the specified purposes. The result will be the shutting down of access to the large data streams essential for machine learning research. Big companies will be further strengthened because the only people who can use their massive data streams will be those who directly work for the companies or are their contractors. It is no accident that most of the breakthroughs in AI are happening in China and North America. Europe is a laggard, despite the availability of highly trained data scientists. That is because they are constrained in their access to data. The drafters of the data protection law have, unfortunately, been responsive to the lobbying of firms wanting to get a piece of the low-tech and low-skill business process outsourcing business. There is no one to lobby for the AI companies that have yet to come into being. Capacity of the Data Protection Authority It is commendable that the Bill focuses on the overall architecture of the regulatory scheme and leaves the details for rules to be made in the future. Those actions are to be taken by the state agency designated to serve as the Data Protection Authority. As stated above, the Authority will have to rely on the Consolidated Fund; it will not have its own resources. Even in Europe, the heartland of data protection, data protection authorities are under-resourced, do not have enough staff with the necessary technical skills, and take inordinately long to respond to complaints. It was reported by The New York Times last year that all but three (Germany, Italy, and the UK) had annual budgets of less than £ 25 million. The above benchmark may be interpreted to mean that a minimum of £ 25 million a year is required to run an efficient data protection authority. That is over Rs. 5 billion in operational funds. The likelihood of the Sri Lankan Data Protection Authority being given even one-fifth of that by the Treasury is small. They will be bound by government-wide rules and may face difficulties in paying the right salaries as most recently demonstrated by the Board of Investment (BOI) controversy. The commendable removal of the registration requirement may not be enough of an adaptation. Sri Lanka has well-crafted laws, but rarely are they implemented satisfactorily. If the regulator is under-resourced, little more than ticking the boxes so that Sri Lanka will pass the EU’s adequacy test is likely to be achieved, and even that is uncertain. The best law is not one that is optimal in a technical sense, but one which is most appropriate for the local conditions.


More News..