The warning from the Information Commissioner’s Office (ICO) concerns use of data stemming from research initiatives that universities are involved in. When considered with the growing cyber threat, it should spur universities to improve the security of all the data they control.
Universities hold vast amounts of data and, in the context of research projects, this can include commercial and personal information of a sensitive nature, such as medical or genetic data. As major employers with a large student population, universities also hold personal data regarding academics, staff and students.
Protecting all of this information is a challenge. Universities often operate with disparate IT systems which can vary in age and complexity, and contain multiple points of vulnerability.
The volume and value of data that universities hold make them targets for increasingly sophisticated cyber-attacks, and a number have fallen victim to phishing and ransomware attacks.
Indicating the seriousness with which the ICO is taking data security in the higher education sector, the watchdog fined the University of Greenwich £120,000 over a data breach - the first time the ICO had issued a fine against a university under the then current data protection law.
A recent ICO report highlighting concerns about data protection practices at universities, said: “What is clear is that there is room for improvement in how higher education institutions overall handle data in the context of academic research.
“It is therefore essential that higher education institutions have in place the correct processes and due diligence arrangements to minimise the risk to data subjects and to the integrity of academic research practices.”
The severe financial penalties that can now be imposed on organisations that breach data protection law – fines of up to 4 per cent of annual global turnover, or €20 million, whichever is highest – provide a regulatory and reputational incentive to address cyber and data risks properly.
There are also positive reasons why universities should update cyber and data security practices, including being well placed to play a key role in helping the UK government achieve the vision set out in its artificial intelligence (AI) “sector deal”.
That deal sets out how business, academia and government might work in partnership to drive improvements in UK productivity through support for AI. It is made up of a package of measures, including up to £950 million of support for the AI sector from public and private investment, and improved tax credits for AI research and development.
Data is at the heart of AI and the government has recognised the need to work with industry to “explore frameworks and mechanisms for safe, secure and equitable data transfer”. This includes through the use of data trusts, which were last year recommended in a government-commissioned review into how to grow the AI industry in the UK.
Data trusts can facilitate the sharing of data across organisations to develop AI. This type of “sharing framework” will encourage data to be created, shared and traded efficiently.
It is unclear yet which organisations might act as data trusts in the context of AI projects, but universities that embed robust cyber and data security practices, processes and procedures across their whole organisation, will certainly be in a good position to lead in this area of innovation. Embedding such processes will not be easy and will take time, but with opportunities and threats ahead, universities should not wait for a major data breach to hit them before taking action.
Joanne McIntosh, legal director and technology law specialist in the higher education sector, Pinsent Masons.