Simon Woodworth: TikTok isn't the problem — lack of a timely response to security risks is

While it is unlikely that a foreign state entity will be particularly concerned with our latest cat videos, European and Irish GDPR legislation affords us distinct privacy rights. Those rights might not be vindicated by a social media platform that is subject to coercion by the Chinese government.
Yesterday the the grim reality of Nazi death camps during World War Two. The 84-year-old Tova Friedman now has 500,000 followers on her account. She is an excellent example of how social media can be used to reach people where other means might fail. It is also an example of how effective TikTok’s platform can be.
ran an article about how a holocaust survivor and her grandson were using TikTok to educate teenagers aboutSo why is there so much talk about banning TikTok? And why is it being singled out for special treatment? And why is there concern about government departmental response to the perceived threat of TikTok and other cyber security issues?
Let’s start with TikTok. It’s owned by Chinese company ByteDance and is therefore subject to Chinese law and regulation. The Chinese Communist Party has greatly tightened control of the industry of which ByteDance is a part. Specifically, Chinese law can force any organisation to assist with state intelligence work. This raises the concern that at some stage ByteDance might be forced to hand over TikTok user data.
Does this matter? Yes, it does, because first of all, it is a data privacy issue for TikTok users. While it is unlikely that a foreign state entity will be particularly concerned with our latest cat videos, European and Irish GDPR legislation affords us distinct privacy rights.
While the problem is concerning at an individual level, the attendant risk to the state is probably low. Identity theft or financial compromise are, of course, serious issues and there are numerous stories where scams or data breaches have caused great distress and inconvenience to individuals. The personal effect of such breaches can be devastating but the effect at national level is minimal. But what happens when the individual affected is party to knowledge or data that pertains to the business of the state?
This is where Catherine Murphy’s question to all government departments comes in. A concern has been expressed about the use of TikTok in organisations and governments where the involuntary disclosure of data could do substantial and lasting damage. The European Commission and European Parliament have both banned TikTok for that reason. Deputy Murphy has asked what cybersecurity measures are in place or are likely to be taken here.
Some departments are more permissive than others, and different combinations of security measures are implemented across different departments. It’s hard to tell which departments have implemented sufficient security measures for the risks that staff and departments face. Additionally, since each department clearly decides its own policy, this represents duplicated and wasted effort.
Individual departments, of course, have different data security needs and their risk profiles will also vary. Matters of national security are likely to be far more sensitive than, for example, policy around making provision for school books. This is not to say school books are less important; it is rather a recognition that the damage done by disclosing certain types of information is potentially much more damaging than disclosing others.
Nevertheless, a consistent level of basic security policy across all departments should be expected, with some departments perhaps exercising tighter security measures due to the particular sensitivity of their data. A consistency of approach would save money, reduce duplicated effort, and make it easier and less confusing for civil servants transferring between departments. This in turn reduces the risk of data breach through user error or confusion.
There is an additional concern that some departments are waiting for guidance from the National Cyber Security Centre. While developing a consistent policy is exactly what is needed, not taking action in the interim is not an option, especially in the face of immediate threats. Foot-dragging is not reasonable, especially in the aftermath of the devastating HSE cyberattack. Any concern that TikTok usage represents a possible risk, however small, to the security of the state warrants an immediate ban, subject to periodic review. The same applies to any other app or software where there is a similar security concern. There is no need to wait for the NCSC as such a ban would reflect prudent practice elsewhere. In particular, where mobile and desktop devices are supplied by the department, it is perfectly reasonable to restrict how those devices are used.
The HSE cyberattack should have been a wakeup call to the government that, while comprehensive measures take time to develop and implement, immediate remedial action for the greatest risks is also needed. In this case, TikTok is not the problem, our lack of timely response to it is.