The surveillance sphere no one is talking about.
When half of workers are unsure what data their employers hold on them, we should be concerned. That’s the worrying statistic that came back when Prospect union spoke to over 7,500 workers across specialist, technical and professional jobs.
The Fourth Industrial Revolution promised that digital and data would be the answer to our problems of stagnating productivity, sustainable growth, tackling disease and improving quality of life. But the big story of the past couple of years is of how tech has run into a trust and accountability backlash that risks delaying or derailing the benefits it could bring.
Recurrent data breaches, scandals over the commercial or political uses of personal information, and controversy over new technologies such as facial recognition, have shaken public acceptance of how citizen and consumer data is gathered and used. They have prompted a belated but necessary bout of soul-searching about how the ethical reputation and social legitimacy companies and governments can be maintained.
What can go wrong?
Workers are right to be concerned. The rapid advance of digital technologies into workspaces mean employers are accumulating huge amounts of data on their workforces. For all the focus on individual data privacy, little attention has been paid to data rights at work and the fast-moving frontier of workplace technologies. Just compare the justified scale of concern on facial recognition in public places, to the near silence on how monitoring or surveillance is applied at work.
Big tech is driving a growing trend of the commercial capture of data in what author and scholar Shoshana Zuboff has termed surveillance capitalism. This isn’t just limited to our private use of technology. Miller and Adler-Bell have explored how tech has led to the digitisation of employment. These trends are not limited to the United States. Even the protections extended under the EU’s GDPR rules have not stopped this forward march of data manipulation into Europe.
Rapidly developing technologies and business practices have already resulted in serious infringements of workers’ rights and dignity, in both public and private sectors. In 2012 for example an Employment Tribunal found that an online test used by the Home Office to assess candidates for promotion was indirectly discriminating on grounds of race and age. Last year Amazon was forced to scrap an AI program it was using to sort applications for jobs at its Edinburgh engineering hub when it emerged it had been discriminating against women.
There are clearly benefits from how AI and technology can improve work, assist innovation and increase productivity. But they are also all too easily vulnerable to misuse. Employers are increasingly using data to recruit, manage, promote, discipline or reward their workforces in ways that can lack transparency or accountability, increasing the risk of opaque, ill-founded, unfair or discriminatory decision-making. Fundamentally, any system that is designed by humans will carry and sometimes amplify the inherent, and often inadvertent, bias of those who make it.
Technology is making it possible for employers to track workers through location tracking; keystroke monitoring; audio recording; CCTV or workplace sensors; facial “coding” software; or wearable devices such as cameras or Fitbits. It is not just data collection; there is also the question of how data is use to make inferences about people and the harm that can cause (see Wachter & Mittelstadt 2019).
This may range from personal information used to target direct marketing or credit scoring, to the logging of behavioural patterns used in data inferencing. Workers at call centres can now be monitored by software that uses algorithms to assess their tone, mood and success in pleasing customers. UNI Global has reported a case where these were then used in appraisals; despite being inaccurate and discriminatory.
Prospect members are pro-technology. Most of them work in technology-intensive sectors, such as IT and telecoms, energy, aviation, regulation and R&D. Many of them are directly involved in developing the new technologies that are at the forefront of the digital revolution. They are optimistic about the beneficial impact of technological change on our economy and society.
But a recent survey of over 7,000 of our members revealed a dangerous deficit of transparency and trust with regard to their own employers. We found that almost half (48%) were “not confident” or “not confident at all” that they knew what data their employer collected about them at work. And over a third (34%) were “not confident” or “not confident at all” that their employer used data collected about them at work for appropriate purposes.
What are unions doing?
One of the first things we can do is be aware of how technology and data can be used and act accordingly. Our members don’t want us to be a barrier to new technology – they want us to ensure it is introduced and used in a way that involves them.
Unions are getting serious about both organising workers in tech and advocating for data rights that protect workers. Uni Global Union, to which Prospect is part, has successfully negotiated a Europe-wide right to disconnect agreement with Telefonica and Orange to set new boundaries for how workers can switch-off their devices outside of work. The Irish Financial Services Union has recently secured an agreement from Uslter Bank/RBS to write employee protections from data being sold or misused into staff contracts.
In the UK, unions have used collective bargaining to secure protections for workers, for example, the agreement negotiated by unions at the Environment Agency for vehicle tracking systems to include a “privacy switch” that workers can use when not on business. Prospect, working with sister unions, successfully moved policy at the TUC Congress in 2019 calling for greater work around collective voice and new technology.
We are also working with Uni Global Union on a project called Spotlight to develop our own workers FitBit with data controlled and owned by workers but able to build our own data evidence to challenge employers
How do we improve things?
As a recent study by the OECD recognised, trade unions can play a critical role in enabling technological and workplace innovation, by engaging and involving workers in the conversations needed to make the most of the opportunities. The reality though is that we don’t have a clear picture of the technology used at work and what it means for workers’ data.
That is one of the reasons we have called for greater transparency. We have set out some initial ideas for how we get the human/data balance right, including:
- A right to privacy – explicit commitments on employers’ collection and use of employee data should be included in employee contracts, collective agreements and bargaining processes, and in employee privacy notices required by GDPR rules;
- A right to disconnect – to challenge the ‘always-on’ culture and blurring lines between family life and work;
- A right to be involved – Employees and union should have the right to check and challenge how their data is used, and what inferences are made, in employers’ decision-making processes;
- A right to equality – using existing data and equality laws to ensure new technology, automated decisions and algorithms challenge bias and prejudice;
- A right to be consulted – union representation on national bodies determining guidance on the design and use of AI and new technology.
Andrew Pakes is Research Director at Prospect Union.
To reach hundreds of thousands of new readers we need to grow our donor base substantially.
That's why in 2024, we are seeking to generate 150 additional regular donors to support Left Foot Forward's work.
We still need another 117 people to donate to hit the target. You can help. Donate today.