This column began with a glimpse at some new technology, jaw-dropping but also terrifying if we stop to think of how it can be used. And yet, we’ve come quite far in what we get used to. Over the last few years there's been a slow change to the psychological contract we make when we grant apps and organizations access to our private and personal information in exchange for services and access to tools and capabilities. We use euphemisms to make this exchange comfortable, terms such as access to information, privacy. But the reality of this exchange is that we tell unknown parties where we are in order to get navigational tools, we let them listen in on us so we can use voice commands, we give them our credit information in order to simplify the purchasing process and our passwords in order to save us the need to remember them. And through all this, we allow companies that supply these services to keep track of every keystroke, every search, every word, in order to create for us a digital world that is personalized, free-flowing, convenient.
And we’re used to it. However, this exchange is now slowly entering the workplaces through smart tools, which manage buildings, calendars and meetings, and that's just the tip of the iceberg. Today's column on Work-Futures will explore the changing psychological contract required as new technology clashes with privacy in the workplace.
We've already grown accustomed to the presence of sensors, cameras and smart devices around us and even on us. It began with sports, health, lifestyle and the value of tracking movement and vital signs. In these cases, it was relatively easy to allow access to our very personal information in exchange for valuable feedback. Now, the business world is also gradually recognizing the value of gathering and processing information about our movements, communications and conduct within the workplace and new players are exploring how this information can influence organizational strategy. This is apparent as data analysis services are popping up, offering to gather and analyze information we're still not used to gather in everyday work-life. Who is talking to whom? When? How often? Through which communication tools? And what does this indicate about the communication patterns of people in the organization?
We can now discover where people work and meet and use this information to determine the need for physical offices. We can tell how often different tools are being used? How much time do employees spend working or learning? Even which part of our conversation is spent listening vs talking? And how much time we spend working alone vs in teams?
In the workplace of the future we'll probably have much less privacy. And we're likely to accept and get used to it because technology will make our workspace safer, more efficient, and easier to use. Take for example Humanyze, a service which gathers and analyzes data in order to enhance office and employee efficiency. Within their own offices they keep track of employee movements and interactions. Everyone wears identification badges the size of a matchbox, which include a microphone and sensors, and gather information about the employee's meetings, who they speak to, for how long, at what voice level, where etc. Adding all this information to data gathered from emails, online chats, phone conversations, boardrooms and calendars, they paint a comprehensive picture of how employees spend their workday.
Worried? Humanyze will try to reassure you, insisting it doesn't listen in or collect the content of the conversations. According to Humanyze, the content is not necessary and the very existence of the information that the conversation took place, where, how, with whom etc. is enough to make it valuable. Another aspect aimed at making us feel comfortable is that only the employee has access to his or her personal information. The organization can only see data at team-level and up, including analysis of a wide range of criteria including gender information, talk-to-listen ratio, the usages of various tools and workspaces and so on. To be fair, it's easy to see how this can benefit an organization. For example, the fact that certain teams don't communicate enough with each other, or that part of the building is under-utilized. It is also possible to see how such analysis, at a personal level, can even help us understand, for example, how we use, and maybe waste, our own time.
Hitachi, for example, argues that these capabilities can even help people be happier. In 2016 it launched a product called the "happiness meter". Here too, employees wear a smart card, which keeps track of our every movement. It sends 50 signals per second to a database, including data on conversations (e.g. who you've spoken to, when, was it face to face or while walking), data on location (e.g. when and where you sat by the desk or in the conference room) and so on. The company's algorithm then draws conclusions about employees’ mood and identifies hidden business issues. One interesting conclusion was that when young employees spend more than an hour in a conference room, the whole team's mood gets worse. Hitachi too reassures us that it doesn’t have access to the individual information, only the group's.
And how is all this connected to happiness? The company discovered, for example, at a customer service center, that people, who engaged in lively conversation during their break were happier. As a result, the structure of breaks was changed so that people could meet other employees from other shifts and parts of the organization, whose company they were more likely to enjoy. According to that customer, this tripled the efficiency of the service center.
If you’ve ever wondered why there is more focus on privacy in Europe, for example, why Facebook and WhatsApp are not popular in Germany, this is where it became clear for me. In a recent visit to Berlin, the tour guide explained this is caused by years of living in the shadow of a secret police force, which constantly gathered information from people about people, including who is where and who meets whom.
History has taught them what a regime can do with such information. It is therefore not surprising that the thought of employers collecting and using such information in the workplace it is unsettling.
It's not hard to imagine how organizations and authorities might take these capabilities too far. Veriato for example promises employers software that stores all employee computer activities. It also claims it can identify signs of low efficiency, scan emails in order to understand the organizational mood, and identify whatever the company chooses to define as irregular activity, such as a suspicious downloads.
In his latest book, 21 lessons for the 21st century, Yuval Noah Harari outlines a future in which smart systems will know more about our private world than we know about ourselves. And they will be able to use this knowledge in order to influence us, so that our choices will, in essence, no longer be our own. And this is where it becomes clear why this whole conversation is unsettling.
It begins with control. Are we able to choose to share this information in return for the value obtained, or decide not to share it if we believe that the value isn't worth the risk. Take for example the blue V marks in WhatsApp chats. They tell the sender if you saw the message. You may not want that, and the app lets you choose to opt out, which means others will not see if you saw their messages but you too will not see if they saw yours. It is, however, more difficult to choose not to use Waze so that Google doesn't know where you are at any given moment, or not to use Google so that it doesn't learn your preferences.
And it doesn't end here. The tough questions we will have to deal with concern the ownership of the information. AI is better than us at identifying weak signals, connections we are not always aware of ourselves. So, when all this information about us is gathered, a private world of connections and considerations will be revealed to others, a world that we ourselves might not be aware of. Do you really know what part of the time you speak and what part of the time you listen? How your time is divided between different kinds of work? Is there a connection between things that annoy you? Or the people? Whoever gets access to such information, especially if it's not us, will know how to push our buttons, cause us to react, even manipulate us to make decisions. Decisions we believe we are making of our own free will. But that might not really be the case.
This discussion is much broader than merely that of the workplace. But one thing is certain. A new contract in the workplace is needed, between employer and employees, as well as legislation that will protect employee rights in connection with data in the workplace. On a daily basis, we are already being asked to confirm that we accept an invasion of our privacy, while not really having a choice to add or remove aspects to this consent. We might need to ask for more trade-off’s in the form of the WhatsApp blue V: WhatsApp didn't deny us the use of their tool, they just let us choose whether to use or not to use a certain option. The question is whether an employer, who asks us to wear a tag that tracks and listens in on us, is really asking our permission and whether we even have the option of saying no. And what happens if we do.
Published by Globes, Israel business news - en.globes.co.il - on January 1, 2020
© Copyright of Globes Publisher Itonut (1983) Ltd. 2020