Every so often something comes along and disrupts the normal order of things, and out of that disruption a something new emerges. It’s certainly not a stretch to say that 2020 has brought plenty of disruptions with it, and according to a recent report by Gartner, businesses are starting “reset” how they operate and implement new strategies reliant on emerging, more sophisticated technologies. In the report, Gartner lists a number of predictions for what the future of business will look like. Perhaps the most startling prediction the report makes is the increase in workplace surveillance: “By 2025, 75% of conversations at work will be recorded and analyzed, enabling the discovery of added organizational value or risk.” Whether this prediction will turn out to be true is up for debate, however the tone of the report seems to imply there isn’t much we can do about it. The problem, of course, is that these changes don’t appear out of thin air. People create the change. This means, if Gartner’s prediction turns out to be true, we aren’t completely helpless and could even play a role in building new technologies based on the values and ethics people share. Just like there is a movement in cybersecurity to create technologies that are based on privacy by design, as we begin moving towards a new future, we also need to focus on creating technology based on an ethics by design that promotes the well-being and rights of individual

While the idea of having every conversation and interaction you have at work recorded and analyzed probably doesn’t sound to appealing to employees, Gartner’s report highlights the possible benefits this will have for businesses. As Magnus Revang, research vice president at Gartner, explained to Tech Republic, “By analyzing these communications, organizations could identify sources of innovation and coaching throughout a company.” This may certainly be true. In fact, organizations could even use this data to help improve the workplace for employees.

Of course, if we’ve learned anything in the past decade, the technology that is used for good can also be used for bad. And Revang recognizes the risk involved with this shift. “I definitely think there [are] companies that are going to use technology like this and misuse it, and step over the line of what you would call ethical or moral.” When used correctly, however, Revang belives the benefits of the this technology will outweigh any possible risks.

The problem with this argument, however, is that it assumes the problem is not with the technology itself, but the people who use it. According to Tech Republic, Revang believes “technology is inherently neutral, however the way an organization chooses to deploy and use a technology is another consideration.” What this way of thinking doesn’t consider, however, is that technology is built by people — people who are certainly far from neutral. As Joan Donovan, a social science researcher at Harvard University, recently put it, the technology we build encodes “a vision of society and the economy.”

Humans are flawed, and technology is stained with our flaws before it is even operationalized. So, when looking towards the future of technology in business, without designing these new innovations with an ethics in mind, our underlining biases and flaws will play a big role in the consequences this technology will have for our everyday lives. This has huge implications in every facet of society, and unfortunately, our ethical oversight structures are very weak to mitigate these threats.

There’s talk about privacy by design principles and there are AI-bias frameworks being developed. But, in order to create technologies that support our better angels and not our worse impulses, we need experts across all fields and sectors to work together in order to understand and develop ethics by design principles that can help build technologies that are not only useful, but that reflect the values and ideals for a more just and equitable society.