As Tom Friedman writes in his March 27, 2018 opinion column, we’ve entered the second inning of one of the world’s great technological leaps, the implications of which we’re just beginning to understand. We’ve started to feel beat up by the same platforms and technologies that, in inning one, had enriched, empowered and connected our lives, he writes, pointing to the consequential news story of a self-driving car — with an emergency backup driver behind the wheel — having struck and tragically killed a woman on a street in Tempe, Ariz. “For problems like this, I like to consult my teacher and friend Dov Seidman, CEO of LRN,” Friedman says.
“The first inning’s prevailing ethos was that any technology that makes the world more open by connecting us or makes us more equal by empowering us individually must, in and of itself, be a force for good,” Seidman began. “But, in inning two, we are coming to grips with the reality that the power to make the world more open and equal is not in the technologies themselves. It all depends on how the tools are designed and how we choose to use them. The same amazing tech that enables people to forge deeper relationships, foster closer communities and give everyone a voice can also breed isolation, embolden racists, and empower digital bullies and nefarious actors.”
Equally important, Seidman added, these “unprecedented and valuable tools of connection” are being used with great accuracy and potency “to assault the foundations of what makes our democracies vibrant, capitalism dynamic and our societies healthy — namely, truth and trust.”
And they have begun to be used “to assault our personal foundations — our privacy and sense of identity,” Seidman said: “It is one thing to use our data to enable better shopping experiences, but when my beliefs and attitudes are mined and manipulated for someone’s political campaign, a campaign that may be antithetical to my beliefs, that is deeply harmful and unmooring.”
So what to do?
“Precisely because we are in just the beginning of a technological revolution with a long, uncertain, up-and-down road ahead, we need to start by pausing to reflect on how our world, reshaped by these technologies, operates differently — and on the kind of values and leadership we will need to realize their promise.”
“Values are more vital now than ever, Seidman insisted. “Because sustainable values are what anchor us in a storm, and because values propel and guide us when our lives are profoundly disrupted. They help us make the hard decisions.” Hard decisions abound, because everything is now connected. “The world is fused. So there no place anymore to stand to the side and claim neutrality — to say, ‘I am just a businessperson’ or ‘I am just running a platform.’ ”
In the fused world, Seidman said, “the business of business is no longer just business. The business of business is now society. And, therefore, how you take or don’t take responsibility for what your technology enables or for what happens on your platforms is inescapable. This is the emerging expectation of users — real people — who’ve entrusted so much of their inner lives to these powerful companies.”
To be sure, Facebook, Twitter and YouTube should all be commended for trying to find engineering solutions to prevent them from being hacked and weaponized. “But this is not just an engineering problem, or just a business model problem,” he said. “Software solutions can increase our confidence that we can stay a step ahead of the bad guys. But, fundamentally, it will take more ‘moralware’ to regain our trust. Only one kind of leadership can respond to this kind of problem — moral leadership.”
“Moral leadership means truly putting people first and making whatever sacrifices that entails,” said Seidman. “That means not always competing on shallow things and quantity — on how much time people spend on your platform — but on quality and depth. It means seeing and treating people not just as ‘users’ or ‘clicks,’ but as ‘citizens,’ who are worthy of being accurately informed to make their best choices. It means not just trying to shift people from one click to another, from one video to another, but instead trying to elevate them in ways that deepen our connections and enrich our conversations.”
It means, Seidman continued, being “fully transparent about how you operate, and make decisions that affect them — all the ways in which you’re monetizing their data. It means having the courage to publish explicit standards of quality and expectations of conduct, and fighting to maintain them however inconvenient. It means having the humility to ask for help even from your critics. It means promoting civility and decency, making the opposite unwelcome. It means being truly bold — proclaiming, for example, that you will not sleep until you’re certain that our next democratic election won’t be hacked.”
“Once you see that your technologies are having unintended consequences, you cannot maintain your neutrality — especially when you’ve become so central to the lives of billions of people.”
Click here for the full article.
Photo: David Ramos/Getty Images
About the AuthorMore Content by The New York Times