The idea that technology and social media could have such a profound impact on society and politics would have sounded crazy in the past, but now that technology is so ingrained in our lives, it raises legitimate questions about Silicon Valley’s ethical responsibility.
Should technology companies create and enforce guidelines within their platforms if they believe such policies are in the interest of society as a whole, or should leaders allow technology to evolve organically without filter or manipulation?
One authority on this interesting subject is Casey Fisler—Researcher, assistant professor, and technology ethics expert at the University of Colorado Boulder. A graduate of Vanderbilt Law School, where he discovered his passion for the intersection of law, ethics, and technology.
She was further motivated when she realized that, given the obvious overlaps, more empirical research was needed on the ethics of modern technology companies. With a sense of urgency, she dedicated her career to researching and teaching the legal and ethical aspects behind the impact of technology.
Today, Casey is dedicated to educating leaders across the tech industry, from young entrepreneurs to seasoned CEOs, about designing and supporting ethically sound platforms. She believes a renewed focus on tech ethics will create a better future for companies, technology users, and society at large.
What is technology ethics?
We're probably all familiar with the general concepts behind ethics and technology. But what exactly are technology ethics? There's always been debate and discussion about the role of social media in society, and even Casey himself admits that there's no clear answer.
“Ethics is a loaded word,” she says. “We also talk about responsibility. Are the people designing technology responsible and thinking about the potential harm of their products? Ultimately, I think tech ethics is about designing technology that does more good than evil for the world.”
Let's look at Facebook, the most popular social media platform on the planet, through an ethical lens. “Before Facebook was Facebook, it was built as a platform to value girls,” Casey says. “One day it will be It would have sounded ridiculous to have something so deeply rooted in society and affecting democracy!”
Casey condones Mark Zuckerberg for not considering the ethics of banning a sitting president when he was designing Facebook in his Harvard dorm room, but as tech companies grow, so does their responsibility to engage in ethical thought.
“It should have been obvious to Facebook at some point that this was the way things should have gone,” Casey said. It's ultimately incumbent on leaders to anticipate potential problems, plan for them, and take action, rather than ignoring ethics. And with examples of poor tech ethics everywhere, startups have no excuse for putting it off while they wait for some success.
She also encourages tech designers and entrepreneurs to look at more than just the bottom line. Financial health is certainly an important number, but it shouldn't be the only metric. “Don't let revenue be your only metric,” Casey says. “When it comes to social media, decisions have been made based on how much attention you can get for your ads.”
Casey points to the controversy over YouTube's algorithm, which blatantly pushed conspiracy theories on viewers to boost ad revenue. After significant backlash, “YouTube made the decision to purposefully change their algorithm so that certain types of content would not be recommended as highly,” Casey says. “It was an ethical decision, and it certainly led to lost revenue, but sometimes you have to do what's right for society.”
“Think early and often about what could go wrong,” Casey says, “don't wait until a problem occurs to try to solve it, because by that time the damage has already been done.”
Social Media and Technology Ethics in Politics
Unsurprisingly, Casey's focus today is often on social media's role in politics, from moderation tactics to decisions to flag information or remove accounts entirely. “Content moderation decisions are Huge “This is a good example of why ethics is so difficult, and why it's so difficult to teach and difficult to learn,” Casey says.
“There's no right or wrong answer. A lot of the time it has to do with people's different values,” she continues. “Some people will say that freedom of speech is the most important value. Others will say that protecting people from hate speech and harassment is the most important value. Platforms have to make a decision about this.”
One big tech company that is in the middle of this ethical tech debate is Twitter, which currently has a policy called “Ethical Tech.” Civil Rights PolicyViolations include sharing misinformation about election procedures or results. Examples include tweeting false polling station locations or falsely stating that an election has been postponed. Twitter reserves the right to remove Tweets if they violate this policy.
“Another element of this policy is that you cannot share content that misleads people about the outcome of the election,” Casey said. “When President Trump tweeted that he won the election after Joe Biden was announced as the winner, that content was flagged under our civil rights policy.”
Until recently, Twitter had a policy of not removing content from political leaders. Instead, it labeled it as misinformation. But the events of January 6, 2021, led to Forced Twitter Reconsider your ethical position regarding this policy.
Twitter now removes Tweets from users who consistently violate this policy, regardless of their status. Although this decision has had a negative impact on Twitter's revenue, management believes it was the right ethical course of action.
Why technology ethics matter to every business
Beyond politics, there are countless reasons why technology ethics matter. Before the 2020 COVID pandemic, few could have predicted the now-famous phenomenon of Zoombombing (when uninvited visitors join private Zoom chats).
Sometimes, an unexpected Zoombomb just elicits laughs. Other times, it poses a serious security threat or harassment to a company. Either way, many are wondering why Zoom's leaders and designers didn't foresee the possibility that people would misuse the platform and devise a solution. in front It has been released.
“To a certain extent, I understand Facebook,” Casey said. “Facebook was built for one thing. I didn't expect it to get to this point. But Zoom is was Built for business meetingsMaybe their process could have involved some ethical considerations.”
To avoid Zoom mistakes, remember: many It's easier to solve problems before the platform gets into millions of people's hands and desks. “Sometimes you can predict where the technology is going,” Casey says. “It's probably a slow, gradual process, and it can be difficult. But that's why you need the company's leadership team to check in.”
During startup and development, Casey encourages tech leaders and designers to put themselves in the shoes of “terrible people”: think about how they'll use this technology once it becomes widespread.
“We know this happens,” Casey said. “When we design technology, part of the process is to think about how a bad actor might use it, and then we design it to make it harder for them to do that.”
Ultimately, Zoom addressed many of the serious issues that led to Zoom Bombing. But it was all done as a reactive response to a problem. Imagine how much time, money, and headaches they could have saved if they had integrated these solutions from the start. And fewer PR issues.
“I don’t expect software designers to be precognitive, but I think it’s really important to have some ethical reflection during the design process, even if it’s motivated by wanting to avoid a PR disaster,” Casey says with a laugh.
The conversation with Casey Fisler continues Leading with genuine compassion Podcasts. Hear about how ethics have changed during the COVID-19 pandemic, why increasing diversity and equity in tech can solve many ethical issues, and much more. Sign up for the podcast and never miss an article or episode. My Mailing List. You'll also get a free guide to my favorite mindfulness resources: twitter and LinkedIn Don't be late for my work Image 1.