Dealing with Stupid People

It’s easy and fun to call people names. Slotting them into categories lets us guess what they think without them telling us. We can shut down intolerant babble before it begins. In promoting tolerance, why should we listen to perspectives that do the opposite?

I'm with Stupid
By Kevin Marks [CC BY 2.0 (http://creativecommons.org/licenses/by/2.0)%5D, via Wikimedia Commons
Okay. Imagine refusing to hear people out because their opinions promote intolerance and even oppression. In so doing, we are gleefully engaging in the very behaviours we claim to condemn: silencing, excluding, and imposing our own subjective position as a morally objective standpoint.

Name-calling happens across any divide. At its core, the very notion of the divide lays the foundation for separation, even segregation. Can we build a world with no divide?

I think that world already exists, if we would just help it emerge. We construct and reproduce the divide ourselves. We talk and behave as if seven billion individual views can effectively be split into two or three camps: you’re either this or that or the other thing. Anyone who resents being pigeonholed knows that such a division hardly represents reality.

Collectively, we like to reinforce the notion of the divide. New Atheists talk about the “regressive left” as harmful because they promote cultural tolerance. Those who advocate cultural relativism point fingers at “evangelical” atheists who steamroll diversity in their push for rational solutions to social problems.

In the US, Republicans say Democrats are living in “la la land,” while Democrats call Republicans “racist” and “uninformed.” The same dynamic happens elsewhere, and not just in politics. We seem to think people who don’t agree with us are stupid or unaware. If only they knew what we know, they would inevitably come to the same conclusions we do.

In 2005, writer David Foster Wallace gave a commencement speech to Kenyon College’s liberal arts graduates. He argued that the benefit of a liberal arts education lies in its ability to teach us to choose how we think and what we pay attention to. It pushes us beyond our default stance, the self-centered positionality we all grapple with.

Of course it’s hard to get out of our own heads when that’s where we live. Doing so requires coaching and effort. Sometimes, we’ll fail and that’s okay too.

The goal is to try and try again. Recognize our fallibility. Embrace uncertainty.

Hannah Arendt famously said, “The most radical revolutionary will become a conservative the day after the revolution.” Once we establish a new way of doing things, we can get stuck in our ability to receive new ideas.

In sticking to our guns, we might even restrict the liberties of others. Yes, we have good intentions. We believe we are fighting to create or conserve a “better” society. But better for who?

The only good society is one that allows all members to speak and be heard, regardless of how much power they have or how little they toe the line. This applies to small groups as much as entire nations.

So whether we call ourselves liberal or conservative, progressive or radical, anarchist or socialist, religious or humanist, or none of the above – whatever we call ourselves, we need to strive not to reduce others to something less than human. As if we’ve read their hearts and minds and distilled the contents down to tidy, facile labels.

Anyone can have a good idea. Find the common ground. Listen and learn. If we engage life assuming we don’t have all the answers, we’ll end up wiser in the long run. And nobody has to get hurt along the way.

Advertisements

On Not Knowing

In my last original post, I had heard that water has memory, but I didn’t know enough to decide either way.

I did some digging, and it turns out that the water studies were not conducted in a scientific way, nor were they peer reviewed. There’s no way to substantiate any of the claims made. In other words, pseudoscientific mumbo jumbo. Good to know.

Confused
Rory the Tiger, confused – Artwork by Ruby Wang

Now that that’s been cleared up, let me turn to my main point for this post. I want to reiterate a point I brought up in that previous post: it’s both extremely uncomfortable and incredibly rewarding to admit that we don’t know something.

I think we internalize this discomfort with not knowing from a young age. We learn to only speak when we know the answer.

Here’s a good time to enact what I’m advocating for: I don’t know how people feel about saying they don’t know something in other parts of the world. Or even here, really, so I’m speculating based on things I’ve read and my own observations in North America and Europe.

Within this realm, parents and older siblings often chide children for asking too many questions. It’s annoying and time-consuming, and probably the parents and siblings don’t know the answer half the time. This situation can make them uncomfortable, especially for parents, since they are socially responsible for their children’s education. If they don’t know what information to give their child, have they failed in their role?

While there are many outstanding teachers in the world, many are forced to silence curious children. The goal of public education is to get children to pass standardized tests and learn the required material laid out in a narrow curriculum based on memorization, rather than research and discovery.

This trend of consuming information instead of seeking and building knowledge continues into adulthood. At work, seeming unsure might lead clients, bosses, and coworkers to distrust your abilities. Saying you don’t know merits chastisement, even to the point of getting fired for “incompetence.” (I’m not saying there aren’t actually incompetent people, just that learning is a process that takes time and acceptance of errors.) Instead of providing training and allowing employees to grow into their positions, many companies – including those in the arts – want their new recruits to come fully formed with years of experience and lots of energy. When David Bowie put out his first album, no one expected a success, but they gave him the benefit of the doubt and let him explore his sound in the studio for multiple subsequent recordings. Nowadays, if you don’t sell enough copies right off the bat, you risk being cut from the label.

So we feel that it’s uncomfortable and inhibiting to admit we don’t know something. Yet in reality, no one can claim to know everything. If we can start accepting this condition in ourselves and others, we may build a more honest world.

We will also start to recognize our own strengths and to value those of others. If I can admit that I don’t know something about medicine or physics or politics, then I can find someone who does know. In this way, I can become smarter and more informed.

We will all become better at discerning good information from bad if we practice finding out what we don’t know. It’s not enough to click on the first Google hit. SEO, paid ads, and popularity play a huge role in what’s at the top; facticity and accuracy do not.

It’s also not enough to read a few lines or even a few paragraphs on a topic and call it “learning.” Most things that are “known” are way more complex than a simple explanation will suggest. It’s always good to examine opposing views, because someone can be very convincing without being right. It’s also important to look at the person’s claim to authority, which can mean many different things. Their positionality in the world (which social labels they fall under, where they live, and many other factors) will influence their perspective as well.

Learning through discovery is hard, much harder than reading a “fact” and then regurgitating it. But we don’t really know anything until we’ve looked at it closely from many angles and sat with it for some time. Otherwise, we’re just repeating words that someone else said. We can’t actually own the knowledge as a personal intellectual asset. We’ve accepted information as true without having any understanding of what makes it so.

Sometimes new information emerges and what we previously “knew” becomes obsolete, shown to be incorrect. Other things can change from moment to moment, place to place, person to person. Democracy doesn’t just mean one thing; neither does religion, or even a specific type or subtype of religious belief. Sufi Islam is very different from the Sunni persuasion. Pagans vary across the board. And of course individual circumstances affect interpretations and practices, so you can never come up with a timeless, immutable law of Sufi practice or any other human endeavour.

640px-B&W_Happiness
Rory the Tiger, confused – Artwork by Ruby Wang

I don’t know a lot of things. When I write a post expressing an opinion, I often second-guess the validity of that statement. In other parts of my life, I try to present myself as knowledgeable as frequently as possible; obviously I’m not immune to the socialization process. I also tend to simply remain quiet when I don’t know something, hoping that someone else will fill in the blanks.

However, training in research methods has spilled over into my daily life. The first thing we all should know is how to find solid, valid information. Then we should know how to examine it, test it, shake it up, and see if it still stands.

Of course, we all have limited time and resources at hand, including life experience or education in various subject areas. To some, something might seem obviously untrue, while someone else might be convinced by its apparent value and what looks like supporting evidence. We also need to respect this type of difference (without being afraid to challenge another person’s viewpoint, of course).

We can’t all know the same things, no matter how hard we try. In this light, it seems worthwhile to question our own received knowledge and try to really understand why we think we know what we do. This approach just might help us all get smarter, while simultaneously producing the joy of discovery and the solace of mutual respect.

Free Speech and the Right To Not Be Offended

The right to free speech has been invoked to defend all sorts of things, from promoting drugs for off-label uses to parodying trademarks. At the same time, people who make insulting or simply inaccurate statements sometimes expect that freedom of expression means that everyone must let those statements slide.

I’ve encountered this attitude many times. It goes something like this:

“You can’t tell me I shouldn’t say x because it’s offensive or untrue. I have my own truth and I’m allowed to say whatever I want.”

It’s a contradictory idea, of course: “I have freedom of speech, but you don’t have the freedom to say you disagree with me.” The whole point of free expression is to allow dissent. It was conceived as a legal protection to prohibit government crackdowns on citizens who protest or even just hold alternative views.

Not all speech is protected under these laws and the extent of them varies from country to country. For a number of reasons, religion seems to fall into a whole separate category – in many cases, the freedom to practice your religion has its own legislation.

Socially, too, criticizing someone’s religion raises more hackles than criticizing their political or lifestyle choices. A magazine can make fun of someone’s vegan diet or views on immigration, but that freedom seems to stop with religion.

Outside of a country’s laws, no one has the right to impose a religion on someone else. Some versions of secularism (the division of church and state) mean everyone can publicly practice their religion, while others lean more toward limiting religious expression to private occasions. Either way, it is illegal for the state to mandate individual religion in secular countries. This is what freedom of religion means.

In places with religious government, everyone has to respect and even appear to practice the national religion. To many, the most obvious examples are Iran and Saudi Arabia, but Myanmar also punishes citizens for even suggesting disrespect for Buddhism. Earlier this year, the country put three people in jail for an advertisement showing the Buddha wearing headphones.

According to the New York Times, the court convicted them because the advertisement “offended the majority religion in the country,” which violates the country’s religion act.

This incarceration shows that Buddhism is not just about a jolly fat guy who thinks everybody should live in peace. But more importantly, it reinforces a dangerous notion made apparent in the aftermath of the Charlie Hebdo attacks: people who hold religious beliefs have the right to not be offended. Not only that, but if they are offended and they take revenge, it is at least partly the fault of those who committed the offence.

Of course, ethically, people probably shouldn’t offend other people just because they can. Ideally, all criticism should be constructive and based on facts. However, and here’s the key to all this, people are allowed to offend other people, no matter what their motivation – even if they just want to be annoying. Yes, it’s childish and irritating and unproductive. But that’s not the point. Freedom of speech only works if we protect speech that we disagree with.

No matter what we might think of the Charlie Hebdo cartoonists’ crude drawings, we have to recognize their right to publish them. Religious belief does not exempt us from human rights law, let alone from being offended. If it did, we would have to say that it’s okay for Myanmar Buddhists to imprison people for depicting the Buddha in a comical way, while extremist Christians are right to blow up abortion clinics (which, by the way, is the biggest terrorism threat in the United States).

For good reasons, secular nations and the United Nations Declaration of Human Rights have made many religiously sanctioned acts illegal, from honor killings (murder) to spousal disciplining (abuse).  Hate speech, or speech that incites people to violence, is also banned in many places, although its limits tend to be a bit fuzzy.

Photo: Ryan McGuire
Freedom of expression is not a one-way street (Photo: Ryan McGuire)

In the end, legislation is an attempt to stop people from harming other people. Freedom of expression protects us from fear of political retaliation, so that we can all contribute to a discussion on how to make the world a better place. Other laws forbid person-to-person violence, including as an act of revenge.

Free speech doesn’t justify structural, physical, or any other kind of violence.  And it certainly doesn’t protect us from getting our feelings hurt.

Pope Francis Announces What We Already Knew

Pope FrancisLots of people have been getting excited about Pope Francis. He seems moderate and progressive, a humanitarian Christian voice in a world plagued by religious extremism. Recently, instead of staying for dinner with politicians, he decided to eat with some of the homeless in Washington, DC.

That’s a great action and he seems like a decent person. He’s made waves by refusing to ride a bulletproof Popemobile and speaking out about climate change. What’s not to like?

He even went so far as to accept evolution and the Big Bang theory. Good for him. He has caught up with the rest of us.

Except not quite. According to The Independent, his acceptance of these theories relies on the fact that they necessarily incorporate a creator – that they don’t work without intentional design:

“The Big Bang, which today we hold to be the origin of the world, does not contradict the intervention of the divine creator but, rather, requires it.

“Evolution in nature is not inconsistent with the notion of creation, because evolution requires the creation of beings that evolve.”

These statements reveal how little he understands evolution. If “beings” automatically require a creator, then nothing can exist without something else existing to design it.

So wouldn’t the creator also require a creator and so on ad infinitum? This is an example of the infinite regress fallacy.

Okay, so maybe he’s imposing God on a theory that pretty much negates the possibility of a creator. But he’s a Catholic, so of course, he’s going to find a way to work God into proven scientific facts, right?

He’s entitled to express his opinion, even if it is based on a fallacy. What I don’t understand is why people applaud him for announcing a distorted, unfounded version of what scientists have already been telling us for a long time. Is Catholicism so far behind on the facts that even inaccurate science has become worthy of praise?

Then we have Elton John, the famous gay musician, calling Pope Francis his “hero” for promoting gay rights in the Catholic Church.  The Advocate, a gay rights magazine, named the pope Person of the Year, allegedly for nudging the church in the direction of greater tolerance and inclusion of the LGBT community.

Wouldn’t that be great if it were true? Sadly, before becoming pope, Francis (then known as Archbishop Jorge Bergoglio) spoke out against legalizing gay marriage, calling it “an attempt to destroy God’s plan.”

People seem to have misunderstood what he meant by his now-famous quote, which appeared on the cover of The Advocate:

“If someone is gay and seeks the Lord with good will, who am I to judge?”

Of course, I can’t say what’s in his heart, but judging by his track record and other comments he has made, this statement seems more like a reference to casting the first stone only if you’re without sin yourself. He hasn’t actually said that homosexuality is not a sin, just that Catholics should stop judging others. For a more detailed analysis of the Vatican’s current stance on homosexuality, check out this insightful article from TIME. It’s not as radical as you might think.

The fact that people get so excited about this guy shows how restrictive and judgmental the Catholic clergy often are, not to mention hypocritical. At least Pope Francis appears to practice what he preaches.

It’s like if a two-year-old draws a stick person compared to an adult doing the same. The feat seems more impressive when the person isn’t fully developed.

I guess the same goes for the church taking baby steps. We’re so impressed by this pope that we forget how completely Catholicism would have to reinvent itself if it wanted to achieve any kind of progressive status.

Saudi Arabia, Land of Human Rights

“Farasan Island 3” by Bandar Yuosef – Flickr: Farasan Island_0392. Licensed under CC BY 2.0 via Wikimedia Commons – https://commons.wikimedia.org/wiki/File:Farasan_Island_3.jpg#/media/File:Farasan_Island_3.jpg

Last year, all atheists became terrorists – at least according to new laws introduced in Saudi Arabia. Non-believers aren’t alone though: anyone who criticizes the state, its rulers, or the Saudi version of Islam could be charged with terrorism. The Penal Law for Crimes of Terrorism and its Financing explicitly includes non-violent acts, effectively prohibiting any semblance of free speech, association, or religion.

The International Humanist and Ethical Union explains some of the key terms of the legislation:

The provisions of the “terrorism” law define and outlaw numerous acts and forms of expression as “terrorism”, including:

  • “Calling for atheist thought in any form”

  • any disloyalty “to the country’s rulers”, or anyone “who swears allegiance to any party, organization, current [of thought], group, or individual inside or outside [the kingdom]”;

  • anyone who aids, affiliates, or holds “sympathy” with any “terrorist” organization as defined by the act, which “includes participation in audio, written, or visual media; social media in its audio, written, or visual forms; internet websites; or circulating their contents in any form”;

  • contact with groups or individuals who are “hostile to the kingdom”

  • and the article on “Seeking to shake the social fabric or national cohesion” prohibits all protest, without qualification as to its message or intent, by outlawing “calling, participating, promoting, or inciting sit-ins, protests, meetings, or group statements in any form”.

Additionally, apostasy (denying Islam by adopting another faith or becoming an atheist) is punishable by death. The International Business Times states that 100 people have been put to death already this year, in compliance with laws prescribing capital punishment for “murder, rape, armed robbery, using recreational drugs, and smuggling, in addition to homosexuality, false prophecy, apostasy, adultery, witchcraft and sorcery.”

Raif Badawi is a case in point: he’s a Saudi Arabian blogger sentenced to ten years in prison and a thousand lashes for political criticism. It sounds crazy and it is, yet he is only one example of extreme corporal punishment among countless others that remain invisible to the international world.

With this lovely human rights record, Saudi Arabia somehow remains a full member of the United Nations. Not only that, but one of its representatives was quietly selected in June to head a panel of independent experts on the UN Human Rights Council.

This appointment followed on the heels of Saudi Arabia’s job opening for eight new executioners, described in the ad as “religious functionaries” working in the civil service, according to The Independent.

It’s like putting the head of ISIS in charge of human rights. Actually, the folks at UN Watch say that Saudi Arabia has beheaded more people than the famous extremist group this year.

Somehow, this is the real world, where farce sometimes merges with tragedy.

Science Is Not Truth

How do we know what is true? It’s an age-old question that hasn’t been fully resolved.

We do know that evaluating evidence and recognizing the role of subjectivity are part of the most reliable approach we’ve discovered so far in our trajectory as a species.

Generally, we call this approach “science.” But science is not the same as truth.

A few years ago, the American Anthropological Association (AAA) removed the word “science” from its description of the discipline of anthropology. Unsurprisingly, this act created an uproar with partisans on either side arguing for or against the definition of anthropology as a science.

Unfortunately, a lot of them seemed unable to articulate what science really is.

It’s a common mistake and one you will encounter in other areas. When people debate the validity of science, we tend to take for granted that everyone knows what the term means. Fun fact: we don’t.

You may have come across the phrase “the scientific method.” This phrase provides a clearer indication of what’s going on than the single word “science.” “Science tell us that the Earth orbits the sun” would become “The scientific method tells us that the Earth orbits the sun.” It’s a subtle difference, but revealing nonetheless.

The first statement implies some sort of oracle or god (“Science, goddess of the Sky”) revealed an absolute truth. In the second version, we understand that a rigorous process was involved.

In the early twentieth century, anthropologist Alfred Kroeber (drawing on his teacher Franz Boas) wrote down three principles of science:

  • The method of science is to begin with questions, not with answers, least of all with value judgements.

  • Science is dispassionate inquiry and therefore cannot take over outright any ideologies “already formulated in everyday life”, since these are themselves inevitably traditional and normally tinged with emotional prejudice.

  • Sweeping all-or-none, black-and-white judgements are characteristic of categorical attitudes and have no place in science, whose very nature is inferential and judicious.

Kroeber’s teacher, Franz Boas, divided science into two branches, which he called the general sciences and the historical sciences. For him, the general sciences try to discover universal laws, while the historical sciences uncover the processes behind things that happen only once, usually in a specific time and place. The social sciences, including history, linguistics, sociology, and anthropology (among many others), fall into the latter category.

Both use the scientific method. Although the type of knowledge they produce is not identical, both the general sciences and the historical sciences contribute “true” ideas to the collective body of knowledge.

I put “true” in quotation marks because scientists of all stripes constantly work to disprove accepted theories. When evidence supports a theory, that’s great, but it’s more important to see what other evidence might disprove the theory. In this way, while it is a challenge to conclusively prove a scientific theory beyond any possibility of dispute, it can be quite simple to disprove the same theory. This is how we know that vaccines don’t cause autism (a disproven theory) and that gravity is likely more of a push than a pull (a theory modified from its original version).

As The Skeptical Raptor suggests, evidence against a hypothesis is more powerful than evidence in favour of it. In other words, if you have three bits of evidence in favour of a theory and only one against it, the theory is wrong. It might not be entirely wrong (as in the gravity example above), but something about it needs to be changed to more accurately represent what is happening.

This is why so-called scientific laws are not carved in stone. Just like national laws, they can change.

Scientists face a lot of criticism when they announce that they were wrong about something. Also when they refuse to state anything with 100% certainty.

But what they are trying to say is that, while there is no absolute truth, there are degrees of validity.

Additionally, a theory is only based on the scientific method if people can imagine a way to test or disprove that theory.

If there is no way to test it, then there is also no way to prove it or argue it either way. We’ve left the realm of science to enter the realm of philosophy (which, by the way, has a lot of value in its right).

One of psychoanalyst Sigmund Freud’s more infamous theories says that all girls experience penis envy, recognizing the male sex organ as superior to their own. When challenged with evidence that went against his theory, namely that many girls and women say they do not have penis envy, Freud simply asserted that they were in denial.

This is an example of how logic can fail us by denying or explaining away contradictory evidence. Evidence is more important than logic and the hypothetical plausibility of a theory based on logic alone says little about how a process might actually play out in the real world. We can imagine all kinds of logically possible beings, processes, and events. Both Freud’s theory and his explanation of contradictory evidence sound logical and plausible. But as soon as he explains away the gaps with circular logic that draws on no evidence, the theory stops being scientific. You can no longer falsify (i.e. come up with a way to test his theory) since there is nothing observable (i.e. no evidence). We are forced to toss the whole thing out the window.

Real scientists and seekers of knowledge are always trying to disprove their own theories. They don’t need something to be right just because they have always thought it must be true. Instead, they constantly re-examine their assumptions and come up with new ways to disprove existing explanations.

Of course, if people test a theory for decades and centuries with no successful disproof, then the theory generally becomes established as fact. In careless everyday speech, many people including scientists will say it is therefore “true,” but I would recommend being more specific in what we say. Otherwise, we risk obscuring that in this context the word “true” simply means “rigorously tested but never disproven.” If contradictory evidence surfaced, we would accept the error in the original theory.

Science requires a flexible mind, as well as an acceptance of uncertainty.

On a final note, I use the word “disproof” above, which is probably more recent incarnation than the word “proof.” Originally, to “prove” something meant to test it, as in its derivative word “probation.” Knowing this helps us understand what the scientific method means by proving something is true – it actually means the opposite of what we might think: you can only prove a theory by seeking to show it is false, but failing to find evidence that goes against it.