Science Isn’t Everything

Artists often desire social recognition for their contribution to society, recognition that might come in the form of funding, recognition, validation, or visibility. Teachers do the same, perhaps striking to foster awareness of the constraints within which they conduct their work. Even politicians have to convince the population that their efforts are worthwhile; otherwise, they don’t get reelected.

Increasingly, scientists, too, seek legitimacy for their work. The March for Science on April 22, 2017 had this claim to legitimacy as one of its goals: politics should take scientific knowledge into account when making decisions.

E=MC^2_(7852234992)
By Christopher Michel – E=MC^2, CC BY 2.0, https://commons.wikimedia.org/w/index.php?curid=24810134

Here’s the rub: science changes its mind and sometimes scientists (including social scientists) can’t seem to agree on what constitutes scientific rigour and scholarly success, or even how to interpret results. This apparent conflict does nothing for the public perception of an endeavour truthfully plagued by practical and social problems, from funding to publishing pressures.

Most people don’t seem to understand the scientific method. In fact, most people don’t seem to understand that science is in fact a method, not an object. Yes, the word “science” can be used as shorthand for “knowledge garnered through scientific research,” but I suspect most serious scholars would claim that the power of science lies in its principles, not in the body of knowledge itself.

This is why social science is a science, even though many studies are neither replicable nor generalizable, two cornerstones of what we might think of as science. That’s not a flaw in the method; it’s a limitation in the object of study. People and culture and society are complex, shape-shifting objects.

Science philosopher Ian Hacking coined the term “looping effect” to denote the process whereby ideas about how society or identity works directly influence how society or identity actually works. New studies of these things modify the scientific or social consensus, which then further influences how people view or experience them. And so on ad infinitum.

In one example, certain trends become apparent through research about what it means to be a refugee. Identifying people as refugees only becomes possible once the notion of a refugee becomes socially widespread. People thus identified become aware of what it means to be a refugee through how that identity is constructed by institutions and their fellow refugees. They will have multiple social scripts available to them, but those scripts are shaped by the culturally accepted notion of “refugee-ness.”

Science is like this. It is messy and some things change. Other things don’t. For instance, some scientific disciplines have a condition humorously known as physics envy. Physics doesn’t really change and it can be measured with mathematical precision, no matter how social conditions and individual biases might shift. (I’m talking about the actual workings of physics, not the study and practice of physics, which of course change with societal variations.) In many other fields, including so-called “hard” sciences like biology or psychology, the actual facts might mutate along with external conditions. Life expectancy lengthens with improved hygiene, education, and access to healthcare. Menstruation starts at a younger age with a change in diet. Being LGBTQ+ is a risk factor for depression, due to antipathetic cultural norms.

Homicide_rates_for_select_countries_2012
By Rcragun – Own work, CC BY-SA 4.0, https://commons.wikimedia.org/w/index.php?curid=34549866

These factual changes are uncomfortable. Something can be true only under specific conditions, but we imagine that it is always true, because it is currently true or it has been true in our experience. People might think science is just making an “educated guess” based on preexisting ideas. To a certain extent, that is true, because scientists are people and we are all biased and prone to errors in thinking. In fact, however, in science, an educated guess is called a hypothesis and it is the starting point, not the end point, of research. The hypothesis guesses at the outcome based on prior results, but the study’s purpose is to attempt to disprove the hypothesis. In other words, what we think is likely true needs to be tested and challenged and disputed in as many ways as possible and as many times as possible before we can accept it as true. Scientists call this process “falsification.” They are trying to prove their ideas wrong, not right.

Unlike many others, (good) scientists of all stripes welcome criticism. Scientists want to improve their craft. They want to know what they are collectively doing wrong, so that they can do it better.

Obviously, individual scientists with ego issues or tunnel vision might not welcome criticism, especially from another field, but overall, the scientific community aims to hone in on all the factors that go into knowing reality. These factors include subjectivity, interdisciplinary awareness, cross-cultural understanding, and myriad other intangible ingredients. Good scientists welcome the challenge.

The rest of the world looks at them and thinks, “Man, they keep changing their minds. They keep making mistakes. They aren’t paying attention to what’s going on over here.” And so on. Those are true statements. But instead of rejecting all science because of these issues, let’s try opening up a conversation. Let me say it again: good scientists will welcome the criticism and change tack accordingly.

Do scientists have all the answers? Of course not. Certainly not as individuals, and even collectively there is much yet unknown. And some things can’t be known through science.

Science is flawed and it is imperfect and it is sometimes wrong. But it is the best method we have for solving large-scale questions, like polio or religious war or climate change or the origins of humanity, and for figuring out how things work, like weather, socialization, affection, and bridge stability.

We absolutely need social science and philosophy and art and literature and personal experience to live meaningful lives, and even sometimes to give us personal answers. But cutting out science means ignoring an intimate, integral part of our humanity. We have an innate and cultivated desire to know and understand, an urge to observe and reason, and an impulse to test our ideas against those of others to determine what might be true.

Science is not everything. But it sure is something.

Dealing with Stupid People

It’s easy and fun to call people names. Slotting them into categories lets us guess what they think without them telling us. We can shut down intolerant babble before it begins. In promoting tolerance, why should we listen to perspectives that do the opposite?

I'm with Stupid
By Kevin Marks [CC BY 2.0 (http://creativecommons.org/licenses/by/2.0)%5D, via Wikimedia Commons
Okay. Imagine refusing to hear people out because their opinions promote intolerance and even oppression. In so doing, we are gleefully engaging in the very behaviours we claim to condemn: silencing, excluding, and imposing our own subjective position as a morally objective standpoint.

Name-calling happens across any divide. At its core, the very notion of the divide lays the foundation for separation, even segregation. Can we build a world with no divide?

I think that world already exists, if we would just help it emerge. We construct and reproduce the divide ourselves. We talk and behave as if seven billion individual views can effectively be split into two or three camps: you’re either this or that or the other thing. Anyone who resents being pigeonholed knows that such a division hardly represents reality.

Collectively, we like to reinforce the notion of the divide. New Atheists talk about the “regressive left” as harmful because they promote cultural tolerance. Those who advocate cultural relativism point fingers at “evangelical” atheists who steamroll diversity in their push for rational solutions to social problems.

In the US, Republicans say Democrats are living in “la la land,” while Democrats call Republicans “racist” and “uninformed.” The same dynamic happens elsewhere, and not just in politics. We seem to think people who don’t agree with us are stupid or unaware. If only they knew what we know, they would inevitably come to the same conclusions we do.

In 2005, writer David Foster Wallace gave a commencement speech to Kenyon College’s liberal arts graduates. He argued that the benefit of a liberal arts education lies in its ability to teach us to choose how we think and what we pay attention to. It pushes us beyond our default stance, the self-centered positionality we all grapple with.

Of course it’s hard to get out of our own heads when that’s where we live. Doing so requires coaching and effort. Sometimes, we’ll fail and that’s okay too.

The goal is to try and try again. Recognize our fallibility. Embrace uncertainty.

Hannah Arendt famously said, “The most radical revolutionary will become a conservative the day after the revolution.” Once we establish a new way of doing things, we can get stuck in our ability to receive new ideas.

In sticking to our guns, we might even restrict the liberties of others. Yes, we have good intentions. We believe we are fighting to create or conserve a “better” society. But better for who?

The only good society is one that allows all members to speak and be heard, regardless of how much power they have or how little they toe the line. This applies to small groups as much as entire nations.

So whether we call ourselves liberal or conservative, progressive or radical, anarchist or socialist, religious or humanist, or none of the above – whatever we call ourselves, we need to strive not to reduce others to something less than human. As if we’ve read their hearts and minds and distilled the contents down to tidy, facile labels.

Anyone can have a good idea. Find the common ground. Listen and learn. If we engage life assuming we don’t have all the answers, we’ll end up wiser in the long run. And nobody has to get hurt along the way.

It’s the culture, stupid! Or is it?

Bringing complexity to the Cologne attacks.

Eriksen's blog

The events in Cologne have sparked controversies across Europe. This time, the topic is not the economic and social costs of the refugee crisis, but questions concerning culture and gender. We need a proper language in which to address these issues.

A shorter version of this article was published in Norwegian in Morgenbladet on 15 January 2016.

There is no simple answer as to what exactly happened in Cologne on New Year’s Eve. There was a large number of people partying in the city centre, in varying states of intoxication, and no version of the events is the only valid or possible one. You and I may easily perceive and interpret identical situations in quite different ways, even if we were both present. But a few facts seem indisputable. A rather large number of young men (a few hundred? a thousand?), most of them with a background in Arab-speaking countries…

View original post 3,140 more words

Lived Realities and the Concept of Race

Etno_icon
From Wikimedia Commons, User: Xil

Some people perceive women to be inferior to men and sexually available to whoever wants to have a go at them. They then behave as if this were true. Women’s inferiority is a “lived reality” to those people. Yet nobody says, “We need to acknowledge women’s inferiority and sexual availability because we need to talk about rape.” To get at the root of the problem, we need to understand that some people believe that women are inferior and/or sexually available, not that this is factually true. It might even be useful to question the meaning of a category like “women” (which many people have done more or less successfully).

On a related note, the divine right of kings used to be a “lived reality,” meaning that people (including the kings) experienced it as real and lived their lives accordingly. Yet nobody today would argue that we have to acknowledge the divine right of kings in order to understand historical political systems or even how people experienced them. No, we would simply agree that people believed in the divine right of kings and behaved as if it were real, thereby contributing to their own oppression and that of others.

Thus far, we can agree.

Many people perceive arbitrarily divided groups of people to be different based on physical characteristics. Somehow, our logic changes in this case and we come to the conclusion that, “We have to acknowledge race.” We don’t. We have to acknowledge racism.

By acknowledging race as a useful ontological category, we are telling the racists they are right, that there are concrete, observable differences that indelibly separate groups of humans from one another in insurmountable ways. We are saying that lines can be drawn to distinguish these groups in clear-cut ways. We are saying there is an objective reality to the perception of race. We are saying “Race is real, but we shouldn’t treat people differently because of it.” With good intentions, we are trying to make racist views less damaging, instead of rejecting them altogether. In short, we are accepting racism as a valid worldview, just one that needs to be tweaked.

If we acknowledge that some people believe in race and that it creates problems without accepting the premise of race, then we are refusing to give any credence to the underlying assumption of racists. We already have useful concepts that work much better than “race.” As a starting point, I suggest “genetic ancestry,” “historical community,” or “shared cultural experience.”

Can we really assert that a right-leaning politician in India has much in common with a retired performance artist in the United States just because they have the same skin tone? Does a social worker in Ireland readily relate to a Zimbabwean farmer who fled to Zambia during the civil war? My guess is no. “Race” has become shorthand for too many conflicting ideas. If we want to talk about oppression and lived realities, we need a better, more precise vocabulary to say what we really mean.

Outrage and Compassion

Gerhard_Merz_in_FernwaldThe world is full of outrage.

It’s normal to get upset when we see someone being treated unfairly, even more so when an entire group of people is oppressed by a system founded on prejudice. The right thing is to speak up and try to make the world a better place.

Social media can be an echo chamber, but it can also be a space for encountering alternative views. Unless we unfriend or unfollow everyone who disagrees with us, we can open ourselves up to a greater awareness of how others think.

I know as well as anyone that some people’s views are unpalatable and hard to deal with day after day. Racism, sexism, homophobia, transphobia, xenophobia, ageism, classism, fundamentalism,  and a million other things can catch us off guard. Sometimes we’re tired and really just want to look at some cute animals or read the news about our favorite show. Some days, we just want it to go away. We can choose to ignore it or hide it. That’s okay. Contrary to popular belief, silence doesn’t always mean acceptance.

On other occasions, we may be fired up enough to challenge the view with reasoned arguments and solid evidence. We won’t stop until the person admits their view is wrong.

Unfortunately, in the midst of our activist zeal, we sometimes forget about compassion.

Despite what absolute relativists say (if such people really exist, which is doubtful), some views are more valid than others, because they are based on evidence and analysis and experience. I’m not saying that we should respect all views equally, regardless of who they might harm. But I do think that we should respect all people – if not equally, at least to a minimum degree.

There’s a difference between saying “Your view is wrong” and “You’re a dumb person.” It’s also unfair to assume we know how a person feels on an issue (“You shouldn’t be so angry about this”) if they haven’t told you (maybe they’re not angry at all). Also, telling people that how they feel is wrong and that they should feel some other way is about as unhelpful and unproductive as we can get.

Additionally, we should avoid slotting people into categories because of a single aspect of their opinion. The thought process goes something like this: “This person doesn’t like homosexuality, and in my experience homophobic people are generally on the right. Therefore this person is on the right and must also be a creationist Christian, fiscally conservative and more concerned with security than equality.” Wrong. In the Netherlands, for example, the right is not necessarily religious and they openly support homosexuality.

Our biases are just as biased as anyone else’s biases.

Any view must undergo a lot of scrutiny for it to prove its worth and staying power. Our own views are vulnerable to logical fallacies and misinformation, just like other people’s. We need to recognize this before jumping on our high horse. Questioning our own position will help reign in any tendency to arrogance we might have.

At the same time, we need to remember that not everyone has had access to intellectual training or positive mentors or accurate information. Many of our opinions come from emotional experiences, not facts, and those experiences and emotions need to be acknowledged, even if the conclusions are problematic.

Finally, though, the most important thing is that we remember that very few people are bad. At some point, Hitler was an aspiring artist who was kind to dogs. Instead of always focusing on what divides us, we might get further by trying to figure out what connects us. By finding common ground, we will be able to see our shared humanity and trigger empathy.

We can’t expect other people to behave more empathetically toward people they don’t agree with if we can’t do it ourselves. Let’s practice compassion whenever we can. After all, at its root, social justice is about people being nicer to each other. Maybe we can start by being nicer ourselves.

On Not Knowing

In my last original post, I had heard that water has memory, but I didn’t know enough to decide either way.

I did some digging, and it turns out that the water studies were not conducted in a scientific way, nor were they peer reviewed. There’s no way to substantiate any of the claims made. In other words, pseudoscientific mumbo jumbo. Good to know.

Confused
Rory the Tiger, confused – Artwork by Ruby Wang

Now that that’s been cleared up, let me turn to my main point for this post. I want to reiterate a point I brought up in that previous post: it’s both extremely uncomfortable and incredibly rewarding to admit that we don’t know something.

I think we internalize this discomfort with not knowing from a young age. We learn to only speak when we know the answer.

Here’s a good time to enact what I’m advocating for: I don’t know how people feel about saying they don’t know something in other parts of the world. Or even here, really, so I’m speculating based on things I’ve read and my own observations in North America and Europe.

Within this realm, parents and older siblings often chide children for asking too many questions. It’s annoying and time-consuming, and probably the parents and siblings don’t know the answer half the time. This situation can make them uncomfortable, especially for parents, since they are socially responsible for their children’s education. If they don’t know what information to give their child, have they failed in their role?

While there are many outstanding teachers in the world, many are forced to silence curious children. The goal of public education is to get children to pass standardized tests and learn the required material laid out in a narrow curriculum based on memorization, rather than research and discovery.

This trend of consuming information instead of seeking and building knowledge continues into adulthood. At work, seeming unsure might lead clients, bosses, and coworkers to distrust your abilities. Saying you don’t know merits chastisement, even to the point of getting fired for “incompetence.” (I’m not saying there aren’t actually incompetent people, just that learning is a process that takes time and acceptance of errors.) Instead of providing training and allowing employees to grow into their positions, many companies – including those in the arts – want their new recruits to come fully formed with years of experience and lots of energy. When David Bowie put out his first album, no one expected a success, but they gave him the benefit of the doubt and let him explore his sound in the studio for multiple subsequent recordings. Nowadays, if you don’t sell enough copies right off the bat, you risk being cut from the label.

So we feel that it’s uncomfortable and inhibiting to admit we don’t know something. Yet in reality, no one can claim to know everything. If we can start accepting this condition in ourselves and others, we may build a more honest world.

We will also start to recognize our own strengths and to value those of others. If I can admit that I don’t know something about medicine or physics or politics, then I can find someone who does know. In this way, I can become smarter and more informed.

We will all become better at discerning good information from bad if we practice finding out what we don’t know. It’s not enough to click on the first Google hit. SEO, paid ads, and popularity play a huge role in what’s at the top; facticity and accuracy do not.

It’s also not enough to read a few lines or even a few paragraphs on a topic and call it “learning.” Most things that are “known” are way more complex than a simple explanation will suggest. It’s always good to examine opposing views, because someone can be very convincing without being right. It’s also important to look at the person’s claim to authority, which can mean many different things. Their positionality in the world (which social labels they fall under, where they live, and many other factors) will influence their perspective as well.

Learning through discovery is hard, much harder than reading a “fact” and then regurgitating it. But we don’t really know anything until we’ve looked at it closely from many angles and sat with it for some time. Otherwise, we’re just repeating words that someone else said. We can’t actually own the knowledge as a personal intellectual asset. We’ve accepted information as true without having any understanding of what makes it so.

Sometimes new information emerges and what we previously “knew” becomes obsolete, shown to be incorrect. Other things can change from moment to moment, place to place, person to person. Democracy doesn’t just mean one thing; neither does religion, or even a specific type or subtype of religious belief. Sufi Islam is very different from the Sunni persuasion. Pagans vary across the board. And of course individual circumstances affect interpretations and practices, so you can never come up with a timeless, immutable law of Sufi practice or any other human endeavour.

640px-B&W_Happiness
Rory the Tiger, confused – Artwork by Ruby Wang

I don’t know a lot of things. When I write a post expressing an opinion, I often second-guess the validity of that statement. In other parts of my life, I try to present myself as knowledgeable as frequently as possible; obviously I’m not immune to the socialization process. I also tend to simply remain quiet when I don’t know something, hoping that someone else will fill in the blanks.

However, training in research methods has spilled over into my daily life. The first thing we all should know is how to find solid, valid information. Then we should know how to examine it, test it, shake it up, and see if it still stands.

Of course, we all have limited time and resources at hand, including life experience or education in various subject areas. To some, something might seem obviously untrue, while someone else might be convinced by its apparent value and what looks like supporting evidence. We also need to respect this type of difference (without being afraid to challenge another person’s viewpoint, of course).

We can’t all know the same things, no matter how hard we try. In this light, it seems worthwhile to question our own received knowledge and try to really understand why we think we know what we do. This approach just might help us all get smarter, while simultaneously producing the joy of discovery and the solace of mutual respect.