Science Isn’t Everything

Artists often desire social recognition for their contribution to society, recognition that might come in the form of funding, recognition, validation, or visibility. Teachers do the same, perhaps striking to foster awareness of the constraints within which they conduct their work. Even politicians have to convince the population that their efforts are worthwhile; otherwise, they don’t get reelected.

Increasingly, scientists, too, seek legitimacy for their work. The March for Science on April 22, 2017 had this claim to legitimacy as one of its goals: politics should take scientific knowledge into account when making decisions.

E=MC^2_(7852234992)
By Christopher Michel – E=MC^2, CC BY 2.0, https://commons.wikimedia.org/w/index.php?curid=24810134

Here’s the rub: science changes its mind and sometimes scientists (including social scientists) can’t seem to agree on what constitutes scientific rigour and scholarly success, or even how to interpret results. This apparent conflict does nothing for the public perception of an endeavour truthfully plagued by practical and social problems, from funding to publishing pressures.

Most people don’t seem to understand the scientific method. In fact, most people don’t seem to understand that science is in fact a method, not an object. Yes, the word “science” can be used as shorthand for “knowledge garnered through scientific research,” but I suspect most serious scholars would claim that the power of science lies in its principles, not in the body of knowledge itself.

This is why social science is a science, even though many studies are neither replicable nor generalizable, two cornerstones of what we might think of as science. That’s not a flaw in the method; it’s a limitation in the object of study. People and culture and society are complex, shape-shifting objects.

Science philosopher Ian Hacking coined the term “looping effect” to denote the process whereby ideas about how society or identity works directly influence how society or identity actually works. New studies of these things modify the scientific or social consensus, which then further influences how people view or experience them. And so on ad infinitum.

In one example, certain trends become apparent through research about what it means to be a refugee. Identifying people as refugees only becomes possible once the notion of a refugee becomes socially widespread. People thus identified become aware of what it means to be a refugee through how that identity is constructed by institutions and their fellow refugees. They will have multiple social scripts available to them, but those scripts are shaped by the culturally accepted notion of “refugee-ness.”

Science is like this. It is messy and some things change. Other things don’t. For instance, some scientific disciplines have a condition humorously known as physics envy. Physics doesn’t really change and it can be measured with mathematical precision, no matter how social conditions and individual biases might shift. (I’m talking about the actual workings of physics, not the study and practice of physics, which of course change with societal variations.) In many other fields, including so-called “hard” sciences like biology or psychology, the actual facts might mutate along with external conditions. Life expectancy lengthens with improved hygiene, education, and access to healthcare. Menstruation starts at a younger age with a change in diet. Being LGBTQ+ is a risk factor for depression, due to antipathetic cultural norms.

Homicide_rates_for_select_countries_2012
By Rcragun – Own work, CC BY-SA 4.0, https://commons.wikimedia.org/w/index.php?curid=34549866

These factual changes are uncomfortable. Something can be true only under specific conditions, but we imagine that it is always true, because it is currently true or it has been true in our experience. People might think science is just making an “educated guess” based on preexisting ideas. To a certain extent, that is true, because scientists are people and we are all biased and prone to errors in thinking. In fact, however, in science, an educated guess is called a hypothesis and it is the starting point, not the end point, of research. The hypothesis guesses at the outcome based on prior results, but the study’s purpose is to attempt to disprove the hypothesis. In other words, what we think is likely true needs to be tested and challenged and disputed in as many ways as possible and as many times as possible before we can accept it as true. Scientists call this process “falsification.” They are trying to prove their ideas wrong, not right.

Unlike many others, (good) scientists of all stripes welcome criticism. Scientists want to improve their craft. They want to know what they are collectively doing wrong, so that they can do it better.

Obviously, individual scientists with ego issues or tunnel vision might not welcome criticism, especially from another field, but overall, the scientific community aims to hone in on all the factors that go into knowing reality. These factors include subjectivity, interdisciplinary awareness, cross-cultural understanding, and myriad other intangible ingredients. Good scientists welcome the challenge.

The rest of the world looks at them and thinks, “Man, they keep changing their minds. They keep making mistakes. They aren’t paying attention to what’s going on over here.” And so on. Those are true statements. But instead of rejecting all science because of these issues, let’s try opening up a conversation. Let me say it again: good scientists will welcome the criticism and change tack accordingly.

Do scientists have all the answers? Of course not. Certainly not as individuals, and even collectively there is much yet unknown. And some things can’t be known through science.

Science is flawed and it is imperfect and it is sometimes wrong. But it is the best method we have for solving large-scale questions, like polio or religious war or climate change or the origins of humanity, and for figuring out how things work, like weather, socialization, affection, and bridge stability.

We absolutely need social science and philosophy and art and literature and personal experience to live meaningful lives, and even sometimes to give us personal answers. But cutting out science means ignoring an intimate, integral part of our humanity. We have an innate and cultivated desire to know and understand, an urge to observe and reason, and an impulse to test our ideas against those of others to determine what might be true.

Science is not everything. But it sure is something.

Advertisements

Lived Realities and the Concept of Race

Etno_icon
From Wikimedia Commons, User: Xil

Some people perceive women to be inferior to men and sexually available to whoever wants to have a go at them. They then behave as if this were true. Women’s inferiority is a “lived reality” to those people. Yet nobody says, “We need to acknowledge women’s inferiority and sexual availability because we need to talk about rape.” To get at the root of the problem, we need to understand that some people believe that women are inferior and/or sexually available, not that this is factually true. It might even be useful to question the meaning of a category like “women” (which many people have done more or less successfully).

On a related note, the divine right of kings used to be a “lived reality,” meaning that people (including the kings) experienced it as real and lived their lives accordingly. Yet nobody today would argue that we have to acknowledge the divine right of kings in order to understand historical political systems or even how people experienced them. No, we would simply agree that people believed in the divine right of kings and behaved as if it were real, thereby contributing to their own oppression and that of others.

Thus far, we can agree.

Many people perceive arbitrarily divided groups of people to be different based on physical characteristics. Somehow, our logic changes in this case and we come to the conclusion that, “We have to acknowledge race.” We don’t. We have to acknowledge racism.

By acknowledging race as a useful ontological category, we are telling the racists they are right, that there are concrete, observable differences that indelibly separate groups of humans from one another in insurmountable ways. We are saying that lines can be drawn to distinguish these groups in clear-cut ways. We are saying there is an objective reality to the perception of race. We are saying “Race is real, but we shouldn’t treat people differently because of it.” With good intentions, we are trying to make racist views less damaging, instead of rejecting them altogether. In short, we are accepting racism as a valid worldview, just one that needs to be tweaked.

If we acknowledge that some people believe in race and that it creates problems without accepting the premise of race, then we are refusing to give any credence to the underlying assumption of racists. We already have useful concepts that work much better than “race.” As a starting point, I suggest “genetic ancestry,” “historical community,” or “shared cultural experience.”

Can we really assert that a right-leaning politician in India has much in common with a retired performance artist in the United States just because they have the same skin tone? Does a social worker in Ireland readily relate to a Zimbabwean farmer who fled to Zambia during the civil war? My guess is no. “Race” has become shorthand for too many conflicting ideas. If we want to talk about oppression and lived realities, we need a better, more precise vocabulary to say what we really mean.

On Not Knowing

In my last original post, I had heard that water has memory, but I didn’t know enough to decide either way.

I did some digging, and it turns out that the water studies were not conducted in a scientific way, nor were they peer reviewed. There’s no way to substantiate any of the claims made. In other words, pseudoscientific mumbo jumbo. Good to know.

Confused
Rory the Tiger, confused – Artwork by Ruby Wang

Now that that’s been cleared up, let me turn to my main point for this post. I want to reiterate a point I brought up in that previous post: it’s both extremely uncomfortable and incredibly rewarding to admit that we don’t know something.

I think we internalize this discomfort with not knowing from a young age. We learn to only speak when we know the answer.

Here’s a good time to enact what I’m advocating for: I don’t know how people feel about saying they don’t know something in other parts of the world. Or even here, really, so I’m speculating based on things I’ve read and my own observations in North America and Europe.

Within this realm, parents and older siblings often chide children for asking too many questions. It’s annoying and time-consuming, and probably the parents and siblings don’t know the answer half the time. This situation can make them uncomfortable, especially for parents, since they are socially responsible for their children’s education. If they don’t know what information to give their child, have they failed in their role?

While there are many outstanding teachers in the world, many are forced to silence curious children. The goal of public education is to get children to pass standardized tests and learn the required material laid out in a narrow curriculum based on memorization, rather than research and discovery.

This trend of consuming information instead of seeking and building knowledge continues into adulthood. At work, seeming unsure might lead clients, bosses, and coworkers to distrust your abilities. Saying you don’t know merits chastisement, even to the point of getting fired for “incompetence.” (I’m not saying there aren’t actually incompetent people, just that learning is a process that takes time and acceptance of errors.) Instead of providing training and allowing employees to grow into their positions, many companies – including those in the arts – want their new recruits to come fully formed with years of experience and lots of energy. When David Bowie put out his first album, no one expected a success, but they gave him the benefit of the doubt and let him explore his sound in the studio for multiple subsequent recordings. Nowadays, if you don’t sell enough copies right off the bat, you risk being cut from the label.

So we feel that it’s uncomfortable and inhibiting to admit we don’t know something. Yet in reality, no one can claim to know everything. If we can start accepting this condition in ourselves and others, we may build a more honest world.

We will also start to recognize our own strengths and to value those of others. If I can admit that I don’t know something about medicine or physics or politics, then I can find someone who does know. In this way, I can become smarter and more informed.

We will all become better at discerning good information from bad if we practice finding out what we don’t know. It’s not enough to click on the first Google hit. SEO, paid ads, and popularity play a huge role in what’s at the top; facticity and accuracy do not.

It’s also not enough to read a few lines or even a few paragraphs on a topic and call it “learning.” Most things that are “known” are way more complex than a simple explanation will suggest. It’s always good to examine opposing views, because someone can be very convincing without being right. It’s also important to look at the person’s claim to authority, which can mean many different things. Their positionality in the world (which social labels they fall under, where they live, and many other factors) will influence their perspective as well.

Learning through discovery is hard, much harder than reading a “fact” and then regurgitating it. But we don’t really know anything until we’ve looked at it closely from many angles and sat with it for some time. Otherwise, we’re just repeating words that someone else said. We can’t actually own the knowledge as a personal intellectual asset. We’ve accepted information as true without having any understanding of what makes it so.

Sometimes new information emerges and what we previously “knew” becomes obsolete, shown to be incorrect. Other things can change from moment to moment, place to place, person to person. Democracy doesn’t just mean one thing; neither does religion, or even a specific type or subtype of religious belief. Sufi Islam is very different from the Sunni persuasion. Pagans vary across the board. And of course individual circumstances affect interpretations and practices, so you can never come up with a timeless, immutable law of Sufi practice or any other human endeavour.

640px-B&W_Happiness
Rory the Tiger, confused – Artwork by Ruby Wang

I don’t know a lot of things. When I write a post expressing an opinion, I often second-guess the validity of that statement. In other parts of my life, I try to present myself as knowledgeable as frequently as possible; obviously I’m not immune to the socialization process. I also tend to simply remain quiet when I don’t know something, hoping that someone else will fill in the blanks.

However, training in research methods has spilled over into my daily life. The first thing we all should know is how to find solid, valid information. Then we should know how to examine it, test it, shake it up, and see if it still stands.

Of course, we all have limited time and resources at hand, including life experience or education in various subject areas. To some, something might seem obviously untrue, while someone else might be convinced by its apparent value and what looks like supporting evidence. We also need to respect this type of difference (without being afraid to challenge another person’s viewpoint, of course).

We can’t all know the same things, no matter how hard we try. In this light, it seems worthwhile to question our own received knowledge and try to really understand why we think we know what we do. This approach just might help us all get smarter, while simultaneously producing the joy of discovery and the solace of mutual respect.

Combating Terrorism with Cultural Relativism

Some of my otherwise favorite people scoff at the notion of cultural relativism. Usually, they are pointing out that we can’t tolerate human rights abuses just because “it’s their culture.”

I agree. We can’t tolerate abuse or violence or oppression based on some notion of culture as sacred and inviolable. But that’s not what cultural relativism means. In fact, cultural relativism is vital to combating the very things some people suggest it supports.

Let me explain. Cultural relativism comes from anthropology, which just happens to be something I have a clue about.

I’m a social or cultural anthropologist, depending on where in the world you live. Personally, I prefer “social anthropologist,” because many people seem to think “culture” is a bounded, predetermining, static force. I think of it more like an ongoing process.

Culture is what we do, not what we have. Culture exists through the interactions of people, the ways we think about things, and how we express ourselves, individually, collectively, and systemically.

Culture shapes us even as we shape culture. As a group, we engage in cultural innovation through creativity and agency as much as we reinforce conventions through ritual or conformity.

Ruth Benedict
Anthropologist Ruth Benedict

So what is cultural relativism and what is it good for? The notion of cultural relativism came into the spotlight largely through the work of anthropologist Franz Boas and his students, particularly Ruth Benedict. For example, in her book Patterns of Culture (1934), Benedict argued that we can’t understand – and shouldn’t judge – a particular kind of human behaviour based on our own norms and, especially, without understanding the cultural context.

While many have since separated moral relativism from cultural relativism, Benedict’s general definition still stands – and the concept remains key to building a better world.

Here are the two key errors in rejecting cultural relativism:

  1. Cultural relativism is more of a tool than an attitude.

  2. Cultural relativism allows us to understand what’s really going on, so that we can respond to the situation appropriately.

Let’s start with the first one. In contrast to cultural relativism, moral relativism is an attitude. It means not judging things by your own learned sense of right and wrong. It’s important to distinguish between moderate moral relativism and absolute moral relativism. In their common hyperbolic style, politicians and xenophobes usually mean the most extreme version of whatever it is they’re talking about, and many of those who reject cultural relativism are really talking about absolute moral relativism.

Absolute moral relativism means anything goes. If this were the case, life would be terrible. Just about everybody agrees on this point, because humans seem to have at least a basic sense of right and wrong that would be offended by the idea of true amorality.

Moderate moral relativism means finding out how people involved in the situation perceive what is going on. Do they think it’s wrong? Are they being harmed in some way?

For example, among the Solinké of Mali, sleeping and waking alone are considered negative and even harmful experiences that should be avoided at all costs (see Sleep Around the World: Anthropological Perspectives). Meanwhile, the Globe and Mail recently reported that as many as 40 percent of Canadian couples prefer to sleep in separate bedrooms to improve their ability to sleep when and how they want as individuals.

The Solinké would be appalled. Maybe they would even want to start educational programs to teach those poor Canadians how to fix their bad habits, so that they no longer have to suffer.

Of course, sharing a bed is a sign of romantic intimacy in Canada, which speaks to another moral code that overlaps or even clashes with the idea of individual comfort. That’s one reason why the Globe called the article “The Night Divorce,” as if these separate sleepers are breaking a social contract by defying the norms of monogamous relationships. Morality is complex and even conflicted in any cultural milieu.

Approaching these sleeping situations from a culturally relativistic point of view enables us to see that no version is inherently right or wrong. Different moral codes are in play and all are equally valid. Just because I might be offended or hurt by something doesn’t mean everyone will be. We know that on an individual level. We just need to start applying it to cultural practices as well.

It should be easy for us to understand how cultural relativism works in this example. But what about something like honour killings or spousal rape or crucifixion for apostasy? Am I saying that if the general community thinks it’s okay, we should just accept it?

This is the place where moral and cultural relativism diverge. We can still use cultural relativism to understand a situation without thereby saying that the situation is morally acceptable.

So, if cultural relativism is a method or a tool, as I claimed above, what is it a tool for? And, following my second claim, how can it teach us to respond to a situation appropriately?

Let’s use an extreme example that’s on everybody’s lips (or screen) these days: ISIS. How can we use cultural relativism to understand and even solve this situation?

First, we need to understand something about ISIS, Islam, and Syria. That means we need to allow ISIS supporters to speak for themselves. Yes, I know that’s a scary thought, but how else are we supposed to know why they do what they do?

Coffee
Did you know Syria was one of the first places in the world where people drank coffee?

I read an insightful article in the Atlantic called “What ISIS Really Wants.” I suggest reading it for yourself. It’s a bit lengthy, but that’s usually a sign thorough research and representational complexity that far surpasses the average daily newsbite. You’ll come away with a more solid foundation for thinking about ISIS, as well as those who are fleeing them.

For example, I now know that ISIS members are not trying to get to other countries. Quite the opposite. Based on their interpretation of Islamic scripture, they have a spiritual and moral obligation to live inside the caliphate (Islamic state). In other words, everyone who supports ISIS is trying to get to Syria – not Canada or Australia or Greece or France. The majority of Muslim Syrians are the prime target of ISIS, who deems them apostates worthy of death. That’s why so many Muslims have become refugees.

Using cultural relativism to understand motivations and behaviours will allow us to engage with people who commit heinous acts in a more appropriate way. Doing so will teach us what to expect and how worried we should be about refugees coming to our own country. With stringent screening methods already in place in Canada, I personally have little concern that an ISIS supporter will pass the gates. That concern has now been almost entirely squashed by my new understanding of ISIS culture.

The only way to combat violence and extremism and terrorism is to learn something about the culture behind perpetrators’ motivations and behaviours. And the only solid way to learn these things is through cultural relativism as a tool for understanding.

We need to be willing to suspend our own cultural value judgments long enough to wrap our heads around totally different ways of thinking. We can’t assume that “they” must think like “us,” because we will never be able to grasp how they can be so evil.

Nobody thinks of themselves as evil. Only when we know how people justify their own violence can we tackle the broader cultural values that promote and allow it. Cultural relativism is the antithesis of extremism. By learning to understand other cultures, no matter how unpalatable, we will empower ourselves with the necessary knowledge to root out hatred and intolerance at the source.

Mansplaining and Feminist Chromosomes

"Battleofthesexes" by Welleman - Own work. Licensed under Public Domain
“Battleofthesexes” by Welleman – Own work. Licensed under Public Domain

Mansplaining” is an increasingly nebulous term used in some feminist circles to criticize men who condescend to women by relaying their apparently superior knowledge in self-important ways. From my understanding, the term is applied most commonly in situations where the woman already knows a lot about the topic at hand, but the man assumes that she doesn’t and that what he has to say is more pertinent.

In many cases, the topic is sexism or misogyny or, you know, feminism.

It seems obvious that women would know more about women’s rights or experiences of sexism than men, doesn’t it? How could a man possibly have anything to say about it that a woman wouldn’t already know?

I hope the rhetorical device is apparent in the above two questions. Asking how a man could possibly know more than a woman on a specific topic is essentializing to both genders. But the problem runs deeper than that. It may be the case that many people distrust experts in a number of realms, and rightly so! Science is a method, not a religion to be taken at face value, and questioning every claim is integral to its progress. (So is accepting solid evidence, but that’s for another day.)

But yes, there have been many scientists who failed at their projects because they refused to consider laypeople’s knowledge about their own lifeworlds. It’s a give and take. Other so-called experts from educators to politicians have intentionally deceived or accidentally misled those in their care.

It’s not a perfect world and experts aren’t immune to hubris or other human pitfalls.

Wait, so are the men who start telling women about so-called women’s issues exhibiting hubris and a false sense of expertise? Or are women who throw the mansplaining label around claiming absolute authority for themselves with a de facto rejection of anyone else’s perspective?

The old saying goes, “Everybody you meet knows something you don’t.” If we actually interacted with one another on that basis, communication would improve in all directions.

Formal experts have studied and trained, and they’ve often spent years researching issues by examining and deconstructing primary and secondary evidence of many kinds. They know something.

Laypeople have also gained years of experience, sometimes their own, sometimes an accumulation of social knowledge that has been passed down from one generation to the next. They also know something.

Both so-called experts and so-called laypeople have some knowledge that the other can’t immediately access.

Can a middle-class social theorist discover a fact about the systemic roots of poverty that someone living in the ghetto might not have known? Does a light-skinned person who suffered from genocidal attacks in Rhodesia have to kowtow to the opinion of someone whose ideas of racialized discrimination are based purely on North American definitions of being black?

Is a person with a penis incapable of knowing anything about women’s issues? Can a man not read a book or watch the news or talk to women or even other informed men and come to a reasonable, respectable conclusion about rights or sexism or feminism?

Does being a male feminist mean shutting up and listening while the women talk?

Before entering any discussion, let’s take a moment to acknowledge everyone’s unique positionality in the world. I know what I know and you know what you know. The goal is to question what we know, as much as possible, so that we can all get closer to some semblance of truth. Of course, like everyone, I might mistakenly wax polemical for a while, until someone more informed shuts it down or at least provides evidence that forces me to embrace more nuance in my stance.

Women’s experiences are crucial to understanding feminism and sexism and misogyny. Assuming women don’t know about economics or astrophysics is sexist and insulting. In addition to personal experiences, many female feminists have spent years studying social issues and engaging in empirical research on which they base their knowledge and arguments.

So have male feminists, like Michael Kimmel. I agree that being condescending and silencing others is reprehensible. But it’s not “mansplaining,” a trait somehow unique to men, nor is it shared by all men (although there are men who automatically assume their own superiority to women, of course).

And it’s not only about gender – some men speak that way to other men, as do some women to other women. Some queer and trans people do it to one another, and to cis people and vice versa. People with lighter skin do it to people with darker skin and the reverse is equally true. If you have differently shaped eyes, or a different accent, the condescending tones might come out. (And, by the way, so-called “white male Westerners” don’t have a monopoly on arrogance.)

Having a certain type of chromosome or physical appearance doesn’t make you an expert. And it certainly doesn’t give you the right to drown out another person’s voice.

Before we engage in a shouting match, let’s take some time to listen. Everyone’s positionality has taught them something we don’t know. And some people know a lot, while others know a little. Some people can spout statistics, while others can tell us how something feels.

So if we each know something, maybe we can let go of our egos and put the parts together. That way we can all become smarter.

Something about Science, Gender, and Jobs

WomaRecently, the Globe and Mail sought readers’ opinions on getting more women into male-dominated professions, the sciences in particular. According to the article, more Canadian men than women pursue a career in the sciences. While the numbers are closer for those who study science in university (“less than 40 percent” are women), after graduation the discrepancy widens when it comes to employment (“less than 22 percent”).

The writers don’t offer any explanation for this gap. However, the piece’s title, “How can we encourage more girls into science careers?” suggests a tacit assumption. “We” (whoever that is) are not doing enough to promote science careers to young women.

Education, parents, media, marketing, and whatever else constitutes “we” might very well be guilty of persuading women that science is for men. It’s hard to say; the article provides no evidence, which is to be expected considering it never states the claim explicitly anyway.

Since we’re in speculating mode, I can come up with a few other reasons for the gender difference in employment. Please bear in mind that we’d need actual research to substantiate any of these.

  • Older people have more of a gender gap than younger people
    • It wouldn’t surprise me if accounting for age or length of time in the field changes the way we understand the data. If recent numbers show less of a gap among science graduates, it’s likely that we’ll see less of a gap in employment once the older generation retires.
  • Women have babies
    • Yes, I know. More men are staying home with their kids these days, and that’s great if that’s what both partners want. However, I’d guess it’s still more common for women to stay home out of choice and/or tradition. More importantly, many women get pregnant, which requires at least some time off. Creating a human being is hard work, but not the kind you can put on your CV (unless you’re creating a homunculus in a lab). Even with the most supportive family, childbearing can put women behind in their careers when compared to their childless counterparts, including men. The more children you have, the further behind you will fall. A male commenter on the Globe article made this point quite well.
  • Employers are sexist
    • Not all employers are sexist. Obviously. But unless things have changed drastically since 2012, many employers have an implicit bias that they might not even be aware of. One study gave potential science mentors the exact same student application, but changed the name from male to female on half of them. They discovered that a gender bias really does exist: “Results found that the ‘female’ applicants were rated significantly lower than the ‘males’ in competence, hireability, and whether the scientist would be willing to mentor the student.”

These are just a few possible roots of the gender gap. Luckily, it does appear to be shifting. So yay.

Now here’s an issue nobody talks about in these discussions: why is no one encouraging boys to enter female-dominated professions? Where are the articles decrying the lack of men in nursing, social work, counseling, event planning, or teaching?

To be fair, earlier this year, Business Insider did note which jobs tend to employ more women than men. However, the brief article was bereft of the sense of alarm so often used to highlight the relative lack of women in traditionally masculine fields.

So why the paucity of interest in getting men into traditionally feminine careers? Let’s speculate some more.

  • Work traditionally viewed as masculine is more highly valued than work viewed as feminine

That’s the only reason I can think of. The work that women have done traditionally just doesn’t garner the same level of respect, as evidenced by the higher salaries often received for many masculine jobs.

The respective valuation of traditionally masculine and feminine work may be the real crux of ongoing gender inequality in the labour force. Today’s movement encourages women to be like men. On a large scale, “we” still tend to value masculine things over feminine things. The goal is to raise women up to the level of men, because women’s work does not have the same social standing, no matter how much it contributes to our health and economic function (e.g. social work or primary education).

In other words, it’s great to encourage women to do the same work as men. But we won’t have true equality until men can do the same work as women, without losing their social standing.

Saudi Arabia, Land of Human Rights

“Farasan Island 3” by Bandar Yuosef – Flickr: Farasan Island_0392. Licensed under CC BY 2.0 via Wikimedia Commons – https://commons.wikimedia.org/wiki/File:Farasan_Island_3.jpg#/media/File:Farasan_Island_3.jpg

Last year, all atheists became terrorists – at least according to new laws introduced in Saudi Arabia. Non-believers aren’t alone though: anyone who criticizes the state, its rulers, or the Saudi version of Islam could be charged with terrorism. The Penal Law for Crimes of Terrorism and its Financing explicitly includes non-violent acts, effectively prohibiting any semblance of free speech, association, or religion.

The International Humanist and Ethical Union explains some of the key terms of the legislation:

The provisions of the “terrorism” law define and outlaw numerous acts and forms of expression as “terrorism”, including:

  • “Calling for atheist thought in any form”

  • any disloyalty “to the country’s rulers”, or anyone “who swears allegiance to any party, organization, current [of thought], group, or individual inside or outside [the kingdom]”;

  • anyone who aids, affiliates, or holds “sympathy” with any “terrorist” organization as defined by the act, which “includes participation in audio, written, or visual media; social media in its audio, written, or visual forms; internet websites; or circulating their contents in any form”;

  • contact with groups or individuals who are “hostile to the kingdom”

  • and the article on “Seeking to shake the social fabric or national cohesion” prohibits all protest, without qualification as to its message or intent, by outlawing “calling, participating, promoting, or inciting sit-ins, protests, meetings, or group statements in any form”.

Additionally, apostasy (denying Islam by adopting another faith or becoming an atheist) is punishable by death. The International Business Times states that 100 people have been put to death already this year, in compliance with laws prescribing capital punishment for “murder, rape, armed robbery, using recreational drugs, and smuggling, in addition to homosexuality, false prophecy, apostasy, adultery, witchcraft and sorcery.”

Raif Badawi is a case in point: he’s a Saudi Arabian blogger sentenced to ten years in prison and a thousand lashes for political criticism. It sounds crazy and it is, yet he is only one example of extreme corporal punishment among countless others that remain invisible to the international world.

With this lovely human rights record, Saudi Arabia somehow remains a full member of the United Nations. Not only that, but one of its representatives was quietly selected in June to head a panel of independent experts on the UN Human Rights Council.

This appointment followed on the heels of Saudi Arabia’s job opening for eight new executioners, described in the ad as “religious functionaries” working in the civil service, according to The Independent.

It’s like putting the head of ISIS in charge of human rights. Actually, the folks at UN Watch say that Saudi Arabia has beheaded more people than the famous extremist group this year.

Somehow, this is the real world, where farce sometimes merges with tragedy.