I would be really nice to know how many LGBT people there are in the US. I often scoff at studies that say "X% of LGBT people do/believe Y" because, since we don't even know the size of the queer population, how can we say what percent of us does or believes something?
And finding the size of the LGB population is immensely difficult for a variety of reasons. No, I take that back: it's impossible for a variety of reasons. So imagine my surprise when the Williams Institute announced that nine million Americans are LGBT, around 3.5% of the population. How could they know?
According to his brief, the author, Gary Gates, says he realizes it's hard to know how many people are queer. He spends over a page of the brief describing just how difficult it is to know how many LGBT people there are in the US. Then... he lays out how many there are, not saying how he overcame any of the many problems he cited, as well as the many problems he didn't cite.
In fact, I'd say that the author of this brief showed a pretty flagrant disregard for the rules of basic logic, much less demographics. Instead of producing something with facts that could be published in a peer-reviewed journal, he produced a research-lite brief to be distributed via press release to draw media attention to the number, using methodology that would make the average high school statistics teacher send the paper back for a rework.
Can LGBT activism be advanced when its based on a shared suspension of disbelief? Are we allowed to make fun of the fundies' manipulation of numbers when we do the same, only less effectively? Does poor reasoning help advance our cause? More on that after the jump.
Here's one of the problems with finding out how big the LGB population is, as Gates describes it:
In measuring sexual orientation, lesbian, gay, and bisexual individuals may be identified strictly based on their self-identity or it may be possible to consider same-sex sexual behavior or sexual attraction.
That's not a trifle. The US is a huge country with many religions, many regions, many races, and many, many subcultures, all with varying understandings of how sexual orientation works. Asking them "Are you gay, bisexual, or straight?" will give you a percentage for each, but people who may answer the same way are thinking different things.
That's not a difficulty. That right there makes these sorts of studies impossible. If 5% of the adult male population, for example, prefers sexual activity with other men but still identifies as straight because they have wives or girlfriends (who knows how much or how little gay sex they're having), well, that's enough to render results of surveys useless.
A long time ago being gay meant having gay sex and partaking in the homosexual lifestyle. Many people still believe that today.
For some other people, being gay means how someone identifies... I even knew a straight girl at college who identified as lesbian "in solidarity" with LGBT people. There was a time when that was the dominant position in the movement, about 15 to 20 years ago, that sexual identity was 100% yours to choose and it didn't have to have anything to do with reality.
And lots of people today, as I've posted about before, see sexual orientation as a profound desire that someone can't control and that's independent of how someone identifies.
I'm not judging any of these definitions, but acknowledging that when there are tens of millions of people who believe in each one, it makes studying this subject more than difficult. A study that just asks people will produce numbers in the end. The numbers will be useless, but they'll be numbers.
That's something if you're not concerned with accuracy.
Here's another problem cited in the brief:
Feelings of confidentiality and anonymity increase the likelihood that respondents will be more accurate in reporting sensitive information.
Definitely true. Some people just won't answer a survey honestly about their sexuality, even if they really do identify a certain way. That makes finding out how many people are queer hard.
So we can't even get to the debate about what sexual orientation means. For the purpose of these studies, sexual orientation isn't someone's identity, desires, or behavior; sexual orientation is what someone is willing to tell a pollster.
Those are just two of the problems with these sorts of surveys that Gates lays out. Those are big hurdles! How will Gates jump over them?
However, combining information from the population-based surveys considered in this brief offers a mechanism to produce credible estimates for the size of the LGBT community. Specifically, estimates for sexual orientation identity will be derived by averaging results from the five US surveys identified in Figure 1.
Read that a few times and perhaps you'll see the glaring problem in Gates's reasoning. As an editor, I would just like to remove the word "credible"; his methodology can definitely produce "estimates," but that's about it.
He cites five surveys on sexuality and health in the US that asked people their sexual orientation. The percent who identified as gay, lesbian, and bisexual (combined) was, from top to bottom, 5.6%, 3.7%, 3.2%, 2.9%, and 1.7%. When it came to just bisexual people, the answers ranged from 0.7% to 3.1%. For just gays and lesbians, it ranged from 1.0% to 2.5%.
There was a range of 3.9% among estimates of the LGB population. The range was bigger than his final result! (I remember something about a p-value from college....) The biggest estimate was over three times bigger than the smallest. Among bisexuals, the biggest estimate was almost four and a half times bigger than the smallest. Those differences translate into millions of people answering differently depending on how a study is conducted.
In the scientific world, one wouldn't even evaluate if those numbers are accurate because they aren't even precise. If you keep on repeating the same experiment and getting wildly different results each time, you don't average your data and publish. Instead, you find out what keeps changing. After you start getting results that repeat themselves, then you go on to defend your results as accurate.
Gates doesn't even get far enough to even try to defend his numbers as accurate because he's not even in the right ballpark yet. The question isn't if the data are accurate because we already know it's impossible that they are all accurate.
Why did one study say 3.1% of the population was bisexual and another say that 0.7% of the population was bisexual? We don't find out, Gates doesn't see fit to explain why the data he uses vary so much - doing so would induce too much laughter to get this study published in major newspapers, even by math-challenged media stars.
I notice from his references that one is a study of California. Does anyone think that the queer population of California is the same as, say, North Dakota? Another one studied only adults below the age of 44. Does anyone think that there hasn't been any change in attitudes towards homosexuality and bisexuality in the last century? Or that a gay 18-year-old is just as likely to be out as a gay 45-year-old? At least two of the links for the studies in the references go to the same place - a sloppy mistake, I know, but it makes it hard to evaluate what sort of questioning was used for one of the studies with the word "interview" in the title. Were these face-to-face interviews while some of the other studies were on paper?
That's just what I can notice from his brief. Who knows how big the methodology differences were, differences that Gates acknowledged just several pages before can have a large impact on the results. Those differences can't just be averaged together to cancel each other out.
Now, a real study of the data would evaluate which of the five surveys has the best methodology, which represents the American population (and it
could be is probably none of them). But this isn't a real study; it's working the media. So Gates averages the data. If you take good data and mix them with bad data, Gates asserts, you can produce "credible" estimates!
As for the trans estimate, that 700,000 Americans are transgender, I don't even need to get into a discussion here on Bilerico about the vastness of the term "transgender." Gates cites several studies that range from 0.1% to 2% depending on whether people were asked if they had "strong feelings" of being transgender or if they had "take[n] steps" towards transition. Some of these studies are state-level, others aren't specified in the brief.
And Gates comes to the conclusion that 0.2% is a good number to use. He doesn't explain why.
But the media just gobble it up. Usually I blame the media when statistics are poorly reported on - often academics will explain all the caveats behind their data and its limitations, only to be ignored by deadline-crunched journalists and math-challenged media stars. This brief, though, is a think tank brief, so it's designed to produce headlines.
A few years back, before I was doing the copy-editing work that led me into blogging, I had a job for a small publisher that included doing research for opposing viewpoints-type materials for high school students. I was given a topic and I'd find studies to prove either side.
Some of the topics were more balanced than others. The worst was the one on the need for increased government health care - every academic study and professor and anyone with any credibility was on the increased care side of that topic.
So what I'd do when I needed someone to say something wrong is I'd hunt down think tanks. There was always a rightwing think tank willing to take the wrong side of the debate. Even if there was no real evidence to support a conservative proposition, they'd find a way to manipulate data and philosophy into their little box. (Did you hear about those waiting lines in Europe? Or that infant mortality is actually lower in the US because other countries don't know how to count babies?)
I'd get my $11 a page, the publisher would get whatever he charged schools, and the think tank would propagate their views. The only people that suffered were the high school students who were presented academic work and think tank work as equally valid. Oh, and anyone concerned with the country being run on fact-based policy.
This brief is standard-issue think tank work. While the number is smaller than some of us would like, just having a number to cite is important in a lot of arguments. And they've provided it and we'll cite it and the right will say that it's smaller (they prefer 2%, for some reason) and everyone will be all the dumber for it.
One last thing: I agree with Cathy Renna that this shouldn't even matter in politics. But it does.
img Alex Blaze, from Paris Pride 2010