'83 percent of Google+ users are inactive' screams a headline on a well-known tech site. But where does that number come from? How is inactive defined? Is it accurate?
The answer to that last question looks to be a resounding no. But that hasn't stopped several well-respected Web sites and social media players from passing it along.
I first came across the story on Friday via IT World blogger Peter Smith on Plus, and he was not one of those forwarding it along as if it were true. Instead, he complained that if there had really been a survey of 10 million Google+ users, surely someone he knew would have been asked.
A survey of 10 million Google+ users? That would be quite an undertaking; I know how much work we go through to poll fewer than 10,000 for Computerworld original research. Could that have happened under the radar, without any of the journalists on Plus hearing of it in progress? That didn't sound right.
The story on GigaOm said that 83% of Google+ users are inactive, a figure that came from "Bime Analytics, which polled a voluntary sample of more than 10 million Google+ users." So, off I went to the Bime Analytics blog; but Bime wasn't the original source of the data after all. Instead, Bime was looking at numbers from FindPeopleOnPlus.com, reporting the data as a "voluntary sample [not survey] of around 10 million Google+ users." Ah, that made more sense. So, off we go to that original data source to discover where the number comes from.
It doesn't appear that FindPeopleOnPlus is "surveying" anyone; nor does the site depend on people "voluntarily" adding themselves to the directory. Instead, it likely gets its information from a Web scraper that grabs all the public Google+ profiles it can find, with a privacy option of allowing people to remove themselves from the index.
FindPeoplePlus has a fair amount of data that looks useful, as far as a Web scraper can deduce from public Google+ profiles: gender (especially back when Google+ required public reporting of gender; it no longer does), education, occupation. However, the numbers for activity, with an alleged 12 million "inactive" and only 2.5 million "active," are, to be charitable, open to further scrutiny.
First of all: How is active defined? Does it mean posting in public? If so, how often? If people post only to their private circles, can FindPeopleOnPlus.com see that? What if people just comment instead of post, does that count as active? What about logging in regularly to read but not post; does that count? If you're comparing with competing social networks, readers-only should count as users: Facebook counts you as "active" if you've logged in at least once within the prior 30 days -- no content-creation required.
FindPeopleOnPlus.com doesn't say what it means by its active and inactive designations -- nor does it even mention that particular info in its own recent Google+ Statistics Report. Bime skirts this problem with the caveat "We are not sure how these figures were determined, but it was amazing to see that 83% of users were classed as inactive." In other words: We have no idea what this number says, but never mind! Wow! (I expect their commercial Business Intelligence service isn't quite that cavalier about how it uses data.)
But back to FindPeopleOnPlus. I searched for a few people I know who use Google+. I'm listed as active, so that's accurate. But by search #5, I found one of my friends listed as inactive, even though she has been posting from time to time -- in public a couple of weeks ago. Hmmm. A 20% error rate in my (non statistically valid) random sample. Not to mention all the other users who may be posting (or commenting) in private.
I commented about the inaccurate inactive listing both on Peter's Google+ posting and in one of my own. I also headed over to the original story and posted a somewhat caustic comment (in a tone that Web-based comment threads somehow seem to bring out, even in those of us who generally pride ourselves on more civil discourse) about the 83% stat coming "from garbage data."
Bime's response in the GigaOm comment thread:
[W]e understand your concerns as we were also really surprised to see the high number of 'inactive' users, but since this was the only data we could find on the subject we decided to add the line about not being sure how it was determined in order to clarify that the data may not reflect reality, and not lead people to take the data at 100% face value. We are waiting for a response from findpeopleonplus as to how they came up with this figure. So please sit tight and we'll try to clarify soon
Seems like it would have been better to find out what the data actually meant before reporting on it, but OK, at least they tossed in a scrap of a question in their blog. The resulting story simply claimed "A silent majority exists -- in a big way" on Google+, no caveat at all.
To her credit, story author Colleen Taylor responded to me on Google Plus (although not GigaOm yet) that the data criticism "is totally warranted. I'm looking forward to hearing more from Bime about the figures." She's still not going to the original source of the data, FindPeopleOnPlus, but at least she was acknowledging the data questions had merit.
Information designer Ville Kilkku, who commented after me on GigaOm that "without the definition of inactive ... this is pretty much just noise," joined me in trying to squelch buzz about this. But it can be tough to go up against a juicy headline -- especially one that fits into your existing ideas. Fast Company, for example, linked to it with the comment: "it's a trend we higlighted last month called 'Circles fatigue.' " Ah, we believe it so of course it's true.
It was also difficult to resist for a few of those seeking to be experts on all things Google+ itself, whether or not they agree with what the number claimed to say. Dan Schwabel, who is in more than 5,000 people's circles, posted a link to that GigaOm story with the sole comment "Surprised? I am." Steve Rubel, who's been added by more than 23,000 people on Plus, posted the GigaOm piece without comment. Happily a couple of people did complain, with one noting that the numbers are likely "way off" for not including commenters as well as posters, and another speculating that much data like this is in fact "subjective guesswork." One commenter on Google+ even took the time to read comments on the original site (and even cited my complaint). Most, though, responded to that 83% number as if it were true, with one of the first critics complaining "they should talk about the quality of discussions versus noise."
The good news? While this rather questionable statistic has cropped up in places it shouldn't have, it hasn't (yet) gone viral. Perhaps people learned something from the "Internet Explorer users are dumb" fiasco after all?
UPDATES: While Bime did include a caveat about not knowing specifics of the "inactive" statistic in its blog post about the numbers, it did not have a similar caution in an infographic that it invites others to re-post on their Web sites. There's been an interesting back-and-forth discussion about Bime's role in this on Google+ between consultant Matt Ridings and Bime's Kirsty Lee.
@FndPeopleonPlus tweeted to me early this morning that its site does indeed only see public posts, adding "We are renaming field [because] 'activity' isnt clear."
Additional update:Bime Analytics has removed information about Google+ inactive users from its infographic, according to a blog post.
Final update: GigaOm responded to the criticism and Bime's response by pulling the "83% of Google+ users are inactive" stat from its story and focusing instead on the rise of student users. I've written a follow-up post.
Sharon Machlis is online managing editor at Computerworld. Her e-mail address is firstname.lastname@example.org. You can follow her on Twitter @sharon000, on Facebook, on Google+ or by subscribing to her RSS feeds:
articles | blogs .