PDA

View Full Version : Compete.com an Alexa competitor



KLB
03-22-2007, 11:14 AM
I was over at http://whois.domaintools.com running some whois queries on IP addresses and noticed a banner ad for http://www.compete.com/, which was billing itself as a competitor for Alexa. Out of curiosity, I went to the site and queried my own site and was extremely impressed by the accuracy by which it reported my U.S. traffic. Its page views are in line with what I know them to be for my U.S. traffic as are the number of visits and number of page views per month. In a way, it was pretty freaky. :eek:

It appears that they may be buying click stream data from ISPs, which would explain why their data is so accurate. For more on how your ISP is involved see:
http://www.techdirt.com/articles/20070313/213014.shtml
http://internet.seekingalpha.com/article/29449

On one hand I'm sure a lot of advertisers will like the ability to get really accurate U.S. traffic data about websites, but on the other hand as a U.S. Internet subscriber, it really pisses me off to know that my ISP is selling my web surfing habits to others.:flare:

Chris
03-22-2007, 12:54 PM
It is nice and accurate, and there isn't a webmaster skew.

http://snapshot.compete.com/online-literature.com+sitepoint.com+digitalpoint.com+

rpanella
03-22-2007, 06:08 PM
Theres also another site called quantcast.com. Both of these places show similar levels of traffic for my site but both are about half what Google Analytics reports for absolute unique US visitors over a month.
________
Og kush marijuana (http://trichomes.org/marijuana-strains/og-kush)

MaxS
03-22-2007, 06:16 PM
I'm a bit wary to believe those rankings -- I compared Sitepoint/DP with a site of mine that receives around 15,000 uniques a day; my site ranks higher.

Do you really think SP/DP receive less than 15,000 uniques a day?

ses5909
03-22-2007, 06:17 PM
That's a really interesting site!

Chris
03-22-2007, 06:26 PM
I'm a bit wary to believe those rankings -- I compared Sitepoint/DP with a site of mine that receives around 15,000 uniques a day; my site ranks higher.

Do you really think SP/DP receive less than 15,000 uniques a day?
They aren't as active as you may think. They serve a specific niche and have a high number of repeat visitors but neither is a general audience site.

MaxS
03-22-2007, 06:40 PM
They aren't as active as you may think. They serve a specific niche and have a high number of repeat visitors but neither is a general audience site.
Interesting. In that case, I'm happy :).

I'll have to start using Compete instead of Alexa.

KLB
03-22-2007, 07:17 PM
Also people need to keep in mind that Compete.com only looks at U.S. traffic. Thus it will look lower than Google Analytic's overall stats. It should, however, look similar to GA's U.S. traffic numbers.

At least in my case its numbers for U.S. traffic seem to be very close to my real numbers. The reason for this higher accuracy is the way it gets its data, instead of relying on a toolbar install like Alexa, they buy click stream data from U.S. ISPs. This means they get data on all pages millions of users across all walks of life, not just data from those who have opted to install a toolbar. From a statistics standpoint, this is a much more reliable way to conduct a survey.

rpanella
03-22-2007, 07:55 PM
Compete also has a toolbar along with their ISP data, and ISP data can be just as skewed as toolbar users. Here's a study that was done to determine the co-relation between true traffic and the various free external metrics: http://www.seomoz.org/article/search-blog-stats
________
Volcano Review (http://vaporizer.org/reviews/volcano)

KLB
03-22-2007, 08:04 PM
The only real skew with ISP data is if it doesn't cover a broad enough cross section of ISPs including dialup, broadband and wireless. Remember this isn't a function of some tracking software on the computer, rather a matter of page requests that were captured and logged as the requests hopped across routers owned by ISPs.

Quite literally it is akin to the phone company logging every phone call you make. Scientifically speaking this methodology would be very sound, but from a social aspect there are some tremendous privacy concerns about ISPs selling the Internet equivalent of phone logs.

Chris
03-22-2007, 08:34 PM
Compete also has a toolbar along with their ISP data, and ISP data can be just as skewed as toolbar users. Here's a study that was done to determine the co-relation between true traffic and the various free external metrics: http://www.seomoz.org/article/search-blog-stats
What bull**** that was. I see them yet again touting their arbitrary "PageStength" thing. Its a neat tool, but they assign way too much importance to something they made up.

rpanella
03-22-2007, 08:34 PM
To remove the skew though they would need a very large sample from lots of different ISPs. They currently claim to track 2 millions users, many which are through the toolbar, and even this is less that 1% of the users. and they likely buy it from very few ISPs.

More akin to tracking all calls made from MetroPCS phones in New York.
________
HERBAL SHOP (http://herbalhealthshop.com)

Chris
03-22-2007, 08:46 PM
The only real skew with ISP data is if it doesn't cover a broad enough cross section of ISPs including dialup, broadband and wireless. Remember this isn't a function of some tracking software on the computer, rather a matter of page requests that were captured and logged as the requests hopped across routers owned by ISPs.

Quite literally it is akin to the phone company logging every phone call you make. Scientifically speaking this methodology would be very sound, but from a social aspect there are some tremendous privacy concerns about ISPs selling the Internet equivalent of phone logs.
The issue really is about sample size, you get a huge sample size with ISP data, this makes it statistically accurate.

But popularity matters for the sites being measured too. They took a bunch of niche blogs, none of which is popular in the grand scheme of things. They see it as comparing a 100, to a 75, to a 10, to a 4. In truth it was like comparing a 0.75 to a 0.10 to a 0.04.

This is just another example of people with uncertain educational backgrounds trying to do experiments and drawing improper conclusions.

No kidding they couldn't see any accuracy, they have a sample size of probably hundreds of millions and they're comparing sites that, top to bottom, only span a few hundred thousand uniques. When graphed as a percentage of total Internet traffic, or something like Compete's sample size, the traffic of those sites would form an almost perfectly horizontal sliver of a line right at the bottom.

A real comparison would need to make sure all participating sites used the same tracking system (say analytics) and then you'd want sites that get 10 million + uniques a month, 1-10 million uniques a month, and under 1 million uniques a month.

Heh... maybe I should do one.

rpanella
03-22-2007, 09:15 PM
What bull**** that was. I see them yet again touting their arbitrary "PageStength" thing. Its a neat tool, but they assign way too much importance to something they made up.

Yeah, well basically the point of it was to show that sites with such little traffic can't be that accurately represented with Alexa etc, due to the small sample size they have. Obviously the larger the traffic a site gets, the more accurate these services with small sample sizes will be, no ones arguing that. My point was just that at KLB's traffic level, the fact that Compete is very accurate is more of a fluke than the norm.


Heh... maybe I should do one.

That would be an interesting study to see how accurate all the different metrics are at different traffic levels.
________
Homemade vaporizers (http://homemadevaporizers.info/)

KLB
03-22-2007, 09:46 PM
The issue really is about sample size, you get a huge sample size with ISP data, this makes it statistically accurate.
This is what I was thinking.


But popularity matters for the sites being measured too. They took a bunch of niche blogs, none of which is popular in the grand scheme of things. They see it as comparing a 100, to a 75, to a 10, to a 4. In truth it was like comparing a 0.75 to a 0.10 to a 0.04.
Okay I'm really confused. :confused:

What is this comment referring to? Is it referring to the seomoz article or to compete.com???


This is just another example of people with uncertain educational backgrounds trying to do experiments and drawing improper conclusions.
This is what I thought of the seomoz article.


When graphed as a percentage of total Internet traffic, or something like Compete's sample size, the traffic of those sites would form an almost perfectly horizontal sliver of a line right at the bottom.
Yep. Although I hate the "big brother" aspects of how compete gets their data, what I like about it is that it eliminates the "webmaster" skew that Alexa has. This could actually help dispel the popularity myth that some web publisher related sites have. For instance, according to Compete.com, my chemistry site is more popular than Sitepoint.

Quite simply, because webmasters seem to be the primary ones who install the Alexa toolbar Sitepoint's popularity is being heavily overstated by Alexa. At the same time because the groups that tend to visit my site are prohibited from using the Alexa toolbar (corporate and academic IT departments) my site's popularity was being under stated.

What I've observed about Compete.com's data for my site is that at least at my traffic levels its methodology is producing some very accurate results. Compete.com could really shake up the web advertising marketplace.

KLB
03-22-2007, 09:51 PM
That would be an interesting study to see how accurate all the different metrics are at different traffic levels.

It would be interesting to know although I suspect that Alexa loses accuracy on any site that gets less than a couple million visits per month. One thing that is interesting about Compete.com is that I get the distinct impression that their popularity ranking is based upon individual users, Not visits or page views. Whereas Alexa ranks (or at least did) based on pageviews. This difference in methodology could really reduce the "forum" factor.

Chris
03-23-2007, 06:03 AM
Okay I'm really confused.

What is this comment referring to? Is it referring to the seomoz article or to compete.com???

The SEOmoz one. They couldn't find any accuracy because all the site's had traffic so small that the difference between them was within a reasonable margin of error for the total sample size.

Westech
03-23-2007, 07:17 AM
This is a really cool tool. Recently a new competitor of ours appeared out of nowhere and started going around claiming huge success and traffic numbers in an insanely short period of time. They pointed to their Alexa ranking as "proof" of this. Looking them up on compete.com confirms my suspicion that they were artificially inflating their Alexa rank. :p

KLB
03-23-2007, 09:37 AM
This is a really cool tool. Recently a new competitor of ours appeared out of nowhere and started going around claiming huge success and traffic numbers in an insanely short period of time. They pointed to their Alexa ranking as "proof" of this. Looking them up on compete.com confirms my suspicion that they were artificially inflating their Alexa rank. :p

Although I'm seriously freaked about my click stream data being sold, I do see that compete.com could help "out" those who are inflating their popularity via Alexa. For years many of us have been screaming about how flawed Alexa data is, but the rebuttal excuse was "well it's the best we got". Now we can prove just how inaccurate Alexa is AND provide people with a better alternative--at least for U.S. traffic stats.

Now if we could just convince advertisers to look at compete.com stats instead of Alexa stats.

ZigE
03-24-2007, 03:49 AM
Hmm, is it accurate for anyone else?

I've been running some rough numbers on a few sites, and it seems way way to low, than it should be.

rpanella
03-24-2007, 05:01 AM
For my biggest site it shows about 50% of the actual US uniques reported by Google Analytics.
________
Universal Health (http://uhwh.com/)

KLB
03-24-2007, 06:53 AM
How much traffic are you talking about per day? Also, it looks at absolute "users," not visits, which could be the reason for the "low" number compared to what you expect.

As Chris pointed out the lower the traffic numbers for a site the more inaccurate the data will be from any survey.

Really the way this data should be used is to compare different sites because at least from the free no login required option we don't see the number of visits or page views for a specific site and we can not extrapolate this information from the information provided because the first tab is "people" the second is "rank" and the third is "pages per visit". From this we can not determine the number of visits a site gets per month.

For advertisers these three tabs really are the most important pieces of information as they can help determine how many people their ads will get in front of and the ranking will help give a relevant popularity comparison between different sites. The added benefit being that this data is not skewed the way Alexa is. Even if it does turn out to under count the actual number of people who visit a site, it will be done in a consistent manner across all sites thus site comparisons will still be relevant.

BTW here is a comparison for three popular webmaster related forums (Webmaster world, Sitepoint & Digitalpoint): http://snapshot.compete.com/webmasterworld.com+sitepoint.com+digitalpoint.com+

Contrary to what some posts in these forums have suggested Sitepoint's popularity continues to climb in spite of many of the changes they have made over the last year and a half.

Chris
03-24-2007, 07:17 AM
SP also I think has a more internation audience, especially Australian.

KLB
03-24-2007, 07:30 AM
Right, but Compete.com does not report international traffic. It only reports U.S. traffic.

rpanella
03-24-2007, 04:05 PM
How much traffic are you talking about per day? Also, it looks at absolute "users," not visits, which could be the reason for the "low" number compared to what you expect.

Google Analytics reports 1.8m absolute uniques in Feb and about 52% US so about 900k absolute unique US, and Compete reports between 400k-500k uniques from the US in Feb.
________
Lincoln Mkr (http://www.ford-wiki.com/wiki/Lincoln_MKR)

KLB
03-24-2007, 07:50 PM
The question comes down to the definition of unique visitors. GA may be defining it differently because of their ability to place cookies on individual user's computers and users ability to delete/refuse cookies. Compete on the other hand maybe be counting uniques as unique IP addresses and/or unique ISP user accounts. This difference in the definition of uniques could easily explain the difference.

rpanella
03-24-2007, 08:10 PM
The question comes down to the definition of unique visitors. GA may be defining it differently because of their ability to place cookies on individual user's computers and users ability to delete/refuse cookies. Compete on the other hand maybe be counting uniques as unique IP addresses and/or unique ISP user accounts. This difference in the definition of uniques could easily explain the difference.


The difference between uniques based on cookies vs IPs would only be less than a percent and we are talking them missing half of a million uniques.

I personally would assume that Google Analytics (which matches up with all my ad network impressions, server logs, etc) would probably be closer to the actual number and Compete.com which uses toolbar statistics and a tiny fraction of ISP logs, is actually reporting almost half my visitors.

You are bascially saying that this is not the case, Compete is accurate and Google is probably over counting by twice since they might use cookies as well. This is a ridiculous conclusion, and it is far more likely that Compete is simply not as accurate as you concluded based on the single instance where they are accurate for your site.
________
Chevrolet Bison Specifications (http://www.chevy-wiki.com/wiki/Chevrolet_Bison)

Chris
03-24-2007, 08:29 PM
Yes, you cannot say Compete is 100% accurate.. merely just more accurate than alexa. No matter what their sample size, they're still just extrapolating a guess and that isn't as accurate as anything that directly logs traffic on your site.

rpanella
03-24-2007, 08:51 PM
Agreed Chris, it is likely more accurate than Alexa in many regards (mostly webmaster skew) but probably also less accurate in some regards.

The fact these competitors to Alexa are coming out is great though, because by using them all together (triangulating the data if you will), it is likely possible to get more accurate figures than any one service could provide by itself. Also, just having competition will push them to provide more accurate data.
________
Pregnant Vid (http://www.****tube.com/categories/34/pregnant/videos/1)

KLB
03-25-2007, 08:10 AM
Yes, you cannot say Compete is 100% accurate.. merely just more accurate than alexa. No matter what their sample size, they're still just extrapolating a guess and that isn't as accurate as anything that directly logs traffic on your site.

Yes this is part of what I'm saying. I'm also saying that GA is not 100% accurate either. It may be more accurate than Compete.com, BUT it still isn't as accurate as server logs. Also what I'm saying that definitions of what a qualifies as a unique person is are different for Compete.com and GA and thus they really aren't comparable. If one is comparing two sites, one would draw false conclusions if tried to compare compete.com uniques for one site against GA uniques for the other site.

Anytime one tries to compare numbers from two different statistical methodologies, one will draw false conclusions.

KLB
03-25-2007, 08:29 AM
The fact these competitors to Alexa are coming out is great though, because by using them all together (triangulating the data if you will), it is likely possible to get more accurate figures than any one service could provide by itself. Also, just having competition will push them to provide more accurate data.
1) "Triangulating" using bad data will not result in good data--it just can't happen.

2) What is important is not an exact traffic level for a specific site, but its traffic levels compared to other sites. For the comparison to be relevant the numbers MUST NOT be skewed do to serious flaws in methodology.

3) Alexa's methodology is flawed beyond repair. It relies upon scientifically and statistically invalid methods of data capture. Sample data is not collected from random individuals, rather ones who seek out and install Alexa's toolbar. This requires the individual to know about Alexa, be able to use the toolbar and be willing to install it. The only way Alexa's stats could be fixed would be to abandon the toolbar and shift to buying click stream data directly from ISPs (as Compete is doing). Alexa's data and toolbar is nothing more than a marketing scheme used by Amazon.com disguised as traffic data. Quite simply in the case of Alexa, the emperor has no clothes.

4) GA will be inaccurate (although very useful) because it relies on the browser supporting JavaScript and allowing cookies.

5) Comparing server log stats between sites will also be inaccurate because of different methodologies to detect and weed out robots/bots/spiders and different calculations of defining what is a "unique".

Now granted some methods will return better data than others, but one should never be deluded into believing one method is truly accurate.

rpanella
03-25-2007, 04:13 PM
1) "Triangulating" using bad data will not result in good data--it just can't happen.

By taking 3 small samples of data, you create a much bigger sample size, which will reduce the skew of each sample individually. This is basic statistics, which you do not seem to understand. Everything you have said in this thread is along the lines of "Compete knows my exact traffic therefore they are completely accurate, possibly even moreso than direct metrics such as google analytics".
________
Amateur Video (http://www.****tube.com/categories/1/amateur/videos/1)

rpanella
03-25-2007, 04:44 PM
This thread has gotten very off topic but here is a useful site I found that lets you quickly compare traffic graphs between different sites with all 3 of the services: Alexa, Compete, and Quantcast.

http://www.attentionmeter.com
________
BABE STRIPPING (http://www.****tube.com/categories/199/stripping/videos/1)

KLB
03-25-2007, 05:53 PM
By taking 3 small samples of data, you create a much bigger sample size, which will reduce the skew of each sample individually. This is basic statistics, which you do not seem to understand.
But if that data is skewed, it doesn't matter how many data points you pick. the results will still be skewed. There is no way around it.


Everything you have said in this thread is along the lines of "Compete knows my exact traffic therefore they are completely accurate, possibly even moreso than direct metrics such as google analytics".
NO, that is what you may be reading into what I said, but it is not what I'm saying. What I'm saying is that Compete is the most accurate indirect traffic reporting service I have seen. Notice the key word "indirect". GA and server logs are direct traffic reporting tools.

I'm also trying to get people to realize that GA uniques is not the same thing as Compete uniques is not the same thing as server log uniques. Even if GA, Compete and server logs had recorded all of the traffic to a site, all three would still show a different number uniques do to the different ways uniques are defined and determined.

The most important things to take away from this thread is that for those who need a third-party overview of other sites traffic that Compete is a very good source and that there is absolutely no excuse for still using Alexa.

rpanella
03-25-2007, 06:37 PM
But if that data is skewed, it doesn't matter how many data points you pick. the results will still be skewed. There is no way around it.

So if Alexa toolbars were installed on every computer they would still have a webmaster skew? Thats what you are saying.


NO, that is what you may be reading into what I said, but it is not what I'm saying. What I'm saying is that Compete is the most accurate indirect traffic reporting service I have seen. Notice the key word "indirect". GA and server logs are direct traffic reporting tools.

I'm also trying to get people to realize that GA uniques is not the same thing as Compete uniques is not the same thing as server log uniques. Even if GA, Compete and server logs had recorded all of the traffic to a site, all three would still show a different number uniques do to the different ways uniques are defined and determined.

The most important things to take away from this thread is that for those who need a third-party overview of other sites traffic that Compete is a very good source and that there is absolutely no excuse for still using Alexa.

I agree Compete is more accurate for most comparisons, especially across different niches. I never said it wasn't. I'm just saying it is not a reason to completely ignore data from Alexa. Both can provide useful information depending on what you are comparing/measuring.

The only thing I am disagreeing with, is when you said the difference in my Compete numbers and Analytics numbers could be explained due to counting differences. A unique is a unique, the difference is simply explained by the fact that the <1% of the internet users that Compete samples use my site about half as much as the entire internet.
________
UPSKIRT TENNIS (http://www.****tube.com/categories/1139/tennis/videos/1)

KLB
03-25-2007, 09:13 PM
So if Alexa toolbars were installed on every computer they would still have a webmaster skew? Thats what you are saying.
No that is not what I was saying because I wasn't discussing Alexa in my comparison. I was comparing Compete, GA and server logs. If Alexa toolbar was installed on every computer the skew would be eliminated but the data still wouldn't agree with the other three because of different definitions of "uniques". What is a "unique"? How do you determine this?


I agree Compete is more accurate for most comparisons, especially across different niches. I never said it wasn't. I'm just saying it is not a reason to completely ignore data from Alexa. Both can provide useful information depending on what you are comparing/measuring.
Alexa should have been totally ignored for years now. Its data is worse than worthless for the following reasons: 1) some spyware applications label it as spyware and remove it; 2) many IT department in academic institutions, government and corporations ban the install of Alexa; 3) most regular users never install it on their own; and 4) webmaster/SEO types typically do install Alexa. Those four factors cause Alexa's sample to be extremely skewed and way too small to be statistically relevant. The only one who gains from Alexa is Amazon.com in that it theoretically improves their ability to target book and purchase suggestions when the person goes to Amazon's website.


A unique is a unique, the difference is simply explained by the fact that the <1% of the internet users that Compete samples use my site about half as much as the entire internet.
Again, define unique and explain how a unique would be calculated. How do you ensure that automated processes like site scrapers/bots/spiders/etc. don't get counted as uniques?

rpanella
03-25-2007, 10:17 PM
Regardless of how you define unique visitors, using direct metrics will give you a very accurate count of the true number of unique people that come to your site. These would include Google Analytics or various server stat packages, ad network stats, etc.

Regardless of how you define unique visitors, indirect metrics such as Alexa, Compete, and Quantcast can only "guess" how much traffic you get based on a very small sample of statistics they collect.

You have an extreme bias against Alexa, thats fine, no one is forcing you to use it. I'm simply saying you cannot compare a direct metric such as Analytics to an indirect one such as Compete, and explain the difference in numbers due to different definitions of what a unique is. It has nothing to do with that.

I've wasted enough time trying to explain this to you, and you continue to go on tangents bashing Alexa or comparing counting uniques with IPs or cookies, which is not what this thread or I was ever talking about.
________
Yamaha yz125 (http://www.cyclechaos.com/wiki/Yamaha_YZ125)

AmbulanceBlues
03-25-2007, 11:07 PM
Not to step into somebody else's argument, but...

I was intrigued about the "Big Brother" aspect of how they get their clicks. I went to the Compete FAQ and it does an excellent job of using a whole lot of words to tell you almost nothing about that subject.

I seems to me that when they refer to getting click information from ISPs, that could mean one of two things (actually, I guess it could be both or neither):
1. They buy the logs of an ISP's customers' activities OR
2. They buy the logs of random internet traffic going through an ISP's Internet backbone exchange.

The first I would have a serious problem with, because a company with whom an account that I pay for should assume that any data I may generate is a part of my account's activities. To me, that's just like my bank selling "randomized" transaction information. It would be wrong.

The latter, however, I don't think I really have a problem with. The nature of the internet means that your web surfing is going to be routed through numerous exchanges owned by entities that owe you no privacy. If I operate an exchange on the internet backbone, I'm accepting every signal that comes to me without condition and routing them appropriately without fealty to anyone but my own clients who benefit from the service. In that situation, I wouldn't have any compunction about selling logs that don't directly relate to my clients or anybody else who doesn't have a reasonable expectation of privacy.

It seems compete would want to go with some version of that mechanism because it's also less geographically limited. If they were buying surfing information about an ISP's customers it would necessarily be limited to that ISP's customers. (Duh...) What I mean to say is that what if all of their "Two Million Users" are customers of ISPs in the San Francisco Bay area, or the Houston area, or any other limited (or dispersed but patchy) area?

The "Two Million Users" statement does kind of put a kink in my theory. But if they're doing what I think they're doing, I don't see any privacy problems there. You don't have a reasonable expectation that information you send over the internet (without efforts made to secure it via tunneling and/or encryption or whatever) will remain private. If the ISPs are doing that, it seems to me they're only monetizing an opportunity that presented itself.

EDIT:

Okay, I went back and re-read the articles in the first post and it appears I am wrong. When I first perused them it sounded like they were just making conjecture like me but I was wrong. But I still think my idea is better (and more ethical) than what Compete doing. When they rip me off, remember you heard it here first.

rpanella
03-26-2007, 12:00 AM
Very good point AmbulanceBlues, data from a backbone provider would be more ethical and much more random.

Data from a particular ISP alone can be very biased. Most ISP have a start page which search powered by either Google or Yahoo. My site is basically banned from Yahoo, and 85% of my traffic comes from Google and the rest is links or typein. You think my site would be fairly represented taking data from all the Yahoo SBC dsl users? Not at all, but an ISP powered by Google would probably over estimate my traffic. Also, dialup users are overall probably less tech savvy, so those numbers would undercount stats to tech sites. You would need to buy random stats from all ISPs to get rid of the bias.
________
Buy roor bongs (http://glassbongs.org/)

KLB
03-26-2007, 06:20 AM
AmbulanceBlues has touched on an aspect of this issue that I tried to touch on and has not been well discussed.

Rpanella you are absolutely correct that getting data from only a few ISPs would be skewed and I touch on this in post #10 (http://www.websitepublisher.net/forums/showpost.php?p=56086&postcount=10)

The backbone idea for collecting random data would be good, however, it would still need to be the backbones from several providers AND it would miss data that never made it to a backbone. For instance if the website and the user were on the same ISP those requests may never get routed to a backbone.

I suspect that Compete is buying data that is culled from ISP logs. The question is how detailed this data is. I do suspect a serious potential for privacy issues if this is not handled properly (ala AOL search log release).

Westech
03-26-2007, 08:52 AM
compete.com is showing less than 1/3 of the unique US visitors that Google Analytics is reporting for me. Compete seems better than Alexa for comparing relative traffic across multiple sites, but they seem pretty worthless as far as reporting actual visitor numbers.

KLB
03-26-2007, 08:58 AM
compete.com is showing less than 1/3 of the unique US visitors that Google Analytics is reporting for me. Compete seems better than Alexa for comparing relative traffic across multiple sites, but they seem pretty worthless as far as reporting actual visitor numbers.

I'll have to look closer at the "unique" traffic comparisons between GA and Compete for my site.

Unlike Alexa, Compete might simply be reporting the actual traffic it sees rather than trying to extrapolate "real" traffic numbers. If this is true then it would only be useful for comparing one site against another, but not for getting actual traffic levels of a specific site.

For me what I really care about with my own traffic is pageviews and visits per day and total pageviews per visit. I don't really care about total unique visitors per month because I see this as a totally fictitious number regardless of methodology (nobody has given me a definition of "uniques" yet).

Westech
03-26-2007, 09:06 AM
For me what I really care about with my own traffic is pageviews and visits per day and total pageviews per visit. I don't really care about total unique visitors per month because I see this as a totally fictitious number regardless of methodology (nobody has given me a definition of "uniques" yet).

What I'm refering to as "uniques" here is monthly unique visitors -- how many unique people have visited the site within a one month period. The same person visiting the site multiple times during the one month period is only counted once. Of course, even with this specific definition the numbers will vary somewhat depending on how unique visitors are tracked (cookies or IP's... javascript or images...etc.)

Absolute monthly uniques is the main metric that most big ad buyers want to know when evaluating direct ad buys. Their main concern seems to be how many unique eyeballs will be seeing their ad.

KLB
03-26-2007, 09:40 AM
What I'm refering to as "uniques" here is monthly unique visitors -- how many unique people have visited the site within a one month period.
Great, but how do you determine this? No method can truly track this. People share computers, IP addresses, browser profiles, etc. People also use different computers, connect from different locations, use different browsers, etc. Cookies get deleted, IP addresses change, etc. Quite simply the concept of "unique" visitors per month is a fraud. At best you can determine unique visitor's per day.

For example if a teacher uses a site like mine in her class, she might teach the same subject to three or four classes. These students would share classroom computers thus multiple "uniques" would actually be sharing IP address and cookies. If a user connects via dialup, each time they could get a different IP address or share the same address with thousands of other users (e.g. AOL). Thus you can't rely on IP address to determine uniques. Many users could use the same computers in an Internet cafe or library and thus share the same cookies.

How do you deal with bots and spiders that spoof regular browsers? More and more of these bots are part of zombie networks that infect regular computers and thus scrape pages from multiple "regular" IP addresses, thus non-human visitors are becoming nearly impossible to separate from human visitors.

Chris
03-26-2007, 10:00 AM
Great, but how do you determine this? No method can truly track this. People share computers, IP addresses, browser profiles, etc. People also use different computers, connect from different locations, use different browsers, etc. Cookies get deleted, IP addresses change, etc. Quite simply the concept of "unique" visitors per month is a fraud. At best you can determine unique visitor's per day.

Daily uniques aren't immune to those same issues. The same percentage of a whole should be affected and thuse the same margin of error.

KLB
03-26-2007, 10:30 AM
Daily uniques aren't immune to those same issues. The same percentage of a whole should be affected and thuse the same margin of error.

You are correct that daily uniques are not immune, but they would be less susceptible to causes like different users using the same computers or different computers sharing the same IP address. There would also be less skew caused by the factors describe above. Notice the key word "less".

They say there are three types of lies: lies, damn lies and statistics. I'd say there is a fourth lie, "uniques". :p