1) "Triangulating" using bad data will not result in good data--it just can't happen.
2) What is important is not an exact traffic level for a specific site, but its traffic levels compared to other sites. For the comparison to be relevant the numbers MUST NOT be skewed do to serious flaws in methodology.
3) Alexa's methodology is flawed beyond repair. It relies upon scientifically and statistically invalid methods of data capture. Sample data is not collected from random individuals, rather ones who seek out and install Alexa's toolbar. This requires the individual to know about Alexa, be able to use the toolbar and be willing to install it. The only way Alexa's stats could be fixed would be to abandon the toolbar and shift to buying click stream data directly from ISPs (as Compete is doing). Alexa's data and toolbar is nothing more than a marketing scheme used by Amazon.com disguised as traffic data. Quite simply in the case of Alexa, the emperor has no clothes.
4) GA will be inaccurate (although very useful) because it relies on the browser supporting JavaScript and allowing cookies.
5) Comparing server log stats between sites will also be inaccurate because of different methodologies to detect and weed out robots/bots/spiders and different calculations of defining what is a "unique".
Now granted some methods will return better data than others, but one should never be deluded into believing one method is truly accurate.
Bookmarks