PDA

View Full Version : sandbox proof based on my definition



Kyle
04-05-2007, 06:56 PM
I will defend the sandbox effect to my death based on my definition. I get offended when people who I know do not develop, or interact with numerous new sites on a regular basis instantly saying the sandbox is false. In other words, I am annoyed with people who may maintain one or two big sites instead of involved with new sites being launched regularly to get a good idea of the latest effects Google has placed on newly established sites. Keep in mind... I am all for maintaining fewer large sites than creating smaller insignificant sites regularly. However I interact with many different webmasters who create small sites regularly, consulting them and having complete access and understanding of their link building methods.

My definition of the sandbox is a delay with a NEW site ranking on the primary term being targetted from the anchor text from external incoming links. What is weird is, if your site has a unique name that doesn't appear regularly on other sites (ex: something brandable like "Hoppity Woppity.com"), then this "sandbox" effect does not happen.

This "sandbox effect" likely happens due to ONE or ALL of the following...

- Google delaying the benefit of a link. As always the more incoming links you have, the less time it will take to rank.

- Other than the pagerank value of the incoming link always being important (importance may have been changing over the years, this thread is not a debate on pagerank), how topic-related the link is MAY have effect on the sandbox effect.

- As I already mentioned, the term you are targetting. It would be very easy for the google algo to know the difference between a unique site name and a name based on a competitive term. This is why a site with a unique name does not experience the sandbox when ranking on the two keywords making up their unique name.

My proof?

I have had controlled experiments with friends MFA sites.

Example 1: www.socialanxietydisorder.net. It took 9 months for this site to rank on Google for 'social anxiety disorder'. ALL of the incoming links were obtained in the first 2 months since launching. It was not a gradual increase in the rankings. 9 months after the site launched, it suddenly appeared in the top 20 on Google. THEN it has slowly increased in rankings like a normal site since that 9 month mark.

Example 2: www.anxietyinsight.com - Same difference, but only took 7 months to rank on 'agoraphobia'.

Example 3: www.dust-mites.org - Not in the top 1000 results on Google for 6 months. All incoming links were established early on, and no additional ones have been obtained. This site always received traffic from day 1 on obscure random terms related to mites.. even terms that included 'dust mites'. It would receive traffic on obscure terms like "bed dust mites". But when searching on 'dust mites', it wouldnt be in the top 1000. As of the past month (recently), it has been jumping around the top 100. At the time of this post, ranked 101.

Example 4: www.insectidentification.org - Obtained a DMOZ listing almost immediately after launching with the anchor text Insect Identification (among many other links). Took over 6 months to rank in the top 100 on the term 'insect identification'. Then normal, gradual increases in rank happened, now it is #1.

Out of all these sites, insect identification is my best example. It is not an MFA site, and it is not targetting some hugely competitive term. I realize DMOZ isn't everything, so please don't bash me saying "DMOZ links aren't gold stop placing so much emphasis on that." BUT, a DMOZ listing should at least put you in the top 500 on Google fairly quickly for the not so competitive term 'insect identification', shouldn't it? It took 6 months for www.insectidentification.org to crack the top 100 on a not so competitive term. Then it just decided to jump to #1 quickly once it cracked the top 100.

I consider myself a very logical and experienced web developer. I think those who I interact with regularly here would acknowledge my experience. I do not believe in any of the garbage that comes out of WMW.. from the -950 theory, to the "pray to google 7 times a day to increase your rankings" theory. However, wherever the term 'sandbox' came from, I believe the originator of this theory is on to something. Especially when this theory can be tested.

Kyle
04-05-2007, 07:19 PM
The simplified point is..

There is definitely some delay that exists related to new sites. And this delay did not exist pre-2003 (approximately).

That is what the sandbox is. The details can be debated, but there is something. And ever since the term was first coined, it has always referred to a delay in ranking for new sites.

sandbox ====== delay

Details ===== debatable

ozgression
04-05-2007, 09:30 PM
Well said Kyle. I believe the age of the links is the crucial factor.

ZigE
04-06-2007, 01:51 AM
Nice post kyle, im in the process of creating a bunch of new sites, so its good to get a realistic timeframe, when ranking in google.

Did you manage to measure the effect of incoming pagerank, regards to time?

Mike
04-06-2007, 02:15 AM
I'm not too sure...

I recently launched a new site, got links and within weeks I was ranking for terms. It has to be said I was going for a lot of terms for my subpages, not just one big one for my homepage.

andyf
04-06-2007, 04:52 AM
I'm not too sure...

I recently launched a new site, got links and within weeks I was ranking for terms. It has to be said I was going for a lot of terms for my subpages, not just one big one for my homepage.

Yes...that's very true...
Try to get as many backlink for innerpages also as you do for your home page;

Chris
04-06-2007, 08:03 AM
I've experience similar things. However I don't think we fully understand the system, especially since Google reps have said that there is no sandbox as per the initial definition and instead what we sometimes see is a perfect storm of filters and other systems that can create a sandbox like effect.

Kyle
04-06-2007, 10:23 AM
Nice post kyle, im in the process of creating a bunch of new sites, so its good to get a realistic timeframe, when ranking in google.

Did you manage to measure the effect of incoming pagerank, regards to time?

Ever since the Google dances were no longer a part of their system, Google reps have even said that pagerank calculations are supposed to be near instant.

They crawl an incoming link to your site, you get the pagerank.

The system that is in place is complicated, and I have no desire to figure it out. I only posted this thread to let people know that there is a delay... that is all I'm sure of.

This delay is just another example of how and where you should spend your time. Spend it on your business.. KNOW that you are writing good content. Don't worry about search engines, worry about your content and getting links from sites that respect and find your content interesting.

Realize that the fundamentals of SEO have not changed much, and the challenge is in your content writing, making it more unique and interesting than the competing site.

Kyle
04-06-2007, 10:37 AM
I've experience similar things. However I don't think we fully understand the system, especially since Google reps have said that there is no sandbox as per the initial definition and instead what we sometimes see is a perfect storm of filters and other systems that can create a sandbox like effect.

We definitely don't understand the system... nor will we. Chances are the system will change even more by the time we are close to understanding it. Experimentation, as I mentioned as a blog comment, is really going out the window compared to how it used to be.

All that's important is using the same fundamental seo techniques that haven't really changed over the years.

Billyray
04-06-2007, 09:47 PM
I recently registered a domain and put up a basic page and some how the goog bot came along and had a sniff.

The weird thing is there are no links to it and I haven't submitted it anywhere. The only thing I can see is that I may have looked up the domain history to see if it had been registered before.

Anyways it was ranked number 9 for a about a week and then nothing. The domain name is somethingcalculator dot com. Weird stuff.

KLB
04-06-2007, 10:26 PM
I have a perfect site to run an experiment on. It has been live for over a year, but I haven't done anything beyond get its skin up. It has maybe two links to it and only has a PR1.

The site's domain is RemoteComputerNetworking.com (Please don't link to it yet). My plan was to work with my brother create a resource dealing with building and maintaining computer networks under the most extreme and remote conditions on earth. The problem is my brother shipped off to Armenia with the Peace Corps to, build computer networks in an extremely impoverished country with whatever scraps of hardware he could scrounge up on almost zero budget (I already sent him most of my bone yard). Simply put he is the computing world's equivalent of Macgyver.

So basically the site isn't getting very far because the chief brain on this topic is off in some extremely remote destination gaining great experience for the mission of the site.

This site is perfect because it is indexed by Google and has aged past the "sandbox" effect cut off everyone seems to believe in, but essentially there is nothing to the site that would have helped it build up all the characteristics that sites traditionally build up. If and when I can finally start to get original articles for the site we'll be able to see how long it takes to get traction in the SERPs.

Todd W
04-06-2007, 10:39 PM
I have a perfect site to run an experiment on. It has been live for over a year, but I haven't done anything beyond get its skin up. It has maybe two links to it and only has a PR1.

The site's domain is RemoteComputerNetworking.com (Please don't link to it yet). My plan was to work with my brother create a resource dealing with building and maintaining computer networks under the most extreme and remote conditions on earth. The problem is my brother shipped off to Armenia with the Peace Corps to, build computer networks in an extremely impoverished country with whatever scraps of hardware he could scrounge up on almost zero budget (I already sent him most of my bone yard). Simply put he is the computing world's equivalent of Macgyver.

So basically the site isn't getting very far because the chief brain on this topic is off in some extremely remote destination gaining great experience for the mission of the site.

This site is perfect because it is indexed by Google and has aged past the "sandbox" effect cut off everyone seems to believe in, but essentially there is nothing to the site that would have helped it build up all the characteristics that sites traditionally build up. If and when I can finally start to get original articles for the site we'll be able to see how long it takes to get traction in the SERPs.


IMHO you can't "Test" with one site. Since we don't know exactly how the system/delay/"sandbox trigger" works we can't test one site and come to any conclusion. I think it's pretty clear some sites get penalized or "reviewed" by googles "special sandbox system" while others make it through and are in the clear.

s2kinteg916
04-06-2007, 11:04 PM
BillyRay thats a temp listing. its normal.. it only last for a week or so...

KLB
04-07-2007, 06:23 AM
IMHO you can't "Test" with one site. Since we don't know exactly how the system/delay/"sandbox trigger" works we can't test one site and come to any conclusion. I think it's pretty clear some sites get penalized or "reviewed" by googles "special sandbox system" while others make it through and are in the clear.

No you can't but I do have another site like the one mentioned above that was set up but never pursued. I also have some domains that have been registered for well over a year, but never developed beyond being parked at Sedo.

The favored "cause" of the "sandbox effect" is BS. There is no evidence of a causation relationship between the age of a site and it being "penalized" in the SERPs. At most there is a correlation and correlation does not mean causation. It isn't the responsibility of people like me to disprove the causation.

It is the responsibility of those want to perpetuate this conspiracy theory to provide evidence that supports the claim of causation AND to show evidence that exonerates other potential causes of the "penalty".

In other words it isn't the responsibility of the skeptics to disprove this myth (one can not disprove a conspiracy theory) it is the responsibility of the "believers" to provide verifiable evidence of causation.

People REALLY need to read Chris' latest SEO blog about scientific principles (http://www.websitepublisher.net/blog/2007/04/04/is-seo-science-or-marketing/).

For those who want to continue to believe in things like th sandbox effect, go ahead and be a self-defeatist. The rest of us can move forward and become more successful by listening to the wisdom of people like Chris who actually apply scientific principles to their research.

By believing in unsubstantiated beliefs like the causation of a domain's age being a penalty factor in search algorithms all you are doing is hurting yourself thereby being less of a competitive threat to the rest of us.

Selkirk
04-07-2007, 08:46 AM
Google has a patent on information retrieval based on historical data. It is hard to say what they use and what they don't.

My theory is that when google first discovers a page, it assigns it a PR based on the PR of the domain. That allows established domains to rank immediately with "fresh" content.

Then, google comes along and calculates the "real" ranking a few days to a week later. At that point, the page lives on its own.

The secret sauce that I think alot of people miss is the click data that google collects from the SERPS page. I think google will put weight on this data, if it has it. A surprising amount of weight, I think. This keeps the top of the SERPs more stable. If google is showing a result for a search and nobody is clicking on it, google can see its obviously not relevant for that term and weight it down. On the other hand, If a marginal page gets better than average click through, google can boost it.

The interesting thing happens when an established domain name gets ranked high in the SERPS for a term due to "freshness," but happens to get very good click through. Google will keep the term in the SERPS despite the lack of other SEO factors, like incoming links. To outside observers it looks like the page ranks because it is from a "trusted domain." That is only partially true. Being from a trusted domain got it ranked during the "freshness" period. Getting clicked on keeps it there.

Of course, new domains miss out on the freshness opportunity and click data. They have to fight their way up from the bottom, while established domains have the opportunity to start at the top.

KLB
04-07-2007, 09:50 AM
Selkirk, you have put forth some very interesting ideas that may have a great deal of merit. This goes back to looking for the real causes of the "sandbox". If the "sandbox is completely arbitrary and applied against a site primarily because of the site's age, then there is nothing one can do, but wait it out. For this reason I look at the traditional view of the sandbox as self defeating.

Based on the ideas Selkirk put forth one can develop a strategy of what to do that will greatly help a site succeed even if the ideas the strategy is founded upon is wrong.

What I see from Selkirk's post is that one needs to focus on the user by creating compelling content or services. Accurate and compelling titles for pages, good descriptions and descriptive initial text on the page are all important to getting a compelling listings in SERPs that will encourage users to click through to the page in question. When doing SEO one must not forget to do UO (user optimization).

Now in regards to new sites having to fight their way up from the bottom. Why do they have to fight their way up from the bottom. Is it because they are "new" and Google penalizes "new" sites or is it because new sites have not established a track record of favorable characteristics like NATURALLY/ORGANICALLY collecting back links from lots of diverse as well as related sites.

Going on a link buying binge while maybe helpful isn't a replacement to the organic links that a good site builds up over time. Chris recently posted a great and really long article on link building methodologies (http://www.websitepublisher.net/blog/2007/03/30/link-building-methodologies/) I highly recommend reading it if you haven't already.

Kyle
04-07-2007, 05:23 PM
All good thoughts... but what about the most important point of my post...a site going from not being ranked in the top 1000 (for 6+ months), to then ranking in the top 50 suddenly one day. Then from that day on, being treated as a regular site.

Kyle
04-07-2007, 05:27 PM
Selkirk, your theories are awesome.. but they do not apply to what I was bringing up.

I apologize for being critical to some of the responses so far... but your example of a fresh boost has nothing to do with the classic sandbox symptoms related to ranking on the "two word" name of your site (if the two word name is competitive).

I was not bringing up the classic up/down rank life based on freshness for a new article added to your site.

The most complicated, and confusing aspect, AND the reason the 'sandbox' theory got any attention from anyone... was due to the FRONT page of your site (i.e. www.dust-mites.org), not ranking in the top 1000 for MONTHS on the obvious competitive terms the site is targetting (dust mites).

This thread so far is one of the hundreds of other threads on other webmaster forums. Getting no where to address why people believe in the sandbox (the chief example above, yes i have repeted myself many times now).

Kyle
04-07-2007, 05:37 PM
Ken, i would appreciate it if you addressed my main point before doubting my definition of the sandbox (note again, MY definition).

I don't like your strong words, they are irritating when you dance around my definition, then throw in points that have nothing to do with the definition i provided (like domain age, who said domain age?)

Maybe we should delete this thread, start a new one, where i clearly outline the ONLY issue that should be debated/discussed?

KLB
04-08-2007, 08:48 AM
I will defend the sandbox effect to my death based on my definition.
Okay Kyle, you asked for it, prepare to die. :p

Even by your definition I say the sandbox effect is a myth at least from the causation you claim causes it. Do new sites jump to the top of the SERPs quickly? Normally they don't. While there is a correlation between the age of a site and how well it does in the SERPs, I would submit that there is no proof of causation and thus NO proof to support the sandbox conspiracy theory.

The reason new sites tend to not do well in SERPs has nothing to do with some penalty. The reason new sites don't rank well is that they don't have the favorable characteristics that caused other sites to rank well.

I always notice any such sandbox definition always excludes "non-competitive" search terms. Gee, I wonder why? Could it be that making an exception to the "rule" is the only way to make any sandbox conspiracy theory work? Who defines what is a competitive term anyways? Are search phrases defined as competitive terms if they appear to "obey" the "rules" of the sandbox effect? The moment one has to make exceptions to the rule, is the moment a "theory" starts to have validity problems.


My definition of the sandbox is a delay with a NEW site ranking on the primary term being targetted from the anchor text from external incoming links.
This is a pretty vague definition and it IS NOT in lines with the general definition/meaning applied to the term "sandbox effect", which is that the effect is a penalty applied to new sites that trip some "secret" spam triggers in Google's algorithms. With that said, your definition implies that in some way new sites are treated differently using different algorithms than older sites.

No one can expect to create a new site and expect it to rank well for competitive search terms "overnight". This isn't a "sandbox" it is a simple truth. It takes time to build up sufficient quantities of the favorable characteristics like organic (non-solicited/paid/spammed/etc) back links.

In the real world a company doesn't go from working out of a home garage to being a mighty corporation like HP overnight. The company has to work very hard to promote itself and build up a reputation to grow. This takes time.


What is weird is, if your site has a unique name that doesn't appear regularly on other sites (ex: something brandable like "Hoppity Woppity.com"), then this "sandbox" effect does not happen.
I'm not following what you are saying at all.


This "sandbox effect" likely happens due to ONE or ALL of the following...

- Google delaying the benefit of a link.
Why would Google go to the trouble and added processing load to have two sets of link weighting calculations (one for new sites and one for old sites). More than likely Google would take the simpler and cheaper route of ranking the value of the link based on the characteristics of the link itself.

What benefit would there be to Google or the search results to unnecessarily complicate the calculation of the value of links on such irrelevant characteristics as the age of the target site? After all, the age of the target site is irrelevant to the value of the link. There would be no benefit but lots of added cost in terms of processing power. Quite simply adjusting the value of a link based on the age of the target site makes no sense.

On the other hand it would make total sense to not count or weight ANY links to ANY site unless Google "knew" the link had been around for a while. A time delay to crediting links would be a great way to reduce the impact of link spammers as it would give Google an automated way to ignore fly-by-night FFA (free for all links) sites, and other transient types of pages (e.g. referrer spam in server log reports).

Delaying the "benefit" of ALL new links to ALL sites (regardless of site's age) quite simply would be a very nice automated way to ignore spammy/transient links.

So the reason one sees a sudden jump in SERP position with a site after a period of time may have absolutely nothing to do with the age of the site, but rather the age of the links, which just happened to all come about at about the same time.

Again this wouldn't a penalty that specifically targets new sites (e.g. a sandbox), rather it targets new links. It just so happens that new sites by definition ONLY have new links.


As always the more incoming links you have, the less time it will take to rank.
Well of course, to Google (and now many other SEs) the "link popularity" of a site is an important way to measure the value/popularity of a site. The quantity of well established organic links a site has accumulated is one good "third-party" measure of the value of a site. Google was basically founded on this principle.

The best I can tell is that incoming links to a site have both a page specific impact and a site wide impact. The "value" of links to page 'X' is partially passed on to page 'Y' if page 'X' links to page 'Y' (thus the site wide impact). This is the whole pagerank thingy in a nutshell. This is also why the internal linking structure of a site is so very important.

Large and/or well established sites tend to have lots and lots of links to their various pages. Do to the way Pagerank is spread within a site (via internal linking methods), a new page on a site will quickly benefit from the ALREADY ESTABLISHED links to other pages on said site, thus a new page on a well established site will climb faster than a new page on a poorly established site.

Put simply, this isn't a penalty placed against a new site, rather a bonus given to sites that have accumulated lots of organic incoming links and have over time proven themselves. This is why it may be a more successful strategy to focus on a few large sites that stick around for a long time than to focus on lots of small "disposable" sites.


- Other than the pagerank value of the incoming link always being important (importance may have been changing over the years, this thread is not a debate on pagerank), how topic-related the link is MAY have effect on the sandbox effect.
Again to attribute this to the sandbox effect you are assuming that Google treats links differently for old sites vs. new sites. This doesn't make any sense from any perspective. Why treat links differently simply because of the target site's age? It is far simpler and more logical to "score" a link based on its own merits alone, and not add in any unrelated factors like the age of the target site. The more complicated something becomes, the less likely it is to be.

Now it would make total sense to give different values to links based on how "topic-related" they were if that were technically possible. This, however, is all about the quality of the link and has no relevance to the age of the target site (which is what the sandbox effect is all about). Why would Google use this quality scoring only for new sites? Wouldn't make more sense to apply it to all links?


- As I already mentioned, the term you are targetting. It would be very easy for the google algo to know the difference between a unique site name and a name based on a competitive term. This is why a site with a unique name does not experience the sandbox when ranking on the two keywords making up their unique name.
What does this have to do with the sandbox effect? Nothing. It has been long established that succinct site title and domain name that contain the relevant search terms are highly important factors for ranking well in search results. But again this is only two factors upon countless factors Google uses to rank sites. So a good domain name and site title combination might help offset a lack of established back links.


My proof?
I'm not convinced that this proved anything.


I consider myself a very logical and experienced web developer. I think those who I interact with regularly here would acknowledge my experience. I do not believe in any of the garbage that comes out of WMW.. from the -950 theory, to the "pray to google 7 times a day to increase your rankings" theory. However, wherever the term 'sandbox' came from, I believe the originator of this theory is on to something. Especially when this theory can be tested.
I have seen NOTHING EVER, not from you not from anyone else, that substantiates a causation effect between the age of a site and its performance in search results.

Is there a correlation between the age of a site and its inability to crack the top of SERPs? Yes there probably is, but this does not mean causation as is being claimed with the "sandbox effect". It could simply be a matter that new sites have not built up the characteristics (like established back links) that Google is "looking for".

I would submit that if I purchased a thousand spammy links to my long established site that those links would have no more benefit to said site than if I bought the same links for a new site and in both instances any affects would take the same amount of time to be realized.

Again as I pointed out above, if there were any "sandbox", it isn't based on the age of the target site, rather the age of the links themselves (which makes total sense to do).


Kyle you and I will agree on one thing, it is hard for a new site to break into the top of some SERPs, but this should not be called a sandbox, because this infers the wrong thing. Let's kill off the term "sandbox effect" and look at this matter for what it is. It is not a penalty against new sites, rather the fact that new sites have not accumulated the necessary favorable characteristics.

Calling it a "sandbox" infers that the site is being artificially suppressed exclusively because of its age (or lack there of) and that once the site has passed that "magical" age it will no longer have those penalties. Thus one could be under the mistaken belief that if they register their site and then sit on it for six months or a year before working on the site they can magically avoid the "sandbox", when in fact the sooner they start working on promoting their site, the sooner it will be that the fruits of those marketing efforts will be realized.

In the end, the launching and final success of a site in the SERPs is a process that takes time and lots of work. One should not expect instant gratification. Sure there was a time when one could stuff a site with lots of on site SEO trickery and buy lots of links to instantly get to the top of the SERPs, but this only benefited spammers and those producing disposable sites. Google has simply become smarter to on page factors and then became more effective in the way they handle links (especially new links). Thus one now needs to take a more long term approach to web publishing.

Kyle
04-08-2007, 11:01 AM
Ok KLB, time to pull out my 9mm and put on "white boy e-thug mode".

I can't put things out any clearer, or more accurate than the following. And Ken, please read the entire thread that I mention later in this post, its short and is from 3 years ago.

No more using the term sandbox.

What is interesting is, I agree with everything you said... I am not insulting your points, but they are "beginner" thoughts, and ideas related to standard ranking timelines. It has nothing to do with what I'm trying to bring up.

I realize that when I don't proofread my statements, I write many run-on sentences... and can get repetitive and confusing. I apologize for this, it is a fault of mine - i should always proofread.

Here is the final summary of what I was trying to point out...

If you choose to address any of these points, I would love for you to address #1 and #2 (combined).

1) Something DID happen post 2003. This change involved delays either in how quickly Google calculates the weight you obtain from external links, or a delay in your site becoming 'respected' in Google. This is by far the key point here... anyone here with lots of experience launching new sites pre-2003 knows that if you called your site "blue widgets", and obtained a DMOZ listing, a Yahoo listing, and lets say a PR6 or PR7 link from your "company's" home page, you would be in the top 50 or top 100 for "blue widgets" by the next Google dance.

2) I never meant to say this is a delay with your entire site's rankings. That is the scary thing here that I have been TRYING TIME AND TIME AGAIN to bring up. This delay is only effected by a single search term... the name of your site, if the name of your site is competitive. It also seems to apply to your sub-pages.

A real life example of a crazy hypothetical? How many articles exist on "dust mite detection", regardless of the quality of the article? Not many... I hope you agree it is safe to assume that. My friend's low quality, MFA site, has a sub page called "Dust Mite Detection". This sub page took over 6 months to rank anywhere on Google (same article, no exteranl links pointing to it, is now #7). Again, this is not a quality site, it is an MFA site. Which means.... it doesnt really obtain random incoming links, cause other sites know it is garbage!!! So all real links were obtained early on, by the webmaster! The beauty about these piece of **** sites is the webmaster generally has full control over the incoming links. Can't get much more of a controlled example than that?

What is happening here? Why did that article not rank in the top few hundred on Google earlier? Why did it take over 6 months?

Because there was some delay in how Google was deciding to rank the site and calculate its weight.

You still rank on obscure terms across your sub pages. Just not the obvious targetted ones, generally related to the primary keywords (or topic of your site/site section).

I think what is happening here is Google has a delay in calculating the topic-related local rank for a new site. We can all agree on the value differences between an incoming link from another dust mite site, and an incoming link from a high PR casino site. The real question is, how long does it take for Google to give each of your sub-pages the "dust mite" tag. What do I mean by "dust mite" tag? I mean when Google considers all your sub-pages to be about dust mites, then when your internal pages link to eachother, extra weight is then given! Resulting in a huge boost on all your internal page rankings!

This is why wikipedia does so well (besides their numerous incoming links pointed all over their site, not just their front page).

Wikipedia has an excellent internal linking system which allows for perfect flow of TOPIC-RELATED link weight. So when they put up a new article on an SEO topic, many/most of their previous SEO articles then link to it. Because wikipedia is not a new site, and Google has alerady established their local rank across all their sections/pages whatever.

This same example can be applied to my friend's site on dust mites. If he were to create a new page on some dust mite topic now, it would quickly rank in the top 100 after the page were put online. He know has that topic-related trust with Google. This trust may extend past the topic of your site, not sure there.. but it isn't an important thing to address right now.

Ok, here is the rather lame thread I started almost 3 years ago.

http://www.websitepublisher.net/forums/showthread.php?t=2070

On this thread, Chris offers his thoughts on local rank.

Shawn mentions it takes 6-12 months for his new sites to start ranking.

Regardless of their opinions have changed now, it shows how long this has been going on now. And the people that still like to bring it up are those who not only created many different sites pre-2003, but are still creating many small sites post-2003 (large sample size).

Before becoming self-employed in 2003, the company I worked for managed hundreds of clients. We hosted, designed, programmed, and applied standard SEO practices to hundreds of different sites on all kinds of topics. Our company was only 5 people, so you could say I was able to engage in every aspect of the web development, marketing, hosting, and sales business.

Predicting potential ranking on terms and the time to obtain them was easy then. You could figure out a logical timeline of how long it will take to rank on certain niches, the pagerank needed, the pagerank didn't have to be topic-related, it was an easy time.

Now that has all changed. And the best explanation for these delays relates to your local rank / topic-related weight, AND how long Google takes for your own internal pages to have that 'local rank' respect or acknowledgement.

Ken, we agree on everything you said in your previous post, however your thoughts did not cut to the issue. They were too general...

In the end, of course I am happy with how Google is now. I'm not in the business of making my living from tricky SEO...Anything that makes it harder on the webmasters means less competition from spammers and 'get rich quick' mentalities. It is all about your dedication to the content or service your site offers. SEO has become a set of fundamentals that COULD be improved with certain additional theories (like using CSS over tables, or proper hierarchical use of the H1-H6 heading tags). The real challenge is in the product you produce.

Kyle
04-08-2007, 11:26 AM
No one can expect to create a new site and expect it to rank well for competitive search terms "overnight". This isn't a "sandbox" it is a simple truth. It takes time to build up sufficient quantities of the favorable characteristics like organic (non-solicited/paid/spammed/etc) back links.

Ok seriously Ken, I do speaka the SEO english.. who said overnight?

Going from one extreme to the other isn't fair friend.

KLB
04-08-2007, 09:57 PM
Okay, this took like hours to read, think about and comment on. Now my head really hurts.


No more using the term sandbox.
Excellent, because I think a lot of what you are saying has merit, but has absolutely nothing to do with the sandbox effect as most people define it.


What is interesting is, I agree with everything you said... I am not insulting your points, but they are "beginner" thoughts, and ideas related to standard ranking timelines. It has nothing to do with what I'm trying to bring up.
I'm not insulted at all. My efforts were nothing more than to dispel the sandbox conspiracy theory in the most basic terms.


I realize that when I don't proofread my statements, I write many run-on sentences... and can get repetitive and confusing. I apologize for this, it is a fault of mine - i should always proofread.Ya, as much as all of us write this can be a challenge.


1) Something DID happen post 2003. This change involved delays either in how quickly Google calculates the weight you obtain from external links, or a delay in your site becoming 'respected' in Google. This is by far the key point here... anyone here with lots of experience launching new sites pre-2003 knows that if you called your site "blue widgets", and obtained a DMOZ listing, a Yahoo listing, and lets say a PR6 or PR7 link from your "company's" home page, you would be in the top 50 or top 100 for "blue widgets" by the next Google dance.
I could buy off that something changed in 2003. The question is what changed? To begin to figure out what changed we need to look at why Google would want to make a change and then what would be the easiest way to achieve the desired objective.

I think the problem was that it was too easy to spam SERPs by simply buying lots and lots of links. Basically we were seeing an explosion in the buying and selling of Pagerank. If Google wanted to reduce the sale of Pagerank and reduce the ability to spam SERP results via buying links, what would be the best way to do this and would Google really want to limit the change to just new sites?


2) I never meant to say this is a delay with your entire site's rankings. That is the scary thing here that I have been TRYING TIME AND TIME AGAIN to bring up. This delay is only effected by a single search term... the name of your site, if the name of your site is competitive. It also seems to apply to your sub-pages.What I don't get is how you could determine this? How could you isolate out all of the various factors?


A real life example of a crazy hypothetical? How many articles exist on "dust mite detection", regardless of the quality of the article? Not many... I hope you agree it is safe to assume that. My friend's low quality, MFA site, has a sub page called "Dust Mite Detection". This sub page took over 6 months to rank anywhere on Google (same article, no exteranl links pointing to it, is now #7). Again, this is not a quality site, it is an MFA site. Which means.... it doesnt really obtain random incoming links, cause other sites know it is garbage!!! So all real links were obtained early on, by the webmaster! The beauty about these piece of **** sites is the webmaster generally has full control over the incoming links. Can't get much more of a controlled example than that?
Dust mite detection does seem like a good obscure search term (my search returned 370,000 results).


What is happening here? Why did that article not rank in the top few hundred on Google earlier? Why did it take over 6 months?

Because there was some delay in how Google was deciding to rank the site and calculate its weight.

You still rank on obscure terms across your sub pages. Just not the obvious targetted ones, generally related to the primary keywords (or topic of your site/site section).Color this as sarcastic, but it sounds like to me Google's changes, whatever they are, are working perfectly to put a crimp in spamming efforts. ;)


I think what is happening here is Google has a delay in calculating the topic-related local rank for a new site.
But what happens to a new page/section on an existing site that targets a new obscure term that is promoted in an identical fashion as the MFA link spamming described above.

Maybe the "penalty" isn't so much about the site being new, as it is about the tactics being used to promote the site. What you have described is so obviously spamming. Have you tried to systematically verify the fact that this doesn't happen to established sites?


This is why wikipedia does so well (besides their numerous incoming links pointed all over their site, not just their front page).
Actually I suspect it is the numerous organic incoming links that is the primary reason Wikipedia does so well in the SERPs. Their excellent internal linking system simply ensures that the value of the incoming links is effectively spread across the site.


Wikipedia has an excellent internal linking system which allows for perfect flow of TOPIC-RELATED link weight. So when they put up a new article on an SEO topic, many/most of their previous SEO articles then link to it. Because wikipedia is not a new site, and Google has alerady established their local rank across all their sections/pages whatever. Or could Wikipedia and other "established" sites become "trusted" by Google by the age and quantity of the organic (e.g. non-spammed) links they have attracted over time. Just like we can use Bayesian filters to free our email of spam, why can't Google use similar tools to detect and "quarantine" suspicious link development? Why would Google only want to apply such filters only on new sites?


This same example can be applied to my friend's site on dust mites. If he were to create a new page on some dust mite topic now, it would quickly rank in the top 100 after the page were put online. He know has that topic-related trust with Google. This trust may extend past the topic of your site, not sure there.. but it isn't an important thing to address right now.
Actually I can answer that question for you. The vast majority of the tens of thousands of links to my site are focused on a couple sections of my site and a handful of related terms. Yet anytime I add a new page or section the new page/section will quickly dominate searches for obscure search phrases that are contained within said pages. As an example I added a new careers section to my site and have been working on adding new career listings pages targeting specific job types and those pages have been gaining strength in Google's search results very nicely. In fact much of the traffic to those pages is directly from search results.


Shawn mentions it takes 6-12 months for his new sites to start ranking. But we go back to the old question of where I think the traditional view of the sandbox "theory" is wrong. It is attributing the delay to a penalty placed on new sites, which it may simply be the delaying of attributing credit for new links to a site, especially if those links accumulate in a "suspicious" manner. In other words if I had an "older" site that was simply allowed to go fallow and hadn't accumulated any links and then one day I decided to really promote the site and generated massive amounts of links, Google could delay the crediting of that massive influx of links.

Simply put in such a scenario, any unnatural accumulation of links would be ignored for a period of time. For an established site this may not be very noticeable because the already established links give a boost to the new pages. A new site on the other hand or a site with virtually no preexisting back links would not have the benefit of existing links to help boost the search position while the new links "proved" themselves out and Google got around to giving the site credit for said links.


Regardless of their opinions have changed now, it shows how long this has been going on now. And the people that still like to bring it up are those who not only created many different sites pre-2003, but are still creating many small sites post-2003 (large sample size). Yes something changed around 2003, but we must be very careful as to how we attribute causation in regards to those changes. Obviously Google wanted to reduce the impact of search spam and link buying. The question is did Google simply apply a penalty against new sites or did they delay the crediting of new links to a site's PR regardless of the age of the site.


Predicting potential ranking on terms and the time to obtain them was easy then. You could figure out a logical timeline of how long it will take to rank on certain niches, the pagerank needed, the pagerank didn't have to be topic-related, it was an easy time.
And this was a problem because it allowed for the easy gaming of Google's SERPs by SEOs and spammers. Now Google's algorithm is a little more robust against these games. In many respects, this turned the tide against turned the tide against fly-by-night build and abandon sites in favor of sites that are in this for the long haul.


Ken, we agree on everything you said in your previous post, however your thoughts did not cut to the issue. They were too general...
I have no real desire to try and dig really deep into this issue. If I'm wrong and there is a sandbox that is applied against new sites, there is nothing I could do differently to avoid the sandbox so it wouldn't matter. If, however, I'm right then based on this general information I know I need to focus on the things I've always focused on with any site. This would include creating compelling content as well as buying/soliciting links, with a focus on starting to build links as early in the game as possible so that the site will start to gain credit for said back links as quickly as possible. The one thing I wouldn't do (and would have never done in the past) is go crazy buying links for a very specific search phrase from scores of different sites. To me, the sudden appearance of countless links from scores of different sites using the exact same anchor text is tantamount to branding a big "I am a spammer" notice across one's forehead.

I may not know exactly how Google's link valuation system works, but I do know that Google really only wants to give value to organic links thus one should assume that spamming the exact same anchor text across scores of sites is sure to raise red flags.


In the end, of course I am happy with how Google is now. I'm not in the business of making my living from tricky SEO...Anything that makes it harder on the webmasters means less competition from spammers and 'get rich quick' mentalities. It is all about your dedication to the content or service your site offers.
I fully agree. I have always thought that a dedication to content and services for REAL users was the number one priority. The problem was that the old Google was too easy to game. Now the get rich quick mentality is not rewarded this is a great improvement.


SEO has become a set of fundamentals that COULD be improved with certain additional theories (like using CSS over tables, or proper hierarchical use of the H1-H6 heading tags). The real challenge is in the product you produce.
YES YES YES!!

This is why I'm trying to hard to kill off the whole sandbox conspiracy theory, because it distracts people from the fundamentals of what they should be doing and what they always should have been doing. In fact with my chemistry site, I hardly put any thought into optimizing new pages for any particular search phrase. I simply try to produce quality content and let the "chips" fall where they may when it comes to key words.


Ok seriously Ken, I do speaka the SEO english.. who said overnight?

Going from one extreme to the other isn't fair friend.
I wasn't referring to a literal 8 hour overnight. I was referring to a very rapid climb (e.g. weeks or few months) climb in the SERPs. How many times have we seen some noobie upset because their brand new site isn't #1 for their target search phrase after only a few months of existence? The overnight comment is directed at those who expect to pull a few SEO tricks and magically begin to conquer the SERPs in a short period of time. Then they turn around and blame some Google sandbox penalty because they didn't get what they wanted as fast as they wanted it.

Kyle
04-08-2007, 10:38 PM
Great thread, and thanks for continuing to reply.

In the end, nothing is convincing either way. The primary reason you (Ken) and I were back and forth on this... is cause Ken was looking at traditional definition of the sandbox, and i was basing my statements on my general, non-specific definition.

This quote shows we are still not connecting. Damn the Internet and not being able to interact in person!


Yes something changed around 2003, but we must be very careful as to how we attribute causation in regards to those changes. Obviously Google wanted to reduce the impact of search spam and link buying. The question is did Google simply apply a penalty against new sites or did they delay the crediting of new links to a site's PR regardless of the age of the site.

Here, you bring up penalty again. I have never thought it was a penalty. I have always thought of it as a delay. Where we disagree is whether it applies to new sites.

Your examples of adding new content to your chemistry site, then immediately ranking on scattered random terms... YES, this is always the case for established sites (and new sites!!). Completely agree there...

Not much more to say on this, I'm tired, opening day Cubs game tomorrow (even though I hate the cubs, I live in Chicago and love baseball).

What can we conclude from this wordy thread?

Care about your content, and the traffic will follow. Worry about profit first, and you'll be in the real sandbox (working in a cubicle for 'the man').

KLB
04-09-2007, 05:27 AM
In the end, nothing is convincing either way.This is correct.


The primary reason you (Ken) and I were back and forth on this... is cause Ken was looking at traditional definition of the sandbox, and i was basing my statements on my general, non-specific definition.
Which is why I took issue with you labeling your definition as a sandbox. You would be better off not using any label and simply explaining the observations.


This quote shows we are still not connecting. Damn the Internet and not being able to interact in person!
It is good and bad. Having to type responses requires more work and thus slows us down and forces us to think rather than simply blurt out a response.



Here, you bring up penalty again. I have never thought it was a penalty. I have always thought of it as a delay. Where we disagree is whether it applies to new sites.
Again this is why you shouldn't be labeling your observations as a sandbox, because most people refer to the sandbox as a penalty.


What can we conclude from this wordy thread?Your observations have considerable merit, but they are not a sandbox as most people know it.


Care about your content, and the traffic will follow. Worry about profit first, and you'll be in the real sandbox (working in a cubicle for 'the man').
EXACTLY!!!

Kyle
04-11-2007, 05:48 PM
People REALLY need to read Chris' latest SEO blog about scientific principles (http://www.websitepublisher.net/blog/2007/04/04/is-seo-science-or-marketing/).

For those who want to continue to believe in things like th sandbox effect, go ahead and be a self-defeatist. The rest of us can move forward and become more successful by listening to the wisdom of people like Chris who actually apply scientific principles to their research.

Woah, I totally missed this part. I decided to read this thread one more time, and stumbled upon something I totally overlooked.

This isn't the first time you throw something Chris has written to aid in whatever mystical point you're trying to make. Chris's experience has nothing to do with this thread, yet you bring it in regularly when replying to given issues. It really gets annoying Ken.

Chris does scientific research related to SEO? That is news to me. But thanks for the info Ken. As far as I can tell, I have never seen Chris talk about SEO past the fundamentals, except when he attempted Search Engine Labs.

Why does he not speak of SEO past the fundamentals? Why does he not do research?

Because it is a waste of time, and any experienced web publisher knows this.

Experimentation related to SEO has become almost impossible. Not only is it almost impossible to have a 'control' in your experiment, but the erratic behavior from Google adds to this being impossible.

Would you agree Ken? Or do you think scientific research is actually possible (or maybe the right word is productive..)?

This thread is the best we can do... and inexperienced developers like you don't realize that. You take threads like these, twist them around, and then they turn into debates NOT centered around the core point of why the thread was created!

This thread is advice Ken. It is data, that other webmasters can use. Forums like WebmasterWorld ban all data. You cannot mention specifics there. Forums like SitePoint are full of post monkeys and spammers ready to turn the most simplest thread into a 5 page discussion.

You need to chill with your negative attitude towards things, and look at the big picture.

The proof of your negative attitude is in this thread... I made it perfectly clear since the beginning that this thread was proof of a delay in rankings based on MY DEFINITION. I titled the thread with keyword "sandbox" to obviously draw attention (now websitepublisher ranks #1 on google for 'sandbox proof'). But i CLEARLY outlined what i meant in the FIRST TWO POSTS.

What did you do?

You decided to ignore the "my definition" part, and instead pick at me for using the term sandbox, that the general definition is not the same as 'my definition'. (O RLY?). Then you criticize my definition for being too vague.

What did Chris do?

He replied with a productive post. He stated the following:


I've experience similar things. However I don't think we fully understand the system, especially since Google reps have said that there is no sandbox as per the initial definition and instead what we sometimes see is a perfect storm of filters and other systems that can create a sandbox like effect.

This is productive. I then replied with...


We definitely don't understand the system... nor will we. Chances are the system will change even more by the time we are close to understanding it. Experimentation, as I mentioned as a blog comment, is really going out the window compared to how it used to be.

All that's important is using the same fundamental seo techniques that haven't really changed over the years.

Here we are, making progress, so early on before you posted! The thread was over. New developers were seeing that in the end, its all about the fundamentals. Then you throw in your hocus pocus, and it all goes to hell.

organ
04-11-2007, 06:49 PM
sandbox aims new website which are trash.

KLB
04-11-2007, 07:12 PM
This thread is the best we can do... and inexperienced developers like you don't realize that.
Inexperienced MY ***!:flare:

I didn't stumble into web publishing yesterday. I've been doing this since 1995. I've been at the SEO game, since before SEO was even a coined term. My main site has even been around since before there a Google.

When I talk about this stuff, I'm talking about it based on 12 years of experience. If I'm inexperienced NOBODY here is qualified to talk on the subject.

You owe me an apology.:flare:

Oh and yes if you actually read Chris' blog he has made several references over time towards the idea of applying scientific principles to one's SEO research. It is why I listen to and respect what Chris says, because he is extremely analytical in his approach, and methodical in is research.

KLB
04-11-2007, 07:43 PM
Now that I got that off my chest I'll reply to some other comments:


This isn't the first time you throw something Chris has written to aid in whatever mystical point you're trying to make.
The only mystical points that ever get made are to try and prove that there is a sandbox, which strangely has an ever changing definition to adapt to the debate at hand. The sandbox conspiracy is the only mystical "hocus pocus".



Chris's experience has nothing to do with this thread, yet you bring it in regularly when replying to given issues. It really gets annoying Ken.
Chris' experience has everything to do with this thread and other SEO debates. Not because he claims to be able to prove or disprove the sandbox, but because he is willing to analytically look at issues and try to recognize the difference between correlation and causation. Chris applies critical thinking to his research, something so many "SEO experts" fail to do.


Chris does scientific research related to SEO? That is news to me. But thanks for the info Ken. As far as I can tell, I have never seen Chris talk about SEO past the fundamentals, except when he attempted Search Engine Labs.
Just because one can not conduct controlled experiments does not mean one can not apply critical thinking and scientific principles to one's research, it just means that one can not "prove" something to an absolute certainty. Scientific principles aren't just about controlled experiments, they are also about systematic observation and documentation of observations. One can not conduct an experiment to prove the "big bang" but this doesn't make research involving it any less scientific.


Why does he not speak of SEO past the fundamentals? Why does he not do research?

Because it is a waste of time, and any experienced web publisher knows this.
Then why try and defend the "sandbox effect" which is an attempt to describe something beyond the "fundamentals".


Would you agree Ken? Or do you think scientific research is actually possible (or maybe the right word is productive..)?
Applying scientific principles and critical thinking is possible to SEO and too few SEO experts try to apply even the most basic levels of it to their "studies" of SEO and the crap they spread.


This thread is the best we can do... and inexperienced developers like you don't realize that.
I covered this in my last post, but I've been at this game since 1995, so if I'm inexperienced almost no one here has any experience.


This thread is advice Ken. It is data, that other webmasters can use.
And bad data was being spread. By your INCORRECT use of the term sandbox to apply to your definition you helped reinforce the belief in the sandbox effect as most people define it. It wasn't until I started challenging you on this that I was able show that what you are talking about isn't the sandbox effect as most people define it. Your observations have merit, but applying the term "sandbox" to them does them a disservice.


I made it perfectly clear since the beginning that this thread was proof of a delay in rankings based on MY DEFINITION. I titled the thread with keyword "sandbox" to obviously draw attention (now websitepublisher ranks #1 on google for 'sandbox proof'). But i CLEARLY outlined what i meant in the FIRST TWO POSTS.
Your use of the word "sandbox" completely muddled the waters on this discussion and did nothing but help spread the belief in the sandbox. The belief in this conspiracy theory needs to be stamped out and confusing the issue for sensational purposes serves no greater good.


You decided to ignore the "my definition" part, and instead pick at me for using the term sandbox, that the general definition is not the same as 'my definition'.
You can not take a term that has a loosely defined meaning and then apply your own definition to it and not expect a backlash. You also can not claim to be perfectly clear yet take a term that most people know to mean one thing and totally redefine it to mean something very different. You set this whole thread up for a great deal of confusion. It wasn't until I started debating you on this issue and forced you to be more clear about what you meant that it became clear to me that you weren't talking about the sandbox effect as it is normally presented.

Chris
04-11-2007, 08:15 PM
Fellas, Fellas, settle down.

Here is what I think, since you both seem to be referring to me so much.

1. Kyle's experiment is a nice try, but it isn't scientifically perfect, it isn't definitive.

2. It'd be really, really, really hard to do a valid controlled experiment on this, so I gotta forgive #1.

3. The idea that there is something sandbox-like shouldn't be up for discussion. It has been confirmed by Google. They said that there is something they have a combination of filters/algorithms, that can result in a sandbox-like effect, but that the original "sandbox" definition was wrong. What we don't know if it is a devaluation of the site, just it's links, both, neither, etc. But we do know that new sites often fail to rank well on their main keywords for months and then suddenly leap up.

4. On a personal note, I do think I talk about things beyond fundamentals, its just that I do not do so in comparison to others who include unverified conjecture in their discussions. There isn't a whole lot beyond fundamentals that actually is real.

So. I think Ken is getting bent out of shape because he misunderstood Kyle's original claim. Kyle is right, you just misunderstood what he was saying. I almost did too, I read the thread title, and was all prepared to write him up, but after reading and rereading his post I realized he wasn't really talking about the typical sandbox. I also think Kyle does need to apologize to Ken for the inexperienced comment. We all know Ken is a professional with lots of experience.

So, come on guys, hug it out.

Kyle
04-11-2007, 08:30 PM
-

KLB
04-11-2007, 08:33 PM
Yes Chris, you are right on all counts. And yes I did misunderstand what Kyle's original claim was.

This isn't a perfect medium for discussions and debating. It is much easier for us to misunderstand each other than we sometimes realize. Thus I misunderstood Kyle's original claim and initially started to knock him around about the sandbox issue.

I think we should go back to posts #24 & #25 where we had begun to find that we do actually agree on the "fundamentals" of this discussion and realized that it was a simple mislabeling of those fundamentals that caused confusion in the first place.

Kyle once I understood what your fundamental claim in this thread was, I couldn't find any major points of disagreement with it, I just wish you wouldn't have called it a "sandbox" due to the baggage that term carries. Oh and Kyle, you are right about Sitepoint and Webmaster World.

Kyle
04-11-2007, 08:34 PM
Fellas, Fellas, settle down.

Here is what I think, since you both seem to be referring to me so much.

1. Kyle's experiment is a nice try, but it isn't scientifically perfect, it isn't definitive.

2. It'd be really, really, really hard to do a valid controlled experiment on this, so I gotta forgive #1.

3. The idea that there is something sandbox-like shouldn't be up for discussion. It has been confirmed by Google. They said that there is something they have a combination of filters/algorithms, that can result in a sandbox-like effect, but that the original "sandbox" definition was wrong. What we don't know if it is a devaluation of the site, just it's links, both, neither, etc. But we do know that new sites often fail to rank well on their main keywords for months and then suddenly leap up.

4. On a personal note, I do think I talk about things beyond fundamentals, its just that I do not do so in comparison to others who include unverified conjecture in their discussions. There isn't a whole lot beyond fundamentals that actually is real.

So. I think Ken is getting bent out of shape because he misunderstood Kyle's original claim. Kyle is right, you just misunderstood what he was saying. I almost did too, I read the thread title, and was all prepared to write him up, but after reading and rereading his post I realized he wasn't really talking about the typical sandbox. I also think Kyle does need to apologize to Ken for the inexperienced comment. We all know Ken is a professional with lots of experience.

So, come on guys, hug it out.

Working on new post, my prev post was done before I saw Chris's reply.

KLB
04-11-2007, 08:36 PM
Working on new post, my prev post was done before I saw Chris's reply.

Understood. As such I'm going to ignore said post.

Kyle
04-11-2007, 08:45 PM
Fellas, Fellas, settle down.

Here is what I think, since you both seem to be referring to me so much.

1. Kyle's experiment is a nice try, but it isn't scientifically perfect, it isn't definitive.

2. It'd be really, really, really hard to do a valid controlled experiment on this, so I gotta forgive #1.

3. The idea that there is something sandbox-like shouldn't be up for discussion. It has been confirmed by Google. They said that there is something they have a combination of filters/algorithms, that can result in a sandbox-like effect, but that the original "sandbox" definition was wrong. What we don't know if it is a devaluation of the site, just it's links, both, neither, etc. But we do know that new sites often fail to rank well on their main keywords for months and then suddenly leap up.

4. On a personal note, I do think I talk about things beyond fundamentals, its just that I do not do so in comparison to others who include unverified conjecture in their discussions. There isn't a whole lot beyond fundamentals that actually is real.

So. I think Ken is getting bent out of shape because he misunderstood Kyle's original claim. Kyle is right, you just misunderstood what he was saying. I almost did too, I read the thread title, and was all prepared to write him up, but after reading and rereading his post I realized he wasn't really talking about the typical sandbox. I also think Kyle does need to apologize to Ken for the inexperienced comment. We all know Ken is a professional with lots of experience.

So, come on guys, hug it out.

Re: calling it the sandbox. Of course I admit that this could mislead people, but it could also shed a LOT OF LIGHT on things, due to the attention it receives, and traffic it receives from Google. So I admit that Ken could have been honestly mislead.

Re: my experiment. It isn't an experiment, but more of data from MFA sites I have been involved in. In my opinion, one of the best sites to study, cause they don't generate many random incoming links. Most links are controlled by the webmaster.

Re: calling it the sandbox. I knew that Google said it did not exist based on original definitions, however referencing these symptoms will always bring that term up. I of course apologize for irritating Ken due to my choice of words. There's a lot of history here regarding this subject, even on this forum!

Re: Chris talking beyond the fundamentals. Well said, but the issue I'm trying to bring up is how much things have changed, due to these "delay symptoms" AND eratic google behavior. Our time is better used working on our content than figuring out new SEO "tricks' (for lack of a better word).

Re: apologizing to Ken. My inexperienced statement is leftover from your way of defending putting the co-op on your chemistry site. This really made me judge you...As I cannot apologize for that statement with regards to seo, I do apologize for making a generalized comment regarding your experience. You are definitely NOT an inexperienced web developer. Your chemistry site is the ultimate success, and you should be proud of how long you have kept it online, and the authority your site displays. I have never judged success by monetary gain, so when I look at your chemistry site, it is definietly a wonderful thing to analyze :).

I will also say that whenever you reply to beginners posts here, your style of teaching them is excellent. I have always been impressed with your methods of phrasing things and educating the new web developers who come to this forum.

Thanks for getting involved Chris.

Kyle
04-11-2007, 08:59 PM
I also wanted to add that Ken's decision to not use www in his URL has caused me to do the same on future projects!

I always chose to use www out of habbit, but when searching Google on "environmental chemistry", and seeing how nice his result looks with the domain in bold (without a WWW in the way!), I thought this could be a more effective way of trying to generate clicks from SERPs!

Thanks Ken.

KLB
04-11-2007, 09:15 PM
Re: calling it the sandbox. Of course I admit that this could mislead people, but it could also shed a LOT OF LIGHT on things, due to the attention it receives, and traffic it receives from Google. So I admit that Ken could have been honestly mislead.
I understand your thoughts behind the labeling.


Re: my experiment. It isn't an experiment, but more of data from MFA sites I have been involved in. In my opinion, one of the best sites to study, cause they don't generate many random incoming links. Most links are controlled by the webmaster.
It may not have been a perfect experiment, but it was good for observations and sometimes one has to settle for systematic observations (e.g. studying the weather).



Re: calling it the sandbox. I knew that Google said it did not exist based on original definitions, however referencing these symptoms will always bring that term up. I of course apologize for irritating Ken due to my choice of words. There's a lot of history here regarding this subject, even on this forum!
Your observations deserve much better than that badly abused term. If we could get people past that term and what it historically means, I think your observations can provide for real discussions beyond "the fundamentals".

Our time is better used working on our content than figuring out new SEO "tricks' (for lack of a better word).
You'll never hear a disagreement from me on this comment.


Re: apologizing to Ken. My inexperienced statement is leftover from your way of defending putting the co-op on your chemistry site. This really made me judge you...
I never put co-op on my site for SEO purposes. I didn't even know what Digitalpoint was at the time. I was paid $500 per month to put five rotating text links on my site (plus another $100 per month to put it on another site). The links weren't going to be contextually targeted so they didn't violate the AdSense TOS and I was looking for creative ways to monitorize my site.

It was a pretty big sum that was hard to refuse and I simply didn't ask the questions I should have. It was a stupid moniterzation mistake, not an SEO mistake. Although when I look back at it, it might not have hurt that much financially since I did earn around $4,000+ from it.

Once I learned what the links were all about and realized it could have been a source of my problems I terminated the links. I'm no longer willing to allow an advertiser to blindly place ad links on my site.

When trying to find creative ways to make more money from a site, one can sometimes make bad choices even if they have been at this business for a long time.


As I cannot apologize for that statement with regards to seo
Just remember it wasn't an SEO stunt from my stance, it was simple advertising agreement.


I do apologize for making a generalized comment regarding your experience. You are definitely NOT an inexperienced web developer.
Thank you, apology accepted.


Your chemistry site is the ultimate success, and you should be proud of how long you have kept it online, and the authority your site displays. I have never judged success by monetary gain, so when I look at your chemistry site, it is definietly a wonderful thing to analyze :).

Thank you. I look at this as a long term investment. I plan to be still working on the site and still earning a living from it thirty or forty years from now. I dream of it becoming a highly respected authority site where articles are peer reviewed and getting published on it is as sought after as is getting published in any serious medical journal.


I will also say that whenever you reply to beginners posts here, your style of teaching them is excellent. I have always been impressed with your methods of phrasing things and educating the new web developers who come to this forum.
Thank you. Lately I have felt that I have started to get a little short or irritable in my posting style. It is frustrating starting over at the beginning time after time. Sometimes posts by new users make me feel like I'm stuck in Groundhogs' Day.


Thanks for getting involved Chris.
Yes sometimes we need a good slap in the face to snap out of it.

Kyle
04-11-2007, 09:23 PM
Thank you. Lately I have felt that I have started to get a little short or irritable in my posting style. It is frustrating starting over at the beginning time after time. Sometimes posts by new users make me feel like I'm stuck in Groundhogs' Day.

I hear ya man, that is in me as well. Thanks for saying that.

Kyle
04-11-2007, 09:30 PM
I never put co-op on my site for SEO purposes. I didn't even know what Digitalpoint was at the time. I was paid $500 per month to put five rotating text links on my site (plus another $100 per month to put it on another site). The links weren't going to be contextually targeted so they didn't violate the AdSense TOS and I was looking for creative ways to monitorize my site.

Oh i know you didn't put it on for SEO purposes. Don't get mad at me, but part of SEO is knowing what not to put on your site. And un related text links have been a big no no for a while.

Go look at www.surviveoutdoors.com. See that link to the shoes site in the right navigation at the bottom? Why am I doing this? Because it makes a ton of money, and Survive Outdoors is a dead project. It isn't where my heart is, and the writer (my dad) does not have the time as well. Survive Outdoors was a huge success for the 5 years we gave it tons fo attention, and made 6 figures multiple years during the Christmas season using our authority to milk the affiliate business.

So I'm definitely in the same boat as you when it comes to looking at profit sometimes over common sense...but it really is part of SEO. I am conciously doing something negative to Survive Outdoors by putting the shoe link there.

I'm glad you're here Ken, I really mean that. There are not enough old school webmasters who are still around... most have vanished, become millionares and ignored the community, or are silent about their specific projects.

KLB
04-11-2007, 09:50 PM
Oh i know you didn't put it on for SEO purposes. Don't get mad at me, but part of SEO is knowing what not to put on your site. And un related text links have been a big no no for a while.
Unrelated links don't seem to hurt me, I've sold them for a long time and still do. It was links to bad neighborhoods that hurt. And anyone who knows what co-op is knows that it often contains links to bad neighborhoods. I'm absolutely certain it was bad neighborhood links that hurt me. I say this because my other "unrelated" links stayed on my site and it recovered just fine once I got rid of co-op.

Also, sites have "unrelated" links on them all the time. Think of all the affiliate programs we link to and syndication links we put on our sites. We also routinely link our sites back to our "main" web developer site via a "site developed by link". It would be really hard to penalize a site for unrelated links without having undesired consequences. However, penalizing sites for linking to bad neighborhoods would be quite easy.


So I'm definitely in the same boat as you when it comes to looking at profit sometimes over common sense...but it really is part of SEO. I am conciously doing something negative to Survive Outdoors by putting the shoe link there.
It is really scary having to depend upon one primary advertiser and diversifying is very hard. I'm finally getting some good partnerships that have long term potential, but one has to step on the occasional rotten egg to make a living in this business.


I'm glad you're here Ken, I really mean that. There are not enough old school webmasters who are still around... most have vanished, become millionares and ignored the community, or are silent about their specific projects.
Man I'd like to be one of those millionaires.:p

Seriously though, I gain way too much knowledge from communities like this to turn my back on them. Even when we seriously butt heads we can still learn something once we finally swallow our pride. I may be quick to anger, but I'm also quick to forgive. Holding a grudge is counterproductive.

Kyle, I do really like your thinking overall and I think we all can learn a lot from you. So don't take the occasional head butting personally. You can't make an omelet without breaking a few eggs and we can't have a spirited discussion without a few misunderstandings.

Oh and you're right there are not enough old school webmasters left around. Sometimes I long for the days before WYSIWYGs when webmasters had to actually know the difference between a tag and an attribute.

Kyle
04-11-2007, 09:56 PM
Unrelated links don't seem to hurt me, I've sold them for a long time and still do. It was links to bad neighborhoods that hurt. And anyone who knows what co-op is knows that it often contains links to bad neighborhoods. I'm absolutely certain it was bad neighborhood links that hurt me. I say this because my other "unrelated" links stayed on my site and it recovered just fine once I got rid of co-op.

Also, sites have "unrelated" links on them all the time. Think of all the affiliate programs we link to and syndication links we put on our sites. We also routinely link our sites back to our "main" web developer site via a "site developed by link". It would be really hard to penalize a site for unrelated links without having undesired consequences. However, penalizing sites for linking to bad neighborhoods would be quite easy.

True true... but we'll save examples of sites getting their pagerank spreading abilities blocked for another debate! I guess it is all about your level of paranoia. No good data here either way.


Seriously though, I gain way too much knowledge from communities like this to turn my back on them. Even when we seriously butt heads we can still learn something once we finally swallow our pride. I may be quick to anger, but I'm also quick to forgive. Holding a grudge is counterproductive.

I am very quick to anger as well, we definitely have that in common!


Kyle, I do really like your thinking overall and I think we all can learn a lot from you. So don't take the occasional head butting personally. You can't make an omelet without breaking a few eggs and we can't have a spirited discussion without a few misunderstandings.

Oh and you're right there are not enough old school webmasters left around. Sometimes I long for the days before WYSIWYGs when webmasters had to actually know the difference between a tag and an attribute.

Thanks Ken, misunderstandings happen. The subject matter of this made it more heated than a normal misunderstanding.

Ya I miss my old site on Geocities, I even remember the old URL www.geocities.com/TimesSquare/Castle/2100. They didnt have a wysiwyg then, just some generic template choices or coding it by hand.

KLB
04-11-2007, 10:29 PM
True true... but we'll save examples of sites getting their pagerank spreading abilities blocked for another debate! I guess it is all about your level of paranoia. No good data here either way.
Yep another debate, although blocking of the spread of PR to other sites is different from penalizing a site for linking to unrelated sites the way Google penalizes sites that link to bad neighborhoods.


I am very quick to anger as well, we definitely have that in common!
I think it is a common trait among those of us in this business. Lots of high IQ's, big egos and lack of patience due to answering the same damn questions too many times. For me the button that pushes me over the edge is the whole sandbox issue, I'm sick of people coming up with bizarre conspiracy theories to justify their lack of instant success when in reality it is because they created an MFA site with virtually no content.

I had put tens of thousands of hours of hard effort, sweat and tears into my site before it started turning into a commercial success. Noobies coming into this game and expecting to "instantly" get rich by putting up a crap site just sets me off.


Thanks Ken, misunderstandings happen. The subject matter of this made it more heated than a normal misunderstanding.
I've been more irritable than normal this past couple of weeks so I'm a little quicker to crank up the heat than normal. Just lots of little frustrations combined with the extraction of my wisdom teeth has made me really irritable.



Ya I miss my old site on Geocities, I even remember the old URL www.geocities.com/TimesSquare/Castle/2100. They didnt have a wysiwyg then, just some generic template choices or coding it by hand.
I had a Geocities site, just to play around with. It really sucked. I first started coding in 1995. My site was on a college NeXT server and I would telnet in and use pico to edit my html. Talk about a brutal way to code. It was a bit like walking to a mile to school each day in blinding blizzards and having to walk up hill in both directions. :lol: For me migrating to notepad was a major leap forward.

organ
04-16-2007, 04:01 AM
I think i have a sandbox problem in my 1 year website. No traffic from google, but get tons of traffic from yahoo and msn.

KLB
04-16-2007, 05:44 AM
Probably not related to the sandbox as defined originally in this thread or as is commonly defined. Instead, you should look at on page factors that Google may be penalizing you for (e.g. links to bad neighborhoods, over doing SEO, etc.).