Page 1 of 2 12 LastLast
Results 1 to 15 of 17

Thread: PR is Dead. HostRank rullz.

  1. #1
    Registered
    Join Date
    Jan 2004
    Posts
    183

    PR is Dead. HostRank rullz.

    I think I've uncovered Google's real PageRank implementation. It agrees with anything I've seen and experienced.

    Your input is much appreciated. Link to my article is here:HostRank Article

  2. #2
    Web Monkey MarkB's Avatar
    Join Date
    Nov 2003
    Location
    London, UK
    Posts
    1,783
    Man, that's even more confusing than Page Rank!
    Stepping On Wires - the new blog

  3. #3
    Registered
    Join Date
    Jan 2004
    Posts
    183
    Man, that's even more confusing than Page Rank!
    It is much simpler. Just look at the example graphs in Ian Rogers' article and imagine these are not pages but sites. Also take a look at the papers at the bottom of my article. I think HostRank is a brilliant idea. It just kicks spammers' ***** big time.

  4. #4
    Administrator Chris's Avatar
    Join Date
    Feb 2003
    Location
    East Lansing, MI USA
    Posts
    7,055
    Normally I wouldn't allow such self promotion, but I'm letting this stay as its worth pointing out to people just how ridiculous HostRank is.

    Putting aside the fact that you have no factual basis for your theory. I can't quite easily point out two reasons why it wouldn't work.

    1. Google has no way of knowing what a site is. Some sites span multiple domains, some domains have multiple sites.

    2. If sites were ranked, and not pages, then similar pages on a site would have similar rankings. Like wise, a new page on a high ranked domain, say MIT.edu or GEOCITIES.com, would automatically receive a bonus (which rankings show they don't).

    HostRank might be your idea of how things should work. And I'm sure you'd love to think you discovered something no other SEO knows. The simple truth is though that its not true. The web is made up of pages, sites are arbitrary.

    You also assume Google is having problems running its algorithms due to the size of it's index. Claiming that index size has increased faster than computer power. Actually over the past 2 years processor power has increased and costs have decreases by a greater percentage than Google's index. Also Google's income/capital has increased dramatically (what with Adwords taking off and the IPO). Finally, Google is doing more frequent updates.

    Brin & Page had a far worse resource situation when they started Google & obviously they didn't think ranking sites was a good idea then, I doubt they'd do it now.


    You also claim a problem with PR is the ability to manufacturer it:

    You can hoard PageRank by artificially generating pages and making them link to another page that you want to stuff with PR. Remember, every new page adds a PR of 1 to all the PR on the web and then you can transfer it to another page. By simply adding lots of pages you generate new PR and can transfer it to the pages you want. In this way, you can build monstrous PR with a huge site that does not have a single link from another site! The cost of building new pages (adding PR to the web) is ZERO!
    Yet you fail to realize the endless feedback loop you have created with your idea of HostRank external links not siphoning off rank.

    Under the HostRank model, external links won't leak your HostRank (they may do that in very limited contexts, but in most cases they won't). Under the classic PageRank model, outgoing links leaked some PR.

    All-in-all, HostRank encourages host-to-host voting (linking). You won't lose your rankings if you vote for other sites.

    All outgoing links carry the same weight. You can't bury your link partners 5 levels deep. They'll get an even share of your total HostRank.
    This system would quite literally break Google. Literally. Its like doing this:

    $x = 1;
    while($x > 0){
    $x++;
    }

    Eventually the program crashes.

    Each link must have a penalty equal to it's benefit or you're simply manufacturing more and more rank with each iteration of the algorithm and instead of approaching 1 you approach infinity until the integers grow too large and the system crashes down.

    Its is a mathematic certainty. By keeping a cost for each link you keep total global rank the same. Then by adding a dampening factor you slowly decrease it until it reaches 1.



    You've tried to design a utopian system that is "fair" to all webmasters. And like communisim, it works in theory, but not in practice.
    Chris Beasley - My Guide to Building a Successful Website[size=1]
    Content Sites: ABCDFGHIJKLMNOP|Forums: ABCD EF|Ecommerce: Swords Knives

  5. #5
    Web Monkey MarkB's Avatar
    Join Date
    Nov 2003
    Location
    London, UK
    Posts
    1,783
    See? I told you it was confusing
    Stepping On Wires - the new blog

  6. #6
    Registered
    Join Date
    Jan 2004
    Posts
    183
    Google has no way of knowing what a site is. Some sites span multiple domains, some domains have multiple sites.
    Under HostRank's model every domain/subdomain is a separate entity.

    If sites were ranked, and not pages, then similar pages on a site would have similar rankings. Like wise, a new page on a high ranked domain, say MIT.edu or GEOCITIES.com, would automatically receive a bonus (which rankings show they don't).
    It will receive a bonus if the page is linked to from every other page on the site with the anchor text full of keywords (to bump up the IR Score). Also it will receive good LocalRank if it gets linked from the home page. What's wrong with that? A medical site with a huge HostRank puts a new page about a new disease. Does that page deserve top ranking, or does some lame site with few link swaps and inflated anchor hits deserve top rankings?
    The geocites types of sites are tackled in the Google paper "Who links to whom" that you haven't read published in 2001.

    You also assume Google is having problems running its algorithms due to the size of it's index.
    Where did I say this? HostRank requires less resources, but the major advantage is fighting spam.

    Yet you fail to realize the endless feedback loop you have created with your idea of HostRank external links not siphoning off rank.
    You don't seem to understand it. Under the classic PR model, external links leak PR because a big part of the PR comes from internal laudry of the external PR. You lower the PR that gets fed back from internal pages. Under HostRank, PR comes from other sites. Under HostRank leak will happen only to artificial networks. Example: you have a high HostRank site that links to a bunch of sites of yours which link back to it. IF you start putting external links on the high HostRank your other sites will lose HostRank and will feed you back less HostRank. But most sites rely on links from sites you don't control. You only leak HostPR when you laundry it with your other sites. In the real world, HostRank leads to much less leakage that PageRank.

    This system would quite literally break Google.
    Nonsense.

    Each link must have a penalty equal to it's benefit or you're simply manufacturing more and more rank with each iteration of the algorithm and instead of approaching 1 you approach infinity until the integers grow too large and the system crashes down.
    PR/HostPR is based only on incoming links. You don't leak/penalize explicitly, but implicitly by getting less from the pages/sites you control.

    You've tried to design a utopian system that is "fair" to all webmasters. And like communisim, it works in theory, but not in practice.
    Chris, you gave a quick look and didn't get anything. Your comments are rushed too much. Think first, write next. Sit down, think about it, read the papers (one of them discusses in detail the HostRank model and explains in detail what I am trying to tell you) and then come back with comments.

    Normally I wouldn't allow such self promotion, but I'm letting this stay as its worth pointing out to people just how ridiculous HostRank is.
    The paper that discusses HostRank got an award for it. Yet, your 45 seconds of reading proclaimed it ridiculous. If you think I am a spammer, delete my account and all my posts.

  7. #7
    Administrator Chris's Avatar
    Join Date
    Feb 2003
    Location
    East Lansing, MI USA
    Posts
    7,055
    Under HostRank's model every domain/subdomain is a separate entity.
    So then we'd have subdomain spam? Domains are cheap too.

    Also I didn't give it a quick look. This idea has been brought up before, I've discussed it before, it doesn't work.

    Just because someone working for Google wrote a paper on something does not mean Google uses it. Google has dozens of PhDs working for them, each one of those people has written multiple papers.

    Before you come up with the ideally that Google has made a radical change to how they rank pages and start patting yourself on the back for your insight. Get actual empirical evidence.



    Oh... I just thought of another thing that disproves your theory. You claim that all external host links on a site get equal benefit. Well how come then do you receive a far greater jump in the rankings when you're linked to on the homepage of a highly-ranked site than when your link is buried deep?
    Chris Beasley - My Guide to Building a Successful Website[size=1]
    Content Sites: ABCDFGHIJKLMNOP|Forums: ABCD EF|Ecommerce: Swords Knives

  8. #8
    Registered
    Join Date
    Jan 2004
    Posts
    183
    So then we'd have subdomain spam? Domains are cheap too.
    It is possible. Pages don't cost a dime. If I am right about HostRank, there ain't too many people that know subdomain/domain spamming works. It is obvious that more people try to page spam than domain spam.

    Also I didn't give it a quick look. This idea has been brought up before, I've discussed it before, it doesn't work.
    Any link? What are the reasons it does not work?

    Before you come up with the ideally that Google has made a radical change to how they rank pages and start patting yourself on the back for your insight. Get actual empirical evidence.
    I have evidence and I am running a test on my site. On my link exchange pages there were about 80-90 internal and 10 external links. That prevents PR leakage under the PR model. I have removed almost all internal links and left only the 3 links I care + 2 more (add your link, link directory). Under the page-to-page PR model, I will have PR leakage and my rankings will drop. Under HostRank/LocalRank, I will move up, because I've bumped up the LocalRank of the 3 important pages that were diluted by all those unimportant links before. I'll report the results when google finds out about the changes. I don't work on any other incoming links. If I'm right I will get a boast, if I'm wrong I will drop in the SERPs, right? Let's see what happens.

    Oh... I just thought of another thing that disproves your theory. You claim that all external host links on a site get equal benefit. Well how come then do you receive a far greater jump in the rankings when you're linked to on the homepage of a highly-ranked site than when your link is buried deep?
    I claim they get the same HostRank benefit. I also will claim in my next articles that the IR Score that takes into account anchor text actually looks at where the page is placed, and what's on that page.

  9. #9
    Administrator Chris's Avatar
    Join Date
    Feb 2003
    Location
    East Lansing, MI USA
    Posts
    7,055
    I have evidence and I am running a test on my site. On my link exchange pages there were about 80-90 internal and 10 external links. That prevents PR leakage under the PR model. I have removed almost all internal links and left only the 3 links I care + 2 more (add your link, link directory). Under the page-to-page PR model, I will have PR leakage and my rankings will drop. Under HostRank/LocalRank, I will move up, because I've bumped up the LocalRank of the 3 important pages that were diluted by all those unimportant links before. I'll report the results when google finds out about the changes. I don't work on any other incoming links. If I'm right I will get a boast, if I'm wrong I will drop in the SERPs, right? Let's see what happens.
    Thats hardly a controlled experiment.

    You might not have enough PR for it to be statistically significant.

    The value of incoming links to the site could change in value.

    You're not actually testing any site-to-site relations.

    The change might be small enough that it'd go unnoticed in rankings.

    A change in rankings might happen not because of anything you've done, but rather other pages ranked for your terms have gotten better/worse.

    Do you have any background in science or math?





    The idea that Google is looking at unique domains in it's linking algorithm is not far fetched. It is likely done by page, not by site though. Looking at unique domains is a good antispam tool.

    However, you seem to have morphed that concept into, as I said before, your own utopian view of how things should work. You are inventing concepts and trying to pass them off as fact without any empirical evidence (even if your experiment was a valid test of site-site links, you yourself admit its not done yet). This type of behavior is not conducive to credibility.
    Chris Beasley - My Guide to Building a Successful Website[size=1]
    Content Sites: ABCDFGHIJKLMNOP|Forums: ABCD EF|Ecommerce: Swords Knives

  10. #10
    Registered
    Join Date
    Jan 2004
    Posts
    183
    You might not have enough PR for it to be statistically significant.
    All 70 link exchange pages are PR4. The home page is PR6. I get outranked by PR5 PR4 home pages and the #1 ranked site has PR4, no DMOZ, Yahoo listing, poor internal linking structure (based on the classic PR model with lots of leakage). All competitors that outrank me, actually don't protect PR leakage. I am the only one.

    The value of incoming links to the site could change in value.
    Maybe. My rankings have been stable for the last 8 months. A nice change up/down is indicative of either my experiment or a change in Google's algo.

    You're not actually testing any site-to-site relations.
    I am testing LocalRank. If it exists, HostRank must also exist.

    The change might be small enough that it'd go unnoticed in rankings.
    Yes. In that case this experiment won't show definite results.

    A change in rankings might happen not because of anything you've done, but rather other pages ranked for your terms have gotten better/worse.
    yes, but as I said my rankings have been too stable for the last 8 months to believe changes outside my site.

    Do you have any background in science or math?
    Yep. Graduated a high school of math. I have a first place at a national math competition (best in my country). Various national/international programming competitions with one 1 national 1st place. I have a 1st place at an international conference of young scientists with a data compressor (it was a faster than all the compressors of the time zip, arj etc. and it compressed better). I have also worked in various programming projects with many other top programmers. Been the captain of the university programming team. Studied with the coach of our national programming team. We've actually won a competition against a team who was ranked 17 at the ACM finals. So, I know a thing or 2 about programming and algorithms. At the competitions we solve problems for time. If your program does not run under the alloted time you get 0 points. Of course, if you want points you have to use the smart algorithm etc.etc.

    However, you seem to have morphed that concept into, as I said before, your own utopian view of how things should work. You are inventing concepts and trying to pass them off as fact without any empirical evidence (even if your experiment was a valid test of site-site links, you yourself admit its not done yet). This type of behavior is not conducive to credibility.
    My article is based on papers, experience (both with SEs and with math/algos). You may or may not believe me. In fact, I don't care, but your comments are insulting. I understand your point. There's a truckload of SEOs who haven't written a line of code in their entire lives trying to explain how Google works. I understand your reaction. But I am not an idiot.

  11. #11
    Administrator Chris's Avatar
    Join Date
    Feb 2003
    Location
    East Lansing, MI USA
    Posts
    7,055
    All 70 link exchange pages are PR4. The home page is PR6. I get outranked by PR5 PR4 home pages and the #1 ranked site has PR4, no DMOZ, Yahoo listing, poor internal linking structure (based on the classic PR model with lots of leakage). All competitors that outrank me, actually don't protect PR leakage. I am the only one.
    This illustrates my point nicely. You're not ranked as you think you should be so you concoct this explanation. Maybe the pages ranking higher than you have better quality incoming links.

    I am testing LocalRank. If it exists, HostRank must also exist.
    This is why I wonder if you have a scientific background. You express absolutely zero knowledge of the scientific method. You cannot make a jump like that without proof.

    Maybe. My rankings have been stable for the last 8 months. A nice change up/down is indicative of either my experiment or a change in Google's algo.
    Maybe = improper experiment. You must isolate all variables other than the one you're testing and you must have a control group. Its alot of work, I know, but its the only way to know for sure.



    Also, like I've said, your whole theory makes no sense.

    According to your article a site is given a "hostrank" score and then a page is given a "localrank" score and they are multiplied to give the page a rank. The local rank score is based only on internal links and the hostrank score is based only on external links from different domains.

    Oh my literature site I have 13,000 pages of content arranged hierarchically. The internal linking structure is almost constant so, under your system, each author page would have roughly the same localrank score, each book page, each chapter page. If anything the authors with more books published on the site (or more pages of content) would have the highest scores.

    Any incoming links pointing to any page on the site would raise the site's hostrank, but do nothing directly for the page the link is pointing to.

    Why then do some pages rank better than other pages? Why do my pages with Yahoo listings rank better than pages without Yahoo listings? Why is it that the more links an author page gets, the better it ranks? Because of off page content? Surely you recognize, with your math background, that that has to be a percentage based score?

    I'm going out of town now, so I won't be responding to this anymore until tommorow. My final thought is..... Why is it that you, a person with no reputation in the search engine world, are the person who came up with this? You could argue that reputation is meaningless, but in reality reputation roughly equivocates to experience. Why are none of the well recognized experienced people in this industry talking about PageRank being over and hostrank "rullz"?

    Okay... final thought #2....

    Your system allows webmasters to rank their own content in order of importance. This is a step behind the current system, which allows the Internet as a whole to rank each page on your site according to how it sees fit. Why would your system be better? To fight spam? There are easier ways. To reduce processing time? Thats not needed.
    Chris Beasley - My Guide to Building a Successful Website[size=1]
    Content Sites: ABCDFGHIJKLMNOP|Forums: ABCD EF|Ecommerce: Swords Knives

  12. #12
    Registered
    Join Date
    Jan 2004
    Posts
    183
    Chris,
    arguing with you is pointless. The guys who got awarded for their HostRank paper are idiots. I am an idiot. And you are the search engine expert.

    Finally:
    Why are none of the well recognized experienced people in this industry talking about PageRank being over and hostrank "rullz"?
    Experience = leverage, not search engine knowledge. Most "reputable" seos have only leverage and have never written even "Hello World", let alone solve 7 difficult problems in 5 hours involving Graph Algorithms, Dynamic Optimization etc. and running all huge test tasks under 1 second. According to your logic Jill Whalen is a reputable expert. Any person with average intelligence could have been 1000 times more reputable if he started some years ago. All "reputable experts" simply started long ago. That's leverage, not seo knowlegde. The only experts work for the search engines.

  13. #13
    Administrator Chris's Avatar
    Join Date
    Feb 2003
    Location
    East Lansing, MI USA
    Posts
    7,055
    Actually I was specifically not thinking of Jill Whalen -- she isn't a technical SEO person.

    I was thinking more of someone like Dan Thies. He does alot of SE algorithm testing and has written papers previously on major changes.
    Chris Beasley - My Guide to Building a Successful Website[size=1]
    Content Sites: ABCDFGHIJKLMNOP|Forums: ABCD EF|Ecommerce: Swords Knives

  14. #14
    Registered
    Join Date
    Feb 2005
    Location
    Kaliningrad
    Posts
    22
    I only skipped over the article, but it looks nonsense. I'm pretty sure google has a sandbox, a few of my sites are ranked no.1 on msn and yahoo for competitive keywords, but because they are only a couple of months old are not ranked AT ALL on the google SERPs for the same keywords. It's in DMOZ and Zeal, and I've bought links to it on several similar sites. I made a press release for the site, like you said, I give away articles, like you said, but my site still isn't ranked on google. I'm in the 'Sandbox', no doubt about it, I've talked to loads of people who are SEO consultants full time, and almost all of them agree with the sandbox theory.

    Yes, internal linking is important, and page rank isn't as important as it was a few years ago, but I dismiss HostRank. LocalRank could be true, I'm not going to comment on that.
    Anna Gorbachev.

  15. #15
    Senior Member chromate's Avatar
    Join Date
    Aug 2003
    Location
    UK
    Posts
    2,348
    I don't dismiss what nohaber is saying. I certainly wouldn't call the whole thing nonsense without reading it (I have read it). I don't necessarily agree with all of it either though.

    It's actually a very interesting idea. I can see why whoever wrote it got an award for the paper. I need time to fully think it through though (I'm slow I know!!)

Similar Threads

  1. Report Dead Link Script
    By incka in forum Website Programming & Databases
    Replies: 8
    Last Post: 01-23-2005, 04:03 PM

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •