No announcement yet.

Has Google run out?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Has Google run out?

    A rumour is spreading like the plague over the infinite paths of the Internet: Google has reached overbooking. The most popular creature in the virtual universe has space problems, or, more precisely, it could have them in the future. In fact, is seems that the most popular search engine in the world is about to reach the limit of its capacity of listed pages: 4,294,967,296. A numeric problem that is mainly due to a calculation error.

    In Google’s giant database, each link can occupy the space of only 4 bytes. In other words, even though the home page of the system has counted something like three million web pages, it is probable that the engine is about to run out. Or, if they do not change their system of listing links, the Internet will continue to grow and expand behind Google’s back, but the new sites that will be created will be left out while its database will be full of an enormous quantity of obsolete pages that will crowd the pages of search results.

    The analysts arrived at this conclusion by studying the recent “strange behaviour of the system” during the last renewal of its “contents.” In fact, more or less once a month, Google reorders its listed pages. During this process, the system calculates the so-called PageRank of each page, based on the number of visits received and therefore, based on order of importance. It then incorporates the new pages found into its list of available web sites, periodically modifying the search results. This process of updating is called Google Dance in jargon and lasts approximately four days. However, during the last Google Dance many pages changed their position in the classification in an unexplainable way. This, together with other anomalous events has created unrest among users.

    Since the first “>Google became a reality on the Internet in 1997, this search system has evolved and become, without a doubt, the most powerful instrument on the Internet. Not only have its creators, Sergey Brin and Larry Page become multimillionaires, but they have also become two of the greatest heroes of the virtual community whose dominion extends all around the world, thanks in particular to their list of links. In a certain sense, Google is the spine of the Internet. And whatever problem afflicts its working can influence the growth of the very same Internet. But what can they do to continue to efficiently list a sea of pages that continue to multiply exorbitantly?

    If the problem really exists, and is grave, it must certainly be a calculation error. Though the Internet is infinite, and can grow in an unlimited way, its system of protocols (the map of letters and numbers which allow us to navigate) is not. The greatest error was to design a system of limited growth, as has already happened in the case of IP technology. But similar almost panicky situations have already come up: at the end of the year 1999, the entire world was apprehensive about the possible consequences of passing from 99 to 00. All of the files would go crazy, with computers convinced that they were working in 1900, and create an infinity of problems? This psychophilosophical problem of nervous media which manifested itself all over the world was called the year 2000 effect. But everything went well and now we have not seen the smallest trace of this problem.
    GSMBOX

  • #2
    So is everyone going back to Excite?

    Of course the web will indeed continue to grow. Question is how many of these sites-to-come will be of consequence.

    Likely to be a small percentage indeed. Not difficult to concieve of them being added manually, so to speak as sites of merit are found.

    Google has never been a static creature, and development is ongoing.

    I'd certainly hate to see development stop, but at this time I just can't imagine that is going to happen in the near future.

    Ah well, perhaps it is time to dust off the links to those "deep web" search engines, they always did turn up some bizarrely interesting reading anyway.
    The reason a diamond shines so brightly is because it has many facets which reflect light.

    Comment


    • #3
      all problems have a solution, I'm sure they will find one for this problem too...of course that is only if this 'rumor ' is true

      I'd like to hear from google if this is actually the case, and if they can/will do anything about it before everyone gets in a panic over losing their current favorite search engine

      Comment


      • #4
        i guess this is why they developed that page rank system.
        they might be getting rid of the pages with very low page ranks as they are quite irrelevant or might not be offering good content (thus getting low hits) but then this is just a speculation.

        we should rather move this thread to "rumours and speculations" forums :p
        Latest Microsoft Security Updates.
        Last Updated:
        10th MARCH


        If you are a security freak: Use Microsoft Baseline Security Analyzer (NT/2000/XP/2003)
        ======================
        icq : 203189004
        jabber : [email protected]
        =======================
        Linux user since: April 24, 2003 312478
        yabaa dabaa doo...
        Customized for 1024x768

        Comment


        • #5
          Originally posted by [size=6
          asklepios[/size]]
          we should rather move this thread to "rumours and speculations" forums :p
          I'll not only 2nd the motion but move it as well. :laugh:

          Comment


          • #6
            I too am confident in Google's ability to cope. If this really is a problem, then they will just start (as was mentioned) bumping off the sites at the very very bottom, until it is convienient to upgrade, and since they are constantly upgrading, this is no big deal, whatsoever, and no doubt they have been planning for it since day 1.

            Comment


            • #7
              Could someone sorta clarify whats happening here? I somewhat understand but then I dont get it at all:o

              Comment


              • #8
                Its pretty much exactly like reaching a limit of a File system. (2 GB barrier, 32 GB barrier, 137 GB Barrier) Which is pretty much like reaching a limit of phone numbers. So when you are assigning homes with phone numbers, and you start with 555-0000 and you get to 555-9999 then all of a sudden you have to change the format to accomodate more. And so that isn't a great example, but lets say we fill up all the area codes, and so every number from 000-000-0000 to 999-999-9999 is taken, then all of a sudden, the format would have to change.

                So why don't they just start with some HUGE number?, because each number takes up space, and in an extremely extensive file system, it becomes bulky and inefficient. And so for Google's system to have lasted 6 years, that is alot and they were very forward thinking in 1997 to do it the way they did.

                Honestly, from an engineering perspective, this is nothing more than a system wide renovation.

                Comment


                • #9
                  This is why 64bit processin' will eventually take over from our current 32bit and servers will likely go to 128bit to address this need though there are a couple of other ways around this but they're far to technical for me to explain (needs someone with more grey matter than me to explain it better).

                  Comment

                  Working...
                  X