views:

118

answers:

4

Hi all! There are a lot of websites that look professional in Google results. Try searching for 'stackoverflow' and you'll see at the top a result with a title, a description and a table of 8 links to stackoverflow categories. That's what i'm interested in producing for future websites.

So what must be done? Does it depend on the number of visitors? how long does it take until the results start looking like that?

+2  A: 

I think you are referring to "sitelinks". Google generally does not make it public exactly how those are created (to prevent abuse, for example). I suspect you need the subpages to be very strongly linked, perhaps about the same amount or more than the top-level page. No way to know for sure. The best way to get your website looking good in Google is to make it as user-friendly and human-friendly as possible. I think Google typically looks for clues as to whether the website will be relevant to humans and very likely penalizes content that detracts from the interface just to become search-engine optimized.

Michael Aaron Safyan
+1  A: 
  1. Make sure that each page (not just your home page) has a title.

  2. Include description meta information, which search engines may (or may not) use for snippets to display.

Rob Lachlan
A: 

If an unordered list (<ul><li><a href="http://..&quot;&gt;Home...) is used for navigation on the page, Google will pick that up and display it underneath the page listing when it is the #1 or #2 position listing.

Google may also use the description meta, or the first few lines of text that appear on the page, underneath the entry. It usually does this for searches in the other positions.

Chris Perkins
A: 

Having moved to the site, the robot starts walking down the pages and downloads the information, they have (on the site it can be presented as a text, image, video and other files). This process is called “crawling” Robot also needs to establish, when it will visit the site again, so it creates the schedule with the help of which it will define the time of the next visit in order to check the already existing information about the site and also to add new pages to the Database, if they exist. It is important to “meet” the search robot in the effective manner, to simplify the process of the site indexation as much as possible because its time is limited and it has a great amount of sites, except yours, in its “visit schedule”. That’s why we have to create the appropriate conditions. We can do it in this way:

  • At first it is necessary to be sure in the site availability, it can be checked by the typing the domain name of the site in the address line of the browser.
  • Be sure that there are no problems with the site navigation, try not to use Java Script and Flash, the menu, designed with the help of these technologies is not recognized by the robots effectively.
  • Test your site for the absence of the errors 404.
  • Also don’t send the robot on the pages, accessible only for the registered users.
  • Take care of the levels of the nestings of your pages, because there are some limitations as for the deep site penetration.
  • Take into account that the maximal size of the text, represented on the page consists of 256 kb, try not to exceed this limit. In general, it is difficult to answer your question directly. Try to read http://phpforms.net/tutorial/tutorial.html To understand Google and other SS