nukeSEO.com - PHPNuke SEO Search engine optimization, professional tools including nukeSEO, nukeSPAM, nukeFEED, nukePIE, nukeWYSIWYG and more

 

. Welcome to nukeSEO.com  ! 
.
.
.


.
nukeSEO.com FAQ (Frequently Asked Questions)





Question Question
·  What is Search Engine Optimization or SEO?
·  What is robots.txt?
·  How does robots.txt differ from the ROBOTS META tag?
·  Do all robots follow the rules for robots.txt?
·  Are META keywords important?
·  What is keyword relevancy?
·  What is keyword density?
·  What is competitive analysis by keyword?
·  Why are META tags important?
·  Should I submit my site to every search engine?
·  What is a Google Sitemap / sitemap.xml?
·  What is SERP?
·  What is search engine saturation?
·  What is link popularity?
·  What are ethical methods of SEO?
·  What factors affect search engine rankings?
·  Are dynamic titles important?

Do you have a question which isn't answered here? Click Here to add it.

Answer Answer
Print this Answer
?  What is Search Engine Optimization or SEO?

from Wikipedia:
Search engine optimization (SEO) is a set of methods aimed at improving the ranking of a website in search engine listings. The term also refers to an industry of consultants that carry out optimization projects on behalf of clients' sites.

Using search engines, visitors can find sites in a variety of ways: via paid-for advertisements in the search engine results pages (SERPs), via third parties who are listed in the search engines, or via "organic" listings, i.e. the results the search engines present users. SEO is primarily about the latter.

High rankings in the organic SERPs usually translate into a lot of free, targeted traffic for the sites that rank well. Obtaining that traffic by other means can be expensive. For particularly competitive terms the cost of acquisition of a single visitor can be in excess of US$100. For even moderately competitive terms the cost can range from a few cents to several tens of dollars per visitor. Given those costs it often makes sense for site owners to optimize their sites for the free traffic search engines can send.

Not all sites have identical goals in mind when they optimize for search engines. Some sites are seeking any and all traffic, and may be optimized to rank highly for any commonly searched phrase. Other sites target a certain population, and are therefore concerned with their rank on specific search phrases that they believe that population will use. Sites may optimize to attract native speakers of a particular language, to attract visitors from a specific geographical location, or even to attract visitors who use certain misspelling of the target keyword.

[ Back to Top ]

Answer Answer
Print this Answer
?  What is robots.txt?

robots.txt, found in the root directory of a website, is a specially formatted file that allows a web site administrator to indicate which parts of the site should not be visited by a robot.  It must be specified in the Robots Exclusion Protocol.  The file must be named robots.txt in lower case.

You can use robots.txt to help legitimate robots (e.g. Googlebot) find what you want them to find, avoid traps that could cause them to be banned by a tool like NukeSentinel, and most importantly, conserve your limited bandwidth.  Unfortunately, bad robots (e.g. spambots, linkbots) often ignore the robots.txt instructions.  Fortunately, there are ways to stop them.

from The Web Robots Pages:
The Robots Exclusion Protocol is a method that allows Web site administrators to indicate to visiting robots which parts of their site should not be visited by the robot.

In a nutshell, when a Robot vists a Web site, say http://www.foobar.com/, it firsts checks for http://www.foobar.com/robots.txt. If it can find this document, it will analyse its contents for records like:

User-agent: *
Disallow: /
to see if it is allowed to retrieve the document. The precise details on how these rules can be specified, and what they mean, can be found in:

[ Back to Top ]

Answer Answer
Print this Answer
?  How does robots.txt differ from the ROBOTS META tag?

robots.txt is used to tell robots which parts of a site should not be visited, whereas the ROBOTS META tag tells a robot how to treat a specific page.

from The Web Robots Pages:
The ROBOTS META tag allows HTML authors to indicate to visiting robots if a document may be indexed, or used to harvest more links. No server administrator action is required.

Note that currently only a few robots implement this.

In this simple example:

<META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">
a robot should neither index this document, nor analyse it for links.

Full details on how this tags works is provided:



[ Back to Top ]

Answer Answer
Print this Answer
?  Do all robots follow the rules for robots.txt?

The robots.txt file is only a guide and some bad robots ignore it.  Fortunately, there are tools for enforcing compliance with the robots exclusion protocol used in robots.txt, including:

[ Back to Top ]

Answer Answer
Print this Answer
?  Are META keywords important?

According to many search engine experts, META keywords have lost prominence as search engines have become more advanced.  However, most also agree that META keywords should not be ignored.  Intelligent search engines that assess content quality may also calculate keyword density based on the META keywords and use that to rate a site.  If keyword density for META keywords is to too high, a site's ranking may be negatively impacted, as the search engine may view this as keyword spamming, an unethical SEO practice.

[ Back to Top ]

Answer Answer
Print this Answer
?  What is keyword relevancy?

Keyword relevancy describes the importance of a META keyword in a web page's content.  It is usually defined in terms of keyword density - the ratio of a keyword to other words on a page.  Low keyword density means that a keyword isn't relevant to the page and may be treated negatively by search engines.  High keyword density may indicate an unethical SEO practice known as keyword stuffing or spamming - repeating keywords frequently on a page simply to increase density, which may also be treated negatively by search engines.

[ Back to Top ]

Answer Answer
Print this Answer
?  What is keyword density?

From Wikipedia:
Keyword density is the percentage of words on a web page that match a specified set of keywords. In the context of search engine optimization keyword density can be used as a factor in determining whether a web page is relevant to a specified keyword or keyword phrase. Due to the ease of managing keyword density, search engines usually implement other measures of relevancy to prevent unscrupulous webmasters from creating search spam through practices such as keyword stuffing.


[ Back to Top ]

Answer Answer
Print this Answer
?  What is competitive analysis by keyword?

Competitive analysis by keyword, also referred to keyword analysis, involves determining the top URLs (the competition) for a keyword with additional information for each of the top URLs like backlinks, indexed pages, and pagerank.  Keyword analysis can be used to identify the most appropriate search terms for optimizing a site or page for search engines.

[ Back to Top ]

Answer Answer
Print this Answer
?  Why are META tags important?

From Wikipedia:
Meta tags have been the focus of a field of marketing research known as search engine optimization or SEO. In the mid to late 1990s, search engines were reliant on Meta tag data to correctly classify a web page. Webmasters quickly learned the commercial significance of having the right Meta tag, as it frequently led to a high ranking in the search engines - and thus, high traffic to the web site.

As search engine traffic achieved greater significance in online marketing plans, consultants were brought in who were well versed in how search engines perceive a web site. These consultants used a variety of techniques (legitimate and otherwise) to improve ranking for their clients. You can see any web site's Meta tags by viewing the source code.

The two most popular Meta tags used for search engine optimization is the Meta description tag and Meta keyword tag. The Meta description tag allows web site owners to describe what the web page is about. Search engines such as Google still use this to display the description of a web site in the search results. The Meta keyword tag allows site owners to put keywords in it to help search engines decide what a web site is about.

In the early 2000s, search engines veered away from reliance on Meta tags, as many web sites used inappropriate keywords or were keyword stuffing to obtain any and all traffic possible.

Some search engines, however, still take Meta tags into some consideration when delivering results. In recent years, search engines have become smarter, penalizing web sites that are cheating (by repeating the same keyword several times to get a boost in the search ranking). Instead of going up in the ranking, these sites will go down in ranking or on some search engines, will be kicked out of the search engine completely.

[ Back to Top ]

Answer Answer
Print this Answer
?  Should I submit my site to every search engine?

Several factors may influence your decisions as to which search engines in which you desire to have your site listed.  Just as you determine which search terms you want to have high search engine rankings, where those searches occur can impact your site traffic - the number and frequency of visitors.  Having high rankings on a search engine no one uses doesn't help.  Limiting your submissions to a few engines may also limit your traffic.

Over the years, SearchEngineWatch.com has used several measures to report search engine statistics, including searches per day, search hours per day, visits per day, surveys and more.  The latest statistics for searches per day, as of July 2005, show that the top 4 search engines (Google, Yahoo, MSN and AOL) account for almost 90% of daily searches.  Clearly, these four search engines should be on your submission list.

Some of these search engines (and many others) look for inclusions in web directories like DMOZ  to determine if a site should be included and what its ranking should be.  Thus, it is important to submit to those sites as well.

[ Back to Top ]

Answer Answer
Print this Answer
?  What is a Google Sitemap / sitemap.xml?

A Google sitemap is a specially formatted file, often named sitemap.xml, used by Google to quickly identify the pages on a website.  Currently in Beta, the Google Sitemap program has proven to be very effective at quickly improving inclusion for new sites, page ranking, and saturation.

[ Back to Top ]

Answer Answer
Print this Answer
?  What is SERP?

SERP is an acronym for Search Engine Result Pages.  It commonly refers to a site's keyword ranking or position in a search engine and can be defined in terms of page and / or position.  For example, as of October 27, 2005, the Google rankings for http://www.root-beer.org for the keyword "root beer" are first page, first position, while the MSN rankings are first page, second position.

Having a high keyword ranking is the ultimate goal of search engine optimization (SEO), but achieving that goal can be very difficult, depending on the popularity of the keyword, the number of competing sites, and a large number of other factors, depending on the specific search engine.

[ Back to Top ]

Answer Answer
Print this Answer
?  What is search engine saturation?

Search engine saturation refers to the number of indexed pages for a website.  Although a higher number is generally better, the important measure is the percentage of a site's important pages that are indexed.  Low search engine saturation can be the result of problems with the robots.txt (disallowing pages incorrectly), pages missing from the sitemap or unlinked pages.

[ Back to Top ]

Answer Answer
Print this Answer
?  What is link popularity?

From Wikipedia:
Link popularity is a measure of the quantity and quality of other web sites that link to a specific site on the World Wide Web. It is an example of the move by search engines towards off-the-page-criteria to determine quality content. In theory, off-the-page-criteria adds the aspect of impartiality to search engine rankings.

Link popularity plays an important role in the visibility of a web site among the top of the search results. Indeed, some search engines require at least one or more links coming to a web site, otherwise they will drop it from their index.

Search engines such as Google use a special link analysis system to rank web pages. Citations from other WWW authors help to define a site's reputation. The philosophy of link popularity is that important sites will attract many links. Content-poor sites will have difficulty attracting any links. Link popularity assumes that not all incoming links are equal, as an inbound link from a major directory carries more weight than an inbound link from an obscure personal home page. In other words, the quality of incoming links counts more than sheer numbers of them.

NOTE:  Google's latest ranking algorithm, called Jagger, decreases the significance of link popularity, favoring content quality instead.

[ Back to Top ]

Answer Answer
Print this Answer
?  What are ethical methods of SEO?

From Wikipedia:

So-called "Ethical" methods of SEO involve following the search engines' guidelines (Google, Yahoo) as to what is and what isn't acceptable. Their advice generally is to create content for the user, not the search engines; to make that content easily accessible to their spiders; and to not try to game their system. Often webmasters make critical mistakes when designing or setting up their web sites, and "poison" them so that they will not rank well. Ethical SEO attempts to discover and correct mistakes, such as menus not-readible, broken links, temporary redirects, or a generally poor navigation structure that places pages too many clicks from the home page.

Because search engines are text-centric, many of the same methods that are useful for web accessibility are also advantageous for SEO. Methods are available for optimizing graphical content, even Flash animation (by placing a paragraph or division within, and at the end of the enclosing OBJECT tag), so that search engines can interpret the information.

Some methods considered ethical by the search engines:

  • Using a robots.txt file to grant permissions to spiders to access, or avoid, specific files and directories in the site
  • Using a short and relevant page title to name each page
  • Using a reasonably sized description meta tag without excessive use of keywords, exclamation marks or off topic comments
  • Keeping the page accessible via links from other pages on the site and, preferably, from a sitemap
  • Developing links via natural methods: Google doesn't elaborate on this somewhat vague guideline, but buying a link from an off-topic page purely because it has a high PageRank is probably not considered acceptable. Dropping an email to a fellow webmaster telling him about a great article you've just posted, and requesting a link, is most likely acceptable.


[ Back to Top ]

Answer Answer
Print this Answer
?  What factors affect search engine rankings?

If there was a simple answer or formula for search engine ranking success, we wouldn't need sites like this excellent, almost scientific review of various search engine factors.

[ Back to Top ]

Answer Answer
Print this Answer
?  Are dynamic titles important?

First, to clarify, the use of dynamic titles refers to changing the page <TITLE> element based on the contents of the page, rather than having a single TITLE element for the entire site. 

The value of the TITLE element appears at the top of the browser and may be used by search engines to identify a short description of a web page.  This dual -purpose use makes the TITLE element extremely important both for usability and for search engine optimization.  In addition, the TITLE element frequently appears at the top of a printed web page.

Since the length of the text in the TITLE element used for various purposes and extra text is usually truncated, the most important text (i.e. that which describes the page) should be displayed first.  The value of the TITLE element should generally be kept to 64 characters.  For example, "Search Engine Optimization Frequently Asked Questions - nukeSEO.com" would be better than "nukeSEO.com - Frequently Asked Questions - Search Engine Optimization" because an important word, Optimization, might be truncated in the latter.

[ Back to Top ]




.
Search

User Info
Last SeenLast Seen
Server TrafficServer Traffic
  • Total: 25,025,363
  • Today: 1,425
Server InfoServer Info
  • Nov 21, 2024
  • 01:49 am CST
 

Supporters


Link to Us
PHPNuke SEO Search engine optimization, professional tools including nukeSEO, nukeSPAM, nukeFEED, nukePIE, nukeWYSIWYG and more
[ Supporters ]

Recommended Sites

RavenPHPScripts.com


SOE Media
Valid HTML 4.01!

Googlebot Tracker


.

Page Generation: 0.02 Seconds