Featured
Offer
Leads
Generation Program - get listed
in 48 hrs for $99/month!
Would you like to see your website
coming in the first two
pages of Google or Yahoo?
To get highly qualified, targeted
traffic that generates leads and
increases your sales? Ready to
start seriously promoting your
website so it's earning you the
money it should?
If any of the above rings true,
then consider our Lead
Generation Program (from $99/per
month): this is a specialized
program that will help you target
costumers that are already interested
in your products and services
and that are ready to buy from
you. Just send us a Quote
Enquiry and tell us a bit
about your website and your expectactions.
According to a recent study by
Jupiter Research (2006), 90% of
online customers click on search
results within the first 3 pages.
But 62% of them click only on
the first few results! If your
website has not been correctly
optimized the chances of your
customers finding you through
search engines are slim to none.
And if nobody can find your website,
no business will be sent your
way.
Despite the alarming statistics,
there are too many businesses
that still do not effectively
promote their website. So why
don't they? "Too expensive,"
"Too difficult" and
"Takes too long" are
the three most common reasons.
At Apex Pacific we can help your
website grow quickly and all at
a low cost. Our experienced web
consultants will individually
work with you to tailor an online
strategy that is right for your
business.
The message is clear:
Online customers are getting more
sophisticated and demanding. If
your website does not rank well,
you're missing out on significant,
highly qualified traffic. Contact
us today to get your website listed
in 48 hours for $99/month.
Expert
Article
How
URLs Can Affect Top Search Engine
Rankings
Strategies
you need to know about...
How URLs Can Affect Top Search
Engine Rankings
...and
everything else that's
nice to know about them
too!
— By John Heard
You've
seen it a million times; you even
know it by name—URL.
You know that URL stands for Uniform
Resource Locator
and you probably refer to it by
its 3-letter acronym: "U-R-L."
Or maybe you're one of the cool
kids who calls it an "Earl." Either
way, you may not know how
URLs can affect your search
engine optimization (SEO)
strategies. Well, move over cool
kids, 'cause you're about to learn
something new...
Let's begin with the basics so
that later, when we drill down
into the important need-to-know
details pertaining to SEO strategy,
you'll be perched on a solid knowledge-base
and primed to follow through when
it comes time to implement what
you've learned.
|
The nice-to-know
stuff... |
A typical URL, like the one
seen below, can be broken down
into the following individual
components:
http://www.domain.com/subdirectory/filename.html
- The http
stands for Hyper Text
Transport Protocol,
which defines the method to
be used to view the resource.
Basically, that's what tells
us a webpage is reachable
via a web-browser and that
search engines can index it.
It also defines which Communication
Port to use—in this
case the default for http
is Port 80. Different protocols
typically use different ports.
- WWW
is the section of a domain
name commonly referred to
as the subdomain.
Most websites use either www
or else no subdomain at all.
However, many large composite
websites make use of unique
subdomain names to differentiate
between different major sections,
services, or topics within
their sites. For example news.google.com
or blog.company.com.
Subdomains can have multiple
levels, for example new.pressreleases.company.com
or, if you watch some of
your emails where people
are trying to trick you
into giving out your private
account details (i.e.,
phishing schemes),
you might see something
like www.citicorp.com.domain.somewhereelse.ru.
Subdomains can have many
levels and, technically
speaking, there are few
restrictions to their length
or number. Subdomains can
and typically do translate
to a separate IP address
than the primary domain
name. For non-SEO applications,
a unique subdomain often
is a different webserver
but, nevertheless, it remains
under the control of the
same primary domain owner.
For example, only Google
can make use of uniquename.google.com
because the DNS (Domain
Name Server) addressing
is tied to their primary
domain, google.com, which
is solely under Google's
control.
Subdomain names are not
case-sensitive. In other
words, News.Google.com
is the same as news.google.com.
- The last part of the domain
name, .com,
refers to the Top Level
Domain or TLD.
Like subdomains and domain
names, the TLD is not case
sensitive. TLDs are classified
by three types:
- Generic: .com,
.org, .gov, .edu, .biz,
etc.
- Country Codes:
.us, .uk, .jp
- Infrastructure:
.arpa (the
only one)
In this report we need
only be concerned with Generic
and Country Code
TLD's for SEO purposes.
- Finally, let's define the
elements beyond the TLD that
complete the breakdown of
our sample URL.
- subdirectory –
Refers to a subdirectory
or what appears to be a
subdirectory on the webserver.
- filename –
Refers to the document file
on the webserver.
- .html – the
filename extension that
typically tells us what
type of document. In this
case, .html
tells us the document is
written in Hypertext
Markup Language;
in other words, it's a typical
webpage.
Everything in the URL beyond
the TLD —i.e., subdirectory,
filename, and extensions—
are all
case-sensitive. In other
words, index.html
is not the same as
Index.Html
|
The need-to-know
stuff... |
Now, let's examine how each
of these elements should be
integrated into your search
engine optimization and marketing
strategies.
Subdomains
It's common knowledge
that the root of SEO is
keywords. Since the
beginning, all strategies
have revolved around the
ever-changing and constantly
evolving targets for keyword
placement. In the
swirling sea of anchor links,
meta tags, headlines, body
copy, and what-not, one
of the most consistently
useful placements in terms
of SEO has been within the
URL itself. And the degree
to which this strategy remains
effective is simply a matter
of how far the engines scale-it-up
on the algorithmic dial.
This is why the use of
keyword-specific subdomains
has been a long standing
and typically effective
SEO strategy. When viewed
from the search engine's
point of view (SEPOV),
this also makes sense. That's
because when a keyword is
found within the URL there
is typically a very high
probability that the keyword
is relevant
to the webpage's content—and
search engines hunger to
provide relevant results.
So, keyword placement in
the subdomain accomplishes
two things:
- It provides a clue to
a potential site visitor
what the site is about
prior to clicking the
link in the search results
(as
would be the case with
a domain like CheapAirlineTickets.mysite.com).
- It gives the search
engines an additional
relevancy indicator which
they can
choose to use—a
little, a lot, or not
at all—within their
overall relevancy algorithm.
Over the years, we have
found that search engines
cannot resist using it
to varying degrees—even
if only a little.
Think about it like this:
If all else is equal (which
is only hypothetically
possible), the site
with the keyword in the
subdomain will likely rank
higher than an otherwise
equally optimized site without
it. And, perhaps more importantly,
given a choice between two
otherwise equally attractive
selections within the search
results, the average
potential site visitor will
choose the link containing
the keyword in the subdomain
over the link that does
not.
Utilizing a subdomain is
one of the best legitimate
strategies for placing your
primary keyword into the
URL when the specific keyword-TLD
name is already taken or
otherwise unavailable.
Currently, Google, Yahoo,
and MSN each appear to be
giving some ranking boost
to pages that contain the
keywords in the subdomain
of the URL. We believe that
Yahoo places the most significance
on keywords in the domain
name, closely followed by
Google. MSN doesn't seem
to place quite so much emphasis
on this, but it does appear
to factor it into their
scoring.
We can't help but notice
that most of the top pages
in the search results for
very competitive
searches contain the keyword
either in the domain or
the subdomain name. One
good example is a search
for music.
On each of the big three
engines you'll find many
of the top results have
the keyword music
either in the subdomain,
primary domain, or in some
cases, the subdirectory—or
else you'll find a keyword
(like
mp3 or mtv
or itunes)
that is synonymous with
music.
Although there isn't any
question that keywords in
the subdomain can
help rankings, some cautions
are in order. Taking into
account that subdomains
can be separated by a period—making
possible certain combinations
like keyword.keyword.domain.com—we
recommend sticking to the
more customary single level
format; keyword.domain.com.
Rarely do you see multilevel
subdomains ranking for the
more difficult generic searches.
Another possibility is
to delimit your subdomain
using dashes. For example
keyword-keyword.domain.com
is also a technically available
possibility. However we
don't recommend using more
than one dash in a subdomain
even though we have
seen some good results at
MSN using more than that.
Regardless, unless you're
optimizing solely for MSN
(unlikely)
you should limit subdomain
dashes to one, or none.
A keyword placed in a subdomain
is not only a ranking factor,
but also a linking factor.
Frequently, when another
site links to your site,
they use the domain name
as the anchor text of the
link. Placing your best
keyword in the subdomain
means that other sites will
be compelled to use that
keyword in their link to
you. That will score your
page some extra relevancy
points from the search engines
point of view.
Another caution to be aware
of is wildcard subdomains
where anything.domain.com
results in the same page
as anythingelse.domain.com.
Search engines have a
VERY dim view of this practice
and we don't advise using
it.
Here's another warning:
Don't create a subdomain
when there isn't a good
user focused reason
to do so. Avoid having subdomains
with only one or two pages
on them—a small number
of pages on a subdomain
(other
than www.)
is a red flag to a search
engine. You can expect that
ranking penalties or outright
bans could be levied on
any site that combines the
typical www.domain.com
format with a bunch of low-populated
keyword-laced subdomains
in an obvious effort to
manipulate rankings.
If in doubt, it's always
a good idea to review how
some of the major search
engines are structuring
their file systems. For
example, look at Google.
Most of their site is at
www.google.com. However
they assign a subdomain
to certain specific and
large areas of their site.
For example news.google.com,
groups.google.com,
froogle.google.com,
and local.google.com.
Each of these subdomains
are logical separations
and, clearly, they have
good reason for dividing
their site into these subdomains.
A contrasting example would
be a shoe site that
uses nike.sitename.com,
adidas.sitename.com
and so forth. We're not
saying this wouldn't be
effective, only that it's
pushing the envelope and
the next algorithm tweak
might land the whole site
in the penalty box. If you
do it, be sure to place
plenty of unique content
within each of the subdomains.
Otherwise it's sure to be
viewed as a spam technique—so
beware.
Domains
Your domain, of
course, is your registered
domain name. For example,
this site's domain name
is SearchEngineNews. Naturally
you don't have the flexibility
of modifying it in the way
that you can with subdomains.
Regardless, the ranking
advantages enjoyed by keyword-smart
subdomains also apply to
domains.
Clearly, it helps considerably
to have good keywords in
your primary domain—both
from a search engine and
a consumer perspective.
The ranking boost is most
profound when the domain
name exactly matches the
keywords being searched
on. However, due to past
abuses, there are some unwritten
restrictions and warnings
to be aware of.
At one time, keyword-keyword-keyword-keyword.com
had a boosting effect on
rankings, but that didn't
last long. The engines were
quick to realized they were
being gamed and began counting
hyphens and domain name
character lengths. As you
might expect, they were
able to correlate multiple
dashes and long domain names
with spam sites. It didn't
take them long to restrict
how high a long, multi-dashed
domain name could rank.
Today we recommend no more
than one dash, and the shorter
your domain name, the better.
In a perfect scenario, your
best domain name is typically
your primary keyword or
keyword combination. Whenever
that isn't possible, at
least try to get the most
important keyword mentioned
somewhere in your domain
name. Otherwise, expect
to settle for placing it
into your subdomain.
To hyphen or not?
A few years back, hyphenated
keywords within some domain
names enjoyed a small ranking
advantage. Hyphens were
used to delimit the text
so the search engines could
more easily distinguish
each keyword within a phrase
without mistaking it for
some unique term. However,
today we're seeing more
indications that the major
engines are getting better
at picking keywords out
of a phrase without the
help of dashes—at
least in the English language.
So today, whenever faced
with a choice, you should
favor the keywordkeyword.com
over the keyword-keyword.com
URL. While it's true you
may want to secure both
URLs to keep the other one
out of the hands of your
competitor, you would be
better advised to develop
the URL without the hyphen
as your primary site.
Subdirectories
If you can't get the keyword
into the domain name and
it isn't advisable to put
it into the subdomain, then
your next best option is
to place it in the subdirectory
name.
Bear in mind that it really
doesn't matter if you use
a subdirectory or
a file name to contain
the keyword, but typically
one (but
not both) will help
search ranking. In most
cases, /keyword/index.html,
ranks equally with /keyword.html.
Regardless, you will likely
have better results if part
of the file structure contains
at least some part of your
keyword phrase. Subdirectory
names are also sometimes
important for other reasons.
For example, take /cgi-bin/.
Historically that's been
a subdirectory avoided
by search engine robots
for fear of getting trapped
in an infinite loop and
indexing millions of unique
URLs that are actually just
duplicate pages.
For the most part, however,
search spiders have solved
this problem and will index
URLs that contain cgi-bin.
Still, we suspect some limitations
still remain, so, ideally,
you should avoid using that
specific subdirectory name
whenever possible.
Other similar directory
names that may have built-in
limitations are popular
software program names like
/phpBB/
for the PHP Bulletin
Board software and /Gallery2/.
If you install one of these
programs we would recommend
that you use a unique directory
name whenever possible just
to ensure you'll avoid whatever
limitations to indexing
that might still be lingering.
File names
Just as with subdirectories,
having the keyword as the
file name is usually a good
idea. However, be forewarned
that you don't want to overdo
any part of this. One mention
of the keyword in the URL
is often good enough. Multiple
repetitions in the URL are
typically associated with
spam and ranked lower.
For example http://www.domain.com/keyword.html
is ok, but http://www.keyword.com/samekeyword.html
is probably overkill. Examine
the search results for your
keywords and look at what
is scoring at or near the
top. You'll see the pattern
quite clearly as to what
is good and what isn't.
File Extensions
This is a frequent question
that pops up in our tech
Q&A's — Does
the file extension affect
my search ranking? Typically
NO, it does not, at least
as long as the extension
is one that is commonly
associated with a web page.
We have not seen a case
where this mattered at all
in the last couple years.
Of course, .html
or .htm
is the most often used file
extension. But more and
more we are seeing file
extensions in the top 10
search results that include
.cfm,
.php,
.asp,
and .aspx.
Ranking-wise, all of these
file extensions are equal
in the eyes of the engines.
|
More advanced-need-to-know
stuff... |
Here are some important points
to help you close the technical
loopholes on your SEO marketing
strategies...
Fine tune your URL structure
In general, URL structure
for SEO optimization follows
a general rule that...
the more generic
your keyword, the earlier
you want it in your URL
structure.
For instance, if you want
to score for the extremely
generic term music
that returns more than a
billion search
results at Google, you should
definitely place it in either
your domain or subdomain
name.
However, if you want to
score for a specific model
number for a cell phone
your keyword wouldn't be
anywhere near as popular
as the keyword music.
So, using your keyword—the
model number—as a
subdirectory or file name
will typically work quite
well.
Also, you should be wary
of over-doing it. A good
rule of thumb is this:
If the URL
looks like spam, it probably
will be treated as spam.
The search engines caught
on a long time ago to the
www.viagra-pills-porn-casino.com/
style domain names and such
similar file structures.
Today you want to use domain
and file name structures
that appear to be common
sense to the human visitors
of your site. Always bear
in mind that people do
look at the file names within
the search results. And
seeing the keywords highlighted
in the URL does increase
click-through-rates.
Static vs Dynamic URLs
A static URL looks like
this:
http://www.domain.com/pagename.html
A Dynamic URL is typically
characterized by having
certain characters like
?
after the file extension.
For example:
http://www.domain.com/subdirectory/filename.php?variable=
xyz&variable=123&2ndvariable=456&3rdvariable=789
Everything after the ?
mark is typically a variable.
Remember, many dynamic
systems do not need all
of their variable information
in order to work correctly.
Many programmers go overboard
on the variables they add
to the URL structure.
Today's major search engines
have added capabilities
for crawling dynamic URLs
that weren't available a
few years ago. Still, there
are limitations. A site
rarely gets fully indexed
when there are more than
three variables in a dynamic
URL. Ideally you should
avoid using more than two.
There are also some specific
variable names– like
ID
– that indexing-bots
often avoid. That's due
to what are called session
variables—a variable
that is unique to each visitor
to the site. The use of
session variables often
results in many, many, duplicate
URLs being indexed in the
search engines. That's why
the bots do their best to
avoid URLs that appear to
have a session variable.
If you are operating a
dynamic site you should,
ideally, avoid using session
variables whenever there
is an alternative. And,
if not, find a way (like
using robots.txt)
to prevent the search engines
from crawling those URL
paths.
The most effective way
to prevent indexing problems
with session variables is
to use a good IP delivery
program that will recognize
search engine spiders and
make sure they only receive
URLs which have had the
session variables removed.
Note: a large number
of URLs with session variable
have begun appearing in
Google recently. This appears
to be a glitch in Google's
indexing process that should
be resolved soon, as Google
states on their webmaster
guidelines page that they
do not index pages with
session variables.
It's actually a major headache
for many webmasters, since
it means that many web pages
are getting indexed multiple
times under different URLs.
All the more reason to employ
a solution which prevents
search engines from being
served session variables.
Absolute Vs. Relative
URLs
Another question that
often comes up about URLs
is how to refer to them
in the HREF section of a
link. There are two options
here: absolute or relative.
An absolute link
means you use the entire
URL:
http://www.domain.com/filename.html
...where a relative
link simply refers to:
filename.html
While either link will
work just fine when referring
to pages on the same domain,
absolute URLs are
preferred. They're better
for a few minor reasons,
like...
- The links will work
if someone steals your
content or saves it to
their desktop.
- Absolute URLs help avoid
getting a dumb-bot stuck
in a loop that increases
server load and generates
404 errors.
Admittedly, it's a minor
point. But, if you're striving
toward SEO perfection, absolute
links are part of the
perfect package. Anything
you can do to make it easier
for search engines to index
your pages is a good thing.
An
Extremely Important Point
Keywords in URLs can help
rankings, but remember that
changing your existing URLs
without redirecting it to
a new one will break all
the links and bookmarks
pointing towards that URL,
causing the page to drop
out of the search engines.
This means that if you
have an existing high-ranking
page, it's almost never
a good idea to change the
URL. If you must change
it, be sure to use a 301
redirect to make sure links,
user traffic, and search
engines are properly sent
to the new URL. For a complete
tutorial on using the 301
redirect to change URLs,
see our report:
Unraveling the Versatile
301 Redirect
Silver Bullet?
The continuously evolving arms
race of search engine optimization
and marketing is the science
of piecing together all of the
useful components in ways that
can only help, and never hurt
your ranking and consumer marketing
efforts. Intelligent placement
of keywords within URLs is an
integral part of SEO strategy.
And, when done correctly, it's
likely to pay ranking dividends
far into the foreseeable future.
Whether the search engines
like it or not, there is no
escaping the fact that keywords
in the URL assist them in their
mission to provide relevant
results. Therefore, by utilizing
this strategy, and keeping it
looking natural, you're
capitalizing on the one single
gimme that search engines
can never be expected to completely
eliminate.
However, one should also bear
in mind that, as a stand-alone
strategy, putting the keyword
in your URL won't matter much—it
isn't the silver bullet for
high rankings. But, when all
else is equal, it is a fact
that the webpage with the keyword
in the URL will outrank and
receive more click-throughs
than a page without it!
Because success depends on
getting all the little things
right within a concerted effort,
John Heard – Head Researcher
Courtesy of Planet Ocean Communications,
the top rated source of search
engine marketing information.
Tips
and Strategies
How
Google Indexes Your Site
I can't tell you how many times
I've answered this question in
forums, so I figured since so
many are asking, it would make
for a great article.
First off, let's describe what
we are talking about. A "bot"
is a piece of software from a
search engine that is built to
go through every page of your
site, categorize it, and place
it into a database.
Google has three
well known bots:
The Adsense bot,
the Freshbot and the DeepCrawl.
The Adsense bot,
as you could probably guess, is
used for publishers who have Adsense
on their sites. As soon as a new
page is created, the JavaScript
within the Adsense code sends
a message to the Adsense bot,
and it will come within 15 minutes
to index the page so that it can
serve up the most relevant ads.
But, for this conversation we
are only concerned about the DeepCrawl
and the Freshbot.
The Freshbot
crawls the most popular pages
on your website. It doesn't matter
if that is one page or thousands.
Sites like Amazon.com and CNN.com
have pages that are crawled every
ten minutes, since Google has
learned that those pages have
that amount of frequent changes.
A typical site should expect to
have a freshbot visit every 1
to 14 days, depending on how popular
those pages are.
What happens to your site on
a Freshbot visit is that it finds
all of the deeper links in your
site. It places those links into
a database so that when the DeepCrawl
occurs, it has a reference.
Once a month, the DeepCrawl
bot visits your site
and goes over all the links found
by the Freshbot. This is the reason
why it can take up to a month
for your entire site to be indexed
in Google - even with the addition
of a Google Sitemap.
So, be patient and keep on adding
content to your site, and work
on getting valuable in-bound links
to your site - Google will reward
you for it.
-To your online success!
Paul Bliss, CeM
SEO Certified Professional
www.SEOforGoogle.com
|