How to get more pages indexed by Google?

Google really concerns about the duplicated contents issue in their database. If Google (or any other search engines) find many duplicated contents in your site, Google stops indexing pages even for the new pages and starts removing many duplicated contents in your site.

If you find that Google deindexing you site, you are a part of the duplicated content problems. Every webmaster are facing this deindexing situation everyday. In fact, we were experimenting this duplicated contents issue with Google search engine.

Since we don’t really know the Google real algorithm we start removing many duplicated contents as many as possible in our sites and the result is really astonishing.

We get 28%(average) more pages indexing in Google database by removing many (or all) duplicated contents (more than 1000+ pages) in our sites (more than 100+ sites) within two weeks. This is just one of the techniques for “How to get more pages indexed by Google?”.

Just trim or diet your site. Actually, “Less is more”, a phrase from the 1855 poem Andrea del Sarto (the faultless painter) by Robert Browning. How true is it?

What if, someone copy your contents and put it on their sites. Since you have a digital date stamp in your content, we hope that Google can find out the original sources. All you have to do is just keep posting unique and quality contents.

Most of the articles directory will suffer the duplicated contents issue because authors used to syndicate their same articles in many different sites.

Duplicate content issue is contagious. Just stay away from it.

The size of the World Wide Web (The Internet) Saturday, 15 October, 2011

Posted in Duplicate Content, SEO | Tagged , , , , , , , | Leave a comment

Google Toolbar PageRank Still Alive

Google just pressed the panic button for the SEO world a few days ago without any notice. However, Google Toolbar PageRank is still alive and well. Pagerank in not only important for SEO world but also for Google itself because Pagerank plays one of the important roles in the search engine ranking. Who want to remove the important roles?

BoohMost of the high Pagerank pages are in the top position of the first page of search engine results. Pagerank shows not only the ranking but also the number of strong back links for the site.

As Quote from Seroundtable “The issue is not with Toolbar PageRank, but rather how 3rd party tools lookup the PR values. Google did make a change, they changed the lookup URL for the PageRank value.”


This code changes is nothing to do with Google’s patent for PageRank expiring in 2011.

Many online pagerank checking sites didn’t work anymore until they correct the code mentioned above. Many directory sites are also affected by this code changes. Most of the directory usually use Pagerank as a ranking factor to display on the pages.

This site is working to check your site free.

Still Coooooool.

Sample application for php code: Edit  by changing:

Old URL:
$queryString =’/search?client=navclient-auto&ch=’.$checksum.’&features=Rank:&q=info:’.$url;
New URL:
$queryString =’/tbr?client=navclient-auto&ch=’.$checksum.’&features=Rank:&q=info:’.$url;

Posted in Google Pagerank, SEO | Tagged , , , | Leave a comment

Google Pagerank Algorithm Scenario

We like to compare two of our sites namely PR4 and PR2 web site. These two sites have about 20 inner pages each. Those sites are aged about 2 and 1 year old respectively. PR4 site got a lot of traffic and Alexa “real” Traffic index is about 30,000 and PR2 site got about no data (may be 5,000,000 Ouch).

However there is absolutely no Pagerank on all inner pages for PR4 site and there are PR one each for all inner pages for PR2 site. How is that be possible for Chris’s Sake?

In fact, we are sure that conventional Google Pagerank algorithm doesn’t apply for this Scenario. If we introduce the term total Pagerank,

PRtotalPRtotal for PR4 site will be 4 and PRtotal for PR2 site will be 22 .

We believe that our PR2 site is more powerful than PR4 site based on the TotalPR solution. It should be mentioned here that PR4 site is a directory site with no content and PR2 site is a unique content based site. If we consider Pagerank as a potential and unique content as a mass, PR4 site has high potential and less mass and PR2 site has low potential and high mass.

In conclusion, inner Pagerank flow requires high mass or unique content otherwise Pagerank won’t any juice even if you have high potential or Pagerank on the Homepage. A professional site requires not only high Pagerank but also unique contents.

Posted in Google Pagerank, SEO | Tagged , , | Leave a comment

Google Pagerank vs Standard & Poors Credit Rating

The most famous credit rating agencies are Moody’s, Standard & Poor’s (S&P) and Fitch in United States. Most of the Bond market rely on their rating. The credit ratings are follow:

AAA , AA, A, BBB, BBB-, BB+, BB, B ,CCC, CC, C, and D. The definition of the letter symbol are available at the end of the article.



Actually, which is equivalent to Google Pagerank, which actually is a credit rating of the web sites which starts from Google Pagerank 10 to NA. In fact, those two concepts are equivalent.

Google Pagerank Standard & Poors Credit Rating
10 AAA
9 AA
8 A
6 BBB-
5 BB+
4 BB
3 B
1 CC
0 C

That is one of the reasons, Google Pagerank is very important whether you agree or not. The only difference between Google Pagerank and Standard & Poors Credit Rating are Google has no competition and no regulation.

Google Pagerank may not be important for you however many advertisers do really concern about many factors. Google Pagerank is indeed one of the parameters. If customers have a choice, which sites customers prefer to do business with PR=NA or PR=7 which is an analogy with the Bond market investors. Who wants to buy Bond with credit rating D compare to BBB?

Standard & Poors Rating

The general meaning of our credit rating opinions is summarized below.

‘AAA’—Extremely strong capacity to meet financial commitments. Highest Rating.

‘AA’—Very strong capacity to meet financial commitments.

‘A’—Strong capacity to meet financial commitments, but somewhat susceptible to adverse economic conditions and changes in circumstances.

‘BBB’—Adequate capacity to meet financial commitments, but more subject to adverse economic conditions.

‘BBB-‘—Considered lowest investment grade by market participants.

‘BB+’—Considered highest speculative grade by market participants.

‘BB’—Less vulnerable in the near-term but faces major ongoing uncertainties to adverse business, financial and economic conditions.

‘B’—More vulnerable to adverse business, financial and economic conditions but currently has the capacity to meet financial commitments.

‘CCC’—Currently vulnerable and dependent on favorable business, financial and economic conditions to meet financial commitments.

‘CC’—Currently highly vulnerable.

‘C’—Currently highly vulnerable obligations and other defined circumstances.

‘D’—Payment default on financial commitments.


Posted in Google Pagerank | Tagged , , , , , | Leave a comment

Google PR Dance

The last time Google PR updated on 27th June 2011, most of my sites (more than 200 sites) lost PR. More than 50 sites with PR 4 are gone to Zero. Zip. It’s really a ground zero for me. Google even amazingly downgraded its PR from 10 to 9. I was frustrated and let all the sites in the dark for many days. Surprisingly, all my sites PR just returned back to the normal PR now (I didn’t know when) as well as Google (now PR 10 again). However I’ve noticed that is still PR 9.

There is nothing you can do about Google PR Dance. Now, I’ve learned that just be patience and keep track of what you are doing it right. Everything will be OK again(may be) however keep your fingers crossed.


Jump on the Google PR bandwagon again.

Posted in Google Pagerank, SEO | Tagged , , , | Leave a comment

The new world order in Internet

A few days ago, Google update Pagerank all the sites in Internet including Google himself. Statistical speaking, most of the web site reduces PR (-1 or less) including Google and Youtube themselves. I am impressed that Google hit his head with his own baseball bat. The problem is that Google also hit all of us.

Well, it’s Fair enough. It could be a part of the Internet democracy without uprising.

I feel that we have the new world order prescribed by Google.

1. Facebook become the emperor of the Internet PR=10.            [External Ref: PR 10 sites]

2. All other website current PR= previous PR – 1 (or more)

3. If your site has no content or a few content or not unique content, your PR will reduce drastically. In some case, your high PR website becomes nada, zip, zero, zilch, nichts, niente, nulla, ni gota, niets, niks, rien. (Many of my sites has PR 4+, now all zilch.)

It doesn’t matter it is fair or not. It’s “The new world order in Internet”. Live with that!

Some SEO mentioned about important of frequent updates on PR. It’s not true. This site haven’t been updated for many weeks and it improves PR from NA to 2. Am I happy? Not really because I don’t understand the new algorithm.

Some SEO also mentioned about important of quality backlinks on PR. It’s also not true anymore. This site haven’t been participated in any backlinks campaign whatsoever.

Moreover, some SEO still mention that PR is not important. If it is true, why, Google is still updating this un-important, stupid, airheaded, birdbrained, bonehead, boneheaded, brain-dead, brainless, bubbleheaded, chuckleheaded, dense, dim, dim-witted, doltish, dopey, dopy, dorky, dull, dumb, dunderheaded, empty-headed, fatuous, gormless, half-witted, knuckleheaded,lamebrained, lunkheaded, mindless, oafish, obtuse, pinheaded, senseless, thickheaded, thick-witted, unintelligent, unsmart, vacuous, weak-minded, witless, parameter PageRank.

I was just blogging my own stuff in my head for my own record. There are many SEO guru in Internet. They are all mentioning what to do and what not to do. Actually, they don’t even know what to do and what not to do likes guru from CNBC. They all predicts wrong in financial market.

So what is the new algorithm.

the new algorithm = the old algorithm – 1 (or more)

I feel that we all just fall off the cliff together. Well. It could be a part of the Internet democracy.

Posted in Google Pagerank, SEO | Tagged , , , , , , | Leave a comment

Google Pagerank Update May 2011

I’d just surprisingly found out that one of my sites improve Google Pagerank from 0 to 4. After I’d checked all of my sites, many sites Pagerank improved significantly however I don’t hear any information about Google Pagerank update in net yet. I don’t really care when did that happen as long as my sites got a lot of Pagerank improvement.

It’s really a Judgment Day! May 21, 2011 for me.

Here are my results.

Pagerank from 0 to 1

Pagerank from 0 to 2

Pagerank from 0 to 3

Pagerank from 0 to 4

Please check your sites now. I hope you get some improvement! as well.

Posted in Google Pagerank | Tagged , | Leave a comment

Google Ranking Factor – the secret of genius

There must be a scoring system exist in the Google Algorithm to rank your site for your targeted keywords which is somewhat similarto the expression of Google Pagerank. The targeted keywords may be one or more such as Travel (2,130,000,000 results) vs Yamoussoukro Travel (504,000 results) (I don’t really know where is Yamoussoukro on Earth!)

Google does not publish their ranking algorithms officially because of the company secret and algorithm may be changed frequently as noted by Google CEO. So what is the secret of genius?

Let’s say, the ranking factor be R

Google Ranking Factor

Fig.1 Google Ranking Factor

I : the total number of targeted keywords
J : the total number of ranking factors, may be 10 or 500. Google’s CEO said 200
w : the weighting factor for the ranking factors. It’s may be factor  ( 1 > w > 0 )
X : the ranking factors (function variables) not mathematical number

Let’s start with two of my high ranking sites in Google SERP. (Sites URL cannot mention in this analysis.)

Site A:
Position: ranking 2 out of 144,000,000 results
1. Keywords = URL  (only one keyword)
2. Content = many keywords included on the main page (High Keyword Density)
3. No advertisement
4. Site Speed = Fast
5. no SEO optimization
6. Very old (16 years)
7. PR = 7
8. Many good relevant backlinks
9. Yslow = Grade B
10. Hosting = Dedicated IP
11. Donot follow any webmaster rules). (No robots.txt and sitemap.xml)
12. No outbound links
13. Very heavy traffic
14. Well-known

Site B:
Position: ranking 11 out of 1,820,000   results
1. Keywords = URL  (two keywords)
2. Content = many keywords included on the main page (High Keyword Density)
3. No advertisement
4. Site Speed = Fast
5. Full SEO optimization
6. Very young (1 month)
7. PR = 0
8. No good relevant backlinks
9. Yslow = Grade C
10. Hosting = Sharred IP
11. Follow all webmaster rules.
12. A few outbound links
13. Less heavy traffic
14. Nobody know

Based on two above sites, the major weighting factors be
1. Keywords = URL
2. Content = included many keywords (High Keyword Density)
3. No advertisement.

and one of the mior weighting factors be
4. Site Speed = Fast

It should be noted that “Wikipedia, the free encyclopedia” doesn’t have any advertisements. That’s one of the main reasons they rank very well in all search keywords.

We could not beat many commercial sites in the battle of SERP because of the tight budget however we may change the tactic in this battle, likes guerrilla warfare of SERP.

For example, you may never be on the first page of the keyword Travel (2,130,000,000 results) in your life time. However if your site promote Yamoussoukro region, you may be on the first page of the combined keywords Yamoussoukro Travel (504,000 results) according to the modify ranking factor expression:

Fig 2. Modify Ranking Factor

The coefficients in the above expression are just imaginary numbers. In the combined keyword there is no “I : the total number of targeted keywords” summation loop because of the combined-keywords. It is considered as one-keyword only. For one keyword only which is I=1 and expression 2 reduces to 1.

In fact, Its a guerrilla warfare of SERP. Just go local. This concept is already proved in Site B. Somebody who wants to travel Yamoussoukro will never used the word “Travel” in the search engine, instead he will be using “Yamoussoukro Travel” as a search keywords. Just grab your piece of pie!

External Resources
Google SEO News and Discussion Forum

Posted in SEO | Tagged , , , , | Leave a comment

High Server Load

The Server Load Average gives the sum of the average number of active processes in the queue over the last 1, 5, and 15 minutes. It is a simplistic sign on how much work has been done on your server. This number is usually under the x.xx format and can have values starting from 0.00 to potentially unlimited. Actually, Server Load is processes waiting.

Fig 1. High Server Load on Managed Server

The server load average are determined in the UNIX kernel using CALC_LOAD,  based on exponentially dampened moving average. The new server load average values are in part based on the last server load recordings.

There are a few factors involved to determine the server load average. If your server use dual processors, the acceptable server load average is 2.00. This load is considered “normal load average”. Normal load average equals the number of server’s CPUs (cores). For this case, server load more than 2.00 is not good. Anything under 2.00 is good.

In some case, even tho servers with loads of 100, if cpu 80% idle will result in extremely responsive. There is no clear-cut Server Load Average number considered to be a ‘warning’ level for performance degradation. However if the server load gets up to 5.00 or greater for a consistent period of time, then it’s worth finding out the exact cause of the higher server load.

High Server Load Spikes

Fig. 2 High Server Load Spikes on Managed Server

It’s perfectly normal to have ‘spikes’ in your server load. Peak visitor times, log processing, database backup scripts, automated cron jobs – all can cause ‘spikes’ in your server load. that’s the time to start asking questions when the server load is constantly above 5.00.

Fig. 3 Normal Spikes for 7 days time frame on Managed Server

There are many potential causes of high server loads. Server load levels very much depend on what is being run on your server . The following table gives a good overview of some potential areas which could cause your server to be placed under higher loads.

  • High server loads could be caused by just one or several resource-intensive application. Examples include very high-traffic Web sites, database-driven Web sites, forums, gaming sites, file download sites and so on.
  • A high server load can also be caused by a malicious script or a “runaway script” which can continuously loop, dragging down the server ‘s resources.
  • Too many websites on the one server with the cumulative resources resulting in high server load.
  • Running out of memory and swapping to the swap file

Fig. 4 Memory Statistics on Managed Server

In that Fig 4, the free memory is very low and it may cause high server load. If you run your web site on a shared server that host so many sites on same ip, how can you compete the site running on managed server with that kind of high disk cache so called 12GB. The speed of web site likes a thunder strike for the same amont of content like yours. Please consider that Google (and/or other Search Engines) is now using Site Speed as a Search Ranking Factor.

  • Server backups or server updates are taking place.
  • Server comes under intermittent or continuous Internet attack such as distributed denial-of-service attack (DDoS) however the Bandwidth usage (Fig 5) is normal at the time of high server load (Fig 1) so this case won’t be DDoS attack.

Fig 5. Daily Bandwidth Usage on Managed Server

  • Mis-configured software causing errors.
  • Users sending huge mailing lists.
  • Users trying to bounce spam.
  • Users/spammers sending spam email.
  • Hardware issues including memory leak, bad hard drive, and network card.

The above mentioned problems are for professional webmaster or SEO who manage web sites on dedicated servers or managed servers. If you are on a shared server, all you have to do is just write complain to web hosting. According to Fig. 3, Sometime, when server load too high, just wait a free moment and the problem will go away itself without doing anything otherwise just write complain to web hosting again.

External Resources
Server load
ServerTune KnowledgeTune / What Causes High Server Load?

Posted in SEO | Tagged , , | Leave a comment

Analysis on Dashes in Domain Name

There was a debate about the Dash or Dashes in Domain name in Internet for many years. That issue is not going away and stay going on.

The Golden Rule of Thumb is to avoid the Dash or Dashes (as well as number) in the Domain name. It’s a safe choice, likes choosing TLD com in Domain name rather than TLD net, org etc.

Nowadays, Internet is so crowd. Even IPv4 is running out. So, how the hack we could have TLD com in the Domain name for our words of choice these days. In fact, net, org or other TLD is an excuse choice. How about Dash or Dashes in Domain name?

Let’s remember the favourite sound of late Michael Jackson. You are not alone.

When we search that phrase in Google

Name of Phrase Search Engine Results
youarenotalone 11,800,000
you-arenotalone 4,040,000
you-are-notalone 8,050,000
you-are-not-alone 130,000,000

Thus, you-are-not-alone is as good as youarenotalone because Search Engines consider Dash as a blank. However the resale value may be less for you-are-not-alone because we just violate our Golden Rule of Thumb as mentioned above. In the keyword search results you-are-not-alone is as good as youarenotalone. I believe you got my picture.

Nowadays, most of the people are browsing thru search engines and relevant links on the pages (such as web pages, forums, blogs, social networking, directory) so the long or many dashes in Domain name doesn’t really matter. The only matter is just content again. You can still cash in with you-are-not-alone because your customers are there with you not at the Domain Auction.

In fact, if I don’t have choice for my words, I will definitely choose Dashes-in-Domain-Name. The out-of-date statement of “domain with more than one hyphen or dash looks spammy or unprofessional” is now obsolete. Search Engines will judge your site whether your site is spammy or not based on the content in your site not hyphen-or-dash-in-your-domain-name. Moreover, People will judge your site whether your site is professional or not based on the content in your site as well.

It is recommended that you grab both youarenotalone and you-are-not-alone whenever possible. Then, host your site at youarenotalone and use 301 redirect at you-are-not-alone.

Note: Hyphen is a punctuation mark used to join two words. It is however a subset of Dash.

Posted in Domain Name, SEO | Tagged , | Leave a comment