Tuesday, October 06, 2009

Surf's up Wednesday: Google Wave update 9/29/2009 08:46:00 AM

Starting Wednesday, September 30 we'll be sending out more than 100,000 invitations to preview Google Wave to:
We'll ask some of these early users to nominate people they know also to receive early invitations — Google Wave is a lot more useful if your friends, family and colleagues have it too. This, of course, will just be the beginning. If all goes well we will soon be inviting many more to try out Google Wave.

Some of you have asked what we mean by preview. This just means that Google Wave isn't quite ready for prime time. Not yet, anyway. Since
first unveiling the project back in May, we've focused almost exclusively on scalability, stability, speed and usability. Yet, you will still experience the occasional downtime, a crash every now and then, part of the system being a bit sluggish and some of the user interface being, well, quirky.

There are also still key features of Google Wave that we have yet to fully implement. For example, you can't yet remove a participant from a wave or define groups of users, draft mode is still missing and you can't configure the permissions of users on a wave. We'll be rolling out these and other features as soon as they are ready — over the next few months.

Despite all this, we believe you will find that Google Wave has the potential for making you more productive when communicating and collaborating. Even when you're just having fun! We use it ourselves everyday for everything from planning pub crawls to sharing photos, managing release processes and debating features to writing design documents. In fact, we collaborated on this very blog post with several colleagues in Google Wave.

Speaking of ways you could potentially use Google Wave, we're intrigued by the many detailed ones people have taken the time to describe. To mention just a few: journalist Andy Ihnatko on
producing his Chicago Sun-Times column, filmmaker Jonathan Poritsky onstreamlining the movie-making process, scientist Cameron Neylon on academic papers andlab work, Alexander Dreiling and his SAP research team on collaborative business process modelling, and ZDNet's Dion Hincliffe on a host of enterprise use cases.

The Wave team's most fun day since May? We invited a group of students to come spend a day with us at Google's Sydney office. Among other things, we asked them to collaboratively write stories in Google Wave about an imaginary trip around the world. They had a ball! As did we...



Finally, a big shoutout to the thousands of developers who have patiently taken part in our ongoing
developer preview. It has been great fun to see the cool extensions already built or being planned and incredibly instructive to get their help planning the future of our APIs. To get a taste for what some of these creative developers have been working on, and to learn more about the ways we hope to make it even easier for developers to build new extensions, check out this post on our developer blog.

What’s an update?

What is an update? Google updates its index data, including backlinks and PageRank, continually and continuously. We only export new backlinks, PageRank, or directory data every three months or so though. (We started doing that last year when too many SEOs were suffering from “B.O.”, short for backlink obsession.) When new backlinks/PageRank appear, we’ve already factored that into our rankings quite a while ago. So new backlinks/PageRank are fun to see, but it’s not an update; it’s just already-factored-in data being exported visibly for the first time in a while.

Google also crawls and updates its index every day, so different or more index data usually isn’t an update either. The term “everflux” is often used to describe the constant state of low-level changes as we crawl the web and rankings consequently change to a minor degree. That’s normal, and that’s not an update.

Usually, what registers with an update to the webmaster community is when we update an algorithm (or its data), change our scoring algorithms, or switch over to a new piece of infrastructure. TechnicallyUpdate Gilligan is just backlink/PageRank data becoming visible once more, not a real update. There haven’t been any substantial algorithmic changes in our scoring in the last few days. I’m happy to try to give weather reports when we do our update scoring/algo data though.

Google doesn’t use the keywords meta tag in web search

We went ahead and did this post on the official Google webmaster blog to make it super official, but I wanted to echo the point here as well: Google does not use the keywords meta tag in our web search.

To this day, you still see courts mistakenly believe that meta tags occupy a pivotal role in search rankings. We wanted to debunk that misconception, at least as it regards to Google. Google uses over two hundred signals in our web search rankings, but the keywords meta tag is not currently one of them, and I don’t believe it will be.


Friday, August 21, 2009

WAKE UP CHAMPU

Pareshaan thi Champu ki wife

Non-happening thi jo uski life

Champu ko na milta tha aaram

Office main karta kaam hi kaam

Champu ke boss bhi the bade cool

Promotion ko har baar jate the bhul

Par bhulte nahi the wo deadline

Kaam to karwate the roz till nine

Champu bhi banna chata tha best

Isliye to wo nahi karta tha rest

Din raat karta wo boss ki gulami

Onsite ke ummid main deta salami

Din guzre aur guzre fir saal

Bura hota gaya Champu ka haal

Champu ko ab kuch yaad na rehta tha

Galti se Biwi ko Behenji kehta tha

Aakhir ek din Champu ko samjh aaya

Aur chod di usne Onsite ki moh maya

Boss se bola, "Tum kyon satate ho ?"

"Onsite ke laddu se buddu banate ho"

"Promotion do warna chala jaunga"

"Onsite dene par bhi wapis na aunga"

Boss haans ke bola "Nahi koi baat"

"Abhi aur bhi Champus hai mere paas"

"Yeh duniya Champuon se bhari hai"

"Sabko bas aage badhne ki padi hai"

"Tum na karoge to kisi aur se karaunga"

"Tumhari tarah Ek aur Champu banaunga"

Monday, August 17, 2009

HTML 5 and SEO

HTML 5 and SEO

HTML 5 is still in the making but for any SEO expert, who tries to look ahead, some knowledge about HTML 5 and how it will impact SEO is not unnecessary information. It is true that the changes and the new concepts in HTML 5 will impact Web developers and designers much more than SEO experts but still it is far from the truth to say that HTML 5 will not mean changes in the SEO policy.

What's New in HTML 5?

HTML 5 follows the way the Net evolved in the last years and includes many useful tags and elements. At first glance, it might look as if HTML 5 is going in the direction of a programming language (i.e. PHP) but actually this is not so – it is still an XML-based presentation language. The new tags and elements might make HTML 5 look more complex but this is only at first glance.

HTML 5 is not very different from HTML 4. One of the basic ideas in the development of HTML 5 was to ensure backward compatibility and because of that HTML 5 is not a complete revamp of the HTML specification. So, if you had worries that you will have to start learning it from scratch, these worries are groundless.

How the Changes in HTML 5 Will Affect SEO?

As a SEO expert, you are most likely interested mainly in those changes in the HTML 5 specification, which will affect your work. Here are some of them:

  • Improved page segmentation. Search engines are getting smarter and there are many reasons to believe that even now they are applying page segmentation. Basically, page segmentation means that a page is divided into several separate parts (i.e. main content, menus, headers, footers, links sections, etc.) and these parts are treated as separate entries. At present, there is no way for a Web master to tell search engines how to segment a page but this is bound to change in HTML 5.

  • A new

    tag.
    The new
    tag is probably the best addition from a SEO point of view. The
    tag allows to mark separate entries in an online publication, such as a blog or a magazine. It is expected that when articles are marked with the
    tag, this will make the HTML code cleaner because it will reduce the need to use

    tags. Also, probably search engines will put more weight on the text inside the
    tag as compared to the contents on the other parts of the page.

  • A new

    tag.
    The new
    tag can be used to identify separate sections on a page, chapter, book. The advantage is that each section can have its separate HTML heading. As with the
    tag, it can be presumed that search engines will pay more attention to the contents of separate sections. For instance, if the words of a search string are found in one section, this implies higher relevance as compared to when these words are found all across the page or in separate sections.

  • A new

    tag.
    The new
    tag (which is different from the head element) is a blessing for SEO experts because it gives a lot of flexibility. The
    tag is very similar to the tag but the difference is that it can contain a lot of stuff, such as H1, H2, H3 elements, whole paragraphs of text, hard-coded links (and this is really precious for SEO), and any other kind of info you feel relevant to include.

  • A new

    tag.
    The
    tag might not be as useful as the
    one but still it allows to include important information there and it can be used for SEO purposes as well. The
    and
    tags can be used many times on one page – i.e. you can have a separate header/footer for each section and this gives really a lot of flexibility.

  • A new

    Navigation is one of the important factors for SEO and everything that eases navigation is welcome. The new

As you see, the new tags follow the common structure of a standard page and each of the parts (i.e. header, footer, main section) has a separate tag. The tags we described here, are just some (but certainly not all) of the new tags in HTML 5, which will affect SEO in some way. For instance,

For now HTML 5 is still far in the future. When more pages become HTML 5-compliant, search engines will pay more attention to HTML 5. Only then it will be possible to know how exactly search engines will treat HTML 5 pages. The mass adoption of HTML 5 won't happen soon and it is a safe bet to say that for now you can keep to HTML 4 and have no concerns. Additionally, it will take some time for browsers to adjust to HTML 5, which further delays the moment when HTML 5 will be everywhere.

However, once HTML 5 is accepted and put to use, it will be the dominating standard for the years to come and that is why you might want to keep an eye on what other web masters are doing, just to make sure that you will not miss the moment when HTML 5 becomes the defacto standard.


SEO and Web Hosting

NOWHERE to be found on Search Engines ?

If you have submitted your site again and again to Search Engines but you are still unable to get it indexed, it MAY be your Web hosting provider who is responsible.
If the domain sharing your server and IP Address is penalized by a Search Engine on account of spamming then your website is also expected to be banned or penalized. This will happen if there is virtual shared IP hosting.

Another situation may arise if your website is residing on a server containing illegal adult content, and that site is on the black list of Search Engines your website will be banned. While you should never assume that your site is blacklisted, you should also take precautions. Even if your site is dismissed from the rankings temporarily, you can probably clear up the mistake by contacting the Search Engine.

To be safe its best to host with a reputed Web Hosting Provider.
Your Web Hosting provider should be up 24/7. For proper website service it must be up 100% of the time so that Users and Search Engines aren't faced with a blank page or a 404 error. Search engines have no specific schedules for crawling, so your website must be up at all times in order to maintain your Search Engine Ranking.

Reputed Web Hosting Provider

The Reputation of the web host should be good with Search Engines.
Although there is no clear way to identify them, web hosts that are very popular could be trusted.
Some of the popular SEO Friendly Hosting providers are

  • KVC Hosting
  • Directi
  • Interland
  • 1and1
  • Enom etc.
Sites hosted by free web hosting providers do not usually rank well in Search Engines for competitive keywords.

Conclusion

You may benefit from sticking with the most popular hosting service providers, even if they are a little more expensive, rather than going for the best deal of a host with no reputation.
It is also preferable to have an individual IP address so that risk of index removal is minimized (Usually Hosting providers provide dedicated IP Addresses only with purchase of SSL).

Tuesday, July 28, 2009

Know About Google Sandbox

Know About Google Sandbox

It's never easy for newcomers to enter a market and there are barriers of different kinds. For newcomers to the world of search engines, the barrier is called a sandbox – your site stays there until it gets mature enough to be allowed to the Top Positions club. Although there is no direct confirmation of the existence of a sandbox, Google employees have implied it and SEO experts have seen in practice that new sites, no matter how well optimized, don't rank high on Google, while on MSN and Yahoo they catch quickly. For Google, the jailing in the sandbox for new sites with new domains is on average 6 months, although it can vary from less than a month to over 8 months.

Sandbox and Aging Delay

While it might be considered unfair to stop new sites by artificial means like keeping them at the bottom of search results, there is a fair amount of reasoning why search engines, and above all Google, have resorted to such measures. With blackhat practices like bulk buying of links, creation of duplicate content or simply keyword stuffing to get to the coveted top, it is no surprise that Google chose to penalize new sites, which overnight get tons of backlinks, or which are used as a source of backlinks to support an older site (possibly owned by the same company). Needless to say, when such fake sites are indexed and admitted to top positions, this deteriorates search results, so Google had to take measures for ensuring that such practices will not be tolerated. The sandbox effect works like a probation period for new sites and by making the practice of farming fake sites a long-term, rather than a short-term payoff for site owners, it is supposed to decrease its use.

Sandbox and aging delay are similar in meaning and many SEO experts use them interchangeably. Aging delay is more self-explanatory – sites are “delayed” till they come of age. Well, unlike in legislation, with search engines this age is not defined and it differs. There are cases when several sites were launched in the same day, were indexed within a week from each other but the aging delay for each of them expired in different months. As you see, the sandbox is something beyond your control and you cannot avoid it but still there are steps you can undertake to minimize the damage for new sites with new domains.

Minimizing Sandbox Damages

While Google sandbox is not something you can control, there are certain steps you can take in order to make the sandbox effect less destructive for your new site. As with many aspects of SEO, there are ethical and unethical tips and tricks and unethical tricks can get you additional penalties or a complete ban from Google, so think twice before resorting to them. The unethical approaches will not be discussed in this article because they don comply with our policy.

Before we delve into more detail about particular techniques to minimize sandbox damage, it is necessary to clarify the general rule: you cannot fight the sandbox. The only thing you can do is to adapt to it and patiently wait for time to pass. Any attempts to fool Google – starting from writing melodramatic letters to Google, to using “sandbox tools” to bypass the filter – can only make your situation worse. There are many initiatives you can take, while in the sandbox, for as example:

  • Actively gather content and good links – as time passes by, relevant and fresh content and good links will take you to the top. When getting links, have in mind that they need to be from trusted sources – like DMOZ, CNN, Fortune 500 sites, or other reputable places. Also, links from .edu, .gov, and .mil domains might help because these domains are usually exempt from the sandbox filter. Don't get 500 links a month – this will kill your site! Instead, build links slowly and steadily.

  • Plan ahead– contrary to the general practice of launching a site when it is absolutely complete, launch a couple of pages, when you have them. This will start the clock and time will be running parallel to your site development efforts.

  • Buy old or expired domains – the sandbox effect is more serious for new sites on new domains, so if you buy old or expired domains and launch your new site there, you'll experience less problems.

  • Host on a well- established host – another solution is to host your new site on a subdomain of a well-established host (however, free hosts are generally not a good idea in terms of SEO ranking). The sandbox effect is not so severe for new subdomains (unless the domain itself is blacklisted). You can also host the main site on a subdomain and on a separate domain host just some contents, linked with the main site. You can also use redirects from the subdomained site to the new one, although the effect of this practice is also questionable because it can also be viewed as an attempt to fool Google.

  • Concentrate on less popular keywords – the fact that your site is sandboxed does not mean that it is not indexed by Google at all. On the contrary, you could be able to top the search results from the very beginning! Looking like a contradiction with the rest of the article? Not at all! You could top the results for less popular keywords – sure, it is better than nothing. And while you wait to get to the top for the most lucrative keywords, you can discover that even less popular keywords are enough to keep the ball rolling, so you may want to make some optimization for them.

  • Rely more on non-Google ways to increase traffic – it is often reminded that Google is not the only search engine or marketing tool out there. So if you plan your SEO efforts to include other search engines, which either have no sandbox at all or the period of stay there is relatively short, this will also minimize the damages of the sandbox effect.

Sunday, June 14, 2009

How to get Traffic from Social Bookmarking sites

Sites like digg.com, reddit.com, stumbleupon.com etc can bring you a LOT of traffic. How about getting 20,000 and more visitors a day when your listing hits the front page?
Getting to the front page of these sites is not as difficult as it seems. I have been successful with digg and del.icio.us (and not so much with Reddit though the same steps should apply to it as well) multiple times and have thus compiled a list of steps that have helped me succeed:

Delicious add to del.icio.us

1Pay attention to your Headlines

Many great articles go unnoticed on social bookmarking sites because their headline is not catchy enough. Your headline is the first (and very often the only) thing users will see from your article, so if you don't make the effort to provide a catchy headline, your chances of getting to the front page are small.
Here are some examples to start with :-

Original headline : The Two Types of Cognition
Modified Headline : Learn to Understand Your Own Intelligence

Original headline: Neat way to organize and find anything in your purse instantly!
Modified Headline : How to Instantly Find Anything in Your Purse

Here is a good blog post that should help you with your headlines.

2Write a meaningful & short description

The headline is very important to draw attention but if you want to keep that attention, a meaningful description is vital. The description must be slightly provocative because this draws more attention but still, never use lies and false facts to provoke interest. For instance, if your write “This article will reveal to you the 10 sure ways to deal with stress once and forever and live like a king from now on.” visitors will hardly think that your story is true and facts-based.

You also might be tempted to use a long tell-it-all paragraph to describe your great masterpiece but have in mind that many users will not bother to read anything over 100-150 characters. Additionally, some of the social bookmarking sites limit descriptions, so you'd better think in advance how to describe your article as briefly as possible.

3Have a great first paragraph

This is a rule that is always true but for successful social bookmarking it is even more important. If you have successfully passed Level 1 (headlines) and Level 2 (description) in the Catch the User's Attraction game, don't let a bad first paragraph make them leave your site.

4Content is king

However, the first paragraph is not everything. Going further along the chain of drawing (and retaining) users' attention, we reach the Content is King Level. If your articles are just trash, bookmarking them is useless. You might cheat users once but don't count on repetitive visits. What is more, you can get your site banned from social bookmarking sites, when you persistently post junk.

5Make it easy for others to vote / bookmark your site

It is best when other people, not you, bookmark your site. Therefore, you must make your best to make it easier for them to do it. You can put a bookmarking button at the end of the article, so if users like your content, they can easily post it. If you are using a CMS, check if there is an extension that allows to add Digg, Del.icio.us, and other buttons but if you are using static HTML, you can always go to the social bookmarking site and copy the code that will add their button to your pages.
Here is a link that should help you add Links for Del.icio.us, Digg, and More to your pages.

6Know when to submit

The time when you submit can be crucial for your attempts to get to the front page. On most social bookmarking sites you have only 24 hours to get to the front page and stay there. So, if you post when most users (and especially your supporters) are still sleeping, you are wasting valuable time. By the time they get up, you might have gone to the tenth page. You'd better try it for yourself and see if it works for you but generally posting earlier than 10 a.m. US Central Time is not good. Many people say that they get more traffic around 3 p.m. US Central Time. Also, workdays are generally better in terms of traffic but the downside is that you have more competitors for the front page than on weekends.

7Submit to the right category

Sometimes a site might not work for you because there is no right category for you. Or because you don't submit to the right category – technology, health, whatever – but to categories like General, Miscellaneous, etc. where all unclassified stuff goes. And since these categories fill very fast, your chance to get noticed decreases.

8Build a top-profile

Not all users are equal on social bookmarking sites. If you are an old and respected user who has posted tons of interesting stuff, this increases the probability that what you submit will get noticed. Posting links to interesting articles on other sites is vital for building a top-profile. Additionally, it is suspicious, when your profile has links to only one site. Many social bookmarking sites frown when users submit their own content because this feels like self-promotion.

9Cooperate with other social bookmarkers

The Lonely Wolf is a suicidal strategy on sites like StubleUpon, Digg, Netscape. Many stories make it to the front page not only because they are great but because they are backed up by your network of friends. If in the first hours after your submittal you get at least 15 votes from your friends and supporters, it is more likely that other users will vote for you. 50 votes can get you to the top page of Digg.

10Submit in English

Linguistic diversity is great but the majority of users are from English-speaking countries and they don't understand exotic languages. So, for most of the social bookmarking sites submitting anything in a language different from English is not recommendable. The languages that are at an especial disadvantage are Chinese, Arabic, Slavic languages and all the other that use non-latin alphabet. German, Spanish, French are more understandable but still they are not English. If you really must submit your story (i.e. because you need the backlink), include an English translation at least of the title. But the best way to proceed with non-English stories is to post them on where they belong. Check this link for a list of non-English sites.

11Never submit old news

Submitting old news will not help you in becoming a respected user. Yesterday's news is history. But if you still need to submit old stuff, consider feature articles, howtos and similar pieces that are up-to-date for a long time.

12Check your facts

You must be flattered that users read your postings but you will hardly be flattered when users prove that you haven't got the facts right. In addition to sarcastic comments, you might also receive negative votes for your story, so if you want to avoid this, check you facts - or your readers will do it.

13Check you spelling

Some sites do not allow to edit your posts later, so if you misspell the title, the URL, or a keyword, it will stay this way forever.

14Not all topics do well

But sometimes even great content and submitting to the right category do not push you to the top. One possible reason could be that your stories are about unpopular topics. Many sites have topics that their users love and topics that don't sell that well. For instance, Apple sells well on Digg and The War in Iraq on Netscape. Negative stories - about George Bush, Microsoft, evil multinational companies, corruption and crime also have a chance to make it to the front page. You can't know these things in advance but some research on how many stories tagged with keywords like yours have made the front page in the last year or so can give you a clue.

15Have Related Articles / Popular Articles

Traffic gurus joke that traffic from social bookmarking sites is like an invasion – the crowds pour in and in a day or two they are gone. Unfortunately this is true – after your listing rolls from the front page (provided that you reached the front page), the drop in traffic is considerable. Besides, many users come just following the link to your article, have a look at it and then they are gone. One of the ways to keep them longer on your site is to have links to Related Articles / Popular Articles or something similar that can draw their attention to other stuff on the site and make them read more than one article.

16RSS feeds, newsletter subscriptions, affiliate marketing

RSS feeds, newsletter subscriptions, affiliate marketing are all areas in which the traffic from social bookmarking sites can help you a lot. Many people who come to your site and like it, will subscribe to RSS feeds and/or your newsletter. So, you need to put these in visible places and then you will be astonished at the number of new subscriptions you got on the day when you were on the front page of a major social bookmarking site.

17Do not use automated submitters

After some time of active social bookmarking, you will discover that you are spending hours on end posting links. Yes, this is a lot of time and using automated submitters might look like the solution but it isn't. Automated submitters often have malware in them or are used for stealing passwords, so unless you don't care about the fate of your profile and don't mind being banned, automated submitters are not the way to go.

18Respond to comments on your stories

Social bookmarking sites are not a newsgroup but interesting articles can trigger a pretty heated discussion with hundreds of comments. If your article gets comments, you must be proud. Always respond to commends on your stories and even better – post comments on other stories you find interesting. This is a way to make friends and to create a top-profile.

19Prepare your server for the expected traffic

This is hardly a point of minor importance but we take for granted that you are hosting your site on a reliable server that does not crash twice a day. But have in mind that your presence on the front page of a major social bookmarking site can drive you a lot traffic, which can cause your server to crash – literally!
I remember one of the times I was on the front page on Digg, I kept restarting Apache on my dedicated server because it was unable to cope with the massive traffic. I have many tools on my site and when the visitors tried them, this loaded the server additionally.
Well, for an articles site getting so much traffic is not so devastating but if you are hosting on a so-so server, you'd better migrate your site to a machine that can handle a lot of simultaneous hits. Also, check if your monthly traffic allowance is enough to handle 200-500,000 or even more visitors. It is very amateurish to attract a lot of visitors and not be able to serve them because your server crashed or you have exceeded your bandwidth!

20The snowball effect

But despite the differences in the likes of the different social bookmarking communities, there are striking similarities. You will soon discover that if a post is popular on one of the major sites, this usually drives it up on the other big and smaller sites. Usually it is Digg posts that become popular on StumbleUpon and Reddit but there are many other examples. To use this fact to your best advantage, you may want to concentrate your efforts on getting to the front page of the major players only and bet on the snowball effect to drive you to the top on other sites.
An additional benefit of the snowball effect is that if your posting is interesting and people start blogging about it, you can get tons of backlinks from their blogs. This happened to me and the result was that my PR jumped to 6 on the next update.

Tuesday, June 09, 2009

Difference between AdWords and Yahoo PPC

I have been avoiding learning how yahoo works for some time. Today I decided to really get into it. Our agency has a Yahoo agency rep that helps us out a lot and I just spent a lot of time talking to him on the phone. Come to find out Yahoo is very different. Here are a few items I hope this turns into a big discussion.

1. Quality Score:
Google - Quality score is on an individual keyword and is figured out at time of search. You can have low and high QS keywords in the same ad group. Broad match and phrase match keywords only show the average QS. Only exact match shows a close to real QS and even then it is still an average since QS is based on things that happen at time of search.

Yahoo - Quality score is only shown at the ad group level. This means that everything in the ad group contributes to the QS. A bad QS keyword can drag down the others. They recommend that we organize ad group like in Google but then split up the ad group by traffic. This means that I get 3 times as many ad groups in Yahoo than I do Google.

2. Match Type.
Google Phrase, Broad, Eact, Negative Phrase, Negative Broad, Negative Exact. Broad is a wild card that will find all keywords with any of the keywords in your term plus synonyms, plurals, singulars and anything that could be remotely related. Phrase will take the exact match of what you put in the term and allow any other terms that have that exact phrase plus anything before or after the term. Exact is very exact nothing more nothing less. Negatives work the same way except they don't do any guessing. If you put in a singular you have to put in the singular as well. If you put in a negative match you can't have the exact match of the same term. Google considers a plural and singular keyword to be two different words. Negatives can be done at the Campaign and ad group level.

Yahoo - Advanced, Standard, Excluded Words. This has been very confusing to me. It seems that you have broad and really broad. Yahoo does not have an exact match. Advanced is just like Google Broad and standard is just like Google broad used to be. If you put in a negative match keyword it won't be excluded if you have it as a standard or advanced match. If you put in a singular it will also exclude the plural. Yahoo considers plural and singular keywords to be the same keyword there is no way to separate them. Negatives can be done at the account level and ad group level.

There are many other differences like campaign settings and geo targeting and so on but these are the 2 main things that are not obvious. I'm not saying that all this is true it is just my understanding. I hope we can talk about this and maybe clear up some of this stuff.

Sunday, May 24, 2009

Best Free SEO Tools

Best Free SEO Tools - Suggested By Jitendra Singh Jat

Google Adwords Keyword Tool

(https://adwords.google.com/select/KeywordSandbox )

Keyword tool from Google that provides Specific and Similar keywords. It

provides you with 2 lists of words, specific keywords are those that

include your keyword that have been searched on Google and Similar

keywords that are relevant terms searched by uses that searched for your

keyword also.

Overture Search Term Suggestion Tool

(http://inventory.overture.com/d/searchinventory/suggestion/ )

It allows you to enter a prospective keyword and it returns the related

keyword phrases and the number of times they have been searched in

the last month on Overture search engines such as Yahoo, MSN,

Altavista and Hotbot. You use this kind of data to help you choose the

keyword phrases you want to target. This tool is a "must use" as you

don't want to target keywords that no one is searching for! .

Google Suggest Tool

(http://www.google.com/webhp?complete=1 )

It offer completions when you type in your keyword for searches.

It is very helpful in keyword mining.

Search Term Research

(http://www.keyworddiscovery.com)

This tool requires an username and password but it has lots of features.

They claim to have the biggest database, with over 9 billion searches

compiled from 37 sources that include major international, pay per click,

meta and regional engines. You can search on spell:keyword,

related:keyword, like:keyword and category:keyword. The tool will search

Overture's data from 16 different countries. There is just too much to list

here. Go sign up and get take a peek. It is a paid tool.

Keyword Search Tool

(http://www.nichebot.com)

It displays keyword data using Wordtracker and Google search

results. It shows for entered keyword and similar keywords the

number of times searched in 60 days according to WordTracker and

Google results and Google exact results page numbers.

Wordtracker Keyword Research Tool

(http://www.wordtracker.com)

Wordtracker compile a database of terms that people search for. You enter

some keywords, and it tell you how often people search for them, and also tell

you how many competing sites use those keywords. Wordtracker helps you

find all keyword combinations that bear any relation to your business, service

or product. It is a paid tool.

GoogleDuel Original

(http://www.googleduel.com/original.php )

It enables you to compare two different words to see which is more popularity

on the internet. This tool is a fast way to check you keywords to see where

plural or singular is the more common form used on the internet.

Keyword Density Tools

(http://www.seochat.com/seo-tools/keyword-density/ )

(http://www.keyworddensity.com/)

(http://www.virtualpromote.com/tools/keyword-analyzer/ )

Keyword density is a measure of the "density" of the given keyword in a

body of text. The density percentage indicates what percentage of the total

text is made up of the specified keyword. This tool will analyze your entered

URL, and return a list of key phrases with their corresponding density values

for one, two, or three word key phrases. Keyword Density Analyzer gives you

keyword density in your title, meta description, meta keywords, visible text, alt

tags, comment tags, image tags, option tags, reference tags, linked text and

URL.

Ontology Finder

(http://www.gorank.com/seotools/ontology/ )

It provides synonymous words of your keywords to consider for possible

keywords or just to add to your webpage.

Keyword Competition Analyzer

( http://www.redalkemi.com/search-engine-optimization-seo/search-engine-keyword-competition.php)

It displays the number of web pages on Google, Yahoo, MSN & HotBot that

contain the keyword you enter. A very nice tool that gives you all the numbers

in one shot!

Combine words script for PPC systems

(http://www.combinewords.com/)

It is a script is designed to combine lists of keywords for use in AdWords,

Overture and other Pay Per Click systems.

Advanced Google Search Methods

(http://www.algotech.dk/googlesearches.asp/ )

Enter domain and keyword and select numerous searches including allintitle,

allinurl, allinanchor, allintext, site, define, sponsored links and results with

sandbox penalty suspended.

Free Google Keyword Ranking Tool

(http://www.nichebot.com/ranking/)

Enter keywords, domain, depth of search (10-1000), any/all or exact phrase

and choose one of the 101 Google country sites to search from. It wouldshow

the position, number of Backlinks and total no. of pages indexed for the

domain.

Google Datacenter Watch Tool

(http://www.mcdar.net/dance/index.php)

It is a tool that will let you do a search on Google at all the different

datacenters. This shows whether the different datacenters are giving different

different Ranking results for the same key phrase.

Google One Line Results

(http://www.google.com/ie?q=&num=100&hl=en )

It enables you to view Google's search results with each entry on one liner with

anchor text the title.

Google Rank Position

(http://www.prsearch.net/ranking.php)

It allows you to enter a search term and the domain to search for. It returns

the position that search term has on Google WITH, the number of entries that

search term is indexed for, the PR of your home page, the size of file, the

number of inbound links the site has and lets you know if you are in the DMOZ

or Google Directory.

Google Rankings

(http://www.googlerankings.com/)

It searches for the entered URL and keyword in the first 1000 pages

Datacenter Quick Check

(http://www.mcdar.net/q-check/datatool.asp )

It lets you add you URL and check all Google datacenters for BackLinks, PR and

Ranking of keyword.

Future PageRank Tool

(http://www.seochat.com/seo-tools/future-pagerank/ )

It will query all of Google's various data-centers and display your Pagerank

value for each one. If queried during a Google update you will get a glimpse of

upcoming PR changes.

Page Rank Tool

(http://www.pageranktool.net/)

It shows PR on the different Google datacenters. Good for monitoring a PR

update.

Yahoo Search Rankings

(http://www.yahoosearchrankings.com/)

(http://www.prsearch.net/yahmass.php)

Enter URL/Keyword and searches for the entered URL and keyword in the

first 1000 pages.

MSN Ranking Tools

MSN Position Search

(http://www.prsearch.net/msnmass.php)

(http://www.devppl.com/msnbetarankings.php )

You can search up to 15 keywords for the entered domain and from the

top 100 to the top 500.

Robots.txt Tools

Search engines will look in your root domain for a special file named "robots.txt" ( http://www.mydomain.com/robots.txt). The file tells the robot (spider) which files it may spider (download). This system is called, The Robots Exclusion Standard. The format for the robots.txt file is special. It consists of records. Each record consists of two fields : a User-agent line which specifies the robot and one or more Disallow: lines which specify files and/or directories not be indexed by robots)

The format is: (#Robots.txt

User-agent: * (You may also use the wildcard character "*" to specify all robots)

Disallow: /cgi/ # keep robots out of the executable tree

Disallow: /temp/ # temp directory

Disallow: /stats/ # stats directory )

Robots.txt Generator

(http://www.123promotion.co.uk/tools/robotstxtgenerator.php )

It generates a simple robots file for your website allowing you to hide files

or directories.

Robots.txt Syntax Checker

(http://www.sxw.org.uk/computing/robots/check.html )

(http://www.searchengineworld.com/cgi-bin/robotcheck.cgi )

It checks your robots.txt file and informs you of any errors that it finds.

Google Banned Tool

(http://123promotion.co.uk/tools/googlebanned.php )

This tool uses two methods to check to see if a URL you enter is in the Google

database. Its a great way to check to see if you have been banned by Google

for various reasons. If your site is quite new, you may also not show in the

Google database. This tool is aimed towards people who think that they have

been punished.

Meta Tag Generator

(http://www.123promotion.co.uk/tools/meta_tag_creator.php )

(http://www.searchbliss.com/free_scripts_metatag.htm )

It lets you fill in your web page title, description, and keywords and then

creates the meta tags for you.

Similar Page Checker

(http://www.webconfs.com/similar-page-checker.php )

It helps you determine if the content of your web page's to similar as you can

be penalizes for having similar web pages on the same website.

Poodle Predictor

(http://www.gritechnologies.com/tools/spider.go )

It shows you a spiders view of the text on your web page. Ignore where it

shows what your page will be displayed like on a Search Engine. Go into

the Diagnostics View and there you will see your page in text only with all

your alt, title, menu and copy displayed in the order the spider will see it.

Search Engine Spider Simulator

(http://www.webconfs.com/search-engine-spider-simulator.php )

(http://www.webmaster-toolkit.com/search-engine-simulator.shtml )

(http://www.searchwho.com/sw5-spider.html )

It shows you the content order of your webpage as the search engine

robot sees it. This will help you to verify that your keywords are at the

top of your page where they will be more relevant. It shows title, meta

description, meta keywords, page size, content text, word count, distinct

words, keyword density, links, and page in HTML.

Cool SEO Tool

(http://www.webuildpages.com/cool-seo-tool/ )

It provides Google, Yahoo, and MSN ranking, pages indexed in Google and

AltaVista, backlinks, allinanchor ranking, age of domain and phrase

on page.

Search Engine Index Checker

(http://www.marketleap.com/siteindex/default.htm )

It will display the number of your site pages that are in the database of

Alltheweb, AltaVista, Google/AOL and HotBot Search Engines. It also enables

you to enter up to 5 URLs of competing sites and compares your site to those

sites in a graph.

Check Yahoo WebRank

(http://www.webconfs.com/check-yahoo-webrank.php )

Allows you to check the Yahoo WebRank of up to 5 URLs at a time.

Search Engine Position Checker

(http://www.webmaster-toolkit.com/search-engine-position-checker.shtml )

(http://www.seo-guy.com/seo-tools/se-pos.php )

It allows you to enter your keyword phrases and site URL and then displays the

top 50 position for those keyword phrases in Altavista, Excite, Google, Yahoo,

HotBot, MSN, WiseNut, Alltheweb and Teoma.

Search Results Compared

(http://www.yahoogooglemsn.com)

Search in all three major SEs at the same time and displays the results on a

single page.

Domain Directory Checker

(http://www.123promotion.co.uk/directory/ )

It checks your domain to see if it appears in 10 Internet Directory's i.e,

Business.com, DMOZ, Gimpsy, GoGuides.com, Jayde, JoeAnt, Skaffe,

Websavvy, Yahoo, and Zeal.

Backlink Anchor Text Analyzer

(http://www.webconfs.com/anchor-text-analysis.php )

(http://www.555webtemplates.com/backlinks-tool.asp )

Backlink analyzer that provides a report of the URL and anchor text of up to

1000 of your backlinks that Google is showing. If you have more than one link

on the inbound page it show each of the anchor texts.

Link Popularity & Site Analysis Tool

(http://www.linkvendor.com/seo-tools/site-analysis.html )

(http://www.webmaster-toolkit.com/link-appeal.shtml )

(http://www.prsearch.net)

It shows your inbound links and pages in Google, Yahoo, MSN, Ask Jeeves, and

Alexa. Also shows PR, archive date (wayback machine), number of outbound

links on the page and whether the site is in Dmoz and Yahoo directory. Shows

the number of words and characters in the Title, meta keyword and meta

description and shows on a graph where the page has to many or to little.

Link Popularity Report

(http://www.reportbot.com)

(http://uptimebot.com/sql/one.php)

It reports, links and pages indexed from Google, Alltheweb, MSN, HotBot &

Yahoo, Tool PageRank, Alexa Ranking and checks where you are listed in

DMOZ - and it fast!

Link Exchange Tools

Broken Link Checker

(http://www.dead-links.com/)

It tests all links site wide and reports any broken links.

Reciprocal Link Checker

(http://www.recip-links.com/)

It checks the list of reciprocal you cut & paste into there form. It checks the

page, not the whole site and reports back the URLs that it found had a link to your

domain, the URLs that didn't have a link to your and any URLs that error out.

URL Rewriting Tool

(http://www.webconfs.com/url-rewriting-tool.php )

Generates Apache Mod-Rewrite code to help you convert dynamic URLs to

static looking URLs. Create a file called '.htaccess' and paste the code

generated into it.

Domain Age Tool

(http://www.webconfs.com/domain-age.php )

It displays the approximate age of a website on the Internet and allows you to

view how the website looked when it first started. It also helps you find out the

age of your competitor's domains, older domains may get a slight edge in

Search Engine Rankings.

Domain Stats Tool

(http://www.webconfs.com/domain-stats.php )

Provides domain information like Age, Yahoo WebRank, count of Backlinks and

number of pages indexed in Search Engines like Google, Yahoo, and MSN.

Web CEO

(http://www.websiteceo.com/download.htm )

Web CEO is a suite of search engine marketing tools in one software program,

including a keyword generator, web page optimization analyzer, auto and

manual submitter, ranking monitor, top-ranked web page analyzer, HTML

editor, FTP client, and traffic counter and analyzer.

Webposition Gold

(http://www.web-positiongold.com/)

WebPosition Gold is a search engine software program that optimizes web

pages, submits to top search engines worldwide, and tracks your search engine

rankings and web site traffic.

Arelis

(http://www.axandra-link-popularity-tool.com/ )

Arelis is a link popularity software program for finding and managing reciprocal link

partners to increase web site link popularity ratings, which in turn improves search

Google Toolbar For IE for Google Search

(http://www.toolbar.google.com/)

It is a toolbar that installs on IE and shows you the Pagerank for each webpage

you visit. Beware - this tool bar will make you more PR conscience.

Search MSN Toolbar

(http://toolbar.msn.com/)

It is a toolbar that installs on Internet Explorer and has a popup blocker and a

number of other features.

Yahoo Toolbar For IE for Yahoo Search

(http://www.toolbar.yahoo.com/)

It is a toolbar that installs on Internet Explorer. Has pop-up blocker and Anti-

Spyware build in.

Saturday, May 23, 2009

SEO and SEM Marketing as a Carrier

Some Good Reasons to Choose SEO as Your Career

1High demand for SEO services

Once SEO was not a separate profession - Web masters performed some basic SEO for the sites they managed and that was all. But as sites began to grow and make money, it became more reasonable to hire a dedicated SEO specialist than to have the Web master do it. The demand for good SEO experts is high and is constantly on the rise.

2A LOT of people have made a successful SEO career

There are many living proofs that SEO is a viable business. The list is too long to be quoted here but some of the names include Rob fromBlackwood Productions, Jill Wahlen from High Rankings, Rand Fishkin from SEO Moz and many others.

3Search Engine Optimizers make Good Money !

SEO is a profession that can be practiced while working for a company or as a solo practitioner. There are many jobboards like Dice and Craigslist that publish SEO job advertisements. It is worth noting that the compensation for SEO employees is equal to or even higher than that of developers, designers and marketers. Salaries over $80K per annum are not an exception for SEO jobs.
As a solo SEO practitioner you can make even more money. Almost all freelance sites have sections for SEO services and offers for $50 an hour or more are quite common. If you are still not confident that you can work on your own, you can start a SEO job, learn a bit and then start your own company.
If you already feel confident that you know a lot about SEO, you can take this quiz and see how you score. Well, don't get depressed if you didn't pass - here is a great checklist that will teach you a lot, even if you are already familiar with SEO.

4Only Web-Designing MAY NOT be enough

Many companies offer turn-key solutions that include Web design, Web development AND SEO optimization. In fact, many clients expect that when they hire somebody to make their site, the site will be SEO friendly, so if you are good both as a designer and a SEO expert, you will be a truely valuable professional.
On the other hand, many other companies are dealing with SEO only because they feel that this way they can concentrate their efforts on their major strength – SEO, so you can consider this possibility as well.

5Logical step ahead if you come from marketing or advertising

The Web has changed the way companies do business, so to some extent today's marketers and advertisers need to have at least some SEO knowledge if they want to be successful. SEO is also a great career for linguists.

6Lots of Learning

For somebody who comes from design, development or web administration, SEO might look not technical enough and you might feel that you will downgrade if you move to SEO. Don't worry so much - you can learn a LOT from SEO, so if you are a talented techie, you are not downgrading but you are actually upgrading your skills packages.

7SEO is already recognized as a career

Finally, if you need some more proof that SEO is a great career, have a look at the available courses and exams for SEO practitioners. Well, they might not be a CISCO certification but still they help to institutionalize the SEO profession.

Some Ugly Aspects of SEO

1Dependent on search engines

It is true that in any career there are many things that are outside of your control but for SEO this is a rule number one. Search engines frequently change their algorithms and what is worse – these changes are not made public, so even the greatest SEO gurus admit that they make a lot of educated guesses about how things work. It is very discouraging to make everything perfect and then to learn that due to a change in the algorithm, your sites dropped 100 positions down. But the worst part is that you need to communicate this to clients, who are not satisfied with their sinking ratings.

2No fixed rules

Probably this will change over time but for now the rule is that there are no rules – or at least not written ones. You can work very hard, follow everything that looks like a rule and still success is not coming. Currently you can't even rely on bringing a search engine to court because of the damages they have done to your business because search engines are not obliged to rank high sites that have made efforts to get optimized.

3Rapid changes in rankings

But even if you somehow manage to get to the top for a particular keyword, keeping the position requires constant efforts. Well, many other businesses are like that, so this is hardly a reason to complain – except when an angry customer starts shouting at you that this week their ratings are sinking and of course this is all your fault.

4SEO requires Patience

The SEO professional and customers both need to understand that SEO takes constant effort and time. It could take months to move ahead in the ratings, or to build tens of links. Additionally, if you stop optimizing for some time, most likely you will experience a considerable drop in ratings. You need lots of motivation and patience not to give up when things are not going your way.

5Black hat SEO

Black hat SEO is probably one of the biggest concerns for the would-be SEO practitioner. Fraud and unfair competition are present in any industry and those who are good and ethical suffer from this but black hat SEO is still pretty widespread. It is true that search engines penalize black hat practices but still black hat SEO is a major concern for the industry.

So, let's hope that by telling you about the pros and cons of choosing SEO as your career we have helped you make an informed decision about your future

Search This Blog