Tuesday, March 11, 2014

LAST Life-Changing Seminar

)

Bhagwat Gita Saar in Hindi

)

Lord Krishna asks if it is not wrong to blame someone else for our unfulfilled desires

)

Sri Krishna sheds light on how decisions affect our lives

)

Lord Krishna introspects the basis on which we make our decisions

)

Thursday, October 03, 2013

How to Build Links Using Expired Domains

Many people have had great success snapping up expired domains and using those sites for link building purposes. One of the main reasons for this was that it saved work, as you could grab a site that already had content and backlinks and at least a baseline established presence.
However, after the past year with all the Google changes that make link building trickier than ever, this process is no longer as easy and safe as it once was, but it can still be valuable if you think about what you're doing and don't just buy every domain that has your desired keyword in it then hastily 301 redirect it to your own site or trash the content with links to your main site, expecting miracles.
Affiliate marketers are also fond of expired domains to use for their work so while we won't go into detail on that, we will cover some topics that are relevant for that specific use.

How to Find Dropped/Expired/Expiring Domains?

Domain Tools is one of the main places that I check but there are many sites that list expired or about-to-expire domains that are up for grabs. Network Solutions has custom email alerts where you can put in a keyword and get an email when domains matching that are expiring so that's a nice option for those of you who like a more passive approach.
Network Solutions Expiring Domains
Snap Names is also good, as is Drop Day. You may find that there are certain sites that are best for your purposes (whether it's keeping an eye on ones you want or getting ones that just expired) so look around and figure out what best suits you.
Want a domain that's at least 9 years old and has a listing in DMOZ? Domain Tools is where I'd go for that, for example:
Domain Tools Dropping Names
Of course if you come across a domain that you like and it's not set to expire any time soon, there's nothing wrong with emailing the owner and asking to buy it.
Domain may be for sale

How to Vet Expired Domains

  • Check to see what domains 301 redirect to them. I use Link Research Tools for this as you can run a backlink report on the domain in question and see the redirects. If you find a domain that has 50 spammy 301s pointing to it, it may be more trouble that it's worth. Preventing a 301 from coming through when you don't control the site that redirects is almost impossible. You can block this on the server level but that won't help you with your site receiving bad link karma from Google. In that case, you may have to disavow those domains.
  • Check their backlinks using your link tool of choice. Is the profile full of nothing but spam that will take ages to clean up or will you have to spend time disavowing the links? If so, do you really want to bother with it? If you want to buy the domain to use for a 301 redirect and it's full of spammy links, at least wait until you've cleared that all up before you 301 it.
  • Check to see if they were ever anything questionable using the Wayback Machine. If the site simply wasn't well done 2 years ago, that's not nearly as big of a problem as if you're going to be using the site for educating people about the dangers of lead and it used to be a site that sold Viagra.
  • Check to see if the brand has a bad reputation. Do some digging upfront so you can save time disassociating yourself from something bad later. You know how sometimes you get a resume from a person and you ask an employee if they know this Susan who also used to work at the same place that your current employee worked years ago and your employee says "oh yes I remember her. She tried to burn the building down once"? Well, Susan might try to burn your building down, too.
  • Check to see if they were part of a link network. See what other sites were owned by the same person and check them out too.
  • Check to see if they have an existing audience. Is there an attached forum with active members, are there people generally commenting on posts and socializing them, etc.?

How Should You Use Expired Domains?

Many people 301 redirect these domains to their main sites or secondary sites in order to give them a boost. Others turn them into part of their legitimate online arsenal and use them as a proper standalone resource.
Some people add them to their existing blog network and interlink them. Some people keep them and use them to sell links. Some people keep them and try to resell them. Some people use them to try their hand at affiliate marketing.
However that's talking about how people use them, not about how they should use them, but how you should use them is up to you.
I once worked with an account where we used tons of microsites. They were standalone sites that each linked to the main brand site and we built links to them. It worked for a while (and still works for many people according to what I see in forums) but as far as I can tell, most of those microsites are no longer in Google's index or no longer contain live links to the brand site. That's because in that case, it stopped working and became more of a danger than anything else. They served no purpose at all other than to host a link to the brand site, and since they gained no authority, it just wasn't worth the trouble of keeping them up.
I've also dealt with someone who successfully bought expired domains and redirected them to subdomains on his main site in order to split it up into a few niche subdomains. He didn't overdo it, and each expired domain had a good history with content relevant to what the subdomain was, so it all worked very well.
As mentioned early on, affiliate marketers also use expired domains. One big benefit of this is that if you plan to just use PPC for affiliate marketing, you don't have to be as concerned about the backlink profile of the domain as you might not care that much about its organic rankings.

Some Good Signs of Expired Domains

Some of these probably depend upon the purpose you have in mind, but here are a few things I like to see on an expired or expiring domain but please keep in mind that these aren't discrete defining features of a quality domain; they are simply a couple of signs that the domain might be a good one to use:
  • Authority links that will pass through some link benefits via a 301 redirect (if I'm going that route.)
  • An existing audience of people who regularly contribute, comment, and socialize the site's content (if I'm going to use it as a standalone site.) If I'm looking to buy a forum, for example, I'd want to make sure that there are contributing members with something to offer already there. If I want a site that I will be maintaining and adding to and plan to build it out further, seeing that there's an audience of people reading the content, commenting on it, and socializing it would make me very happy.
  • A decent (and legitimate) Toolbar PageRank (TBPR) that is in line with where I think it should be. If I see a site that is 7 months old and has a TBPR of 6, I'll obviously be suspicious, and if I found one that was 9 years old and was a TBPR 1, I would hestitate before using it, for example. I also have to admit that while I don't rely on TBPR as a defining metric of quality, I'd be crazy to pretend that it means nothing so it's definitely something I look at.
  • A domain age of at least 2 years if I was going to do anything other than hold it and try to resell it.
  • Internal pages that have TBPR. If there are 5000 pages and only the homepage has any TBPR, I'd be a bit suspicious about why no internal pages had anything.

A Few Red Flags of Expired Domains

  • Suspicious TBPR as mentioned above.
  • The domain isn't indexed in Google. Even if you look at a recently expired site and see it has a TBPR of 4 with good Majestic flow metrics, is 5 years old, and has been updated in some way until it expired (whether through new blog posts, comments, social shares, etc.), it's safe to ssume it's not indexed for a good reason and you probably want to stay away from it.
  • Backlink profile is full of nothing but spam.
  • All comments on the site's posts are spammy ones and trackbacks.

Bottom Line: Is Using Expired Domains a Good Idea?

As with almost anything in SEO right now, some tactics aren't really great ideas for the long-term but since they work for the short-term, people still use them. Some tactics that won't work in one niche will still work well in certain other niches and some sites seem to be able to weather just about any algorithmic change in Google.
That's why it's hard to say that you shouldn't do this, or you should do that, because every case is different, every webmaster/site owner has a different idea about risk, and a lot of people have made a lot of money off doing things that I personally wouldn't do.
I don't have time to keep up the blogging on my own site so I would never expect that I could keep it up on five sites, each devoted to a specific area of my industry, but with the right manpower and the right people, this can be a successful strategy for many.
If you plan to use them for affiliate marketing and you're going to use PPC for that, you don't have to worry about some of the things that you would have to be concerned with if you planned to rank well.

After '(Not Provided)' & Hummingbird, Where is Google Taking Us Next?

We've come a long way in a little over two decades of search. Archie, Veronica, Jughead, Excite, Wanderer, Aliweb, Altavista, WebCrawler, Yahoo, Lycos, LookSmart, Google, HotBot, Ask, dmoz, AllTheWeb, Goto (Overture), Snap, LiveSearch, Cuil, Bing, Blekko, DuckDuckGo, Yandex, Baidu... and too many other also-rans to name.
The earliest were simply a collection of resources, initially just in alphabetical order, then some introducing an internal search capability. Eventually, some began to crawl the web, while others contented themselves with using the indexes of others.
Among them all, Google now stands out as the giant. About two-thirds of all global searches happen on Google. So that means that those of us who want our sites to be found in Google's search results need to color between the (webmaster guide)lines, while trying to figure out what Google wants to see, today and hopefully, tomorrow.

Search Today

Figuring out what Google prefers to rank isn't really that complex. Pay attention, use some common sense, don't look for silver bullets, and provide quality and value. Get that down pat and you're in pretty good shape.
Most folks who find themselves crosswise of Google got there because they (or someone they hired) tried to take a shortcut. Do shortcuts still work? You bet! Do they still last? Not so much!
Google has gotten a lot better at detecting and handling manipulative tactics. No, they're not perfect – not by a far cry. But the improvement is undeniable, and a couple of recent developments offer hope.
What happened?
Google unleashed a one-two punch recently, with two important changes that stirred up a lot of chatter in SEO and marketing communities. And I'm not convinced they're unrelated. They just mesh too well to be coincidence (not to be confused with correlation, my friends).

1. '(Not Provided)'

No Keyword DataThe recent extension to "(not provided)" for 100 percent of organic Google keywords in Google Analytics got a lot of people up in arms. It was called "sudden", even though it ramped up over a period of two years. I guess "it suddenly dawned on me" would be more accurate.
As my bud, Thom Craver, stated perfectly, if you're one of those who is saying that no keywords means SEO is dead or you can't do your job, then you shouldn't be doing SEO to begin with.
That sums it up pretty well. There are still ways to know what brought users to your pages. It's just not handed to you on a silver platter any more. You'll have to actually work for it.

2. Hummingbird

HummingbirdNow let's look at the other half of that double-tap: Hummingbird. Since Google's announcement of the new search algorithm, there have been a lot of statements that fall on the inaccurate end of the scale. One common theme seems to be referring to it as the biggest algo update since Caffeine.
Wrong on both counts, folks! First, Caffeine is a software set for managing the hardware that crawls and indexes, not search. As such, it's not an algorithm. It was also new, not updated, but we'll let that slide.
That second point, however, applies strongly to Hummingbird. There is no such thing as a Hummingbird update. It's a brand new search algorithm.
Jeez-Louise. if you're going to speak out, at least try not to misinform, OK?

Why Might they be Related?

Now understand, there's a bit of conjecture from here on out. I can't point to any evidence that supports this theory, but I think many of you will agree it makes some sense.
Killing the easy availability of keywords makes sense to me. People have focused on keywords to a degree that approaches (and often passes) ridiculous. Google has finally, however, achieved a sufficient level of semantic ability to allow them to ascertain, with a reasonable amount of accuracy, what a page is about, without having exact keywords to match to a query.
Methinks it's a good idea for the folks who are generating content to try the same.
So... we can no longer see the exact keywords that visitors used to find us in organic search. And we no longer need to use exact keywords to be able to rank in organic search.
Yeah, I know, pure correlation. But still, a pattern, no?
My theory is that there's no coincidence there. In fact, I think it runs deeper.
Think about it. If you're no longer targeting the keywords, you can actually *gasp* target the user. Radical concept for folks who are still stuck in a 2005 rut.
Bottom line: You need to start building your content with concept and context in mind. That'll result in better content, more directed to your visitors – then you can stop worrying about whether Google has a clue about the topic your page is focused on.
Just communicate. If you do it right, it'll come through, for both. Just think things, not strings.

Where is Search Heading Next?

RainbowHere's where I think the Knowledge Graph plays a major role. I've said many times that I thought Google+ was never intended to be a social media platform; it was intended to be an information harvester. I think that the data harvested was intended to help build out the Knowledge Graph, but that it goes still deeper.
Left to its own devices, Google could eventually build out the Knowledge Graph. But it would take time, and it would undoubtedly involve a lot of mistakes, as they dialed their algos in.
With easily verified data via Google+, Google has a database against which they can test their algos' independent findings. That would speed the development process tremendously, probably shaving two or three years off the process.
But my theory doesn't end there. Although I suspect it wasn't a primary motivation, the removal of keywords, coupled with the improved semantic ability of Hummingbird, puts a whole new level of pressure on people to implement structured data. As adoption cranks up, the Knowledge Graph will be built out even faster.
As I said, I doubt that motivating people to implement structured data markup was a primary focus of the recent changes. But I'll bet it was a major benefit that didn't go unnoticed at the 'Plex.
The last week has definitely brought some changes to the way we'll be handling our online marketing and SEO efforts. The Internet continues to evolve. Those who don't follow suit may soon be extinct.

Tuesday, June 25, 2013

How to Make Your Keywords Fit Your Marketing Messaging

here is so much more to keywords than traffic.
They're the biggest descriptor of your business, your self-identifier, and you don't just use them in search. When you have those uncomfortable "So, what do you do?" conversations with the person next to you on an airplane, a keyword is one of the first things out of your mouth.
My point is that keywords are how people remember you, so you need to think more about how they reflect what you do rather than how much they're searched when choosing the right ones for you.
My digital agency is rebranding, which means a new website, new logo, new target audience and new marketing messaging. That also means a whole new set of keywords. Here's how we figured them out.

Who Are You?

Many dive into keyword research without first taking a hard look at the business itself. That feels backwards. Before you can know how you want users to find you, you need to know how the company positions itself.
You may already have a good idea if you're working with company that's been around for a while, but since we're rebranding, we started from scratch. During our branding discussions, we asked questions like:
  • If your company was a car, how would you describe it?
  • How would you describe your company culture?
  • What are your company's core values?
  • What's your company's mission? The vision?
  • Who do you want to buy from your company?
  • Who are the decision makers? Decision influencers?

Market Research

The first place to start in doing keyword research is with your users. Don't get me wrong: Google's Keyword Tools is great for brainstorming variations and validating your opinions with numbers, but you should never rely on a bot to tell you what your users are doing.
By putting together a simple survey in SurveyMonkey – or simply asking people in your local coffee shop if your target audience is more broad – you're able to glean a lot about how users actually search.
How many of you have had the "you really don't know what keywords your users would use to search for your business" conversation with a client?
Ask them questions like:
  • How do you find a company for XXXX?
  • What would you type into Google to find these companies?
  • If you were looking for advice on XXXX, what would you do?
  • What's the most important thing you look for on a XXXX company's home page?
When we did this, we sent out the survey to existing clients who fit our new target, CMO and marketing organizations and promoted through social media to garner feedback.

Informational vs. Commercial Searches

Your user research will tell you a lot about how people actually search, but you do ultimately need to run what they gave you through Google's Keyword Tool to get an idea of traffic and Moz's Keyword Difficulty to get an idea of what it's going to take to see the fruits of your labor.
Traffic isn't the last stop in the vetting process, though. Next, think about the type of query your keyword triggers. Users search because they're looking for an answer to a question, and that comes with some sort of action. The two biggest are:
  • Informational: Your user is still researching and just wants more information about a topic.
  • Commercial: Your user is looking for a business and ready to buy.
You don't want a commercial-based page ranking for an informational-based keyword because you're failing to reach people at the right phase of the buying process.
aida-funnel

Lastly Go To Google

Ultimately, you want to ensure you're comparable with the company you keep, meaning the companies who are using your potential keywords should be your actual competitors, not just your search competitors. So, search for your keywords and see who's ranking.
Look for similar things that your company and these share, like:
  • Target audience
  • Services or products
  • Price points
  • Messaging and positioning
For example, "web design company" has much more traffic but "web design agency" brings up a higher quality of businesses. This was further validated by our market research when we found more C-level marketers use agency over company or firm, and those are the people we are trying to reach.

6 Reasons Why Your Google Adwords For Small Businesses Is Not Working?



    Small businesses are encountering with many challenges while managing their AdWords campaigns. Such small businesses have limited budget, their website is not optimized and also, many people don’t have clear understanding of Google AdWords. I truly believe that you should do your homework before hiring an external help. Let’s have a look on six biggest mistakes, which I have noticed over and over again in adwords accounts of small businesses owners.....

    Thursday, May 23, 2013

    Tuesday, December 27, 2011

    Good to know

    मंज़िलों की खोज में तुमको जो चलता सा लगा, मुझको तो वो ज़िन्दगी भर घर बदलता सा लगा

    Thursday, January 20, 2011

    Too Good

    Swarg ke dwar pe 3 log khade the.

    God
    : Sirf 1 hi andar ja sakta hai....

    1st
    : Main Brahmin hu, sari umar aapki seva ki hai. Swarg pe mera hak hai....

    2nd
    : Main Doctor hu, sari umar logo ki seva ki hai. Swarg pe mera haq hai....

    3rd
    : Maine IT MEIN JOB KI HAI.... ......

    God
    : Bas... aage kuch mat bol.... Rulaayega kya pagle..? Andar aa ja......... Tera forwarded mails, follow-ups, bench pe 2years, night shifts, PM se panga, CTC se zaada deductions, pick-up drop k
    a lafda , client meetings, delivery dates, weekends mein kaam, kam umar mein baalon ka zhadna - safed hona, motape ka problem, etc etc…. mere ko senti kar diya yaar…..aja jaldi andar aja…..

    Monday, November 08, 2010

    Footer Link Optimization for Search Engines and User Experience

    Site after site that I visit lately has been showing a tendency for using footer links to run their internal SEO link structure and anchor text optimization. While this practice in years past held value, today I rarely ever recommend it (and yes, SEOmoz itself will be moving away from using our footer for category links soon). Here's why:

    1. Footer links may be devalued by search engines automatically
      Check out the evidence - Yahoo! says they may devalue footer links, Bill Slawski uncovers patents suggesting the same and anecdotal evidence suggests Google might do this (or go further) as well. Needless to say, if you want to make sure your links are passing maximum value, it's wise to avoid the footer (particularly the footer class itself).
    2. Footer links are often not the first link on the page to a URL
      Since we know that the first link on a page is the one whose anchor text counts and footer links, while anchor text optimized, are often a second link to an already-linked-to target, they are likely not to have the desired impact.
    3. Footer links get very low CTR
      Naturally, since they're some of the least visible links on a webpage, they receive very little traffic. Thus, if algorithms like BrowseRank or other traffic metrics start to play a role in search rankings, footers are unlikely to have a positive impact.
    4. Footer links often take a page beyond a healthy link total
      Many pages that already have 80-100 links on the page are going to lose out when they add a footer with another 30-50 links embedded. The link juice passed per link will go down and the value of each individual link is lowered.
    5. Footer links can be a time suck
      The time you spend crafting the perfect link structure in the footers could be put towards more optimal link structures throughout the site's navigation and cross-linking from content, serving both users and engines better.

    That's not to say I don't suggest doing a good job with your footers. Many sites, large and small, will continue to use the footer as a resource for link placement and, just as with all other SEO tactics that fade, they do carry some residual value. Let's walk through a few examples of both good and bad to get a sense for what works:

    Thumbs Up: Shopper.Cnet.com

    Shopper.Cnet.com Footer Links

    I like the organization, the clear layout, the visibility and the fact that they've distinguished (through straight HTML links vs. drop downs) which links deserve to pass link juice and ranking value. I'm also impressed that I actually see a "Paris Hilton" link in the footer yet am not completely unaccepting of the idea that it could be there entirely naturally, simply as a result of what's popular on CBS.

    Thumbs Down: Hawaii-Aloha.com

    Hawaii-Aloha.com Footer Links

    These are my least favorite kinds of footers. The links are just squashed together, the focus is obviously on anchor text, not relevance, the links are hard to see and read and there's little thought given to users. The links don't even look necessarily clickable until you hover.

    Thumbs Up: VIPRealtyInfo.com

    VIPRealtyInfo.com Footer Links

    When I searched for "Dallas Condos", I was sure I'd find some examples for thumbs down, which is why I was so thrilled to find VIPRealtyInfo, a clearly competitive site in a tough SEO market doing a lot of things right. Yeah, there's some reasonable optimization in the anchor text, but it's definitely not overboard and the links are useful to people and search engines. The visual layout and design quality gives it an extra boost, too - something that can't be overstated in importance when it comes to potential manual reviews by the engines.

    Thumbs Down: ABoardCertifiedPlasticSurgeonResource.com

    ABoardCertifiedPlasticSurgeonResource.com Footer Links

    The site's done a great job with design - it's really quite an attractive layout and color scheme. The links in the "most popular regions" aren't that bad; it's really the number of them that makes me worried. If they'd stuck to one column, I think they'd easily pass a manual review and pass good link juice (rather than spreading it out with so many links in addition to everything else on the page). The part that really sent me over the edge though was the two sentences in the green box, laden with links I didn't even realize were there until I hovered. Technically, there's nothing violating the search guidelines, but I wouldn't put it past the engines to come up with smart ways to devalue links like these, particularly when their focus is so clearly on anchor text, not user value.

    Thumbs Up: Food.Yahoo.com

    Yahoo! Food Footer Links

    Again - great organization, good crosslinking (remaining relevant, then branching out to other network properties) and solid design. Even the most aggressive of the links on the right hand side appear natural and valuable to users, making it hard for an engine to argue they shouldn't pass full value.

    Thumbs Sideways: DeviantArt.com

    DeviantArt.com Footer Links

    It's huge - seriously big. And while it's valuable for users and even contains some interesting content, it's not really accomplishing the job of a footer - it's more like a giant permanent content block on the site. The arrow that lets you close it is a good feature, and the design is solid, too. However, the link value really isn't there and the potential for big blocks of duplicate content across the site makes me a bit nervous, too.

    So what can we take away from these analyses? A few general footer-for-SEO rules of thumb:

    • Don't overstuff keywords in anchor text
    • Make the links relevant and useful
    • Organize links intelligently - don't just throw them into a big list
    • Cross-linking is OK, just do it naturally (and in a way that a manual review could believe it's not solely for SEO purposes)
    • Be smart about nofollows - nearly every footer on the web has a few links that don't need to be followed, so think about whether your terms of service and legal pages really require the link juice you're sending
    • Make your footers look good and function well for users to avoid being labeled "manipulative" during a quality rater's review

    Best Practices for Content Optimization

    Is it possible that in all the years we've been writing at SEOmoz, there's never been a solid walkthrough on the basics of content optimization? Let's fix that up.

    First off, by content, I don't mean keyword usage or keyword optimization. I'm talking about how the presentation and architecture of the text, image and multimedia content on a page can be optimized for search engines. The peculiar part is that many of these recommendations are second-order effects. Having the right formatting or display won't necessarily boost your rankings directly, but through it, you're more likely to earn links, get clicks and eventually benefit in search rankings. If you regularly practice the techniques below, you'll not only earn better consideration from the engines, but from the human activities on the web that influence their algorithms.

    Content Structure

    Because SEO has become such a holistic part of website improvement, it's no surprise that content formatting - the presentation, style and layout choices you select for your content - are a part of the process. Choosing sans serif fonts like Arial and Helvetica are wise choices for the web; Verdana in particular has received high praise from usability/readability experts, such as this article from WebAIM:

    Verdana is one of the most popular of the fonts designed for on-screen viewing. It has a simple, straightforward design, and the characters or glyphs are not easily confused. For example, the upper-case "I" and the lower-case "L" have unique shapes, unlike Arial, in which the two glyphs may be easily confused.

    Another advantage of Verdana is that the spacing between letters. One consideration to take into account with Verdana is that it is a relatively large font. The words take up more space than words in Arial, even at the same point size.

    The larger size improves readability, but also has the potential of disrupting carefully-planned page layouts.

    Font choice is accompanied in importance by sizing & contrast issues. Type smaller than 10pt is typically very challenging to parse and in all cases, relative font sizes are recommended so users can employ browser options to increase/decrease if necessary. Contrast - the color difference between the background and text is also critical - legibility usually drops for anything that isn't black (or very dark) on a white background.

    Content length is another critical piece of the optimization puzzle that's mistakenly placed in the "keyword density" or "unique content" buckets of SEO. In fact, content length can have a big role to play in whether your material is easy to consume and easy to share. Lengthy pieces often don't fare particularly well on the web, while short form and easily-digestible content often has more success. Sadly, splitting long pieces into multiple segments frequently backfires, as abandonment increases while link-attraction falls - the only benefit is page views per visit (which is why so many CPM-monetized sites employ this tactic).

    Last but not least in content structure optimization is the display of the material. Beautiful, simplistic, easy-to-use and consumable layouts garner far more readership and links than poorly designed content wedged between ad blocks that threaten to overtake the page. I'd recommend checking out The Golden Ratio in Web Design from NetTuts, which has some great illustrations and advice on laying out web content on the page.

    CSS & Semantic Markup

    CSS is commonly mentioned as a "best practice" for general web design & development, but its principles coincide with many SEO guidelines as well. First, of course, is web page size. Google used to recommend keeping pages under 101K and, although most suspect that's no longer an issue, keeping file size low means faster load times, lower abandonment rates and a higher probability of being fully indexed, fully read and more frequently linked-to.

    CSS can also help with another hotly debated issue: code to text ratio. Some SEOs swear that making code to text ratio smaller (so there's less code and more text) can help considerably on large websites with many thousands of pages. My personal experience showed this to be true (or, at least, appeared to be true) only once, but since good CSS makes it easy, there's no reason not to make it part of your standard operating procedure for webdev. Use tableless CSS stored in external files & keep Javascript calls external and follow in the path of CSS Zen.

    Finally, CSS provides an easy means for "semantic" markup. For a primer, see Digital Web Magazine's article, Writing Semantic Markup. For SEO purposes, there are only a few primary tags that apply and the extent of microformats interpretation (using tags like or

    ) is less critical (the engines tend to sort out semantics largely on their own since so few web publishers participate in this coding fashion). Using CSS code to provide emphasis, to quote/reference and to reduce the use of tables and other bloated HTML mechanisms for formatting, however, can make a positive difference.

    Content Uniqueness & Depth

    The final portion of our content optimization discussion is the most important. Few can debate the value the engines place on robust, unique, value-adding content. Google in particular has had several rounds of kicking "low quality content" sites out of their indices, and the other engines have followed suit.

    The first critical designation to avoid is "Thin Content" - an insider phrase that (loosely) means that which the engines do not feel contributes enough unique material to display a page competitively in the search results. The criteria have never been officially listed, but I have seen & heard many examples/discussions from engineers and would place the following on my list:

    • 30-50 unique words, forming unique, parseable sentences that other sites/pages do not have
    • Unique HTML text content, different from other pages on the site in more than just the replacement of key verbs & nouns (yes, this means all those sites that build the same page and just change the city and state names thinking it's "unique")
    • Unique titles and meta description elements - if you can't write unique meta descriptions, just exclude them. I've seen similarity algos trip up pages and boot them from the index simply for having near-duplicate meta tags.
    • Unique video/audio/image content - the engines have started getting smarter about identifying and indexing pages for vertical search that wouldn't normally meet the "uniqueness" criteria

    BTW - You can often bypass these limitations if you have a good quantity of high value, external links pointing to the page in question (though this is very rarely scalable) or an extremely powerful, authoritative site (note how many one sentence Wikipedia stub pages still rank).

    The next criteria from the engines demands that websites "add value" to the content they publish, particularly if it comes from (wholly or partially) a secondary source. This most frequently applies to affiliate sites, whose re-publishing of product descriptions, images, etc. has come under search engine fire numerous times. In fact, we've recently dealt with this issue on several sites and concluded it's best to anticipate manual evaluations here even if you've dodged the algorithmic sweep. The basic tenets are:

    • Don't simply re-publish something that's found elsewhere on the web unless your site adds substantive value to users
    • If you're hosting affiliate content, expect to be judged more harshly than others, as affiliates in the SERPs are one of users' top complaints about search engines
    • Small things like a few comments, a clever sorting algorithm or automated tags, filtering, a line or two of text, or advertising does NOT constitute "substantive value"

    For some exemplary cases where websites fulfill these guidelines, check out the way sites like C|Net (example), UrbanSpoon (example) or Metacritic (example) take content/products/reviews from elsewhere, both aggregating AND "adding value" for their users.

    Last, but not least, we have the odd (and somewhat unknown) content guideline from Google, in particular, to refrain from "search results in the search results" (see this post from Google's WebSpam Chief, including the comments, for more detail). Google's stated feeling is that search results generally don't "add value" for users, though others have made the argument that this is merely an anti-competitive move. Whatever the motivation, here at SEOmoz, we've cleaned up many sites' "search results," transforming them into "more valuable" listings and category/sub-category landing pages, and have had great success recovering rankings and gaining traffic from Google.

    In essence, you want to avoid the potential for being perceived (not necessarily just by an engine's algorithm but by human engineers and quality raters) as search results. Refrain from:

    • Pages labeled in the title or headline as "search results" or "results"
    • Pages that appear to offer a query-based list of links to "relevant" pages on the site without other content (add a short paragraph of text, an image, and a formatting that makes the "results" look like detailed descriptions/links instead)
    • Pages whose URLs appear to carry search queries, e.g. ?q=seattle+restaurants vs. /seattle-restaurants

    Search This Blog