Competing with the Big Names

Competing with the Big Names

by: Derek Croote

If you have ever been in an industry with long established, big name companies then you know it is hard to beat them in the search engine rankings. So you go to their pages to see how they are ranking so high, but they are not using the techniques you were told to use. Furthermore, they only have the word you searched for displayed once in their body text, not in their title tag, heading or meta tags. This can’t be right! However, it is. The only possible way to compete with the big names is with links, frequently updated content and name recognition.

The long established sites have received so many links over the years they don’t have to plaster their pages with certain keywords for high rankings.

Links can be hard to come by, especially if your site is new. There are countless ways to get them, but here are a few that will get you high quality links:

Writing articles

Reciprocal linking to relevant sites

Participating in forums

Frequently updating content

Creating free tools

Submitting to directories

Buying text links

Frequently updated content will generate a large number of oneway links, if people know about it. For people to find your compelling content, you have to get your site name out there. If no one knows about your content, how will you get links? To get you name out there you have to:

Become the leader in your industry

Write press releases, articles and free ebooks

Use paid advertising

Start a newsletter/ezine and a blog.

Target the right keywords

Beating the leading names may be difficult, but it is achievable. Gaining links and getting your name out there will be key in competing with these powerful sites. You will get to the top of the rankings in the search engines with time and hard work.

About The Author

Derek Croote is a SEO, web design and usability enthusiast. Derek is the webmaster of http://www.saratogalakesideacresassociation.org, a small homeowners association. You can reach him at [email protected].

This article was posted on April 16

by Derek Croote

How To Avoid The Google.com Duplicate Content Filt

How To Avoid The Google.com Duplicate Content Filter?

by: Christoph Puetz

More and more webmasters are building websites with publicly available content (data feeds, news feeds, articles). This results in websites with duplicate content on the Internet. In cases of websites build on news feeds or data feeds you can even find websites that match each other 100% (except for the design). Several copies of the same content in a search engine does not really do any good and so Google apparently decided to weed out some of this duplicate content to be able to deliver cleaner and better search results.

Plain copies of websites were hit hardest. If a webmaster was publishing the exact same content on more than one domain, all domains in question were eventually removed from Googleกs index. Many websites based on affiliate programs suddenly took a big hit in loss of traffic from Google.com. Shortly after this started some webmaster forums saw the same complaints and stories again and if 1 1 was put together a clear picture of the situation was available: a duplicate content filter was applied.

Duplicate content is not always bad and will always exist in one way or the other. News websites are the best example of duplicate content. Nobody expects those to be dropped from Googleกs index.

So, how can webmasters avoid the duplicate content filter? There are quite a few things webmasters can do when using duplicate content of any sort and still create unique pages and content from it. Letกs see some of these options explained here.

1) Unique content on pages with duplicate content.

On pages where duplicate content is being used, unique content should be added. I do not mean like just a few different words or a link/navigation menu. If you (the webmaster) can add 15% 30% unique content to pages where you display duplicate content the overall ratio of duplicate content compared to the overall content of that page goes down. This will reduce the risk of having a page flagged as duplicate content.

2) Randomization of content

Ever seen those กQuote of the Dayก thingies on some websites? It adds a random quote of the day to a page at any given time. Every time you come back the page will look different. Those scripts can be used for many more things than just displaying a quote of the day with just a few code changes. With some creativity a webmaster can use such a script to create the impression pages are always updated and always different. This can be a great tool to prevent Google to apply the duplicate content filter.

3) Unique content

Yes, unique content is still king. But sometimes you just cannot work around using duplicate content at all. That is alright. But how about adding unique content to your website, too. If the overall ratio of unique content and duplicate content is wellbalanced chances that the duplicate content filter applies to your website are much lower. I personally recommend that a website has at least 30% of unique content to offer (I admit I am sometimes having difficulties myself to reach that level but I try).

Will this guarantee that your website stays in Googleกs index? I don’t know. To be most successful a website should be completely unique. Unique content is what draws visitors to a website. Everything else can be found somewhere else, too and visitors have no reason to just visit one particular website if they can get the same thing somewhere else.

About The Author

Christoph Puetz is a successful Entrepreneur and international book author. Websites currently operated by Christoph are: http://www.highlandsranch.us http://www.smallbusinessland.com

[email protected]

This article was posted on August 17

by Christoph Puetz