Robert Nyman has questions he’d love to have answered about SEO. I’m not the person to answer these questions, certainly, but I can certainly provide commentary.
In particular, it’s nice to see people from the web standards community discussing search optimization. There’s no question that creating a website which applies web standards and the principles of accessibility also creates a nice landing spot for search engines. When you build accessibly, you remove barriers to access for search engines as well as users. Although accessibility and web standards are certainly not necessary for search engine success, they can be an excellent way to kickstart your campaign. New websites in particular are likely to benefit from the crawlability and easy navigation aided by conscientious construction.
Robert comments on the fact that there are a lot of shady SEO companies out there. It’s important to mention that, but also important, as he does, to acknowledge that there are large communities of enthusiastic search marketing companies who won’t use those methods. A solid search marketing company will emphasize long term results — and will therefore avoid these methods which leave your site open to future condemnation.
There are a lot of interesting questions he raises on how logical web design and implementation questions can influence search engine considerations of your site. Does hidden text which is part of a DOM (Document Object Model)-manipulable interface raise red flags? What about use of <em>
to highlight an entire block of text, such as an introductory paragraph or preamble? They may be sensible decisions, but they may also raise undesirable red flags with a search robot.
My opinion is that search engines are out to find your content. If you’re providing unique, valuable content, they’ll persevere until they find and index your content. If you’re using a particular technique for a good reason, you should be OK.
Should be OK. That deserves repetition. You may NOT be OK — like I said, I’m not an authority. However, if you keep the basic premise that search engines want to find your unique information, then they’ll discard the red flags as long as they continue to find value at your site. Algorithmic decisions will always create conflicts, however. Whether the weight of your content is greater than the flags raised by your design decisions is an open question.
To generalize, if you’re making a decision on the basis of a semantic decision or to provide added functionality, you should be all right. If you’re making a decision because you’re trying to influence search engines, you should think again. A search engine should never be your sole reason for a coding decision.
Frank Meyer
I think, that only content makes a website good or bad. In your website has a good theme and much backlinks, it will be indexed from the spiders and found by users….
Joe Dolson
Writing for search engines is a bit of a different beast — there’s a line to be walked between headlines which are interesting to people and also strong for search engines and headlines which are interesting to people but very weak for search engines. The thing to avoid is headlines which are weak for people but strong for search engines…
Your rule of thumb is pretty sound!
Mike Cherim
I agree, to a point. I never code for search engines, but when I create headings I try to write them with indexing in mind. I love being creative with my headings, but on the web that may not produce good results so I save that for print works.
A safe rule of thumb I think: If you think you’re playing games to trick the ‘bots then you probably are and it may prove disastrous. Want good SEO? Write good content and add new content regularly. Make sure your site is accessible to the blind, choose headings/wording carefully so it is properly descriptive (which is important on the web), promote the site so it is linked to, and then rest should just fall into place. Naturally.
Joe Dolson
I’ll have to admit that MSN would be my last choice of those three…but I’m sure it depends somewhat on what you’re searching for! You’re right that knowing exactly what could be a problem is very challenging — which is exactly why I’d recommend always having a solid justification for any action which doesn’t revolve around search engine behaviors.
Stevie D
It’s very difficult to know exactly what search engines will downscore a website for, given that they refuse to tell us! They claim that more and more weight is given to incoming links over the page itself.
Given how fickle the search engines can be, Google and Yahoo in particular, with pages jumping up and down the rankings every day for no apparent reason, and no knowing when the next big launch that turns everything on its head will be, we have to be careful about how much time we invest in deliberate SEO techniques. As you say, the best strategy is to include relevant and unique content, and hope!
FWIW, my current search engine of choice is MSN, for consistently picking the most relevant sites.