Code: Select all
{if $head_title}
<meta name="robots" content="index,follow">
{else}
<meta name="robots" content="noindex,follow">
{/if}
P.S.
I did not use {if $is_single_entry} because that won't catch static pages.
Code: Select all
{if $head_title}
<meta name="robots" content="index,follow">
{else}
<meta name="robots" content="noindex,follow">
{/if}
I don't think I mentioned penalties. Anyhow, the noindex on the overview page still makes sense to me - having one chunk of content only indexed once seems reasonable to me. So the above code snippet can be interpreted as making sure the right page is being indexed (and later offered as a search result). To ensure that search engines recognize the original source of information (as in "the entry", not some overview page). To ensure that archive/list pages don't push the individual entry pages out of the result list.johncanary wrote:There is nothing like "available slots"!
And there is no "Duplicate Content Penality", if you run your blogOnly thing that happens is, Google hides result sets that it believes to be
- * in a natural fashion
* have original content
* don't steal from other sites
* don't overload it with advertisment
redundant to a specific search. Then those pages are "hidden" behind the
"More Results" Link.
Yeah it should - but I don't see any benefit in returning archive pages as search results - all content therein is also located on a single page (which may link to more interesting, related articles). Except when the search query overlaps two blog entries which are then returned together, but - I don't know.johncanary wrote: Those categories, monthly, archives, tag pages are just a different mix of the
content. It can be those pages that get higher ranking for a particular search
term than a single entry page.Google knows how blogs work and it knows what archives, .... are about.
- It is very simple:
* The more pages you allow to be indexed
* The more pages will be indexed
* The more free traffic you will get
Not with less LOC though - especially when you think about static pagesjohncanary wrote: Don't limit your potential.
You could use the ROBOTS.TXT file to achieve the same effect more easily.
JohnCanary
Scratch that. After reviewing the intent and code of the plugin I had in mind, I really do not think it is the place to incorporate this concept since it applies to so many different types of pages (entries, overviews, archives, static pages, and other plugin generated non-entry pages).Don Chambers wrote:I look forward to any further input on this. I have an idea as to where this COULD be included in a plugin but will only do so if the concept has merit.
You are right, you didn't.jhermanns wrote:I don't think I mentioned penalties.
LOC ?jhermanns wrote:Not with less LOC though - especially when you think about static pagesjohncanary wrote:You could use the ROBOTS.TXT file to achieve the same effect more easily.
JohnCanary
Or the overview page could "dominate" for some reason and push the article page out of the search results - and the article page contains links to related blog entries. That's what I tried to sayjohncanary wrote: Having an overview page indexed gives you simply more chances that your blog
turns up in some search results (SERP) for some users. That's what I know.
How would an accessible Site that is not very huge (as adobe.com) profit from submitting sitemaps? If you search for adobe (on google) e.g. you see the effect of submitting a sitemap for a large site. But for regular sites I don't really see the use...johncanary wrote:nstead of setting some pages to 'noindex',
I would focus onI vote for having these three functions in the S9Y core:
- * Providing a sitemap, which is very effective
* getting as many inbound links from many different, relevant sources as
possible. With this it makes much sense to contentrate on the "entries".That's the best for publicity. Publicity brings inbound links. Inbound links
- * Announcement (Ping popular services)
* Trackback Control
* Full pingback support (currently only in the development version)
are the most effective SEO technique.
Lines of Codejohncanary wrote:LOC ?jhermanns wrote:Not with less LOC though - especially when you think about static pagesjohncanary wrote:You could use the ROBOTS.TXT file to achieve the same effect more easily.
JohnCanary
Sitemaps reviewed:jhermanns wrote:How would an accessible Site that is not very huge (as adobe.com) profit from
submitting sitemaps? If you search for adobe (on google) e.g. you see the effect of
submitting a sitemap for a large site. But for regular sites I don't really see the
use...
Code: Select all
{if ($view == "entry" || $view == "plugin") && $smarty.server.REQUEST_URI|truncate:17:"" != "/daily/plugin/tag"}
<meta name="robots" content="index,follow" />
{else}
<meta name="robots" content="noindex,follow" />
{/if}
Code: Select all
{if $head_title}
<meta name="robots" content="index,follow">
{elseif $startpage}
<meta name="robots" content="index,follow">
{else}
<meta name="robots" content="noindex,follow">
{/if}