Hello !
I would like to have Google index my entries and only my entries ! Not the category pages, the author pages, the month pages, etc., because this cause the same text to appear several times when doing a google search.
Is there an easy way to do that.
Thanks, Fabien
Google indexation
Google indexation
Fabien Chabreuil (blog)
Re: Google indexation
Create a robots.txt and exclude pages, you don't want to be indexed.
For example, this is the one i use for my weblog. (could be that's the one, which is default)
For example, this is the one i use for my weblog. (could be that's the one, which is default)
Code: Select all
Sitemap: http://bernd.distler.ws/sitemap.xml.gz
User-agent: *
Disallow: /feeds/
Disallow: /bundled-libs/
Disallow: /deployment/
Disallow: /docs/
Disallow: /htmlarea/
Disallow: /include/
Disallow: /lang/
Disallow: /plugins/
Disallow: /sql/
Disallow: /templates_c/
Disallow: /*.php$
Disallow: /*.js$
Disallow: /*.inc$
Disallow: /*.tar$
Disallow: /*.tgz$
Disallow: /*.sh$
Disallow: /*.zip$
Disallow: /*.tpl$
Re: Google indexation
Thank you Bernd for your answer. So I have to write something like
Disallow: /myblog/authors/
Disallow: /myblog/categories/
Disallow: /myblog/archives/2014/ (one line by year)
Is this correct?
Thanks, Fabien
Disallow: /myblog/authors/
Disallow: /myblog/categories/
Disallow: /myblog/archives/2014/ (one line by year)
Is this correct?
Thanks, Fabien
Fabien Chabreuil (blog)
Re: Google indexation
I am not a robots-expert, but i think this extensions are right.
What you have to ensure: robots.txt has to be in root of your domain, so it could be that you have to change the pathes of my example if you have s9y in a subfolder /weblog/.
What you have to ensure: robots.txt has to be in root of your domain, so it could be that you have to change the pathes of my example if you have s9y in a subfolder /weblog/.