[p2pu-dev] robots.txt

Ricardo Soares botequilha at gmail.com
Tue May 15 09:22:35 UTC 2012


Well, basically you already pointed out the big pros and cons.
IMO, the patterns will not change that much and so there's really no need
to rely on django to serve robots.txt. You can add robots.txt to the git
repo and create a symlink to the file wherever you want, to avoid upload
the file when changed.

Cheers,
Ricardo

On Mon, May 14, 2012 at 10:26 AM, Dirk Uys <dirk at p2pu.org> wrote:

> On Mon, May 14, 2012 at 11:16 AM, Ricardo Soares <botequilha at gmail.com>
> wrote:
> > Hi Dirk!
> >
> > I Hope that's not too late for a suggestion :)
> > Regarding the robots.txt, I use to leave that work to apache by adding an
> > alias entrance to the vhost file.
>
> Not a bad idea. I guess you avoid the dependency on django-robots,
> save some overhead and avoid locale prefix issues. On the other hand
> the file needs to be uploaded to the server when changed.
>
> What's the big pros and cons in your mind about this?
>
> Thanks!
> d
> _______________________________________________
> p2pu-dev mailing list
> p2pu-dev at lists.p2pu.org
> http://lists.p2pu.org/mailman/listinfo/p2pu-dev
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.p2pu.org/pipermail/p2pu-dev/attachments/20120515/a5e14115/attachment.html>


More information about the p2pu-dev mailing list