View Single Post
Old 01-09-2010, 13:32   #68
Rchivist
Inactive
 
Join Date: Apr 2008
Posts: 831
Rchivist has a fine set of QuadsRchivist has a fine set of QuadsRchivist has a fine set of QuadsRchivist has a fine set of QuadsRchivist has a fine set of QuadsRchivist has a fine set of QuadsRchivist has a fine set of QuadsRchivist has a fine set of QuadsRchivist has a fine set of QuadsRchivist has a fine set of QuadsRchivist has a fine set of QuadsRchivist has a fine set of QuadsRchivist has a fine set of QuadsRchivist has a fine set of Quads
Re: TalkTalk tracking you, phorm?

Quote:
Originally Posted by Chris View Post
If someone breaches our T&Cs, they get warned. If they do it repeatedly, they get 'banned' - that is, their posting rights are permanently revoked and we actively seek to prevent them from re-registering. What we don't do is attempt to prevent them from accessing all those parts of the site which are publicly viewable without registration.

The fact that a website is published means that the publisher implicitly accepts that certain things may be done with it. The question here is whether the creation of user accounts with additional privileges is analogous to the use of certain lines of code to attempt to control the behviour of other, automated internet systems.

Our membership system requires active human involvement and specific agreement to a set of conditions, in order to gain access to parts of the site that are inaccessible otherwise. I don't see that as being much of a comparison with a line of code that asks a spider or other web cataloguing system not to record certain content when that content is visible and not protected by any password.

Of course, you might then want to argue that the code in question amounts to a password protection against web-crawling systems, and that by ignoring it, the operator of that system is effectively 'hacking' your site. Personally I can't see a judge going for that.
Quote:
Originally Posted by Chris View Post
If someone breaches our T&Cs, they get warned. If they do it repeatedly, they get 'banned' - that is, their posting rights are permanently revoked and we actively seek to prevent them from re-registering. What we don't do is attempt to prevent them from accessing all those parts of the site which are publicly viewable without registration.

The fact that a website is published means that the publisher implicitly accepts that certain things may be done with it. The question here is whether the creation of user accounts with additional privileges is analogous to the use of certain lines of code to attempt to control the behviour of other, automated internet systems.

Our membership system requires active human involvement and specific agreement to a set of conditions, in order to gain access to parts of the site that are inaccessible otherwise. I don't see that as being much of a comparison with a line of code that asks a spider or other web cataloguing system not to record certain content when that content is visible and not protected by any password.

Of course, you might then want to argue that the code in question amounts to a password protection against web-crawling systems, and that by ignoring it, the operator of that system is effectively 'hacking' your site. Personally I can't see a judge going for that.
Thanks. I think that all makes my point quite nicely. Of course we are talking about YOUR T&Cs, aren't we, so the things you don't do aren't really relevant to MY case - the fact is - you have terms and conditions and you intend to enforce them.

Some of your comments above suggest you are not aware of the basics of the dispute between certain websites and TalkTalk.

I suggest that you read it all up again, because the things you are saying above, clearly indicate some mistaken assumptions about the actions certain website owners are taking. Especially when you make reference to what judges might or might not "go for".

The TalkTalk members forum should give you a grasp of the basics and it is available to all to read.

Another hypothetical question - how often and how comprehensively does your C/F site get scraped? If I was to use - say - the firefox addon Scrapbook Plus to download the content of the entire C/F site, twice a day? Using a dedicated server and a nice big fat terrabyte level hard drive?

What would you do about that? Let's assume for the sake of argument that I was not a member of C/F - just a commercial operator, compiling a marketing or malware database for which your site's content was a valuable component.

Any comparison to any actual scraping exercise carried out by any particular company is of course, entirely accidental.

Or if I got tired of the screen scraping, and decided on a DdoS attack instead. Would you act to prevent that sort of abuse?

I suspect the point would come at which you would enforce your rights. How?

Website owners have rights. They set Terms and Conditions. They notify those Terms and Conditions. Others can abide by them or not access the sites. I think the way C/F cleverly puts it is like this:

Quote:
You acknowledge and accept that in certain circumstances we have the right to provide information to your ISP at our discretion following but not limited to severe disruption by you, the provision of illegal or abusive content, any attempts made by you to re-register after being banned, or for any other reason that involves you breaking these Terms and Conditions.


You will acknowledge that reproduction of material from this web site without prior written permission is strictly prohibited. All contributions to this site are also copyright of the site owner.


You acknowledge and accept that if you are banned from use of the CF web site, you will not in any way attempt to re-register using any other name or identity not known to us, or any other email address not known to us. If you do so, we will pursue the maximum penalty available under your Internet Service Providers Acceptable Use Policy.


You have the absolute right to free speech. If you find it impossible to abide by this document, then please feel free to contact one of the many good web hosting companies out there and set up an account, create your own discussion board and exercise that right to your hearts content.
It is quite clear that you take website owners' rights to protect their sites very seriously. The C/F Terms of Service show that. Or perhaps I should say - they suggest that YOU clearly take your OWN right to protect YOUR OWN site seriously. I'm not really very sure how you feel about MY right to protect MY sites.

TalkTalk are of course free to NOT visit my site, NOT to scrape it, NOT download material from it, or NOT impersonate their customers while visiting it. I've told them that very very clearly. It's a very simple point but one they are struggling to grasp - and they don't seem to be the only ones.
Rchivist is offline   Reply With Quote