favor

Ted Mittelstaedt tedm at toybox.placo.com
Tue Feb 8 02:45:34 PST 2005



> -----Original Message-----
> From: owner-freebsd-questions at freebsd.org
> [mailto:owner-freebsd-questions at freebsd.org]On Behalf Of Anthony
> Atkielski
> Sent: Sunday, February 06, 2005 3:20 AM
> To: freebsd-questions at freebsd.org
> Subject: Re: favor
>
>
> Ted Mittelstaedt writes:
>
> TM> This is a bit of twisting of the definition of "site that
> is public"
> TM> in my opinion.
>
> The key distinction is between a venue to which access must be
> explicitly requested and one that a person can visit without any
> formalities.

How exactly do you visit it without any formalities?  What if a
site is coded with HTML that is undigestible with Netscape and
digestible with Explorer.  That now requires installation of IE
to visit it, which is a way of filtering those who want to visit out.

And if there is a guarentee that any request for access will be
automatically granted, then it really is no longer a request.
A request assumes that the requestee has to grant permission -
which assumes that the requestee has the authority and ability
to say no, to deny access.

If the site has an automatic program that immediately grants access,
then the requestee has no power to deny access - which means it's no
longer a request.

If you go to a HTTPS site that has a self-signed certificate in it,
you must explicitly request the public key in order to install it
into your browser's root certificate store, if you do not want to
have to be prompted every single time you visit the site.  If you
answer No to the prompt asking you to request the key to put it
into your browser, then your access is denied.

Yet https sites with self-signed certificates are considered public
sites.

If I write a program that intercepts all your outbound mail, and
whenever you send an e-mail message to a subscription-only mailing list
that you are not subscribed to, this program intercepts your
outbound mail, sends in a subscription signup, answers the response
to activate the subscription, then proceeds to post your outbound
message, then from your point of view the mailing list is identical
to a totally open, non-request mailing list.  If you run this program
on your PC and set it to not notify you when this happens, then
the autoresponder mailing lists now become transparent, you will not
even know if one you send a post to is a subscription list or not.

And, do you really believe that spammers AREN'T doing this already?

> Asking people to register or subscribe in order
> to post to
> a forum is requiring an explicit request for access from participants.

No it is not - not unless there's a real person reviewing those
requests who is filtering them, and not honoring ones they don't like.

> This creates a private venue with access limited only to those who have
> gone through the formalities.

Those formalities must be a bit more serious than a mailing list
autoresponder, to be considered a real authentication and access
request.

>
> This has to do with the "moral rights" of copyright holders to control
> the manner in which their work is presented. More practically,
> it has to
> do with the wish of some copyright owners to provide their content for
> viewing only under certain conditions beneficial to themselves (such as
> on a specific page with advertising).
>
> I think that linking should be permitted by default, unless
> the owner of
> a site specifically prohibits it.  This allows maximum
> flexibility while
> still affording protection to people who don't want deep links into
> their content.
>

My feeling is that if a site is extremely difficult to navigate
within - such as many news sites (ie cnn.com, etc.) that this
encourages deep linking.  If the site owners don't want deep
linking then they can make their sites easier to navigate within.

I do at times deep link because of this but I would prefer not to
do it - because quite often this makes it harder to find ancillary
material that the user would want to see.  My experience is that
deep linking is normally counterproductive and if people did a
decent jobs with their websites, it wouldn't be an issue because
nobody would deep link to them.

Putting a 5 minute Flash presentation as the index page of a site
is guarenteed to provoke deep linking, for example.

Also, if you do have a decent site, and still have problems with
referring sites deep linking to you and not changing those links
when you politely request them to do so, it's pretty easy to
replace the deep link with a html page that
redirects to your site index.

As for the idea of giving any protection to a site holder that
wants to prohibit linking at all, that is as you say, a can of
worms.  Probably best to allow politeness to handle this, and if
the referring site isn't polite about it, there's ways to
retaliate.

> I think that search engines should respect exclusion policies as a
> matter of courtesy.

Do any not?  If they didn't, then their search database results would not
carry as much value.

>
> On my site, I prohibit deep links, except for search engines under
> certain conditions.  I don't enforce this much, except on rare
> occasions
> when someone is pulling just an image off my site, wasting my bandwidth
> and effectively pirating the image (but whether or not such linking
> constitutes infringement is yet another can of worms).
>

If they are using it as a component of their site, then I think it does.

Ted



More information about the freebsd-questions mailing list