rest/opacity: Difference between revisions

From Microformats Wiki
Jump to navigation Jump to search
m (→‎Final Comment By Ernie: included Ernie's full name)
(edited final comment, added See Also)
Line 121: Line 121:
= Final Comment By Dr. Ernie Prabhakar =
= Final Comment By Dr. Ernie Prabhakar =


This implies to me that the ''server'' should use an specific microformat to indicate that a given URL can be treated in a particular way.  In the absence of a such a microformat, clients should not make any such assumptions; further, servers should avoid requiring clients to understand that microformat in order to receive service.
This implies to me that the ''server'' could use a specific microformat to indicate that a given URL can be treated in a particular way.  In the absence of a such a microformat, clients should not make any such assumptions; further, servers should avoid requiring clients to understand that microformat in order to receive service.
 
= See Also =
 
The treatment of URI opacity on the RestWiki. http://rest.blueoxen.net/cgi-bin/wiki.pl?RestAndUriOpacity]

Revision as of 21:36, 25 October 2006

(Adapted from a post by RESTfather Roy Fielding on rest-discuss; used by permission. The context was a discussion about what Tim Berners-Lee meant by his so-called 'Opacity Axiom', specifically how it applies to REST.)

Roy on Opacity in REST

Historical Context

...In order to understand TimBL's design notes, you have to know about the context in which he is writing a response. In this case, various thoughts were recorded as the "Opacity Axiom" in response to a discussion about client behavior and the perceived need for URNs. It has long since been taken out of context and abused in various ways.

Also, keep in mind that TimBL's design note was written two years after Henrik and I [Roy Fielding] worked out the important bits of what would become the HTTP object model, later re-named REST to get away from OOM terms, about 18 months after we had similar discussions at MIT with TimBL and the rest of the W3C team, and more than a year after HTTP/1.0 was finished and HTTP/1.1 proposed. So, saying it was something that I "didn't really care to integrate so much" is missing the mark by quite a bit -- he is trying to describe our model to people who did not understand it.

Original Intent

The opacity principle, as actually used on the Web, refers only to the machine interpretation of request processing as being dependent on control data (e.g., hypertext anchors and message field names) rather than on metadata appearing within the URI. It is the same reason why we distinguish media types from data formats -- the fact that a string of bytes looks like angle tags doesn't mean we want to process it as HTML. Ignoring any semantically significant data in a URI allows operations on a resource to be orthogonal to identification of the resource.

REST does include the opacity axiom in the original sense of that phrase. I did not use it by name in REST because it isn't a principle at all -- opacity is just a name TimBL used for the set of constraints around URI processing by clients (a byproduct of the constraints that you will find in REST). The principle involved is orthogonal design.

Applicability to Clients

"Opacity of URI" only applies to clients and, even then, only to those parts of the URI that are not defined by relevant standards. Origin servers, for example, have the choice of interpreting a URI as being opaque or as a structure that defines how the server maps the URI to a representation of the resource. Cool URIs will often make a transition from being originally interpreted as structure by the server and then later treated as an opaque string (perhaps because the server implementation has changed and the owner wants the old URI to persist). The server can make that transition because clients are required to act like they are ignorant of the server-private structure.

Clients are allowed to treat a URI as being structured if that structure is defined by standard (e.g., scheme and authority in "http") or if the server tells the client how its URI is structured. For example, both GET-based FORM actions and server-side image map processing compose the URI from a server-provided base and a user-supplied suffix constructed according to an algorithm defined by a standard media type.

The Bottom Line

Note, however, that some people have taken the mere title of "opacity" and assumed that it meant URIs should not have meaningful construction at all. TimBL's axiom doesn't say that and neither does REST.

Summary by Jon Hanna

(extracted from the microformats-rest mailing list; headers added later)

Opacity Option

I think it's important that URIs *can* be treated opaquely - you can link to them, you can cache their resources without processing them beyond naive string-matching, process RDF graphs again without processing them beyond naive string-matching, and so on.

Non-Opacity Option

This does not preclude the fact that agents *may* construct or parse URIs in more sophisticated ways, it just means that agents don't have to, especially in the case of agents that are not aware of the specific purpose of the application.

Further Clarification by Roy

Not Quite

Er, not quite. The key is that the server is free to tell the client that there does exist structure in a given URI-space, and then the client is free to make use of that knowledge in future requests. That is how server-side imagemaps worked -- HTML says that the src URI is structured such that appending "?X,Y" to that URI, where X and Y are non-negative integers, corresponds to points on a map that can respond to future GET requests.

Thus, one way for the server to tell the client that a given URI is structured is to provide the URI in a standard element of a standard media type that has been defined as such. Another is to include the URI in a response header field.

Caveat to the Caveat

Note, however, that I am also one of the creators of the robots.txt standard. That particular scenario requires that the spider find out the constraints on a site prior to the very first access of a real hypertext link. It is too late to obtain that information by looking at header fields after a normal request is made. Adding a new method to HTTP was out of the question because it would have to be deployed before it could be used [OPTIONS could fulfill that purpose today, 12 years after robots.txt was defined]. Adding fields to DNS wasn't an option back then, either. Reserving /robots.txt remains the best option given the alternatives available.

Note, however, that no other "reserved URI" features have the same requirements as spiders, so it is still true that favicon and pics chose the wrong solution.

Final Comment By Dr. Ernie Prabhakar

This implies to me that the server could use a specific microformat to indicate that a given URL can be treated in a particular way. In the absence of a such a microformat, clients should not make any such assumptions; further, servers should avoid requiring clients to understand that microformat in order to receive service.

See Also

The treatment of URI opacity on the RestWiki. http://rest.blueoxen.net/cgi-bin/wiki.pl?RestAndUriOpacity]