ruby gemcutter ports - how to handle fetching?

Peter Schuller peter.schuller at
Thu Dec 24 13:47:29 UTC 2009


I recently submitted two ports with master site set to
http://gemcutter.ogr/gems/ because rubyforge did not have the .gem
files. The ruby community seems to be transitioning to gemcutter, so
it would be good if gemcutter gems were easily maintained in ports. I
suppose gemcutter should be added to (see patch below),
but the other problem is that I ended up adding to my port Makefile:

   # we care about not passing -A
   FETCH_ARGS=     -pRr

While this works it is not maintainable (what if fetch isn't used?
what if other FETCH_ARGS overrides are in effect? etc).

The problem is that the official location of gemcutter gem downloads
return (correctly) 302 redirects - currently to Amazon S3. But the
default FETCH_ARG:s contain -A, which means this is treated as a

I understand that many times a 302 is just some broken site and we do
not want to follow it. The question is how this is supposed to be
handled in a maintainable way?


* Create FETCH_DISTFILE_REDIRECTS which, if yes, implies that when
fetching redirects must be followed.
(is a USE_* appropriate for this?).
* Have add -A only if FETCH_DISTFILES_REDIRECTS is not yes.

Thoughts? patch for gemcutter follows (will probably be mangled by
gmail). Should one add as a fall-back the current direct destination
being used? In other words, although it is presumably an
implementation detail of gemcutter, should one add, in this case, to the master sites
so that fetching has a high chance of working even if gemcutter is
down (as long as S3 is up, which should be reliable)?

---	2009-12-24 13:53:17.958746367 +0100
+++	2009-12-24 13:56:57.472840650 +0100
@@ -458,6 +458,11 @@


/ Peter Schuller

More information about the freebsd-ports mailing list