OT: Software's Orderly-Chaos (Re: build option survey)
bsd at afields.ca
Mon Aug 1 15:29:48 GMT 2005
OK.. time to let loose: arbitrary comparison of computer
science to religion exception caught. Theory to follow. =)
Keywords: BSD, Linux, life, chaos (Yes, I have a chaos theory :), disk
encryption, file systems, physics, corn chips
On Sun, Jul 31, 2005 at 03:23:47PM +0200, Poul-Henning Kamp wrote:
> I'm working on a script which will survey the various build options
> (NO_FOO, NO_BAR etc) and what their impact is on the installed system
> The table shows size of the total installworld image, including
> the GENERIC kernel.
> Delta is the number of blocks compared to a full installworld, and
> no, I'm not quite sure yet why some of them take more space.
> Number of extra files and number of eliminated files (these link
> to a "mtree -f -f" output file so you can see which they were.
> When done chewing, all the options will be tested for their effect
> during "installworld only", "buildworld only" and "build+installworld"
> When completed, this script will end up under src/tools/tools
> PS: Interestingly, this survey has theological implications as well
> as it proves the long held belief that C++ is evil: The NO_CXX
> option which prevents the installation of this evilness eliminates
> exactly 666 files :-)
Which reminds me: At the local Desktop, Linux conference I attended
my pizza lunch came to exactly 6.66 CAD. (I'd recommend a Brio w/
pizza.) Since by the nature of my existence, I choose not to draw
an inference on this fact correlated to a specific project at project
granularity, I'll let you be the judge of further meaning.
Just thought I'd let you know anyway..
> Poul-Henning Kamp | UNIX since Zilog Zeus 3.20
> phk at FreeBSD.ORG | TCP/IP since RFC 956
> FreeBSD committer | BSD since 4.3-tahoe
> Never attribute to malice what can adequately be explained by incompetence.
By the way the OLS was equally as good as BSDCan, lots of good,
informative presentations, etc. Is it coincidence that both the
BSD Kernel dev. and Linux Kernel developer summit were held in
Ottawa? (I tried to sneak in, but the quite friendly USENIX people
at the door asked me for my name and turned me back -- better social
engineering required -or- I could just write some important kernel
code for next year.)
It must be one of those strange attractors alluded to in the physics /
Chaos theory. Some time I should take a trip to Europe maintaining
symmetry in the migratory patterns.
So what's so mysterious/evil about 2/3 (e3) anyway? Other than
it's an irrational number indicating dissonance between two small
prime quantities (an infinite stream of sixes.) But then isn't 1/3
that much more evil? (threes forever! >:)
I've already got the holy grail, so this one shouldn't be that
difficult to puzzle out. (or there is always gg:666)
OK so on the topic of evil: Chaos can seem pretty evil when it
overwhelms our best attempts at the higher order. So how about
Chaos as it pertains to Software?
I found it astonishing to note the lack of a Universal symbol
for Chaos in the Unicode mathematical table, yet Wikipedia has
one for us: http://en.wikipedia.org/wiki/Symbol_of_Chaos
I invented one of my own Saturday night, but since people are fixed on
the idea of expansion from the singularity / big bang as representative
of Chaos.. I guess that one sticks? I then tried in vain to merge the
Chaos == <---0---> ?
Interesting: and isn't that similar to the Kline Star from a distance?
A very greedy match? Maybe this symbol needs to be simplified if
we are to eventually type it in 9-point font.
Software engineering is the art of overcoming Chaos in preservation
of the information order while supposedly achieving some end
either computationally or otherwise. The art of infinity, this is
to understand the nature of computation. The computational
nature of life that is.. ;)
"Spooky" action at a distance and the Quantum secrets. It's all
out there for us. Just more Cryto-Computational-Philosophical-
Religious-Technical-Analytical type stuff with deductions to draw
and upon which to build our theories. Very Neal Stephenson or Larry
Wall because everyday life starts to make technical sense and things
like corn chips suddenly start to relate to man kinds larger technical
pursuits. (The world is a complex place? But then it has to make
sense, how else would it be implemented?)
Note that there are a great number of books on these subjects:
that you might have read or be interested in reading.
If you have any ongoing questions that you've puzzled over that
you think are worthy challenges I'd be interested to see them
I can't remember if/what I was suppose to ask you yet, but perhaps
you've been seeking some larger answer. Have you any unsolved
problems or conCERNs? I have been developing some Universe theories
and it just makes sense from a computational perspective. I'm also
listening to Weinberg's Dreams of a Final Theory (audiobook).
You're still not alone in recognizing the parallels, and they are
VERY applicable ... ;)
Simplicity,Generality,Clarity are probably the most religious
comp-sci ideas yet, hopefully *NIX can continue to get it right.
I know with talk of Plan 9 at OLS people are starting to become
more bent on bringing higher theoretical purity to the operating
system in what primitives are provided. Yet surprisingly the
Linux community are still rather conservative as to what code
actually makes it into the kernel to the exclusion of Reiser4 it
seems. That's not to suggest Linus should open the flood gates
to any and all code, wouldn't you also agree? There may be some
implementation-centric technical reasons for that, but Stephen
Tweedie doesn't buy it yet. :)
So what can be learned from this resistance (w/o commenting on
the validity of the logic.) Should we all be using FUSE and
should it be ported to BSD? Then why not stop using filesytems
all togeth and just use databases mapped to raw disks going through
VFS to the userspace and finally back through VFS?
There are reasons that people don't care for some of the ideas
presented in these seemingly unnormal *nix-border-line cases that
push the envelope. But can they be made kosher with the rest of
the normative cases: i.e. clean in implementation.
Abstract concepts can be drawn upon in finding answers - these could include:
Completeness (0..N Generality)
Generality / Recombination
Applied at multiple scopes. In the kernel the reason your ioctl
reform makes sense from some theoretical angle is indeed because
it offers generality and closure (of interfaces). These concepts
are areas of traditional software design that could seek improvement.
The most ingenious elements of current-day *NIX are the abilities
to simplify and recombine primitives of operation on files. But a
whole new phase/generation of function could be introduced with-out
disrupting the theoretically minimalist underpinnings of the operating
system (i.e. unnecessary code).
Regulator: Given a pool of Chaos, the creator as the regulator
assembles based on the patterns and stored memory available to them.
The act of creation is an act of feedback between the system and
the creator. Software developers are creative so the structility
of each developer/author/committer and their decisions determines
the extent to which software becomes chaotic at multiple scopes.
But also the structility of the collective in cooperation to deliver
the combined entity of software is dependent upon many factors.
Scope: The importance of scope is key in regulating chaos: even
today scope seems to hide that which could be done more efficiently.
That which can be common but isn't or that which must be divergent
but seeks some unification or common infrastructure.
We have not yet exercised the full extent of the magic upon scope which
Repetition: there are two types of repetition; necessary and unnecessary.
The necessary brings redundancy, the unnecessary brings evil as in Spam
or tasks which require continual repetition.
At each scope of life some repetition can be helpful, but absent that
which holds these entities commonly divergent the repetition is simply
In an ideal world there is no need for repetition. Software isn't
always ideal and forks both in scope and solution space (at common
scope) bring much developer challenge. Some challenge leads
to insight while other challenge leads to drudgery and futility.
I commented on how the BSD forks seem to paradoxically offer new
opportunities for development. http://afields.ca/bsdcan/2005/reflection
I'm still waiting to update the quotes of which a few are incorrect
I'll rsync an updated version of the reflections and more info/check
marks in the TODO boxes hopefully. I've started in on Wiki,
as it seems the only logical format for the quick recording of my
ideas and being able to keep them up to date.
If some architecture was so cleanly coordinated that everything
was done once only at one scope (no duplicate between device-level
and vnode-level subsystems) then it would all "just fit". But take the
contrary case where we are trying to implement fine-grain disk
encryption to supplement device-level, how do we keep all the
encryption in a single layer only? It makes sense instead to use
the common kernel crypto infrastructure and build on that at multiple
layers, unless we simply choose one or the other with it's various
Structility/Crystallinity (The structility of the regulator): This is
important to BSD and is a traditional hallmark of *BSD. IMO, BSD
should make it practical and simple, where it makes solid rational
sense, but too much conservatism would stifle the potential for
The crystal regulator can separate and recombine in derivation of
the purest form while maintaining fundemental qualities.
Unification: Maintain the strength of Unix (while emphasizing the
UNI-upon-*) Perhaps we got it backward, maybe it was suppose to be
UNI* instead of *NIX. The age of the holy unification might bring
us an advantage in the Open Source world that will leave those who
wish failure and naysay based on software bedeviled by it's very
exponential complexity/growth proven incorrect. This
would imply a type of paradoxical software development: spiritually
motivated reductionism/refactoring (a trend toward this would be
the emergence of a new movement.)
Reductionistic software development: The software author that
seeks not to add new features but only to bring closure, order
and general calmness to the existing solution space. One who
seeks enhance the logical where the process of creation has
left it's chaotic signatures in the amorphus elements of the
incarnation. It would be a refinement of the software design which
would improve and further strength or rebuttress the walls and add
that so desired strength back into the more unsettled areas of the
OS. FreeBSD needs this not because it's failed, but because it has
areas to which improvements can be found which weren't available
at the time of genesis of the creation. Linux definitely needs
this and even GNU needs this.
I've got a few projects in mind. But really, I'll have to start
spooling some of these ideas off, cause the queue is severely
backed up and the bottle-neck is _me_ (in my mortal incarnation).
I continue to focus on what is obtainable in the theoretical (abstract) and
more pragmatic elements (concrete), and the quest marches onward..
I have ideas, but it's the barrier between the abstract and concrete.
More...: Even all these ideas are a subset expedited in specific
for the purposes of software contemplation.
So I need a Concurrent-Distributed-Process-Forking-Exec thingy.
Since we are cellular automata of a sort, I can simply toss it
into the ether and hope that some pattern matches occur resulting
in construction of the ideas with-in the digital-upon-physical realm
by an entity of similar, specific disposition.
Some ideas it seems we cannot yet fathom, as the concrete incarnation
and it has been a challenge to try to practically convey these to those
who, understandably just want to maintain the current stability/state
of the kernel, filesystem, etc. But these ideas should be SIMPLE.
I'm convinced these are sufficiently important concepts that if we
don't solve the fundamental disorder we'll end up with the current
situation of userspace bloat and systems upon systems in compensation.
Suprisingly this may provide a higher barrier for entry in
scientific-computing and why can't we easily use the pool of knowledge
available on the Internet? Ted Nelson might know why, and it's also
a problem in the OS space. So, enter the age of nanophasic computing?
I think eventually BSD/Linux developers and companies like Google
might show an interest. But missteps (why system architecture
matters) will only set us all back (as a race), and it's all about
what makes the most rational sense for the OS, the core libraries,
the userland: the technology.
Have a happy non-denominational (belated) comp-sci/*NIX religious day..
More information about the freebsd-current