[OT] Why did "enterprise" crap seem to win? (re: Virtualization, Java, Microsoft, Outsourcing, etc.)...

Rod Person rodperson at rodperson.com
Thu Oct 5 15:42:15 UTC 2017


Polytropon,

I just want to also thank you for your comments. They hit so many nails
and the head and made me feel better about the despair I feel toward
the direction of corporate IT.  I really want to send this thread out
to everyone in my department, but I know it will fall on deaf ears. But
again thanks for letting me know I'm not as crazy as the rest of my
co-workers think!



On Thu, 5 Oct 2017 06:29:19 +0200
Polytropon <freebsd at edvax.de> wrote:

> Allow me a few comments. It will be mostly statements from my
> very limited individual point of view and experience.
> 
> 
> On Wed, 4 Oct 2017 10:10:04 -0400, Alejandro Imass wrote:
> > Why did C++ and Java seem win over C and shared object libraries?  
> 
> Java is the new COBOL, 'nuff said.
> 
> No, actually not.
> 
> Mastering C requires a proper understanding of how things work
> at a quite low level, and it also requires the ability to
> search for (and find!) modular components to incorporate into
> own projects (because not everyone wants to write a XML or
> CSV parser in C on his own). Those skills aren't taught at
> university or in vocational schools anymore, so C became a
> "magic language" which only old people and "nerds" can use
> today.
> 
> So a self-amplifying mechanism has been built: Projects require
> the use of Java, and because they do, nobody learns C. And
> as there is nobody who can write C, those responsible for
> projects suggest Java.
> 
> Nobody ever got fired for suggesting Java.
> 
> 
> 
> > In the Unix model the general philosophy is precisely that is made
> > up of tiny little things that do one thing really, really well and
> > are able to operate with other tiny things by means of a common
> > protocol such as pipes etc.  
> 
> On the other hand, there is no problem creating LOB software
> entirely in C. Sure, this is hard, _really_ hard, but possible
> 
> 
> 
> > What more encapsulation than shared objects, header files, and C
> > structs? if you want higher level there is for example, Perl with an
> > elegant XS interface to C which can in turn leverage all the
> > underlying O/S. Not only that, Perl (amongst other great "scripting"
> > langs) is multi-paradigm allowing you write software to tackle
> > problems in different ways, which are much more clearer than trying
> > to stick a square peg in a round hole approach of single paradigm
> > languages such as Java.  
> 
> When running on a Unix system, programs can also use omnipresent
> external tools, for example a "report generator" dynamically
> created as an awk script. Java may be a multi-OS choice ("write
> once, run everywhere"), but the systems it's actually being run
> on maybe do not have such tools...
> 
> 
> 
> > And only after 30 years or so, do you start seeing some functional
> > paradigms come to Java and new languages over the JVM such as Scala.
> > When in fact these things have been with us for ages since Scheme,
> > Perl, etc. ? and these integrate beautifully with the underlying
> > O/S.  
> 
> In marketing, it's important to tell people things are new, even
> though they aren't. ;-)
> 
> 
> 
> > Why dis the industry degrade the languages as "scripting languages"
> > when in fact this is precisely what you want: simplicity, elegance
> > and tight integration to the O/S!  
> 
> Because they are only seen as "programming glue" and highly
> discouraged in the world of monolithic "one thing does all"
> applications.
> 
> 
> 
> > So why did the Java Virtual Machine concept win, instead of
> > leveraging the underlying O/S ? Was portability the only mermaid
> > song ?  
> 
> This often-mentioned portability is mostly nonsense. As soon
> as you use some OS-specific hooks in your Java program - and
> you _do_ -, portability is gone. "This CRUD CRM only works
> with this one database subsystem which only runs on 'Windows'."
> 
> 
> 
> > or was it
> > the promise of a language to protect us from the mediocre
> > programmer ?  
> 
> No. As you can write bad programs in any programming language,
> some languages encourage writing shit more than others. Just
> have a look at "The Daily WTF" and their CSOD (code snippet of
> the day) articles. Guess what languages you'll find? Mostly
> Java and C#, the "business languages" of today.
> 
> 
> 
> > What is the point of portability if it doesn't really leverage the
> > underlying O/S? I personally think portability is not only a myth
> > (i.e. Java is not really portable as people may think) and it's
> > actually pretty stupid and it's actually what you DON'T want.  
> 
> Users don't want to run their software on a changing OS, they
> want a "solution", i. e., a monolithic thing where they have
> someone to blame and to sue if things break.
> 
> Improving things means change, and nobody wants to change.
> 
> 
> 
> > What you really want IMHO is a fined-tuned architecture that not
> > only plays well with the underlying O/S but that actually leverages
> > the O/S, which makes it NOT portable by definition. Why do we want
> > portability in the first place? Does portability foster competition
> > and innovation or just makes everybody mediocre at best? Does it
> > increase security or performance? NO, it's actually the total
> > opposite!  
> 
> You easily see the borders of portability even for applications
> written for "Windows" exclusively: Do the update, break the program.
> I sadly have to deal with such crap on a weekly basis... :-/
> 
> Sometimes you'd say certain platforms (hardware and OS) are
> better suited for specific jobs. This is the reason why still
> so much computing is done on mainframes, or why hardware control
> is done on things that are not regular PCs. Those platforms are
> different for a reason, but choosing the right platform can be
> hard when "qualified consultants" (quotes deserved) suggest you
> do it on a "Windows" PC.
> 
> 
> 
> > Code reusability is mostly bullshit too, and what you wind up with,
> > AGAIN, is piles over piles of crap, wheel re-invention and
> > duplication of efforts.  
> 
> Definitely true.
> 
> And add interoperability. You have a program that needs data A
> and does X, Y, and Z. And you also need to do V and W, but that's
> done with another program, where you also have to enter data A.
> You cannot transfer the data once acquired. No interface, no
> standard exchange format.
> 
> 
> 
> > A quick look to the Java ecosystem is enough to show this
> > is fundamentally flawed. Even the simplest Java application winds up
> > bloated with megabytes (sometimes gigabytes) of crap in there that
> > is not doing anything but hampering performance and opening up all
> > sorts of security holes.  
> 
> This can be easily compensated with hardware resources. :-)
> 
> 
> 
> > The same goes for the Windows world and C++ where
> > it gets even worse as every application you install is able to
> > overwrite critical system libraries.  
> 
> Again, this is fully intended and a "feature". A "security feature",
> if I remember correctly. :-)
> 
> 
> 
> > Why did virtualization and outsourcing (AWS) seem to win over
> > concepts such a chroot and FreeBSD Jails and a good old sysadmin?  
> 
> You just said it: "old". Administration of a Unix system requires
> skills. Those are probably too expensive when you can easily take
> any person who is "in IT" (read: sits at a desk with a PC on it),
> and advertising tells you that it's so easy, no knowledge needed,
> does everything on its own, just click on the funny pictures and
> make money.
> 
> 
> 
> > Why do we
> > really need full virtualization in the first place? Does it help in
> > performance or security? Does it reduce costs? On the contrary it
> > does neither, at least IMHO.  
> 
> An interesting approach is to run programs under a specific user.
> That "chroots" the process to the respective $HOME, still being
> able to benefit from the OS infrastructures and services.
> 
> 
> 
> > HTF did we get such aberrations as Docker on an already virtualized
> > environment such as AWS??? I mean WTF is that? really? Why pile up
> > all those layers of shit to get what you could get with a real
> > hardware with FreeBSD and EzJail ?  
> 
> With solutions like Docker and immutable images, you are basically
> back to the past: Software loaded from ROM modules, immune to change,
> and everything is present in one big block: the OS, the libraries,
> the application program.
> 
> 
> 
> > I write these reflections at a time when all these security breaches
> > such as Yahoo's 3 billion account breach and Equifax 145 million and
> > much more serious breach are happening and the situation will only
> > get worse because of all the piles, over piles and piles of shit
> > the most companies run on.  
> 
> Nobody cares. The end user pays the bills anyway.
> 
> 
> 
> > So how did we get here? Why does industry insist on complicating
> > stuff instead of a complete back to basics approach?  
> 
> Because there are parasites that benefit from the status quo. Those
> are companies that offer "solutions" which only work with heavy
> vendor lock-in - or the customer could move to a more efficient
> solution that costs less. In order to avoid this, extensive
> marketing and PR is being performed to convince the decision-makers.
> Always keep in mind that those who make the decisions about what
> software to use aren't the ones who are forced to use that software
> (and deal with crashes, missing features, multiple work, etc.).
> Add long-running contracts with penalty fees, and everyone is
> happy. As I said, the end user pays.
> 
> 
> 
> > Why is the answer to
> > these problems is more and more outsourcing and throwing even more
> > piles of crap and wasting money in the hopes that will fix the
> > fundamentally broken systems we have in place? What we need is a
> > radical back to basics approach to software engineering.  
> 
> This is due to the belief that you don't need to build something,
> instead just take what's already there (because it's being promised
> to be "the solution"). So no more thinking is needed. Every layer
> of abstraction, be it 3rd party OCO libraries, Internet-based
> services using an API, or simply handing off the work to a cheap
> shop (the cheapest one is just good enough) adds more crap. But
> as I mentioned, this can be compensated by machine power. :-)
> 
> 
> 
> > Well at least not ALL industry. I think Apple made the right choice
> > when they decided to embrace NeXT (which is heavily based and
> > inspired on FreeBSD) and Objective-C from the start, creating
> > highly-tuned systems that are not only NOT portable in software
> > terms but not even portable in hardware terms!  
> 
> With this approach, Apple wanted to be able to offer a hardware and
> software combination (!) that they verified to work. Control over
> hardware makes software creation much easier, as you know your
> "moving parts" and don't have to introduce guesswork and abstractions
> for hardware. Apple's primary goal is not to sell an OS or a
> computer - they want to sell an "experience" where many things
> add up to the product.
> 
> 
> 
> > Pardon the rant and have a wonderful day.  
> 
> I'd like to expand on one thing I briefly mentioned, because I
> think it could be important for the process:
> 
> With articles like "Why can't programmers... program?" that
> question fundamental knowledge that "we" (skilled and experienced
> "old people") take for granted, the question comes up: Why are
> there no programmers who want to do good work?
> 
> The reason is simple: Good work doesn't pay.
> 
> This is for two reasons:
> 
> Reason 1 is related to taxes. If you want to employ a really good
> programmer, this will cost you money. You don't get it for minimum
> wage. But personnel costs are bad on your balance sheet, as you
> cannot deduct those costs. So what do you do? Turn personnel
> costs into material costs, i. e., get consultants and external
> services who finally write you an invoice which can be deducted.
> Sure, they are often more expensive (!) than your own programmer,
> but hey, everyone does it that way! And now that the contract has
> been signed, there is no way back. Three rules justify all those
> actions: (1st) "We've always done it that way." (2nd) "We've
> never done it that way." (3rd) "Then anybody could do it."
> 
> (The previous paragraph applies extremely in Germany.)
> 
> Reason 2 is education. As I said, it requires a structured mind
> and a passionate heart to learn and use (!) languages like C or
> systems like Unix. From the beginning, you're not "business-y",
> because that would be "Windows" and Java and C#. Copypasting from
> the Internet without understanding what you're doing and still
> getting something that "somehow works, sometimes" seems to be
> sufficient to get a job today. Even when graduates fail in a
> simple FizzBuzz interview, hey, they are fresh out of university,
> so they must be good, right? A degree and lots of certificates
> open many doors, skills do not.
> 
> You'd be surprised how many programmers you could find in the
> "jobless" section, but they'll probably never get a job again
> even though they educate themselves. This is because they often
> don't have access to an important resource: money.
> 
> Let me explain: In order to get "monkey see, monkey do" jobs in
> IT where you don't need a fully functioning brain because you
> just press buttons when the bell rings, you need to be certified.
> This costs lots of money. But because it is being paid for, the
> result is easy to predict: No knowledge needed, just be visible
> when the courses are being held; pay your money, get your shiny
> sheet of paper. In many cases, employers send their underlings
> to such certification courses just to make the customers pay
> more ("But our consultants are certified!"). Only with money,
> you get access to this world. If novice programmers aren't
> lucky, they'll always stay on minimum wage, even though their
> skills and experience are better than those of the HPCs who
> tell them what to do.
> 
> I've seen more than enough of those "qualified consultants"
> who make boatloads of money, but are stupid as hell. They
> are, in fact, excellent actors and influencers...
> 
> Reason 3 - three! three reasons! - is social appeal. Just imagine
> you're actually understanding your work, and you do it better
> than others, then you're not a "team player", because you "make
> the other branch offices look bad and lazy". That's why it is
> not encouraged to improve things.
> 
> Example from reality: I once worked for a shop where one person
> got a high wage for creating timesheet templates. Manually. She
> basically entered years, months, days and hours manually into a
> PC as if it was a typewriter. With additional microformatting.
> Every day. Every month. For years. When one of her coworkers
> suggested to automate the task basically by piping the output
> of "ncal" into a little awk script (he was a Linux person),
> he got yelled at for being "hostile to coworkers" and "threatening
> the staff". That's what you get when you use your brain.
> 
> PC on, brain off.
> 
> 
> 
> 
> 
> 
> Sorry for the rant. :-)
> 
> 
> 
> 



-- 
Rod

http://www.rodperson.com

For we are both ready to believe the thing we want to believe, and also
hope other people feel what we feel ourselves.

  Julius Caesar
    The Civil War, book II, 27
     Expedition of Curio to Africa



More information about the freebsd-questions mailing list