Programming has changed a lot since the days of punched cards and
unruly mechanical behemoths. However, programmers still have a lot on
their minds. So, anything that reduces the mental load is hailed as a
fantastic improvement in the computing world. Let’s contemplate the
evolution of programming for a moment.
Life began with assembly where every instruction was painstakingly
detailed right down to the hardware specs. Then there came the first
high level languages and their compilers. Suddenly, you didn’t need to
hand craft the math and data moving operations, you could express a
more complex idea with broader logical strokes. Once personal
computers proliferated programmers faced the daunting task of writing
code for multiple platforms. The dividing lines were drawn. You were
a mac dev or a PC dev but not both without a whole lot of duplicate
work. So, the virtual machines rose up to ease the coders burden. By
having software simulate a common platform programmers could focus on
writing once and running everywhere.
Time rolled on and languages grew smarter, with more built in memory
management and improved error catching. Code failed more gracefully
and products became more resilient. Life was getting easier for
developers or at least it should have. But the truth is modern
systems are built upon several layers of technological complexity.
Many newer technologies rely upon even older code. And should that
old code harbor old overlooked security flaws “modern” sites fall
Computers keep getting faster but developers do not. Human cognitive
limits constantly reign in efforts to sling shot us into the fantastic
tech Hollywood keeps trumpeting as the world of tomorrow. We’re often
stuck because writing code is still a tough business on the whole.
Software is brittle, insecure and often slower than it should be. In
fact, there is a rather large chasm between how fast things can run
and how fast they do simply because developers rely on the speed of a
computer to make up for the lack of attention to optimization. And
there are now so many layers of abstraction sweeping the complexity
under the rug that the most sensible thing to do when the computer
drags its feet is to hit reset.
Welcome to the mantra of the 21st century: Turn it off and on again.
Is this really progress?
A lot of industry folks are asking themselves what the next step for
computer development is. What do we arm developers with in order to
take on the complexity challenge today? How can fewer developers be
more productive? One camp proclaim-ith code visuals manifested in real
time shall be the next best thing since sliced bread. But I don’t
think that’s the whole story. I’ve always been a fan of visual
representation I don’t think that’s the magic bullet for software.
Visuals aide developers in understanding code. Yes. But the truth is
that writing and maintaining code requires a lot of supporting tasks
to insure reliable, maintainable projects serve their end users well.
Any sizeable living breathing software project requires documentation,
unit testing, bug tracking and creative devotion to maintain top
performance. There’s such a large swath of tasks to take care of that
it hardly seems fair to expect one person to hoist them all onto their
shoulders. Often projects suffer from growing pains as minor
weaknesses turn into major structural issues. Quality gets dicey,
updates get late and people abandon broken code as layers of support
shift beneath them.
But we can do better. One lone developer can do better.
In fact, we should take a lesson from Castlevania.
(Symphony of the Night to be exact.) No one wants to storm the castle of
the undead without a little help. And that help came in the form of
the trusty familiars that accompanied Alucard during his heroic
rebellion. He had bats, skulls, faeries and other flying whatevers
aide him in the fairly common task of storming a castle and rooting
out evil under the dark of night. I mean someone has to do that job.
Why not write AIs to handle the multiple tasks of development? Why not
write more AI oriented code in the first place? Programs can become
collections of inter-operational intelligent agents. I feel that this
is the natural progression in the long march of easing the developers
cognitive load and thus freeing their mind up for the more creative
aspects of coding. Building an entire virtual environment to better
handle and adapt to failures should lead to more robust code. It’s
time to roll out some powerful new automated assistants.
That’s the conclusion I arrived at but the journey is fun to recap.
The notion began to dawn on me as I worked on the compiler
web series. I was in the process of distilling the steps to compile
open source software projects when I realized a lot of the knowledge
required was pretty much the same. You really just loop through a
cycle of attempts, problems and research until you arrive at a
polished executable binary file. If this knowledge could be
represented by a flow chart why couldn’t I write a program to move
through the task itself? Truthfully, developers have created a
marvelous cacophony of tools for compiling software projects. I wanted a system
that could run some basic steps and look up the error codes that
inevitably result from failed compilation attempts. I began to
speculate what it would take to have an AI “learn” the
process and guess possible command sequences.
I often ponder what I need to do to pull off all the big projects of
my dreams. So, I began to ask myself how I would run a large software
development project. Would I make unit tests? I can see their value
but wasn’t keen on having to write all this supporting code just to
keep a system maintainable. I mean I can’t deny the necessity but it
sure felt like something was wrong with the fact I have to spend extra
time writing all this code just to help maintain a system. Automatic
analysis, testing and updating of supporting resources could be
accomplished by AI driven tools.
I want my code to be robust, malleable and an intelligent aide in its
own life cycle. I want my AI Familiars to aide me in all the different
aspects of code analysis and development. I want smarter code and a
smarter development cycle. I want the next step in the evolution of
coding. It’s time to summon my familiars and find out what tomorrow
 Gross simplification here. Linux, embedded systems and other OSs
are all very valid. But I knew people generally understand when I
mention Mac/PC instead etc.
 I could even release a variant online to et others test/teach the system.
 I’m speaking about the Shell Shock bug. Old Bash code opened up a can of worms.