Once the nouns and verbs I need in my architecture start to solidify, I look at organizing them across multiple
dimensions. I tend to think of the noun/verb organization exercise in the physical
terms of surface area and moving parts. By "surface area" I mean
minimizing the sheer size of the model. I freely admin that page
count is a crude-sounding measure for a software architecture, but I
have found over the years that the total size of the document
required to adequately explain the architecture is an excellent proxy
for its total cost of ownership.
It is vital, for a good
representation of a software architecture, that both the data side and the computation side are
covered. I have seen many architectures where the data side is
covered well but the computation side has many gaps. This is the
infamous “and then magic happens” part of the software
architecture world. It is most commonly seen when there is too much
use of convenient real world analogies. i.e. thematic modules that
just snap together like jigsaw/lego pieces, data layers that sit
perfectly on top of each other like layers of a cake, objects that
nest perfectly inside other objects like Russian Dolls etc.
When I have a document
that I feel adequately reflects both the noun and the verb side of the architecture, I
employ a variety of techniques to minimize its overall size. On the
noun side, I can create type hierarchies to explore how nouns can be
considered special cases of other nouns. I can create relational
de-compositions to explore how partial nouns can be shared by other
nouns. I will typically “jump levels” when I am doing this. i.e.
I will switch between thinking of the nouns in purely abstract terms
(“what is a widget really” to thinking about them in
physical terms: “how best to create/read/update/delete
widgets?”). I think of it as working downwards towards
implementation an upwards towards abstraction at the same
time. It is head hurting at times, but in my experience produces
better practical results that the simpler step-wise refinement
approach of moving incrementally downwards from abstraction to concrete
implementation.
On the verb side, I
tend to focus on the classic engineering concept of "moving parts".
Just as in the physical world, it has been my experience that the
smaller the number of independent moving parts in an architecture,
the better. Giving a lot of thought to opportunities to reduce the
total number of verbs required pays handsome dividends. I think of
it in terms of combinatorics. What are the fundamental operators I
need from which, all the other operators can be created by
combinations of the fundamental operators? Getting to this set of fundamental operators is almost like finding the architecture inside the architecture.
I also think of verbs
in terms of complexity generators. Here I am using the word
“complexity” in the mathematical sense. Complexity is not a
fundamentally bad thing! I would argue that all system behavior has a
certain amount of complexity. The trick with complexity is to find
ways to create the amount required but in a way that allows you to be
in control of it. The compounding of verbs is the workhorse for
complexity generation. I think of data as a resource that undergoes
transformation over time. Most computation – even the simplest
assignment of the value Y to be the value Y + 1 has an implicit time
dimension. Assuming Y is a value that lives over a long period of
time – i.e. is persisted in some storage system – then Y today is
just the compounded result of the verbs applied to it from its date
of creation.
There are two main
things I watch for as I am looking into my verbs and how to compound
them and apply them to my nouns. The first is to always include the
ability to create an ad-hoc verb “by hand”. By which I mean, always
having the ability to edit the data in nouns using purely interactive
means. This is especially important in systems where down-time for
the creation of new algorithmic verbs is not an option.
The second is
watching out for feedback/recursion in verbs. Nothing generates complexity
faster than feedback/recursion and when it is it used, it must be used
with great care. I have a poster on my wall of a fractal with its simple
mathematical formula written underneath it. It is incredible that
such bottomless complexity can be derived from such a harmless
looking feedback loop. Using it wisely can produce architectures capable of highly complex behaviors but with small surface areas and few moving parts. Used unwisely.....
No comments:
Post a Comment