One of the major drivers in application design is infrastructure economics i.e. the costs - both in capex an opex terms - of things like RAM, non-volatile storage, compute power, bandwidth, fault tolerance etc.
These economic factors have changed utterly in the 35 years I have been involved in IT, but we still have a strong legacy of designs/architectures/patterns that are ill suited to the new economics of IT.
Many sacred cows of application design such as run-time efficiencies of compiled code versus interpreted code or the consistency guarantees of ACID transactions, can be traced back to the days when CPU cycles were costly. When RAM was measured in dollars per kilobyte and when storage was measured in dollars per megabyte.
My favorite example of a deeply held paradigm which I believe has little or no economic basis today is the concept of designs that only keep a certain amount of data in online form, dispatching the rest, at periodic intervals, to offline forms e.g. tape, disc that require "restore" operations to get them back into usable form.
I have no problem with the concept of backups:-) My problem is with the concept of designs that only keep, say, 1 years worth of data online. This made a lot of sense when storage was expensive because the opex costs of manual retrieval were smaller than the opex costs of keeping everything online.
I think of these designs as backup-and-delete designs. My earliest exposure to such a design was on an IBM PC with twin 5 and 1/4 inch floppy disk drives. An accounting application ran from Drive A. The accounting data file was in Drive B. At each period-end, the accounting system rolled forward ledger balances and then - after a backup floppy was created - deleted the individual transactions on Drive B. That was about 1984 or so.
As organizations identified value in their "old data" -for regulatory reporting or training or predictive analytics, designs appeared to extract the value from the "old" data. This lead to a flurry of activity around data warehousing, ETL (extract, transform, load), business intelligence dashboards etc.
My view is that these ETL-based designs are a transitionary period. Designers in their twenties working today, steeped as they are in the new economics of IT, are much more likely to create designs that eschew the concept of ever deleting anything. Why would you when online storage (local disk or remote disk, is so cheap?) and there is alway the possibility of latent residual value in the "old" data.
Rather than have one design for day-to-day business and another design for business intelligence, regulatory compliance, predictive analytics, why not have one design that addresses all of these? Apart from the feasibility and desirability of this brought about by the new economics of IT, there is another good business reason to do it this way. Simply put, it removes the need for delays in reporting cycles and predictive analytics. Ie. rather than pull all the operational data into a separate repository and crunch it once a quarter or once a month, you can be looking at reports and indicators that are in near-realtime.
I believe that the time is coming when the economic feasibility of near-realtime monitoring and reporting becomes a "must have" in regulated businesses because the regulators will take the view that well run businesses should have it. In the same way that a well run business today, is expected to have low communications latencies between its global offices (thanks to cheap availability of digital communications), they will be expected to have low latency reporting for their management and for the regulators.
We are starting to see terminology form around this space. I am hopelessly biased because we have been creating designs based on never-deleting-anything for many years now. I like the terms "time-based repository" and the term "automatic audit trail". Others like the terms "temporal database", "provenance system", "journal-based repository"...and the new kid on the block (no pun intended!) - the block chain.
The block-chain, when all is said and done, is a design based on never-throwing-away anything *combined* with providing a trust-free mechanism for observers of the audit-trail to be able to have confidence in what they see.
There is lots of hype at present around block chain and with hype comes the inevitable "silver bullet" phase where all sorts of problems not really suited to the block-chain paradigm are shoe-horned into it because it is the new thing.
When the smoke clears around block chain - which it will - I believe we will see many interesting application designs emerge which break away completely from the backup-and-delete models of a previous economic era.
Featured Post
These days, I mostly post my tech musings on Linkedin. https://www.linkedin.com/in/seanmcgrath/
Tuesday, January 26, 2016
Monday, January 25, 2016
The biggest IT changes in the last 5 years: Multi-gadget User Interfaces
In the early days of mobile computing, the dominant vision was to get your application suite on "any device, at any time". Single purpose devices such as SMS messengers, email-only messengers faded from popularity, largely replaced by mobile gadgets that could, at least in principle, do every thing that a good old fashioned desktop computer could do.
Operating system visions started to pop up everywhere aimed at enabling a single user experience across a suite of productivity applications, regardless of form-factor, weight etc.
Things (as ever!) have turned out a little differently. Particular form factors e.g. smart phone, tend to be used as the *primary* device for a subset of the users full application suite. Moreover, many users like to use multiple form-factors *at the same time*.
Some examples from my own experiences. I can do e-mail on my phone but I choose to do it on my desktop most of the time. I will do weird things like e-mail myself when on-the-road, using a basic e-mail sender, to essentially put myself in my own in-box. (Incidentally, my in-box is my main daily GTD focus. I can make notes to myself on my desktop but I tend to accumulate notes on my smart phone. I keep my note-taker-app open on the phone even when I am at the desktop computer and often pick it up to make notes.
I can watch Youtube videos on my desktop but tend to queue up videos instead and then pick them off one-by-one from my smart-phone, trying to fit as many of them into "down time" as I can. Ditto with podcasts. I have a TV that has all sorts of "desktop PC" aspects from web browsers to social media clients but I don't use any of it. I prefer to use my smartphone (sometimes my tablet) while in couch-potato mode and will often multi-task my attention between the smartphone/tablet and the TV. I find it increasingly annoying to have to sit through advertizing breaks on TV and automatically flick to smartphone/tablet for ad breaks.
I suspect there is a growing trend towards a suite of modalities (smartphone, tablet, smart TV, smart car) and a suite of applications that in practical use, have smaller functionality overlaps than the "any device, at any time" vision of the early days would have predicted. A second, related trend is increasingly common use-cases where users are wielding multiple devices *at the same time* to achieve tasks.
Each of us in this digital world, is becoming a mini-cloud of computing power hooked together over a common WIFI hub or a bluetooth connection or a physical wire. As we move from home to car to train to office and back again, we reconfigure our own little mini-cloud to get things done. The trend towards smartphones becoming remote controls for all sorts of other digital gadgets is accelerating this.
I suspect that the inevitable result of all of this is that application developers will increasingly have to factor in the idea that the "user interface" may be a hybrid mosaic of gadgets, rather than any one gadget. With some gadgets being the primary for certain functionality.
Operating system visions started to pop up everywhere aimed at enabling a single user experience across a suite of productivity applications, regardless of form-factor, weight etc.
Things (as ever!) have turned out a little differently. Particular form factors e.g. smart phone, tend to be used as the *primary* device for a subset of the users full application suite. Moreover, many users like to use multiple form-factors *at the same time*.
Some examples from my own experiences. I can do e-mail on my phone but I choose to do it on my desktop most of the time. I will do weird things like e-mail myself when on-the-road, using a basic e-mail sender, to essentially put myself in my own in-box. (Incidentally, my in-box is my main daily GTD focus. I can make notes to myself on my desktop but I tend to accumulate notes on my smart phone. I keep my note-taker-app open on the phone even when I am at the desktop computer and often pick it up to make notes.
I can watch Youtube videos on my desktop but tend to queue up videos instead and then pick them off one-by-one from my smart-phone, trying to fit as many of them into "down time" as I can. Ditto with podcasts. I have a TV that has all sorts of "desktop PC" aspects from web browsers to social media clients but I don't use any of it. I prefer to use my smartphone (sometimes my tablet) while in couch-potato mode and will often multi-task my attention between the smartphone/tablet and the TV. I find it increasingly annoying to have to sit through advertizing breaks on TV and automatically flick to smartphone/tablet for ad breaks.
I suspect there is a growing trend towards a suite of modalities (smartphone, tablet, smart TV, smart car) and a suite of applications that in practical use, have smaller functionality overlaps than the "any device, at any time" vision of the early days would have predicted. A second, related trend is increasingly common use-cases where users are wielding multiple devices *at the same time* to achieve tasks.
Each of us in this digital world, is becoming a mini-cloud of computing power hooked together over a common WIFI hub or a bluetooth connection or a physical wire. As we move from home to car to train to office and back again, we reconfigure our own little mini-cloud to get things done. The trend towards smartphones becoming remote controls for all sorts of other digital gadgets is accelerating this.
I suspect that the inevitable result of all of this is that application developers will increasingly have to factor in the idea that the "user interface" may be a hybrid mosaic of gadgets, rather than any one gadget. With some gadgets being the primary for certain functionality.
Subscribe to:
Posts (Atom)