Featured Post

Linkedin

 These days, I mostly post my tech musings on Linkedin.  https://www.linkedin.com/in/seanmcgrath/

Tuesday, January 05, 2016

The biggest IT changes in the last 5 years

The last time I sat down and seriously worked on content for this blog was, amazingly, over 5 years ago now in 2010.

It coincided with finalizing a large design project for a Legislative Informatics system and resulted in a series of blog posts to attempt to answer the question "What is a Legislature/Parliament?" from an informatics perspective.

IT has changed a lot in the intervening 5 years. Changes creep up on all of us in the industry because they are, for the most part, in the form of a steady stream, rather than a rushing torrent. We have to deal with change every day of our lives in IT. It goes with the territory.

In fact, I would argue that the biggest difference between Computer Science in theory versus Computer Science in practice, is that practitioners have to spend a lot of time and effort dealing with change. Dealing with change effectively, is itself, an interesting design problem and one I will return to here at some point.

If I had to pick out one item to focus on as the biggest change it would without a doubt be the emergence - for good or ill - of a completely different type of World Wide Web. A Web based not on documents and hyperlinks, but on software fragments that are typically routed to the browser in "raw" form and then executed when they get there.

I.e. instead of thinking about http://www.example.com/index.html as a document that can be retrieved and parsed to extract its contents, much of the Web now consists of document "wrappers" that serve as containers for payloads of JavaScript which are delivered to the browser in order to be evaluated as programs.

It can be argued that this is a generalization of the original web in that anything that can be expressed as a document in the original web can be expressed as a program. It can be argued that the modern approach looses nothing but gains a lot - especially in the area of rich interactive behavior in browser-based user interfaces.

However, it can equally be argued that we risk loosing some things that were just plain good about the original Web. In particular, the idea that content can usefully live at a distance from any given presentation of that content. The idea that content can be retrieved and processed with simple tools as well as big javascript enabled browsers.

I can see both sides of it. At the time I did the  closing keynote at XTech 2008 I was firmly in the camp mourning the loss of the web-of-documents. I think I am still mostly there. Especially when I think about documents that have longevity requirements and documents that have legal status. However, I can see a role that things like single-page webapps can play. As is so often the case in IT, we have a tendency to fix what needed fixing in the old model but introducing collateral damage to what was good about the old model.

Over time, in general, the pendulum swings back. I don't think we have hit "peak Javascript" yet but I do believe that there is an increasing realization that Javascript is not a silver bullet, any more than XHTML was ever a silver bullet.

The middle-way, as ever, beckons as a resting place. Who knows when we will get there. Probably just in time to make room for some newly upcoming pendulum swinging that is gathering place on the server side. Namely the re-emergence of content addressable storage which is part of the hashification of everything. I want to get to that next time.




3 comments:

Brant Watson said...

It's interesting to note that the current incarnation of things is a pretty wild set of shifts.

1. [Mainframes] - Multiple users access a central multi-user computer remotely
2. [Personal Computers] - Users each have their own dedicated computer
3. [PC's to Early Web] - Users have their own dedicated computer to run an application that accesses a central multi-user computer remotely.
4. [PC's to Modern Web] - Users have their own dedicated computer to run an application that accesses a central multi-user computer (or cloud of computers) remotely to retrieve an application that runs in the JavaScript virtual machine sandboxed within said application, which often in turn communicates back to the same computer or cloud of computers for additional data/requests.

WHEW We're getting to a point where there are a lot of abstraction layers going on. The pendulum swung one way, back again, then back once more, got tangled up, melted and blended into some higher level thing - what that will look like in the end I have no idea. The complexity therein is tremendous.

Sean McGrath said...

Yes, indeed. The complexity is tremendous. Ironically, way-way-down the stack of most unix-flavoured machines (including Android devices?) is an X11 Windowing system which allows for applications to running on one machine while all the GUI/Interaction stuff runs on another machine. It allows application developers intercept mouse/keyboard events, paint lines and text etc. on remote machines....Basically all the stuff that is now being added to HTML 5-style browsers upteen levels further up the stack :-)

I think it is doubly ironic that the X11 windowing system had a "consortium" based at MIT whose role was to steward the technology, create standards etc. Sound familiar?:-)

A fun think to figure out would be this: Get a modern web app to draw a square using the canvas element, via javascript. Figure out how much cpu time is consumed before that request to draw a square gets turned into the corresponding X11 function calls, that do the same thing, way, way down the call stack:-)

Mano Marks said...

One of the biggest changes I've seen in the last few years is StackOverflow and related StackExchange sites. They've clearly become THE place to go for developers. Often coming up before official reference docs in a search.