Featured Post

Linkedin

 These days, I mostly post my tech musings on Linkedin.  https://www.linkedin.com/in/seanmcgrath/

Tuesday, April 27, 2021

Linkedin

 These days, I mostly post my tech musings on Linkedin. 

https://www.linkedin.com/in/seanmcgrath/


Monday, March 30, 2020

Fully Virtual Legislative and Parliamentary sessions coming in 2020. Forty years ahead of schedule

It goes without saying that we live in interesting times at the moment. Never could I have imagined that the world would change utterly, so rapidly...

In times like this thoughts turn immediately to those most seriously affected. Fingers crossed the measures being taken at the moment will allow us to defeat this virus soon.

When we return to "normal" it is clearly going to be a new "normal". All over the world organizations of all sorts are finding ways to function in fully virtualized environments. The word "Zoom" - already in the dictionary as a verb coming into this  - has acquired a new meaning, in record time. There has never been a more intense focus on finding ways to get things done digitally. It is no longer hyperbole to say that our futures and our lives depend on it.

Legislatures and Parliaments are not immune to this new digital impetus. All around the world Legislatures and Parliaments are exploring ways to work fully digitally. Decades ahead of when I had envisaged that they would.

These institutions face some unique challenges in making this transition. It is not as simple as using conference calls for voice votes and switching on a few video conferencing cameras. Decades and indeed centuries of precedents, statutory provisions and constitutional/charter provisions need to be complied with so that continuity of the rule of law is maintained and to ensure legal authority is preserved.

Rules of Legislative Procedure such as Mason's provide a wealth of good guidance that functions regardless of whether or not a Session is being conducted "in person" or virtually. 77 of the 99 legislative chambers in the US use Mason's as the bedrock of their chamber rules. Mason's allows for the suspension of rules and, on the face if it, this appears to provide a lot of scope for US legislative chambers/houses to adapt quickly to working in a fully virtualized environment.

However, Mason's is also clear that its rules, or the rules in a House or Senate resolution or in the joint rules of a legislature, are all ultimately limited by whatever the constitution may require.

Simply put, the Constitution rules. And this is where the difficulty may lie in some cases. Most constitutions are old documents - into the hundreds of years old. Some specifically state, for example, that legislative sessions are conducted "in person".

This has higher precedence that anything in the joint rules, the chamber specific rules or in Masons/Jeffersons/Roberts etc. So, perhaps a change to the constitution is required in states where there is a constitutional requirement for "in person" meetings?

This is of course possible in all states, but I know of no state where the constitution can be changed quickly. In some states it takes a minimum of a session to pass it, plus a referendum....All of which take time.

I am not a constitutional scholar or indeed a lawyer for that matter, but it may be that the nature of legal language comes to the rescue here. I have written before at length about how I think about law and how it is not a simple, large set of "rules". For example What is Law? and also a series of blogposts starting here.

Simply put, my view is that the open-ended nature of legal language that may appear to be a "bug" at first glance - especially if you are a software engineer seeking to create  "rules" from it - is actually law's most brilliant feature.

Perhaps the time has come - in these extraordinary times - to revisit the interpretation of the phrase "in person" meetings to include "virtual" meetings in certain circumstances. I believe a lot of groundwork already exists for this with the gradual adoption over the last few decades of digital technology from faxes to digital signatures in legally binding environments.

In one fell swoop, such an interpretation of "in person" would fully pave the way for the fully virtual legislative meeting capability in states that have "in person" language in their constitutions. It is a matter for the courts obviously, as that is where the "interpretation" function lies, at least in common law jurisdictions.

I do not mean to suggest that there is some sort of magic wand to be waved here, but I do believe that with all the dedicated, talented legal people looking at enabling virtual legislative sessions at the moment, a legal solution will be found in all states that need it.

In parallel of course, the technology has come on in leaps and bounds over the last two decades to facilitate it, once the legal framework is in place to enable it. The technology in question is my day job and has been for about thirty years. It is very exciting to be in a position to help. We have all the modules we need right now for the technology having spent decades building out digitial systems in Legislatures/Parliaments already.

Indeed, we are already working with a number of legislatures (and also with local government with PrimeGov), to extend our existing solutions to be fully virtual for legislatures and parliaments, covering chamber floors and committees. Deployment time can be as low as a matter of weeks.

For many legislatures, it is already full steam ahead towards support for virtual sessions. For those that need to make some preparatory legal framework modifications, my advice would be to do that in parallel. There is no need to wait. Do them in parallel. The time to start is now. Today.

This is all happening in my lifetime.  I never would have guessed it, but I am delighted to be part of it having spent 30 years thinking about it. The new normal is upon us.


Thursday, January 03, 2019

An alternative model of computer programming : Part 2


This is part two of a series of posts about an alternative model of computer programming I have been mulling for, oh decades now. The first part is here: http://seanmcgrath.blogspot.com/2018/08/an-alternative-model-of-computer.html

The dominant conceptual model of computer programming is that it is computation, which in turn of course is a branch of mathematics. This is incredibly persuasive on many levels. George Boole's book An Investigation of The Laws of Thought, sets out a powerful way of thinking about truth/falsity and conditional reasoning in ways that are purely numerical and thus mathematical and, well, really beautiful. Hence the phrase “boolean logic”. Further back in time still, we find al-Khwārizmī in the Ninth century working out sequences of mathematical steps to perform complex calculations. Hence the word “algorithm”. Further back in the time of the ancient Greeks we find Euclid and Eratosthenes with their elegant algorithms for finding greatest common divisors and prime numbers respectively.

Pretty much every programming language on the planet has a suite of examples/demos that include these classic algorithms turned into math-like language. They all feature the “three musketeers” of most computer programming. Namely, assignment (e.g. y = f(x)), conditional logic (e.g. “if y greater than 0 do THIS otherwise THAT”) and branching (e.g. “goto END”).

These three concepts get dressed up in all sorts of fine clothes in different programming languages but, as Alan Turing showed in the Nineteen Thirties, you only need to be able to assign values and to “branch on 0” in order to be able to compute anything that is computable via a classical computer – a so called Turning Machine. (This is significantly less that everything you might want to compute but that is another topic for another day. For now, we will stick to classical computers as exemplified in the so-called Von Neumann Architecture and leave quantum computing for another day.)

So what's the problem? Mathematics clearly maps very elegantly to expressing the logic and the calculations needed to get algorithms formalized for classical computers. And this mathematics maps very nicely onto todays mainstream programming languages.

Well, buried deep inside this beautiful mapping are some ugly truths that manifest themselves as soon as you go from written software to shipped software. To see these truths we will take a really small example of an algorithm. Here it is:
“Let y be the value of f(x)
If y is 0
then set x to 0
otherwise set x to 1”

The meaning of the logic here doesn't matter and it doesn't matter what f(x) actually calculates. All we need is something that has some assignments and some conditional logic such as the above snippet.

Now ask any programmer to code this in Python or C++ or Java or and they will be done expressing the algorithm in their coding environment in a matter of minutes. It is mostly a question of adding the write “boilerplate” code around the edges, and finding whatever the correct syntax is in the chosen programming language for “if” and for “then” and for demarcating statements and expressing assignments etc.

But in order to ship the code to production items such as error handling, reliability, scalability, predictability.... – sometimes referrred to as the “ilities” of programming end up taking up a lot of time and a lot of coding. So much so that the coding for “ilities” that needs to surround shipped code is often many times larger that the lines of code required for the original purely mathematical mapping into the programming language.

All of this ancilliary code – itself liberally infused with its own assignments and conditional logic – becomes part of the total code the needs to be created to ship code and most of it needs to be managed for the life time of the core code itself. So now we have code for numeric overflows, function call timeouts, exception handlers etc. We have code for builds, running test scripts, shipping to production, monitoring, tracing...the list goes on and on.

The pure world of pure math rarely needs to have any of these as concerns because in math we say “let x = f(x)” without worrying if f(x) will actually fall over and not give us an answer at all, or, perhaps worse, work fine for a year and then start getting slower and slower for some unknown reason.

This second layer of code – the code that surrounds the “pure” code is very hard to quantify. Its very hard to explain to non-programmers how important it might prove to be, how much time it might take and then to make matters worse, it is very unusual to be able to say its “done” in any formal sense. There are always loose ends. Error conditions that code doesn't handle – either because they are believed to be highly unlikely or because there are an open ended set of potential error scenarios and its simply not possible to code for every conceivable eventuality.

Pure math is a land of zero computational latency. A land where calculations are independent of each other and cannot interfere with each other. A land where all communications pathways are 100% reliable. A land where numbers have infinite precision. A land of infinite storage capacity. A land where the power never dies...etc. Etc.

All this is to make the point that in my opinion that for all the appealing mapping from pure math to pure algorithms, actual computer programming involves adding many other layers to cater for the fact that the real world of shipped code is not a pure math “machine”.

Next up. My favorite subject. Change with respect to time....


Friday, August 31, 2018

An alternative model of computer programming : Part 1

Today, I googled "How many programming languages are there?" and the first hit I got said, "256".

I giggled - as any programmer would when a power of two pops up in the wild like that. Of course, it is not possible to say exactly how many because new ones are invented almost every day and it really depends on how you define "language"...It is definitely in the hundreds at least.

It is probably in the thousands, if you rope in all the DSLs and all the macro-pre-processors-and-front-ends-that-spit-out-Java-or-C.

In this series of blog posts I am going to ask myself and then attempt to answer an odd question. Namely, "what if language is not the best starting point for thinking about computer programming?"

Before I get into the meat of that question, I will start with how I believe we got to the current state of affairs - the current programming linguistic tower of Bable - with its high learning curve to enter its hallowed walls. With all its power and the complexities that seem to be inevitable in accessing that power.

I believe we got here the day we decided that computing was best modelling with mathematics.

Friday, July 27, 2018

The day I found Python....

It was 21 years ago. 1997. I was at an SGML conference in Boston (http://xml.coverpages.org/xml97Highlights.html). It was the conference where the XML spec. was launched.

Back in those days I mostly coded in Perl and C++ but was dabbling in the dangerous territory known as "write your own programming language"...

On the way from my hotel to a restaurant one evening I took a shortcut and stumbled upon a bookshop. I don't walk past bookshops unless they are closed. This one was open.

I found the IT section and was scanning a shelf of Perl books. Perl, Perl, Perl, Perl, Python, Perl....

Wait! What?

A misfiled book....Name seems familiar. Why? Ah, Henry Thomson. SGML Europe. Munich 1996. I attended Henry's talk where he shows some of his computational linguistics work. At first glance his screen looked like the OS had crashed, but after a little while I began to see that it was Emacs with command shell windows and the command line invocation of scripts, doing clever things with markup, in Python. Very productive setup fusing editor and command line...

I bought the mis-filed Python book in Boston that day and read it on the way home. By the time I landed in Dublin it was clear to me that Python was my programming future.  It gradually replaced all my Perl and C++ and today, well, Python is everywhere.




Monday, July 23, 2018

Thinking about Software Architecture & Design : Part 14

Of all the acronyms associated with software architecture and design, I suspect that CRUD (Create Read/Report Update Delete) is the most problematic. It is commonly used as a very useful sanity check to ensure that every entity/object created in an architecture is understood in terms of the four fundamental operations : creating, reading, updating and deleting. However, it subtly suggests that the effort/TCO of these four operations are on a par with each other.

In my experience the "U" operation – update  – is the one where there are the most “gotchas” lurking. A create operation – by definition – is one per object/entity. Reads are typically harmless (ignoring some scaling issues for simplicity here). Deletes are one per object/entity, again by definition. More complex than reads generally but not too bad. Updates however, often account for the vast majority of operations performed on objects/entities. The vast majority of the life cycle is spent in updates. Not only that, but each update – by definition again – changes the object/entity and in many architectures updates cascade. i.e. updates cause other updates. This is sometimes exponential as updates trigger other updates. It is also sometimes truly complex in the sense that updates end up,through event cascades, causing further updates to the originally updated objects....

I am a big fan of the CRUD checklist to cover off gaps in architectures early on but I have learned through experience that dwelling on the Update use-cases and thinking through the update cascades can significantly reduce the total cost of ownership of many information architectures.