Featured Post

Linkedin

 These days, I mostly post my tech musings on Linkedin.  https://www.linkedin.com/in/seanmcgrath/

Friday, December 12, 2008

Fractal Friday - Computer Science is broken

Joe Gregorio says that computer Science, at a deep and fundamental level, is broken. Nothing significant has happened for may years to move the state of the "science" forward.

I tend to agree. A lecturer of mine in college, in, oh what was it...1984 or so, used to say "nothing significant has happened since 1968". That was the year of Algol 68. I think he could make the same statement today.

Now I am more of an information-guy than an algorithmics-guy but I would be inclined to say that nothing significant has happened since the invention of the word processor.

Joe asks "where are the fractals"? I think they are in the documents. Headings and sub-headings to any depth with anything you like in between the headings, in any order. Self similar. Arbitrarily deep and fiendishly difficult to process automatically. That is the real world. A messy world of "unstructured" stuff that we call "content" lest we sully the word "data".

The un-real world is the so-much-simpler concept of tables and fields and scalar values...That is what is broken IMO. The science of computer science knows a lot about the theory of computation - it is a branch of math. But data...content...documents...well, I think we are still in diapers on that subject.

8 comments:

René Dudfield said...

hi,

I think computer science has moved so far ahead since 1968 it's not funny.

Understanding of computers has progressed by many magnitudes since then.

I could list examples of significant learning, but I won't. It's easy enough to search most of humanities knowledge through an internet search engine to find how much we have learned about computers since 1968.

Or perhaps search wikipedia --- a compendium of knowledge made by thousands of collaborators from around the world.

But first, maybe you'd like to get some breakfast? Pull your 600mhz phone out of your pocket, msg a couple of friends for suggestions on where to go, and if they want to meet up. Then look up one of the suggestions, and get street directions there -- optimized for best way to walk. Then take a look at the menu of the place you are going to, and some photos of the area... then you see a picture of the cafe across the road, and check out its website and menu, and decide it is much better place to go.


I'm hungry... cu.

Tucanae Services said...

Sean,

Complaining about the current state of computing is like complaining that the laws of gravity have not advanced since Galileo first made observations on the matter. Or that the fundamentals of General Relativity have done likewise.
Much of the general fundamentals were defined long before transistors showed up on the scene.

But there is a lot going on. Parallel computing design is moving forward. Fact the chance for everyone to have the equivalent of a ole Cray-1 in their basement is only 3-5 years away. Some interesting research in nanoprotein memory storage is going on. Fact the whole field of nanotechnology in many ways owes its existence and continued success to computing. Both will proceed in lockstep to new ventures.

A lot is going on. But you need to look for it in nontraditional CS type venues.

mnot said...

That faster processor in your pocket is more of a feat of materials science and iterative design -- i.e., evolution -- than it is something genuinely new.

Yes, we've found new ways to use computers, and new layers of complexity on top of them, but there's little that's actually significantly, ground-breakingly new.

Anonymous said...

I don't think that the existence of Wikipedia or the X-ray lithography that allows us to have 400MHz CPUs in our pocket is really relevant to computer science.

But here are some things in computer science that have been invented since 1968: relational databases, generational garbage collectors, graph-reduction evaluators for lazy functional programming languages, classes, prototype-based programming languages, skip lists, higher-order control flow analysis, supercombinators, treaps, LALR parsers, Earley parsers, JIT compilers, Bloom filters, object-capability systems that can run on stock hardware, SSA representations in compilers, monitors, semaphores, gradual underflow, bitblt, Chez Scheme's efficient implementation of call/cc, SIFT feature extraction, most of the field of statistical machine learning, the Burrows-Wheeler transform, packrat parsers, the Hindley-Milner type system, the related realization that algorithms are associated with algebras (on which the STL is based), Lempel-Ziv data compression, alpha compositing, Phong and Gouraud shading, SHRDLU, logic programming and Prolog, Linda, any number of better ways of solving large systems of linear equations, public-key cryptography, Shor's algorithm (plus hardware that can run it for small problems), and many other things I don't even know about.

These are not the most significant computer-related inventions since 1968 --- those would be the GUI, the IDE, the spreadsheet, and the factor of 64 million increase in transistor density --- but unlike these more practically significant inventions, they are unambiguously in the realm of computer science.

Scott Aaronson's lecture series on great ideas in theoretical computer science includes a number of things invented since 1968.

Sean McGrath said...

Illume, Johnc,

You both point to examples where hardware (bitmap displays, networks, chip density) has improved greatly. No argument there but that is electronics/physics not computer science.

When I speak of Computer Science I'm thinking of the activity that resulted in Boolean Logic, Turing Completeness, the Omega Number, Information Theory, the Relational Model, Object Oriented Programming...

When I look at that lot, I cannot help but think that George Boole and Alan Turing and Claude Shannon were the glory days.

Sean

flow said...

maybe the state of the science has not changed much, ahdunno. but when you look at a language like python and ask yourself: how could i have written this or that piece of software in 1978 (the year i started with commodore basic, later assembler---yes there were more advanced tools out even back then), then you see that a few things did move in the state of the art. brushes, paints, and canvasses have not changed much on a fundamental level, but the pictures we draw and the way we weave things together has advanced considerably.

Anonymous said...

"When I speak of Computer Science I'm thinking of the activity that resulted in Boolean Logic, Turing Completeness, the Omega Number, Information Theory, the Relational Model, Object Oriented Programming..."

Chaitin's publication of Omega was in 1975, in "A theory of program size formally identical to information theory", and the relational model was in 1970. Object-oriented programming in its current form (although not, as I said in my earlier comment, the concept of the class) was developed throughout the 1970s. And although Turing's discovery of the Universal Turing Machine was sometime in the 1930s, work on simpler universal automata continues even today; several interesting new results were published in the 1980s and 1990s. And when it comes to complexity, quantum computation might upset the whole complexity apple-cart! Unless it turns out to be impossible, which would obsolete our present theories of physics.

So, while on one hand you may be right that many of the most fundamental results were discovered many years ago, there are new important results being discovered today.

Perhaps they are buried in a much larger collection of trivia because computers are more practically useful? Or perhaps it will take another 40 years to see as clearly the significant CS ideas of 2008 as we can now see the significant CS ideas of 1968?

Sean McGrath said...

Kragen,

The 1968 figure was from my lecturer in college. I think he is in the ballpark but lets add some lattitude into the early Seventies. Many of the items (not all, but many) fall into the <=Seventies bracket.

Quantum Computation stands out for me as being the potential game changer and a lot of that has happened post '68 even if the Physics and Math bits that underpin it, go back further.

Sean