One of the challenges of working with older software is one of obsolescence. It is a
challenge that I face in my professional life, and it appears to be something that is affecting
the open source world. I write software in C++. Core GNOME developers write their software in C.
We would all love to offer our platform in languages programmers commonly work with today.
GNOME offers some libraries to languages other than C in bindings. This can be useful, but for
some technically good reasons developers in Java like "pure Java" code. This can be true of
other languages as well. Language bindings themselves can be a problem. Maintaining an interface
to a C library in a way that makes sense in python is a full-time job in itself.
So what do we do with ourselves when the software we write doesn't fit into how people want
to use it? What options do we have, and how do we maintain a useful software base as languages
and technologies come and go?
To get the game into full swing, I would like to separate the notions of platform and
architecture. For the purposes of this entry I'll define platform as the software you link
into your process to make your process do what it should. I'll define architecture as the way
different processes interact to form a cohesive whole. Within a process you need the platform
to integrate pretty naturally with the developer's code. Defined protocols can be used between
processes to reduce coupling, and reduce the need direct language bindings. From those base
assumptions and definitions, whatever software we can keep out of the process is not going to
have to sway with the breeze of how that process is implemented. This extricated software is a
part of the architecture we can keep, no matter what platform we use to implement it.
The closest link I can find for the moment is
this one,
but discussion has cropped up from time to time over the last few years. It centres on whether
Gnome is a software platform, or simply a set of specifications. Are the parts of Gnome going
to be reimplemented in various languages, each with their own bugs and quirks? Will that be
good for the platform or bad? Should these implementations be considered competing products,
or parts of the same product?
The simplest answer for now is to sidestep the question. There are two approaches that allow
us to do this, which I would characterise as model-driven or lesscode approaches. The
model-driven approach involves taking what is common between the various implementations, and
defining it in a platform neutral way. This can often and easily be done with the reading of
configuration files or of network input. You define this model once, and provide individual mappings
into the different platforms. These mappings may still be expensive to maintain, but it would
allow developers to keep working on "common code" when it comes to real applications. A working
example of this in the gnome platform is
glade.
Various implementations or language bindings for "libglade" can be created, or the widget
hierarchy model can be transformed into platform-specific code directly.
Lesscode is an approach where we make architectural decisions that reduce the amount of
platform-specific code we need to implement. Instead of trying to map a library that implements
a particular feature into your process, split it out into another process. Do it in a way that
is easy to interact with without having to write a lot of code on your side of the fence.
The goal is to write less code overall, include less platform code, and implement more functions
while we are at it.
While lesscode is something of an ideal, the tools are already with us. Instead of using
an object-oriented interfacing technology, consider using REST. Map every object in the
architecture to a URI. Now you only have to implement identifer handling once. Access every
object in the architecture using a resource abstraction, such as a pure virtual C++ class or
Java Interface. Find these resources through a resource finder abstraction.
What this does is put everyone on a common simple playing field. You no longer have to worry
about which protocol is spoken at the application-level. Your platform reads the url scheme and
maps your requests appropriately. The uniform interface means you only have to interface to one
baseclass, not multiple libraries and baseclasses. The platform concept is transformed from
an implementation technology into an interfacing technology.
Implementing REST in your system is not sufficient. GNOME is composed of a number of
important libraries, not the least of which is gtk+ itself. Perhaps it is time to rearchitect,
taking a leaf out of the web brower's book. Perhaps we should have a separate program dealing
with the actual user interface. That handling could be based on a model just a little more
expressive than that of glade's widget hierarchy. Desired widget content and attributes could
be derived from back-end processes written in whatever way is most appropropriate at the time.
Widget interactions could be transmitted back to back-end processes over a defined protocol.
Perhaps Model-View-Controller isn't enough when expressed as three objects. Perhaps what is
needed is two or more processes.
If a special interface is developed for speaking to this front-end process, nothing has been
gained. It would be equivalent to providing the language bindings of today. What would be
required is a general interfacing approach based around REST. The widget hierarcy model would
specify where to get information from as URIs, and where to send notifications to as URIs.
Alternatively, the model could simply leave its data open for subscription and leave it up
to the other side to query and react to. The same RESTful signals and slots implementation could
be used for interaction between all processes in the architecture.
My architectural vision is that each process incorporates a featherweight platform defined
around RESTful communications. Which platform is chosen is irrelevant to the architecture.
The fact that each platform implementation would be specific to the language or environment most
suitable at the time would not be considered a problem. The features the platform implements are
simply the essentials of writing all software. Specialty behaviours such as user interaction
should be directed through processes that are designed to perform those functions.
Linking in libraries to perform those interactions is something only a small number of processes
in the system should be doing.
Web browsing is built around exactly this combination of lesscode and model-driven
approaches. I think it is a template for the desktop as well.
Benjamin