OS/2 eZine - http://www.os2ezine.com
November 16, 2003
John Martin John Martin started using OS/2 in 1987 (IBM OS/2 EE 1.0 beta) and after a week he had his first OS/2 application ready (a ascii chart). He works as a consulting software developer in Gothenburg, Sweden specializing in delivering whatever his customers wants him to. He is married with three children and an aspiring salsero. He started using OS/2 since it was the future and his server and Thinkpad still continues to run with eComStation.
If you have a comment about the content of this article, please feel free to vent in the OS/2 e-Zine discussion forums.

There is also a Printer Friendly version of this page.

Previous Article
Next Article

Advertise with OS/2 e-Zine


Whatever happened to the revolution? I have a weakness for old computer books. It's fun to read about the revolution that is just around the corner, based on the impending arrival of affordable client/server computing, ubiquitous CORBA, or cross-platform, network-centric Java. Some of the better books even backed their revolutionary fervor with working code. So what happened?

In the beginning...

Cast your mind back to the beginning of the '90s. The '80s had seen the personal computing platforms transform from Apple IIs through IBM PCs, the GUI Lisa, Macintosh, and Atari ST (with Digital Research's GEM as the UI), the multi-tasking Amiga, and the 32-bit RISC Acorn Archimedes. Several hardware manufacturers had looked at the Transputer as a way to bring serious parallel processing power to the masses. On the software front, the many versions of DOS (MS, PC, DR etc.) had been "replaced" by OS/2 1.x, the joint venture between Microsoft and IBM. The Amiga OS had been substantially improved by its user community in the form of the ARP (AmigaDOS Replacement Project) - a rewrite in another language performed without access to the source of the original! Truly these were revolutionary times.

Positioned at the beginning of the nineties the future was a vast array of possibilities. IBM had its 32-bit OS/2 2.0 in beta, with its OOUI (Object-Oriented User Interface) the Workplace Shell. It looked as good as the Mac, ran on PCs and had IBM behind it - what could go wrong? There was an object revolution going on!

Cynics might answer that Microsoft happened, but it was bigger than Microsoft. Apple started the rot by suing Digital Research over GEM. Microsoft decided it didn't want to compete head to head with Microsoft OS/2 vs.IBM OS/2 (a rerun of MS-DOS vs. PC-DOS vs. DR-DOS) and went off make Windows, some say after sabotaging/delaying OS/2. After succeeding in laying waste to Digital Research for copying their own copy of the Xerox desktop, Apple tried to sue Microsoft over Windows. And somewhere in there, emboldened by Apple's success against DR, Lotus sued a competitor for a look and feel borrowed from Visicalc. The big boys were not playing nice.


I don't believe the phrase "paradigm shift" was particularly popular back then, but it was pretty much what was needed. The Amiga had retreated to making videos, the Atari to music, the ARM-chip powered Acorn had morphed into a school computer in the UK, and the mainstream players and architectures were at war. In addition, application developers were being forced to choose sides.

The rise to prominence of the GUI had lead to elevation of object-oriented ideas, if not languages for implementation. Printers and displays were amongst the first things to be "encapsulated" behind the desktop shell - no more worrying whether WordPerfect 5.1 had a driver for the particular printer you needed to use, or configuring the video card so you could get a wider columned screen. Freed from the burden/opportunity of writing device drivers, more software houses fell.

When OS/2 2.0 finally came out of beta, it was the new high-water mark in this trend of encapsulation. OS/2 had an object-oriented shell, the Workplace shell, with standard classes for much of the behavior of the system accessible for re-use by application developers. Behind this was the mystical System Object Model (SOM), making this OO magic available, though not exactly accessible to the masses. Indeed, at the time of its release IBM could not even provide a SOM-aware object-oriented language to make use of these classes; it took quite a while for CSet++ to arrive.

Despite the difficulties, there were still vendors that took the WPS by the horns and implemented replacement classes, most notably Stardock. However many stuck with the style of programming they knew, and wrote Presentation Manager applications or even VIO ones. As a measure of how little the WPS was utilized, you could still run most applications for OS/2 in a replacement shell, like the one you get with maintenance partitions - no WPS active. Counter-revolutionaries even ported X to OS/2, or made more primitive Win95-like shells.


In retrospect it seems like IBM did everything wrong. Everyone now believes that in making OS/2 a better DOS than DOS and a better Windows than Windows IBM provided developers with little compulsion to produce native applications. However few people continue the progression and argue that in constructing an OS architecture with the base OS, the window manager layer (PM) and the OO desktop shell on top that IBM provided little compulsion for developers to study the WPS. What would have happened if IBM had not provided the PM API, and let the OS be accessed only via SOM classes? Probably disaster, but then that happened anyway, eventually.

When IBM and Microsoft fell out over OS/2, IBM retained some rights to Windows source code, enough to build a copy of Windows into "blue-spine" OS/2 versions. Microsoft must not have wanted IBM to get the rights to a component model they were working on at the time (OLE), so they made it part of their Office application suite. With this subterfuge yet another impediment to document sharing was foist upon the computing world. Eventually, in a rare display of cross-vendor co-operation Apple and IBM would bring out OpenDoc, a cross-platform compound document to rival OLE/COM/DCOM/ whatever it was called by then. This too was the victim of bad timing, being supplanted at the low end by HTML and at the high end by JavaBeans. The only OpenDoc application of note to surface was Apple's original lean-and-mean Web browser, Cyberdog. With CORBA an in-theory-only standard, dogged by interoperability issues, and Microsoft choking the life out of Netscape, and with it the possibility of ubiquity for the Visigenic ORB, the revolution was practically over.

By the end of 1996 the world had pretty much started to regress. Perfectly good UIs were being replaced by clunky browser-based front-ends (think Feature Installer), Microsoft was making the active desktop to consume CPU cycles, Apple was in trouble and everyone was learning Java. Microsoft was right, Java wasn't just a language, it was a platform, and a pretty limited one. Attempts to make application suites based on Java (Lotus eWorkPlace, Corel Office) and even operating systems based on Java (IBM's Java OS for Business) failed or disappeared quickly. Java as a language is "Smalltalk semantics obfuscated by C++ syntax", and as a platform it is still a work in progress - witness the Swing vs. SWT debate now taking place.

The years since 1996 have pretty much been dominated by the "failure" of client/server, the move of function back to the server side (where Java is now a significant force), and the polishing of existing operating systems rather than the creation of new (apart from BeOS). Microsoft's "Cairo" answer to OS/2's object prowess was revealed as just another marketing ploy to put off purchases. The rise of Linux has put a 30-year old OS on every geek's desktop, but done little to advance the state of the art. The platform of the future is arguably the smart phone, running a former PDA manufacturer's OS on a chip derived from the CPU of the old Acorn Archimedes - the British are coming!


So here we are, the present day, the revolution is dead, long live the revolution! But how to give it the kiss of life?

One of the things that gets in the way of the object and component nirvanas of readily available, quality, ready-to-assemble objects and ad hoc stitched together applications is the fundamental conflicts of this kind of development with commercial realities. Even if they could, what application software vendor in their right mind would deconstruct their applications and make their best work readily available for someone else to utilize, or base their application on components from a competing application vendor, knowing that at any time it could be deliberately broken? Competition is supposed to be good for the consumer, but is it always? As Jennifer Saunders said in her TV sitcom "Absolutely Fabulous" - "I don't want lots of choices, I just want nice things!"

Component-oriented vendors want run-time licenses, and application vendors don't want these eating into their profit margins - so the object revolution only really improved the lot of corporations building their own in-house systems, not the public at large. In fact even in the world outside computer software, the world of real objects and components, there are very few cases of best-of-breed components/services being offered, unencumbered, to the public. How many times have you gone to change the sparkplugs on your car to find that the exact plug is only available from the manufacturer's own parts brand, not even the OEM that makes it for them! Truly co-operative components, designed specifically for use in combination with competitors components, seem mostly to occur in the music world; the world of Hi-Fi, the world of musical instruments, and even the output of musicians themselves. However in these cases, the purveyors are often not really competitors, because of their high degree of specialization. And it has to be said that in terms of integration, for example, for the convenience of having all your sound system's components running off a single remote control, single vendor systems still win the day.

So commercial considerations counter component revolutions; what can be done?

Fortunately(?), there is little remaining in the OS/2 world that can be described as commercial. It seems perfectly feasible to ask most of the community to forgo the impossible dream of commercial gain from OS/2 coding and "limit" themselves to the kind of non-commercial behavior compatible with the licenses of many otherwise expensive products. For example, the VisualAge for Smalltalk tool covered in September's OS/2 Ezine is a rather expensive tool to purchase for commercial use, and as such I believe finds most of its use in the telecommunications industry. However, as noted, it is now offered free for non-commercial use. In this guise, there is what is known as a "goodie" for VAST that might prove of interest to the largely non-technical OS/2 community; the Visual UML Designer. [Ed: How timely! See the article in this month's issue.]

What is UML?

The Unified Modeling Language is a visual and written syntax for the descriptions of behaviors, roles, responsibilities, state transitions, object and class hierarchies and inheritance structures of systems. It verges on an oxymoron being Unified in the same sense that the United Kingdom is United. Travel from one end to the other of UML and you basically have to learn a new language (just like the UK), however it is a standard and it does have major league backing from nearly every developer tools player (except Microsoft?). The wonderful folks at OMG (the Object Management Group) even think it is the basis of a whole new way of developing applications, the Model Driven Architecture.

It seems nobody really knows all of UML. "Functional" people might acquaint themselves with the use-case syntax, a way of telling the story of individual stereotypical transactions within a system of "actors". "Technical" people might be asked to devise an inheritance scheme that allows maximum re-use of logic across the system. This division of labor into functional and technical mirrors the workplace of many commercial enterprises but might not correspond to the typical enthusiast/amateur programmer "scratching a personal itch" as typified in the Linux community. However, the OS/2 community is unlike the Linux community in many ways; the ideas encompassed in the LazyWeb might actually work in the OS/2 world.

Lets us imagine a utopian future where Netlabs or something like it houses a repository of use-cases outlined as VAST projects. "Domain experts" create descriptions of the scenarios they want to see supported in a future application, and enthusiastic amateur programmers think "that sounds like a challenge" and propose designs. Together the domain expert and programmer agree on the state transitions etc. that go to implement the desired behavior and while one goes away to build test cases the other gets programming in/generates code for whatever implementation language best suits the requirements.

So does this language have to be Smalltalk? Well actually no. VAST can target Java in addition to Smalltalk. But there are gotchas. Java is a simple language, remember "Smalltalk semantics obfuscated by C++ syntax", so it limits the kind of designs permissible. In short, you have to choose which language you want to implement in ahead of the class hierarchy design stage. Thankfully the functional people never need think about this as the Unified part of UML doesn't really apply to the development methodology, they remain free to concentrate on the use-case scenarios.

One of the coolest quotes regarding these sorts of endeavors is "good ideas and bad code build communities, the other three combinations do not." (see LazyWeb and RSS: Given Enough Eyeballs, Are Features Shallow Too). If this is true, the worst result that can expected from downloading VAST and trying the UML designer is a stronger community. Start your downloads now.

We know from the activity on the eComStation Yahoo group, and the OS/2 e-Zine features, that there is a desire at the grass roots level to pitch in and help the platform (the learn C++ class). However to a certain extent the books and sample applications out there are not going to help beginner programmers make significant lasting contributions to the community. We are not, by-and-large a C++ community. How then, to perform these miracles? Obviously we must start with co-operation instead of competition; "A rising tide lifts all boats" they say. Identification of popular ideas, expert distillation to their (hopefully common) component requirements and well-architected construction of these components to allow their heavy re-use in new applications must help. There is a need for all the WPS/SOM gurus to make themselves known, and to pool their knowledge in a common forum, perhaps at Netlabs, and to mentor these new self-helpers. The revolution re-starts here.

Next month, a set of example use-cases, and maybe even a stab at generating code.

Previous Article
Next Article

Copyright (C) 2003. All Rights Reserved.