OS/2 e-Zine! - Now on CD (Click here)
[Previous]
You really ought to be using a thin client - by Bob St. John
[Next]

Summary: Bob tries to talk some sense into you about the merits of thin clients, managed PCs, and why thin is fatter than fat.

Given the statement as it stands, most OS/2 e-Zine! readers would snort and snuffle. But you, there! You snorting and snuffling! You would be wrong. There is immediate resistance to the idea of thin clients when there really ought to be exploration. For example, what do you mean when you say thin client? Since we have terms like thin client and fat client, the thin client must be less than a fat client, it only stands to reason. Right? The problem is that it may be yes or it may be no.

So, if the source of the immediate resistance is the concern by the users that they are losing something, are they? What if I could persuade you that the user loses nothing and, in fact, gains something? What if I could persuade you that a thin client is fatter than a fat client? Is such a thing possible? Ah, yes! In the land of computers, anything is possible. Even the conundrum of fat being thin and thin being fat.

One issue here is that terms are very poorly defined. What do we mean when we say "thin client"? Let's take a look at the new IBM Network Station 2800, the Intel based network computer from IBM which was announced on May 25. It does not support local drives. Thin? Of course! Even IBM says so.

When is a thin client not a thin client?

The Network Station 2800 could be used as a Windows Based Terminal (WBT), in which case it will be sharing applications like Lotus Notes with about a dozen other workstations. These workstations would be attached to, and the applications would be running in, a Windows Terminal Server. Imagine: the applications are not even executing in the client. These Windows Workstations are just appendages hanging off a server, and even the Windows Terminal Server can't be considered "fat" because of its implementation dependencies. But that strays off topic. This environment fits everyone's idea of a thin client. Microsoft has implemented it as a very restrictive environment. But it doesn't have to be that way.

However, the same IBM NS 2800 could be running as a WorkSpace on Demand client, remote boot, server managed, with applications running in the client. Still thin? Yes! According to IBM. Though not covered in the IBM announcement, we (at Serenity Systems) have successfully tested the new IBM Network Station Series 2800 as a client running OS/2 V4, a full Merlin client, using our WiseManager product. Still thin? Not according me, speaking as the vendor. After it boots from the server it is running like any other OS/2 Warp 4 client. Excuse me, but I don't see anything thin here. It's a PC.

Would you disagree with me? Would you say that the NS 2800, running diskless, remote boot, server managed, but running a complete copy of Warp 4 is thin? Would you like to step outside and say that? OK, then stay here with me because we are about to step things up.

Managed Clients by Serenity Systems can be diskless, like the NS 2800, or the client can be something like an IBM P300PL PC with a hard drive. I mention the IBM P300PL specifically because it is clearly a PC, a brand name business desktop computer which tends to utilize the same components through the production runs. This means that something that works on one P300PL is likely to work on another P300PL. If P300 and the NS 2800 are running the same software, is one still thin and the other fat? Essentially it's the same implementation. If the client on the P300PL is a fat client and the NS 2800 is a thin client, when did we cross the line?

It must have happened when it was configured with a local hard drive. But keep in mind that neither client workstation knows where the storage actually resides. From the user perspective both workstations behave the same way. The user certainly doesn't know where the applications and data reside.

So, when is a thin client fatter than a fat client?

Now we're coming to the point where someone could argue that my PS/2 Model 70 with 12 meg RAM, 386, and 120 meg hard drive is a fat client. But my diskless client with 64 meg RAM and 450 MHz CPU which is sitting on a 100 MB Ethernet LAN accessing programs and data stored on any number of devices is actually a thin client. Of course, the PS/2 Model 70 uses data and programs from storage located on servers around the network, too.

In fact, I might be able to outfit my diskless Network Station with an 80 meg flash card. Now, does that make it a fat client? Of course not. By the way, which one would you rather use? The truly irascible reader would retort, "It depends on what you're doing. The Model 70 might be a perfectly fine client!" Bullseye! Correctamundo! And now, before we leave the Model 70 behind, a moment for all OS/2 fans who watched, amazed, at OS/2 demos given by David Barnes on his classic P70. OK, now let's move on.

What is the user doing? How is the workstation being used? In this context we may find that a fat client, even connected to the network, offers the user far less than a thin client which is not simply connected to the network but is actually managed over the network. Hardware configuration should not be the key. I can take an IBM P300PL and use it as a Managed Client with or without local storage. Although the PC may behave differently depending on how the storage is "mapped", the PC won't realize that. You can hide where the drives are located from the PC and the user won't realize it.

We are Borg

As near as I can determine in a completely unscientific way this resistance to thin clients thing comes down to the user's concern over loss of 'control'. An emotional issue. Not having local permanent storage is associated with not having control. But, is that a fair assessment? I would, could, and will now argue that local, permanent storage creates a lack of control. All that local storage constitutes de-centralized control which is no control at all. It's "uncontrol". Anarchy. After all, if you are supporting a group of users, just how much control over their workstations do you want them to have compared to the control you should have?

There's good control; the control that I have. And bad control; the control that others have. Who wants users coming in and attaching devices to their PCs? Or loading "unauthorized" software, or maybe worse: removing programs or data. Good control and bad control may be determined by your function in the group.

Maybe the user is losing some control and the function which is managing the workstation for the user is gaining control. But isn't that where the control is supposed to be? In the business desktop area, control is not supposed to be with the user. So, control isn't being lost. It is being gained. But it is moving.

Fat clients with local storage can boot and execute activities without the network. Managed clients should not. I won't say they cannot, but it would be a bad design.

The first time I heard the term thin client was when Lou Gerstner spoke to the analysts in June of 1995. A black day for OS/2 because it was part of his concession to Windows. I went to the forums and asked what was all this about thin clients and fat clients. I didn't like the answers. But think about how you were using your computer in 1995 and how you use it today. Think about the role the network plays in operations today compared to 5 years ago. Think about the Internet 5 years ago. And think about how things will be changing in the next, say, eighteen months. Back in 1995 I was on a LAN but I didn't use it. Now my LAN logon is in my start-up folder.

Control versus Production

Obviously my point of view centers on PCs in the workplace. I have a home user scenario, but let's think about doing that next month. I think I'm a pretty typical business computer user. I don't feel compelled to tinker with my computer. It is there to serve me. Over time, the business PC is evolving into an appliance and some of the basic rules about appliances apply.

I was recently speaking with Mike Persell of IBM and I want to credit (or blame) him for expressing the following point of view. I think we can all appreciate it. Mike pointed out that back in the mid-1970s, cars did not use computers and there was a gas shortage. In the early 1980s, processors started to become part of the car. Among other things, these CPUs managed fuel mix and consumption based on driving characteristics; stop and go or highway driving. Mike believes that these computers were in no small part responsible for the improved fuel consumption which would have had a dramatic effect on the gas shortages experienced a decade before.

You can agree or disagree in part or in whole, but some characteristics of what Mike described struck me as parallel to what I'm discussing now. First, the computer is there, serving the user. But the user doesn't know about it and only cares that it does the job required. The computer is not accessible to the user and no tinkering is permitted.

And mechanics, the power users and support function, tend to complain about these computers and pine for the old days. Back then they could tell you what was wrong with you car from the sight, sound, and smell of it as you drove in. Now they must hook your car up to the "system". But how many of them drive that kind of car, or would suggest one for family use?

To bring this back to the desktop computer, we implement technology over time. The focus is not to provide compelling technology to users. It is to meet a business objective. To create a computing environment which supports the ability to do business, service customers and drive revenue.

What does that mean? Simply put, it means providing users with access to applications and data. What does this best, a fat client or a thin client? Think about the service and support you can provide the user of a managed client. Software distribution and deployment are now simple drag and drop operations. One drag and drop can distribute a software upgrade throughout an organization. Users get new or updated applications on their desktop without even having to reboot. Support people can resolve problems from any place in the network, or even from remote locations dialing into the network.

Most "worst case" resolutions may be a reload from the server which is fast and reliable, with no information lost. Let's take a catastrophic failure which requires replacing the user's workstation. With a managed client implementation, the user could be back up on a new machine in a few minutes with no data lost. These are the things that members of the Southern California OS/2 Users Group saw with Kim Cheung's demonstration at the March meeting.

Compare this with how fat clients must be supported. How available are applications or data when something fails on the traditional fat client PC? If there is something particularly wasteful in the current business computing environment, it is the dreaded phrase "Hey, this ever happened to you? You ever seen this?" coming from an adjacent cubicle.

This phrase generally indicates that the dreaded Peer Support function is about to be engaged. Soon two or three colleagues from nearby cubicles can be found clucking and commiserating with the user. Generally a support technician had to be dispatched to help the user and must wade past the Peer Support function like a emergency medical technician trying to reach an accident victim who is surrounded by sympathetic onlookers. This Peer Support is usually ineffective and expensive in terms of lost productivity. It is rarely calculated into the Total Cost of Ownership figure and it should be. It is also part of the very fabric of the fat client environment.

Can we redefine our terms here?

Probably not. It would seem that fat and thin will be with us for awhile. Too bad because I hope we are making some progress establishing that what some people call a "thin client" can be "fatter" than a fat client in terms of functionality and resource. Though I grant you that not all the resource is "local", that is, part of the specific workstation.

I much prefer the term server managed client. When this is implemented, the design can address many of the vagaries which plague the typical business desktop computing environment. It can make it less expensive, yes, and that is beginning to receive more attention in board rooms this year. Y2K had a part to play in that happening. But the key is managed clients can also make computers a better business tool. More able to support the act of doing business, which is the point or, at least, it should be.

But, alas, we appear to be mired in "fat" and "thin". The trade press and some significant vendors are using the terms. The Computer Reseller News recently featured an article headlined: Study: Thin Clients Reduce Support Costs 80%, But End Users Still Resist Switch From PCs.

Perhaps you see the problem? Why does the headline suggest that you must switch from PCs? Because of the concept of a thin client being different from a PC? It certainly doesn't have to be that way. While the article represents an awareness in the industry, it exemplifies the muddled thinking regarding thin clients. But, for advocates of thin clients, it is positive trade press. What is good is that it reflects that user organizations are prepared to consider alternatives. In short, to undertake a thought process. Even shorter: to think!

The article also indicates increased interest from service oriented VARs. Good! These are the right people to carry the new computing models out to users because VARs tend to be solution focused, not necessarily product or technology focused. Of course, not every VAR fits that complimentary description.

Datapro's Peter Lowber is quoted as saying that users have to be engaged early to overcome resistance. That has always been a valid suggestion when implementing any change. However, I would suggest that I could implement a Managed Client solution without users even noticing the change. The change can occur "below the water line", places in the environment which are not visible to the user.

Oh, I might change the appliance -- the PC -- because not every PC on the desktop is suitable for this environment. But most users consider getting their workstation replaced as a good thing. There is generally no resistance to getting new machine since it's thought of as a step up. And what appears on the screen could be made to look identical to what appeared on the former "desktop".

But even if I were to implement change and engage users early, as Lowber suggests, I probably wouldn't make an issue of "thin" or "traditional PCs". I would show the user the new applications and benefits. The technical elegance of the implementation would not interest the typical business desktop PC user. Nor should it.

But the technical elegance can be there. When I look at Serenity Systems' Managed Client products and IBM's WorkSpace on Demand, I'm struck by how simple things can be kept. The ability to hold on to things that were working well as change is executed. How the best of different computing models can be converged into a new model; thesis + antithesis = hypothesis. It can be a thing of beauty, or it can be Microsoft Windows Based Terminals.

All thin is not created equal

The CRN article then mentions that the survey sample of 25 companies tended to use Microsoft Windows NT Server 4.0, Terminal Server Edition, with Citrix Systems' Metaframe "for load balancing and session shadowing". Well, support costs may have decreased (according to this limited survey, down 80%) but I'd love to know what happened to Cost of Acquisition, because Windows Based Terminals (WBTs) are not an inexpensive solution.

Windows Based Terminals are not my strong suit, but based on what I think I know, "thin" is not intrinsically good. Do not simply accept what I'm about to say, by all means check me out. But in my opinion this has to be about the most inappropriate implementation of technology I can imagine for general business use. If the intent was to create an alternative to the evils and vagaries of fat client computing, this is not a step forward. It is simply the other end of the spectrum.

The first element is setting up a Windows Based Terminal solution is that you need an existing NT Server network. Then you need an NT Terminal Server to attach to that NT Server Network. Why? I can only presume it's because the applications execute in the Windows Terminal Server, so it's fully utilized in that role.

Then consider what the account is using for a workstations, an IBM NS 2800? With WSOD or Serenity Systems' Managed Clients, the applications would be running on the client: the IBM NS 2800. With those OS/2 based implementations, the client hardware would be functioning as a PC. With WBT, the application is executing in the server. It would seem that the Windows Terminal implementation is very wasteful regarding the client hardware resource. And users will pay for that waste.

And where WBT requires an NT Server and a Terminal Server, WSOD and WiseClients can be managed by the same Warp Server which is performing the LAN's NOS activities. Sure, you can have domain servers, database servers, application servers, file and print servers, boot servers, and so forth. But those would be part of the selected network design. If my network requirements are light enough I could use one Warp Server to address all the user requirements. I get to design the network the way that best meets my user requirements. Not to mention the fact that I consider Warp Server the most reliable Intel NOS available, and Warp Server for e-business to be the superior SMP NOS. But that was really last month's article.

Another serious shortcoming of WBTs is my understanding that a WBT desktop will be the same for all users. If you want station A to have a different desktop than station B, you are going to need some significant services performed. But if identical desktops is OK, it's not a problem, and pretty easy to set up. Unless things change. Compare that with the flexibility the SCOUG members saw in March.

Some thin can also be small

Serenity Systems has a Managed Client SOHO program which supports up to five clients on Warp Server Entry, the only server which should be required for SOHO. Imagine the two implementations: One is essentially five OS/2 V4 clients running off Warp Server, able to be completely managed by a remote workstation dialing into the server. The second is five Windows Based Terminals with two NT servers. And these can only execute Win32 apps inside the Terminal Server. The OS/2 implementation has a great deal more functionality and inherent systems and network management.

When I asked someone how many Windows Based Terminals one should consider attaching to the Terminal Server for planning purposes, I received the expected and reasonable caveat: "Well, that, of course, depends on what they are doing. ("Of course") But reasonable planning would be 12 to 18 workstations." Now that Warp Server for e-business is available, we may learn more about managed client implementations. User activity, design and hardware configuration would all be very important factors. But I would be extremely disappointed if I couldn't support twice as many managed clients with either our own WiseManager implementation or IBM's WorkSpace on Demand.

Can you imagine a network supporting several hundred Windows Based Terminals? No wonder VARs are smiling and suggesting this is the promised land. And all you wind up with is a Windows Based Terminal. Although you would have Internet Explorer, you would not even have the flexibility of a network computer. And here's a flash, you would not have a choice of browsers. The WBT hardware has to use RDP, a proprietary MS transport protocol which severely limits what a WBT can do.

If you want the flexibility of a network computer, if you want to expand the capabilities of the workstation beyond WBT, you need to go with added expense of Citrix WinFrame and MetaFrame. Citrix has their own transport protocol which allows you to duplicate the WinFrame capability on a supported client, accessing Windows Terminal Server. The CRN article mentions that this software is for load balancing and session shadowing, as though they were options like power windows. These are sparkplugs. No options here.

So you have the cost of a couple of servers for 12 workstations, the cost of the 12 workstations themselves, and a significant software investment, and we haven't even gotten to the applications. Win32 applications and MS licensing is likely to be what drives any legitimate reason for implementing Windows Terminals. That is, if the user insists on running some Win32 type applications and finds the traditional fat client PC environment too expensive, especially with regards to support expenses, then Windows Terminals is an option.

The CRN article implies that browser based Java applications could reduce the value of Windows Terminals. It cites Datapro and Lowber indicating that Java applications offer more options and "could promote the use of network computers over Windows based terminals". Imagine an implementation which is so restricted that a network computer would constitute an upgrade. Now that's thin.

The issue with network computers is the fact that they are generally very specific appliances. Adding flexibility to a network computing environment is likely to be expensive from a services and customization standpoint. I think network computing tends to work best in a very complex and specialized network. That's what makes network computing a poor choice for general business computing, not the thinness of the hardware client.

I've started to form the view that network computers are actually best positioned as a migration path from legacy host applications to managed clients. Quite the opposite strategy that IBM announced with WorkSpace on Demand Version 1.0.

What's the point?

The point is that server managed clients are going to be big. It's happening now and will be a big trend this year. Server managed clients are changing the face of business computing. Nothing short of that!

This has nothing much to do with bringing support costs down 80% (though if that's true, it ought to create some interesting "right sizing" personnel actions around the US of A). It isn't about Java based apps and the network. Those are parts of it and significant parts. Continuing the "face" of computing analogy, it would be akin to talking about a person's features instead of his whole face. These are not things that should be separate and discrete. A face is a set of integrated features; what someone looks like. A change in a feature changes a person's face, his "look".

And I'm not talking about some subset of transaction based computer users or users limited to Java based apps on the network. Those are segments of the computing environment. Still features of the face. I'm talking about the face in its entirety, its totality. I'm talking about most of the computers in use today.

But, wait! There's Windows!

How far can one go in changing the face of business computing if one ignores Microsoft Office? Not far. Microsoft has used the file format of Word to determine the NOS for many business LANs. You want to read Word documents? You'd better be using Office, and a particular version, too. No MS Office for 3.1 to MS Office 2000. I've been told the entire MS Office 2000 product is over 500 megabytes. Be afraid. Be very afraid.

The installation program for Office 2000 asks users what they do and then selects components to be installed. Even so, think back to Nick Petreley's comment years ago about giving a user an office suite was like giving them a desk the size of Cleveland. Well, continuing Nick's analogy, a desk the size of Ohio has arrived on the loading dock. Where do you want it? Do you really want Office at all?

You use Office, you need Win9X or NT. You going to use the MS apps and OS, you should use the MS NOS since you already have so many of the parts and the skills. And Back Office fits sooooo well into this. Essentially using a word processing format to leverage the selection of the NOS. Amazing. If this area isn't addressed, change will be slower and more segmented.

Keep in mind that thin clients are not thought of for general business desktops. Those are PCs. But managed clients are PCs, too. Managed Clients can work up to the capacities of business desktop computers and deliver the benefits associated with a network computing or the thin client environment. That is why managed clients offer the opportunity to change the face of business computing. Managed Clients can go places thin clients can't, but they can bring the same set of user related benefits with them.

The applications are a different story. A 500 meg application like MS Office 2000 running across the network? An office suite has really been a fat client application, not a network friendly application. But Star Office is a reasonable network office suite solution. And IBM Lotus and NCSD seem to have made a decision to address this. A new e-Suite is available for public testing and is positioned as a WSOD solution. While e-Suite has limitations regarding document exchange with a product like Office, putting a filter exchange using filters from Smart Suite on the server would address the requirements of most users. Will Lotus do this? I don't know, but it would be a good idea. It would create a real market for the products.

Lotus Smart Suite Millenium 1.0 was not a network friendly product. But groups in IBM are working on getting release 1.1 working on WSOD. This shows an awareness of the requirement.

Stop thinking thin! Stop worrying about users losing control. For one thing, business computing is not going monolithic. One type of client isn't going to do it all. Some folks need different computers and features just like some folks need different phone sets. But most business users can be supported by one model; phones or computers.

This is why organizations will be taking a harder look at why they need to run Windows and Windows apps. Windows is expensive enough, but start adding Intel's LANdesk and Novell's Application Launcher to the equation. Nice products and ideas, but you only need them because you stuck yourself with Windows.

People are prepared to consider alternatives for their computing environment. To retake control of their operations. This is the reason that NT has stalled in server market share. In my first article (March) I mentioned that Windows had reached their high water mark. Now watch; the tide is going to start to go out. And just like watching the tide, don't expect to see it by sitting and watching. It will be gradual. Windows and Microsoft are not about to be displaced, but the tide has turned.

Return of Fire Fred Syndrome?

A digression, but as we talk about things to watch for: Business users are beginning to experience a renewal of the old "Fire Fred" syndrome, albeit they are still very early in the curve. Old timers, like me, remember IBM's executive briefing program in San Jose called "the Institute". It was there that I heard about the "Fire Fred" syndrome which plagued early DP (Data Processing) centers.

IBM would go in and sell a bunch of stuff to the "glass house", as the DP shops were called. These big new computers were a source of pride to these companies. Edifices were created with the raised floors and a cool environment. These companies couldn't bear to hide the computers from view. So the walls were glass, allowing employees to see and be inspired by the technology used by the company. This was, after all, the 1970s.

At some point, the executive team would become concerned over the spiraling costs associated with the DP organization. They would come to the inescapable conclusion that the MIS director was a megalomaniac creating an empire. It was now time to "fire Fred". Of course, it was the good IBM rep's job to nip such activities in the bud by helping to establish a glowing Return on Investment presentation, thus saving Fred.

The Fire Fred syndrome was not new. It existed before DP. This was just a new implementation. And I am beginning to see the same early indicators in business today. There is a reason Bill Gates is the richest man in the world. That money did not always belong to him. Microsoft is an expensive way to build a computing environment and it is also slippery slope. Start to add MS content and you could find that you need to convert everything to MS, or so MS would have you believe.

There was a time when IBM used to be able to count on getting an amount of business based on the paradigm, "Nobody ever got fired for ordering IBM products". Then the paradigm shifted. Now you hear "Nobody ever got fired for buying Microsoft." But, you know I do believe I'm hearing it less. Might still be true but when it is said, it's said more quietly now. So, free advice to Windows shops: Don't be a Fred.

But Microsoft is not alone

None other than industry leader IBM, fast becoming a champion and advocate for managed clients, is here to help and to try and improve Windows as a managed client with WorkSpace on Demand. WSOD began IBM's attempt to resurrect OS/2 as a migration platform for OS/2 users, encouraging them to move off OS/2 and on to NCs with Java. Actually, not a bad idea for those who needed it. Just not many people appear to need or perhaps want it yet.

But we will bypass the historical perspective. What IBM NCSD seems to be discovering is that managing workstations is a valid business requirement which they could possibly address. Not simply a stepping stone in a stream. Managed clients can be a terminal point, a destination, a product, even if the product was more of a service.

The initial WorkSpace on Demand (WSOD) was completely OS/2 focused, marketed only to existing OS/2 users. But with the recent announcement of WSOD Feature for Windows Client (FWC) denotes a change from supporting OS/2 to an attempt to manage the dominant client: Windows. Now that this door is open it will be interesting to see where IBM goes with this. WSOD can claim to support OS/2, DOS, Windows, and Java clients. But the server is still OS/2 and the managing workstation is still OS/2. Considering IBM's customer set and IBM's track record, it makes sense for server support to be broadened as well.

But the issue I have with IBM right now is they have tended to ignore the obvious, natural client: OS/2. I know that WSOD delivers an OS/2 based client, but the haste to be "thin" caused them to miss an opportunity. Clearly, that's my opinion since Serenity Systems elected to use Warp 4 as a client. Now IBM appears less focused on OS/2 than Windows. No news there. But instead of trying to manage the clients that are out there, IBM could create a superior managed client and move users over to it. OS/2 would be a clear choice as the platform for such a client.

In my opinion, managed clients are suitable for the vast majority of business users. An excellent choice for niche users like bank tellers, reservation clerks, and so forth. But the same benefits can be available to so many more users. Warp 4 is a world class platform for a managed client; reliable, responsive, reasonable cost, and very, very importantly it manages hardware resources well.

I think one of the unintended results of the WSOD FWC feature may be that it will demonstrate just how good OS/2 is as a managed client. The Windows client cannot touch the functionality of the OS/2 based client. Fact is that today, Warp 4 is the best best platform for a managed client. Linux will be giving it a run for its money over time, as support grows. But for now, the best choice would be OS/2.

WSOD opens a Window

Now that IBM has enfolded Win32 clients into WSOD, users are positioned to learn more lessons. And these users will teach the rest of us. Steven Sharrad of Hensley College (UK) outlined his experiences and observations using the beta of the WSOD 2.0 Feature for Windows Clients. I like the document, which was submitted to the aurora beta distribution list on the OS/2 Super Site.

It is worthwhile in the same way that every document created by critical users, based on their experiences and related directly to their requirements, is worthwhile. Mr. Sharrad describes his challenge this way: "How to recreate this perfect, secure and cost-effective environment (OS/2 RIPL clients running Win3.1 environment) using either Windows 95 or Windows NT as the desktop platform."

I'll wait to see what type of response, if any, IBM offers. I think the key issue here is that WSOD does not provide a panacea to Windows users. It requires you begin with an installed Windows client and when the dust clears, you still have a Windows client. One wag said that the essence of Mr. Sharrad's complaints is that there is too much Windows content in the Windows feature.

There might be something to that, but Mr. Sharrad does demonstrate a clear picture of a user who wants managed clients. And his challenge is that the clients he wants managed are Windows clients. Well, that's unfair. The people he works for want Windows clients. It rather reminds me of the Monty Python pet store skit (not the Norwegian Blue one) in which Cleese requests a pet. Palin, not having the type requested, offers to convert a terrier. Such as chopping off the dog's legs, adding some fins, and converting it into a fish.

What makes the skit so funny isn't the fact that it cannot be done. But the fact that both Palin and Cleese discuss the various conversions (terrier to parrot, terrier to fish) in such a deadpan and nonchalant fashion. Once Cleese asks "Can you do that?" and Palin says "Oh, yes", they're off. Let's face it. There are some things man was never intended to do. Running Windows in a "perfect, secure, and cost effective environment" would appear to be one of those.

The important thing

The important thing is that these issues and problems are receiving more focus. Part of why that's important is because it means that focus is moving away from products and over to problems which need to be solved. And that is important because of the thought process that is implied.

As these new product solutions emerge, features, functions, benefits, costs and return on investment all become factors. And all this is how it should work. The recent past and the immediate future has been and will be dominated by how to make Windows work. That is not a good business objective unless you are Microsoft.

As the exploration and research is undertaken, the value of managed clients emerges. At present, managed clients are not in the mainstream of computing. In the next 18 months, managed clients will gain equal footing with traditional PCs in a way that thin clients never could.

Conclusion

If it helps acceptance, I'm not above calling a managed client a thin client. As long as thin clients are perceived as a subset of a PC they can never be a major component of the computing environment. Thin clients need to be repositioned as a superset of a PC by explaining that this is not a connection to the network but an integration with the network. The managed client uses the network as part of itself, not as an extension of itself. The managed client is not connected to the network. The managed client is the network.

And that's how a thin client winds up fatter than a fat client.

[Previous]
 [Index]
 [Feedback]
 [Next]
Copyright © 1999 - Falcon Networking ISSN 1203-5696
June 1, 1999