Application delivery approaches
The way you deliver applications to your clients is the fundamental issue of debate in this article. Web client, Server Based Computing, Smart Clients, Thin Client computing, Consolidated Client Architectures are all terms that are used and misused in this context. This article attempts to just cover the key concepts, talk to me or comment if you want to discuss further. This is a an article I wrote a couple of years ago, but I thought it was worth a repost as it complements a previous post where I commented on a useful discussion by Brian Madden.
It’s very important to note that Server based Computing is not the only approach, this document discusses all of the mainstream and the main evolving approaches. However because Server based Computing is the most popular generic term in use it has been reused here.
What problem is Server Based Computing trying to solve?
First we need to understand the issues we are trying to solve with these technologies:
- Installing applications on many client devices is difficult for a whole host of reasons but mainly because:
- the clients are not all exactly the same so what works on one does not work on all
- its difficult to predict who needs which applications
- some of the people we want to use our applications use client devices that are not under our control, so we are not able to install software on them
- it takes time and effort to package up an application in such a way that it can be automatically deployed and does not conflict with other applications or the PC operating system in undesirable ways
- Installing applications requires administrative privileges on the PC, most companies do not allow users to have these privileges because of the security, acceptable use compliance management and TCO implications.
- Once an application is installed it needs to be maintained with bug fixes, configuration changes and new versions
- When people move location, visit another location or work from home they want to be able to continue to access their environment
- When PC hardware fails people want to be able to just plug in a new machine, or sit at the desk next to them and carry on
- Deploying applications to many PC’s takes time, introduces risk, and costs money; these three factors create an inertia that resists change. The result is the client software gets out of date, or different versions exist on different machines resulting in inflexibility.
- In environments where theft is a risk, PC’s are very attractive targets!
- Using appropriate technology SBC can provide desktop platform independence, and increase the usable life of desktop equipment.
- Eliminates the need to upgrade desktop hardware in order to support new applications services or upgrades.
- Using appropriate technology the solution can provide more predictable WAN utilisation.
- Centralised processing resources can be rapidly reassigned during temporary periods of increased utilisation of business critical application services. This mitigates the need to procure additional equipment to accommodate these periodic peaks in utilisation, for example year end processing.
Why do we continue to deploy PC’s
These problems are pretty serious ones, so next it’s important to understand why we still often deploy PC’s:
- Many applications do not work, or are not supported unless they are running locally on a PC
- Many applications can only be cost effectively deployed on PC’s
- We have automated management tools, packaging tools and conflict resolution tools that help us get closer to the objective of managing thousands of PC’s with a similar fixed cost to managing hundreds and a very small variable cost per extra PC.
- Many applications integrate at the client, so multiple applications delivered from different servers do not provide the same user experience
- Users often get confused by the extra complexity of some alternatives
- Some people need access to applications when they don’t have access to the network, or that connection is too slow or un-reliable
Where is Server Based Computing most popular today?
Despite some of the issues raised above Server Based Computing is increasingly popular, and very popular for certain scenarios:
- Delivery of applications to clients that are unmanaged or managed by a third party
- Delivery of line of business applications to large numbers of casual users
- Delivery of software for test and evaluation
- Rapid, on demand deployment
- Rapid removal of application services
- Delivery of applications to unsupported locations like branch offices
- Delivery of applications to hostile environments, or high theft risk environments, or environments needing maximum flexibility
- Delivery of applications to task and Structured Task Workers with a small number of applications and well defined processes
- Enforces “business use only” disciplines
- Centralised data management and security
- Provides the flexibility to, rapidly, and securely enable access to business applications for external business partners or new business units, without having to invest in additional infrastructure.
What are the alternatives?
There are however a wide array of technologies that solve these problems, it’s worth restating the basic approaches to solving the traditional PC application delivery problem:
- True thin client. Make the client device as simple as possible, ideally stateless, i.e. you can plug in another one and it will just work. Don’t allow any applications to execute on the client, just allow presentation. In this alternative all applications are server based.
- Re-buildable client. Maintain a record of the desired state of a device on a server, if a device fails for whatever reason then its ‘state’ can be recreated fairly rapidly from the server. Many systems management tools allow this and Operating Systems are getting better at this all of the time . In this alternative applications can be delivered using all of the techniques described below.
- Connect to your PC from anywhere, rather than run all of your applications on the server, its possible to use a traditional PC most of the time. However when you are at home, or working at another business location connect over the network to your PC, and use remote display technologies.
- House your PC in the data centre, solutions are emerging that allow users to connect to an individual PC, (blade format device), housed in a data centre. The PC system unit is accessed over the network. If your PC fails its easy to swap to another. This option is commonly described as Consolidated Client Infrastructure or CCI.
- Execute applications on the server, and run the minimum client side code to render the display and manage keyboard, mouse and peripheral connectivity. X Windows, Windows Terminal Services and Terminal Emulation products all fall into this category. More than 80% of Windows Terminal services and Citrix deployments are actually delivering applications to Windows clients rather than thin clients.
- Download web pages and scripts in real-time. Clients that allow simple presentation and sometimes validation code to execute on the client, but download the application in real time, every time you need it . The key thing is no change to the configuration of the client is needed for the application to download and run – many web applications fall into this category, the vast majority execute JavaScript/JScript on the client.
- Download applications that rely on client platform extensions. Clients that have some fairly rich set of standard services installed that let application code be downloaded in real time execute safely, normally in the browser, but not as general purpose as .NET and Java. Internet Explorer itself falls into this category as it includes significant functionality that’s not pure HTML and CSS . Flash and other Active X controls or alternative Plug-in standard are more obvious examples.
- Download complex applications in real-time. Allow more complex application to execute on the client, but download the application in real time, every time you need it . The key thing is no change to the configuration of the client is needed for the application to download and run, the applications are self maintaining, i.e. new versions are downloaded in real time from the server. Java applications and some Microsoft.NET Framework v2 applications using ClickOnce deployment.
There is a variant of this option, where the application does change the configuration of the client, these applications often provide tight integration with the operating system, high performance graphics, integration with local peripherals etc. However they are still deployed in real-time and self maintaining .
- Store the file on a file server, but execute it on the client. Some applications will work that way, but often applications need to be installed on the client to run correctly.
- Package an application in such a way that it is installed in real-time when a user first invokes it. Some Linux Distributions and SoftGrid for windows provide specialist tools to achieve this that provide application isolation features and optimise the packaging to minimise download delays. These products also ensure that the configuration of the PC operating system is unchanged, ensuring that other applications are not affected by the installation and that the application can be automatically de-installed.
- Manual Installation. manually install an application by running an installation script, a refinement of this is to provide some form of on-line catalogue from which people can download and then install applications. For most applications (see above for different approaches) the application will change the configuration of the client (create shortcuts, install files into shared areas, change the registry) and will therefore require administrative priv.
- Push installation. A refinement of the previous approach, applications are automatically distributed to the persons PC using an automation tool (SMS, Unicenter, LanDesk) and the system executes the installation in the background. The decision to distribute the application may be manual (a list of PC’s) or may be based on the PC being identified automatically as the result of matching a query (all Thinkpad x23s), being added to a group (everyone in finance) or a some other policy (everyone at location B). A refinement of this is publishing; where a stub is installed which shows the applications icons, file type registrations etc. When the application is first invoked it is installed on demand.
- Application appliances or virtual machines. Using a technology like VMware Player a whole operating system and set of applications can be installed by copying a file, (or couple of files) to the PC. This virtual environment may contain a whole managed or unmanaged PC in which case all of the above application delivery discussions apply equally to the Virtual PC. However the Virtual PC may in fact be thought of as an application (or an appliance), examples might be a Virtual PC that is actually a complete functioning database server, or proxy server, or isolated (hence safe) web browsing environment. using VMware ACE it is possible to provide PC appliance that is configured and locked down to provide very well defined role.
Which of mix of these approaches fits your business need will depend on trading off performance, flexibility, usability and cost. It will also depend on your application portfolio and which of the approaches your applications support.
Hi Steve, I know this is an ancient article but I was impressed by the content and the clarity. Maybe you can help me as a lot of research has still not helped.
I am trying to use ClickOnce to deploy an application at a client that has several checking counters for outgoing orders.
The application uses a fixed scanner like the checkouts at supermarkets. This all works great with a conventional installation but, given that the client has multiple outlets about 1000 miles apart, this would be a painful distribution.
Using a CheckOnce deployment, how do I use local peripherals? The dll’s are listed in the distribution manifests but don’t seem to initialise the local scanner.
Your input greatly appreciated.