Ron Munitz

Display protocols have been around for decades. It may be argued that even for centuries: Ever since the prehistoric days of the predecessors of the computing industry, a couple of centuries back then; Back in the late 19th century, technology has brought the wonders of having data, stored in one’s mind, given to a “Terminal Server” such as the typewriter which did some sort of Processing, and emitted output to a Display such as a paper.

Nubo CTO Ron Munitz examines the evolution of display protocols, from typewriters to distributed mobile OS display servers, in this first in a series Nubo Blog post.

Ever since then, technology has evolved greatly, introducing complex graphical platforms such as Microsoft Windows and GNU/Linux, obtaining input from various sources such as Keyboards, mice, applications talking to one another and so, doing complex computations and emitting state of the art graphics to some Display hardware.

Naturally, given the distributed nature of computing in the 20th century, the needs arose to have the input given Remotely at an arbitrary location, transmitted “over the wire” into the Terminal Server to do its magic, and emitted Remotely to a display. All this happens over some network, and completely remotely, and putting this entire framework together is commonly known as using Terminal Services via Remote Display Protocols.

Keyboard_353X294Remote display protocols work in the following way: They use some internistic knowledge of the way the Operating System processes input events (e.g. the way the kernel receives an interrupt of an incoming keystroke), and the way it renders information to the display (e.g. how a matrix of pixels is written to the graphic card memory), and apply this knowledge to reproduce the same actions on two different entities: The client and the Terminal Server.
The Client is a software which is responsible to interact with the user. It gathers input events on the hardware and Operating System it is running on, puts them “on the wire” and sends them to the remote Terminal Server. It is also responsible of receiving display information from the Terminal Server, and emit it locally to the server. In other words the client is an I/O pipe. Receives input from the user, displays output to the user and runs no business logic whatsoever.

The Terminal Server is the system which actually executes the business logic of applications, has exclusive access to data, and is the only entity that really executes the applications of concern. It receives no direct user input or output on its local hardware, but rather translates the input events it gets from the client into its operating system’s internal structure format, and injects them to the application of concern as if these events were received locally by the Terminal Server hardware, at its local interfaces. In a similar fashion, the graphics the application would otherwise share with a local device hardware are projected “onto the wire” rather than on the display, and sent over to the client.

Since both input and output go over the wire (or over the network, to be more formal), remote display protocol are tender to network conditions, which can greatly impact the user experience, making latency and bandwidth considerations the Achilles’ heel of all of them: The input processing is in most cases small in terms of bandwidth it requires. The other part of the puzzle however, having to emit display from the server and onto the client’s display may be extremely challenging and bandwidth expensive.

Many display protocols have been developed in order to optimize these aspects for the different operating systems around. The most famous ones would be VNC, X11 and RDP, to name a few. These have been around for decades, all targeted at the having a desktop or even a laptop client, running over a general purpose operating system such as Windows, Unix, Linux, and their likes executing software on a remote operating system such as those aforementioned.

But what about mobile apps? These have significantly changed the rules of the game. No longer desktop UI, and no longer stable slotted aloha ethernet networks over wired LAN. Everything now goes mobile. For better and worse. The game has changed. And when the game changes – technology catches up. The next article will explain how, why and when, describing the challenges behind the adoption of mobile terminal servers.

Kenny Sahr

It’s time we had a mature discussion on device lifecycles. We expect most things that we buy to last a few years. When you buy a new pair of jeans or a shirt, you expect it to last 3-5 years. You probably own sweaters that you’d rather not admit when you bought them. They look nice and do the job, so why throw them away? When it comes to furniture or anything in the kitchen, we expect things to last 5-20 years.

Why is it that we expect items in our homes to last for ages yet accept one year lifecycles for our mobile devices?

Things are different in the mobile world. If you listen to the pop culture and advertisements around you, you may think that the only way to keep up with the times is to buy a new smartphone and tablet with every new major release. I confess that I own a Galaxy S1 Mini. It dials, lets me receive calls and has an app that tells me when the next train leaves the station. I’d rather invest in my Beatles collection than a 4 inch screen.

change_330X308 Before you buy a new smartphone or tablet, make your own decision as to the acceptable lifecycle. How long will you use it before you buy “the latest, greatest model?” If you don’t decide, society will decide for you. Samsung released the Galaxy S in June 2010. Three years later, we see the media clamoring for the Galaxy S5. The first iPhone was released in 2007. The “latest new” iPhone 5C and 5S models are the 7th generation of iPhones (iPhone, 3G, 3GS, 4, 4S, 5 and the 5C/5S).
I sure hope most of you haven’t bought every model in between. Don’t get me wrong – I’m a big fan of both ecosystems and hope they both keep pumping out amazing mobile technologies for decades to come. I just think it’s time we took a deep breath to reflect on how long we should keep devices that cost between $300 and $800.
What is the perfect device lifecycle? It depends. If you work in a tech startup, expect to succumb to “device peer pressure” every 18 to 24 months. If the only time you are mobile is during your commute to work, I don’t think there’s anything wrong with making a rule – “I will keep my new device for 3 years.” If you only make it to 2 and a half years before you break down and buy a “killer smartphone,” you still win.

If you are a heavy tablet user, go with every 3 years and keep your smartphone for 4 years. If that’s hard to swallow, think of it this way – do you really want to buy 5-8 smartphones every decade?! Unless you’re buying that many houses, restrain yourself. I have a friend here at Nubo who is proud of his inexpensive Chinese smartphone. He and I use our tablets more often and we prefer a smartphone that is a “square rather than a large rectangle.”

I encourage all of you to sit down with your friends and family and talk about how long you intend to keep your smartphones and tablets. Remember the lyric from The Eagles’ Hotel California? “We are all just prisoners here, of our own device.” Are you?