Ron Munitz

Display protocols have been around for decades. It may be argued that even for centuries: Ever since the prehistoric days of the predecessors of the computing industry, a couple of centuries back then; Back in the late 19th century, technology has brought the wonders of having data, stored in one’s mind, given to a “Terminal Server” such as the typewriter which did some sort of Processing, and emitted output to a Display such as a paper.

Nubo CTO Ron Munitz examines the evolution of display protocols, from typewriters to distributed mobile OS display servers, in this first in a series Nubo Blog post.

Ever since then, technology has evolved greatly, introducing complex graphical platforms such as Microsoft Windows and GNU/Linux, obtaining input from various sources such as Keyboards, mice, applications talking to one another and so, doing complex computations and emitting state of the art graphics to some Display hardware.

Naturally, given the distributed nature of computing in the 20th century, the needs arose to have the input given Remotely at an arbitrary location, transmitted “over the wire” into the Terminal Server to do its magic, and emitted Remotely to a display. All this happens over some network, and completely remotely, and putting this entire framework together is commonly known as using Terminal Services via Remote Display Protocols.

Keyboard_353X294Remote display protocols work in the following way: They use some internistic knowledge of the way the Operating System processes input events (e.g. the way the kernel receives an interrupt of an incoming keystroke), and the way it renders information to the display (e.g. how a matrix of pixels is written to the graphic card memory), and apply this knowledge to reproduce the same actions on two different entities: The client and the Terminal Server.
The Client is a software which is responsible to interact with the user. It gathers input events on the hardware and Operating System it is running on, puts them “on the wire” and sends them to the remote Terminal Server. It is also responsible of receiving display information from the Terminal Server, and emit it locally to the server. In other words the client is an I/O pipe. Receives input from the user, displays output to the user and runs no business logic whatsoever.

The Terminal Server is the system which actually executes the business logic of applications, has exclusive access to data, and is the only entity that really executes the applications of concern. It receives no direct user input or output on its local hardware, but rather translates the input events it gets from the client into its operating system’s internal structure format, and injects them to the application of concern as if these events were received locally by the Terminal Server hardware, at its local interfaces. In a similar fashion, the graphics the application would otherwise share with a local device hardware are projected “onto the wire” rather than on the display, and sent over to the client.

Since both input and output go over the wire (or over the network, to be more formal), remote display protocol are tender to network conditions, which can greatly impact the user experience, making latency and bandwidth considerations the Achilles’ heel of all of them: The input processing is in most cases small in terms of bandwidth it requires. The other part of the puzzle however, having to emit display from the server and onto the client’s display may be extremely challenging and bandwidth expensive.

Many display protocols have been developed in order to optimize these aspects for the different operating systems around. The most famous ones would be VNC, X11 and RDP, to name a few. These have been around for decades, all targeted at the having a desktop or even a laptop client, running over a general purpose operating system such as Windows, Unix, Linux, and their likes executing software on a remote operating system such as those aforementioned.

But what about mobile apps? These have significantly changed the rules of the game. No longer desktop UI, and no longer stable slotted aloha ethernet networks over wired LAN. Everything now goes mobile. For better and worse. The game has changed. And when the game changes – technology catches up. The next article will explain how, why and when, describing the challenges behind the adoption of mobile terminal servers.

Kenny Sahr

It’s time we had a mature discussion on device lifecycles. We expect most things that we buy to last a few years. When you buy a new pair of jeans or a shirt, you expect it to last 3-5 years. You probably own sweaters that you’d rather not admit when you bought them. They look nice and do the job, so why throw them away? When it comes to furniture or anything in the kitchen, we expect things to last 5-20 years.

Why is it that we expect items in our homes to last for ages yet accept one year lifecycles for our mobile devices?

Things are different in the mobile world. If you listen to the pop culture and advertisements around you, you may think that the only way to keep up with the times is to buy a new smartphone and tablet with every new major release. I confess that I own a Galaxy S1 Mini. It dials, lets me receive calls and has an app that tells me when the next train leaves the station. I’d rather invest in my Beatles collection than a 4 inch screen.

change_330X308 Before you buy a new smartphone or tablet, make your own decision as to the acceptable lifecycle. How long will you use it before you buy “the latest, greatest model?” If you don’t decide, society will decide for you. Samsung released the Galaxy S in June 2010. Three years later, we see the media clamoring for the Galaxy S5. The first iPhone was released in 2007. The “latest new” iPhone 5C and 5S models are the 7th generation of iPhones (iPhone, 3G, 3GS, 4, 4S, 5 and the 5C/5S).
I sure hope most of you haven’t bought every model in between. Don’t get me wrong – I’m a big fan of both ecosystems and hope they both keep pumping out amazing mobile technologies for decades to come. I just think it’s time we took a deep breath to reflect on how long we should keep devices that cost between $300 and $800.
What is the perfect device lifecycle? It depends. If you work in a tech startup, expect to succumb to “device peer pressure” every 18 to 24 months. If the only time you are mobile is during your commute to work, I don’t think there’s anything wrong with making a rule – “I will keep my new device for 3 years.” If you only make it to 2 and a half years before you break down and buy a “killer smartphone,” you still win.

If you are a heavy tablet user, go with every 3 years and keep your smartphone for 4 years. If that’s hard to swallow, think of it this way – do you really want to buy 5-8 smartphones every decade?! Unless you’re buying that many houses, restrain yourself. I have a friend here at Nubo who is proud of his inexpensive Chinese smartphone. He and I use our tablets more often and we prefer a smartphone that is a “square rather than a large rectangle.”

I encourage all of you to sit down with your friends and family and talk about how long you intend to keep your smartphones and tablets. Remember the lyric from The Eagles’ Hotel California? “We are all just prisoners here, of our own device.” Are you?

Israel Lifshitz

Welcome to the new Nubo blog! This is where you can read about the future of BYOD. Nubo takes a revolutionary approach to BYOD and the articles you will read here will reflect that. In the beginning, there was MDM. The Enterprise Cloud is going to replace MDM in 2014. I wrote the below article for Wired and The Next Web and there’s no better way to provide a background for the amazing BYOD solution that is Nubo.

In the corporate world, there has been an ongoing battle between people who want to use technology and people who want to control it. That battle is coming to an end. The concept of “device management” is going to become as extinct as car phones and beepers. In a few years, no corporation is going to care what device you use, where you use it and what apps you have installed. “Mobile Device Management” (MDM), “Enterprise Mobility Management” (EMM) and their overgrown approach to security are creating more problems for IT than they solve. IT is sick of managing devices. They don’t want to block Pandora and BuzzFeed from your computer. They don’t want to update your software, drivers and operating system. And they really don’t want to issue mobile devices or manage your personal phone.

The ‘bring your own device’ (BYOD) movement became a nightmare for IT because they were suddenly expected to control our personal smartphones and tablets as if they were company devices.

Hardware weighs IT down. I’ve spoken with CIOs who implemented MDM a few months or a few years ago, and they are more than willing to replace it. They want to focus on valuable projects like big data systems, not babysitting our iPhones. They want data and applications in the cloud where they require far less supervision. If this is the case, why is MDM so mainstream and what is the future of BYOD? To understand, we have to look at how tech disruptions actually work.

The Second Wave

Technological development is imitation punctuated with imagination. The most disruptive technologies – PCs, internet, cell phones and tablets – create problems and opportunities that never existed. When companies try to solve these problems and capture these opportunities, the first wave of solutions almost always rely on past experience. Search engines are one of the very best examples of this dynamic. The creation of the internet created a challenge: how do people find information?

Remember Yahoo Directory? It belongs in a museum, but it actually still exists. Yahoo Directory tried to help us find information by organizing it like the paper systems we were used to using: library directories, phone books, encyclopedias and other things we don’t miss. Yahoo Directory was inefficient, but we used it because there was nothing better. Then one day Google said, “Hey, instead of making people search based on our categories, why don’t we just let them type in anything they want?” There was no precedent for this type of searching because we never had technology that permitted it. Search engines rapidly replaced directories. And since then, search engines have created a thousand more challenges and opportunities: search advertising, SEO and search-based financial trading algorithms-imitation and imagination all over again. After a ground-breaking invention like the Internet, the first wave of supporting innovation tends to rely on imitation. The second wave of innovation realizes everything that the first wave had wrong, so they become more imaginative.

The MDM Imitation

MDM was built based on IT’s past experience with PCs, which required a lot of babysitting (and still do). They constantly needed updates, constantly broke and most people at a given company were incapable of troubleshooting them. So when companies wanted to give everyone BlackBerries, MDM made it possible for IT to manage BlackBerries just like PCs. Since then, MDM has become overwhelmingly complicated. Originally, the goal was to let executives take calls, send text messages and check emails. Today, MDM tries to do everything – lock-and-wipe devices, push apps, track location, geofence apps and camera usage and generally create more work for IT and less freedom for employees.

When BYOD began, MDM guys tacked on EMM and tried to extend their dominance over company issued devices to personal devices. They secured individual business apps, placed them in separate containers and pushed clumsy enterprise email clients on employees. Essentially, they create a solution that undercuts the choice and efficiencies that BYOD promised to create, and they made this solution IT’s responsibility. Employees don’t want useless corporate apps forced on their device. They don’t want to be tracked, babysat and coaxed into shedding control over their smartphones and tablets.

Enter the (Enterprise) Cloud

Everyone understands that MDM and EMM are inelegant solutions. They’re fundamentally broken. They cannot keep pace with the wave of innovations that make people the masters of their own digital world. Overall, they have created more difficulties for IT than they have solved. In my opinion, MDM will be wiped out by technology that ignores devices and relies strictly on the cloud. CRM and marketing analytics platforms contain some of the most sensitive information that companies have, and they now live in the cloud. Many more applications and categories of data will go in that direction.

If nothing is stored on our personal devices, IT has nothing to manage. They can take data out of your control. Ultimately, isn’t that the purpose of MDM and EMM? A bundle of disparate cloud apps, however, will not do. A single operating system or platform must link the apps together through one enterprise cloud. This is how IT can manage access to email, file storage, company applications and third-party programs on a user level rather than a device level. Ultimately, the enterprise cloud becomes a closed ecosystem where data can travel from app to app without ever passing through an outside device.

BYOD is ready for its second wave of innovation, and IT is ready to kiss hardware goodbye and those of us who have been under the reign of MDM are ready to get our devices back. MDM’s time has passed. Let’s see if the cloud can deliver.