A Brief History of Applications

For those of you old enough to know what a vacuum tube is, computers used to be a thing for those on the cutting-edge of engineering. Only highly specialized and trained people could make the machines that did the thinking for us. At the time, the entire concept of what a computer is would probably confuse most anyone non-technical who had it explained to them. Even if someone did, they most likely did not look at them right away as being anything more than giant, automatic calculators. Useful, but not life-altering, much in the same way we may have viewed the iPhone when it first came out. Interesting, yes, but who really wanted a touch screen over those beautiful, tactile buttons? And unlike with the iPhone’s arrival, computers were not something the average person would ever interact with. Often the technology that changes society the most does so quietly in the background until it reaches some critical mass of cost-effectiveness, usefulness, and public understanding.

These complex machines needed much more than just the parts they were made with; they needed a task. Is a computer running if it’s not calculating anything? Enter software. In a vast oversimplification of what is a far more complex topic we can deal with in a future post, these machines needed apps, just like you download on your phone. An application is nothing more than a set of instructions of what a computer is supposed to do with the data you give it. These apps used to be installed using punch cards, tape drives, and an unbelievable amount of patience. We quickly graduated to more ergonomically friendly methods of input using keyboards, mice, and CD-ROMs. These days we hit a virtual button on pocket computers and in 5 seconds, a new app is ready to go, no physical medium or wires needed. The usefulness of these apps combined with the increasing ease at which we could interact with computes turned them into a household name.

Computers, in their various forms, are now a staple of daily existence. Even our washing machines have microprocessors, WiFi, and Bluetooth access. They went from cutting-edge to pedestrian in the span of 50 years. It’s no surprise either that some of the most common terms in today’s age are apps: Facebook, Tik-Tok, Netflix, WhatsApp, Instagram. Take the DeLorean out for a drive back to 1985 and the concept of an app was still decades away from mainstream lingo. Microsoft Excel didn’t even exist yet!

As assessable as computers and their applications have become these days, the creation of these “systems within a system” are still far removed from what the average person can create on their own. Further adding to the complexity, is the value of form and not just function. An old DOS or UNIX platform was 99% function and 1% form, with that 1% being the bare minimum needed to interact with it. Command prompts and a flashing underscore were all you got for visual flair. These days, apps are expected to “look the part” and can’t just “do the part” to appeal to their user base. An application or website can go from the pinnacle of visual design to dated in just a few years.

As a result, software and application development, despite all its advances, is still far beyond the reach of the average user. The population of developers and programmers is indeed vast and varied these days, but it is still a heavily learned skill. Adding to the barrier to entry is the vast array of programming languages. Like a digital Earth with all its languages, the same applies to programming. Are you fluent in JavaScript? How’s your Python these days? How’s C# sound to you? If that’s not your thing, what about C++? You’d be surprised just how many languages exist, some widely used and others obscure. If you’re new, where do you start? The world only seems to be able to easily handle 2 options for things: Windows or Mac. iOS or Android. Coke or Pepsi. Then us nerds will take interest in the remaining 2% of obscurity that resides in background, hidden from mainstream society.

Another reason for this barrier to entry is the need for these applications to perform their part correctly. Let’s use photography as an example. In your pocket, you have a powerful camera that probably takes better photos than the most advanced digital camera of just a couple decades ago. Combine that with all the automatic post-processing that happens in a split second after you snap a shot and the large assortment of editing tools at your disposal. Even an unskilled user can probably take a somewhat decent photo. Despite this at your fingertips (or even better if you have an actual DSLR or mirror-less camera), you are still most likely going to have a professional take the real estate photos of your house. You are not going to have your best friend Jim with his iPhone taking your wedding photos. You are not going to just use the “food” feature on your Samsung to take shots of your menu items for the new restaurant you are about to open. Taking the photo isn’t the hard part, it’s the lighting and composure, the stuff that can’t be done without the skill for it. Wedding photos, real estate images, and business photos can’t be just “good enough”, they need to be done correctly.

Applications are no different. An application that is just “good-enough” means a frustrated user that probably will not be using said app for much longer. Apps are not cute photos of your pup that you cherish for the memory regardless of how bad the lighting was. Apps are meant to serve a purpose and they are meant to get as close to 100% of serving that purpose as possible. This serves as a major barrier to the average person. People can easily learn and do things where mistakes or imperfection can be forgiven or in the case of the arts, even cherished. When we expect our things to function as intended 100% of time, we will almost always have an expert make that thing and do that task for us.

Sadly, most of us live in a world of “good-enough” with our apps. Even if the app functions without error, it most likely lacks features we desire or operates in a way that is not perfectly in-line with how we would like it to. In fact, some of the most popular and widely used apps to this day are often the ones with the most thorns in the user-experience. Until all of us know how to write our own code as well as we all know how to use Instagram, there will always be a need for new applications to be created. Developers are constantly at work to perfect the applications they make for us on the ongoing journey to applications that are both 100% form and 100% function.

Similar Posts