Post-PC.
The term has been in our collective minds for a good while now and still, it feels inadequate, inching towards inappropriate. A Post-Personal Computer? Have we passed the time of computer ? What is the stage past personal? Has the computer become intimate? Computers, arguably, are more personal than they’ve ever been. Perhaps we mean to say we’ve crossed over to an era where personal computers look vastly different than they ever have, a reasonable proposition. Tablets, in particular, seem as if conjured directly from a cherished science-fiction novel.
If that were the case, post-PC would still be a bad description; tablets bring so more to the table than a design that separates itself from any pre 2010 computer. Certainly, a revolution is taking place. Computers no longer look the same and are no longer used in the same fashion, the same places or aimed at the same customers. Hence the hook of Post-PC; we are definitely witnessing a changing of the guard. All around us, things are no longer what they used to be.
At best, post-PC is a quantitative term, merely informing that this phase follows the “PC” stage and redundantly serving to differentiate between a tablet and a notebook or desktop. The term offers no significant meaning or metaphor for computers today. What is needed is a definition that accurately reflects its time and place. The era of personal computers defines not only a category of devices but a whole generation of computers that where affordable, relatively simple to use and designed with the individual, not the corporation, in mind. Progressively, it came to define the era of digital media, connectivity, and social networking. PC, in so far as it can still be considered a term today, is qualitative.
Certainly, our aim should be to find a term to replace Post-PC? What wording can adequately describe the way we interact with computers presently?
I would propose we’ve now begun the era of the curated computer.
Curated?
Wikipedia describes the role of a curator as such:
In smaller organizations, a curator may have sole responsibility for the acquisition and care of objects. The curator will make decisions regarding what objects to collect, oversee their care and documentation, conduct research based on the collection, provide proper packaging of art for transport, and share that research with the public and scholarly community through exhibitions and publications…
…In larger institutions, the curator’s primary function is as a subject specialist, with the expectation that he or she will conduct original research on objects and guide the organization in its collecting. Such institutions can have multiple curators, each assigned to a specific collecting area (e.g. Curator of Ancient Art, Curator of Prints and Drawings, etc.) and often operating under the direction of a head curator. In such organizations, the physical care of the collection may be overseen by museum collections managers or museum conservators, and documentation and administrative matters (such as insurance and loans) are handled by a museum registrar.
One line in this definition is of particular interest.
the curator’s primary function is as a subject specialist
To this I would also add that a curator is responsible for the crafting and informing of experiences. If you’ve ever attended a gallery exhibition, for instance, you are witnessing the work of a curator. Curators gather resources and, combined with their expertise, create a particular experience intended to be shared with a specific audience. The term has even crept online: a quick glance at some of your favorite websites and you’ll find passing reference to “curation”. In this case, the term refers to the care and attention the author is placing in the exhibition and content of his website.
Today, with the technology available to us, we have begun to curate our computing experience. Curated computing can describe the following:
- The user acting as a curator, able to select hardware and software based on his needs.
- Hardware and software that can be curated, presenting limitless, interchangeable choices while posing no restrictions to the user.
- The specific and refined computing experiences resulting from the relationship between the curator and the material he curates.
Of course, it’s also important to describe the conditions of the technological world the era of curation defines. After all, if this era is to succeed the era of the PC, it must allude to specific changes that have taken place. More specifically:
1.\ A general performance standard that is no longer limited by physical constraints.
While users have always required specific things from their computers, it is only recently that hardware has been able to provide solutions for almost any possible usage scenario. Only five years ago, it was unfathomable to think a cinematographer could shoot, edit and publish a feature length film using exclusively his mobile phone. That type of work used to be the sole domain of expensive and complex workstations. Even something as simple as checking email or browsing the web, until recently, required a capable machine for the experience to be enjoyable. Pre-2007 smartphones may have been capable of email and browsing but the experience was mediocre at best. Design, portability and functionality almost always came at the expense of performance and usability.
Things have come a long way since. Arguably, almost anything a typical user could want to accomplish with his computer can be done equally well on a traditional computer as it can on a smartphone or tablet device. Technological advancements in component efficiency and miniaturization and have long since closed the performance gap between device categories, nearly removing altogether the need to consider computer components from purchasing decisions. Today, design and functionality have become the primordial considerations of most users.
The MacBook Air perfectly encapsulates this transition. Consistently the most underperforming Mac in Apple’s lineup since its inception, the Air has gone from a device with limited applications to arguably the ideal notebook for almost everyone in its current incarnation.
To illustrate my point. Imagine someone who needs a device capable of handling email, web browsing, text editing, basic picture and video editing, games and communication. Now try to name a device available commercially today that can’t reasonably do all these things. (Software experience notwithstanding, for the sake of the argument.)
Today, instead of having to limit our choices to only what is powerful enough to edit spreadsheets, we can choose devices that truly fit our lifestyles. Computer hardware today does not differentiate itself only for what it can do, it stands out for who will be using it, when it will be used; where and why.
2.\ Software design that values specificity, optimization and human interaction above all else.
Better hardware has also meant better software. Whether by harnessing the computing capabilities of multi-core processors and 64-bit architectures or through more efficient coding, rich and capable software is now available on almost any computing device you can get your hands on. Optimizations, crafty workarounds and resource management in coding languages has also helped maximize the potential of what we once considered underpowered hardware.
Not only is software faster and more powerful than it previously was, you can now find the same applications on your mobile phone as you can your workstation. Apple’s iWork suite is available on every device it sells outside the iPod. iMovie is on our cellphones and pro applications like Final Cut no longer require top of the line hardware to be useful. No matter what sort of device you own today, it’s almost certain that if you can think it, you can achieve it.
Even more interesting to us is the growing trend towards specializee applications, propagated mainly by the vast array of “apps” available on mobile devices. A simple, non-exhaustive description of software today could include:
-
Apps that do one thing, and one thing well: Instead of catch all applications such as Microsoft Word or iTunes, applications are now designed with a few as just one point of emphasis. Think iAWriter, Instapaper, TweetBot. You can also find it in operating systems. iOS 5, for instance, splits the iPod app into two separate Movies and Music apps. With a single focus, the quality and ease of use of desktop and mobile software has never been as great as it is today.
-
Design that focuses the user on the task at hand: A key innovation that iOS helped germinate in Mac OS is the full-screen app. Dedicating the entire screen’s real estate to a single application is helping drive innovation in software design and human interface philosophy, changing the way we experience and interact with new software and simultaneously bringing a fresh twist to old applications. Remember when we looked at inboxes as a single column? Beyond only functionality, full screen apps are also transforming devices from multi-purpose tools to specified, single purpose devices at any given moment. Ask 10 different iPad users how they use their tablets and you’re bound to get 10 wildly different answers. One may use it exclusively as an e-reader while another includes it in his DJ equipment boxes. Although as multi-purpose as ever, specificity in software design causes users to reconsider the identity of the devices they use. What nerds call a tablet computer is another’s cash register, medical filing cabinet or personal bookstore. Combined with gadgets that require less and less maintenance, the user can custom craft his experience with as little or as much computer savvy as those experiences require.
3.\ Content becomes device independent.
Finally, the magic of the internet and its related services has come and connected together almost any and every device you can make use of. While emails, contacts and calendars have long been transferable between devices, we’re fast approaching the stage where almost any kind of content can be accessed and manipulated in the same way across a multitude of devices. As Shawn Blanc describes it, computing today has “[A]ll your stuff on all your devices in any place.” Experiences are now repeatable from one device to another.
Content becoming independent of its client device is a good thing. When Steve Jobs stood on stage at WWDC this past summer and called the new role of the Mac a demotion, he probably didn’t intend for it to sound as pejorative as many understood it to be. Sure, you could say the role of the traditional PC has been diminished but that’s ignoring the net benefit to computer use as a whole. The homogenization of your data across all types of hardware, as well as the client software powering it, is enabling experiences that were previously unimaginable. Nerds may roll their eyes and point to the myriad ways we’ve been able to share data between devices for decades, but would do well to consider that syncing and storing basic data has never been as simple and accessible to the mass market as it is today. Furthermore, we’re fast reaching the point where what you can do with a file on one device is the same as you can do on any devices. Content is free at last.
Steve Lyb:
The iPad is an empty slate, and what you make of it is ultimately what comes to define its role in your life.
Lyb, in one sentence, perfectly captures the essence of curation. Although he is focused on the iPad’s chameleon-like qualities, he’s also describing the kinds of experiences we are beginning to have with every device we own, tablet or otherwise. Every part of a computer, from its hardware to its most basic applications can be customized, carefully selected and arranged to achieve a specific goal. While we’ve always had choice, we’re no longer beholden to the consequences of them. A smartphone needn’t be an emergency mail client. A desktop no longer presents Golberbegian complexity to the average user. Choosing one computer over another is no longer making a sacrifice or picking one thing over another; it is simply one more flower in your own customized technological bouquet.
The upsides are obvious and apparent. Purchasing a device to stay on top of your Facebook timeline and TMZ no longer requires you to deal with anti-virus, disk defragmenting, RAM usage and CPU temperatures. Any Android, WebOS, iOS or Windows Phone OS device will get you browsing the web in no time. If you do nothing but fire up mobile Safari for as long as you own your iPhone, you’d happily be none the wiser to everything else it can offer. Conversely, gamers can still, to their hearts content, trick out custom rigs while creative professionals use workstations capable of rendering the frames of next great American film.
These are simple, one dimensional examples. Curation doesn’t pose restrictions. Imagine your local coffee house offering their beverages on iPad menus, where your order is processed through a dedicated payment solution on your barrista’s iPhone, all the sales data seamlessly being sent over to his accountant’s laptop some unknown distance away. And again, with hardware and software colluding to make the “computer” as invisible as possible to its user, the experience becomes solely what the user makes of it. A common expression in the past might have been: “I’m going to use this computer to write my paper”. Today, you’d say: “I’ve written my paper using this computer.”
Many claim this is the reason behind the appeal of the iPad. They correctly emphasize its ability to transform into whatever the user requires and its ease of use as its strong suits. Yet what most people, and many tech companies, fail to realize is that those qualities shouldn’t and aren’t exclusive to tablets. The iPad’s popularity reflects a mass desire and need for a device delivering the kind of experiences the iPad does. A device that could be curated. All the iPad did was answer the call.
Our experiences with computers today contradict our preconceptions of what computers are, or rather, used to be. Whether through industrial design or software engineering, tablets, phones, notebooks and desktops are becoming more and more abstracted. Apple products are again a good example. Here’s another exercise: Take your iPhone, iPad or MacBook. Turn it off. Imagine all the logos and FCC markings have disappeared. What does that object now resemble? You know it is a computer, but what would someone in 1990 say it was? Someone in 2000? Taken to the extreme, imagine a MacBook Air with no ports and with the lid closed. Could someone today even identify it as a computer? Try the same experiement with your iPhone.
Of course, the optimization of hardware components has changed the design of computers. Integrated circuirty and non replacable parts help miniaturization and design creativity, but the biggest impact may be caused by the psychological effects of those advancements. When a user no longer has to replace a battery, hear fans churning along or no longer wonders which port is a USB port, his relationship to his computer becomes second nature.
You turn on your computer. You launch a program. Add more RAM. Plug in a monitor. Change batteries.
While those types of behaviors still exist, new paradigms of interaction have come and changed the user/device relationship. The most glaring one being the popularization of touch interfaces, removing almost all levels of separation between the user’s interaction with a computer.
Touch interfaces have also influenced the industrial design of tablets, notebooks and smartphones. As devices become touched and held more frequently, the design of computers has had to focus in on its ergonomics, beyond only the keyboard and mouse. Now the entire machine must be designed with physical interaction in mind. Using computers today is an increasingly sensory experience, providing experiences that were never possible in the past, as well as new challenges to overcome.
Apple, Microsoft, Google and Amazon all have different approaches towards the design of modern computers. Whether you believe one tech company’s philosophy to be better than another, one thing is sure: Prioritizing intuitiveness, ergonomics and usability has created entire new classes of devices which invite and inspire users to interact with them regularly.
Not everything is rosy peachy of course. 4 years into the revolution the iPhone (arguably) started, there are still kinks to be uncovered. Even if most would agree Apple’s iOS ecosystem is the standard against which to hold all others, it isn’t without reproach. Apple’s penchant for skeumorphic design, for instance, has received mixed reactions. On the surface, skeumorphic design makes sense: if computer devices are going to be able to replace almost any real world object, perhaps it’s best to mimic those objects through software, for the user’s sake. In practice results are mixed, with applications varying between kitsch and pointless. Computers should strive to be more than just a replacement for your address book or calendar; they can and should perform more efficiently. Apple isn’t alone in suffering from growing pains. Microsoft, despite a beautiful, well designed Metro UI, seems dedicated to tacking it onto the old regular version of Windows. Both the future and the past. Android, for its part, oscillates somewhere in the middle developing an OS built from the ground up seemingly designed by committee, committing neither to old or new paradigms of software designs (One assumes they leave this up to the third parties licensing their software). Transitioning notebooks and desktops OSs is proving to be another challenge. Mac OS X Lion is a bold step in the right direction but it remains to be seen how best to transition and accommodate a slew of new customers who are developing habits in iOS first and then Mac OS, instead of the other way around. Microsoft, for which their Windows Platform is arguably of primordial importance to their well being, will have to step up to a similar challenge with Windows 8.
Computers are no longer about finding the best way to input and analyze data. Both hardware and software must engage and relate to its user. If the attention garned by any new mobile OS release is any indication, it’s an avenue still ripe for exploration.
In many ways, much of what has been described above sounds like the “hyper-PC”, and that’s a perfectly reasonable stance to take. That would be shortchanging the situation. Merely stating that computers are more personalized is a shallow perspective on the situation. So is calling the future “touch-based”, or any derivative thereof. While touch interfaces are certainly a new and innovative paradigm in computer design, it’s reductionary to claim that the future is about computers “designed for touch”. It is but one component.
Computers are finally powerful and malleable enough to truly consider the user’s need in a meaningful way. Who he/she is.? How old are tget? How many users are there? Where are they using their devices? In what orientation? How many devices? With fingers, or keystrokes? All day use or intermittent? In their pockets or on a desk? Developers and designers now have near infinite possibilities when crafting unique experiences.
And yet, that alone would insufficiently describe our “post-PC” era. What we do with all those possibilities is just as important as the devices than can achieve them.
Hence curation. Curation characterizes the result of our relationship with technology. The smartphone you use to check email can be the same you use at work to present keynotes. You might use that same device to reach loved ones on vacation or to glance at while reading your children to sleep. And in each case, that phone, tablet or laptop seamlessly conforms to you, delivering everything you might need from it at that precise moment and nothing else.
Like art gallery curators, we choose the lighting, the sounds, the sights, the mechanisms of our technological exhibitions and create a definitive experience that changes along with our whims. They are as simple or as complex as we make them out to be. Auxillary considerations (battery management, system requirements, storage, memory allocation) are by and large invisible.
To wit, I’m writing this article on my iPhone with an external keyboard, completely absorbed on the pulsating blue cursor on the screen in front of me. At this precise moment, I need only focus on my role as a writer with my iPhone as my typewriter. In a few moments, I’ll be putting the keyboard away, my iPhone will become a phone again and I’ll call my girlfriend to say I’m coming home. Later tonight, this phone will be our movie theatre, showing whatever movie we might find on Netflix. This is what computers are capable of today and how we think about them. As a curator, the experiences I have with my phone are entirely of my choosing and there are no unwanted intrusions.
And on Monday morning, at 5Am, this iPhone will be my nemesis: my alarm clock for work. I do wish I could curate that one differently.
4 notes