Smarterbits

Paraphrasing Google’s Definition of Roboto

Google, from last nights’s ICS event, describing their new font slash design philosophy Pronto:

Roboto has a dual nature. It has a mechanical skeleton and the forms are largely geometric. At the same time the font’s sweeping semi-circular curves give it a cheerful demeanor.

There is a lot of big, hyperbolic jumbled words in that paragraph. What exactly is Roboto? Let’s see if we can’t paraphrase that passage into something clearer.

Roboto has two characteristics. It is primarily a mechanical and geometric font. Thanks to it’s curved ligatures, it is also pleasing and youthful to the eye aesthetically and practically.

Still too much font-speak.

Roboto is a geometric and modern font, a design that is both stylized and practical.

Getting better, but this paraphrase is lacking in meaning.

Roboto is a font that attempts to maintain visual flair while remaining legible and useful.

Almost there…

Roboto is a font getting as close as possible to Helvetica Neue without having to be called Helvetica Neue.

Hmm…

Roboto has multiple personalities. It is edgy, techno and hip. It is also classic, understated and respectful.

One last one.

Roboto is Android’s way of being a little more like iOS.

I think that get’s to the heart of the matter.

    Close Reading of Apple and Google Presentation Slides

    There is something fascinating about choosing words and the results of those choices. Why people choose a specific word over another is equally fascinating: What does it say about it’s author? What message do the chosen words convey? Who is that message addressed to?

    I thought it would be interesting to compare Android and iOS not by a checklist of features, but rather by a checklist of the words their developers choose to use on their presentation slides. From that alone so much can be inferred. For example, here are what I think the boldest statements Apple and Google made during their keynote.

    Apple simply stated:

    The most amazing iPhone yet

    Google asked:

    Can a machine have a soul?

    Talking about Siri:

    Your intelligent assistant that helps you get things done just by asking

    Describing Roboto:

    Roboto has a dual nature. It has a mechanical skeleton and the forms are largely geometric. At the same time the font’s sweeping semi-circular curves give it a cheerful demeanor. Isn’t there so much you can tell about these two companies by those words alone.

    Here’s a list of some other snippets of presentation slide text from the iPhone 4S and Ice Cream Sandwich keynotes. You can click through the links to see from which presentations they come from, but for a fun challenge, try guessing from which company’s mouth the words were spoken (or projected in this case).

    • Bold and Typographic
    • Social Integration
    • Intelligently Stored on Device
    • 33% Faster Capture
    • Zero Shutter Lag
    • Better Color Accuracy
    • Revamped UI
    • Easily Locate Friends and Family
    • Make Me Awesome
    • Full HD Capture
    • Ultra Thin Design
    • Sleek and Curved Design

    Depending on how well you guessed, you’ll have a different impression of how those words shape your perception of the company. If you were able to guess each one correctly, what does that say about the power of words in creating brand identity? Or is the opposite true? What can you surmise about each company’s different values and philosophies from those snippets alone? I would say there’s a definite contrast, nuanced as it is, between the two, but I wouldn’t go so far as to claim one is better than he other. I’m simply fascinated by how much you can learn from paying close attention to words.

      It’s hard to beat free” 1 may not have been the very first sentence I typed on the chiclet panel, but it was surely amongst the second or third. 2 What other way would you do it? 3 The truth is I don’t know how to tell those stories4 and, trust me, there are people and companies out there specifically designing to answer that [] for you. 5

      All Together Now - October 16th 2011

      1. Tom Bradley
      2. Joanna Stern
      3. Ben Brooks
      4. Aaron Sorkin
      5. Patrick Rhone

        In Exchange With Patrick Rhone

        Patrick Rhone is the editor of Minimal Mac and co-host of the Enough Podcast, both of which deal with minimalist practices in the area of technology. Patrick is also the author of Keeping it Straight, a series of meditations and essays on productivity and useful living. Patrick was gracious enough to sit and talk with me about the importance of striking a balance with our technology. What follows is that exchange.

        Patrick, I wanted to thank you first for taking time to sit and talk with me today. I’d like to discuss your writing and more specifically, the theme of “enough” that seems to be behind it all. So to start at the beginning, how did you discover this idea of “enough”?

        Well, I think the arrival at “enough” is similar to finding one’s balance on a tight rope or balance beam. Everyone has a different and unique center of balance. In order to find it, and keep it, one must make constant adjustments. Even once one has found it, as conditions change, one must adjust accordingly.

        Finding what is enough in one’s life is much the same. It is about searching for balance in all areas of life and knowing the right questions to ask and adjustments to make as conditions change in order to keep from falling in either direction. But, also, should one fall, knowing how to find the steps to get back up.

        Was there a specific “aha!” moment for you, or has this philosophy always been with you?

        I think it has grown out of my natural curiosity and desire to seek balance in all areas of life. Which, in part, is driven by my practice of Buddhist mindfulness.

        That’s exactly what it reminds me of, this notion of spirituality. Many people will recognize you from Minimal Mac, your technology website. where this idea of balance is pervasive. It’s a rather unique perspective, because it seems to go against the current of most tech writing: more more more. New, new, new. What’s the advantage of finding a balance with our technology?

        I’ve been re-exploring this question of “Why?” a lot lately in an effort to try to distill it. What I am coming to is the idea of allowing technology in your life (and this is about more than just tech) that is purpose driven. That is, the idea that one asks what the purpose of these items is and then intentionally chooses, tailors, and uses these items to fit that purpose.

        For example, I recently brought this approach to my iPhone. I thought long and hard about what I use it for and how I go about doing that. Then, I examined and arranged every application installed to serve that purpose (and deleted the ones that had no discernible place within that purpose).

        Finally, I think there is another aspect as well. That is, when one has optimized for intention and purpose, one can then focus in on other things with more balance and less friction. If my tools are purpose driven then there is less that stands between my intentions and my actions. The tool then becomes one with that connection and effectively disappears.

        That’s what I’ve noticed too as I’ve tried to optimize my tools. And indeed, it’s my focus that’s really benefited from this. I’m more engaged and interested in what I write and I’ve stopped worrying about my workflow on a regular basis. Is that a common response you get from readers?

        Yes. Along those lines. I think, ultimately, we want these tools to be a natural part of our connections and to facilitate them as seamlessly as possible.

        I have a regular meeting with a good friend. We get together every Monday morning to talk about our weeks, the week ahead, what is driving us, etc. Yesterday, I had my iPad out and was telling him about a wonderful email I received. I looked it up on the iPad, passed it to him to read, he took it, read the email, and handed it back. At the time we thought nothing of this. It was as natural as handing a piece of paper back and forth to read.

        Then later, driving home, it hit me. IT WAS AS NATURAL AS PAPER! I never once thought about the fact that we were passing a computer back and forth. All I thought about was adding to the conversation by showing him what I was discussing. No different than if I had printed it out. That is what our technology should be when purpose driven. We should not think about it. Intention flows to action naturally.

        Absolutely. You’re getting to something I wanted to talk about. This convergence of tools and intentions seems to be best exemplified by the iPad and the iPhone. Even Apple products in general. When you think about it, It seems so elementary to want technology to feel natural. Any ideas why Apple seems to be the only company onto this idea?

        I think it is largely driven by the values and culture that Steve Jobs fostered at Apple. In fact, with much talk about his passing and punditry about the future of Apple without him, it is the values and culture that have not been talked about enough. Because this is, in fact, his greatest and most long lasting creation. It is the reason why we will continue to see this influence in Apple products for as long as the mental eye can see.

        This is why companies try and imitate Apple products almost down to the letter and still, something is missing. They can’t replicate it because it is not something you can capture in the design or execution. The company and culture has to be there too. Steve famously said, “It’s not just what it looks like and feels like. Design is how it works.” I think the ethos and culture drives the how it works part to a greater extent than gets discussed.

        I wonder about that, how systemic that culture has to be. Obviously, no one at Microsoft or Google is thinking to themselves “Hey, how can we make this more complicated and less intuitive?”, but there’s always a stark difference when comparing the results. Side by side an iPhone and a Nexus One have so many technical similarities, yet you can tell there’s also a huge gulf between them in terms of experience.

        Agreed. I actually have a very close friend who works for Apple. He is an amazing guy. The sort of guy who strives for excellence in everything he sets his mind to.

        I’ll never forget one discussion we had about a year ago. About a year prior, he had gone to the the doctor and, well, he was read the riot act. His blood pressure was high, he was overweight, out of shape. His doctor basically told him to shape up or, with his family history, he would likely have a heart attack before he hit 40.

        He took that advice to heart. He did a complete 180. Within a few months he was running marathons. A few months more and he was doing triathlons. Today, he is besting and placing everyone else in his class.

        In this discussion, I said, “One of the things I really admire about you is that you are always pushing the limits of what you can do with everything you do.” He looked and me and, with no hint of humor at all, said, “What limits? I don’t believe in limits. The moment you believe in limits, even in pushing them, you have them.”

        At that moment, I thought; Yep. That is what Apple is made of. Thousands of people who think just like him. That is what makes them different. That is what is imbued in each Mac, iPhone, iPad, and everything they make or do. Relentless-ness.

        I imagine it’s just as challenging trying to find our own personal balance with technology. You write for a large tech savvy audience, but you also do consulting work for a variety of clients. Where do people go astray? What are some of the common misconceptions people still make today that causes them to “fall off the tight rope”, so to speak, with technology?

        I think most people fail to ask the simplest of questions: Why? Because the why is important. If you look at every Apple commercial, they don’t show you tech and specs; they show why. Why do you want this device? Why would you use it? Why does it matter?

        And, trust me, there are people and companies out there specifically designing to answer that why for you. They would very much like it if you did not ask that question. They would like you to just do what they think you should do.

        There is an interesting effect called Gruen Transfer, which is the moment when a consumer enters a shopping mall and, surrounded by an intentionally confusing layout, loses track of their original intentions and thus are more likely to impulse shop. If you think a company like say, Google, is not designing this into their products, you are kidding yourself.

        By asking why, by being purpose driven, you are less likely to be guided by someone else’s purpose for you.

        I don’t think it ends there. Having worked at electronics stores, I can tell you first hand it doesn’t need to be as complex as the Gruen Transfer. Most sales people don’t even bother to ask why the customer came looking for a computer in the first place. Or they ask only in the most superficial of ways. As consumers, we’re told what we “need”. I’d imagine people are surprised when they hire you then?

        I would hope people are refreshingly surprised when they hire me, especially if they have been working with another consultant. Because, I come in not to just “fix” something, take the money, and leave. I come in to find out what they want to do, why they want to do it, what they hope to achieve, and how I might help them in getting there. I listen to their intention and purpose and that desire becomes the driver for the actions we take.

        Again, it seems so simple a concept: find out what you need. Yet, it speaks volumes about how we have an entire culture geared towards never asking ourselves “why?”. If we can say Apple has nurtured a culture that’s become they’re identity, you could say the same about society as a whole. Our example here is technology, but as you’ve explored in your work beyond Minimal Mac, the same issues arise in almost every aspect of our lives. Which goes back to what you said at the beginning of our conversation.

        It struck me just now how your perspective takes our approach to technology as a vehicle to open a dialogue about larger issues. Is that a fair assessment?

        I think it is. Remove the word “technology” and replace it with “clothing” or “dishes” or “books” and the concepts still stand. For me, discussing these issues within the frame of technology is just one that resonates with a lot of people right now because we are so immersed in it and, for some of us, overwhelmed by it.

        Immersed is such a good word. There’s a Gandhi quote that goes “There is more to life than increasing its speed.” In this instance, I like to replace the word life with “technology”.

        Exactly.

        ** We’ve looked at one side of the coin. Let’s flip it. Is it possible to go too minimal?**

        Yes. Like I said, the goal is not too much, or too little. The goal is having just enough. I think “minimalism” has gotten a bad rap and that people take it to its extremes. I think practicality is important. I think one omit only needless things. Things that have need, have purpose, should be treasured. Even if that need is not one that is tangible.

        Any tangible examples of doing too little?

        Well, I certainly went to, and thus advocated, extremes in the early days of Minimal Mac. One only need to look back to my cold war against menubar items to see that. But, here’s the thing: Sometimes we must go to such extremes in order to find out what the right balance is. To go to the balance beam metaphor again, we must hold our arms out and teeter from side to side before we can find that spot that makes us still.

        Another example is the time tested uncluttering technique where one clears a space completely, sets a deadline and then brings items back as needed. Anything not back within that time frame was likely not needed in the space.

        Both of these are examples of using extremes in order to find balance. In these cases, I think it OK to go to those extremes. It is a tool to ask questions and determine answers.

        ** You talk about “minimalism” getting a bad rap. I think it’s because people consider it a pejorative. As if it means to be lacking in some way, or that’s it’s sacrificial to be minimalist. My response to that is always to think of minimal in this instance as what is essential. How do you counter the bad rap?**

        I counter that bad rap by advocating its use as a tool and journey to the idea of enough. That our journey should revolve around that destination. And what that means, what that looks like, will be highly personal.

        I guess, to frame it with Buddhism, that’s a good way of finding the path to liberation?

        Perhaps. In many ways, the answer we seek as human beings, the questions that drive our search and discovery for purpose are twofold: “Is this all there is?” and “Is this all I am?”. However we answer that question, there is only one natural follow-up: Why?

        This is an answer I seek.

        I hope you’ll let us know the answers you find. Patrick, thanks so much for taking the time to talk with me.

        My pleasure. Thank you. I’ve enjoyed this.

          Great Expectations

          John Gruber:

          It may or may not have ideally launched a few months sooner, but the plan was always for an iPhone 4 successor that looked like the 4 but had improved internal components. I wouldn’t be surprised if the next iPhone doesn’t change, or doesn’t change much, either.

          Apple isn’t going to make a new form factor just for the sake of newness itself — they make changes only if the changes make things decidedly better. Thinner, stronger, smaller, more efficient. If they don’t have a new design that brings about such adjectives, they’re going to stick with what they have.

          This seems to be Apple’s overarching philosophy when it comes to the design of all their products: regular iteration of the components and major aesthetic redesigns only when necessary and/or beneficial.

          People’s expectations are a whole n’other thing of course. Watching all the complaints this week about the iPhone 4S retaining the same case design as the iPhone 4 is somewhat fascinating. There’s an expectation of the iPhone that none of Apple’s other products must endure.

          Certainly, people don’t want a new case because the current one is lacking. If the iPhone 4’s design debuted this year instead of last, people would undoubtably be praising its merits now as they were then. Arguably, it is still the most iconic smartphone design you can buy today.

          So then why the fuss over being stuck with it another year?

          My guess? People have grown to expect “new” phones every year because for the longest time, aesthetics was the only factor cellphone OEMs could - or cared to - significantly change and control. Different colored candy wrappers on top of identical mounds of empty calories.

          Apple has never been about flash over substance. And despite all the protests to the contrary, we all know what wins out in the end.

          • 2 months ago
          • 0

            The Dent in Me

            As you’ve pointed out I’ve helped with more computers in more schools than anybody else in the world and I absolutely convinced that is by no means the most important thing. The most important thing is a person. A person who incites your curiosity and feeds your curiosity; and machines cannot do that in the same way that people can. The elements of discovery are all around you. You don’t need a computer. Here - why does that fall? You know why? Nobody in the entire world knows why that falls. We can describe it pretty accurately but no one knows why. I don’t need a computer to get a kid interested in that, to spend a week playing with gravity and trying to understand that and come up with reasons why.

            I’m my wildest I’m an accomplished writer fantasy, there’s a scene where I get to interview Steve Jobs, asking him insightful questions that elicit responses similar to the one above, sourced from an old interview he did back when he founded NEXT.

            That’ll never happen now, which is somewhat besides the point. Since last night, I’ve felt like I needed to write something - anything - in acknowledgedment of his passing. Yet with so many others lighting - and in some cases dimming - so many pixels with their own acknowledgements, I struggled to find my own voice.

            My feelings weren’t reverence or Hallmark idolism. Without a doubt he was a titan and visionnary in his field and certainly the closest thing to a modern legend my generation is likey to ever know. But I honestly couldn’t muster the honesty to say I was thankful I had him in my life, even whilst recognizing his importance to society. Without being disparaging, suffice to say that it wasn’t Apple specifically that kindled my love for technology or led me to become who I am today. Without Apple, I still would have made meaningful connections with people in my life. While certainly true for many, it would have been disingenuous to pretend that had been my experience.

            Instead, I thought of sharing stories of my experiences with Apple. Except, my story of being 3rd in line at the launch of the iPad and ending up at the CBC being asked by a radio DJ to give a live demo of my iPad - though entertaining - pales next to the ressonance of the many other, more relevant tales you’ve certainly already read.

            I considered talking about the impact of death; afterall, I lost my only parent weeks before my 21st birthday. I know something of the swift kick in the ass, call to action effect the realization of your mortality can have. Still, there’s no novel revelation from this. We already know we need to be passionate about life and that we should strive for the things we love aggresively, relentlessly. Fearlessly. Jobs already embodied those ideals perfectly and that, long before his passing. Death shouldn’t be the only time we remember them.

            Coming up short, I just asked myself what I would come to miss most about Steve Jobs. It was coming across the above passage I finally realized.

            I will miss Steve Jobs’s mind. Not the one that created the iPhone and iPod but the one in that interview, the one found in the simple, eloquant treatise on Flash or music. Above all, Jobs was a man of great intellect, not only about technology, but life. Wisdom is the word that comes to mind. And like a child at storytime, I’ll miss getting lost in his intelligence and insight. What excites me most in life are ideas and arguments, discussions and debates. And, perhaps more imporatantly, the people behind them. Those are the reasons I want to write and Jobs set a bar for me I could always look up to. The output you find on this website is my attempt, my progress - however meager - towards that goal.

            Steve Jobs is that great historical, inspirational figure you read about in books. Except this time, we’re the lucky ones who got to witness him firsthand with our own eyes. We are the dent. I’m saddened by the thought his dialogue with us was cut short. It makes what he left behind - what little I manged to learn from him - that much more valuable.

            • 2 months ago
            • 7

              Siri’s Social Dilemma

              Apple’s vignette for Siri, the new voice assistant for the the iPhone 4S looks promising, nearing on the spectacular. Apple is playing up its importance and rightly so; not only is it the 4S’s biggest standout feature, it also represents the mainstream arrival of voice commands. Resembling a more subdued version of Jarvis, Tony Stark’s computerized home butler, Siri promises to revolutionize the way we interact with our technology. Like Jarvis however, there’s a possibility that Siri’s potential is better illustrated in cinematographic vacuums, where the challenges of real life can be ignored.

              Apple’s implementation of voice assistance may be the best example yet but there are still some hurdles to overcome, or at least, skeptics to convince. Already, many are asking about Siri’s localization efforts, or how effectively it will differentiate accents, slang or deal with nuanced language and complex sentences. Apple states that Siri will keep asking questions until it finds the right answer, but we’ve yet to see exactly how many questions that could entail, or how frustrating the experience could become. How much effort will be required to teach Siri to differentiate between Ryan your son, Ryan your boss and Ryan your aunt? And while Apple states Siri can have access to most of the iPhone’s built in apps, what of third party ones?

              Supposing it can get through all those technical hurdles, Siri still faces two difficult challenges to user adoption:

              • An absence of social norms and values around the use of voice commands.
              • Privacy considerations.

              While Apple’s promotional videos show Siri working fantastically in private, there’s no telling how it will be perceived out in public and how willing people will be to use it in front of friends, family, co workers and complete strangers. While it will be awesome to demonstrate to everyone you know when you’re the first get the new iPhone, how likely are people to tolerate its use in public or at work regularly? Will new iPhone owners be faced with ridicule or awkward stares from onlookers and people they know? In a group, how do individuals discern whether you are talking to them of to your phone? Will it feel strange to be out with friends at a bar witness everyone simply dictating to their phones the entire night? There are no established boundaries for using voice commands publicly on a consistent basis. Think of how text messaging is still affecting and changing social norms and behaviors. Voice assistance software like Siri will have the same kind of impact and implications as we try to discover what sorts of behaviors are acceptable. Doing so at the scale of an entire society can take time.

              Secondly, voice assistance carries a whole host of privacy issues. Some are basic: How do you prevent others from using your phone with voice commands? What kind of access security is built in to a system like Siri? In some cases, it can become more complex. Consider how much private information is contained in your smartphone. How much of that information would you be comfortable broadcasting around you to strangers and people you know? Would you be willing to have Siri dictate your texts, emails or appointments out loud in your workplace? At home? You can imagine a thousand and one scenarios where that would be undesirable. Would you want a secretary dictating your next appointment out loud into your doctor’s iPhone for everyone in the waiting room to hear?

              Unfortunately, it isn’t enough to say that you could simply turn Siri’s notifications on and off when necessary: the inconvenience would turn users away. And remember, a text message or email is not the same as a phone call. The contents and contexts in which they are used are worlds apart and there are reasons why one is used over the other. Some forms of communication are silent for a reason. Replacing text input commands with voice ones may improve the methods by which we transmit information, but it is unclear whether to do so is actually more beneficial or more desirable than its tactile counterparts. Let’s face it, no one uses text messaging solely to alert their spouses they’ll be late coming home. There’s an inherent privacy to smartphones, and computers in general, that precludes the attractiveness of voice commands. Hence why it is always demonstrated in sterile and mostly private situations: checking the weather, asking for measurements, in your car’s navigation system. Voice assistants aren’t great at keeping your extra marital affairs under wraps.

              The latter issue informs the former, and vice versa. If the situations where Siri is used are limited to private and simple situations, it’s unlikely people will commit to voice commands meaningfully, making it more difficult for voice assistance to define and integrate itself in a larger social context. This in turn elongate the period of time needed for voice assistance to become socially acceptable (read: not embarrassing), exasperating people’s reluctance towards it. If people are only going to use Siri some of the time, it’s more likely they won’t develop the habit of using consistently, even in private. Hence the difference between Siri becoming an “essential service” and remaining a “gimmick”.

              The technological limitations, if any, will work themselves out. Projects like Siri have been in the oven for a long time and have come a long way: future iterations of the iPhone are sure to improve upon any flaw we might yet find. What’s more pressing is finding Siri’s identity, its _raison d’etre- outside the confines of our vehicles, homes and promotional videos.

                Predictions For Today’s Pre 4 Event •

                Here it is, my best guess on what will be announced at today’s Palm Pre event:

                • Only one new Pre. Called Pre 4
                • Same form factor as the Pre 3 
                • webOS 3.0 release! Updates for TouchPad, Pre 3 and Veer 4G in time for release.
                • Twitter & FaceBook integration.
                • Meg Whitman takes the stage for her first public appearance as CEO. No appearance from Leo Apotheker.
                • New dual core Snapdragon, better camera.
                • Finally on T-Mobile!
                • Dual GSM/CDMA chip.
                • “Just Talk” exclusive to Pre 4.
                • 16GB/$149 32 GB/$199 64GB/$299 on 2 year agreements. Pre 3 cut to $99 

                I’m so excited. Let’s hope I’m right. Excuse while I go put in my pre-order.

                • 3 months ago
                • 0

                  An Era of Curation

                  Post-PC.

                  The term has been in our collective minds for a good while now and still, it feels inadequate, inching towards inappropriate. A Post-Personal Computer? Have we passed the time of computer ? What is the stage past personal? Has the computer become intimate? Computers, arguably, are more personal than they’ve ever been. Perhaps we mean to say we’ve crossed over to an era where personal computers look vastly different than they ever have, a reasonable proposition. Tablets, in particular, seem as if conjured directly from a cherished science-fiction novel.

                  If that were the case, post-PC would still be a bad description; tablets bring so more to the table than a design that separates itself from any pre 2010 computer. Certainly, a revolution is taking place. Computers no longer look the same and are no longer used in the same fashion, the same places or aimed at the same customers. Hence the hook of Post-PC; we are definitely witnessing a changing of the guard. All around us, things are no longer what they used to be.

                  At best, post-PC is a quantitative term, merely informing that this phase follows the “PC” stage and redundantly serving to differentiate between a tablet and a notebook or desktop. The term offers no significant meaning or metaphor for computers today. What is needed is a definition that accurately reflects its time and place. The era of personal computers defines not only a category of devices but a whole generation of computers that where affordable, relatively simple to use and designed with the individual, not the corporation, in mind. Progressively, it came to define the era of digital media, connectivity, and social networking. PC, in so far as it can still be considered a term today, is qualitative.

                  Certainly, our aim should be to find a term to replace Post-PC? What wording can adequately describe the way we interact with computers presently?

                  I would propose we’ve now begun the era of the curated computer.

                  Curated?

                  Wikipedia describes the role of a curator as such:

                  In smaller organizations, a curator may have sole responsibility for the acquisition and care of objects. The curator will make decisions regarding what objects to collect, oversee their care and documentation, conduct research based on the collection, provide proper packaging of art for transport, and share that research with the public and scholarly community through exhibitions and publications…

                  …In larger institutions, the curator’s primary function is as a subject specialist, with the expectation that he or she will conduct original research on objects and guide the organization in its collecting. Such institutions can have multiple curators, each assigned to a specific collecting area (e.g. Curator of Ancient Art, Curator of Prints and Drawings, etc.) and often operating under the direction of a head curator. In such organizations, the physical care of the collection may be overseen by museum collections managers or museum conservators, and documentation and administrative matters (such as insurance and loans) are handled by a museum registrar.

                  One line in this definition is of particular interest.

                  the curator’s primary function is as a subject specialist

                  To this I would also add that a curator is responsible for the crafting and informing of experiences. If you’ve ever attended a gallery exhibition, for instance, you are witnessing the work of a curator. Curators gather resources and, combined with their expertise, create a particular experience intended to be shared with a specific audience. The term has even crept online: a quick glance at some of your favorite websites and you’ll find passing reference to “curation”. In this case, the term refers to the care and attention the author is placing in the exhibition and content of his website.

                  Today, with the technology available to us, we have begun to curate our computing experience. Curated computing can describe the following:

                  • The user acting as a curator, able to select hardware and software based on his needs.
                  • Hardware and software that can be curated, presenting limitless, interchangeable choices while posing no restrictions to the user.
                  • The specific and refined computing experiences resulting from the relationship between the curator and the material he curates.

                  Of course, it’s also important to describe the conditions of the technological world the era of curation defines. After all, if this era is to succeed the era of the PC, it must allude to specific changes that have taken place. More specifically:

                  1.\ A general performance standard that is no longer limited by physical constraints.

                  While users have always required specific things from their computers, it is only recently that hardware has been able to provide solutions for almost any possible usage scenario. Only five years ago, it was unfathomable to think a cinematographer could shoot, edit and publish a feature length film using exclusively his mobile phone. That type of work used to be the sole domain of expensive and complex workstations. Even something as simple as checking email or browsing the web, until recently, required a capable machine for the experience to be enjoyable. Pre-2007 smartphones may have been capable of email and browsing but the experience was mediocre at best. Design, portability and functionality almost always came at the expense of performance and usability.

                  Things have come a long way since. Arguably, almost anything a typical user could want to accomplish with his computer can be done equally well on a traditional computer as it can on a smartphone or tablet device. Technological advancements in component efficiency and miniaturization and have long since closed the performance gap between device categories, nearly removing altogether the need to consider computer components from purchasing decisions. Today, design and functionality have become the primordial considerations of most users.

                  The MacBook Air perfectly encapsulates this transition. Consistently the most underperforming Mac in Apple’s lineup since its inception, the Air has gone from a device with limited applications to arguably the ideal notebook for almost everyone in its current incarnation.

                  To illustrate my point. Imagine someone who needs a device capable of handling email, web browsing, text editing, basic picture and video editing, games and communication. Now try to name a device available commercially today that can’t reasonably do all these things. (Software experience notwithstanding, for the sake of the argument.)

                  Today, instead of having to limit our choices to only what is powerful enough to edit spreadsheets, we can choose devices that truly fit our lifestyles. Computer hardware today does not differentiate itself only for what it can do, it stands out for who will be using it, when it will be used; where and why.

                  2.\ Software design that values specificity, optimization and human interaction above all else.

                  Better hardware has also meant better software. Whether by harnessing the computing capabilities of multi-core processors and 64-bit architectures or through more efficient coding, rich and capable software is now available on almost any computing device you can get your hands on. Optimizations, crafty workarounds and resource management in coding languages has also helped maximize the potential of what we once considered underpowered hardware.

                  Not only is software faster and more powerful than it previously was, you can now find the same applications on your mobile phone as you can your workstation. Apple’s iWork suite is available on every device it sells outside the iPod. iMovie is on our cellphones and pro applications like Final Cut no longer require top of the line hardware to be useful. No matter what sort of device you own today, it’s almost certain that if you can think it, you can achieve it.

                  Even more interesting to us is the growing trend towards specializee applications, propagated mainly by the vast array of “apps” available on mobile devices. A simple, non-exhaustive description of software today could include:

                  • Apps that do one thing, and one thing well: Instead of catch all applications such as Microsoft Word or iTunes, applications are now designed with a few as just one point of emphasis. Think iAWriter, Instapaper, TweetBot. You can also find it in operating systems. iOS 5, for instance, splits the iPod app into two separate Movies and Music apps. With a single focus, the quality and ease of use of desktop and mobile software has never been as great as it is today.

                  • Design that focuses the user on the task at hand: A key innovation that iOS helped germinate in Mac OS is the full-screen app. Dedicating the entire screen’s real estate to a single application is helping drive innovation in software design and human interface philosophy, changing the way we experience and interact with new software and simultaneously bringing a fresh twist to old applications. Remember when we looked at inboxes as a single column? Beyond only functionality, full screen apps are also transforming devices from multi-purpose tools to specified, single purpose devices at any given moment. Ask 10 different iPad users how they use their tablets and you’re bound to get 10 wildly different answers. One may use it exclusively as an e-reader while another includes it in his DJ equipment boxes. Although as multi-purpose as ever, specificity in software design causes users to reconsider the identity of the devices they use. What nerds call a tablet computer is another’s cash register, medical filing cabinet or personal bookstore. Combined with gadgets that require less and less maintenance, the user can custom craft his experience with as little or as much computer savvy as those experiences require.

                  3.\ Content becomes device independent.

                  Finally, the magic of the internet and its related services has come and connected together almost any and every device you can make use of. While emails, contacts and calendars have long been transferable between devices, we’re fast approaching the stage where almost any kind of content can be accessed and manipulated in the same way across a multitude of devices. As Shawn Blanc describes it, computing today has “[A]ll your stuff on all your devices in any place.” Experiences are now repeatable from one device to another.

                  Content becoming independent of its client device is a good thing. When Steve Jobs stood on stage at WWDC this past summer and called the new role of the Mac a demotion, he probably didn’t intend for it to sound as pejorative as many understood it to be. Sure, you could say the role of the traditional PC has been diminished but that’s ignoring the net benefit to computer use as a whole. The homogenization of your data across all types of hardware, as well as the client software powering it, is enabling experiences that were previously unimaginable. Nerds may roll their eyes and point to the myriad ways we’ve been able to share data between devices for decades, but would do well to consider that syncing and storing basic data has never been as simple and accessible to the mass market as it is today. Furthermore, we’re fast reaching the point where what you can do with a file on one device is the same as you can do on any devices. Content is free at last.

                  Steve Lyb:

                  The iPad is an empty slate, and what you make of it is ultimately what comes to define its role in your life.

                  Lyb, in one sentence, perfectly captures the essence of curation. Although he is focused on the iPad’s chameleon-like qualities, he’s also describing the kinds of experiences we are beginning to have with every device we own, tablet or otherwise. Every part of a computer, from its hardware to its most basic applications can be customized, carefully selected and arranged to achieve a specific goal. While we’ve always had choice, we’re no longer beholden to the consequences of them. A smartphone needn’t be an emergency mail client. A desktop no longer presents Golberbegian complexity to the average user. Choosing one computer over another is no longer making a sacrifice or picking one thing over another; it is simply one more flower in your own customized technological bouquet.

                  The upsides are obvious and apparent. Purchasing a device to stay on top of your Facebook timeline and TMZ no longer requires you to deal with anti-virus, disk defragmenting, RAM usage and CPU temperatures. Any Android, WebOS, iOS or Windows Phone OS device will get you browsing the web in no time. If you do nothing but fire up mobile Safari for as long as you own your iPhone, you’d happily be none the wiser to everything else it can offer. Conversely, gamers can still, to their hearts content, trick out custom rigs while creative professionals use workstations capable of rendering the frames of next great American film.

                  These are simple, one dimensional examples. Curation doesn’t pose restrictions. Imagine your local coffee house offering their beverages on iPad menus, where your order is processed through a dedicated payment solution on your barrista’s iPhone, all the sales data seamlessly being sent over to his accountant’s laptop some unknown distance away. And again, with hardware and software colluding to make the “computer” as invisible as possible to its user, the experience becomes solely what the user makes of it. A common expression in the past might have been: “I’m going to use this computer to write my paper”. Today, you’d say: “I’ve written my paper using this computer.”

                  Many claim this is the reason behind the appeal of the iPad. They correctly emphasize its ability to transform into whatever the user requires and its ease of use as its strong suits. Yet what most people, and many tech companies, fail to realize is that those qualities shouldn’t and aren’t exclusive to tablets. The iPad’s popularity reflects a mass desire and need for a device delivering the kind of experiences the iPad does. A device that could be curated. All the iPad did was answer the call.

                  Our experiences with computers today contradict our preconceptions of what computers are, or rather, used to be. Whether through industrial design or software engineering, tablets, phones, notebooks and desktops are becoming more and more abstracted. Apple products are again a good example. Here’s another exercise: Take your iPhone, iPad or MacBook. Turn it off. Imagine all the logos and FCC markings have disappeared. What does that object now resemble? You know it is a computer, but what would someone in 1990 say it was? Someone in 2000? Taken to the extreme, imagine a MacBook Air with no ports and with the lid closed. Could someone today even identify it as a computer? Try the same experiement with your iPhone.

                  Of course, the optimization of hardware components has changed the design of computers. Integrated circuirty and non replacable parts help miniaturization and design creativity, but the biggest impact may be caused by the psychological effects of those advancements. When a user no longer has to replace a battery, hear fans churning along or no longer wonders which port is a USB port, his relationship to his computer becomes second nature.

                  You turn on your computer. You launch a program. Add more RAM. Plug in a monitor. Change batteries.

                  While those types of behaviors still exist, new paradigms of interaction have come and changed the user/device relationship. The most glaring one being the popularization of touch interfaces, removing almost all levels of separation between the user’s interaction with a computer.

                  Touch interfaces have also influenced the industrial design of tablets, notebooks and smartphones. As devices become touched and held more frequently, the design of computers has had to focus in on its ergonomics, beyond only the keyboard and mouse. Now the entire machine must be designed with physical interaction in mind. Using computers today is an increasingly sensory experience, providing experiences that were never possible in the past, as well as new challenges to overcome.

                  Apple, Microsoft, Google and Amazon all have different approaches towards the design of modern computers. Whether you believe one tech company’s philosophy to be better than another, one thing is sure: Prioritizing intuitiveness, ergonomics and usability has created entire new classes of devices which invite and inspire users to interact with them regularly.

                  Not everything is rosy peachy of course. 4 years into the revolution the iPhone (arguably) started, there are still kinks to be uncovered. Even if most would agree Apple’s iOS ecosystem is the standard against which to hold all others, it isn’t without reproach. Apple’s penchant for skeumorphic design, for instance, has received mixed reactions. On the surface, skeumorphic design makes sense: if computer devices are going to be able to replace almost any real world object, perhaps it’s best to mimic those objects through software, for the user’s sake. In practice results are mixed, with applications varying between kitsch and pointless. Computers should strive to be more than just a replacement for your address book or calendar; they can and should perform more efficiently. Apple isn’t alone in suffering from growing pains. Microsoft, despite a beautiful, well designed Metro UI, seems dedicated to tacking it onto the old regular version of Windows. Both the future and the past. Android, for its part, oscillates somewhere in the middle developing an OS built from the ground up seemingly designed by committee, committing neither to old or new paradigms of software designs (One assumes they leave this up to the third parties licensing their software). Transitioning notebooks and desktops OSs is proving to be another challenge. Mac OS X Lion is a bold step in the right direction but it remains to be seen how best to transition and accommodate a slew of new customers who are developing habits in iOS first and then Mac OS, instead of the other way around. Microsoft, for which their Windows Platform is arguably of primordial importance to their well being, will have to step up to a similar challenge with Windows 8.

                  Computers are no longer about finding the best way to input and analyze data. Both hardware and software must engage and relate to its user. If the attention garned by any new mobile OS release is any indication, it’s an avenue still ripe for exploration.

                  In many ways, much of what has been described above sounds like the “hyper-PC”, and that’s a perfectly reasonable stance to take. That would be shortchanging the situation. Merely stating that computers are more personalized is a shallow perspective on the situation. So is calling the future “touch-based”, or any derivative thereof. While touch interfaces are certainly a new and innovative paradigm in computer design, it’s reductionary to claim that the future is about computers “designed for touch”. It is but one component.

                  Computers are finally powerful and malleable enough to truly consider the user’s need in a meaningful way. Who he/she is.? How old are tget? How many users are there? Where are they using their devices? In what orientation? How many devices? With fingers, or keystrokes? All day use or intermittent? In their pockets or on a desk? Developers and designers now have near infinite possibilities when crafting unique experiences.

                  And yet, that alone would insufficiently describe our “post-PC” era. What we do with all those possibilities is just as important as the devices than can achieve them.

                  Hence curation. Curation characterizes the result of our relationship with technology. The smartphone you use to check email can be the same you use at work to present keynotes. You might use that same device to reach loved ones on vacation or to glance at while reading your children to sleep. And in each case, that phone, tablet or laptop seamlessly conforms to you, delivering everything you might need from it at that precise moment and nothing else.

                  Like art gallery curators, we choose the lighting, the sounds, the sights, the mechanisms of our technological exhibitions and create a definitive experience that changes along with our whims. They are as simple or as complex as we make them out to be. Auxillary considerations (battery management, system requirements, storage, memory allocation) are by and large invisible.

                  To wit, I’m writing this article on my iPhone with an external keyboard, completely absorbed on the pulsating blue cursor on the screen in front of me. At this precise moment, I need only focus on my role as a writer with my iPhone as my typewriter. In a few moments, I’ll be putting the keyboard away, my iPhone will become a phone again and I’ll call my girlfriend to say I’m coming home. Later tonight, this phone will be our movie theatre, showing whatever movie we might find on Netflix. This is what computers are capable of today and how we think about them. As a curator, the experiences I have with my phone are entirely of my choosing and there are no unwanted intrusions.

                  And on Monday morning, at 5Am, this iPhone will be my nemesis: my alarm clock for work. I do wish I could curate that one differently.

                    iPhone Essential

                    Patrick Rhone, on the iPhone being the only computer you might need:

                    The real challenge is overcoming our comfort, convenience, limits and pre-conceived notions.

                    Like many others I’m sure, reading Rhone’s account of using the iPhone exclusively to get through the day made me frown with skepticism. Seeing how I had tried using the iPad as my only computer in the past and came to the conclusion it wasn’t enough, I had my doubts the iPhone could fare any better. So initially, I just dismissed it as Patrick Rhone being Patrick Rhone: explorer of minimalist workflows.

                    Still, something about his article struck a chord with me: Was I pigeonholing the iPhone based on my actual experience using it or because I wanted to keep telling myself I was still a “power user”?

                    So this week I ditched my PowerBook and spent the using my iPhone as my only computer, augmented by an Apple wireless keyboard to assist the writing load. I’d like to say it was revelation, an experience or that there’s some exciting anecdotes I can entertain you all with. Really, all it did was confirm my suspicions. My iPhone was already the best and most frequently used computer I own. Leaving my other devices behind was, at Rhone puts it, “no challenge at all.” Most of all, encouraging myself to only reach for my 3GS helped make me realize how effortlessly I had already been doing that everyday.

                    A simple inventory of my computer use during a typical day reveals how much I already depend on my iPhone for almost everything I do. If I had to put a number on it, I’d say that almost 80% of all my tweeting, RSS-reading, email checking and web browsing is done on my iPhone, even at home. In those regards, the iPhone applications dedicated to those activities are arguably some of the best there are.

                    These days, about the only activities I didn’t use my iPhone for was writing and maintaining this website. And like Rhone, a keyboard and Plain Text are the only two things I needed to change that. Previously, before I bothered to learn markdown, I’d ditched the iPad as a publishing device because formatting articles for the web on iOS devices was a chore and quite often frustrating. Of course, those frustrations were of my own making. With markdown, formatting is as simple as I choose to make it. Which is to say quite simple.

                    Is using an iPhone a perfect solutionfor everyone? No, of course not. Even a year ago, my computing demands would have far surpassed what my iPhone is capable of, unless there’s an iOS version of Adobe Lightroom floating around I haven’t heard of. But for what I need today, it’s plenty enough, save perhaps for the few times I need to so some CSS or HTML work, or edit some images in Photoshop. Those tasks, in aggregate, are so infrequent as to confirm that my laptop is now my seconday device. Call it a demotion if you will, but I’ll be making use of it from here on out as a tool dedicated to specific goals.

                    I’d talk about my specific, iOS only workflow, save for the fact that it would be mostly uninteresting. Most of you can guess which apps I use and how. In fact, I suspect my habits with those apps aren’t far from yours. In most cases, you’ll already have made up your mind on the pros and cons of using them instead of “applications” you use on “real” computers. That includes the constant inquisitive stares you’ll receive everywhere when strangers see you typing on a free floating keyboard with no apparent screen in front of you.

                    As a geek, I like to rationalize my need for the latest and greatest gear. Like many, I lust for “cores” and RAM numbers that have more thn a single digit. I want to say I’d make use of dual, triple or even quadruple Thunderbolt displays pumping out petabytes of RAID backups of all my seasons of the Office. That scenario sounds great. Unfortunately, I’ll likely never need any such a behemoth, certainly not to keep updating a Tumblr blog. I’m not advocating you throw everything away because of how awesome the iPhone is. There’s caveats to everything and this device is no exception. I’m not quite sure if I’m even trying to advocate anything at all. What I can say however, is that I’ve never been so happy with my iPhone as I have this week and at the very least, my technological self has gotten a little wiser, a little more attuned to itself.

                    This week, living exclusively with my iPhone has had nothing to do with something that is “minimal”, but everything to do with what’s essential.