Smarterbits

A Week With iOS 5

So it was an interesting chance to test out some of the features in iOS 5 by leaning on them heavily for the week.

Spears’ article echoes both the motivation and delightful surprise I felt dedicating myself to my iPhone.

I purchased an iPhone 4 last week on a spur of the moment epiphany. I had started a fund for my next computer purchase but found myself asking why? If I had already come to the conclusion that my iPhone 3GS was my favorite and most used computer, why not invest in a new favorite computer, one with better battery life, a gorgeous screen and snappier performance?

So that’s what I did.

Spears concludes that he’s no longer dependent on his Mac, and I couldn’t agree more. Using a Mac remains as enjoyable an experience as ever, but it’s becoming harder and harder to find reasons to come back to it, other than for specific tasks my iPhone can’t yet handle. More resonating however is how dedicating oneself to iOS for a brief period of time, such as Spears and myself have, can cause a profound unplugging from the perception of the PC as overlord of the subservient, second class citizens “power users” peg smartphones and tablets as being.

I thought I was fringe but now I’m encoura - no, daring you to try leaving your computer behind for a week.

Be careful, you might learn something about yourself.

  • 2 weeks ago
  • 4
  • Tags
  • tech
  • iOS
  • iPhone
  • living fringe

    Close Reading of Apple and Google Presentation Slides

    There is something fascinating about choosing words and the results of those choices. Why people choose a specific word over another is equally fascinating: What does it say about it’s author? What message do the chosen words convey? Who is that message addressed to?

    I thought it would be interesting to compare Android and iOS not by a checklist of features, but rather by a checklist of the words their developers choose to use on their presentation slides. From that alone so much can be inferred. For example, here are what I think the boldest statements Apple and Google made during their keynote.

    Apple simply stated:

    The most amazing iPhone yet

    Google asked:

    Can a machine have a soul?

    Talking about Siri:

    Your intelligent assistant that helps you get things done just by asking

    Describing Roboto:

    Roboto has a dual nature. It has a mechanical skeleton and the forms are largely geometric. At the same time the font’s sweeping semi-circular curves give it a cheerful demeanor. Isn’t there so much you can tell about these two companies by those words alone.

    Here’s a list of some other snippets of presentation slide text from the iPhone 4S and Ice Cream Sandwich keynotes. You can click through the links to see from which presentations they come from, but for a fun challenge, try guessing from which company’s mouth the words were spoken (or projected in this case).

    • Bold and Typographic
    • Social Integration
    • Intelligently Stored on Device
    • 33% Faster Capture
    • Zero Shutter Lag
    • Better Color Accuracy
    • Revamped UI
    • Easily Locate Friends and Family
    • Make Me Awesome
    • Full HD Capture
    • Ultra Thin Design
    • Sleek and Curved Design

    Depending on how well you guessed, you’ll have a different impression of how those words shape your perception of the company. If you were able to guess each one correctly, what does that say about the power of words in creating brand identity? Or is the opposite true? What can you surmise about each company’s different values and philosophies from those snippets alone? I would say there’s a definite contrast, nuanced as it is, between the two, but I wouldn’t go so far as to claim one is better than he other. I’m simply fascinated by how much you can learn from paying close attention to words.

      Ben Brooks on iMessage •

      Overall iMessage isn’t a feature that is going to change the world, but if you family and friends are all iPhone users (increasingly more likely), it’s a very nice tool to have.

      While I agree with Brooks counterpoint to Dr.Drang, I also disagree with his passive underestimation of its importance, especially with the 13-21 crowd. Here’s why.

      My sister, who turned 18 over the summer, sends on average a thousand text messages a month. When I asked her about it, she wouldn’t even describe herself as a heavy user amongst her friends, whom many of which, coincidentally, own iPhones or iPod Touches. As for myself, the number of texts I send per month hovers around the low hundreds, most of which are to my girlfriend, who also owns an iPhone. I’ve already dropped my $15 unlimited text messaging plan and I’m encouraging her to do the same.

      Of course, these two examples are merely anecdotal. Yet, I’ll go out on a limb and say that they aren’t uncommon. You might even recognize yourself in there.

      Not much ink has been spilt about iMessage in all the iOS 5 reviews I’ve read, other than questions about multi-device synchronization. Unfortunately, it seems their authors fail to realize how ubiquitous text messaging has become as a mode of communication; a generational gap that even takes me by surprise sometimes.

      iMessage is absolutely game changer. Blackberry’s own messaging system was already a hit amongst its users, and the iOS install base is exponentially larger, in the hundreds of millions. Apple’s entry is the tipping point: Android will surely follow suit to match the iPhone’s feature checklist, as will Windows 7. Neither would it be a stretch to imagine that future Facebook integration into mobile OSes will make messaging a standout feature. If you think iMessage is only a handy bonus, you’re missing the larger picture.

      Combine those growing platforms with ever increasing Wi-Fi coverage and commoditization of wireless data and it’s easy to see that the writing is on the wall for SMS.

      Update - 5:20pm

      Ben responded to my post, reaffirming his initial position. Perhaps we don’t see eye to eye, but I rather think we’re both approaching the issue from different angles. Like him, I do agree that in it’s current implementation, iMessage won’t take over SMS by itself. Ben makes two great points that would help iMessage really take off:

      It is integrated into Mac OS X.

      It is opened up so that other platforms can use it (Android mainly).

      I concur, but I still stand by my claim that iMessage is a game changer. Let me clarify:

      • iMessage may not be good enough to overtake SMS entirely, but I’m convinced it’s good enough for many people to reconsider their current SMS plans, perhaps even dropping to a lower - cheaper - plan. This will hurt the carriers, especially at iOS’s rate of growth.

      • iMessage has a large enough install base to make SMS alternatives viable and more importantly, popular beyond the small (in comparison)niche of BBM users. As I previously stated, Google and Microsoft will want to play a part in this as well, so I’d predict seeing something similar on Android and Windows Phone in the near future. But it doesn’t have to end with the big players; even if iMessage only stimulates interest in SMS alternatives, it’ll open the door for another third party to come up with the appropriate solution.

      • Text based messaging is a vital component of communication for a young but growing generation of kids, that will take those habits into adulthood. There’s incentive to find a free alternative to SMS. Likewise, you can expect vigorous resistance from the carriers for the same reasons.

      • The individual platforms may not want to adopt a standardized system, which would hurt adoption. Brooks is correct in pointing out the importance of interoperability. Something akin to system wide Facebook integration could be the answer: gargantuan install base and platform independence.

      iMessage is the first jab in the direction of SMS, not the final blow.

      Update - 6:01pm

      Stephen Hackett astutely brings up another vital point.

        The Sorcerer's Apprentice

        In many ways, Forstall is a mini-Steve. He’s a hard-driving manager who obsesses over every detail. He has Jobs’s knack for translating technical, feature-set jargon into plain English. He’s known to have a taste for the Mercedes-Benz SL55 AMG, in silver, the same car Jobs drove, and even has a signature on-stage costume: black shoes, jeans, and a black zippered sweater.

        Fascinating piece of writing. One part fascinating insight into one of Apple’s key players, one part hard nosed attempt at recasting a new Jobs figure for the company’s narrative.

        On the whole, not sure what to make of it.

        • 1 month ago
        • 5

          An Era of Curation

          Post-PC.

          The term has been in our collective minds for a good while now and still, it feels inadequate, inching towards inappropriate. A Post-Personal Computer? Have we passed the time of computer ? What is the stage past personal? Has the computer become intimate? Computers, arguably, are more personal than they’ve ever been. Perhaps we mean to say we’ve crossed over to an era where personal computers look vastly different than they ever have, a reasonable proposition. Tablets, in particular, seem as if conjured directly from a cherished science-fiction novel.

          If that were the case, post-PC would still be a bad description; tablets bring so more to the table than a design that separates itself from any pre 2010 computer. Certainly, a revolution is taking place. Computers no longer look the same and are no longer used in the same fashion, the same places or aimed at the same customers. Hence the hook of Post-PC; we are definitely witnessing a changing of the guard. All around us, things are no longer what they used to be.

          At best, post-PC is a quantitative term, merely informing that this phase follows the “PC” stage and redundantly serving to differentiate between a tablet and a notebook or desktop. The term offers no significant meaning or metaphor for computers today. What is needed is a definition that accurately reflects its time and place. The era of personal computers defines not only a category of devices but a whole generation of computers that where affordable, relatively simple to use and designed with the individual, not the corporation, in mind. Progressively, it came to define the era of digital media, connectivity, and social networking. PC, in so far as it can still be considered a term today, is qualitative.

          Certainly, our aim should be to find a term to replace Post-PC? What wording can adequately describe the way we interact with computers presently?

          I would propose we’ve now begun the era of the curated computer.

          Curated?

          Wikipedia describes the role of a curator as such:

          In smaller organizations, a curator may have sole responsibility for the acquisition and care of objects. The curator will make decisions regarding what objects to collect, oversee their care and documentation, conduct research based on the collection, provide proper packaging of art for transport, and share that research with the public and scholarly community through exhibitions and publications…

          …In larger institutions, the curator’s primary function is as a subject specialist, with the expectation that he or she will conduct original research on objects and guide the organization in its collecting. Such institutions can have multiple curators, each assigned to a specific collecting area (e.g. Curator of Ancient Art, Curator of Prints and Drawings, etc.) and often operating under the direction of a head curator. In such organizations, the physical care of the collection may be overseen by museum collections managers or museum conservators, and documentation and administrative matters (such as insurance and loans) are handled by a museum registrar.

          One line in this definition is of particular interest.

          the curator’s primary function is as a subject specialist

          To this I would also add that a curator is responsible for the crafting and informing of experiences. If you’ve ever attended a gallery exhibition, for instance, you are witnessing the work of a curator. Curators gather resources and, combined with their expertise, create a particular experience intended to be shared with a specific audience. The term has even crept online: a quick glance at some of your favorite websites and you’ll find passing reference to “curation”. In this case, the term refers to the care and attention the author is placing in the exhibition and content of his website.

          Today, with the technology available to us, we have begun to curate our computing experience. Curated computing can describe the following:

          • The user acting as a curator, able to select hardware and software based on his needs.
          • Hardware and software that can be curated, presenting limitless, interchangeable choices while posing no restrictions to the user.
          • The specific and refined computing experiences resulting from the relationship between the curator and the material he curates.

          Of course, it’s also important to describe the conditions of the technological world the era of curation defines. After all, if this era is to succeed the era of the PC, it must allude to specific changes that have taken place. More specifically:

          1.\ A general performance standard that is no longer limited by physical constraints.

          While users have always required specific things from their computers, it is only recently that hardware has been able to provide solutions for almost any possible usage scenario. Only five years ago, it was unfathomable to think a cinematographer could shoot, edit and publish a feature length film using exclusively his mobile phone. That type of work used to be the sole domain of expensive and complex workstations. Even something as simple as checking email or browsing the web, until recently, required a capable machine for the experience to be enjoyable. Pre-2007 smartphones may have been capable of email and browsing but the experience was mediocre at best. Design, portability and functionality almost always came at the expense of performance and usability.

          Things have come a long way since. Arguably, almost anything a typical user could want to accomplish with his computer can be done equally well on a traditional computer as it can on a smartphone or tablet device. Technological advancements in component efficiency and miniaturization and have long since closed the performance gap between device categories, nearly removing altogether the need to consider computer components from purchasing decisions. Today, design and functionality have become the primordial considerations of most users.

          The MacBook Air perfectly encapsulates this transition. Consistently the most underperforming Mac in Apple’s lineup since its inception, the Air has gone from a device with limited applications to arguably the ideal notebook for almost everyone in its current incarnation.

          To illustrate my point. Imagine someone who needs a device capable of handling email, web browsing, text editing, basic picture and video editing, games and communication. Now try to name a device available commercially today that can’t reasonably do all these things. (Software experience notwithstanding, for the sake of the argument.)

          Today, instead of having to limit our choices to only what is powerful enough to edit spreadsheets, we can choose devices that truly fit our lifestyles. Computer hardware today does not differentiate itself only for what it can do, it stands out for who will be using it, when it will be used; where and why.

          2.\ Software design that values specificity, optimization and human interaction above all else.

          Better hardware has also meant better software. Whether by harnessing the computing capabilities of multi-core processors and 64-bit architectures or through more efficient coding, rich and capable software is now available on almost any computing device you can get your hands on. Optimizations, crafty workarounds and resource management in coding languages has also helped maximize the potential of what we once considered underpowered hardware.

          Not only is software faster and more powerful than it previously was, you can now find the same applications on your mobile phone as you can your workstation. Apple’s iWork suite is available on every device it sells outside the iPod. iMovie is on our cellphones and pro applications like Final Cut no longer require top of the line hardware to be useful. No matter what sort of device you own today, it’s almost certain that if you can think it, you can achieve it.

          Even more interesting to us is the growing trend towards specializee applications, propagated mainly by the vast array of “apps” available on mobile devices. A simple, non-exhaustive description of software today could include:

          • Apps that do one thing, and one thing well: Instead of catch all applications such as Microsoft Word or iTunes, applications are now designed with a few as just one point of emphasis. Think iAWriter, Instapaper, TweetBot. You can also find it in operating systems. iOS 5, for instance, splits the iPod app into two separate Movies and Music apps. With a single focus, the quality and ease of use of desktop and mobile software has never been as great as it is today.

          • Design that focuses the user on the task at hand: A key innovation that iOS helped germinate in Mac OS is the full-screen app. Dedicating the entire screen’s real estate to a single application is helping drive innovation in software design and human interface philosophy, changing the way we experience and interact with new software and simultaneously bringing a fresh twist to old applications. Remember when we looked at inboxes as a single column? Beyond only functionality, full screen apps are also transforming devices from multi-purpose tools to specified, single purpose devices at any given moment. Ask 10 different iPad users how they use their tablets and you’re bound to get 10 wildly different answers. One may use it exclusively as an e-reader while another includes it in his DJ equipment boxes. Although as multi-purpose as ever, specificity in software design causes users to reconsider the identity of the devices they use. What nerds call a tablet computer is another’s cash register, medical filing cabinet or personal bookstore. Combined with gadgets that require less and less maintenance, the user can custom craft his experience with as little or as much computer savvy as those experiences require.

          3.\ Content becomes device independent.

          Finally, the magic of the internet and its related services has come and connected together almost any and every device you can make use of. While emails, contacts and calendars have long been transferable between devices, we’re fast approaching the stage where almost any kind of content can be accessed and manipulated in the same way across a multitude of devices. As Shawn Blanc describes it, computing today has “[A]ll your stuff on all your devices in any place.” Experiences are now repeatable from one device to another.

          Content becoming independent of its client device is a good thing. When Steve Jobs stood on stage at WWDC this past summer and called the new role of the Mac a demotion, he probably didn’t intend for it to sound as pejorative as many understood it to be. Sure, you could say the role of the traditional PC has been diminished but that’s ignoring the net benefit to computer use as a whole. The homogenization of your data across all types of hardware, as well as the client software powering it, is enabling experiences that were previously unimaginable. Nerds may roll their eyes and point to the myriad ways we’ve been able to share data between devices for decades, but would do well to consider that syncing and storing basic data has never been as simple and accessible to the mass market as it is today. Furthermore, we’re fast reaching the point where what you can do with a file on one device is the same as you can do on any devices. Content is free at last.

          Steve Lyb:

          The iPad is an empty slate, and what you make of it is ultimately what comes to define its role in your life.

          Lyb, in one sentence, perfectly captures the essence of curation. Although he is focused on the iPad’s chameleon-like qualities, he’s also describing the kinds of experiences we are beginning to have with every device we own, tablet or otherwise. Every part of a computer, from its hardware to its most basic applications can be customized, carefully selected and arranged to achieve a specific goal. While we’ve always had choice, we’re no longer beholden to the consequences of them. A smartphone needn’t be an emergency mail client. A desktop no longer presents Golberbegian complexity to the average user. Choosing one computer over another is no longer making a sacrifice or picking one thing over another; it is simply one more flower in your own customized technological bouquet.

          The upsides are obvious and apparent. Purchasing a device to stay on top of your Facebook timeline and TMZ no longer requires you to deal with anti-virus, disk defragmenting, RAM usage and CPU temperatures. Any Android, WebOS, iOS or Windows Phone OS device will get you browsing the web in no time. If you do nothing but fire up mobile Safari for as long as you own your iPhone, you’d happily be none the wiser to everything else it can offer. Conversely, gamers can still, to their hearts content, trick out custom rigs while creative professionals use workstations capable of rendering the frames of next great American film.

          These are simple, one dimensional examples. Curation doesn’t pose restrictions. Imagine your local coffee house offering their beverages on iPad menus, where your order is processed through a dedicated payment solution on your barrista’s iPhone, all the sales data seamlessly being sent over to his accountant’s laptop some unknown distance away. And again, with hardware and software colluding to make the “computer” as invisible as possible to its user, the experience becomes solely what the user makes of it. A common expression in the past might have been: “I’m going to use this computer to write my paper”. Today, you’d say: “I’ve written my paper using this computer.”

          Many claim this is the reason behind the appeal of the iPad. They correctly emphasize its ability to transform into whatever the user requires and its ease of use as its strong suits. Yet what most people, and many tech companies, fail to realize is that those qualities shouldn’t and aren’t exclusive to tablets. The iPad’s popularity reflects a mass desire and need for a device delivering the kind of experiences the iPad does. A device that could be curated. All the iPad did was answer the call.

          Our experiences with computers today contradict our preconceptions of what computers are, or rather, used to be. Whether through industrial design or software engineering, tablets, phones, notebooks and desktops are becoming more and more abstracted. Apple products are again a good example. Here’s another exercise: Take your iPhone, iPad or MacBook. Turn it off. Imagine all the logos and FCC markings have disappeared. What does that object now resemble? You know it is a computer, but what would someone in 1990 say it was? Someone in 2000? Taken to the extreme, imagine a MacBook Air with no ports and with the lid closed. Could someone today even identify it as a computer? Try the same experiement with your iPhone.

          Of course, the optimization of hardware components has changed the design of computers. Integrated circuirty and non replacable parts help miniaturization and design creativity, but the biggest impact may be caused by the psychological effects of those advancements. When a user no longer has to replace a battery, hear fans churning along or no longer wonders which port is a USB port, his relationship to his computer becomes second nature.

          You turn on your computer. You launch a program. Add more RAM. Plug in a monitor. Change batteries.

          While those types of behaviors still exist, new paradigms of interaction have come and changed the user/device relationship. The most glaring one being the popularization of touch interfaces, removing almost all levels of separation between the user’s interaction with a computer.

          Touch interfaces have also influenced the industrial design of tablets, notebooks and smartphones. As devices become touched and held more frequently, the design of computers has had to focus in on its ergonomics, beyond only the keyboard and mouse. Now the entire machine must be designed with physical interaction in mind. Using computers today is an increasingly sensory experience, providing experiences that were never possible in the past, as well as new challenges to overcome.

          Apple, Microsoft, Google and Amazon all have different approaches towards the design of modern computers. Whether you believe one tech company’s philosophy to be better than another, one thing is sure: Prioritizing intuitiveness, ergonomics and usability has created entire new classes of devices which invite and inspire users to interact with them regularly.

          Not everything is rosy peachy of course. 4 years into the revolution the iPhone (arguably) started, there are still kinks to be uncovered. Even if most would agree Apple’s iOS ecosystem is the standard against which to hold all others, it isn’t without reproach. Apple’s penchant for skeumorphic design, for instance, has received mixed reactions. On the surface, skeumorphic design makes sense: if computer devices are going to be able to replace almost any real world object, perhaps it’s best to mimic those objects through software, for the user’s sake. In practice results are mixed, with applications varying between kitsch and pointless. Computers should strive to be more than just a replacement for your address book or calendar; they can and should perform more efficiently. Apple isn’t alone in suffering from growing pains. Microsoft, despite a beautiful, well designed Metro UI, seems dedicated to tacking it onto the old regular version of Windows. Both the future and the past. Android, for its part, oscillates somewhere in the middle developing an OS built from the ground up seemingly designed by committee, committing neither to old or new paradigms of software designs (One assumes they leave this up to the third parties licensing their software). Transitioning notebooks and desktops OSs is proving to be another challenge. Mac OS X Lion is a bold step in the right direction but it remains to be seen how best to transition and accommodate a slew of new customers who are developing habits in iOS first and then Mac OS, instead of the other way around. Microsoft, for which their Windows Platform is arguably of primordial importance to their well being, will have to step up to a similar challenge with Windows 8.

          Computers are no longer about finding the best way to input and analyze data. Both hardware and software must engage and relate to its user. If the attention garned by any new mobile OS release is any indication, it’s an avenue still ripe for exploration.

          In many ways, much of what has been described above sounds like the “hyper-PC”, and that’s a perfectly reasonable stance to take. That would be shortchanging the situation. Merely stating that computers are more personalized is a shallow perspective on the situation. So is calling the future “touch-based”, or any derivative thereof. While touch interfaces are certainly a new and innovative paradigm in computer design, it’s reductionary to claim that the future is about computers “designed for touch”. It is but one component.

          Computers are finally powerful and malleable enough to truly consider the user’s need in a meaningful way. Who he/she is.? How old are tget? How many users are there? Where are they using their devices? In what orientation? How many devices? With fingers, or keystrokes? All day use or intermittent? In their pockets or on a desk? Developers and designers now have near infinite possibilities when crafting unique experiences.

          And yet, that alone would insufficiently describe our “post-PC” era. What we do with all those possibilities is just as important as the devices than can achieve them.

          Hence curation. Curation characterizes the result of our relationship with technology. The smartphone you use to check email can be the same you use at work to present keynotes. You might use that same device to reach loved ones on vacation or to glance at while reading your children to sleep. And in each case, that phone, tablet or laptop seamlessly conforms to you, delivering everything you might need from it at that precise moment and nothing else.

          Like art gallery curators, we choose the lighting, the sounds, the sights, the mechanisms of our technological exhibitions and create a definitive experience that changes along with our whims. They are as simple or as complex as we make them out to be. Auxillary considerations (battery management, system requirements, storage, memory allocation) are by and large invisible.

          To wit, I’m writing this article on my iPhone with an external keyboard, completely absorbed on the pulsating blue cursor on the screen in front of me. At this precise moment, I need only focus on my role as a writer with my iPhone as my typewriter. In a few moments, I’ll be putting the keyboard away, my iPhone will become a phone again and I’ll call my girlfriend to say I’m coming home. Later tonight, this phone will be our movie theatre, showing whatever movie we might find on Netflix. This is what computers are capable of today and how we think about them. As a curator, the experiences I have with my phone are entirely of my choosing and there are no unwanted intrusions.

          And on Monday morning, at 5Am, this iPhone will be my nemesis: my alarm clock for work. I do wish I could curate that one differently.

            Subtle Difference

            Seth Clifford:

            That doesn’t seem to be happening with iPads, because I think people’s expectations are set accordingly when they buy them. These are not full computing devices; they’re not built to be - and yet when you watch the commercials, what do you hear? The “full” internet. Flash. Do it all. Why wouldn’t people be disappointed when they can’t actually replace a computer with a device that promised they could?

            Clifford is wrong, but not in the way he thinks. The iPad may never advertise itself as a computer replacement, but it does market itself as a device that can do individual tasks: checking email, browsing the web, watching a movie, preparing a presentation, playing games, etc.

            As it turns out, most average consumers use their computers for a handful of these lightweight tasks, sometimes only one or two. The magic happens when they purchase an iPad for a specific reason, discover how wonderful it is to use, and never return to their comparatively grotesque and unpleasant computers.

            Apple doesn’t have to market the iPad as a computer replacement because they believe the iPad is replacing the computer. Subtle difference.

            Via Daring Fireball

            • 3 months ago
            • 11

              Games Defeated Apple

              Hmm.. reading….drawing….oh and what’s this we have here?

              Imagine if Apple did have a plan for gaming.

              • 3 months ago
              • 7

                Time to Tab

                Myself, a few months ago, speaking about what makes Apple’s advertising so effective:

                Simple premises. Concise messages. Clear, distraction free presentations. They invoke an emotional and personal response.  And while they involve a narrative, the product has an active, rather than passive role. 

                I’m mentioning this because of the new Galaxy Tab 10.1 commercials that have begun airing, appropriately titled “Time to Tab”. Somewhere in the ad agency representing Samsung, someone has been taking notes on the iPad’s marketing.

                Compared to the original Tab commercials, or even some of it’s rival Android tablets, this new spot seems almost too good. So far,  Android commercials have largely been about spacecrafts, strange elemental phenomena, absurd geekery and Terga chipsets. “Time to Tab” finally places you, the user, in the leading role, demonstrating how the Tab can fit into your lifestyle through a series of vignettes demonstrating people, presumably just like you, achieving wonderful things with the Galaxy Tab. 

                This commercial is much more in the mold of Apple’s recent iPad commercials, such as the “We’ll Always” clip. Denying the influence would be a bold endavour. Answering which is a better commercial is moot; how would one go about guessing the intentions of each company and whether or not those have been met? 

                There’s no harm in comparison though:

                • Pay attention to the soundtrack. What sort of emotions do they ellicit in the viewer? The Tab’s soundtrack is energetic, youthful, even dramatic in a action sport sort of manner. The iPad spot is softer, reflective, emotive. One isn’t better than the other necessarily, but the Tabs music doesn’t fit thematically as well as it could with each vignette. On the same topic, compare the voices of the two narrators. 
                • Curiously, the Tab spot doesn’t spend much time with each of the various scenarios. Unfortunate, since some segments like the museum and safari scenes are actually quite original and interesting. Would have been interesting to see some of these expanded into individual 30 second spots.
                • Samsung still emphasizes features of the device through overlaid text. It complicates the ad needlessly and distracts from the potential connection viewers could make with the narrative on display. Hence why Apple eschews such things completely. Why do they keep doing this?
                • Every line in a commercial is specifically and strategically chosen, maximizing the limited time inherent to the format. “Time to Tab” starts with the line “People, it’s time for a better tablet”. “We’ll Always” starts “We will never stop sharing our memories”. Apple also makes heavy use of repetition, starting each sentence almost the same way. I’m not sure if this is merely stylistic, or if if is of any actual use. 
                • Notice how every action in the “We Will Always” spot is one that any average person may do on a regular day numerous times. Not everyone gets the chance to paint murals, scale mountains, chase lions in the savannah or be some high ranking company man. For Apple, the iPad integrates and improves your current life. In many of its commercials vignettes, the Galaxy Tab is presented as a aspirational and empowering device. Presumably, owning one should turn you into someone with an enviable life. Simplicity versus complexity. It’s a subtle and important distinction to make. Does one commercial make it easier to envision one’s self using their device?
                • Interesting to note how Apple hones in on “continuing” to do the same things we’ve always done. 

                The endgame here isn’t to criticize one commercial or another but simply to study the different methodology of each companies marketing. Creatively, ads are a window into a brand’s psyche. They are a chance to see how they view themselves and how they wish to project themselves to others. This new advertisement for the Tab is much improved from the previous crop of Android Tablet clips. Yet, despite adopting some of the ideas and themes from Apple’s own commercials, the message, and the manner in which it is perceived, continues to be vastly different. Notice how little is even discussed about the iPad itself. Everything is about you, today, as you are and how you will be tomorrow.

                The iPad associates itself with your life.

                Comparatively, the Galaxy Tab’s ad is the technological equivalent of the Marlboro Man; some fantasized, idyllic and ultimately illusory identity only achievable through the purchase of a particular product. 

                 

                  The Fire

                  Jerry “Tycho” Holkins: 

                  The reality for me is that even as a “gamer,” an enthusiast, a true adherent, my phone is enough even on a trip. Period. The premium cache associated with Portability capital-P has been obliterated, being portable is insufficient as a motivator when I have multipurpose computational sliver perpetually en pocké.

                  As I was saying.

                  • 3 months ago
                  • 3

                    Just a Multipurpose PDA

                    Speaking of skating to where the puck was, Alex Penny disagrees with my assesment of Nintendo’s slumping 3DS sales:

                    iOS devices have absolutely nothing to do with the drop in price of the 3DS. No one still consider them to even be portable gaming devices as they’re still closely linked to being multipurpose PDAs.

                    Penny, if he actually believes iOS sales aren’t affecting Nintendo, is in denial. I say this because he manages to nail exactly why the 3DS can’t compete with iOS devices when he calls them multipurpose PDAs. He probably uses the term pejoratively, but he’s on the right track. The iPod Touch, iPad and iPhone aren’t just gaming devices, they’re almost anything you want them to be at any given time in an incredibly portable form factor. Which do you think seems more attractive to mass consumers?

                    Penny instead blames the unreleased Playstation Vita, which he claims:

                    has sparked more more attention from portable gamers as it virtually outdoes the 3DS in every aspect now. 

                    That must be it.

                    • 3 months ago
                    • 1