Paul Miller's Search for Digital Authenticity. •
This article is written by a custom book report company on https://top-papers.com/book-report/ and describes how Paul Miller, in an article titled The Condescending UI, achieves the truly bizarre: user interface status-seeking. What starts out as a defensible critique of the slow evolution of user interfaces quickly turns into a solemn lament about what essentially amounts to Miller’s own loss of geek credibility.
BeOS is purely digital, with a sort of 8-bit charm, complete with pixel-perfect isometric icons. It’s much like the appeal of “retro” indie games, which deal in our native, shared gaming language and metaphors, not something borrowed from action movies or an overblown sense of virtual reality.
Miller points to BeOS as the counterpoint to the overblown UI metaphors of modern day operating systems. But it’s his comparison to retro gaming that gives him away as a status-seeking elitist. The appeal of retro gaming certainly is nostalgia but in this case, Miller isn’t describing the nostalgia for childhood memories of weekends spent playing MegaMan. He’s nostalgic about a time when games weren’t mainstream and the distinction granted from being known as a “gamer”. Now that games are more accessible and geared towards the interests of a huge and varied demographic, that distinctions has disappeared. Most everyone can and does play games. What retro indie games did was provide an avenue for games to be edgy again. They often require more technical proficiency to enjoy and are geared to the tastes of the few rather than the many. Indie games restore that long lost distinction.
Now Miller is trying to do the same with user interfaces. He’s tired of having his hand held by modern day operating systems.
A huge graphical icon representing an app might look incredible and enticing, but after a while it’s sort of oversharing. It’s constantly reintroducing itself, in case I didn’t catch its name the first time.
Miller hates that UI language speaks to everyone and for everyone. He may be disguising it as a critique of UI design, but he’s mad that computers have become geared towards the layman user. Finally, he offers up his temporary solution.
In my personal quest to escape the condescension, I recently switched my Windows 7 install over to the “Classic Theme,” which is basically Windows 95 incarnate, just with all the under-the-hood improvements I’ve come to rely on. I really like it. It feels right, and if it isn’t beautiful, at least it’s honest.
Putting aside all questions about exactly what constitutes an “honest” user interface, Miller’s best answer to the lack of evolution and digital authenticity in UI design is to return to something resembling a clunky, dysfunctional and abstracted artifact from the past nearly no one but the most technically savvy of users could enjoy navigating. In other words, he desires the same thing retro indie games did for gamers, namely, the restitution of the distinction that came with being someone who could make the best of a shitty computing experience.
A contrived and convoluted article about nothing more than cool-hunting. What’s not honest and authentic about that?
In Exchange With Gets Its Own Website!
If you’re on this site even semi-frequently, then you already know all about In Exchange With, my series of interviews with various web personalities. This project has been close to my heart and tons of fun, so I’ve decided to give it a proper home: its very own website. The interviews look - and read, better than ever. It’s an exciting step forward in this hopefully long term project.
Don’t wait for my permission, go ahead and subscribe to In Exchange With on RSS and Twitter.
An Impromptu Real-World Test of Siri and iOS 5 Voice Dictation
Unfortunately, my iPhone 4S voice experiments didn’t go well for at least one party involved: my dogs. We’ve trained them fairly well and they know quite a few voice commands. Nearly every time I started talking to Siri or dictating part of the blog post, they would look back or hesitate at the next turn. I wager they can get used to this as time goes on, and maybe I can get in a better habit of addressing Oscar and Maddy specifically when a voice command is for them, and not my phone.
Conversing with someone has never caught the attention of dogs whenever I’ve had the pleasure of taking them for walks. I’m left to wonder then whether there’s a commonality between the way we speak to Siri and the way we talk to pets. Or what to make of the difference between that mode and how we communicate with other humans?
“Command” stands out in my mind, but that feels like an unsatisfying answer.
How a Quantum of Transparency Could Have Made Carrier IQ a Non Issue
As MG Siegler puts it, this whole Carrier IQ thing is being blown out of proportion. However, I don’t blame the media for running wild with the story, I blame Carrier IQ and it’s clients (whether they are the carriers or handset manufacturers) for lobbing them such an easy pitch by not being having one ounce of transparancy in their operations. Having failed to do so, Carrier IQ is now the latest taint no one wants to be seen with at the dance.
If it seems as though Carrier IQ is a sinister entity, it’s probably because popular culture has taught us that organizations with secrets are always concealing some evil plot from our favorite protagonists. In reality, one of two things probably happened:
- Carrier IQ and whomever is implementing it on their products didn’t see why they would need to inform users on what was going on.
- Carrier IQ and whomever is implementing it on their products probably thought more people would opt out if they were well informed about it and wanted to avoid possibly hindering the value of the data being collected.
Let’s be honest, there isn’t any baleful character sitting behind a sea of monitors collecting our information and plotting against us. At its best, Carrier IQ likely provides handset manufacturers and carriers valuable information that, even if we might not understand how, helps them improve their services and products. At worst they… what? Sell our information to advertisers? Businesses are in the business of making money and there isn’t much of it to be made in knowing who you called last week unless it helps them better understand how to subliminally make you dial restaurant delivery services more often. Whatever the case may be, the least all parties involved could have done was to inform users about the data monitoring because people rightly deserve to be informed about what is being done with their personal information and given a clear and obvious oppurtunity to opt out if they so desire.
Carriers and handset makers, this isn’t even hard to implement. Simply display an easy to understand (read: non legalese) statement about your desire to collect users information. Then, clearly state what information you plan on collecting and give them a choice to opt in or out.
This would seem such an obvious and common sense approach to take that not doing so only adds to the dubiousness of the whole situation. People end up thinking: Well, if none of Carrier IQs clients choose to even reveal its existance, it must be because they have something to hide.
In contrast, Apple does in fact make an effort to inform users about it’s data collection policies. Yet the experience could be improved. I’d confidently venture that most people don’t ever dig deep enough into the settings menu to even know the “About Diagnostics and Privacy” document exists. And having likely never read it, chances are a user won’t know the data collecting is turned off by default. Too often, an uninformed person is also one prone to blowing things out of proportion.
Which is precisely how we’ve gotten to this point.
In summary: No one will confound you for the bogeyman if you remember not to stand in shadows.
Kill Screen Reviews Skyward Sword
This sense of being trapped in a pedagogic exercise is inseparable from the core meaning of Zelda. Skyward Sword is a learning simulator, a near-perfect progression of object lessons—physical and logical—that revive in adults the forgotten sense of amazement in learning, while remaining uncomplicated enough to engender a sense of grown-up mastery in younger players. This trick cannot be completed without the invention of a latticework of dishonesty so imaginative it veers into the supernatural. It is a world where dragons can send you underwater to collect dozens of fish shaped like musical notes and where playing a harp causes glowing cryptograms to shine through your skin.
I’ve been championing the death of spec based reviews for a while now, so here’s a fantastic example of what’s possible when you stop focusing on technicalities. Even if you don’t plan on playing the game, it’s a fantastic read and thought provoking analysis of the Zelda series as a whole.
The Case Against Curation
A web site or a blog with links pointing to other web sites or blogs is not being curated.
The problem with Murray’s arguments is that the emphasis on what is being curated is all wrong: Writers curate the entire content and structure of a website, not only links to outside articles.
Writers producing a website certainly do “reporting” and “editorializing”, as Murray and Hackett suggest, but they also design, maintain, research, advertise, network and connect with a larger audience. Considering a website such as this one as the sum of those many different actions, curation seems an adequate term to describe them.
Murray defines curation as involving only the preservation or organization of a body of work. While the profession includes components relating to preservation and organization of a gallery’s collection, curators are also responsible for the presentation of those collections to the public. Here’s my definition, from a previous article on the subject:
…a curator is responsible for the crafting and informing of experiences. If you’ve ever attended a gallery exhibition, for instance, you are witnessing the work of a curator. Curators gather resources and, combined with their expertise, create a particular experience intended to be shared with a specific audience.
I don’t work for a museum but I certainly feel like the above precisely describes what I do here everyday. Last time I checked, “reporting” wasn’t the exclusive domain of people with journalism degrees anymore, so I don’t see why only curators should be allowed to curate.
Following this week’s duo of fantastic articles by Brent Simmons and Rian van der Merwe attempting to explain the dearth of enjoyable reading experiences on the web, it struck me that a big part of the problem is determing who your audience is.
In an essay titled Investing in the Future of News, Alberto Ibargüen outlines a few basic questions any publisher or journalistic entity should ask themselves as they transition their projects into the digital realm:
There is no easy route to success in this emerging digital space. But I think these questions, taken together, hold the key:
Why do people need the information you provide?
Do you provide utility?
Do the things you cover matter to the community?
What is your point of view and how will you reflect it?
Where and how do people want the information?
How will you engage the audience?
While seemingly self-evident, there is one factor than can wildly influence what sort of answers a publication can come up with: determining who, precicely, is the “audience”.
Ostensibly, the “audience” for any publication should be you and I, its readers. However, judging from the copious examples inspiring Simmons and van der Merwe arguments, it would seem publications have determined their actual audience to be shareholders, investors and board of directors. So while they are in fact trying to answer those aformentioned questions, their judgement is clouded through misconception. Advertisers are the ones provided with utility. Focus is directed towards delivering trite content through the latest fad delivery mechanism, all in the hopes of attracting fleeting eyeballs and clicks. The result is a concerted effort to nudge profitability ever so slightly ahead in order to appease the “audience”.
Meanwhile, another audience suffers. Disconnects. Moves on.
Ibargüen believes the “future will be written on the answers” to those questions. They certainly will, one way or another.
The obvious conclusion one can draw from this article is that Craig Mundie is likely a dunce on account of claiming , straight faced, that Apple’s success is due only to “good marketing”.
A more insidious conclusion would be that Mundie is a dunce, since he’s completely oblivious to his own admission that Microsoft spends no effort adequately marketing their own products, some of which actually do have compelling features consumers would be interested in knowing existed.
I’m not shocked Mundie is downplaying Siri’s relevance, I’m shocked I’d never heard of TellMe until now.
Smarterbits favorite, Patrick Rhone:
On a recent episode of the podcast, I discussed a plan for yet another crazy experiment. – To reduce my Mac to the out of the box install and limit myself to only five third party apps and utilities (feel free to listen). The reason? Well, it is the same as any other crazy experiment I do…
Another month, another Minimal Mac experiment, another challenge for me to jump feet first into. This one was particularly fun because of the brain numbing second-guessing I had to endure while deciding which apps to keep and which to give the axe. Yes, that’s my idea of fun. The key, as usual, was to stop pretending I’m a power user and concentrate instead on what I really use my MacBook for: Long writing sessions, image editing and mixel-ing on the web.
(That’s not as bad of an analogy to blogging as it seems. Think about it.)
Unlike Rhone, I didn’t opt for a full reformatting, but here are the rules I did decide to follow:
- Any app that comes with a new, out of the box machine is fair game. Yay iLife! Nay iWork.
- I’m allowed to retain 5 third party apps. If the extension reads .DMG, it counts.
- System preference utilities are fair game. Perian, you’re safe, as are you App Trap!
- Everything else is trashed.
Making the cut.
Dropbox: If I could spread Dropbox on my toast, I would make Dropbox sandwiches. This synchronizing, automated backup utility is always the first thing I install on any device that can support it. Using it daily makes it a no-brainer.
1 Password: 1 Password is one of those things which are extremely hard to give up once you become invested in them. I’m hardly taxing its potential, but it would already be too much work (for me anyway) to revert all my gazillion character hex keys into ones I can actually remember. If I wasn’t so lazy I’d drop this one for something else but as it turns out, I’m that lazy.
Adobe Photoshop CS5: As much as I dislike Flash and the availability of worth alternatives, Photoshop is still the standard for image editing in my eyes. Unsurprisingly, it’ll be my workbench for all of Smarterbits’ graphics and whatever sparse photography editing I still do from time to time. It can also serve in a pinch for creating presentation slides. Eat your heart out Keynote.
Google Chrome: Safari would be fine choice for a web browser, except I don’t install Flash on my machines, and Google Analytics, sadly, uses it lavishly. Besides, from my experience, Chrome is a much faster and enjoyable browsers both in use and on the eyes. Why can’t Safari move the search bar into the address bar already? Thanks to the power of the web applications, Chrome will fill in for all those ancillary apps I would have used in the past for certain tasks: Twitter, image/file uploads, video chat, etc… Presumably, one could install Chrome apps from the Google App Store and cheat the entire process.
Grandview: This spot could have as easily been occupied by usual Smarterbits’ go-to iA Writer. Or anything else really, since Text Edit can already handle the editing of text perfectly (See what I did there?) and can be customized to mimic a full screen writing application with whichever font pretention I’m subscribing to on any given week. However, as discussed previously, Grandview is a perfect tool for developing my writing chops and surprisingly capable (It edits text!) when used outside its intended boundaries. I also enjoy being able to launch it from a keyboard shortcut at will, as if it weren’t an app at all. I can jump into Grandview and be writing away nearly as fast as my inspiration strikes.
Looking at the list, you may be asking yourself why I wasted so many slots on applications that already have equivalents on a bare bones Mac. And, to be totally honest, the only thing I couldn’t truly have done without is Photoshop, meaning I could have picked 4 other apps that used to fill out my dock. For instance, apps I previously would have considered essential that didn’t make the cut:
- Twitter for Mac
- Pages, Numbers or Keynote
Except my goal wasn’t to maximize the capabilities of my Mac with as few apps as possible, but rather to come up with the ideal experience for when, where, why and how I use my notebook. Most of my reading, tweeting and chatting is done in iOS, so while a lot of those aforementioned apps seem like no brainers to most, they actually are excess to me. Especially when my iPhone is almost never out of reach. Effectively, my app diet is about turning my multi-purpose machine into an specialized precision tool. An extremely fine paintbrush for publishing this site, if you will. I even stripped bare my dock into a quick access Finder for my most used folders. I don’t think I’ve had such as sparse, yet goal-oriented a computer as this.
Earlier today, I tweeted my app selection and many responded with questions as to the pertinence of such an exercise. Patrick Rhone himself explained that people were focusing too much on the MacGuffin, and not enough on what I think is the larger picture: that we should all be striving to reflect on our interactions with computers in order to grow and develop as users of technology. Rhone is an adherent to the idea of enough and finding the balance in-between wants and needs. From my perspective, intentionally adding constraints to an activity is an ideal way to improve our skill in that activity, whatever they may be. A basketball player may decide to dribble only with his left hand in order to improve his ball handling skills. A photographer could decide to shoot exclusively in film rather than digital to train his eye and patience when framing a scene. Why not treat computer skills in the same way? Frequently, we consider adding to our skills, whether by learning a new coding language or taking a new application out for a spin. Yet refining the skills we already have is as, if not more, important. The difference between familiarity with an app and mastery of one is in the degree to which we understand it. Imposing limitations that force us to think creatively with our current skill set one is but one way to accomplish this. There are many others.
When Rhone presented the challenge on his podcast, I wasn’t asking myself whether I thought I could make it through a test of endurance, I was wondering whether I was making the best use of the apps I was currently using. I wanted to challenge myself to really learn the ins and outs of a piece of software so I could make the most of it. Those are the kinds of questions that guide us towards mastery. It may be the case that this particular experiment won’t give you the answers you need. So be it. But so long as you are actively in the process of trying to find those answers, you’re on the right track.