I hope our Twitter critique activity on Wednesday helped you think about how you can analyze your assigned application for Unit #2. Next week, we’ll take a brief break from our readings in Net Smart to focus on writing about technology. Even though the readings for Week 6 are all online, please read them deeply (and remember that a couple of them are quite long and will take some time to complete).
Here’s how we’ll spend our time in class:
- On Monday, we will wrap up our analysis of Twitter interfaces, so please bring your completed critique sheets to class and make any final changes to our shared Google Doc before you come to class. Your homework for the weekend is two-fold: First, you should begin analyzing your assigned application for Unit #2, taking copious notes and screenshots as you do so. Second, please read the following articles and leave a comment on this post (no later than Sunday night) that engages with at least one of the articles (and hopefully with your classmates, too):
- “Inside Photoshop,” by Lev Manovich
- “The Condescending UI,” by Paul Miller
- “The Very Model of a Modern Mountain Lion Document,” by Matt Neuburg
- On Wednesday, we will consider the history and evolution of word-processing applications, then experiment with
Pen.io[Update: Due to some problems with Pen.io, we'll be using Google Sites instead.], which you will be using to submit your Unit #2 projects. Before you come to class, please read the following articles:- “Goodbye, Cruel Word,” by Steven Poole
- “Word 2011: Working with the new Find/Replace feature,” by Pierre Igot
- “Pitfalls of WYSIWIG: Self Publishing Hell,” by Luke Maciak
If you have any questions about these plans, or if you want to discuss your Unit #2 project, please feel free to stop by my office (427 Shanks Hall) during office hours (Monday 8–11, Tuesday 1–4). Otherwise, happy interface interrogating!
“My problem with many modern UIs is that they never get past the telling phase. They’re always dressing up their various functions with glows and bevels and curves, and in the process they somehow become overbearing to my senses. “Did you know you can click this? Don’t forget there’s a save button over here! Let me walk you to your control panel.” Imagine a car that verbally explains all of its various knobs and levers the first time you get into the car. Wonderful, right? Now imagine that car explaining all of these various functions every single time you get in the car for the next five years, until you finally snap and drive it off a cliff.” (Paul Miller, The Condescending UI)
Now that he points it out, much of those built in “helpers” do end up slowing the user down quite a bit. I’ve always considered them a necessary evil, but really, how hard would it be for the operating system or program to offer an initial tutorial phase or allow you to choose how much assistance you require. For instance, may video games let you choose the difficulty level and the lowest level will point out all sorts of things (glowing here, popups there, arrows this direction) then as you feel more comfortable you can increase difficulty level or turn off tutorial like features – this allows you to focus solely on the game and your interface without anything slowing you down or getting in the way.
“While I’m in a complaining mood, I’d also like to mention my reservations about that second checkbox…”
IMO, this is a standalone statement in Matt’s article that conveys the overeager tone I observed in Paul Miller’s piece, “The Condescending UI.” The premise of Paul’s argument–that the modern interface “never [gets] past the telling phase,” is one merely wrought with frustration over the constant probe of “you can click this” and “control panel this way,” something that has been integrated simply to make navigation easier. In my experience (with Windows), hints given are subtle to the point where it’s hard to remember the last time something said “click this,” without trying to offer ever-lasting love with a girl from China—yes, let’s “Bring the World Together,” or convince me to “Get In The Fight” and “Play Free! … To level 20.” When a pop-up or message appears on my screen, or a change has occurred within a program and it blinks orange on my task bar, I’m usually redirected with a nice, clear message, such as when the User Access Control in Windows 7 blackens the background to draw attention to software trying to gain access to my computer. This feature was integrated so that you can block something that is potentially harmful or that you aren’t familiar with and to help prevent deciding incorrectly while multitasking. Paul, however, sees it as a threat to his intelligence:
“[T]here’s something deeper that bugs me, about the decorations themselves. Like the ubiquitous drop shadow. ‘Did you know that this window is on top of this window?’ it whispers to me, endlessly . . . Love of reflections and faux 3D subtly imply to me that I might be lost, needing landmarks and a sense of place to find my way.
Of course, this is like ‘prose v. poetry’ and is a matter of preference above all else. His problem is aesthetic overload, a trend in media as of late, since he has an eye for simplicity and ‘prose-esque,’ right-angled-windows versus ‘poetic-esque,’ round-edged ones.
“Soft edges, endless gradients, and rounded corners seem designed to keep me from hurting myself on an acute angle, as if the desktop is a choke-proof toy for babies, instead of a sharpened pencil.”
He over-exaggerates when he equates ‘helpful ques’ to having a car give you the run-down of the dashboard each time you drive it. I find that hard to believe, since the computer doesn’t actually have voice-over, and that helpful features only occur as you need them rather than all at once each time the system is booted. So, if Paul wants to live in the retro world of right-angled-windows, then I say to him: “be like the millions—no, billions of QQers who came before you, and follow those who fantasized of ‘the good ol’ days’ and ‘what I’m used to and don’t have to learn again.’” God forbid we have to learn or cope with every new change that occurs—it isn’t called life or anything.
I also wanted to focus on that passage from Miller, but to point out the flip side of his argument. For someone who engages themselves in technology constantly, uses it every day (whether a computer program, video game, whatever) and climbs the learning curve quickly, like Miller, and like many of us, yes, the UIs can seem annoying slow/teachy. However, for someone like my mom, who has taken several years to get to the point of being okay at doing things on the computer most of us in this class consider basic, like use Microsoft Office products, email, Facebook, etc., she relies on the UI features that do help/teach/remind her what the different functions are. This is mainly because she doesn’t use the computer or the specific programs every day, and she doesn’t use them for as many different things, so it’s harder for her to remember how they work.
While I agree with Miller that a user should be able to customize his/her own UI if they’re a very frequent, hard user of whatever software/program and they have a particular need or want, I don’t think that the answer should be to have only “retro” UIs available, which is what he seems to want. I think instead that UIs should come as-is, with helping features for people like my mom as well as highly customizable options for users like Miller. For example, with most WYSIWYG blogging platforms like wordpress, you start out with a template that is pretty easy for virtually everyone to use, no matter their level of proficiency. But then, if you know more about coding, you can go ahead and hyper-customize your website using HTML, CSS, etc. So I would like to see that kind of basic, but customizable, platform in many other different UIs, and I think Miller would be ok with that, too.
“but I still can’t help but feel like they’re taking the operating systems I knew, in all their “ugly” 1990s glory, and dressing them up in Little Lord Fauntleroy suits” (Paul Miller, The Condescending UI)
I personally like that basic uses are the same. With technology constantly changing and evolving it is nice to not have to learn how to completely use a new interface every couple years. I can go to any computer and know I will know how to use it effectively.
When a designer is designing the interface, they have to think of the large common customer, not a single individual. It’s simply not possible for every individual to have their own interface, just think how much of a headache that would cause with tech support or transferring files from one computer to the next. Not to mention the cost for everyone to have their own would be outrageous. There has to be some type of constant when the basics are broken down.
(In response to The Condescending UI by Paul Miller)
I completely agree here. The fact that basic usability stays the same means that most people that are reasonably well-versed in the operating system can use it anywhere.
Even Mac and PC are basically the same. PC’s have a start menu and are a bit more “condescending” in that more how-to and user-directing is used, but a user of one of those OS’s can probably use the other with very little difficulty.
In the same way that you’d could never be called a “car guy” by never changing out a stock part on a car, you really can’t be called a computer nerd by never changing anything about the OS on your computer. You can use a computer perfectly effectively without ever getting rid of the simple “how to” information, but if you want to streamline your experience there are definitely ways to do so. If you want to make to streamline your work flow, there definitely are ways to hack that.
I have to agree that I like seeing some consistency between updated user interfaces. Technology is changing all of the time and it’s nice to have familiarity. For example, I love being a Mac user. During a summer internship, I was shocked to find how difficult it was to switch from a Mac to a PC when it was required. Besides basic user interface changes, the keyboard is a different size and it’s frustrating to adjust to. I consider myself pretty good with computers, but I hate feeling computer illiterate searching through the tabs and “googling” how to find or do something.
And while I agree with most people that some of the features of UI’s can get annoying, I think Liana makes an excellent point in bringing up our parents. Technology has grown up with our age generation, so we’ve been able to adapt more rapidly than our parents. My dad still doesn’t quite know how to use his iPhone properly (mostly because he’s stubborn), but my parents and others who are not as comfortable with computers and technology really benefit from these tools. I’m definitely intrigued by Liana’s idea to have a customizable platform though.
“To write using Microsoft Word is to use new media. To take pictures with a digital camera is to use new media. To apply the Photoshop Clouds filter (Filters > Render > Clouds) that uses a purely automatic algorithmic process to create a cloud-like texture is to use new media. To draw brushstrokes using the Photoshop brush tool is to use new media.” (Lev Manovich, “Inside Photoshop”)
Is “new media” really an appropriate name for authoring and editing software? Can we really say that using a word processor on the earliest Macintosh computer from 1984 or taking a picture with the earliest digital camera from 1975 or using Photoshop 1.0 from 1990 is “new media?” The concepts of Microsoft Word (a word-processing computer program), digital cameras and Photoshop are not really “new.” They were introduced decades ago and have simply evolved to become more sophisticated today. I think that “new media” should specifically refer to the latest versions of these programs.
“In my personal quest to escape the condescension, I recently switched my Windows 7 install over to the “Classic Theme,” which is basically Windows 95 incarnate, just with all the under-the-hood improvements I’ve come to rely on.”
I feel like this negates most of Miller’s argument. Okay, so you find standard OS design condescending–but you have the ability to customize your experience. Designers have to cater to the base audience knowledge level. Of course that’s going to be frustrating to even a moderately advanced user, but it would be poor design to create an elitest OS that would only appeal to those with advanced knowledge. It’s far more practical and economical to create a basic system that all users can interact with easily, and include upgrades and tweaks for those users who want a different experience.
My dad used to have a Windows computer, but it turned out to be a little too much for him to handle because some of the systems would not work. Eventually, he switched over to Macintosh, and to a greater extent, his Macintosh laptop (over the computer in our house) because he was more experienced with Macintosh.
If my dad’s Windows computer had a system that wasn’t hard and didn’t constantly crash, he would probably have a more positive outlook on Windows computers. I’m not saying we don’t like Macintosh, but if Windows had an easier system on my dad’s computer, the rest of us would probably have Windows computers, too.
“I’m particularly impressed with Apple’s willingness to provide users with options, something that in the past I’ve frequently called upon them to do.”
-The Very Model of Modern Mountain Lion Document by Matt Neuburg
I choose this quote because I think this really summarizes what user interfaces should be about-adapting to benefit the user. People are different and as such their needs and methods of serving those needs differs. I believe what makes a UI successful is options; that users can customize their settings to fit them personally, and allow the same task to be preformed by various methods.
I agree with the article that Mountain Lion is in a way a “backtrack,” but I think that it is a negative term. The original upgrade of Lion was unsuccessful because it tried to make decisions for the users; thinking they were being considerate to save you the trouble of remembering and sorting your own files. People want control! I agree with this article almost entirely and think that Mountain Lion is taking the right steps towards serving the users.
Also, the title is actually a very reference to a Gilbert&Sullivan musical song “The Very Model of a Modern Major General.” It’s a very useful tongue twister as well. Just some random theater knowledge for you.
Paul Miller has an interesting point in his article, coming from the perspective of someone who hates being instructed twice. Unless the process is ridiculously complicated, I strive to learn with one lesson.
That said, I think personal computers are at a stage of their life where the merit of a particular UI is measured more by how it looks than by what it can do. At least on the average consumer end, we look for functionality in terms of ease-of-use, rather than raw computing power. How much computer does it really take to update twitter? Not much. Would I like an interface where everything is really easy to find? Absolutely. Our application developers spend countless hours making their programs most accessible for the common majority.
Miller has a point for those who want the absolute most utility out of their computers. Going to a standard no-frills layout for Windows 7 likely frees up system memory for more important tasks (at least more important for him). For the average computer user (read: practically everyone you know) and the tasks they perform, polished and intuitive interfaces seem preferable.
Want to tweet something? Here’s the button. The big, blue, isolated button. Because if you are on twitter, it’s likely that you want to tweet. Is that condescending by Miller’s definition? Probably. But it appeals to the average Twitter user.
In “The Condescending UI,” there is really no basis for an argument. He claims to be personally attacked and offended by the overbearing design of interfaces, but he really isn’t offering an alternative, rather an exclusive preference.
He seems to be arguing about aesthetics, but disguises it as an argument over design philosophy. Maybe I just find the sarcasm of the article unbearable, but I don’t think he backs up any of his claims.
Occasionally when I’m using different sites/operating systems I feel like the creator has dumbed down everything for the user. I recently upgraded my PC to Windows 7 (which is a MAJOR improvement over Windows Vista which I had been using). Windows Vista attempted to make things easy for the user, but instead just ended up alienating people. I think that at one time this type of in your face direction was needed. People had no idea how to use computers and they need that guidance. Now many of these user interfaces are talking down to users. Paul Miller even goes so far as to say that the design of some interfaces is condescending.
I would go that far in my argument though. But sometimes I do feel overwhelmed with buttons and windows and directions. In Windows Vista attempting to do anything related to the settings of the computer was nearly impossible. I was told that this was to prevent people who didn’t know what they were doing deleting something of importance. I just wanted to change the screen saver. Things were made more difficult in an attempt to make them ‘easier’.
“Microsoft’s oddly-sized minimize / maximize / close buttons in Windows 7 are only there to help, but they also hint at some lack of eye-hand coordination on my part. Soft edges, endless gradients, and rounded corners seem designed to keep me from hurting myself on an acute angle, as if the desktop is a choke-proof toy for babies, instead of a sharpened pencil.” -Paul Miller, The Condescending UI
This quote shows that a lot of what Miller bases his argument on is nitpicking. Are the size, shape and rounded corners really that important to complain about?
“A ‘wizard’ is supposed to ask a pertinent question that eventually leads me to a specific control panel, but once I know the actual control panel I want, the friendly “wizard” is more like a guard at the gates. The Ribbon in Microsoft Office products (which is making its way to the file manager in Windows 8) is constantly talking down to me, assuming I don’t know how to use a menu, a key command, or a honest-to-goodness toolbar.”—The Condescending UI
While Paul Miller argues that some UI’s still use “telling phases” such as a wizard, or what I personally remember in my first year of learning how to use a computer— the talking Microsoft paperclip. I appreciate having that “telling phase” to this very day. While it may slow consistent users down, or irritate frequent online consumers, I believe the “telling phase” should always remain as an automatic system. To me, this symbolizes that the consumer is being cared for consistently, by having that step-by-step procedure; serving as a reminder or possibly as an update, that the tools/navigation system can in fact change. While appearance may look similar, upgrades are always being made. Even for the most technological individual to encompass.
“Like these filters, many of the ‘new’ techniques for media creation, editing, and analysis implemented in software applications were not developed specifically to work with media data.7 Rather, they were created for signal and information processing in general – and then were either directly carried over to, or adapted to work with media” (Manovich, Inside Photoshop).
This phrase just makes me wonder about all the applications we have and are using; what were they first created for?
But if the applications were meant for information processing where would it be processed to? What would it be used for? I think it’s just common sense that it led to what is today.
“But it’s not just functionality, there’s something deeper that bugs me, about the decorations themselves. Like the ubiquitous drop shadow. “Did you know that this window is on top of this window?” it whispers to me, endlessly. Apple’s love of reflections and faux 3D subtly imply to me that I might be lost, needing landmarks and a sense of place to find my way.” (Miller, The Condescending UI)
I completely agree with this criticism. Apple has this inherent obsession with presenting a 3D illusion to its users with the display of multiple interfaces, which can often times make my head spin. This may also be due to my familiarity with Windows, but I appreciate how Windows in essence can dumb things down for the user when it comes to switching off between programs. I have never had any issues opening and closing applications in Windows, but there have been countless instances with Macs in which I wasn’t sure if I had opened, closed or minimized a program.
For users that are familiar with the software and shortcuts, Apple can make lives a whole lot easier. However, I wished they weren’t so focused on what were called the overt 1:1 metaphors. I do hope that eventually Apple does get out of what the author calls the “telling” phase.
The Condescending UI -
“My problem with many modern UIs is that they never get past the telling phase. They’re always dressing up their various functions with glows and bevels and curves, and in the process they somehow become overbearing to my senses. “Did you know you can click this? Don’t forget there’s a save button over here! Let me walk you to your control panel.”
“Who hasn’t quit an application, supposedly without having altered a document, only to see the “Save changes?” dialog appear unexpectedly? It turns out that you did alter the document, but by mistake; you thought you were copying, perhaps, but you cut instead, or the cat prodded the keyboard while an important paragraph was selected. That dialog warned you, and rescued you, letting you cancel those unintended changes. Under Lion, there is no dialog and no warning; your accidental changes are saved without your knowledge and possibly contrary to your desires.”
(The Very Model of a Modern Mountain Lion Document)
I think this example blends the concerns of “The condescending UI” with a real problem of progressing technology. Everyone here has probably experienced that feeling of, “Wait, did I really just change something in that document?” when prompted by that “Would you like to save these changes?” screen. In Lion, that screen will not pop up due to the auto-save function.
As it seems the majority of my other classmates agree, the user interface is most user-friendly when it’s easy to be personalized. Does everyone have the same ring tone or vibration settings on their phones? No. Why? Because that’s simply their preference whether they want their phone to vibrate once, twice, or three times when they receive a text. If everyone was forced to have the same settings, whether it be on an iPhone or on a computer or specific computer program, we wouldn’t be satisfied. My conclusion is the value of a program comes in its ability to be manipulated.
“A “wizard” is supposed to ask a pertinent question that eventually leads me to a specific control panel, but once I know the actual control panel I want, the friendly “wizard” is more like a guard at the gates.” (The Condescending UI)
This article made me reconsider some of the things that I was thinking I had pretty much figured out, regarding the user interface on my own computer, as well as other computers that I have used. For instance, Adobe always takes me through the set up of the program – a neat little wizard, every time that I want to use Photoshop or Indesign. At first, this is good because most of the programs are complex and a little reminder never hurt anyone. But when I gout Microsoft Office on my Mac, it too felt the need to constantly remind me that the “save” button was at the top of the page, and that you could click on the large “A” if you want to display some form of “Word Art”, things that I have learned in 5th grade and have yet to stumble on. These wizards, while supposedly helpful, may be a bit of overkill on certain programs – is there really someone out there who doesn’t know how to use Notepad??? While I do like some last minute pointers on certain things, there are other programs where I find the wizard a complete annoyance, and generally shut it off after the first time I use any program.
Haven’t finished reading the Poole essay just yet, but got to this quote and it literally made me lol.
“When pretty much everything you write has a word-limit attached, and you realise after long and tragic experience that exceeding that limit will not cause the editor to expand the space available to you in tribute to your genius but will instead cause the sub-editors unerringly to home in precisely on the bits that must not be cut if the article is still to make any sense and cut them, then you need to know at every stage how much you have written, and how much you have left to go.”
So true, gave me a smile and a laugh. Back to reading…