The Decline of Usability (2020)
Usability in UI design has declined due to non-standard elements influenced by mobile design, causing confusion in desktop applications. A return to coherent design principles is urged to improve user experience.
Read original articleThe article discusses the decline of usability in user interface (UI) design, particularly in desktop applications. It highlights a period from 1994 to 2012 when users could easily navigate various operating systems due to consistent design standards. However, in recent years, there has been a shift towards non-standard UI elements, especially influenced by mobile design paradigms. This has led to confusion and frustration among users, as traditional features like title bars, menu bars, and scroll bars have become less recognizable and functional. The article provides examples from popular applications, such as Microsoft and Google, illustrating how these changes have made it difficult for users to manage multiple windows and access essential features. The author argues that while technology evolves, it is crucial to maintain fundamental design principles that enhance usability rather than complicate it. The piece concludes with a call for a return to coherent and user-friendly design standards, emphasizing that change should not come at the expense of usability.
- Usability in UI design has declined significantly in recent years.
- Traditional design standards that aided user navigation are being abandoned.
- New mobile design paradigms are negatively impacting desktop application usability.
- Examples from popular applications illustrate the confusion caused by non-standard UI elements.
- A call for a return to coherent design principles is emphasized to enhance user experience.
Related
My Windows Computer Just Doesn't Feel Like Mine Anymore
The article discusses Windows 11's shift to a more commercial feel, with concerns about ads, updates, and lack of control. Users express frustration, preferring macOS or Linux for simplicity and customization.
A plea for the lost practice of information architecture
The article critiques the decline of information architecture in web design, likening it to the chaotic Winchester Mystery House, and advocates for renewed emphasis on structured planning to enhance usability.
Our Users Deserve a Bill of Rights
The author critiques the tech industry's neglect of end users, advocating for a "User Bill of Rights" to ensure accountability and balance between innovation and stability in software development.
Making Database Systems Usable
Database usability is essential for user satisfaction, often more important than technical performance. Despite advancements, challenges persist, including reliance on experts and complexities in querying, necessitating better interfaces and designs.
This behavior is by design
The article emphasizes that software design is driven by intentional human decisions, highlighting the impact of design choices, unintended glitches, and the ongoing human influence in AI development.
Modern apps are so obnoxious and distracting and unusable it’s getting ridiculous. Tapping a verify link in an email would slide the email screen out, then slide in a browser, wait a second or two, then slide out the browser, then slide in the play store, then wait a moment, then slide in the app itself, which then had to update the screen after another second or two. Multiple screen changes flashing before my eyes made even me feel dizzy and confused. My mother had raised eyebrows and shook her head like she just got lost.
Then once you get in the apps, there's so many distractions that make the UI unusuable. "Hey did you know about this?" "Tap this to do this thing!" "Look over here!" not to mention various notification and permission prompts. The permission prompts are especially concerning because now there's so many of them to even start using a messaging app, for example, that you train yourself to just click through them instead of scrutinize them.
We've essentially gotten rid of text labels on icons, making it anyone's guess as to what the icon with the 3 colored circles does, or an icon with a shape, two lines, and another shape that I can't even describe succinctly. So many gestures make navigating the software a guessing game. My poor mother was like, "So to go to the last app I do this" only to gesture "back" in the same app, "Oh, I meant this", only to bring up the app selector. Nothing is discoverable anymore. Those little "learn how to use your phone" tutorials do nothing for you in "the real world." My father, who used to teach Commodore computers to his community, could no more figure out how to use a phone than his students could a Commodore.
We were in much better times when computers came with 100 page manuals and you had to go to a class or ask a techie how to use them.
I'm not even going to talk about the privacy problems in all this software.
I had to keep apologizing to my parents for my industry making the experience so bad and downright confusing.
FlatUI/minimalism has wrecked web usability for a decade now, making buttons look like tags look like inputs look like boxes, everything is a box. Very few people have been able to stand up against this trend without being ignored or shouted down.
My theory is that this is due to the confluence of two things:
First, purity spiraling in a western elite class, as happened with architecture (see "From Bauhaus to Our House" and the Yale Box, now we have the FlatUI Box).
Perhaps more importantly, this trend is cheap: you don't need skilled laborers to produce a yale box and you don't need skilled designers to produce a passable FlatUI, whereas other options take time & skill (read: money). This dovetails with the corporate ethos of cost minimization.
These two trends in harmonic resonance with one another make minimalism/flat the default mode, which forces designers into increasingly bizarre UX patterns to express themselves.
This is not unlike the post-modern movement in architecture.
In the 80s and 90s, there concerted, industry-wide effort to make computers easier to use. There was a large research effort into how to make UIs that had a gentle learning curve, and this relied of making UIs that were consistent and easy to understand at a glance. People my age (older millennial) and younger, even people in the industry, often don't know this. We may have played around in DOS as youngsters, but by and large we grew up with GUIs. We didn't really see what came before, we were too young to appreciate the usability leap forward, and unless we have a particular interest, we don't know computer history.
We never learned the lesson that GUIs are all about usability. We saw GUIs evolve over our lifetimes and collectively came to the conclusion that GUIs are about fashion. Mac OS X was a better OS because look how slick it looks. Ooo, Vista has basically the same UI, but everything is glossy. Everything needs to have a hamburger menu now because menu bars are so last decade, and we don't want applications to look old.
It's lucky that MacOS chose to fix its per-app menu bar at the top of the screen. As a result most apps still have a (mostly) usable one. Windows users are not so lucky.
The author did a good job articulating something I have noticed as well.
My personal pet peeve which very much is related to this is readability. Typography and typesetting are not obscure crafts that are poorly understood. They have decades of research behind them and generally are not even that difficult to implement following some ground rules. Like using sans-serif fonts* for larger blocks of text, having a line length of around 60 characters, balancing line height and making sure the text contrasts well with the background.
Yet many people seem hell-bent on reinventing the wheel there as well. Resulting in a plethora of blogs, knowledge documents, etc. out there with valuable information in a less than ideal format.
Which is why I am happy to see that the person writing about the decline of Usability has done all of these things right. Which made this a very pleasant reading experience :)
As a last note, I don't think readability was always better in the past. Certainly not when you take personal websites into consideration (I still remember the starry sky background Geocities pages). But for a while it did seem that most people were in agreement over readability and most blogs, news articles, etc did follow the basic principles I outlined.
I, too, might be old and angry.
* No, monospaced fonts are not better for readability. They might look cooler on your tech blog, and many people who spend many hours in IDEs might have not an issue with them, but when you are targeting any audience slightly larger than developers you should reconsider. There is some nuance here, for people with dyslexia monospaced fonts might actually work better. A serif font on modern high density displays also does not impact readability, however even today everyone will always read your text on a high density display. So overall, sans-serif is still the best choice to offer readability to the widest range of people.
Edit: For anyone who wants to know a bit more than just my unsourced statements. For a modern deep dive into fonts and readability, I recently came across this medium post which does a pretty good job tackling the font type: https://medium.com/@pvermaer/down-the-font-legibility-rabbit...
To add a bookmark folder directly, like that's your only intended action, here is what you must do:
click a tiny hamburger menu at far top right of browser. Hover over "bookmarks and lists" which expands another menu. Then click on "bookmark manager." That takes you to a screen with all your bookmarks. Excellent! Now I'm looking for a tiny + sign or some kind of obvious button somewhere where I can edit my folders. Nope. Screw you. It's in another tiny hamburger menu that's hard to see, click on that, now you can finally select "add new folder."
Is all that really necessary? Like, why? I'm well aware that you can add a folder from the "star" icon, but I don't always want to do that. This is a very obvious and common action that should take obvious and common steps to perform. This kind of thing is everywhere. Whoever the modern UI/UX people and product managers driving these decisions have completely lost the plot.
Even something that should be stupid easy like buying tickets to an effing baseball game is impossibly obnoxious - go to stubhub to buy ticket. Cool, I can do it as a guest. Where are my tickets now? Oh, I have to use stubhub app for (reasons) to do that. Have to use some kind of google or facebook account to access the app - fine. Whatever. You have me this far already and I already spent the money. View my tickets - oh, ok, I need to go back to my email (that I used when I bought the tickets on stubhub as a guest, not necessarily the one I logged into the stubhub app with) to find the code that allows me to view my ticket, which is typically a QR code scanned at the gate. Great! almost done. Nope. This QR code cannot be screenshot because it has a rotating one time token associated with it, so you gotta do this whole song and dance in line at the gate with an impatient crowd behind you.
How is a computer illiterate person supposed to navigate all of this? Technology is supposed to make things easier.
Used to be, the operating system controlled the title bars. You could see which window was active, and the behavior for moving, resizing, etc was consistent.
Add in classic menus vs. ribbons vs. hamburger menus vs. whatever wet dream some UX expert has imagined.
The inconsistencies in UIs are infuriating.
However, I think usability as a whole is about to get very weird.
I've just done a workshop with a bunch of people. I recorded it, and want to write up some notes about what was discussed, perhaps make detailed notes about each person's perspective and contribution in order to understand how to best help them in the project we're discussing.
Expired: listen to a recording, write notes, maybe even transcribe some quotes, then look at my notes and make some summary notes.
Tired: Use some transcription software to automatically turn an audio recording into a diarised text output. Copy paste that into a document and then edit it down into summarised notes.
Wired: "Hey LLM, provide a diarised transcript of this audio file, labelling speakers based on the names they give in the introductions. Then provide me with a summary of key points discussed, and for each speaker provide a summary of each speaker's perspective and contributions. Provide me with a list of todos to address each action item, and also to plan a mitigation for each risk or concern raised throughout the meeting."
In the new World, I don't care about applications. I care about jobs to be done. Yes, there's a lot of absolute junk Kool-Aid all over X/Twitter and we're definitely in a hype bubble, but the point remains that the interface with the computer is about to change dramatically, and worrying about what items appear in a menu and how to move files between applications is going to look pretty quaint in a few years time.
I know people have been talking about the trend towards mobile for some time and how that's affected density. The other day I was browsing Blizzard's battle.net website to purchase the old school Diablo 1. I was expecting it to be the easiest thing in the world, but it actually took me a few minutes to figure out how to navigate their website. I'm not sure why...
The other example I was thinking of was Amazon. The home page is packed with product recommendations. I'm currently getting pet wellness product recommendations, even though I don't own any pets? If you accidentally hover your mouse over a menu bar, you get these giant drop-downs which cause the rest of the screen to darken. Some drop-downs are text (such as the one for your account in the top right). Others are just drop-downs covered with large icons. I think the drop-down for your account is egregious. 99% of the time I go to Amazon to buy a single product. Music Library? Start a Selling Account? Kindle Unlimited? Wtf? None of this is relevant to me.
In the late 90s/early 2000s, we did user testing to figure out how to make software easy to use. We learned that discoverability was king.
In the last 10 years, we've been throwing it all away in favor of UIs that look "sleek" and "clean".
Functionality is hidden and we got rid of discoverability. Everything is flat, so it's not obvious what you can interact with. What's simply a status icon in one program could be a button that leads you to more information in another.
I really REALLY don't understand the fascination with flatness. Flat is ugly, boring, plain, and hides interactability.
We reached a period where we have more graphics processing capability than ever before, and we made everything simple flat shapes.
Windows 7 with the Classic theme was peak UI and you'll never change my mind.
Related
My Windows Computer Just Doesn't Feel Like Mine Anymore
The article discusses Windows 11's shift to a more commercial feel, with concerns about ads, updates, and lack of control. Users express frustration, preferring macOS or Linux for simplicity and customization.
A plea for the lost practice of information architecture
The article critiques the decline of information architecture in web design, likening it to the chaotic Winchester Mystery House, and advocates for renewed emphasis on structured planning to enhance usability.
Our Users Deserve a Bill of Rights
The author critiques the tech industry's neglect of end users, advocating for a "User Bill of Rights" to ensure accountability and balance between innovation and stability in software development.
Making Database Systems Usable
Database usability is essential for user satisfaction, often more important than technical performance. Despite advancements, challenges persist, including reliance on experts and complexities in querying, necessitating better interfaces and designs.
This behavior is by design
The article emphasizes that software design is driven by intentional human decisions, highlighting the impact of design choices, unintended glitches, and the ongoing human influence in AI development.