Lessons in UX: The Cambrian Explosion of UI Design (and where we are now)

“How do you use this?”

Whenever any new piece of technology is released, this is the core (usually unspoken) question that consumers of said technology have. How does it work? What does it do? The better designed devices and applications build in affordances and familiar controls that allow users to pick up on how to engage and use it – the tech that doesn’t build in these helpful elements produces frustrated users and reduced adoption.

We have come a long way from the technology that I started using in the early 1980s – the early computers such as the Commodore 64 or the Apple II had a character-based UI, and taking full advantage of the hardware meant becoming very familiar with the spiral-bound manual that came with these units. I love watching videos of the “8-Bit Guy” on YouTube as he takes us through some of this “ancient” tech.

Today, including a manual for a piece of hardware or an application is rare, not just because companies want to save money on packaging, but because computing devices and programs either have help built in or they are designed with the aforementioned affordances in mind (or both).

Mobile applications were especially lacking in help or support documentation. With so many simple “single-purpose” applications that existed on early smart phones , there was no need to have any help options due to the limited functionality

Except… for a long while, every app LOOKED DIFFERENT.

In the early days of mobile applications (kickstarted by the popularity of the iPhone) , developers took full advantage of the platform to create games and apps that were wildly different from each other. , Even though Apple provided design guidelines and UI element libraries, many apps had its own custom interaction models and controls… Which meant users had to “relearn” how to use each app. When Google’s Android devices arrived a couple of years after the iPhone a similar situation occurred, compounded by the fact that the Android OS had different “core” interaction patterns from iOs devices.

I consider this “early period” of mobile apps the “Cambrian Explosion of UI Design”

Here’s the Wikipedia description of the Cambrian Explosion:

The Cambrian explosion, or less commonly Cambrian radiation, was the relatively short evolutionary event, beginning around 542 million years ago in the Cambrian period, during which most major animal phyla appeared, as indicated by the fossil record.

The Cambrian Explosion of UI design (both at the application and the mobile OS level) resulted in a lot of fantastic applications, elements and interaction models – and also a lot of really unusable confusing applications. The result was a consolidation and “smoothing out” of the mobile experience – Apple and Google refined and started enforcing their standards, weeding out badly designed applications. App designers started following the conventions more, and finally both iOS and Android started looking more and more “alike” when it comes to the core interaction patterns. Now, usability and learnability has improved for all devices and applications, though the occasional “divergent” application still comes out on a regular basis.

A similar situation has occurred on the web and with desktop operating systems, with Microsoft , Google and Android’s OSs looking more alike than different in most respects.

That’s where we stand today. Everything is different, but yet everything looks pretty much the same. Is this good or bad? It depends on how you look at it. Having standards that have organically come to exist that Ui designers can follow allows for quicker adoption and understanding, but such standards can also constrict creativity and innovation. In my opinion, the overall result is more good than bad, as designers should focus more on user needs than how calendar controls should render.

However, I’m getting pretty tired of seeing the hamburger menu.

Comments are closed.