A Better Way to Fail Fast

Certain notions are so insightful that they spread quickly. As this happens, many such ideas are sapped of their meaning, and turned into hollow platitudes. “Fail fast,” is such a phrase, and it’s commonly repeated among those who work in, or with, startups.

The logic behind this phrase is sound. It proposes that making the right choices on undefined projects can be difficult; therefore, you shouldn’t dwell on setbacks. Instead, you should learn what you can from your missteps and move on—quickly.

This idea is powerful because it encourages innovators to look forward. Or, as Thomas Edison said, “I didn’t fail. I just found 2,000 ways not to make a lightbulb; I only needed to find one way to make it work.”

Without such a mindset, one can get hung-up on his/her stumbles. Worse yet, the resultant fear can slow the innovator from taking new approaches and entertaining somewhat divergent possibilities.

However, this phrase’s meaning has become perverted, affording many an excuse to give up and switch to something completely different. They use the words “fail fast” to avoid the hard tactical questions that need to be addressed, in order to advance. Instead of biting down and dealing with the obstacles at hand, they quit, and move on to something completely different—because they’re “failing fast.”

Let’s bring this discussion back to Edison. What if he had given up on the lightbulb and moved on to another challenge, every time he hit an obstacle? Sure, he could claim that he’d “failed fast,” but he also wouldn’t have achieved his breakthrough.

I’m all for failing fast. For this device to work properly, though, I choose to fail fast on smaller points, while keeping my long-term goal consistent. As such, I propose a simple rework of the phrase. Let’s stop saying, “fail fast,” and instead try: “fail on the small stuff fast.”

Discuss this post in #startupchat on Chapp.

May You Design In Interesting Times

Lately, I’ve repeated the same phrase to a number of smashLAB clients: designing for the web has changed more in the last 3 years than in the 10 before it. I’m never sure whether they believe me or not. I fear that these statements sound a little like a sales pitch. That said, I’m convinced others in this industry have experienced the exact same upheaval.

The Web is Always Changing

The web has always been a tricky medium. Those of us who’ve been at it for a while remember building for 640 x 480 pixel displays, and setting type in graphic formats—to gain some typographic control. We hacked tables and wonky framesets to push immature mark-up to do as we needed. We tested liquid layouts, knowing something interesting was there, even if not quite yet; and, we integrated primitive CMSs and learned to deal with their idiosyncrasies. Some of you might even look back with amusement at tedious tasks like manually dithering (a meagre 216) web-safe colors, to expand the reach of that limited palette.

And we got pretty good at it. In fact, as recently as four years ago @shelkie and I bragged about how you could overlay wireframes and comps from a smashLAB project with the finished website, and typically see pixel-perfect accuracy. Some didn’t care about that level of detail, but we did—and we were even snobby about that capability and commitment to craft. That said, pixel-perfection (for the most part) doesn’t matter any longer. Looked on coldly, one might even say it was a carry-over, or relic, from the print era.

I’m not implying that the web of 2010 was all about pixel perfect execution; instead, I reference it as an example of our focus. We also implicitly understood web conventions, and had established solid processes for building for the web. I also think we were often ahead of the game, as we tested new technologies and (sometimes successfully) pushed approaches. For example, we were playing with the asynchronous scripting in IE6, screwing around with in-browser content editing before most others were. And we were noodling with mobile as early as 2002, but felt that technologies like WAP were just too flimsy to become a standard.

Prognostication is a troublesome practice, almost always proven flawed as time passes. A notion that works in principle is no match for what people actually do. As such, Twitter has, to my surprise, proven a remarkable platform, whereas other good technologies like the MS Surface don’t seem like they’ll ever take off. So, we might have debated whether native apps or the mobile web would win the war (and other such matters), but those discussions meant little until we had lived with these technologies for a while.

After a few years in a multi-screen environment, most would likely agree that the notion of adaptable design is increasingly essential. Yes, there’s Responsive Web Design, which is suitable in a vast number of instances; however, there’s also the need for the conventions we apply in a browser context to adapt to native applications—in order to establish UX/brand consistency for visitors. (E.g., Users expect Facebook to principally act and operate the same way within the browser and in its native iOS/Android applications.) As you’ve likely noticed, I haven’t even touched upon new backend databases, languages, and frameworks, like NoSQL, node.js, and Meteor, as those aren’t my area of expertise; nevertheless, it’s all in a state of flux.

Left Behind by New Models

Establishing design systems that work well across a number of different implementations is no small challenge. Even seemingly simple questions like “does our app need a back button?” are tricky and have broad implications. For some, it’s too much. Recently, I had a conversation (that started on Twitter and moved to Chapp) with a designer who’d largely given up, on design for the web. He explained that although he knew—and had promoted—the approaches championed in web design today, he finds them difficult to implement. He even felt that the demands of the digital context had invalidated the fundamental design rules he had learned during his training and career.

Over the course of our discussion, I encouraged Stephen to not become overwhelmed by the changing nature of design. I noted that back in 2007, I was ready to write off social media, but knew that if I did, I would effectively relegate myself to the waste-bin of design: I’d be left behind as a model that was unable to adapt to change. With that concern in mind, I instead jumped in with both feet. For the next six months, I toyed with every social network I could, in order to simply understand. As I did, I was reminded that no one is restricted from growth, unless they get frustrated and claim defeat—which everyone feels like doing at some point.

With that said, I feel lucky to have started my design career on the web, and acknowledge the advantage that comes from having done so. Those of us born ~1970 were the first to have computers around us for our entire lives, and with this came a great deal of change, and a kind of acceptance of lifelong learning. At no point in my career have I felt as though I knew all there was to know about my vocation. Actually, I’ve typically felt more of a chronic inferiority, and fear that I will never know enough. Perhaps this is why I write so many articles in which I question the nature of design, and the role of the designer: the gig changes so much that I no longer look upon this work as I once did.

In the mid-90s, I saw a copy of “Ray Gun” magazine, designed by David Carson, and it blew my mind. He’d brought an irreverence to a stale feeling medium. In doing so, he seemed to knock down so many barriers: after that, everything seemed like fair game. Carson wasn’t the only one; designers ranging from Margo Chase, to Neville Brody, and Carlos Segura all brought a sense of personality to design that I also wanted a part of. I was working as an artist at the time, and these designers had effectively brought art to design. Little did I realize that this was perhaps the swan-song of personality-driven graphic design. (Although designers still produce personal and visually delightful work, such efforts seem less relevant in today’s landscape.)

Although aesthetics will always be important, a digital context simply involves a greater number of considerations than what something looks like, or how clever the designer behind it was. In fact, if there’s one characteristic that seems to hold back older designers, it’s an obsession with surface: an inability to appreciate that some design solutions might even benefit from plain, or even unoriginal, implementation.

This sounds like a relatively obvious observation, which might best be highlighted by the popularity of flat design. Although I don’t think such approaches are always appropriate, I think it’s fair to say that flat design is in part a reaction to an overly-indulgent focus on surface. I don’t need visuals that represent torn bits of paper and leather surfaces, in order to check the date in my Calendar application. Although there’s nothing wrong with a beautiful presentation, if the visuals in no way improve the user experience, perhaps we need to ask whether such treatments deserve to be there in the first place.

Preventing Extinction

This isn’t a new idea; but it’s one that many designers, who came of age during the print era, are surprisingly resistant to. After releasing The Design Method, we sent copies to a number of designers, asking them to (candidly) review the book on either Amazon or their blogs. The feedback was overwhelmingly positive. Younger designers in particular dug into the content and found it useful—even if a little dense in areas.

The only negative feedback seemed to come from designers who were in their 50s and 60s. A couple of these individuals came back to me, asking why I hadn’t made the book more beautiful. I explained that such an approach seemed patently wrong, given the nature of the content presented in the book. As I prompted for their feelings about the content, I learned that not one of them had even read the book. They were looking at The Design Method as an object, as opposed to a delivery device for information.

It’s hard to be frank when partaking in such interactions, for fear of seeming reactionary or emotional; however, I was somewhat taken aback by their responses. Couldn’t they see that the things we design are about more than just surface? Didn’t they understand that my challenge wasn’t about creating a beautiful artifact, but rather, taking a complex matter and codifying it into an applicable manual? Moreover, what kind of a designer believes playful visuals should take precedence over clarity, readability, and order? Eventually, I realized that this argument probably wasn’t worth feeding into. But, I had to wonder if this small handful of responses was symptomatic of a hurdle that will hold back some designers—who have an inability to adapt to changing ideas about what design should do.

Regardless, these discussions strengthened my desire to evolve as a designer (and as a human). I am not a slave to past conventions any more than I need to be resistant to new possibilities. My brain isn’t incompatible with emerging approaches, unless I choose to shelter myself from such possibilities. This leaves me silently repeating the following mantra: I will not allow myself to become a designosaur.

There are No Experts

Going back to my conversation with Stephen, though, the question for many mid-late career designers becomes one of whether to stick with design—and, if so, how to adapt to all of the change underway. Or, “if everything I know about grids doesn’t matter any more, what do I do now?” Well, first of all, there’s plenty of work that isn’t digital in nature that still needs to be done. Second, all that you know about grids (as well as the other design principles you’ve learned) are as important as they ever were. However, you can also learn a whole bunch of other really neat stuff.

In this article, I’ve already used the phrase, “lifelong learning,” and I think this is an increasingly pivotal notion for designers to open themselves up to. There are so many great new approaches, technologies, and opportunities being unveiled every day. There’s no reason to limit yourself from learning these new things; similarly, there’s little possibility of you becoming an expert in all of them.

Perhaps that’s the most important point for people like us to get comfortable with: we don’t need to be experts at in all of these areas; instead, we need to build toolkits for ourselves that bend to different circumstances. Today’s designers have to possess an adaptability that allows them to shift from one situation to the next. (Odds are that over the course of your career, you’ll work on many projects that have little in common.)

For older designers, such notions probably seem sacrosanct with their whole means of operating. They were more likely hired because of their expert knowledge, and to complete defined tasks. For example, a soda company could go to a package design studio and say, “we need you to come up with a design for some Tetra Paks, and we expect you to know exactly what marks/items we need present on them.” However, I rarely ever get requests that specific. Instead, prospective clients tend to  come to us with questions like, “We’re thinking about building an app to help change people’s dietary habits. How do we know if this is the right approach?” or, “We have this innovative new product people would love if they tried it, but everyone’s scared to give it a chance. What should we do?”

The designer of the past could tell a prospective client that they knew exactly how to complete the task at hand, and had most likely done so successfully, many times before. As design challenges become more ambiguous and undefined, though, such assuredness is difficult to maintain. As such, I’m more inclined to say something to our new clients along the lines of, “I’m not sure, but I have some ways to find out and propose some potential approaches.” This might not be enough for some, but it’s a responsible and realistic response. It’s also one that designers will probably have to grow more accustomed to using.

Interesting Times

Since starting smashLAB, I’ve seen new approaches, technologies, and mindsets come and go (and sometimes come again). Some I was reluctant to accept; others I leapt into, head-first. Increasingly, I’m choosing the latter approach. Although I won’t casually take risky approaches on a client project, I have found that we can’t afford to get locked into any one way of working. (Most times, we test more cutting-edge approaches/technologies with our internal projects, so we can trial-run such things safely.)

As we test varying approaches, we see certain patterns emerge. Most notably, we’ve seen how often the right decision is also the simplest one. Regardless of medium, technology, or deliverable, when you remove your own ego—and cut away extraneous elements—you tend to arrive at something that is good. Similarly, I find that our conversations increasingly gravitate to systems over novel approaches. We’re less concerned with how interesting or new a solution might be, and more so with how well it will work in different situations. So, I’ll choose the adaptable logo over the smart one; similarly, I feel like I’ve done my job well, when a design approach requires fewer fonts, colors, treatments, instructions, and exceptions.

Looking at design from a systems standpoint requires a kind of big-picture focus on behalf of the designer. In a print context, or when planning around fixed-size displays, a designer could be highly particular about accuracy of placement; however, the multiple-output scenarios we’re now more likely to design for don’t accommodate this level of specificity. Sure, we can establish relative rules/guidelines, but these don’t work when overly rigid.

I like to compare this challenge to having to make a one-size-fits-all t-shirt. Were you tasked with creating such a thing, you’d probably source a stretchy fabric, and cut the shirt to work for the greatest number of people. However, for the very smallest and largest people, this shirt probably wouldn’t fit very well. Similarly, an application like Todoist works surprisingly well in a number of different environments, but never feels quite as seamless as a native application.

So, yes, design today requires designers, clients, and users alike to accept some level of imperfection. In exchange, we gain the ability to serve a greater number of users, prototype rapidly, and iterate more frequently—leading to design solutions that can in time become excellent. Sure, new problems arise: nascent technologies are not always dependable and can be poorly documented and supported. Some of the approaches we implement aren’t great on older devices, meaning we’re effectively reinforcing barriers for the have-nots. Similarly, the rate of change is resulting in certain trends that aren’t always great for the end-user.

With all of that—the good and the bad—we find ourselves being nudged forward. Considered on the whole, this results in a net benefit, and we see good design elevate the human experience. For the designer prepared to face the challenge, this is an exciting time. The rate of change is forcing us to look past self-expression, and look upon design for what it can do for others. Similarly, it’s pushing us to abandon out-of-date approaches that there isn’t any room for.

Some will say that this change is all too much, and decide that the time has come for them to do something altogether different. Me? I love it. At no other point in my career has design seemed as vital, relevant, and exciting.

It’s understandable that some feel a little trepidation when they learn that their known methods might no longer be the best ones. Similarly, I think we all understand feeling overwhelmed by the sheer volume of change and opportunity at hand. But, I assure you, there’s little you can’t learn if you are willing to give it a shot. You might not be perfect, but no one is. Besides, who wants to sit on the sidelines, just when the game starts picking up?

Discuss this post in #Design on Chapp.

Resurrecting IRC

Last fall, @shelkie wanted to learn more about cryptocurrency. So, he started building miners for fun. Along the way, he had some questions and found IRC Bitcoin channels a good place to ask them. Later, he talked to me about how surprising he found the technology: In spite of it being cumbersome, antiquated, and forgotten by many, those using it had maintained a vital community.

If you’re unfamiliar with IRC (as I was), a brief history: In 1988, fellow Finn, Jarkko Oikarinen created Internet Relay Chat (IRC), which uses a client/server model to facilitate transfer of messages. These allowed for discussion forums (channels), private messaging, and file transfer. By 2003, there were around a million IRC users, and half-a-million IRC channels; since then, those numbers have dropped by nearly half.

After a few of our talks about IRC, I decided to give it a try for myself. Although I feel comfortable with most technologies, I found using IRC a little arduous. In fact, I eventually had to use a how-to guide, just to make sense of installing a client, finding a server, and configuring my client to gain access. Not a great experience; however, once I got in, I could see why @shelkie was interested in the general notion behind IRC—and why he thought it was primed for a sort of rebirth.

Lately, I’ve come to use Twitter as a broadcast/response tool. I post cheeky remarks and new articles when I have them; additionally, if someone asks a question through Twitter, I respond. That said, I find that the 140 character limit tends to stymie more involved conversations. Meanwhile, I don’t quite know what to do with Facebook any longer. I dislike the notion of clogging colleagues’ news feeds with kid/dog photos, and similarly suspect that family members have little interest in my work-related banter.

Sure, Facebook and Google have tried to solve this problem with features like Circles, which ask you to group your connections into certain buckets. However, the problem with this approach is that they’re effectively asking users to do a whole bunch of categorization (or Information Architecture), which is tedious. On the other hand, IRC is interesting, as it uses subject matter as a starting point, instead of connections. So, if I want to discuss UX or Art with those who care about those topics; a service like IRC allows me to, without having to bug my less-interested friends.

We felt that IRC represented a great “in-betweeny” technology that, although no longer fashionable, solved a problem no other network currently did. We also noted that until someone removed the headaches associated with configuring a client and connecting to a server, new users probably wouldn’t bother giving it a try.

At smashLAB, we love building things for the web. So, we got to work on Chapp (the name is a portmanteau of chat and app), which we see as a kind of updated IRC. At this time, it is just an experiment; but, we’re keen on its promise. Built on Meteor and node.js, Chapp is a network that allows anyone to browse channels (topics) and create new ones, almost instantly.

Most channels are public and so far range from #AnimatedGIFs to famous Canadian, #RobFord, to even #Eyeglasses. However, users can also create private channels that allow them to connect their companies, teams, friends, and families in closed settings. (For example, I use Chapp as an email replacement, for communicating with my parents and brother.)

We soft-launched Chapp a few weeks ago, and asked a few friends to give it a spin. We’re grateful for folks like @matt, @sthursby, and @juddc who’ve essentially served as an impromptu user-testing team. During this process, we’ve identified some issues, and started modifying the app, on the fly. Some changes happened fast, with a number of features being requested and deployed in less than an hour. The app is far from perfect—actually, it’s still rough around the edges. That said, we’re pushing out updates continually, and are working to build something that’s simple, efficient, and useful.

I suppose we could have refined Chapp more before releasing; but, we don’t like the idea of building this app in a vacuum. Instead, we want people like you to take part, use it, and tell us what you like/don’t. It’s through this sort of interaction that we learn how to make something that works for you. And, frankly, that’s what drives us to do this kind of work: the simple understanding that we’re buiding something good that people use.

Try Chapp and tell us what you think.

How Upworthy Killed Facebook

I have an unhealthy relationship with Facebook. This isn’t Facebook’s, or social media’s, fault; it’s mine. Whenever I’m unsure of what to do next, or putting off tasks I probably shouldn’t, I find myself visiting the site, to browse, read, and explore.

In recent months, this has changed. I now hesitate before clicking on content in the News Feed because, in most instances, what’s posted is shit. Actually, not shit—worse than that. Most shared content is representative of a new, hyper-resilient kind of of click bait that seems to be growing uncontrollably.

Take today, for example: my News Feed contains links to real articles like, “Capitalism Eating Its Children,” and “Cutting Back On Carbon,” which act as rare islands of credible content. However, the vast bulk of my feed consists of headlines like, “Silhouette Man Wonders WTF is Wrong With Americans,” “22 Adorable Photos of Wild Foxes Turned Into Pets. And When You Read Their Stories … AWESOME!” not to mention, “What Element Are You?” and “15 Things to Stop Doing to Yourself.”

From this latter batch, I have content “creators” like Tickld, ReShareWorthy, PlayBuzz, and Collective-Evolution to thank. Only a couple of years ago, these groups hardly existed; but, following the example that Upworthy (the ass-rot of the internet) set, the click bait infestation is now upon us in full force. Seemingly daily, another junk-site pops up, with the sole intention of tricking us into clicking.

Maybe this is just a temporary blip, and this brand of pseudo-content will quickly fade from memory. My concern, is that this material might perhaps be more like reality TV: something we’ve all started to accept as normal. If this is the case, our collective notion of what constitutes acceptable news, media, and editorial content, has become completely perverted.

Although some might dismiss this as, “always having been this way,” I worry that what we’re really seeing is endemic of a kind of cultural de-evolution. What we’re willing to accept as newsworthy is not just dressed-up gossip, like it often was before, but rather a weird kind of bait-and-switch we’re seemingly powerless against. If we’re to be judged by what we click, a film like Idiocracy looks less comical, and more like a probable future scenario. (It’s got electrolytes!)

Alas, I’ve become tangential. Allow me to bring this back to the News Feed.

In spite of my lack of focus, I appreciate Facebook. It’s an incredibly useful service, and I’m grateful for it. Similarly, I’m all for the company making money. Heck, I’m even fine (albeit not thrilled) with their data mining and advertising techniques. For me, though, Facebook is all about the content; and if they can’t get the click bait out of there, my visits will become less frequent. Eventually, I’ll probably just avoid the service altogether, if this continues.

My hunch is that I’m not the only one who feels this way. You probably skim your News Feed and also ask questions like, “Wasn’t this once better?” and, “Why are my friends posting this junk?” My bet is that Facebook knows this sentiment exists. In fact, I’ve heard that Facebook has penalized Upworthy for how it games the News Feed; and, for that matter, I do find fewer Upworthy posts there than before. What’s surprising to me, though, is that Facebook hasn’t yet curbed all the other click bait that has surfaced through their network.

Although the notion seems improbable, I wonder if Facebook is simply unable to deal with this infestation. Is the public’s appetite for this kind of nonsense so great—and its creators so insidious—that it simply can’t be stopped? If so, perhaps we’ll look back in years to come, and reflect on how Google’s armies were unable to kill Facebook, but Upworthy (and its brethren) did, without even intending to.

Discuss #Facebook on Chapp.

Pace Yourself

Recently, after many years of not running, I completed a half-marathon. My time wasn’t great, but it wasn’t bad, either. In fact, I finished the course a little faster than I hoped (and if I had only skipped that pee-break at kilometer 19, I would have shaved another three minutes off my time).

Typically, when I run a race, I fall prey to a kind of mental game, in which I overthink my actions. I start slow and then try to speed up at key points. Or, I set my sights on someone in front of me, and try to pick them off. Alternately, I give myself “permission” to go slow for a while, knowing that I’ll be able to push much harder at a later point—given the energy reserves I will have stored. (These approaches rarely work.)

This time I did nothing of the sort. Instead, I turned on my playlist of fun songs, and resolved to go on a nice steady run. At the starting point, I found a spot near the 2 hour pace bunny. From that point on, I simply worked to hold a ~5:30 pace. (Again, I’m not out to break any world records.)

As I neared the completion mark, I felt good, and had more to give—definitely more than if I would have tried anything more complicated. I finished in 1:58, and I reasoned that breaking 1:50 in a future run was completely viable. Next time, I’ll employ the same approach, but start closer to the 1:50 pace bunny.

I think a lot about simplicity these days, as I find the best approaches are often the simplest ones. My half-marathon experience yet again validated this belief for me. Instead of thinking too much, why not just choose a reasonable pace, and then stick to it?

When I look at those who’ve achieved success, the same sort of approach is often present: no complex strategy or heroics; instead, just a clear goal, followed by consistent effort. Although this approach doesn’t ensure success, it’s probably more sensible than following fleeting impulses.

Another aspect of pacing that I like, is that it’s cumulative. If you’re a writer, commit to writing a manageable number of words every day. If you’re a designer, perhaps set out to producing a project a week. If you’re working on a startup, you might pledge to on-board a new customer every day. Some days you’ll achieve more, and on others you’ll achieve less, but so long as you stay the course, those peaks and valleys should even out. (And within even six months, you’ll have created something significant.)

Knowing what lies ahead, and determining the best path to take, can be difficult for those who do creative work. In fact, answers to these sorts of questions are often elusive, leading you into a state of paralysis by analysis—or, feeling like you are spinning in circles.

So, the next time you’re contemplating some divergent strategy/course, instead consider the following. Set a recurring task for yourself, and stick to this production schedule for a few months. This length of time might seem daunting, but I assure you, it will pass in no time. My bet is that you’ll later look back with surprise at what you accomplished by just keeping a steady pace.

Have questions or feedback? Discuss this post on Chapp.