Earlier this summer, an internet gimmick surfaced on social media where people randomly posted their first seven jobs. Implicitly, this meant that you had to be old enough to have held seven jobs in the first place (a fact that eliminated pretty much anyone under forty). Nevertheless, referencing these forgotten anecdotes quickly became a kind of sport: what emerged was a combination of irreverence (famous people divulging they’d once been interns) and insouciance (less famous people declaring they’d only ever been interns), all of which produced a surprising result.

It leveled the playing field.

Conspiracy theorists wondered if the entire charade had been seeded by shrewd marketing strategists (it hadn’t) but to most of us, the entire exercise seemed harmless, even humorous. Part listicle, part Haiku, there was something entertaining about it, and because it was so clearly contingent upon one’s own self-avowed narrative, the whole enterprise felt like a selfie: quirky and quick, a midsummer meme.

In the end, #firstsevenjobs was an exercise in self-generated inclusivity, reminding us that we are, as Aristotle himself might have observed, a great deal more than the sum of our parts. In a world defined by competing realities—social, digital, global, virtual—how do we share who we are, what we value, how it might matter to others?

We live in a world where seeing is no longer believing.

This fall the U.S. Supreme Court will hear arguments for a case—between technology behemoths Apple and Samsung—that hinges on what constitutes the design of a smartphone. As in all legal conflicts, the case will rely heavily on precedent, but what’s significant in this instance is the degree to which intellectual property manifests as undeniably visual. From an operational viewpoint, the phone itself links communication to organization, work to play, camera to calendar to the countless apps that have come to characterize our notions of both efficiency and independence. But from an aspirational perspective, its material value is profoundly tethered to its physical attributes, its ergonomic elegance, and its core appeal. With over six billion smartphones anticipated in worldwide circulation by 2020, what these things look like is inextricably linked to their perceived purpose and performance.

Today, form no longer follows function. It is function.

The Apple-Samsung case testifies to two primary issues: one, that visual perception is a ferociously powerful trigger for consumers, and two, that beauty remains in the eye of the phone-holder. (It’s equally true that living in a democracy means copping to the acceptance of high government: anybody can make an app, but only a chosen few get appointed to the bench.) Fair enough. But before you malign the doctrines of jurisprudence as an elitist practice, consider the sociocultural implications of free speech in the context of design and technology. Think about cross-disciplinary collaboration in the interests of social enterprise. Imagine a world where diversity is standard, where participatory making supplants proprietary bias. Increasingly, it’s becoming clear that when we invest in a collective agenda, we are fortified by a deeper purpose.

We pay it forward.

At a moment in history that’s likely to be remembered for its alarming divisiveness, technology beckons with a kind of solemn grace: we are all makers now, capable of unprecedented levels of engineering, artistry, and authorship. And here, the evidence is heartening. A post-Brexit design referendum promoting cultural inclusivity. “Universal” design conceits eliminating barriers for the disabled. Low-cost, high-impact interventions that produce smart, scalable solutions in developing countries. Today, the once-isolated disciplines of technology and design reassert themselves as the building blocks of a new generation.

Thirty years after William Gibson first coined the utopian term “cyberspace”—a prophecy of shared and highly inclusive social interactions—we’re just beginning to witness the evidence of true social connectivity, and technology may be the closest we’ve got to an international language. Can an understanding of code— and by conjecture, the value we place on computational skill—become the catalyst for change that coming generations will truly value? Can computational design become the connective tissue linking millions of people to tomorrow’s large information systems? And can inclusive practices become the rule, rather than the exception, breaking down barriers to entry, eliminating the fault lines between disciplines, the balkanized rituals that limit our progress? “If they don’t give you a seat at the table,” Shirley Chisholm once remarked, “bring a folding chair.” Today, we can do better: we can build those chairs ourselves. And we will.

❔ Whois

Jessica Helfand received both her BA in graphic design and architectural theory in 1982 and her MFA in graphic design in 1989 from Yale University, where, with Michael Bierut, she is Senior Critic in the School of Art and a member of the design faculty at the School of Management. A co-founder of Design Observer, she has written for many national publications, including the Los Angeles Times Book Review, Aperture, and The New Republic and is the author of numerous books on design and cultural criticism, including Paul Rand: American Modernist (1998), Screen: Essays on Graphic Design, New Media and Visual Culture (2001) and Reinventing the Wheel (2002), which formed the basis for an exhibit in 2004 at the Grolier Club in New York City. Her critically acclaimed Scrapbooks: An American History (Yale University Press, 2008) was named that year’s best visual book by the New York Times. Helfand’s most recent book, Design: The Invention of Desire, was published by Yale University Press in 2016. Jessica Helfand was appointed by the Postmaster General to the U.S. Citizens Stamp Advisory Committee in 2006, where she chaired the Design Subcommittee until 2012. She is a member of Alliance Graphique Internationale (AGI) and a 2011 laureate of the Art Director’s Hall of Fame. Named the first Henry Wolf Resident in Design at the American Academy in Rome, she received the AIGA Medal, the design profession’s highest distinction, in 2013.

❤️ Favorite Emoji


Enter your email address to follow this blog and receive notifications of new posts by email.

Posted by Jessica Helfand

Artist, Writer, Educator. New Haven and Paris.


  1. “Implicitly, this meant that you had to be old enough to have held seven jobs in the first place (a fact that eliminated pretty much anyone under forty).”


    Implicitly, this tells me the author is ignorant of the current employer market (particularly the service industry where many under 40 are working), underemployment, lack of benefits, wage theft, and other nefarious employer practices that have forced many a worker into different jobs (sometimes three at a time just to make 40 hours) – and all at or marginally above minimum wage.

    I’m 54. I live in a comfy socioeconomic range. And I am fully aware of what younger workers are having to deal with. How does the author not?

  2. A fascinating response, that gets to the heart of that meme. How truthful were people about their jobs? What constitutes a so-called job? How much did people spin their employment histories into poetic variations on the “baby shoes never worn” flash fiction originated by Hemingway? When is a job a labor of love, a test of endurance, an inventory of humanity? That is what’s interesting about all this: to my mind, not so much an expression of socioeconomics as an evocation of self-awareness.

Comments are closed.