Samantha Hankins: ‘But Wait, Is Your Last Name Filipino?’

DCIM100GOPRO

I’ve never met my dad and to this day I’ve never seen a photo of him or even discussed him with my mom. From what I’ve gathered, my mom and dad were married, and they were divorced before I was born. I know this because we both still have his last name, Hankins.

My mom was a single parent, and we lived with my aunt and my uncle in an affluent, predominantly white suburb of Chicago. She struggled with the cultural differences between the Philippines and America and therefore prioritized making sure I had a connection to our Filipino heritage. I grew up learning how to speak Visayan, spending summers in Cebu, and making friends and learning traditional Filipino dances with kids whose parents were also in the Chicago Filipino doctors’ association. Through and through I was raised 100% Filipino. Read More

Caroline Sinders on Ethical Product Design for Machine Learning

Caroline-7056_Crop

Photo by Alannah Farrell

For the past two years, I’ve worked as a machine learning design researcher. Machine learning is programming that learns from user inputs and adapts and improves over time. It’s my humble belief that machine learning and artificial intelligence is going to radically change product design. From the implementation of chat bots to natural language processing implemented to study users’ behaviors and conversational patterns, to analytics APIs designed to study and predict behavior, to computer vision software created to predict crimes and recognize human emotions. In fact, everything I just mentioned already exists. But implementing these algorithms is one thing — how do we design, ethically, using machine learning, and how do we create products that use all of the positive attributes of machine learning without surveilling and harming our users? Can ethical product design exist for machine learning?

I believe firmly that it can. However, machine learning needs to be treated not as a new, out-of-the-box software implementation that has been QAed, tested, and is ready for deployment with few new changes or rollouts. It needs to be treated as highly experimental software. Read More

Ash Huang: How Much Poison Is Acceptable in Our Technology?

Ash Huang

Photo credit: Helena Price

For an industry that complains about the inconvenience of waiting for a cab, doing laundry, or picking up takeout, we sure build a lot of suffering into our apps.

Virtual reality initially caused motion sickness in women because the equipment was developed and tested primarily by men. Interracial couples try to take photos together and fail because their phone’s white balance can’t capture both dark and light skin tones. People struggling with mental health issues, violence, or other trauma try to get help from Siri and Alexa but we’re only recently seeing that considered. All these stories and more, underscored by a rampant and constant harassment of women, people of color, people disabilities, those of Muslim and Jewish faiths, and LGBTQA—and tech’s bewilderment on how to help. Read More

John Palfrey on The Paradox of Tolerance

john-palfrey-cropped-1

Photo by Dave White.

Co-Editor’s Note: Two weeks ago, I was lucky to be introduced to John Palfrey’s open note to his campus. It has stayed with me as the clearest piece of creative writing on the vital topic of inclusion. —JM

We teach more than just mathematics, science, writing and reading, languages, the arts, and other academic topics in our schools. We also teach character and moral development. Many schools do so explicitly, through the lessons that we choose; all schools do so implicitly, through the personal examples that teachers, coaches, and principals set for our students. Whether parents like it or not, there is no way for teachers to avoid teaching character to some extent; after all, our students are watching us as they learn. Read More

Lena Groeger on Discrimination By Design

Headshot_Groeger.jpg

A few weeks ago, Snapchat released a new addition to its face-altering filters that have become a signature of the service. But instead of surrounding your face with flower petals or giving you a funny hat, the new photo filter added slanted eyes, puffed cheeks and large front teeth. A number of Snapchat users decried the filter as racist, arguing it was the outcome of not having enough people of color building the product. In a tech world that hires mostly white men, the absence of diverse voices means that companies can be blind to design decisions that are discriminatory or hurtful to their customers. Read More