Most of us start projects with good intentions—we want to make things welcoming, seamless, and maybe even fun to use. But too often, toxic cultures within tech result in products that have all sorts of biases embedded in them: “smart scales” that assume everyone wants to lose weight, form fields that fail for queer people, image-recognition software that doesn’t work for people of color. As tech becomes increasingly central to our users' days—and intertwined with their most intimate lives—we have more responsibility than ever to consider who could be harmed or left out by our decisions.