Poetic irony: Teaching philosophy of technology in an all-virtual format

Gabbrielle Johnson, assistant professor of philosophy, is teaching “Intro to Philosophy: Science, Technology, and Human Values” and “Psychology of Bias” this spring

Across the CMC curriculum, members of our faculty are meeting the challenge of these unprecedented and historic times, delivering exceptional coursework in a fully online modality for the spring semester. In our Academic Innovations series of faculty Q&As, professors share their curricular highlights, best practices, and how students are helping to shape virtual learning for a memorable, collaborative academic experience.

Gabbrielle Johnson, assistant professor of philosophy, is teaching “Intro to Philosophy: Science, Technology, and Human Values” and “Psychology of Bias” this spring.

Gabbrielle Johnson, assistant professor of philosophy

What are you most eager for students to experience and understand in both of your classes this spring?

I’m most eager for students to learn that philosophy’s not dead. What I’ve noticed over my years of teaching philosophy is that many people (understandably) do not know what philosophy is or what philosophers do. As a first-generation student myself, it wasn’t until my junior year that I learned philosophy was something people still did and made careers out of it, careers that went beyond sitting around musing over what dead white men had to say about things a hundred years ago. I believe students are surprised and excited to see philosophy courses on cutting-edge issues, like implicit bias and machine learning, and likewise I’m excited to share with them what philosophy can lend to our understanding of and engaging with current events.

Most heartening, I had three STEM students write to me after courses last semester to share a similar sentiment about the value of my approach for them. Each stated that they had entered the course squarely in STEM majors and were initially skeptical of the application of humanities to their chosen fields of study; however, they were leaving the course with a whole new perspective that humanities are essential to a STEM education and grateful for their newfound appreciation of philosophy's application to their programs. Two of these students had even decided to do a philosophy sequence as a part of their majors in applied sciences. Having gone through a similar philosophical awakening in my own education, I was ecstatic. 

Have you had to reimagine how you teach your classes in a virtual format? Can you discuss what sorts of adaptations you have implemented?

There’s a poetic irony that comes with teaching courses in the philosophy of technology at a time like this. It seems like to convey the degree to which technology now permeates every aspect of our lives, all I would need to do now is simply show up and have them look around at our all-encompassing virtual environment. Joking aside, I do see the pivot to virtual education as something of an opportunity in disguise. Not only are the students living the subject matter, but they’re in a useful position to test out certain ideas in real time. For example, last semester, in a lesson about how human values and commercial interests might shape seemingly objective gateways to information, I had students actually simulate a Google search engine. They were able to curate a list of responses they thought were ideal, and compare those with real-time results provided by Google. This allowed them to simultaneously compare and contrast how each person’s list differed across both the simulated and the actual online platforms, and how this might affect the narrative of (mis)information each person is exposed to.

In coming up with fun activities like these, I’m able to use our reliance on technology as a resource rather than an impediment. However, at the end of the day, there’s a deeper, self-evident lesson: no matter how enriching technology is for our day-to-day activities, it isn’t—and can’t be—a replacement for real, human interaction.

What are you hoping to address in your "Psychology of Bias" course this spring? What sorts of outcomes are you seeking?

Over the last 70 years, our concept of social bias has changed dramatically. It used to be that to say someone had a social bias meant they overtly affirmed some problematic stereotype about people from marginalized groups. Thus, the best measure of whether someone had a bias was simply to ask them directly. Over time, the direct approach to measuring bias began to show a decline in negative racial bias. However, although overt expressions of racist ideology were curbed, the pervasive and destructive effects of racism were still painfully evident. Thus, research methods had to be developed to reveal psychological sources of discriminatory behavior that are not obvious to the individuals who harbor them and that grow out of seemingly innocuous patterns of thought and inference. Recognizing this new species of social bias demonstrates how insidious social bias really is, and how it creeps up everywhere from seemingly objective human inference to data-driven computational processes in machine learning programs.

This class will explore the philosophical implications of these sorts of covert, hidden biases. We’ll question how they might hinder our ability to reason objectively about our social environments, whether we are morally responsible for their existence, how they might contribute to institutional or structural injustices, and what, if anything, we can do to mitigate their effects on individuals and society.

The class itself is strictly interdisciplinary. I’m a firm believer that philosophical work such as this must be grounded in, and ultimately shaped by, the actual empirical work with which it attempts to engage. By reading contemporary works in philosophy, psychology, cognitive science, and computer science, students in this class will explore the topic from multiple angles.

How are you feeling about the year ahead of you? How has this living through the pandemic affected your research?

I’m optimistic. So much of this past year was shrouded in uncertainty. This year has only just begun, but we already have so much to be hopeful about.

Regarding my own research, as I mentioned earlier, the pandemic has thrown into sharp relief how much our lives have come to depend on technology, and provided a healthy skepticism as to what extent that technology can supplant human social interaction. This has given me a lot to think about in my own research projects on social bias, values in scientific practice, and the philosophical impact of technology.

You participated in the Algorithmic Bias: The New Form of Discrimination” panel on Feb. 5 with CMCs Office of Alumni and Parent Engagement. Can you give us a snapshot of what you shared?

The panel was hosted by CMC’s Financial Economics Institute and in response to the College’s Presidential Initiative on Anti-Racism and the Black Experience in America.

At the intersection of topics in my work and courses I’ve discussed above, the panel aimed to explore how racism and discrimination have taken a new form in the age of big data and artificial intelligence. I shared a bit from my work about what social bias is, how our conception of it has changed over the last 70 years, and why we’re seeing it creep up in seemingly neutral technologies like machine learning. One key import I brought to the discussion is the danger that accompanies the misunderstanding that these technologies have ameliorated human bias. Not only do they harbor many of the human biases with which we’re familiar, but they do so under the guise of objectivity, making them even more menacing. Thus, my work of understanding how social bias can manifest in seemingly innocent patterns of social reasoning contributes vitally to our identifying, evaluating, and eventually mitigating the pernicious biases we see in these intelligent systems.

—Anne Bergman

Anne Bergman

Contact

Office of Strategic Communications & Marketing

400 N. Claremont Blvd.
Claremont, CA 91711

Phone: (909) 621-8099
Email: communications@cmc.edu

Media inquiries: David Eastburn
Phone (O): (909) 607-7377
Phone (C): (808) 312-8554
Email: deastburn@cmc.edu