Where are the Women Philosophers? An Unsolicited Book Review of Justin E. H. Smith’s The Philosopher: A History in 6 Types

Unfortunately, for a book that is supposed to be a bold, new, and more inclusive, look at what a philosopher is, Justin E. H. Smith’s book The Philosopher: A History of 6 Types (Princeton, 2016) comes across as traditional and “old school.” While Smith’s pluralistic typology of philosophers and of the history of philosophy certainly has its merits—it shows nicely that poets, novelist, and scientists can also be philosophers—few contemporary philosophers are featured, and so the book provides little by way of knowing where we fit into this typology today. However, the main reason for my claim that Smith’s book is traditional lies in its paucity of women philosophers. The Philosopher verbally challenges the recognized canon yet this laudatory message is betrayed in its very execution.

Given that I teach 10 classes per year and publish somewhat regularly, I am always on the lookout for new, provocative philosophical ideas and arguments, or for philosophers whom I know little or nothing about. It was the latter that motivated me to purchase and read Smith’s The Philosopher. And so it was with interest that I read Smith’s arguments that Alexis Kagame and Walt Whitman are philosophers in their own right. Smith’s discussions of these thinkers, plus others, are fascinating. Nicely done! However, given that male philosophers dominate the halls and assigned texts of philosophy—whether western or eastern—I am not looking for more men. I have more than enough to talk about. I’m looking for more women philosophers to discuss in my classes.

One (solicited) book review of The Philosopher would suggest that I’ve come to the right place: “The array of thinkers cited is gender and culture inclusive …” (Choice). Culture, yes; Gender, hardly. In total, 8 women philosophers are mentioned in a 250 page manuscript: G. E. M. Anscombe, Hannah Arendt, Margaret Cavendish, Ariel Durant, Zora Neale Hurston, Katharine Kepler, Carolyn Merchant, and Alison Wylie. Most get one quick mention; none get more than two. None is discussed. The front cover features Cavendish, yet she gets two quick mentions in the text itself. She is called “brilliant” by Smith, but there’s no explanation of what makes her brilliant. 172 men philosophers are mentioned. Many are discussed at length. Leibniz alone gets 40 mentions. Smith dedicates 11, rather gushing, pages to Walt Whitman.

There is no mention of Mary Astell, Émilie Du Châtelet, Anna Julia Cooper, Joyce Mitchell Cook, Anne Conway, Elisabeth of Bohemia, Kathyrn Gines, Marie de Gournay, Heloise, Hildegard of Bingen, Hypatia, Damaris Masham, Onora O’Neill, Martha Nussbaum,Teresa of Ávila, Judith Jarvis Thomson, Margaret Wilson, Mary Wollstonecraft, or Sophie Charlotte of Hanover. Some of these omissions are simply shocking. For instance, Leibniz himself wrote that his philosophy was influenced by Conway. Except for Huston, no women of color are mentioned. You wish to understand the different types of philosophers, as is the explicit and avowed aim of Smith’s book? You’ve got to be more gender inclusive than this.

End of my unsolicited book review. Or maybe it’s closer to a rant? At any rate, for further reading on philosophy’s gender bias, see a fantastic article by Andrew Janiak and Christia Mercer. By the way, Mercer was one of my favorite professors at the University of Arizona.



The Fallacy Stick

A little philosophy can sometimes be worse than none. Critical thinking done correctly requires constraint. Be warned of what I call the fallacy stick.


One of the first things that new philosophy and critical thinking students learn is to recognize fallacies. Fallacies are common errors in reasoning or argument. Students learn to differentiate between passages that merely raise a further question from those that actually “beg the question,” those that commit “straw man” from those that commit “red herring,” and so forth. Learning how to do this improves their reasoning and to be better equipped to detect the fallacious reasoning of others.

Here are a couple of examples. The Daily Show stated in reference to Trump’s campaign slogan: “’Make America Great Again’ begs the question: when exactly was America great?” From The Hill before the last presidential election: “The polls and online markets currently have Clinton running away with the contest, begging the question: If Clinton wins big, what’s it likely to mean for the country?” Both, however, are simply cases of raising a question; neither has nothing to do with the fallacy of begging the question. 

It is natural when initially learning about fallacies to start noticing them for the first time. I remember when I first learned of bingo halls. (The Drum and Bugle Corps I played in sometimes stayed overnight in bingo halls.) Before this piece of knowledge, I never noticed them. Now I notice bingo halls in every city I visit. However, it’s not like I start seeing bingo halls even when they are not there, yet this is exactly what happens in the case of fallacies. Students often start to see fallacies everywhere, even where there is none. Take the not uncommon phenomenon of people changing their minds and taking positions they previously rejected. Students often succumb to the temptation to attribute the fallacy of inconsistency to them. Consider the following example:

I used to believe that reproductive human cloning was a terrible idea because it is unnatural and it requires advanced technology, but I’ve since changed my mind, because I have come to realize that cloning really isn’t that far removed from in vitro fertilization (IVF), which is also unnatural and requires a lot of human technology. Our reproductive liberty should extend to both IVF and cloning.

The author here should not be charged with the fallacy of inconsistency, however tempting it may be, since she has explained her change of mind. This is a problem of using the fallacy stick even when there’s no fallacy.

Sometimes a fallacy is “seen” or “charged” because of a misunderstanding of the fallacy itself. This is from a popular website: “Argumentum Ad Populum is an argument by the majority stressing that they are right since they belong to the many” (Mortillero). No, an ad populum argument may occur even if only one person asserts it! Say I argue that the sky is blue because most people believe that the sky is blue. I’m appealing simply to the majority for my conclusion that the sky is blue, and my argument is therefore an “argumentum ad populum,” but it is certainly not an argument given by the majority.

Another problem I want to address is using the fallacy stick without explanation. Unfortunately, it affects even the most experienced fallacy-knower: Charging others with committing fallacies without actually explaining the logical shortcoming.

Two things: First, it’s generally rude. Most people don’t truly understand fallacies. Even the accuser sometimes! I can’t count how many times I’ve seen fallacies applied incorrectly. Labeling or naming a problem is a shortcut for showing why the argument is problematic. Now, after showing why the argument is problematic, why use the label? In a classroom setting, where students are learning fallacies, it makes sense to use labels. But in real life where people are often emotionally invested in what they’ve just asserted and don’t fully understand philosophy and logic, generally avoid such terms. I never tell someone I don’t know, “Hey, you just committed the ad populum fallacy!” Even if I’m on Facebook or Quora. Rather, I explain that in arguing for their point, they referred only to the fact that many people agreed with their conclusion, but this alone doesn’t show that their conclusion is correct, for the majority may in fact be wrong, as history has shown us time and time again. Learning fallacies should not amount to learning how to cudgel people with labels; it’s to recognize errors in reasoning, either in your own arguments or in the arguments of others.

Do you use the fallacy stick? I recommend against it. Teachers, specifically, should create a classroom atmosphere of argument repair and construction rather than argument subversion and destruction.

Out Of Simplicity, Find Clutter

Einstein once said, “Out of clutter, find simplicity”—it’s important to recognize order and patterns in the apparently complex. “In discord, find harmony,” he continued. Einstein utilized this maxim to guide his own investigation of the physical universe. Yet it is undoubtedly a valuable maxim for all disciplines, including philosophy, for it is an essential feature of logic. As a teacher of philosophy, then, part of my job is to simplify the clutter. But, as I will show, part of my job is also to “clutter up” what appears to be simple, to find discord in what appears harmonious. I begin with simplification.

Consider the following passage:

What is the highest human good, namely, the most valuable thing for humans, such as you and me? Let’s begin with commonplace examples of things that may appear to offer much value, but actually represent lesser human goods. Consider, for instance, lots of money or assets, i.e., wealth; it is not sought except for the sake of something else, of itself it brings us no good, but only when we use it, whether for the support of the body or for some similar purpose. Now the highest good is sought for its own, and not for another’s sake. In other words, the most valuable thing must be intrinsically valuable. Wealth is thus not a human’s highest good, for, as shown above, it is not intrinsically valuable.

Seemingly convoluted, the above passage offers a fairly straightforward argument:

(1) Wealth brings us good only when we use it.


(2) Wealth is not sought except for the sake of something else.

(3) The highest human good is sought only for its own sake.


(4) Wealth is not a highest human good.

The above argument was found in the original passage by extracting any logically extraneous detail and then expressing what remains as clearly as possible. Now the argument is ready to be checked for soundness, using the tools of logic.

In philosophy, however, the converse of Einstein’s maxim—”Out of simplicity, find clutter”—is equally valuable. For too often, what appears simple, straightforward, or consonant is not. Let me explain.

The philosopher Spinoza spoke of how there would be no disagreement among people if only they used terms univocally. Say that you and I disagree vehemently about the existence of God. Yet, maintained Spinoza, you and I would agree about God, provided that you and I actually meant the same thing by ‘God’. People may think that they are speaking of the same thing, because they are using the same words. But often they aren’t, and so confusion and disagreement reigns. Spinoza himself was called both a man “intoxicated by God” and an atheist. That’s why he began his Ethics with a set of definitions, upon which we can generate further truths and of course consensus on those truths. Spinoza may have been overly optimistic regarding the prospect of human agreement over controversial matters, nevertheless, for there to be any possibility of such agreement, clear communication is paramount.

To that effect, defining terms very carefully is crucial in philosophical discourse. Integral to this task is the making of distinctions between terms; that is, to disambiguate easily or often ambiguated terms. Here are some particularly salient distinctions that philosophers love to talk about:

  • appearance/reality
  • ambiguity/vagueness
  • analytic/synthetic
  • apriori/aposteriori
  • categorical/hypothetical
  • contrary/contradictory
  • determinism/fatalism
  • efficient/final/material/formal
  • endure/perdure
  • eternity/sempiternity
  • induction/deduction
  • intension/extension
  • necessary/contingent
  • necessarily/always
  • noumenal/phenomenal
  • transeunt/immanent
  • per se/per accidens
  • providence/praevidence
  • substantival/adjectival
  • tautologous/contingent/contradictory
  • type/token
  • use/mention

For the sake of time, I will discuss just one of these distinctions—the one between determinism and fatalism.


“Lives are rivers. We imagine we can direct their paths, though in the end there’s but one destination, and we end up being true to ourselves only because we have no choice.” (Richard Russo, Empire Falls)

In ancient Greek writings, the two distinct positions of determinism and fatalism were not often disambiguated, since the same term, moira (μοῖρα), seems to apply to each.  Moira comes from meros, “part or lot,” and moros, “fate or doom.” Even today, the English terms and phrases “fate,” “doom,” “destiny,” “one’s lot,” “what is determined to happen,” “what is predetermined,” and “what is predestined,” are often used interchangeably and loosely.

(Tapestry by Pat Taylor and Fiona Abercrombie, from the drawing Three Fates by Henry Moore)

Let’s begin by describing a brand of determinism offered by those ancient Greek and Roman philosophers called Stoics. Stoics believed that everything that is or comes to be in the universe has a cause: there is nothing that is uncaused. Everything is but a link in the infinite chain of causes. If this were not the case, the universe would be unpredictable, chaotic, and disunified. Indeed, the universe was thought by many Stoics to be an organic unity.

The Stoics distinguished between several kinds of cause: initiating, contributory, sustaining, and constitutive. But what is true of all of these causes that it is impossible that, when all of the circumstances surrounding both the cause and that for which it is a cause are identical, the result would sometimes turn out in a particular way and sometimes would not. For, insisted the Stoics, if this were to happen, then there would have to exist some change without a cause. In other words, according to the Stoics: same initial conditions, same result.

None of this, however, points to the doctrine of fatalism, strictly speaking. Rather, the Stoics tended to defend what philosophers now call determinism.

So what is fatalism and how does it differ from determinism? To explain, consider the case of Oedipus, the mythical Theban king and subject of Sophocles’ tragedy. The oracle of Apollo said to King Laius, “If you beget a child, the one who is born will slay you, and all your house will wade in blood.” Eventually, this came to pass. His son, Oedipus, wound up killing Laius and marrying his mother Jocasta, not at all knowing that he committed patricide and incest.

(King and Queen by Henry Moore)

Determinists say that if the oracle of Apollo had not made such a prophecy to Laius, none of the things that came about would have done so. If the oracle had not prophesied thusly, Laius would not have abandoned his son and his son would have known that Laius was his father and so would not have killed him nor married his mother. Everything is part of the “chain of causes,” including the prophecy. Yet the oracle did utter the prophecy, so Oedipus’ patricide and incest were both causally inevitable.

However, imagine a response to the case of Oedipus that went like this: Oepidus was going to murder his father and have sex with his mother whether or not the oracle uttered the prophecy. Such a response diverges greatly from the above deterministic one; it is the response of a fatalist.

Here’s the difference in a nutshell: determinism means that given the same initial conditions, the same results will occur; fatalism means that at least some events will occur, no matter the initial conditions. No wonder then that physicists tend to be determinists but not fatalists while Christians tend to be fatalists but not determinists. No matter what you or I do from here on out, Christians believe, Christ will return. Whereas many physicists believe that what you and I do does causally affect the future. Our actions do make a difference. The kicker, of course, is that our actions are themselves entirely causally dependent on and determined by past events.

So the next time you hear someone speak of or write about fate, ask yourself: What exactly is this person referring to? Or is she or he using it in a loose or ambiguous way?


Bullshitting With Parameters

Since I teach philosophy, it’s probably a good idea to describe what I take philosophy to be. It’s common knowledge that philosophy means “love of wisdom.” Yawn. But it starts to get a bit more interesting when we see that both love and wisdom can be understood in various ways. The Greek word philia suggests love of the friendship variety. Certainly most philosophers think of themselves as friends of wisdom; however, such a depiction is incomplete and potentially misleading. In Plato’s Socratic dialogue, The Symposium, or as I prefer it, The Drinking Party, love is depicted in several ways. Interestingly, the one Socrates himself prefers relates more to eros than to philia. His point is that philosophers are not merely friends (if they are friendly at all) to wisdom but that they are seekers of wisdom. Erotic love refers to the attempt to obtain something that one does not yet have. It is therefore an activity with an intended target. And, like eros, it is an unstable, even uncomfortable, state of being. (Put The Drinking Party on your reading list.)

What is this wisdom that philosophers seek? For one, it’s not mere information. I’m not a wiser person because I remember the past ten winners of the Great American Beer Festival. And it’s not because I know how to brew beer (though, as countless Benedictine monks can attest, that doesn’t hurt). And it’s not even finding out how yeast produces alcohol. Sorry, scientists. Seeking wisdom is not about remembering facts, or learning a craft, or mere empirical investigation. Wisdom gets at deeper issues of a metaphysical, epistemological, or ethical nature. Ethics, for instance, concerns how we should or ought to live with one another. Notice the should here. Philosophers are not really interested in investigating how we do live, or have lived, or will live. Let sociologists, historians, and anthropologists answer those questions. They’ll do a much better job anyway. Now, how we do live with each other is probably germane to the question of how we should live, but philosophers don’t take that for granted.

Some would point out that answers to these deep questions have already been given via religion, culture, and tradition. Sure, Christian morality, with reference to the Sermon on the Mount, the Golden Rule, and certain virtues, answers the question of how we should live with each other. So why does one need to investigate further, that is, to philosophize? A simple appeal to the answers provided by Christian morality is deeply unsatisfactory to a philosopher. For how do we know that such answers are correct? What makes an answer to our deepest questions correct, or at least better than other answers? Why should we believe that the Christian account of virtue is superior to that of Aristotle’s? Or Nietzsche’s, for that matter? Do not stay fixed on what others believe in order to resolve such issues. There’s an Akan proverb that registers in the same key: “A wise person, if we show them something above, looks on the ground.”

Consider the following Turkish folktale:

One day Nasreddin Khoja and a group of his neighbors were going somewhere together. They all rode upon their donkeys. When they came to a hill, Khoja noticed that his donkey was sweating. He got down from its back and whispered into its ear, “I am sorry that you are working so hard that you are sweating.” His neighbors noticed Khoja get down from his donkey’s back and whisper into its ear, and they were curious about this. “Khoja, what did you whisper to your donkey?” one of them asked. “I told my donkey I was sorry he had to work so hard that he sweated,” answered Khoja. All of his neighbors laughed, and one of them said, “Why did you do that? Donkeys do not understand human speech. They are not at all human.” Khoja replied, “What I have to do is what concerns me. I did what is expected of a human being, and I do not care whether or not he understood what I said.” (Quoted in Bobro, “Folktales and Philosophy for Children,” in Analytic Teaching 25, 2)

Most of us are able to identify with both parties, Khoja as well as his neighbors: the neighbors, since it is not normal to apologize to a labor animal or livestock, because of the widely accepted idea that such animals are inferior to humans and are here only for our purposes; and Khoja, because he is doing what he thinks is right even though it contradicts common belief and practice. But to really take seriously what Khoja says, to begin yourself to question the answers “given” to you by society is to engage in philosophy. This can get uncomfortable and can even set yourself up for ridicule, as it did for Khoja. (I’ve had students who thought that questioning was a sin. Well then, on that score, the practice of philosophy is downright sinful!)

I don’t believe that philosophizing is for everyone, but I do believe that each of us should at least once in our lives, place our beliefs, especially about ethics and religion, under the light of philosophical questioning. Sometimes I call it the skeptical microscope. This is precisely the point of Descartes’ Meditations on First Philosophy. He’s not asking the reader to “meditate” regularly on philosophical questions. It’s a call to place our most cherished beliefs under the skeptical microscope. At least once. If those cherished beliefs stand up to such scrutiny, fine, keep them. If they don’t, well then, you’d better be ready abandon them. Or at least suspend your belief. (It’s a bit more complicated than this, but you get the idea.)

After approximately 10 years of solid training, I knew that I had become a “philosopher.” As discussed above, this meant asking and answering (or attempting to answer) the deepest questions about related to metaphysics, epistemology, ethics, etc.. Others would say that I had simply learned how to bullshit. I would correct them and point out that I had learned how to bullshit with parameters–albeit parameters that have been developed for millennia. (By parameters, I mean methods of investigation, techniques for evaluating arguments, conceptual distinctions, and also, terminology needed for clear expression and communication.) Don’t get me wrong. I have some admiration for bullshitters; in fact, it doesn’t come naturally to everyone, just as sitting down at a piano to play with no training and no understanding of piano-playing “rules” can be daunting, to say the least. In other words, to engage in an activity, ignorant of or deliberately in spite of the rules normally associated with that activity, is something that relatively few are naturally comfortable with doing. Now, it can be extremely difficult to learn those rules, but as with piano-playing, bullshitting becomes more pleasurable when there are parameters. Another interesting phenomenon occurs: at some point, this “bullshitting” no longer feels like bullshitting. At some point in developing one’s capacity to bullshit–I mean philosophize–about the deep questions, this philosophizing becomes relevant and even useful.

To explain, here’s a history of my own relationship with bullshitting:

My first three years of college were basically a bullshit fest for me. In class, I talked a lot; outside of class, I wrote a lot (very little of which had anything with the assigned readings). And I basically had no clue that most of what I spouted was actually bullshit. Consequently, I don’t freak out if a student doesn’t do the assigned readings. This doesn’t mean that the student isn’t interested in the subject matter or isn’t engaged in class. Still I wish that I had followed my professors’ instructions better. For when I went to graduate school, I had a lot of catching up to do. Bullshitting only gets you so far in such an environment. And if it does happen to succeed, you’re in a crappy graduate program. There are some; trust me.

But, fortunately, my bullshitting was “called out” in my senior year, and not even by a professor in philosophy. I went to an English professor’s office hours to talk about some paper or project that was due soon. I’m not sure how it came about, but in talking about how I was doing, she noted that I knew something about most subjects, but nothing well. I was a dilettante. That struck home, because immediately I knew it was true. In college, one can slide by with dilettantism, and I was the prince of the dilettantes.

Before this encounter, I never thought of my bullshit as such. After this encounter, I did. And so I vowed to focus on learning my chosen trade, philosophy. I needed discipline and direction; I could no longer just “wing it.” Of course, this is easier said than done and I still just wing it on occasion.

So, today, when I encounter a student who clearly likes to bullshit and has gotten away with it because she or he knows enough to make it stick, I call them out too. I say words to this effect: You’re a bullshitter, which is cool. I can respect that. However, while it has worked up till now, at some point it won’t. Do the readings. Focus. Get some discipline. Learn the parameters of your discipline. And then your “bullshitting” will be even more pleasurable and, even better, might become relevant.


(Photo courtesy of Chino Chasakara 2015)


The Tusk Question

“What do I really think when a student asks a dumb question?”

Let’s get something out of the way first. The cliché, “There’s no such thing as a dumb question”–taken in a literal way–was coined by someone with an overly active imagination or an overly charitable character. Consider the following example from one of my Ethics classes. The topic was reproductive cloning. We were speaking of the process of somatic cell nuclear transfer, where the DNA-containing nucleus of an ovum or egg cell is replaced with the DNA of a somatic or body cell of the animal that is desired to be cloned. After cell division is “jump-started,” the developing embryo–in this case, the “clone”–is transferred into the uterus of a surrogate mother. Now, some scientists, keen on bringing back extinct species, want to bring back the Wooly Mammoth using recently discovered Mammoth DNA. Provided the DNA is sufficiently intact, this can be done with reproductive cloning. (The movie Jurassic Park is not based entirely on fantasy.) One proviso, of course, is that both the egg cell provider and the surrogate mother would have to be from a living species, presumably an African Elephant. A student immediately raised his hand. “Yes?” I asked. The student replied, “What about the tusks?” After a moment’s hesitation, other students started laughing. I may have as well. (For my students: if you were present in my class that day, what was my reaction?) Let’s call it the Tusk Question.

To be fair, there are contexts in which the Tusk Question would not be a dumb question. For instance, there was a period of time–a long period of time–when we knew little about reproduction and fetal development. (“Back in the day” many scientists used to think that human sperm were tiny humans. It’s called preformationism, if you care.) Or when a student isn’t realistically expected to know that tusks grow after birth, or that African elephants have tusks. But this was a college student, and a relatively intelligent one at that.

It’s easy to get frustrated by dumb questions. One of my favorite professors at the University of Washington, Robert Coburn, who normally was the most imperturbable guy in the world, would on occasion get perturbed when faced with a dumb question. Sometimes he would reply as follows: “Just think about it for two seconds; I’m sure you’ll figure out the answer.” Just so you know, that is verbatim. Other teachers simply gloss over or even ignore outright the dumb question, especially when there are questions from other students.

I was seriously tempted to use Coburn’s approach in answer to the Tusk Question. Instead I used a gentler variation, by asking the student some simple questions: “When do tusks appear?” and the like. It only took a minute or less for the student to figure out that his original question was pretty dumb. And then he laughed at himself. That’s the approach I typically take. Often, though, I will ask the student to repeat the question or to restate it. Perhaps I’ve misheard or misinterpreted the question.

Some say that there’s no such thing as a dumb question; there are only dumb people. That in some context, any question can be perceptive and on point. For, as pointed out above, in some contexts, even the Tusk Question makes sense. There is some truth to the claim that it’s people who are stupid, not questions themselves. Let me tell you a true story. Once Elie Wiesel visited my local high school and gave a lecture in the auditorium. Student attendance was mandatory. He spoke eloquently and powerfully of his experiences as a prisoner in several concentration camps during World War II, including Auschwitz, and also of the Holocaust in general. When he finished, Wiesel called for questions from the largely student audience. One student near the front on the right hand side stood up and asked, “What is a Jew?” Wiesel became angry and proceeded to berate the student for what seemed like five minutes. He clearly presumed that the student was simply being an asshole. But I knew that student, and to this day, remember his name. He was a gentle soul. I knew, and many of my fellow students knew, that the question was sincere. He truly had no clue what a Jew was. Wiesel presumed that the student was intelligent. This was a faulty presumption and it naturally led him to berate the poor student. I want to make another point as well. Asking what it means to be a Jew is not a bad question. My grandmother was Jewish by blood, but was Eastern Orthodox by religion. Others are followers of Judaism but have no Jewish blood in their veins. “What is a Jew?” is a good question, though surely the student could have and should have introduced the question differently, since it’s also a sensitive question. He, however, simply wasn’t very intelligent.

I think we need to recognize that there are dumb questions as well as dumb people. But this doesn’t mean that as teachers our default approach is to treat students and their questions this way. What should the default approach be? I employ the Principle of Charity in class. If a student says something that can be interpreted in more than one way, interpret it in the way that presumes higher intelligence on the part of the student. I discuss this principle in class as something that we, as teachers and students, should employ when we engage with others. At the same time, charity only goes so far. Sometimes you’ve got to “call a spade a spade.”


20,000 Questions

Leibniz, the famous 17th century German polymath, seemingly did everything under the sun. One thing he never did was teach. Sure, he “taught” in the sense of explaining his ideas to others, but he never tutored anyone in the proper sense and certainly never taught a class of students. Perhaps he would have made an excellent teacher, but we’ll never know. If Leibniz was a polymath, I am a dilettante. However, there is one thing that keeps me from complete dilettantism: teaching, especially the teaching of philosophy. The content of this blog will be determined largely by the questions my students, friends, colleagues, and strangers ask of me regarding teaching in college. Throughout the life of this blog, I will answer as many as I can. In no particular order, here are some I have gathered:  

How is this applicable to my life? 

How can one teach a college student to think cogently on their own without professors infecting them with their bias and prejudice? 

Why did you decide to teach? Relatedly, how would you respond to the cliché: “those who can do those who can’t teach”?

What am I really thinking when people ask dumb questions?

How do you engage students uninterested in your subject?

How does your ego handle that some people really don’t give a shit about your lifetime academic pursuit? 

Does teaching twenty-somethings (mostly) make you really miss–or really really not miss–being in your twenties?

How do you write a good lesson plan? 

Is this going to be on the test?

Are you single? 

Does your curriculum ever get boring?

Why is teaching so undervalued and therefore so mediocre at many “institutions of higher learning”? 

What is your advice for choosing universities? For highschoolers, transfer students, and those going on to graduate school. 

What are reasons why you should study philosophy, and reasons why you shouldn’t? 

What did you learn teaching at a Catholic institution of learning? 

What improvements has philosophy made in the last century, and which ones should it make in the next? Same for pedagogy