For me the best books make connections. Connections to other good books, ideas, and people. And, the best of these books also makes me want to take action of some sort. In some cases, just the simple act of writing a review like this one. Monica Guzman’s book titled I Never Thought of it That Way is just such a book. It’s at times like this that I wish I had a larger platform on which to share the ideas she discusses in this book. It is the exact message we all need to hear right now and the insights and concrete steps we can use to begin talking to each other instead of talking at each other.
Part of what makes talking to each other so difficult is that we have become isolated from the very people we’d most benefit from engaging with through what she refers to as sorting, othering, and siloing. We have become very good at sorting ourselves into groups according to our beliefs and social media makes this even easier. We view those who differ from us as some mysterious “other” who are so different from us that there is virtually no point in engaging with them. This sorting and othering lead us further into our own enclaves where we get all our information from those who already agree with us. These silos are more than simply echo chambers as they suck us down further into a hole that gets more and more difficult to get out of. For the remainder of this review, I’m going to share some quotes from the book which I find most compelling but really it is worth it to read the entire book. What I provide here is just a snapshot. “Harvard scholar Cass Sunstein found that on issues as diverse as gun control, the minimum wage, and environmental policy, people who talk to folks who share their positions both intensify those positions and want to take bigger, bolder steps to address them.” This very nicely explains why we continue to become more polarized. There’s a vicious cycle at work where our propensity to only listen to those who already agree with us makes our own views that much more extreme. This makes it less likely that we will listen to alternatives or consider views that differ from our own increasingly extreme views. “In her book The Happiness Hack my friend and neuroscience educator Ellen Petry Leanse explains what happens to your brain when you spend a lot of time with folks who reflect your own beliefs back to you. Basically, you stop thinking about those beliefs at all.” Once you stop thinking about your beliefs you stop thinking that they need to be examined. You no longer consider the possibility that those beliefs are not accurate representations of the world around you but simply filters through which you perceive reality. You no longer realize that any belief, no matter how well-founded is going to distort reality in some ways. “In one study of how federal judges make their calls in the courtroom, liberal judges’ opinions moved to the left when they worked with other liberal judges, and conservative judges’ opinions moved to the right when they worked with other conservative judges. But when liberal and conservative judges worked together, that dampened their ideological tendencies. In both cases, what made the difference wasn’t exposure to information or education, but people.” This relates to the Sunstein point made above. While many liberals and conservatives might not see a problem with this, in reality when we don’t work with people who differ from us, our “solutions” become both more extreme and less viable. For most of our problems, the solutions will have to take some input from all sides in order to really work. Another key point here is what we can do to move closer together with people who differ from us. Exposure to information and arguments won’t do it. Those memes you share on social media won’t either. What can make the difference is simply spending time with people and learning about them. When you are faced with a claim that you disagree with, don’t rush in to refute it. Instead, ask questions like “What am I missing?” “As the legendary broadcast journalist Edward R. Murrow once quipped ‘Anyone who isn’t confused doesn’t really understand the situation.’” “The problem isn’t the partial answers we’re always collecting from a variety of sources in our busy lives. It’s the questions we stop asking because we think we’ve learned enough.” If you believe you already understand the issue and know what to do, it probably means that there is something you’re missing. It’s time to get curious and learn more than you think you need to know. What experiences have others lived through that brought them to a different perspective on this issue than me? “People are mysteries, not puzzles. This means we can never be sure about them. But we can always be curious.” She discusses Ian Leslie’s useful distinction between puzzles and mysteries. As Leslie points out in his book titled Curious, ”Mysteries are murkier; less neat. They pose questions that can't be answered definitively because the answers often depend on a highly complex and interrelated set of factors, both known and unknown.” There’s more to learn about people and their lives that can not only help us connect with them but understand where they are coming from on any given issue. “It would take trust. It would take struggle. And more than anything, it would take rejecting the certainty that any one of them has all the information they need.” We need the people who are different from us to really get at the truth about these issues. There are experiences we haven’t had that inform their outlook and we can benefit from learning about them. More reasoning doesn’t work to resolve conflict. Researched Jonathan Haidt and others have deduced “a funny little quirk in how we approach different kinds of ideas. When we encounter ideas that line up with our existing beliefs, we silently ask ourselves Can I believe it? We look at the evidence presented to us, consider it on the merits, and see if the points add up to a belief we can feel good about. When we encounter ideas that challenge our beliefs, though, we ask ourselves something else: Must I believe it? And when we ask Must I believe it? it means our intuition is resisting. “ We start to look for “one good reason–just one will do it– to dismiss the entire offending concept.” “When you feel you’ve won online, you’ve rarely changed anyone’s mind. Instead, you stand as the triumphant king of a lonely land smoldering with the ashes of people you’ve decimated with your words who are less likely than ever to ever listen to your side again.” This quote from Elizabeth Saunders illustrates an important point. As you prepare to post that perfect meme on social media which will really slam the other side ask yourself a question. Will doing this really build a bridge for communication or will it slowly chip away at another opportunity for listening? Social psychologist Shalom Schwartz studies human values. His research indicates that “there aren’t an infinite number of human values out there. There are only ten: stimulation, hedonism, achievement, power, benevolence, universalism, security, conformity, and self-direction.” Conflicts are often cast in terms of one side accusing the other of having no values at all. But, in reality conflicts usually arise as a result of different sides ordering theses values differently. “Neither is likely to see that the other is motivated by a different set of top-ranked values. And that’s how each of them will miss truly seeing the other: they will mistake a different ordering of values for an absence of the ones that they think matter most.” This gives rise to a common but mistaken belief that “If you’re not motivated by this thing I consider good, you must be against it.” “Every tough issue that divides and exhausts us–abortion, immigration, gun regulation, you name it–divides and exhausts us precisely because it puts some fundamentally good values into tension with one another. That tension reveals the most confounding thing of all-trade-offs. What’s more is what she calls “the sucky part” which is that for most intractable issues, “we can’t neatly resolve this tension between values or the trade-offs they reveal.” The best we can do is to find a balance that works for now. Inevitably that balance will have to be re-examined many times in the future. That’s just the nature of how trade-offs among values work. “We all want to solve our problems for good. I meant who wouldn't? Seeing things from our own individual point of view, it can seem so darn possible. But as long as those problems sit on a fault line between fundamental values in tension with each other, and we live among people who (a) walk different paths, and (b) apply their values to problems in a different order (spoiler alert: both are always the case!), good solutions are going to be a lot more elusive than we think. We may each come up with a solution where the trade-offs seem perfectly acceptable…to us. But what about everyone else?” Ultimately, we have to be willing to reach out and really communicate with people. We need to become more curious about individuals not as representatives of a set of beliefs or political groups but as people who have their own set of life experiences that have led them to the outlook they currently have. Doing so means we will discover points of contact where we share values in common. We will also discover points where we differ in how we order those values that we share. We need to remember that we share values even though we might order them differently. And we need to recognize that this difference will always be there requiring us to make trade-offs if we’re ever going to address issues and problems in an effective way. This also means that we will never be done addressing these issues. We will always have to revisit them and continue making trade-offs to accommodate changes in the world we share and the values we share. As Guzman puts it, When it comes to communicating with others we have to build bridges and maintain them even if we don’t always cross them. What counts is that the bridges are there and available when we’re ready to use them. This image is useful when it comes to posting on social media. Are those comments or memes you’re posting building bridges or tearing them down? You might not see an immediate benefit to bridge building but it is there and can make a difference. We have to stay curious as well which includes rejecting easy answers to difficult questions and embracing complexity. When we do these things we open ourselves to experiencing what she refers to as an INTOIT moment. These are the moments when we can bridge the gap between people who differ from us. We can really listen to them and learn from them. If we’re lucky we can experience a revelation of sorts and say “I never thought of it that way!”
0 Comments
Hello everyone and welcome to improve your thinking. I’m your host Kevin Browne and today I want to talk about differences. I want to make two points that may seem to conflict with each other but I think are actually complementary points. First, we are not as different as we think we are from one another. Second, the differences that do exist between us are beneficial, even necessary.
When I teach ethics one of the questions we discuss is: Does everyone have different morals? Nearly every student answers yes. I find this an odd and inaccurate answer. But, even when presented with evidence for the sameness of our moral principles they remain unconvinced and continue to focus on the differences to the exclusion of any focus on our common moral foundation. Inasmuch as this has some pretty important implications for our moral discourse I’d like to consider some possible explanations for this focus on differences. 1. We confuse surface disagreements for deeper disagreements. It’s easy to see disagreements when we discuss such issues as gun control, abortion, euthanasia, and drug legalization. Could those who disagree about these issues really share common moral principles? I think to have the conversation at all, they have to. Take the abortion debate. Both sides care deeply about the issue but are not as clear in their common moral agreement. But, it is there. Both pro-life and pro-choice advocates believe in the moral principle: babies should not be killed for no good reason. I would go so far as to say that everyone agrees with this moral principle. The debate over abortion is not over a fundamental moral principle but rather some important factual questions such as whether the fetus should be classified as a baby. While this may be an oversimplification of the issue I think it remains the case that both sides share much in common when it comes to their fundamental moral principles. 2. We assume that having common moral principles would eliminate all debate. The claim is often made that if we all had the same morals, there would be no debate about such issues as abortion, euthanasia, etc. But, is this true? It is put forward as if it were true but rarely is any evidence provided for this claim. I think it turns out to be false. To illustrate consider an analogy to a simpler example; food. It is a universal human requirement that we need food to survive. But, even though this is a universal agreement, there is a huge variance in what counts as food from one culture to the next. There are even debates about what should and should not be eaten. What explains these differences? The environment is an important factor as well as cultural traditions. But, the point remains that even with a fundamental agreement there are still debates. The same holds true when it comes to morality. As the philosopher James Rachels pointed out, there are several important fundamental moral principles we all share concerning the care of the young, indiscriminate killing, and truth-telling. But, while we all share these fundamental principles we still debate over such things as what counts as proper care for the young, who can and who cannot be killed, and when it is permissible to lie. 3. We don’t want to dig deeper to see common underlying moral principles. It takes work and deep thinking to see the common moral principles that lie beneath the surface disagreements. It also takes a willingness to enter into a thoughtful dialogue. The only way we can really discover the common values we share is to slow down, ask questions, and listen to the answers we get. Ultimately this is a more valuable activity than trading insults and slogans. One activity I encourage my students to engage in comes from a TED talk given by Elizabeth Lesser titled Take the Other to Lunch. In this activity, you sit down with someone who shares a different view regarding some important issue and have a thoughtful conversation. As part of this conversation you ask them to share some of their life experiences and ask them the following questions: What issues deeply concern you? What have you always wanted to ask someone from the “other side?” The idea is not to persuade but to understand. Perhaps if more people tried this they would see the common morality hidden underneath the surface arguments. Another interesting perspective on the question of differences is offered by the research of social psychologist Shalom Schwartz who has concluded that there are not an infinite number of values. In fact, there are only ten: stimulation, hedonism, achievement, power, benevolence, universalism, security, conformity, and self-direction. You can take the Human Values Test and see how you rank these values. Visit IDRlabs.com to see. But, how could there only be ten fundamental values? Aren’t there too many variations to believe that the number of fundamental values is so low? Not really. Of course, there are variations in how researchers have identified and counted values. But, the point is that there is a finite and fairly small number. What accounts for the variation then is how different people prioritize those values. One person might see security and conformity as their highest values whereas someone else might rate achievement and self-direction as their highest values. It’s not that they reject the other values entirely, but that they rate their importance as lower. So, when you are disagreeing with someone on some controversial issue remember that you both probably share many values in common but your disagreement is the result of the differences in how you rank those values. Of course, this won’t completely resolve the disagreement but it does allow for the possibility of communicating and finding common ground. It’s a small step but an important step in the right direction. It’s better than thinking the people who disagree with you have no values at all and are evil. When you adopt this mindset, there is little hope for communicating and coming together. When we see that the differences in our values are differences in degree and not differences of kind, we can also recognize the value in those differences. Without these differences, we succumb to what Matthew Syed calls “collective blindness.?” In his book Rebel Ideas: The Power of Thinking Differently, he offers one of the most compelling arguments for the importance of diversity I have come across. He argues for cognitive diversity by showing that “most of the challenging work today is undertaken in groups for a simple reason: problems are too complex for any one person to tackle alone.” We need the diversity of perspective and thinking that groups offer. But, not just any group will do. If you have a group comprised of people with similar backgrounds, education, and experiences your group is not going to be cognitively diverse enough to really generate the kinds of ideas and solutions needed. In practical terms, think about the groups we often see trying to address social problems. Groups comprised mainly of politicians. But, not politicians from a wide range of backgrounds. Politicians grouped according to allegiance to specific ideas. And, what’s worse, these groups purposely exclude people who value different ideas. These groups sort themselves according to how similar they are in how they rank order values and that virtually ensures that those groups will lack the cognitive diversity required to handle any of the complex problems we face. Recognizing this means recognizing that the idea of working together is not just a nice ideal to aspire to. It is an essential requirement for any hope of success. Of course, there's much more to say on this topic so stay tuned for next week’s episode where I discuss a very insightful new book by Mónica Guzmán that offers some practical tips for bringing people together. If you’ve enjoyed this episode, please subscribe to the podcast and visit me online at kevinjbrowne.com. Thank you for listening and I’ll see you on the next episode. Hello everyone and welcome to improve your thinking. I’m your host Kevin Browne and today I want to talk a little bit more about influence. Specifically, the invisible influences that shape many of our beliefs and behaviors.
Let me paint you a picture of how you think about any given issue. You search for information, do research, look at all the facts, examine your own values, and in light of all that decide your position on the issue. You don’t allow irrelevant factors to influence your position. What you think about an issue depends on the evidence. You’re an independent thinker and you’re not easily swayed by the opinions of others. You certainly don’t let the fact that people you like or dislike express their views differently have any sway over your own deliberations. And, of course, whether you think about these things in the morning or the evening has no impact on your thinking. Neither does the fact that you haven’t had lunch yet or dinner or recently had a fight with your partner. Those things are clearly irrelevant to the issue at hand and you recognize that. Therefore, they have no influence on your deliberations. Does that sound about right? It sounds right for how I think about things! If you’ve been listening to my podcasts for a while you know what I’m about to say, don’t you? This view is almost entirely wrong. In reality, those quote-unquote irrelevant factors have more influence on your than you know. As Jonah Berger points out in his book Invisible Influence, “without our realizing it, others have a huge influence on almost every aspect of life. People vote because others are voting, eat more when others are eating, and buy a new car because their neighbors have recently done the same.” In fact, he points out that “99.9% of all decisions are shaped by others. In fact, looking across all domains of our lives, there is only one place we don’t seem to see social influence. Ourselves.” I want to examine this idea by discussing three examples of influence that he addresses in the book. Let’s start with one of the most famous studies of influence conducted in 1952 by Solomon Asch. In the study, he brought a group of students together and told them he was doing a vision test. In reality, he was testing the power of social influence and conformity. Only one student was actually a true subject. The others in the small group of 5-7 met with Asch beforehand and were instructed to give wrong answers. Students were shown a card with 3 lines of clearly different lengths labeled A, B, and C. They were then shown a second card with a line and asked to identify which of the three lines matched this one. The clear answer was that line C matched. As each student was asked for their answer the ones instructed to give the wrong answer did so. Finally, the test subject was asked for their answer. To Asch’s surprise he found a large portion of test subjects conformed to the others in the group who have the wrong answer. In fact, around 75% conformed. Let’s be very clear here. In a case where the correct answer was crystal clear, 75% voiced the wrong answer due to the fact that others had answered that way. The need to conform overcame the evidence of their own eyes. Of course, this raises a number of questions. If conformity in a case like this is so high what about in cases where the correct answer is not as clear? Is there anything that can be done to break this spell of conformity? Before addressing the first question, let me say something reassuring about the second one. In further trials of this experiment, one of the students was coached to dissent from the group and give a different wrong answer. In such cases, conformity from the test subject dropped dramatically. In some cases, the conformity rate dropped to zero. This demonstrates the positive power of influence which I’ll come back to. What about cases where the answer is less clear. In his book, Berger examines such cases. Consider this example: “Suppose you were asked to vote on a new welfare policy. It offers $800 a month for families with one child and an extra $200 a month for each additional child. In addition, it provides full medical insurance, a job training program, $2,000 in food stamps, extra subsidies for housing and daycare, and two years of paid tuition at a community college. Benefits are limited to 8 years, but the program would guarantee a job after benefits ended and would reinstate aid if a family had another child. Would you be in favor or opposed to such a policy?” Not surprisingly when researchers posed this question, most people who identified as liberals favored the policy, and most who identified as conservatives were against it. But, here’s the twist. When Stanford professor Geoffrey Cohen asked this question he presented some conservatives with one additional piece of information: he told them that the policy was supported by 95% of House Republicans and that Republican lawmakers felt that the policy provides sufficient coverage without undermining a basic work ethic and sense of personal responsibility.” What happened when they were asked about their support? The conservatives loved the policy idea. Simply being told that other like-minded people supported it was enough to influence their view. Now, don’t think this influence just affected the conservatives! When the liberals were given a stringent welfare policy and told that other Democrats endorsed it, they favored it as well. And, when people were asked about whether the fact that other Democrats or Republicans favored the policy in question they said that barely mattered at all. And, as Berger points out, they were wrong. People’s attitudes weren’t just slightly nudged as a result of being told what other like-minded people thought about the policy, they were completely flipped! And, if you’re like most people who find out about these studies you are now saying to yourself something like this: Sure those other people were influenced. But, that doesn’t happen to me! And, just like those other people, you are wrong about that. You are being influenced. So am I. But, is this necessarily a bad thing? Perhaps not. Let’s look at some of the benefits of such influence. In his book Ethics and the Limits of Philosophy, Bernard Williams points out the importance of setting priorities when addressing ethical values. I think this makes an important point about our implicit ability to recognize right and wrong. Some actions are immediately recognized as right and wrong before any moral deliberation. This turns out to be a good thing. As Williams puts it "an effective way for actions to be ruled out is that they never come into thought at all, and this is often the best way. One does not feel easy with the man who in the course of a discussion of how to deal with political or business rivals says, 'Of course, we could have them killed, but we should lay that aside right from the beginning.' It should never have come into his hands to be laid aside. It is characteristic of morality that it tends to overlook the possibility that some concerns are best embodied in this way, in deliberative silence." In many cases, that deliberative silence is the result of social influence. There are a number of other examples of potentially positive influence as well. In fact, every example of invisible influence which seems negative can be turned into something positive. We are influenced by any number of factors in our environment including our neighbors, friends, and acquaintances. So, it makes sense to choose wisely where possible. We are not only influenced by the opinions of others but their actions as well. So, if you want to improve your health, hang out with people who are already healthy and practice good health habits. Let those positive habits influence you. Another good example of the potential for positive influence is discussed by Victoria Harrison in her book Happy by Design. The homes we live in and the way we decorate them can have an influence on our health, happiness, and well-being. This influence can be positive or negative so learning how to shape our environment to realize the positive benefits makes sense. Another good resource for more information on this is Ingrid Fetell Lee’s book titled Joyful The Surprising Power of Ordinary Things to Create Extraordinary Happiness. It’s funny how we all feel as if we are independent in our choices and beliefs. We believe that we arrive at our preferences, tastes, friends, partners, and opinions through the conscious deliberate choices that we make. But, we don’t. We are influenced in ways we can barely recognize. So, it makes sense to learn about those and shape them when we can to be more positive. If you’ve enjoyed this episode I’d like to influence you to subscribe and visit me online at kevinjbrowne.com. Thanks for listening and I’ll see you on the next episode. Hello everyone and welcome to Improve Your Thinking. I’m your host kevin browne and today I want to talk about influence.
I teach courses in logic on a regular basis. Most any logic course presumes that reasoning is sometimes fallacious but that it is possible to improve your reasoning ability and as a result improve your decision making as well. In addition, by understanding how to evaluate arguments you can not only become better at spotting defective ones but also better at constructing them for yourself. In other words, studying logic will improve your power to influence. As it turns out, that view is wrong. Perhaps a better way to put it is that it’s an incomplete picture of our thinking and what you need to focus on to improve your influence. To learn what else you need to know, I highly recommend Zoe Chance’s new book Influence is Your Superpower: The Science of Winning Hearts, Sparking Change and Making Good Things Happen. As a logic professor, there are several points in the book that resonated with me that I want to focus on particularly in the second chapter titled Influence Doesn’t Work the Way You Think. She discusses Daniel Kahneman’s idea of two systems of thinking from his book Thinking Fast and Slow which he describes as “System 1” and “System 2.” She helpfully relabels these as the “Gator” and the “Judge.” The Gator “is responsible for every cognitive process that’s quick and requires negligible attention,” including things like emotions, quick judgments, and pattern recognition as well as any habitual behaviors. The label comes from the observation that much of a gator’s behavior is “habitual and relatively effortless.” That describes much of our thinking as well. The Judge “is responsible for every cognitive process requiring consternation and effort.” Things like planning, calculating, and the work my students do to solve proofs in symbolic logic. This is how we mostly view most of our thinking. We are sifting through facts and evidence and making judgments. What’s interesting is that while we tend to think that most of our decisions involve deliberate, rational thinking which would be done by the Judge, in reality, the Gator is responsible for much of our behavior and thinking, perhaps “up to 95 percent of our decisions and behaviors.” Understanding this point is key to understanding our own thinking and how to improve it. It’s also a crucial insight to improve your ability to influence others. Why would you want to do that? Well, as Daniel Pink pointed out in his book To Sell is Human, we are all in sales. Think about how much of what you do involves persuading and convincing others in some way. I know as a teacher I am trying to persuade my students to value the class material and learn it. In other words, I’m trying to influence them. So, if most of our thinking is done by the gator below the level of our “rational mind” does that mean influence amounts to manipulation? Not at all. Of course, some people do try to influence by using manipulation and she discusses how to deal with these tactics in her chapter titled Defense Against the Dark Arts. If our thinking is done by gators and judges then the manipulators are sharks “willing to bully, cheat, manipulate, and deceive people to get what they want.” But, that is not what she advocates in the book. Instead, she offers some simply useful tools to improve your ability to influence while building relationships with people. A good example of this is in her chapter on Charisma where she outlines two paradoxes of charisma. Trying to be charismatic has the opposite effect and the best way to attract other people’s attention is by giving them yours. An important key to influence is not manipulating people but connecting with them. The same goes for negotiations which most people see as an adversarial process where you’re trying to get something from someone else while keeping as much for yourself as possible. But, in reality, the best negotiations are collaborative and allow all parties to gain something. Doing this successfully involves asking the “Magic Question” which she describes as her favorite influence strategy. It’s one of my favorite from the book too. The question? “What would it take…?” Starting your question this way naturally invites the other person to think in terms of solving a problem together. It fits nicely with a point she makes earlier in the book. If you want “to become more influential just ask. Ask more often, ask more directly, and ask for more.” The worst outcome is that the other person will say no. But, you’ll be surprised how often people will say yes. Or at least meet you halfway. Sometimes, their answer will even exceed your expectations. But, you’ll never know until you ask. Most logic textbooks define the subject of logic as the science of evaluating arguments. An implicit part of this approach is that there is a distinction between the logical and the psychological. In fact, most logic textbooks will include a reference to the psychological in a chapter on logical fallacies which are mistakes people make in their reasoning process. The implication is that while these psychological tactics may be effective, they should not be used since they violate the principles of good critical thinking. Indeed, some psychological tactics do just that and should not be used. But, Zoe Chance’s book shows us that not all psychological insights are manipulative. To be a good critical thinker, you need to understand how your own thinking works. To have influence you need to understand how to connect with people in a genuine and authentic way. As a professor, her book has given me many useful insights that I will be integrating into my logic and critical thinking courses. As a musical artist, I think I can apply her insights to help my music reach a larger audience. As a human being, who wants to win hearts, spark change and make good things happen, this book will help me unlock my superpower to influence for the better. It can help you unlock your superpower to influence as well. An interesting aspect of influence is the willingness to change your own mind. Influencing others really amounts to wanting to get them to change their mind. It’s much easier to do this if you are open to changing your mind. Unfortunately, Changing one’s mind has a bad reputation. We ridicule politicians as “flip-floppers” for changing their mind. We over-value the idea that one should stick to their beliefs regardless. I’ve had students tell me that there are certain opinions they will believe no matter what. Presumably, that includes evidence that clearly shows their opinion to be wrong. It occurs to me that this attitude really makes you less influential. I mean how open would you be to listening to someone and being influenced by them if they told you that nothing could persuade them to change their mind? This relates to Zoe chance’s point in the book that in order to attract other people’s attention you have to give yours. If you want to influence others you need to demonstrate that you can be influenced as well. Another interesting perspective on being open to changing your mind is offered by Adam Grant in his book Think Again: The Power of Knowing What You Don’t Know, where he says: “Who you are should be a question of what you value, not what you believe. Values are your core principles in life–they might be excellence and generosity, freedom and fairness, or security and integrity. Basing your identity on these kinds of principles enables you to remain open-minded about the best ways to advance them. You want the doctor whose identity is protecting health, the teacher whose identity is helping students learn, and the police chief whose identity is promoting safety and justice. When they define themselves by values rather than opinion, they buy themselves the flexibility to update their practices in light of new evidence.” We should encourage people to update their views and practices in light of new evidence. If someone is not doing this, it means they are not learning anything new. It also means they are not going to be very good at influencing others. If you enjoyed this episode I hope you will subscribe and join me online at kevinjbrowne.com. Thanks for listening and I’ll see you on the next episode. Hello everyone and welcome to improve your thinking. I’m your host Kevin Browne and today I want to talk about cogent biases and fallacies.
Here’s an example of something you’ve probably done before. You’re shopping at Amazon.com and you have $30.00 worth of merchandise in your shopping cart. When you check out your total (with shipping) comes to $37.00. You can get free shipping if you spend at least $35.00 so you look for another item to add to your cart. You find one priced at $12.00 and add it. Now your total (with free shipping) comes to $42.00. So, have you saved any money? No! You’ve spent more merely to get free shipping. This is a perfect example of what economists call irrational behavior. You were trying to save money and in the process ended up spending more. The interesting thing about this example, and many others we could cite, is that they are predictable. That is, we all are susceptible to cognitive biases, but our responses to them are not random. We are, to use the title of Dan Ariely’s book, Predictably Irrational. A common viewpoint students express in the classes I teach is that everyone thinks differently. Within a narrow range of options, this is true. But, in the larger scheme of things, it is very definitely false. Ariely’s research shows that the kinds of mistakes we make in our reasoning process are not random and not all that variable. In fact, the entire idea of cognitive biases is based on the idea that our thinking patterns have more in common than they do differences. There should be nothing very surprising about this since we are all wired up the same as human beings. Our brains are configured pretty much the same and the variations occupy a relatively narrow range. So, if you’re like most people you don’t think you’re like most people. But, you’re wrong about this! Here are some other things you probably think about yourself that you are quite likely wrong about. Don’t feel bad about this because you’re not alone. We’re all wrong about many of these things and we’re all wrong in basically the same way. You probably think you’re an above-average driver. In fact, the majority of people think this. But, it can’t be the case since “above average” is a category the majority cannot fit into. You may also think you have above-average intelligence and are above average in how you look. Again, most people think this way and most cannot be right about it. If you’re like most people you probably overestimate how well you can plan things, underestimate how much time you will need to get a given task done, and procrastinate more than you think you do. What behavioral economists like Dan Ariely do is to study how we are predictably irrational and how to improve our decision making given our propensity to succumb to cognitive biases. Finally, if you're like most people you easily fall prey to confirmation bias. This is one of the most important cognitive biases to understand and guard against. It is the propensity we all have to only look for evidence in favor of our beliefs and ignore the evidence against our beliefs. This occurs in most areas of debate. If you follow political discussions you have seen this (and probably also fallen prey to it). You know the arguments for your belief and you think they are good arguments. You also "know" that the arguments for the other side of the debate are bad arguments and the people who hold these beliefs are ill-informed. But, this is what the confirmation bias leads us to think. In fact, most people don't have a clear understanding of both sides of whatever issue they are debating. They don't realize that there are good arguments for the other side and that people hold these positions for good reasons. That doesn't mean that all arguments are equal and that there aren't bad arguments. But, we tend to see the bad in ideas that other people hold, not our own. This can be very dangerous, especially when this bias leads us to make bad decisions in our own lives or our social policies. In light of this, let’s consider some common mistakes in reasoning; what we call fallacies. We should attend to these for two reasons. First, we want to be able to identify these mistakes if they occur in arguments we hear. Second, we should familiarize ourselves with fallacies to avoid committing them ourselves. I won’t address all of the fallacies usually covered in a logic course but here are four of the most common. 1. Ad Hominem: (Sometimes referred to as argument against the person) In this fallacy, the arguer attacks the person instead of attacking the person's argument. It is often very difficult to keep these two separate. Does the following seem like a good argument? Bill Gilmore has argued for increased funding for the disabled. But nobody should listen to that argument. Gilmore is a slob who cheats on his wife, beats his kids, and never pays his bills on time. In fact, it is not a good argument because instead of criticizing Gilmore's argument for funding the disabled, the criticism is about Gilmore himself. In fact, it is irrelevant to his argument whether he does any of those other things. While they are reprehensible, the fact that these things are true does not necessarily mean his argument for increased funding for the disabled is a bad argument. If you listen to political arguments you often hear people criticize others for their motives, call them evil, or just insult them. All of these are examples of the argument against the person fallacy. While this can be an effective tactic for motivating people who already agree with you, it does nothing to address the real arguments on any given issue. 2. false dichotomy is a fallacy that relates to arguments that contain two choices. For example, Either I get into law school or my life is over. This would be an example of a false either/or statement. Usually, it is obvious what conclusion should be drawn so often the argument is not completely stated. Many political issues are framed as either/or choices. You’re either pro-life or pro-choice, you’re either for gun control or against it. framing arguments this way is very persuasive but also often misleading as they hide the possibility of a third option that might be both more reasonable and more achievable as a solution to the problem. 3. Begging the question: occurs in two ways. First, it can occur when you leave out a questionable premise in your argument, a premise that is required for the argument to work. Here’s an example from a logic textbook, “It’s obvious that the poor in this country should be given handouts from the government. After all, these people earn less than the average citizen.” This argument is not answering a key question. In other words it is begging the question. In this case the question is: Just because the poor earn less than the average citizen, does this imply that the government spoiled give them handouts?” Notice, that the answer to this question could be yes or no. The problem with the argument is that the question is not addressed at all and so no examination of the evidence either way has been considered. A second way this fallacy is committed is by arguing in a circle by having the conclusion of your argument serve as one of the premises as well. This is often referred to as circular reasoning. Here’s another textbook example: Capital punishment is justified for the crimes of murder and kidnapping because it is quite legitimate and appropriate that someone be put to death for having committed such hateful and inhuman acts. This argument might sound persuasive until you examine it closely. To say that something is justified is the same as to say it is legitimate and appropriate. Claiming that something is legitimate and appropriate is not proving that something is justified merely restating it. 4. Red Herring: In red herring what usually happens is that the critic just drifts away from the original subject; hoping that the audience will forget what the original argument was! Here's an example: Environmentalists argue that the use of pesticides on fruits and vegetables is dangerous to our health. But, fruits and vegetables contain many essential nutrients that can prevent disease and promote health. According to the FDA, one of the best sources of vitamin C is orange juice, and vegetables like broccoli contain a healthy dose of minerals such as iron. Clearly, fruits and vegetables are important to our health. Now, what was the original argument? It was about pesticides. But, then look what happens. The critic proceeds to change the subject to talk about vitamins and minerals. This has nothing to do with whether the environmentalists are correct about the dangers of pesticides! Classic red herring fallacy. There are many other fallacies but these are some of the most common ones you’re likely to hear in political debates. Identifying them is an important step in the process of thinking like a philosopher which means that you ask more questions and demand better answers. You also demand that arguers stop using these fallacies and address substantive arguments and evidence. If you’ve enjoyed this episode I invite you to subscribe and visit me online at kevinjbrowne.com. Thanks for listening and I’ll see you on the next episode. Philosophy is sometimes seen as an abstract discipline that can be mastered only through reading obscure and difficult-to-understand texts. But, philosophy can provide us with useful practical tools to improve our lives and addresses problems that arise. Here are some excellent examples of philosophical ideas that can improve your life by improving your thinking.
Questions: Philosophers ask questions. It’s what they do best. But, not every question is philosophical. A good philosophical question digs below the surface to examine underlying assumptions that are not often acknowledged. One of the best examples of such questioning was Socrates. He was fond of asking questions like, “What is beauty?” “What is justice?” His questions seem so easy to answer on the surface but as a dialogue with him would continue the questioning would soon reveal hidden assumptions and contradictions. One of the best uses of this kind of questioning is where there seems to be complete agreement. This may seem counter-intuitive but in many such cases, that agreement is masking an underlying lack of clarity which can be explored by some strategic Socratic questioning. Doubt: The 17th-century philosopher Rene Descartes famously began his philosophical investigation with doubt. He attempted to doubt everything to discover if there could be a foundation of knowledge that was indubitable. He found that foundation in the very act of doubting recognizing that if he was thinking he had to exist. That could not be doubted. This is what he meant by his famous pronouncement that “I think, therefore, I am.” The 18th-century British empiricist David Hume continued this tradition of doubt with his skeptical approach to knowledge. He advised that the wise person will “proportion his belief to the evidence.” If the evidence for a claim is weak we cannot be justified in holding a strong belief about it. In a world of “fake news” and numerous sources of information, this advice can come in very handy. We don’t have to doubt everything, but a healthy dose of skepticism is often warranted when presented with claims that seem too good or too outlandish to be true. Categories: The 18-century philosopher Immanuel Kant began his philosophical investigations by trying to address Hume’s skepticism. In doing so, he hit upon an insight that still influences psychology today. His idea was that our minds act as a filter through which we perceive sense experience. These filters he called the “categories of the mind.” They provide the structure for our knowledge of sense experience and include things like space, time, and causality. We impose an order on sense experience that may not really be there. In practical terms, we each bring our own unique perspective to whatever problem or situation we are facing. Remembering this can be very useful. We too often assume that people see the world just like we do and while there are some common elements in our perspectives as human beings, there are also important differences that must be addressed in order to ensure good communication. Wants and Needs: If you want something, does that mean you need it? For many people, the answer seems to be yes. What’s worse, is the all too common mentality that if I want something, I deserve it. The ancient Greek philosopher Epicurus provided a useful antidote to this kind of thinking. For Epicurus, there are three categories of desires. The first is natural and necessary and includes the basics: food, clothing, shelter, as well as friendship, freedom, and thinking. We require these to be happy. And, for Epicurus, these are all that we require to be happy. But, there are also natural and unnecessary desires (wants that become confused with needs). These include many of the luxury items in our life; fancy food, clothing, big houses, fancy cars. All of which we desire but do not require to be happy. The problem occurs when we believe we need these things, purchase them, and then find that we have to work to afford them and in the process discover we are no happier as a result of acquiring them than we were without them. Third, Epicurus identifies the unnatural and unnecessary desires for power and fame. These may be at the root of our belief that natural but unnecessary desires are really needs. After all, if you strongly desire fame and power, you will deduce that you need those things that Epicurus classifies as natural but unnecessary. Attitude: The Greek and Roman Stoic philosophers were masters of cultivating a good attitude towards life, especially life’s problems. The Stoics recognized that it was important to distinguish between what you can control and what you cannot control. Surprisingly few things are in your direct control, chief among them is your attitude. You cannot control what other people think or do but you can control your attitude towards them and their actions. As Epictetus pointed out, “People are not disturbed by things, but by the views which they take of things.” The Roman Emperor and Stoic philosopher Marcus Aurelius echoed this sentiment saying, “The happiness of your life depends upon the quality of your thoughts.” The Stoics also offered useful advice for dealing with adversity. We must recognize that problems are an inevitable part of life. But, we can cultivate an attitude that meets these problems as challenges. If we can see problems as opportunities to cultivate a proper attitude and as tests of our virtue we can better cope with what difficulties arise in the course of our life. As you can see, philosophical ideas can be very insightful and practical. While some philosophers write abstract difficult-to-understand texts, many others specifically wrote and taught to help people improve their lives. As Epicurus once said, “Vain is the word of a philosopher which does not heal any suffering of man. For just as there is no profit in medicine if it does not expel the diseases of the body, so there is no profit in philosophy either, if it does not expel the suffering of the mind.” Hello everyone and welcome to Improve your thinking. I’m your host Kevin Browne and today I want to talk a little about problem solving.
Problem-solving is one of those skills that are in high demand. Many employers rate it as one of the most important traits they look for in prospective employees. It is also usually listed as an important learning outcome in many of the classes you take in college including math classes and psychology classes. But, it is often not taught at all or very well. One reason why this is the case is because of confusion relating to what kinds of problems are being solved. It is often assumed that by practicing any kind of problem for which you don't have an answer you will learn the skills appropriate to solve any problem. But, this is simply not true. Since there are different types of problems learning how to solve one type won't necessarily help you build the skills you need to solve the other type. So, let's begin by distinguishing two types of problems: puzzles and mysteries. In his book Curious: The Desire to Know and Why Your Future Depends on It, Ian Leslie defines these two types of problems very well: puzzles and mysteries. "Puzzles have definite answers. Puzzles are orderly; they have a beginning and an end. Once the missing information is found, it's not a puzzle anymore." Most of the problems you encounter in your college math courses are in reality puzzles. They have definite answers and are orderly. Also, the information you need to solve them is right there in the question! And, if you need extra help you can just turn back to the chapter in the text which discusses how to solve these particular types of problems. Many "logic puzzle" questions are (obviously) also just puzzles. While they seem more complex in reality they are the same as your math problems. They have definite answers and are orderly. Here are a couple of examples: You have eight billiard balls. One of them is "defective," meaning that it weighs more than the others. How do you tell, using a balance, which ball is defective in two weighings? You have five jars of pills. All the pills in one jar only are "contaminated." The only way to tell which pills are contaminated is by weight. A regular pill weighs 10 grams; a contaminated pill is 9 grams. You are given a scale and allowed to make just one measurement with it. How do you tell which jar is contaminated? Another important characteristic of puzzles is that you can quite often Google the answer for them! Go ahead try and figure these two out and if you have trouble just Google the answer. Now, here's the question. Are these kinds of problems "real?" In other words, are these the kinds of problems you're likely to run into in your everyday life? Are they the kinds of problems you will run into at work? I doubt it. Another question then: Is learning how to solve these kinds of problems a good preparation for learning to face real-world problems? Again, I doubt it. So, the second type of problem Leslie defines is mysteries. "Mysteries are murkier; less neat. They pose questions that can't be answered definitively because the answers often depend on a highly complex and interrelated set of factors, both known and unknown." In other words, the exact kind of problems you're likely to run into in real life and at work. So, how do you solve a mystery? Well, if you've understood the distinction between puzzles and mysteries you can probably guess what I'm about to say. There's no easy method that works for all mysteries. Their very ambiguity and lack of clarity ensure that this is so. But, there are things you can do to improve your ability to solve these kinds of problems. Here are a few steps to take. 1. Practice solving mysteries. Like any other skill the more you practice solving these kinds of problems the better you will become at solving them. And, it should be easy to find such problems as they crop up in everyday life all of the time. Here are some possible examples: [A] How can I achieve a better balance between all my responsibilities including work, school, and family? [B] How can I save more money for retirement or other future expenses than I'm currently saving? [C] How can I find a career that will allow me to earn enough money to support myself and my family and also allow me to grow and build my skills and talents? I know what you're thinking. Sure, these are real-world problems and I've even faced some of them but how can I build the skills I need to solve these problems before taking these problems on? It does no good to practice these very difficult problems. So, to prepare for solving these kinds of problems here are some other potentially helpful steps. 2. Learn as much as you can about as many topics as you can. Solving mysteries often involves drawing on knowledge from a variety of sources. It is quite likely that many of the things you're now learning (even those you think are irrelevant) may turn out to be helpful at some point in solving a problem. So, take advantage of the opportunity to learn about psychology, biology, history, mathematics, and all the other subjects you're learning now. That time will pay off. 3. Make connections. A lot of problems are solved by finding interesting, creative connections between seemingly unrelated topics. A good example here is Steve Jobs' application of his knowledge of calligraphy to the Apple operating system. Another is the psychiatrist Jeffrey Schwartz's development of a treatment for OCD (obsessive-compulsive disorder) by combining Buddhism and Austrian economics. In both these cases, seemingly unrelated ideas were connected to provide new insights. 4. Read books about mysteries, problem-solving, and even mystery novels. Sherlock Holmes is a great place to start. Reading Conan Doyle's stories of the great detective reveals some basic principles of problem-solving which can apply to everyday life. Daniel Smith's book How to Think Like Sherlock is another good resource. 5. Solve simpler mysteries first. Rather than taking on the more complex type of real-world problems mentioned earlier, begin with simpler, but still real, problems. Never mind solving your work-life balance problem right off. Start with solving this problem: How can I keep from losing my phone every time I get ready to go somewhere? 6. Adopt a problem-solving mindset. Be observant of your surroundings and attuned to what you can do to improve how you live, work, or do simple things. Thomas Edison exemplifies this mindset very well. He often presented his lab assistants with the following challenge. He would hand them some ordinary item (like an iron or a fountain pen) and say "There's a better way. Find it." Adopt that mentality in your life. Whatever you're doing ask whether there is a better way: easier, more efficient, more effective. 7. Learn about design thinking. Lastly, you may want to learn about design thinking. I've made a few resources available that provide a brief introduction to the basic ideas of design. The important feature is to solve problems with the end-user in mind. When you're trying to solve a problem be sure you understand for whom you are solving it and what their needs are. That will allow you to focus on the best solutions given what they're facing. Another important feature of design thinking is prototyping and iteration. Don’t wait to do something until you have the perfect solution, try out ideas and use the results as useful feedback for further refinement of your solution. Recognize that you may have to go through several iterations before arriving at the ultimate effective solution. The process of problem solving in everyday life is similar to running experiments. Try something that might work and if it doesn’t, modify it and try again. Like in scientific experiments, you will make mistakes, but these mistakes are part of the process of learning and refining your solutions. Often people get stuck on the idea that a problem cannot be solved and so there’s no point in trying. But, in most cases, there is something you can do to make headway. even if you can’t completely solve the problem, you can probably make things better in some way. Taking small steps can be very effective and almost always more effective than doing nothing. If you’ve enjoyed this podcast what I hope you do is subscribe and visit me online at kevinjbrowne.com. Thanks for listening and I’ll see you on the next episode. Hello everyone and welcome to Improve your thinking. I’m your host Kevin Browne and today I want to talk a little about how to evaluate your beliefs, opinions, and theories using something called the search formula. This formula comes from a book titled How to Think About Weird Things. It is a great guide to assessing any claim and possible explanations for it. SEARCH is an acronym that stands for:
State the claim. Examine the Evidence for the claim. Consider Alternative hypotheses. Rate, according to the Criteria of adequacy, each Hypothesis. As we’ll see there are several parts of this process that are especially difficult. Among these, the most difficult for people seem to be considering alternative hypotheses. Another way of putting this is to consider the evidence against the claim you are evaluating. The reason this is so difficult is due to something called confirmation bias: the tendency we all have to look only for evidence that confirms our own beliefs. But, to really evaluate any claim, you need to look at all the evidence, not just the evidence in favor of the claim. So, by all means, look at all the evidence for the claim you are investigating. But, then do what may seem counterintuitive, even counterproductive: look at the evidence against your claim. Be sure to look at it with an eye toward fair evaluation. Don’t simply look at it to dismiss it. Above all, don’t look at a biased form of this evidence. You know what I mean. Don’t go to a source that is arguing for the position you agree with and simply take what they say about the evidence against the claim. Look to the best possible sources of evidence against your claim. That way you know you are really evaluating your claim not simply endorsing what you already agree with. To ensure you are really doing this rate each hypothesis you are considering according to 5 criteria of adequacy. These criteria are: Testability, fruitfulness, scope, and whether an explanation is simple and conservative. Let’s look at each of them. First, a theory should be testable. If you cannot even figure out how to go about determining if your theory explains the evidence, you don't have a good theory. To be testable means your hypothesis "predicts something more than what is predicted by the background theory alone." In short, we need this criterion because if there's no way to tell whether a theory is true or false it's really no good to us. Second, a theory should be fruitful. What this means is that a good theory should make novel predictions. It should not only account for the evidence at hand but be able to address evidence that comes in later and even predict such new evidence. Einstein's theory of relativity is a good example of a fruitful theory because it made the novel prediction that light would be visible from a star behind the sun. After all, the light would be bent by the gravitational field around the sun to be visible on earth. And, concerning criterion number one, this was a testable claim. Once tested, it was verified. Third, a theory should have a wide scope. That is, a good theory explains a wide field of evidence. One of the differences between theories and hypotheses is their scope. Hypotheses address specific questions whereas theories attempt to provide a broad explanatory device. Theories that can explain a wide array of things are preferred, other things being equal, to more narrow theories. Fourth, a theory should be simple. This term should not be confused with simplistic. Many scientific theories are complex in terms of our ability to understand them but simple in the sense that they postulate fewer underlying entities or assumptions. A good example is the difference between Copernicus and Ptolemy. Ptolemy's geocentric theory could explain the orbits of the planets but it was quite complex whereas Copernicus' theory explained the same observable phenomena with less complexity. So, other things being equal, that theory was the better theory. Think of it this way. Suppose I come up with a theory to explain how the lights in my housework but it involves little gremlins running inside the light bulbs. Someone else can explain the same phenomenon but without postulating gremlins. So, their theory is simpler than mine. It should also be pointed out that my gremlin theory may fail on other criteria as well such as being testable. Finally, a theory should be conservative. Not in the political sense of the word. Rather, it should fit in with other things we know. If we have an explanation for something that we think is fairly certain and accurate then a new theory should fit in with that prior explanation. If it doesn't fit that may indicate our prior knowledge is flawed. We have to be open to that possibility but the burden of proof is on the new theory. An interesting examination of how this process works is offered by Thomas Kuhn's book The Structure of Scientific Revolutions. you may be inclined to lodge the following criticism at this point: But, that’s "just a theory." The criticism here is supposed to be that any given theory is not an established fact. But, this misunderstands the relationship between fact and theory. No theory is a fact because facts and theories are two different things entirely. The facts are what we observe about the world around us. But, these facts need an explanation. This is what a theory is designed to do. It is a well-formulated attempt to explain the facts we observe. So, we observe the motion of the planets and the fact that an apple falls to the ground when we drop it. The theory of relativity attempts to explain these things. We observe different species and varieties of animals in the natural world and the theory of evolution attempts to explain how these varieties arose. We observe the motion of subatomic particles and the theory of quantum mechanics attempts to explain these observations. In each case, we begin with observations and construct an explanation to account for them. In each case, it makes little sense to criticize the theory by saying it's not a fact. Of course not! Theories are not facts and do not attempt to be. Theories can be correct or incorrect and the criteria outlined above are the best way to determine this. But, you must also understand what a theory is attempting to do. Another important consideration concerning evaluating theories is the notion of proof. People often misunderstand the concept of proof by thinking that proving something demonstrates conclusively that something must be true. While this may be the case for deductive arguments, it cannot be that way for inductive arguments which are based on probability. Since most philosophical (and scientific) arguments are inductive, it is impossible to conclusively prove the conclusion of these arguments true. That is, we will not be able to preclude evidence that may arise in the future to count against our conclusion. This being the case proofs in philosophy tended to attempt to find the most probable conclusion that fits the available evidence. As the scientist, Arthur Stanley Eddington put it "We cannot pretend to offer proofs. the proof is an idol before whom the pure mathematician tortures himself: In physics, we are generally content to sacrifice before the lesser shrine of plausibility." Proof is a seriously misunderstood word. This probably accounts for its rare usage in the natural sciences. In one important sense no one can "prove" the theory of evolution, or the big bang theory, or relativity, or string theory, or whatever theory you want to talk about. But, that does not mean that there is insufficient evidence to warrant thinking these are good theories. To use Eddington's words, we can say these theories are plausible. In some cases, very plausible. Another factor that often leads to misunderstandings about theories and proofs is that, in science, theories are the best explanations we have so far for the phenomena in question. But, as the philosopher of science, Karl Popper pointed out, "the demand for scientific objectivity makes it inevitable that every scientific statement must remain tentative forever.” It is always possible that new evidence arises that shows that our current theories are false. We have to remain open to that possibility. We should remain open to that possibility for our beliefs as well. They may be true given what we know right now, but what we know can change and if so, our beliefs should also change. As the economist, John Maynard Keynes is reputed to have said: When the facts change, I change my mind. What do you do? If you’ve enjoyed this podcast what I hope you do is subscribe and visit me online at kevinjbrowne.com. Thanks for listening and I’ll see you on the next episode. Hello everyone and welcome to Improve your thinking. I’m your host Kevin Browne and today I want to talk a little about logic and how you can use it. Logic is usually defined as the science of evaluating arguments. Logic gives us a set of tools for determining whether arguments are well-reasoned and if the premises provide support for the conclusion. Logic can also help us identity flaws in our reasoning.
But, some of the lessons logic can teach us are a little counter-intuitive. So, let me begin by address five of these counter-intuitive lessons that logic teaches us. 1. An argument with true statements can be invalid. An argument is said to be valid if the premises provide necessary support for the conclusion. An argument can do this with either true or false statements depending on how they are formulated. However, just because the premises of an argument are true does not mean the argument is valid. While the argument may sound persuasive, it could be the case that the premises do not provide support for the conclusion. A good example of this is the following argument which contains all true statements but is, nevertheless, invalid: All banks are financial institutions. Chase is a financial institution. Therefore, Chase is a bank. 2. Statements can sound very different, yet mean exactly the same thing. One of the insights you can learn from categorical logic is that statements which sound entirely different are, in fact, equivalent in meaning. One of the purposes of studying categorical logic is to learn precisely this insight. Another purpose is to give you the power to simplify complex statements such as this one: Some employees who are not currently on the payroll are not ineligible for workers' benefits. Categorical logic can show that this rather unclear statement is really the same as this much simpler statement: Some of those eligible for workers' benefits are not currently on the payroll. 3. There is a mathematical-like rigor to ordinary language. Certain words in ordinary language such as "and," "or," "if...then," and "if and only if" function somewhat like the mathematical operators for addition, subtraction, multiplication, and division. What this means is that you can determine whether statements are true or false without knowing everything about the statement's content. For example, in the statement "Nixon resigned the presidency and Clinton wrote the Gettysburg Address" you can determine that this statement is false if all you know is that Clinton did not write the Gettysburg Address. Partial information can lead you to detect when statements are false (or true). 4. It is possible to evaluate an argument's merits without entirely understanding its content. This possibility exists in logic due to the previous point and the fact that we can build upon it a set of rules which allow anyone to deduce an argument's validity without reference to its content. Just as in math where you can add numbers without worrying about what the numbers reference (2+3=5 and you don't need to know what you're adding 2 of and 3 of to deduce that) you can also infer an argument's validity without worrying about the argument's reference. While this is one of the most difficult points to master in the study of logic it turns out to be a very powerful tool for the evaluation of everyday arguments. 5. Fallacies of thinking are extremely common in ordinary discourse. With all the power of logical reasoning, it is still quite common for people to be persuaded by faulty arguments. What's worse is that many of these fallacies are easy to recognize with only a little training in the very basic principles of logic. Certainly one of the reasons why fallacies of thinking are so common is because they are so effective. These fallacies in reasoning are effective in part because our brains are wired up to be persuaded in ways that are not always rational and because without some basic knowledge of logic it is easy to overlook these fallacies. In logic there are specific rules for determining whether statements like that one, called a conjunction, are true or false. And, the rules are mathematical in that you can plug in a formula and determine whether the statement is true or false based on the rule which is called a truth function. The same will hold true of disjunctions which are either or statements and conditional statements which have the form If A then B. Each has a specific math like rule for determining whether the statement is true or false. In essence it gives you a sort of x-ray vision so that you can see through the clutter of an argument to its underlying form. Once you can see the form of an argument it becomes much easier to determine whether the premises are really supporting the conclusion or not. This is where knowing a little psychology comes in very handy. Logic tends to operate on the assumption that the best arguments are ones that are based on sound principles of reason and not psychological tactics. But, the best arguments from a logical standpoint are not always the most persuasive and the most persuasive arguments are not always based on sound logical principles. This raises an interesting question. Is it possible to construct an argument that is based on sound logical principles and also be psychologically persuasive? In fact it is. I’ll be discussing some interesting insights about influence in a later episode but it is entirely possible to use both sound logical principles and psychology in a way that is genuine and not manipulative. This is a point that is often overlooked in logic textbooks where any reference to psychology usually involves its potential to be used as a tool for manipulating people into agreeing with your argument. Logic does not endorse manipulation. But, you can be persuasive without being manipulative. What I’ve come to appreciate in my own study of psychology as it relates to thinking and persuasion is that it might not be possible to be persuasive if you only appeal to sound logical principles. You might get verbal agreement with the points you’re making but you probably won’t get action. Getting people to act on your argument requires that you appeal not only to their head but also their heart. If you enjoyed this episode I hope you’ll subscribe to the podcast and visit me online at kevinkbrowne.com/ Thank you for listening and I’ll see you on the next episode. Hello everyone. Welcome to Improve Your Thinking. I’m your host Kevin Browne and today I want to talk a little about insight. It’s something we all think we have into our own thoughts and feelings. But, as Tasha Eurich argues in her book titled Insight this is not necessarily the case.
In fact, only a few people are born with the natural disposition for self-awareness. Interestingly, these people are not able to offer many insights into how they can maintain this ability so researchers have tended to focus on discovering the elements of self-awareness by looking at people who became self-aware by overcoming their lack of self-awareness. Most people tend to think that as they grow older they gain more self-awareness. As with many of our intuitions about our own thinking, this too turns out to be false. As Dr. Eurich puts it, “In the absence of a committed effort to build self-awareness, the average person makes only meager gains as they grow older.” There are two factors involved in self-awareness: internal and external. To be truly self-aware we not only need a deep understanding of our own thoughts, feelings, and behaviors but also how people see us, which can often be quite different than how we see ourselves or how we think others see us. There are several impediments to becoming self-aware. Our brains tend to work by making thinking easier and allowing us to use a variety of shortcuts to streamline our thinking. While this often works well when dealing with complex situations or situations which repeat and are predictable, it often creates impediments to becoming self-aware. Three major impediments to self-awareness include knowledge blindness, emotion blindness, and behavior blindness. Knowledge blindness is very similar to the idea of the knowledge illusion I discussed in a previous episode. As Dr. Eurich puts it, “the opinions we have about our abilities in specific situations are based less on how we perform and more on the general beliefs we have about ourselves and our underlying abilities. In short, we think we know more than we do and that belief colors our perception of how well we will do at a given task. What’s worse, the more expertise we think we have, the more harmful knowledge blindness can be. You may be familiar with this idea which is related to the Dunning-Kruger effect which states that the least component people tend to be the most confident in their abilities. At heart, this is a problem of lack of self-awareness. In this case, the self-awareness to recognize what we don’t know. It’s no better when we turn our attention to emotions. In fact, we are often just as poor about evaluating our own emotions as we are our knowledge. Dr. Eurich calls this second impediment to self awareness emotion blindness. The third impediment to self-awareness is behavior blindness which refers to our inability to see our own behavior clearly and objectively. To combat these impediments Dr. Eurich offers three suggestions. First, We need to identify our assumptions that we make about ourselves and the world around us. Second, keep learning especially in areas where we think we already know a lot. Third, we should seek feedback on our abilities and behaviors. Objective input from others can provide a correction to our biased and inaccurate self-assessment. As I read these points in her book I couldn’t help but be reminded of my own motto which I shared with you in my opening episode: To think like a philosopher you need to: Ask more questions. Demand better answers. And learn more than you think you need to know. Most people seek explanations for their problems in externals: other people, situations, environment, etc. Self-awareness requires considering the possibility that part of the problem is the person. Another common roadblock to self-awareness is what Dr. Eurich calls the “cult of self.” This relentless focus on self-esteem leads to less self-awareness as well as less satisfaction overall. To combat this she advises that you cultivate humility, a self-acceptance that entails “understanding our objective reality and choosing to like ourselves anyway,” and better monitoring of one’s inner dialogue. There are ways to increase self-awareness but they are often obscured by several myths regarding introspection and self-awareness. “The assumption that introspection begets self-awareness is a myth. The problem with introspection, it turns out, isn’t that it’s categorically ineffective, but that many people are doing it completely wrong. introspection to access our unconscious is ineffective since our unconscious “is less like a padlocked door and more like a hermetically sealed vault.” A better approach focuses less on the process of introspection and more on the outcome of insight focusing on what we can learn and how to move forward. Much of introspection involves asking why and trying to find the causes for our thoughts, feelings, and behaviors. This does not often lead to accurate results and can instead lead to endless rumination. A better approach is to ask “What?” instead of “Why?” She provides an example of this in the book. “Let’s say you’re in a terrible mood after work one day. We already know that asking Why do I feel this way? should come with a warning label. It’s likely to elicit such unhelpful answers as “because I hate Mondays!” or “because I’m just a negative person.” What if you instead asked What am I feeling right now? Perhaps you’d realize that you’re overwhelmed at work, exhausted, and hungry. Rather than blindly reacting to these feelings you take a step back, decide to fix yourself dinner, call a friend for some advice about how to manage your work stress, and commit to an early bedtime.” Asking "What"instead of "Why" forces us to name our emotions which research shows is effective. Other benefits to this approach include the fact that Why questions draw us to our limitations; What questions help us see our potential. Why questions stir up negative emotions; what questions keep us curious. Why questions trap us in our past; what questions help us create a better future. So, we aren’t as self-aware as we think we are but there are steps we can take to improve our own insight. Some of these tools to increase self-awareness include mindfulness, reflecting on your life as a biography, and focusing on solutions which she elaborates on in the book. An interesting philosophical example of introspection at work can be seen in the work of 17th century philosopher Rene Descartes. His Meditations on First Philosophy illustrates how we arrived at his philosophical insights through reflecting on his own mental processes. It’s interesting to read his work in light of Dr. Eurich’s findings on insight. One thing seems clear from both Descartes’ work and Dr. Eurich’s book. You can’t really gain insight into how the world works if you rely only on your own thoughts and feelings. You need something more objective to appeal to. As the philosopher Ludwig Wittgenstein put it in his book Philosophical Investigations, “an inner process stands in need of outward criteria.” To understand what’s going on inside your own head, you’ve got to get outside your own head sometimes. If you’ve enjoyed this episode I hope you’ll subscribe to the podcast and visit me online at kevinjbrowne.com. Thank you for listening and I’ll see you on the next episode. |
kevin j. brownePhilosopher | Educator Archives
July 2022
improve your thinking: The course |