Featured Post

Close-Minded Credulity

I’m finding it harder and harder to believe anything said on the internet by anyone or put out by any news, blog, or journalism outlet—and not just because of the new fake news phenomenon. Some might be saying ‘it’s about time; read something else.’ Others are probably thinking me an uninformed radically-shallow skeptic, finding it odd that I’ve swung to such an extreme. Both groups, however, are found to be no sooner nudging me over to this site or outlet over here that’s telling the Truth.

In truth, I am coming around to W.K. Clifford’s maxim: that “it is wrong always, everywhere, and for anyone, to believe anything on insufficient evidence,” as opposed to, say, William James’ more sympathetic twist which essentially says that we can believe something if it makes us feel good. Admittedly, this is because I think the times call more for Clifford; I’m not coming around to his view of the matter because of some revealed universal Truth. The image of the political trimmer—one who aims to keep the boat on an even keel; steady from those who wish to capsize it—seems to me an apt metaphor in this case.

It is true, in any case, that James said much more than we should believe things if they make us feel good about it, and what he said here was said in the context of religious belief—which was not, as some people hastily point out, a defense of holding willy-nilly beliefs. James hated that as far as I can tell.

I find myself writing this reflection-exploration because I’m not sure I’ve seen anyone change their minds in real-time about anything, much less whimper the words “that’s a good point.” I don’t think this is simply a matter of hanging around the wrong people or visiting the worst sites: the lowest denominators of human life often find their way to the even the highest of peaks with little exception.

People do change their minds, no doubt, but with regard to things they don’t care much about. There are whole swaths of people who can be convinced, persuaded, and have their minds changed on “important” topics within the context of a political discussion, much to the joy of the persuaders, but, alas, they aren’t even going to vote. The task, then, is much harder. It is to see the minds of the active, involved, supposedly informed citizenry open up. Mine included, to those who might catch a whiff of sanctimony.

Under Clifford, you might as well shut your laptop; with a particular misreading of James1 you can find yourself in the wonderfully Narnia-like land, where whatever you already think comes to be proven beyond a shadow of a doubt. As someone said in the wake of the nightclub shooting in Orlando, “I’m glad Orlando confirmed everything you suspected, whatever it was.”

It isn’t exactly an echo-chamber, because an echo-chamber allows some degree of variation even if it ultimately points at the same end. An echo-chamber’s fault isn’t found in this ultimate aim—for we all have an aim, no?—but that it readily dismisses or ignores things to the contrary. What we are experiencing no doubt was birthed from the echo-chamber, but being bitten by a radioactive spider, it grew more powerful than its hollowed-out parents could have ever imagined.

The situations we find ourselves in seems more akin to the funhouse: where people take “contrary” facts happily. For it is found that, after putting them in front of the distorting mirrors, they aren’t actually contrary at all and thus distorted, fit rather nicely with everything else. Some call this “ideology.” I call it “laziness.” James might have said that this was comfortable; perhaps wrong, but comfortable nonetheless.

__________

Susan Haack—a wonderfully relevant philosopher—has wrestled with this issue of credulity, belief, evidence, etc. for almost her entire career now, so I often find myself referring back to her work during these periodical crisis-of-knowledge meltdowns (taking a deep breath, ignoring her sideways and slightly off the mark attacks on Mr. Rorty for the moment). Credulity, Haack tells us, is “being too ready to believe, or being ready to believe more strongly than your evidence warrants.” Close-mindedness, on the other hand, is “being too ready to dismiss ideas, or being ready to disbelieve, or to believe only less strongly than your evidence warrants.

Over and against these two definitions, Haack presents a third, if somewhat vague option with circumspection: “proportioning your degree of belief to the strength of your evidence.” So something between gullibility and radical skepticism, the latter of course being the category I find myself in at the moment.

I’m not sure this helps us any. Even in the academic world, the second half of those definitions of credulity and close-mindedness are usually paid no more than lip service. “I’m believing more strongly than my evidence warrants? How so?” might be a reasonable retort. In other words, the person who wishes to accuse the average reader of internet news that he is not “proportioned” must know what a good proportion looks like and further, what the evidence is and how to interpret it in a proportional manner.2

Everyone already thinks their beliefs are “proportioned” to the evidence—you’d be hard pressed to find anyone who doesn’t think this. The only reason we are satisfied with any belief in the first place is because we feel as though it’s proportioned in our minds correctly; I’m not sure how one could ever know or live with the fact that their belief is disproportional without ditching or adjusting the belief. Hardly anyone goes around saying “I believe X, but I realize that the strength of that conviction is disproportionate to the evidence that makes me believe it,” but it would be a wonderful sight indeed. In fairness to Ms. Haack, though, this is nowhere near her last word on the subject, so perhaps exploring a bit more might get us out of the weeds. Nevertheless, I have little hope. If something can’t manage in theory, we can expect even less of it in practice.

Circumspection, Haack goes on to say, “simply requires that you use your head about whom to believe, when, on what subject, and to what degree,” and adds that too often people believe that “sincerity alone is sufficient for trustworthiness.” Hear, hear! Perhaps this comment alone—that sincerity is nowhere near enough—might jolt some out of their zombie-like state where anything said with force or emotion is thought to be nearer to the truth than things said or written dryly or with a calm disinterestedness. While I am a dreamer about some things, however, I am no dreamer about this. So our original question merely gets adjusted: from “what does proportioned belief look like?” to “how can we determine whether someone is telling the truth as he believes it to be?” In a beautiful yet damning passage, Haack says:

Perhaps we look to his demeanor: is he matter-of-fact, or is he hesitant, or defensive, or suspiciously emphatic and certain? (As Hume observed, we rightly distrust not only a witness who is hesitant, but also a witness who is too vehement: his protestations about how absolutely and completely certain he is may be a sign that he’s trying to convince himself as much as his hearers.) Or we may look into whether our informant might have a motive to deceive us: e.g., is he selling the product he praises, does he stand to gain from our believing what he tells us, or to lose from our not believing him, is the gossip he’s passing along deleterious to an enemy of his?—for, as we all know, when doing so is to their advantage people are given to prevaricating, turning a blind eye, fudging, hedging, evading, and lying.

…In other words, the entirety of the internet.

Does anyone escape this filter? Is it overkill? Of course, she goes on, “it should go without saying that such common-sense precautions against being taken in by liars, cheats, scammers, charlatans, the self-deceived, the self-promoting, and the simply confused or ill-informed can’t guarantee that we will never be misled.” I am inclined to agree, but that these are considered at all “common-sense precautions” is a wild—perhaps self-deceptive—fantasy. Indeed, as things stand now anyone running an analysis such as this whenever they read or are presented with something would be the most hopeful of situations.

Further, the line suggesting that we look to whether someone would have a “motive to deceive us” is precisely what conversations circle around: willy-nilly motive ascription is hallmark of conversations these days. “She’s just a neo-con shill in the pockets of Wall Street that wants to preserve her own status.” “He’s a regressive Leftist who is trying to Trojan Horse Sharia law into Western liberal democracy.” Although extreme (is it though?), the rubric is both standard and deployed often.

__________

So with the aftermath of Trump’s election, waves of self-righteous approval were predictably met with waves of self-righteous indignation reminding me of Mr. Butterfield’s apt if grandiose analysis:

The greatest menace to our civilization is the conflict between giant organized systems of self-righteousness – each only too delighted to find that the other is wicked – each only too glad that the sins of the other give it pretext for still deeper hatred.

This is not the time to engage with people in hopes to have a conversation about any given topic. Like the creation of laws, people’s heightened senses are never a good pretext for fruitful discussion or codification: the Patriot Act is a good parable in this sense.

Then again, by that standard, never is the only appropriate time for a conversation.

So I wandered, finding myself in a reasonably civil and contained discussion with a particular fellow about the merits of his claim that, in his words, “There are plenty of writings from founding fathers, journals, diaries, whatever you choose to call it, clearly depicting Christianity. This isn’t a debatable topic, it’s 100% true that these rights were religious from the core.” In other words, this country was, is, and should be a Christian nation. To be fair, the “was” and “is” was the only part I had a problem with. I imagine the “ought” portion would have led to greener pastures.

I went on to discuss how historians seem to think otherwise: the only undebatable part, ironically, is that the topic has been being debated steadily for some two-hundred years now. I’m not sure anything qualifies as a debatable topic better than something that literally continues to have fervent debates and books written about it. So I wasn’t trying to “win” the discussion, putting this debate to bed once and for all, but merely to point out that it’s a bit more complicated than he would have it. As Haack would say, his belief was disproportional to his evidence.

To be clear—and this is my gripe about Haack’s musings—his belief was proportional to the evidence as he saw it. Which was, as he would point out later, the singular fact that it says “the year of our Lord” in Article VII of the Constitution. This was precisely where we derailed. On Haack’s view, it is my duty to attempt to show the disproportionate nature of his belief that this country “is and always was a Christian nation,” by presenting him with evidence not that he’s totally wrong, but that the evidence would suggest that his belief is much too strong given the evidence.

When people ask about the “Christian nation” question it should be responded with a half-hearted “meh, kind of.” There’s much to be said on the subject—a “yes” or “no” hardly suffices. Per Haack, “How much time do you have to talk about this?” might be the most proportional belief considering the evidence.

I’m also with Haack when she agrees with Clifford that the credulous man “is a danger to society,” and that a “credulous population creates the market for those con-men, crooks, fakers, cheats, charlatans, self-promoters and the self-deceived… And the more people are easily duped, the more likely it is that charismatic but crazy politicians will gain power.” Is my “Christian nation” interlocutor a dupe, likely to buy into something like an Herbalife scheme? Is he a danger to society? Are people like him—people who can confidently argue that this or that is “100% true”—the reason we’re in this mess, “mess” being whatever negative aspect of society I happen to be bemoaning?

It would, no doubt, be unfair to cast my friend in such a negative light. It’s obvious that his religious belief is motivating him—or shall we say allowing him—to see this somewhat distorted picture of history. What’s a more interesting question, though, is how having either a “100% true” belief that every single founding father was an “orthodox Christian” (his words), or believing “it’s complicated” affects any of his current beliefs; the “ought” part of the equation.

It seems that once again we find ourselves in the justificatory weeds, the place where we believe we need incontrovertible, often historical, evidence to bolster any given view we hold in the present. Which more or less amounts to the strategy of either tailoring our evidence “appropriately” or ignoring everything to the contrary in the name of holding a proportional belief. Being a complexity-mongerer about the study of history, I am inclined to deride this justificatory strategy even more in light of the fact that this almost always becomes a matter of politicizing history.

To be sure, I wouldn’t have hated our conversation nearly as much if my friend wasn’t saying these things with such authority, though perhaps he was equally disgusted with me being authoritatively and “passionately moderate” about the whole affair. Again, this need for some authoritative ground on which to stand stems from the belief that whatever we happen to want in the present moment must then be backed up by some solid foundation. I’m almost positive one can be a successful Christian in the United States without resorting to a single shred of historical evidence, no? One can even want the U.S. to become a Christian-established nation without ever referencing the founding generation (which I pointed out was no specific group of people with anything near a consistent set of beliefs either singularly or as a group).

Yet I imagine some, including Haack, would want to go further, arguing along the lines that if only his distorted view could be corrected or adjusted some, his actual beliefs might be corrected or adjusted some. This seems to me—no doubt because it just happened to me—a fruitless endeavor. It goes back to my original point with Haack: that it is the mere attempt to open someone else up to the fact that maybe, just maybe, their belief is disproportionately unwarranted considering the evidence that is the impossibility. Not, as it were, showing how this belief is in fact disproportionate.

__________

Haack, to once again join her side, has moderate suggestions as to what we can do or try. I say “moderate” because, as she rightly points out, credulity and other negative habits of the mind are only “partly correctable flaws.” Her suggestion is, first, that we acknowledge our—as in everyone’s—position as role models of behavior to other people. That we all get into these battles of “100% true” versus “100% not true” about the same object of inquiry is not, contrary to whomever is speaking, the other person’s fault. It’s both. Shall we not remove the plank from our own eye?

So perhaps, Haack says, we should pick our battles; “look carefully into the evidence with respect to those claims which, for personal of professional reasons, are important to [us.]” And far more importantly, “with respect to claims I haven’t looked into carefully, make a habit of acknowledging freely that I don’t know, I’m not sure, I’m not really entitled to an opinion.” Some of the most powerful words in the English language right along a genuine “I’m sorry.”

I wish I could get across to my interlocutor that “to criticize the flimsy evidence on which he believes, is not to say that he’s a bad person, and certainly not to demean him.” We need to find a sort of middle ground between the Hitchens-esque view that to criticize beliefs is not to criticize the person, and the view that people have their beliefs and that’s that, closed to provocations from the outside.

Put another way, people who agree with Hitchens refuse to acknowledge the inextricable nature between belief and person, while others often over-emphasize the closed nature of the belief-person relationship and how it must, at all costs, be protected. Which is often rhetorically veiled under the idea that we all need to be a bit more civil to each other. Of course we do, but not at the cost of growth. Nor do we need to toss civility out the window motivated by some sense of moral urgency. Somehow we need to break down this terrible habit we have of being credulous then becoming instantly close-minded with regards to what we were so credulous about—it makes for a terrible combination, if not terrible conversation.

__________

I would be remiss if I did not leave without mentioning two aphorisms that have been doing no small amount of work in my head over the last few years, but are only obliquely related to my musings above. Both are from the kind and intelligent mind of Aaron Haspel.

People will cheerfully confess ignorance of a topic and reject indignantly the suggestion that it might debar them from an opinion.

* * *

Most people, on most matters, are not, in fact, entitled to an opinion.

 

  1. to which, admittedly, I succumbed at one point []
  2. To say nothing of the much rockier side of the mountain of trying to get the other person to merely listen to what you have to say about proportion and evidence. And to say even less about the fact that those conveying the original message we’re all sopping up aren’t conveying it with proper proportionality… []

Contributor

Life is like one of those sand art thingies that gets destroyed after it's completed.

Please do be so kind as to share this post.
Share

30 thoughts on “Close-Minded Credulity

  1. I’d modify clifford’s maxim: It is epistemically irrational always, everywhere and for anyone to believe anything on insufficient evidence. I don’t know that it necessarily wrong to be epistemically irrational.

      Quote  Link

    Report

      • There are a whole bunch of metaphysical questions which have basically 0 practical impact which it wouldn’t be wrong to be irrational about. I’m not saying its the morally best thing you could do, it just wouldn’t be morally wrong.

        For instance, as an atheist, you believe that agnostics are responding poorly to the evidence and hence are irrational. After all, they afford far more credence (According to you) to the proposition that God exists, than the evidence warrants. Yet, from the atheist perspective, it would seem strange to think that agnosticism about God is morally wrong.

        Here’s another case. Suppose I really don’t like you and think, irrationally (even though I know that you like money and that you could use it to benefit yourself in many ways), that the best way to harm you is to send you $1000. It is hard to see how my irrationality itself is morally bad. Surely my irrationality is at the very least neutral with respect to or even mitigates the badness of my intentions.

        So, we can clearly distinguish between cases in which there is practical (and thus moral import) and cases which lack such import without having to be perfectly rational all the time.

          Quote  Link

        Report

  2. Deliberate poisoning of the rationality of the left and right ought not to be confused with “the internet is a mass of lies and deceit.” Particularly because you are then more likely to miss the blatant lies that occur on national television.

    Sadly, you are not a propagandist, and are thus not entitled to an opinion.

    **Neither am I, but I do know the guy responsible*** for SolarFuckingRoadways, which continues to get piles of cash from people who really really ought to know better.

    ***not the bagholder.

      Quote  Link

    Report

  3. William James’ more sympathetic twist which essentially says that we can believe something if it makes us feel good.

    That is not at all what William James says, about the sentiments of rationality or about religious belief. I mean, it’s not even a remotely reasonable gloss. Sure, he says that there are things that are beyond (the capabilities of human) reason (and, it should be noted, therefore not necessarily true), including moral judgment and religious faith (similarly, “Le cœur a ses raisons, que la raison ne connaît point. On le sent en mille choses. C’est le cœur qui sent Dieu, et non la raison.”) However, at no point does he suggest we believe things simply because they feel good. This would be a distinctly un-Jamesian position.

      Quote  Link

    Report

    • Hence the very next line..

      It is true, in any case, that James said much more than we should believe things if they make us feel good about it, and what he said here was said in the context of religious belief—which was not, as some people hastily point out, a defense of holding willy-nilly beliefs. James hated that as far as I can tell.

      My only point was to show that James’ foundation for building a good amount of his philosophy is based on a very psychological, as opposed to epistemological need. People who get James wrong often stay well clear of anything further he said on the subject… which was also my point.

        Quote  Link

      Report

    • I didn’t want to delve much into the Clifford-James debate, but only hoped to show the somewhat hazy dichotomy between Cliffordian rigidity and Jamesian fluidity… I realize now I fell short in that project!

        Quote  Link

      Report

  4. I remember reading a book called Mistakes Were Made, But Not By Me. Then there’s the whole body of literature called “behavioral finance” that describes how human beings aren’t rational even when it comes to simple economic decisions. This work got Robert Shiller a Nobel Prize.

    He got a Nobel Prize for describing rationally how people aren’t rational. This is our world.

    I feel that the philosophers are not as useful in this world I’m describing as the psychologists. And there is a model for how people change – change their behavior, not their opinions. To me, it is behavior that matters, not opinions. They are tied together in a mutually reinforcing system, to be sure. But changing behavior will change opinions probably more reliably than changing opinions will change behavior.

    I have a wealth of practical experience in changing behavior in my experience in learning and teaching martial arts. In my ryu, which is Danzanryu Jujitsu, we spend a lot of time teach people to take good falls. This is foundational, and probably the most practically valuable thing we teach: Many of us have had falls that would have been much, much worse if not for that training.

    I am capable of describing a correct fall in great detail. However, doing so is of little use to the student. It was of little use to me. I could absorb all the information and reproduce it verbally, and my behavior would not change at all. I would still do a crappy fall and it would hurt. That changed because of a process – a process that very strongly mirrors what psychologists call the transtheoretical model (of behavior change). In that model, a bunch of facts and arguments are not endorsed at early stages of change. What is needed instead is to highlight the need for change.

    That is, I would do a bad fall, and sensei would say, “that looks like that hurt”. Or perhaps, “If you were doing that on concrete, it would hurt a lot”.

    I feel we need to be doing that more – point at things and saying, “that looks like that hurt”. I just read a piece on Vox this morning about Kentucky Trump voters who don’t believe he will repeal Obamacare, and hope he won’t because they really need it and like it. So many of the Trump voters thought he was lying about stuff, but not about their stuff.

    We need to be saying “that looks like that hurt” a lot. We need to be saying “this will hurt me” a lot, too. Forget all the logical arguments. Humans are only rational because it makes them feel good to be rational.

      Quote  Link

    Report

    • We need to be saying “that looks like that hurt” a lot. We need to be saying “this will hurt me” a lot, too. Forget all the logical arguments. Humans are only rational because it makes them feel good to be rational.

      What do you do when you tell them that, and they say you’re lying? And when they do hurt themselves, blame you for it because you’re an elitist who thinks they know so much more and clearly feel superior? So they’re gonna do it again harder until it stops hurting?

      Just wait for them to hurt themselves until they can’t take it anymore? What happens when it’s not just themselves they’re hurting, but people around them?

        Quote  Link

      Report

      • If you say, in a direct way, “this hurts me” and they say you are lying, move on. Talk to someone else. That is a door slammed in your face. That hasn’t happened to me much.

        However, I don’t ever sea-lion into someone else’s threads with that complaint. If I’m a stranger to whomever I am speaking to, I might well be lying. I might well have some hidden agenda. I might well be a deceiver. You have to build the platform of “us” first.

        And that platform depends on identifying something that you and they have in common. If you can’t find that, walk away. It’s someone else’s work, not yours. But if you have hobbies and interests, those will work fine as a point of commonality, perhaps even as a shared identity.

          Quote  Link

        Report

        • I’m just pointing out, large-scale, there was a lot of “those plans will hurt you if enacted” to people who….chose not to believe that.

          Now some of them chose not to believe those plans would be enacted. Some felt the person telling them that was lying. Some probably couldn’t decide.

          I mean take repealing the ACA — there was a piece out today interviewing Trump voters about that, and the two that stuck with me were two people who were very surprised to find out that Trump seems to be wanting to do that. The thing he said he’d do.

          One actually thought the ACA couldn’t be repealed (“They can’t take away my insurance can they?”) and the other couldn’t believe the GOP would do that.

          It wasn’t because no one said “That plan will hurt you”. Lots of people did. They chose not to believe it, for one reason or another.

          That’s where we come back to “You’re entitled to your own opinions, but not your own facts”. Apparently you ARE entitled to your own facts. Doing so — disregarding facts you dislike and inventing ones you do — will eventually bite you when reality reasserts. (Assuming you ever connect those dots). But on things like the ACA, you’re taking 22 million people down with you.

          You can empathize all you want — but in a world where the very facts themselves bow to your opinions, how can empathy help?

          To use your fall analogy — you’re trying to teach people to fall correctly, except they keep tackling other students to the concrete floor because they’ve been told the mats are actually harder than concrete. Then they blame YOU for having such a hard floor, demand you fix it, and then tackle another student before you can respond.

          Eventually you’ll run out of students or they’ll be too hurt to do it, but in the meantime people are getting bloodied on your floor.

            Quote  Link

          Report

          • You can empathize all you want — but in a world where the very facts themselves bow to your opinions, how can empathy help?

            To me that’s exactly the argument for (tactical uses of) empathy. The facts by themselves apparently didn’t work with certain interlocutors. So try somehow starting where they are. Not sure I know how to do that, though.

              Quote  Link

            Report

            • We went over a lot of this last month, but the thing is — I do have empathy for those folks. I’ve got a lot of them in my family. I know a lot of them. I’m quite close to a lot of them.

              (It apparently shocked Jaybird to realize I knew actual Trump voters, people I interact with all the time. Blue collar ones. Rural ones even. Shocker, right?).

              These aren’t faceless people I don’t know, or people I hate. I’d be hating members of my own family.

              And I’m not some weird outlier of a Democrat, either. The Democrats push a ton of plans designed to help people they know won’t ever vote for them.

              They have empathy. They try to reach out, to solve the pressing problems (like the ACA).

              You know the ACA is incredibly popular with conservatives until you tell them the name? There are folks in Kentucky that hate the ACA but love Kynect.

              Which is the ACA. There’s gonna be a lot of unhappy people in Kentucky pretty soon.

              Empathy…never had a chance. You’ve got that “D” after your name? Mind is made up, you don’t even have to open your mouth.

                Quote  Link

              Report

          • morat20,
            sure, they can repeal the ACA. They might even be stupid enough to do it.
            But, i can tell you with certainty who’d be stupid enough to keep the ACA as written, with as few changes as literally possible.
            Hillary Clinton.

            When the system’s broke, sometimes it takes an outsider to have a chance of fixing the problem.

              Quote  Link

            Report

    • As an example of what I’m not talking about, here’s Daily Kos telling us to be happy all those coal miners will lose their health insurance.

      Saying, “that looks like that hurt” has to come from a place of empathy, not spite. It won’t work if it’s from spite. Daily Kos is pretty much fueled by spite, which is why I don’t spend any time there, even though we have the same policy goals.

        Quote  Link

      Report

    • Doctor Jay,
      Two things about Midwestern voters:
      1) They didn’t believe trump. You didn’t believe trump. Not believing trump is the sound decision (because he reverses himself, a lot)
      2) They didn’t believe he was capable of half the stuff he was doing.

        Quote  Link

      Report

    • I’ve heard it said that habit begets virtue, not the inverse. We tend to think the inverse.

      “Tomorrow, I will become someone who works out. I will do this by working out tomorrow and every day after. This will happen because tomorrow, when I wake up, I will be Someone Who Works Out.”
      No.

      “Tomorrow, Barry will drag my ass to the gym. The day after that, Susie will. Then, Matt will. Before I know it, I’ll have gone to the gym 7 days in a row… despite not wanting to on one of those days. But then on the 8th day? It’ll be a little easier. By the 9th, I’ll actually want to go. Next thing I know, I’ll be someone who wants to go to the gym because I’ll have become that by actually being someone who goes to the gym.”
      Much more like that.

      We don’t like to admit this is how we work but it seems to be the case.

      We don’t say to little kids, “Be grateful!”
      We say, “Say ‘thank you’.” Even if they can’t really conceptualize gratitude. Done properly — with many other lessons along the way — they become grateful people. And even the ones who don’t are still halfway decent at saying ‘Thank you’.

        Quote  Link

      Report

      • I endorse what you are saying, and wish to take it a step further.

        Researchers have studied the question of “what can you do to make yourself happier over a reasonable period of time?” They call it an “intervention” and then they assess mood/happiness a day after, a week after, and a month after.

        They find that writing a thank-you note to someone else is one of the most powerful interventions in making the subject happier. Not getting the note, but writing it, makes people happier.

          Quote  Link

        Report

Leave a Reply

Your email address will not be published. Required fields are marked *