Quantcast
Subscribe!

 

Enter your email address:

Delivered by FeedBurner

 

E-mail Steve
This form does not yet contain any fields.

    free counters

    Twitter Feed

    Entries in truth (3)

    Monday
    Oct312016

    Learn a new word: The Illusion of Truth

    Repeating something over and over and over again doesn't make it the truth.

    That seems like a pretty easy statement to understand, and with which to agree. I mean we all get that right? It doesn't matter what the statement is, or who is saying it again and again, the act of repeating it so many times doesn't impact the fundamental nature of truth.  I think we all learned that back in grade school.

    But here's the tricky part, even though we know, or think we know that repeating something doesn't make it the truth, or at least closer to the truth, we often are easily deceived.

    And that brings us to today's 'Learn a new word' submission, especially interesting and relevant with Election Day in the USA bearing down upon us in then next week or so.

    Today's word is 'The Illusion of Truth'. Definition from our pals at Wikipedia:

    The illusory truth effect (also known as the truth effect or the illusion-of-truth effect) is the tendency to believe information to be correct after repeated exposure. This phenomenon was first discovered in 1977 at Villanova University and Temple University.

    This illusion of truth effect, which has been known for a while, was recently repeated in a study titled 'Knowledge does not protect against illusory truth' published in 2015 in the Journal of Experimental Psychology.

    In that study,  40 participants were asked to rate how true a statement was on a six-point scale, and in the second, a different set of 40 participants were asked to simply state whether a statement was true or false. In both cases, repetition made the statement more likely to be categorized as true. This was the case even for statements that contradict well-known facts, such as, “Barcelona is the capital of Spain,” (when in fact, Madrid is Spain's capital).

    Why were the participants in the study, and the rest of us too, more prone to believe a statement was true if we had heard it repeated over and over? According to the researchers, it is because trying to figure out whether new information is true is kind of hard, and requires more brain processing power than just simply accepting it.

    From the above mentioned study's summary:

    Research on the illusory truth effect demonstrates that repeated statements are easier to process, and subsequently perceived to be more truthful, than new statements. Contrary to prior suppositions, illusory truth effects occurred even when participants knew better. Participants demonstrated knowledge neglect, or the failure to rely on stored knowledge, in the face of fluent processing experiences.

    And this from the conclusion:

    Inferring truth from fluency often proves to be an accurate and cognitively inexpensive strategy, making it reasonable that people sometimes apply this heuristic without searching for knowledge. 

    Thinking about things is hard.  It takes energy. Even doing simple fact-checking might be a bridge too far in many situations. But 'going along' with something largely because we have heard it many times before is always easier, and often makes sense and is a sound and harmless strategy.

    Except when it's not.

    So that's the trick then. To know when to trust the process if you will and when to do your own research and make your own conclusions. Gosh, that sounds like work.

    But be aware that we all are more susceptible to the illusion of truth effect than we may think.

    Happy Halloween!

    Tuesday
    Oct022012

    The Photographic Truthiness Effect

    Check this recent finding reported in a fascinating piece from the British Psychological Society blog on the effects of combining images, even ones that offer the reader no specific information, with statements, and the resulting effect on people's perception of the 'truthiness' of said statements:Everything in this blog is true. Trust me. Here's an unrelated picture.

    When we're making a snap judgement about a fact, the mere presence of an accompanying photograph makes us more likely to think it's true, even when the photo doesn't provide any evidence one way or the other. In the words of (researcher) Eryn Newman and her colleagues, uninformative photographs "inflate truthiness".

    It is pretty much a given these days about the importance of imagery on the web, (witness the incredible growth and emerging impact of Pinterest and Instagram, both primarily visual platforms), and the need for communicators and marketers of every kind to figure out how to use a combination of visual imagery, video, audio, text, etc. to help their messages stand out in a crowded market for attention. But this research takes the idea a step further, and possibly makes the communicator's job a little simpler, i.e., that it really doesn't matter too much what images are included with the message, just that some kind of image is present.

    More from the BPS piece:

    Ninety-two students in New Zealand and a further 48 in Canada looked at dozens of "true or alive statements" about celebrities, some of whom they'd heard of and some they hadn't, such as "John Key is alive". As fast as they could, without compromising their accuracy, the students had to say whether each statement was true or not. Crucially, half the statements were accompanied by a photo of the relevant celebrity and half weren't. The take-home finding: the participants were more likely to say a statement was true if it was accompanied by a photo.

    Fantastic stuff - slap a semi-related image up along with whatever statement you are trying to pass off as the 'truth', and bam - all of a sudden the 'truthiness' of whatever you are trying to sell is increased. 

    Admittedly this is kind of a goofy story, but one I think raises an interesting question, that is, how often are we truly being manipulated, even if subtly, by the mere existence of that well-placed image, or that perfectly Instagrammed and filtered shot that accompanies every other tweet, status update, or web page that we see?

    How often are we being tricked into believing something that seems at first read to be wrong, or at least to be a little off, but we get distracted by a fancy image just long enough to lose focus and go along with whatever the savvy communicator wants us to think?

    How many blog posts have you read and thought, 'That is a clever picture. That writer really knows their stuff?'

     

    Friday
    Sep242010

    The ancillary benefit of being true

    As the US nears what is shaping up to be a contentious political campaign season, and the rhetoric, vitriol, and semantic arguments multiply, (it depends on what you mean by 'is'), it can get pretty next to impossible to know who is telling the truth, and who is just pushing their agenda.

    And even the agenda pushers are not always easy to read.  They could be promoting themselves, some faceless political party, some corporate interests, or even a labor union.  In the current American political arena, the 'truth' is an elusive concept.  Last night I heard a pundit observe that his particular viewpoint on a hotly debated topic had 'the ancillary benefit of being true'. Cool, some ancillary truth to go along with the normal pile of drivel he will be shoving in your direction.

    The larger point is every communication in politics, at work, and even at home has some kind of an agenda behind it.  We try to inform, persuade, educate, direct, etc. all the time.  Sure, most of us (I hope) are not trying to constantly outmaneuver our rivals at work, or are trying to promote some kind of worldview that may or may not be based in truth or what's 'right'. 

    Aside - not everyone who disagrees with you is 'dangerous', 'radical', or some kind of threat to order and security.  Smart people can disagree.  Get over it.

    But still, I think it a good reminder, and kind of refreshing of this pundit to so blatantly call out the fact that sorting out how much of what he spouts is actually 'true' is certainly a challenging proposition.  In the political arena, where side-taking dominates and colors perceptions of the truth more than anything else, it perhaps is not so difficult to come to a conclusion. At work, and when managing and trying to lead teams and individuals, it is maybe not so simple.

    As an employee it may not be easy to know if what's being fed to you is a lie, the truth, or just something something in between that has 'the ancillary benefit of being true'.

    As a manager or leader how much 'ancillary truth' are you sharing today?