The answer to this question has three aspects:
- why people initially believed in the effectiveness of alternative medicine and the fact that vaccines cause autism;
- the problems of myth fighting in general;
- why discussion as a format of communication is not very conducive to changing opinions.
The first point is brief. First, why not believe it: the brain is so arranged that any incoming information is initially accepted as correct, if it does not contradict existing knowledge. It is revised when there is a reason. So if alternative medicine is the first to get to the brain, they have an edge. Secondly, what evidence is trustworthy and what is not is not very simple and obvious to a person who was not specifically interested in the topic. To understand why a double-blind experiment can be trusted more than personal experience, you need to know what explains the reliability of the former and the unreliability of the latter. And intuitively, personal experience seems much more real and weightier than the chatter of some academics who are out of touch with reality.
You can read excellent books about the causes of irrational beliefs:
Michael Shermer, "Secrets of the Brain "The Believing Brain. From Ghosts and Gods to Politics and Conspiracies - How We Construct Beliefs"
Now I will go into detail on the second aspect and give some tips. I will touch on the third one briefly - I will say that the Socratic dialogue is better than the usual discussion, in which people take turns arguing in favor of their point of view.
In scientific communication, there is a term "model of lack of information." It's the idea that people believe in myths simply because they haven't been told how things really are. But this model is wrong.
A straightforward attempt at enlightenment can inadvertently make it worse - to strengthen the interlocutor's wrong picture of the world. There are several psychological effects that need to be considered to prevent this from happening.
These are described in a special guide for mythbreakers (John Cook, Stephan Lewandowsky, “The Debunking Handbook”).
The Familiarity Backfire Effect
The more often we come across a statement, the more willingly we believe in him. This happens automatically. Even if the information is marked as incorrect: "Acupuncture is not used for pain relief during operations." When memories fade, there remains something about acupuncture and pain relief during surgery. So we played on the side of the myth.
To test this idea, in one experiment people were given a booklet refuting common myths about the flu vaccine. Immediately after reading, people successfully distinguished delusions from facts, but after about half an hour, some (fortunately, not all) test participants showed worse results than before reading. Most likely, in a month the result will be even brighter.
What are they doingb?
Not to mention the myth. Seriously. If this is not possible, focus on facts. Mention them at the beginning, highlight them, put them in the center, so that the effect of recognition works on truthful statements. For example, “Only drug pain relief is effective in surgical operations.”
This is especially important for the information that a person will catch a glimpse of, not really focusing - headlines, posters and booklets. In the middle of the book, when you have already captured the attention of the reader, it is not so scary to dwell on the myth in more detail.
The Overkill Backfire Effect
result due to an overabundance of information)
Often people believe that the more arguments, the more convincing the position. And they shower the interlocutor with facts and links. The problem is that if the information seems too complex, the brain prefers not to understand at all and return to a simple (and therefore cognitively attractive) myth.
What to do?
It's easier. Don't list ten arguments, leave three. Use simple phrases, pictures for clarity. When writing an article, it's good to have multiple levels of difficulty so the audience can choose how much they want to dive.
The Worldview Backfire Effect
(The backfire effect due to the threat to the worldview)
People seek, notice and interpret information in a biased way so that it confirms their beliefs. This is especially true of beliefs that are emotionally significant and associated with identity. In principle, nothing can be changed here, this is a side effect of the structure of our thinking.
So it turns out that if we give two convinced representatives of opposite camps a balanced set of evidence, disagreements can become stronger ("polarization effect"). Natural double standards: the brain accepts arguments in favor of its point of view freely, picks the arguments of the opposite side with tweezers and is capricious ("motivated skepticism", "disconfirmation bias").
And if you give the fanatics a set of facts that contradict their picture of the world, they will defend themselves - actively seek confirmation of their position in their memory. This can lead to the fact that the conviction will only grow stronger.
If we want to influence society as a whole, we need to understand that there are not so many unconditional adherents of any idea, and you can leave them alone. It will be much more effective to focus your efforts on working with neutral and doubting ones.
If it is important for you to convince a specific person, try using Socratic dialogue - open questions instead of statements. Why do you think so? What evidence could influence your opinion? If this argument of yours turned out to be untrue, would you change your position, or are there more serious reasons to believe? What is the main reason for holding this belief? Confirm nothing, give no food to distortiondenia and motivated skepticism. Find out how much the interlocutor is, in principle, ready to change his mind (if in plain text he says that he is not ready, be glad that you did not waste time on an argument). Encourage him to think over and overestimate his position, look for contradictions in reasoning. Focus on what he thinks is most important: these are rarely the first arguments expressed, you need to clarify. Only talk about the facts after you have listened to the person and made sure they will accept the testimony. There is a technique for such a Socratic dialogue about beliefs and the movement of people who use it - Street Epistemology.
Another important point: experiments show that when a participant is asked to recall a reason for pride (a situation in which he behaved according to his values), immediately after that, a person evaluates more objectively information that threatens the picture of the world.
One can cautiously deduce from this that if a person feels good and strong, the discussion with him will be more productive. So (this is a twist) politeness and respect help.
The need to fill the gap
Let's say you've debunked a myth. And now there is a gap in the picture of the world. The vacuum is where there used to be a coherent story. When consciousness needs to patch up this hole, repairing it will be the closest available explanation - the same myth that seems to have been discarded.
In one experiment, participants read a description of a fire in a warehouse. Oil paint and cans of gasoline were mentioned in the text. Then it was clarified that they were not there during the fire. Even when people remembered and understood the amendment, they were confused. For example, to the question "Why do you think there was so much smoke?" usually answered that it was oil paint.
Fortunately, when they were offered an alternative explanation for the cause of the fire - the liquid for ignition - there were fewer answers about the paint.
What what to do?
Provide a clear alternative explanation. Why the myth is wrong, why people can spread it (if the motives are clear) and what is really happening.
Links to sources can be found in the original booklet. Good luck in your fight!
Some people exaggerate the risk of real side effects and underestimate the benefits due to the inability to operate with such a mathematical concept as probability.
Others have ignorance and an inner willingness to believe in any unscientific nonsense, up to conspiracy theory.
Because these people have heard about the tragic consequences of vaccinations. And even realizing that the percentage of such cases is negligible, they do not want to enter the tenth and hundredth percent of tragic cases.