Discussion Reference

Overview

I think it's important to argue online with folks that my experience tells me are incorrect. It serves two purposes: it allows me to see when and where I am wrong, and it allows for the readers that are seeing the argument to learn from our discussion.

Too often, I see arguments that end in the party with the better argument abandoning the effort because they feel they aren't getting anywhere. This is unfortunate, because the party with the poor argument appears to have won, because his/her opponent seems to have conceded through silence.

In the cases where the party with the better argument remains engaged, the situation often devolves into ad hominem attacks. This is also unfortunate.

My goal is not to win over my opponent and make them have some kind of epiphany. Psychology shows us that the human mind is far too well-equipped to prevent this. With that in mind, I argue for the audience, knowing full well that I will never have won in my opponent's eyes. I only wish to address each point of each argument in turn, so that no one reading the exchange will be left without both sides to consider.

This page serves two purposes. One, to serve as a memory aid, keeping my thoughts organized, and maintaining up-to-date references. And two, to allow me to quickly link to my responses to common argument points so that I can reference them without typing them out, over and over.

Science is difficult. It takes one sentence to point out a "hole" in science to a layman, and several paragraphs to explain why that hole is not a hole at all (see the "Gish Gallup," below). This page is my solution to that difficulty.

Skeptic's Toolbox

Topics

Science's "Rights" and "Wrongs"

I hear over and over again about how science is always getting things wrong, and how that should be a lesson to us about whether or not we should listen to what it's saying now.

At its heart, I totally agree with what is being said here. We should always be critical of the information we receive, and take everything with a grain of salt. However, that grain of salt is usually best described as a grain. Not a block of salt the size of an iceberg.

In a nutshell, the vast majority of science is done well, adhering to strict procedures designed to respect the scientific method. It hasn't gotten things "wrong" in the past, as many of the folks I argue with like to suggest. What has happened, is that in the passage of time, as more information has been discovered, it has become "more right."

My favorite example is Newton's Laws of Motion. When Newton wrote them down, they described the nature of motion as he could measure it quite perfectly. In fact, without some serious effort, we still can't detect the errors in Newton's Laws of Motion in everyday life. They weren't, and still aren't, wrong, inasmuch as they work perfectly well for the application in which most people use them.

When Einstein came along and published Specific and General Relativity, he added corrections to Newton's Laws of Motion. In everyday applications at human-sized speeds (including the fastest jets on the planet), these corrections are insignificant. But once you get up close to the speed of light, these corrections totally rule the outcomes of the equations.

Does this mean Newton was wrong? Only in the purest sense; that he was't perfectly correct. And sure, we can wonder if Einstein didn't give us the true picture either. Honestly, it seems likely to me that he didn't. But what is the point of science? If it's truth you're looking for, look elsewhere. What science gives us is understanding. The explanations science delivers help us understand the world, and do so more deeply every day.

To wrap it up, if because of science I come to understand something so much more deeply today that yesterday I could only be described as being wrong, even then, I argue I have science to thank for my continually increased understanding. I'll not dismiss science because it didn't give me perfect knowledge from the get go. It would be facile to expect that. It's a process, and it works, slowly but surely leading us into understanding, and out of our mistakes.

Pluto

"...I mean, science can't even decide how many planets there are!"

An Appeal to Ridicule. The arguer is implying that science can't even count.

This is usually part of a Gish Gallup, and requires a lengthy discussion about the definition of what a planet is, and why:

In 2006, the International Astronomical Union (IAU) adopted this definition: a body that orbits the Sun, is massive enough for its own gravity to make it round, and has "cleared its neighborhood" of smaller objects around its orbit. It did this to rule out many of the objects that orbit the sun that would need to be defined as planets by any prior definition. It was a decision based upon a desire for simplicity. Either adopt this definition, and subsequently redefine Pluto as a "dwarf planet," or define all the newly discovered objects out there as planets as well. Such as Eris, which is actually larger than Pluto.

So, which makes more sense, go from 9 planets to 13 (and counting), or go from 9 planets to 8 and stay there? It should be noted the estimated number of undiscovered dwarf planets ranges from 200 to 10,000, depending on how far out you want to count as being part of the solar system.

Astronomers have now chosen a tidy and specific definition for planets that gives us 8 of them. In the unlikely event that astronomers discover a new object that fits the definition, I'm sure people will complain that science is stupid, but I hope I've shown that the issue here was not one of being unable to count. It's one of taking a poorly defined term ("planet") and defining it in a more exact way. If anything, it's a win for science, in that it is improving the quality of its ability to communicate, by increasing its use of strongly defined terms.

Life Doesn't Spontaneously Generate In A Vacuum

Creationists: Life was created by God.

Scientists: There's no evidence of that. We're going to look for a way it can happen using only the chemistry of the early Earth, as defined by what we can observe.

Creationists: You haven't found it. See? We're right.

Scientists: Seriously?

This is a most obvious Argument from Ignorance. Just because something hasn't been proven false doesn't mean it's false.

Creationists often cite science's inability to reproduce life in a lab using only the chemistry of the early Earth as yet another reason science doesn't have the all answers. In this, I agree. Science doesn't have all the answers. And it never will; there are plenty of questions science is ill-suited to answer. But questions on how this highly significant link between chemistry and biology initially came into being is not one of those questions. Science is very well suited to figuring this one out.

Just because we don't have a scientific answer to this question doesn't mean we never will. But in the meantime, there is no rational way to argue that science's lack of a strong Origin of Life (OOL) Theory means that either 1.) science won't find the answer, or, 2.) creationism is correct.

And it's not like science is dead in the water on this. Just this month, a paper was published that gets us a step closer to the answer: By Bhavesh H. Patel, Claudia Percivalle, Dougal J. Ritson, Colm D. Duffy and John D. Sutherland, the paper is called Common origins of RNA, protein and lipid precursors in a cyanosulfidic protometabolism (Nature Chemistry, 16 March 2015 | DOI: 10.1038/NCHEM.2202). The findings in this paper alone are not enough to give rise to an OOL Theory, but it's damned close. I'm not qualified to say, "Hey, we're like XX% of the way to a step-by-step explanation of how it can happen," but the fact that the pieces of the puzzle keep coming together is very encouraging. And bad news for Creationists.

Scientists: Hey look, we figured it out. Follow these steps to create these early Earth conditions, and what the chemistry turn into biology!

Creationists: Doesn't mean it happened. We weren't there.

Scientists: You're moving the goalposts. In any case, here's life spontaneously generating in the lab. Here's the evidence, the explanation, and the well-substantiated, plausible answer. Moving on.

The Flat Earth

We, including scientists, used to think the earth is flat! How can we think we know anything when we get such an obvious thing wrong?

First off, this is Chronological Snobbery.

But the issue with this claim is this: What does it mean to know something? Epistemology (the study of knowledge) is a complicated subject, but a reasonable definition of knowledge is that it is a "justified, true belief."

When we knew the Earth was flat, we made observations that suggested this was a justified, true belief. Turns out our observations were shallow, and thus our justification was poor, but we didn't know that... and it didn't matter to us. When we thought the Earth was flat, we either couldn't measure it, or if we could—such as during the period between when we developed the necessary tools to observe the earth's curvature and when we used those tools—we didn't do so because treating the Earth as flat didn't cause us any problems.

We weren't backward dopes, we just didn't have any evidence that necessitated us to pursue a better answer. The same holds true today. Does it mean we could still be wrong? Sure, but again, it doesn't matter right now. The theory we currently have (that the earth is an oblate spheroid) serves us well.

And that's a fundamental difference between scientists (who are looking for useful answers that make sense by observing the world) and folks who just want some truth to believe, and aren't concerned with determining if their belief is justified (or actually true).

"The Evidence In The Bible Does Not Change"

Can't Prove/Disprove/Reproduce "Historical Events"

Evolution

Macroevolution / Microevolution / 1980 Meeting on Macroevolution

"Only a Theory"

Claim: "It's the theory of evolution, it's just an unproven assumption."

The fallacy here is equivocation. This claim equivocates the differing meanings of the word "theory" (the two equivocated definitions can be found in that link). Saying the Theory of Evolution is "just" a theory and an unproven assumption is inimical to the definition of the word theory that is used in the term "Theory of Evolution."

"No supporting Facts Intermediate Species, No Present Day Creatures In Any State of Evolving

Vaccine Adverse Event Reporting System (VAERS)

The Bible As An Accurate/Reliable Historical Document

Intelligent Design is not a Science

https://en.wikipedia.org/wiki/Kitzmiller_v._Dover_Area_School_District

Definitions

Theory

Scientific Theory

A scientific theory is a well-substantiated explanation of some aspect of the natural world that is acquired through the scientific method and repeatedly tested and confirmed through observation and experimentation.

Common Parlance Theory

A theory is an idea or thought that might explain something.

Like, "I have a theory that the Flying Spaghetti Monster doesn't like purple. I mean, why else would the Vikings not have a Super Bowl win under their belts yet?"

Evidence

Broadly, evidence is anything presented in support of an assertion. This support may be strong or weak. The strongest type of evidence is that which provides direct proof of the truth of an assertion. At the other extreme is evidence that is merely consistent with an assertion but does not rule out other, contradictory assertions, as in circumstantial evidence.

Scientific Evidence

Scientific evidence consists of observations and experimental results that serve to support, refute, or modify a scientific hypothesis or theory, when collected and interpreted in accordance with the scientific method.

Circumstantial Evidence

Evidence that relies on an inference to connect it to a conclusion of fact. For example, a fingerprint on a gun is circumstantial evidence in a shooting. It does not prove the suspect shot the victim. It only proves the suspect's finger was, at some point in time, placed on the gun.

Anecdotal Evidence

Everyone's favorite evidence. "I heard that chicken farm chickens lay 100 eggs an hour!" To be taken with a grain of salt.

More common anecdotal evidence is that which arises from specific events that align with the arguer's proposition, and is then applied in a general manner. "My cousin got a flu vaccine and then went into anaphylactic shock. Flu vaccines are dangerous!" This is a fallacy of argument called the lonely fact, and flies in the face of the way rational risk decisions are made. Which is to base one's decision upon the preponderance of evidence regarding risk and reward.

Argument Fallacies

Appeal to Ridicule

A Red Herring.

An argument is made by presenting the opponent's argument in a way that makes it appear ridiculous.

Argumentum Ad Hominem

A Red Herring.

The evasion of the actual topic by directing an attack at your opponent. Responding to arguments by attacking a person's character, rather than to the content of their arguments.

Argument from Ignorance

Asserts that a proposition is true because it has not yet been proven false (or vice versa!). This represents a type of false dichotomy in that it excludes other options:

  • That there is insufficient investigation and therefore insufficient information to prove the proposition satisfactorily to be either true or false.
  • That the proposition is unprovable or unknowable.

Argument from Silence

Where the conclusion is based on the absence of evidence, rather than the existence of evidence. This differs from Argument from Ignorance in a subtle but important way: Whereas Argument from Ignorance is about making assumptions when evidence is absent, Argument from Silence is about making assumptions when evidence is omitted.

I.e., "So and so should have addressed (some salient proposition), but because so and so did not, it must be (whatever conclusion the arguer is making)."

E.g., Claim: "There are specific historical records of the plays that were performed at court during the reign of Henry VII. The lack of references to the performance of plays by Nicholas Udall (such as The Respublica) suggests that these plays were not performed."

Just because evidence is missing doesn't mean the proposition such evidence would support is false. It just means that the proposition hasn't been shown to be true. In this case, there are records of payment from the court to Udall for items related to his plays that suggest his plays were performed. The likely case is that the court records omitted those performances. A cursory search on Udall suggests why the court may have done this.

Argument to Moderation

Also known as false compromise, middle ground, fallacy of the mean, argumentum ad temperantiam, and balance fallacy.

Arguer assumes that the compromise between two positions is always correct. This fallacy's opposite is the false dilemma argument.

An individual operating within the false compromise fallacy believes that the positions being considered represent extremes of a continuum of opinions, and that such extremes are always wrong, and the middle ground is always correct.

Solving problems by making compromises readies us to make this mistake, however there are situations in which, given two propositions, only one is acceptable, and no part of the other is so.

For example, when given the choice between listening to a neurosurgeon's input on how to deal with a seizure disorder and listening to a trepanner's diatribe about cutting a hole in your head to let the demons out, there is no part of the treppaner's input that is valuable, and any compromise between the two practitioners in determining your treatment would detract from listening to the neurosurgeon alone.

Begging The Question

Providing what is essentially the conclusion of the argument as a premise. Often used in the sense that some proposition obviously suggests asking another question first, but Begging the Question is more general. It does not only refer to an argument in which an unstated premise is essential to the conclusion

It's a form of circular reasoning, in which the conclusion that one is attempting to prove is a premise of the argument, often in an indirect way that conceals this fact.

E.g., it is not difficult to logically prove an omniscient and omnipotent God's existence. However, it requires that one first believe him to be omniscient and omnipotent, which requires one to first believe him to... well, be.

Chronological Snobbery

Where a thesis is deemed incorrect because it was commonly held when something else, clearly false, was also commonly held. A simple example being: "Folks in the 17th Century believed alchemy to be possible and that there was no reason for surgeons to wash their hands, so they didn't know anything. And since Nicholas Steno's proposal (during this time of idiots) that fossils are organic remains embedded in layers of sediment (the basis of stratigraphy), must also be wrong.

Continuum Fallacy

Also known as: fallacy of the beard, line-drawing fallacy, sorites fallacy, fallacy of the heap, bald man fallacy). In this case, one improperly rejects a claim for being imprecise.

Put another way, just because a claim is not as precise as you would like it to be does not mean the claim is false. Vagueness alone does not imply invalidity.

A common example is in this paradox: Fred is clean-shaven now. If a person has no beard, one more day of growth will not cause them to suddenly have a beard. Therefore Fred can never grow a beard. The arguer will claim that since this is absurd, then the statement, "If a person has no beard, one more day of growth will not cause them to suddenly have a beard," must be false.

However, it is clearly true, though it is vague. The idea the arguer is often making is that changes in a continuous quantity (length of facial hair) cannot give rise to changes in quality (has beard/has no beard). However, it clearly can; assuming Fred has the capability to grow facial hair, he can most certainly grow a beard.

The idea that a line cannot be drawn between qualities based upon a quantity of continuous measurement is fallacious.

Correlation Proves Causation

In this fallacy, the argument states that because two things are correlated, there must be a causal relationship between them. The fact is that there may be a causal relationship, but the correlation does not prove this. Correlation is no more than a statistical comparison between two data sets. No relationship needs to exist between them in order for them to be similar.

Visit this link if you want to see plenty of examples of how drawing conclusions from correlation alone will probably get you nowhere: Spurious Correlations. I just visited it, and for a taste, here's the first example: "US spending on science, space, and technology" correlates with "Suicides by hanging, strangulation and suffocation." Strength of correlation: 0.992082 (perfect correlation is represented by the number 1).

For a less obvious example, consider this: In a widely studied case, numerous epidemiological studies showed that women who were taking combined hormone replacement therapy (HRT) also had a lower-than-average incidence of coronary heart disease (CHD), leading doctors to propose that HRT was protective against CHD. But randomized controlled trials showed that HRT caused a small but statistically significant increase in risk of CHD. Re-analysis of the data from the epidemiological studies showed that women undertaking HRT were more likely to be from higher socio-economic groups (ABC1), with better-than-average diet and exercise regimens. The use of HRT and decreased incidence of coronary heart disease were coincident effects of a common cause (i.e. the benefits associated with a higher socioeconomic status), rather than cause and effect, as had been supposed. ( Lawlor DA, Davey Smith G, Ebrahim S (June 2004). "Commentary: the hormone replacement-coronary heart disease conundrum: is this the death of observational epidemiology?". Int J Epidemiol 33 (3): 464–7.)

Also known as "cum hoc, ergo propter hoc," "with this, therefore because of this."

Not to be confused with "post hoc, ergo propter hoc," "after this, therefore because of this," which states that the cause brought about the effect. This is not about a false link between correlation and causation. It is instead about the error in assuming a link between a cause that occurs prior to an effect.

Equivocation

Occurs when the arguer makes a word or phrase employed in two (or more) different senses in an argument appear to have the same meaning throughout.

Simple example: Nothing is better than eternal happiness. A ham sandwich is better than nothing. So, a ham sandwich is better than eternal happiness.

Here, the arguer is using the word nothing in two very different ways. The second statement is omitting a very important word: "having." That omission creates the fallacy here.

Another example: A feather is light. What is light cannot be dark. Therefore, a feather cannot be dark.

Here the arguer is switching the meaning of light from light vs. heavy to light vs. dark.

Similar to hedging, in which the arguer, when shown the error in one of his propositions, claims different meaning in one or more words in his statements so that his proposition is transformed into a new proposition that can be claimed to be true, thus escaping the error that was shown. Hedging, when done well, is a form of equivocation. When done poorly, it's called backpedaling. Either way, it's a fallacy.

False Attribution

An advocate appeals to an irrelevant, unqualified, unidentified, biased or fabricated source in support of an argument.

False Dilemma

Also known as a false dichotomy, fallacy of bifurcation, or a black-or-white fallacy.

Used when the arguer asserts that two alternative statements are held to be the only possible options, when in reality there are more.

Note that this is not just being given a set of options; it's the implication that the only choices are those in the set given.

Example: Either evolutionists have it right, or creationists do. Since evolutionists have made mistakes, creationists must be right.

The false dilemma is in that they could both be wrong.

Note that "since evolutionists have made mistakes" also uses the fallacy of hasty generalization: that since evolutionists have gotten some things wrong, they must be getting everything wrong.

False Equivalence

Describing a situation of logical and apparent equivalence, when in fact there is none.

Simply, this is saying that two things are the same based upon equal details that they both share. This can be true, but not necessarily so.

"Cats and dogs both have fur and give live birth to offspring. So they're both mammals." The details are true, and the conclusion is true.

"Cats and dogs are both cuddly and friendly pets. So they're the same." The details are true, but the conclusion is only true for a terrifically ignorant prospective pet owner.

Fallacy of Composition / Fallacy of Division

Fallacy of composition – assuming that something true of part of a whole must also be true of the whole.

E.g., All cells are aquatic. Thus everything composed of cells is also aquatic.

Fallacy of division – assuming that something true of a thing must also be true of all or some of its parts.

E.g., Passenger jets can fly me over the ocean. Passenger jets have reclining seats. Thus reclining seats can fly me over the ocean.

These are both very specific forms of the Inductive Fallacy.

Fallacy of Many Questions

Also known as a complex question, fallacy of presupposition, loaded question, plurium interrogationum.

Someone asks a question that presupposes something that has not been proven or accepted by all the people involved. This fallacy is often used rhetorically, so that the question limits direct replies to those that serve the questioner's agenda.

The traditional example is, "Have you stopped beating your wife?" Whether answered with "yes" or "no," the responder has admitted to beating his wife.

This is often a rhetorical tool, in which many questions are asked that imply the arguers desired answers/propositions, and where the responder is not afforded the opportunity to address each implication presented, only the last one, which is designed to trap the respondent into admitting one or more propositions he or she does not wish or intend to admit.

Fallacy of The Single Cause

It is assumed that there is one, simple cause of an outcome when in reality it may have been caused by a number of only jointly sufficient causes.

E.g., "Erik struggles in his math classes and he failed his last exam. This is because he is poor at math." The effects stated in the first sentence may easily be true with the conclusion being false if one considers that Erik is dyslexic. He may well *also* be poor at math, but his failure to perform doesn't necessarily show this.

Or, "The Earth is always in a cycle of ice ages and warm periods. This explains global warming." In fact, there are a multitude of contributing factors, of which global cycles are only one part. Whether or not anthropogenic global warming exists is not addressed by showing that another cause also exists, unless that cause is also shown to be sufficient to cause the effect on its own.

Historian's Fallacy

Occurs when one assumes that decision makers of the past viewed events from the same perspective and having the same information as those subsequently analyzing the decision.

Consider that Newton's Laws of Motion are useless at relativistic speeds (i.e., close to the speed of light). The error introduced by the relativistic terms that Einstein added to Newton's laws exists at more normal speeds as well, though the correction at normal speeds is infinitesimal. It is still, however, there, and without that correction, Newton's Laws alone don't give the "truest" answer that we currently can attain by adding Einstein's terms. The result of this logic is that Newton's Laws of Motion are wrong, in the sense that they are incomplete. The Historian's Fallacy would blame this error on Newton, even though Newton had no perspective or information to suggest to him that he was incorrect.

Inductive Fallacy / Cherry Picking / Hasty Generalization

Also known as: fallacy of insufficient statistics, fallacy of insufficient sample, fallacy of the lonely fact, leaping to a conclusion, hasty induction, secundum quid, converse accident.

One of the most common fallacies used. Occurs when the arguer bases a broad conclusion on a small sample.

Examples:

  • In the USA, African Americans are not oppressed and racism is a thing of the past, because the USA has a black president. (Try telling that to the residents of Ferguson, Missouri, especially considering the Republican Congressional Committee's findings of a culture of racial discrimination in the PD)
  • Vaccinations have caused horrible adverse reactions. So, vaccinations are unsafe and should be severely limited or banned. (The risk/reward in this scenario has been hijacked, with the risk massively inflated above, and the reward massively diminished below, the levels where Scientific Evidence shows them to be).

This fallacy is the opposite of Slothful Induction

Inflation of Conflict

Where the arguer asserts that since the experts of a field of knowledge disagree on a certain point, the scholars must know nothing, and therefore the legitimacy of their entire field is put to question.

Example: There is a great deal of disagreement on which technology to pursue in the next generation of electrical batteries. Inflating this conflict would suggest that the field of battery technology is illegitimate. However, if that were the case, I'm pretty sure my phone, the laptop I'm using, and my wife's old Prius would all be useless. Put another way (and avoiding anecdotal evidence), there is a preponderance of evidence that batteries are quite useful.

Ignoratio Elenchi

Also known as: irrelevant conclusion, missing the point.

An argument that may in itself be valid, but does not address the issue in question.

Example: Vaccinations can hurt people. Hurting people is bad. Anyone that supports vaccination schedules or requirements is either intentionally or unwittingly supporting hurting people, and are thus intentionally or unwittingly bad people.

Nothing in these statements fails to follow the rules of logic, but it misses the point, which is that the reward gained in preventing disease outweighs the very small risk that any given vaccination might hurt a person. Supporters of vaccinations accept that some may be harmed for the greater good, provided that that harm is minimal, and constantly revisited to assure it remains minimal. Suggesting that supporters are supporting the harm of the few harmed is the fallacy. In truth, supporters are supporting the health of those that are spared disease.

In contrast, it is not a fallacy of missing the point to suggest that anti-vaccination proponents are unwittingly supporting hurting people. The result of their success would be rampant rise in some horrible diseases, and far more suffering than the few vaccination injuries we currently bear in order to avoid such an awful future.

Kettle Logic

Using multiple, jointly inconsistent arguments to defend a position.

The name derived from a story about dream theory related by Sigmund Freud, in which a man defends himself in a dream for having returned a damaged kettle to a neighbor after borrowing it with these arguments:

  1. That he had returned the kettle undamaged;
  2. That it was already damaged when he borrowed it;
  3. That he had never borrowed it in the first place.

An example from the real world:

  1. The bible is the perfect work of God, given to man
  2. The bible is a collection works written by men, from personal and eyewitness accounts
  3. The bible is an accurate, reliable historical record
  4. The historical inconsistencies of detail in the bible are irrelevant, the point is in its general facts, or in its moral and ethical lessons

Moral High Ground

In which one assumes a "holier-than-thou" attitude in an attempt to make oneself look good to win an argument.

Moving The Goalposts

Also known as: shifting sands, raising the bar.

Where an arguer, upon being provided with contrary evidence addressing a specific claim, will dismiss the evidence and demand different (often greater) evidence in order to advance his position.

Example: A creationist writes a book that advances his position using, among other things, some highly questionable paleontology. An expert in the field of paleontology writes a scathing review of the book, addressing its errors in the field in which he is an expert. Respondents ignore the paleontology, and demand to know why he hasn't addressed the claims the book makes about early embryological development or epigenetics. He didn't address those issues because he is not an expert in the field, but the respondents have shifted the goalpost to make his "score" seem like a miss: "You didn't address those things in your review because they're right, and thus the book is right." (ref. Donald Prothero's review of Stephen Meyer's book, Darwin’s Doubt)

Non Sequiturs

Latin: "It does not follow." A fallacy that is an error in logic that can be seen in the argument's form. The conclusion could be either true or false, but the argument is fallacious because there is a disconnection between the premise and the conclusion.

It comes in many complex forms, which you can explore here.

Post Hoc Ergo Propter Hoc

Latin: "After this, therefore because of this."

Where the arguer advances a causal relationship between a cause and effect, in which the cause occurs prior to the effect.

E.g., "I never go to sleep before plugging my phone into its charger. Therefore, charging my phone must make me sleepy."

Gish Gallup

Also known as: proof by verbosity, shotgun argumentation.

Where the arguer submits an argument too complex and/or verbose to reasonably deal with in all its intimate details.

Named for Young Earth Creationist Duane Gish for his propensity to emit a barrage of erroneous arguments in a variety of topics in short order, and then ignoring objections raised by opponents.

It is often this dynamic that frustrates proponents of science when confronted with someone that has a bag of 10 weakly formed holes to poke in your arguments, and can spew them in rapid succession. While one may be perfectly capable of addressing and debunking each hole individually, it is likely that anyone listening, including your opponent, will either lose interest in your long-winded explanations, or use some technique to move on before you've had the chance to do so. Thus it seems that 8 or 9 of those holes, since left unaddressed, are thus valid inconsistencies in your position.

Effective counters include:

  1. Summarily dismiss the entire diatribe, and demand that only one topic be addressed at a time, or
  2. Demand that evidence be provided for the claims made

If in an online debate, and you choose to take the time to address each item in turn, take care to address each item with great succinctness, and format the response so that each argument is clearly distinct from the others.

It is also worth pointing out that your opponent has made this fallacy, and that its use serves to show the desperation of your opponent.

Red Herring

A speaker attempts to distract an audience by deviating from the topic at hand by introducing a separate argument the speaker believes is easier to speak to.

A most disturbing example is the Reductio ad Hitlerum, where the arguer compares his opponent or his opponent's position to Hitler or to Nazi positions.

When someone suggests that your support of vaccination schedules is akin to Nazi practices, this is a red herring. They can go on all day about Nazi this and Nazi that, but it has nothing to do with the discussion at hand. While it may be tempting to show them where their knowledge of what Nazis actually did is utterly laughable, it's better to refrain. Don't take the bait, and gently redirect the discussion back to reality.

Reductio ad Hitlerum

A Red Herring.

Reductio ad Hitlerum is a form of argumentum ad hominem, a fallacy of irrelevance, in which a conclusion is suggested based solely on something's or someone's origin rather than its current meaning. The suggested rationale is one of guilt by association.

Reification

A fallacy of ambiguity, when an abstraction (abstract belief or hypothetical construct) is treated as if it were a concrete, real event or physical entity. In other words, it is the error of treating as a "real thing" something that is not a real thing, but merely an idea.

An example is the idea that electricity in a wire behaves like water in a pipe. It's a good metaphor for initial introductions into the study of electricity, but it fails badly soon after. It is (initially) useful to think of electricity behaving like water in a pipe, but it most certainly is not. To treat it so is reification.

Shifting The Burden of Proof

Shifting the burden of truth is a particular case of the Argument from Ignorance fallacy. Here, the arguer attempts to shift the "onus probandi" onto the person opposed to the original assertion.

Onus probandi is the "burden of proof." The Argument from Ignorance shows that the responsibility to prove a claim lies with the party making the claim.

Example: Jon claims X is true, Sally disagrees, and Jon says, "Prove it."

This can go two ways:

  1. Sally argues that the onus probandi is on him to show that X is true, saying "I do not believe X to be true. You must prove it to me." The onus probandi falls to the party making the claim.
  2. Sally instead says, "I believe X to be false." She has now made a positive claim, and the onus probandi is on her to show that X is false.

To hammer it home, I do not believe Earth was created 6,000 years ago. There is not sufficient evidence to suggest such a thing. I do not, however, believe this claim to be impossible; compelling evidence may arise (e.g., God returns to Earth and tells us he did it), and so I do not make the claim that this proposition is impossible. Thus the onus of proof remains with the party claiming that Earth was created 6,000 years ago.

Slothful Induction

Also known as: appeal to coincidence

Where the arguer refuses to acknowledge a proposal that is supported by a preponderance of evidence, dismissing every piece of supporting evidence given as merely coincidence.

This fallacy is the opposite of Hasty Generalization

Straw Man

A Red Herring.

An argument based on misrepresentation of an opponent's position. The arguer creates a "straw man" argument that appears to represent his opponent's position. However, it differs in an subtle but significant way; it has a weakness that the arguer then attacks.

Comments