In many cases where the standard label of a fallacy is well-known I have included it in quotes in the definition. Some of these are expansions of things usually listed as one, or vice versa depending on commonality, and a few are not often listed even in the best lists.
Fallacy of Fallacies – The assumption that because a particular argument used to reach a conclusion is unreliable, the conclusion is definitely false; the conclusion might be true for other reasons. This is an important caution to keep in mind when considering all other logical fallacies.
Taste Confusion – Mixes up subjective issues such as different personal preferences for what foods taste better with objective issues such as what foods people should or should not eat. Each person's unique (and often changing) personality, experiences, genetics, and choices cause them to have their own “like or dislike” reactions to things, which can be shown to be validly different from others who are “wired differently.” Users of this fallacy essentially look at their own subjective tastes, assumes by Hasty Generalization Fallacy that everybody is wired the same as them (or assumes selfishly that only their tastes are right), and then assumes that only their tastes should be appealed to by others. Often leads to harsh and even hateful criticisms, often unwittingly insulting others. This is also an important caution when considering logical fallacies; often a fallacy will be invalid when it deals with objective issues, and yet be valid when it deals only with subjective preferences.
Voodoo Labels – Slapping a label onto an opponent's argument, especially a famous label, including “definist's fallacy” which mistakes an argument for a similar one, and believing that this act defeats the argument, as if the label was a voodoo doll. Sometimes the similar argument has been soundly defeated elsewhere, sometimes it has been argued against, but wrongly so, and the labeler doesn't realize this, but in either case the labeler does not usually have a good understanding of why the argument is supposedly defeated.
Insults – “Ad Hominem”; insulting the debate opponent instead of showing any logical errors in their argument. Very common; typically the use of insults is a sign that the user knows they are wrong and have been proven wrong and has nowhere else to turn to stubbornly hold to their position anyways. Typically a sign that they have ego problems that make it hard for them to admit when they are wrong. Caution must be used when judging the use of this; sometimes criticisms that conclude with the use of negative labels are not fallacious and are not meant as insults but as alerting someone to a mistake. The identification of fallacies itself is sometimes wrongly mistaken for insults, for example.
Threats – Similar to insults, except it implies or states that if a person does not agree with an idea, they will be harmed, therefore the idea is true. Pointing out the fallaciousness of this argument can be dangerous LOL.
Popular Idea – “Ad Populum” – the appeal to the majority opinion as proof that the argument is true. This is unfortunately popular itself in any public which has not been educated well in logic. For example, in science sound logic of ideas is what leads to a conclusion being true, false, or uncertain, yet many people have throughout history become dogmatically defensive of whatever happens to be the majority opinion in science at the time (flat Earth, geocentrism, evolution, etc.) and scornful of any idea that disagrees with the majority (round earth, heliocentrism, expanding universe, tectonics, creation, etc.). Caution must, however, also be applied in judging this, as in subjective matters of personal taste across groups, popular taste can be valid to appeal to over the less popular tastes, especially when a company is aiming for high sales, for example. This too requires caution in turn as sometimes objective moral issues are mistaken for subjective taste issues (such as violence).
Black Swan Blindness – Assumes that something that is statistically unlikely is impossible, or incapable of having significant impact, when in fact sometimes a thing or event can have low probability but high impact. Also often takes the form of comparing how often real contrasting possibilities occur, identifying one as occuring far more often, and then believing the minority event has been proven not in fact to occur. One of several fallacies unfortunately perpetuated by the famous David Hume, for example.
Backwards If – “Affirming the consequent”; essentially assumes that there is only one way to get a result when in fact sometimes multiple methods can be used to gain that result (or sometimes the method being argued for in fact cannot actually reach that result when studied more carefully). The argument runs like this – if my idea happens, then you would see this result. You see this result, therefore my idea did happen. The argument can be made valid only if the conclusion is modified to – therefore my idea might have happened – thus backwards if can be useful in many investigative inquiries as a clue towards probabilities, but it cannot prove that one “if” was in fact the cause of the effects observed. For example in crime dramas often evidence seems to point to particular perpetrator but later more evidence is found which shows that in fact someone else was at fault.
False If – “Denying the Antecedent” Similar to the Backwards If, this runs forward through an if-then clause, shows that the if is false, and then assumes that the then must also be false. Like BI, it assumes that there is only one route to the result, when in fact multiple routes are possible; the cause that is shown not to be true is not the only possible cause for that result. Therefore the result might be true anyways. As with Backwards If this can be a useful clue in determining probability, as ruling out one cause for an effect does reduce the total number of causes that may have occurred, meaning it's more likely the effect did not occur. But it cannot by itself prove that the result is false.
Wrong Or – “Affirming a disjunct”. Confuses the two versions of “or” in the English language. Specifically, a “this or that” statement may be inclusive, meaning “this and/or that”, but be mistake for an exclusive, meaning “either this or that, but not both.” Then the fallacy identifies one of the ideas in the phrase as true, and wrongly assumes this proves that the other idea must be false. The whole fallacy takes the form of “this or that – this is true, so that is false.” Care must be taken when identifying this because sometimes two ideas really are mutually exclusive.
Incomplete Or – “False Dilemma” and various other names. Assumes that only two alternatives are possible, when there are really other possibilities. Typically one of the two is then discredited, in an attempt to prove the other. For example, evolutionists typically lump all religion into one group of manmade religion, and contrast this with evolution. Since it is easy to show most religions false, this makes it appear that evolution has been proven, when the third option of biblical creation has not been considered. Caution must be used, here, however, because there are different levels of analysis possible; for example, if as the first level we consider all possible origins views, and then conclude (as I have) that only evolution or biblical creation can be true, the second level of analysis which tests just these is valid, and is not a false dilemma. Then once evolution is shown to be false, this does validly help show that creation is (probably) true.
Careless Negative – Various formal and informal fallacies run off of this basic idea, which is that you can prove a negative prior to having done careful research, and/or that the negative is false in a sense in which you cannot test it. The most common example is arguing that something does not exist or is not in a place, when that place, or all of existence, has not been searched. Also in some cases a thorough search can still lead to a false negative, such as if a search party is looking for a lost child in the woods, but the child has moved around to a place already searched. In other cases a statement can be proven false, but only in certain senses; we can prove for example that you do not see God, but since God is claimed to be invisible (like gravity), we cannot thus prove that he does not exist, merely that an always-visible version of God does not exist. (Furthermore it can also be argued that since sight involves the bouncing of particles off of an object, so is indirect, and since God is said to have created everything, all vision may be in a sense seeing God, as everything in existence is like light particles bouncing off of an object.)
Careless Positive – Both this and the above are forms of “Argument from Ignorance”; this one assumes that because something has not, or cannot yet be proven false, it must be true. For example, some creationists think that the fact that science cannot disprove God proves that God exists. Rather, that fact alone merely proves that God MIGHT exist, as far as the scientists know.
Truth By Repetition – “Argument from Repetition” – assumes that because an idea has been repeated over and over it must be true. Similar to Popular Idea; often the two are used together. Like that fallacy, however, it can be valid when it is restricted to subjective issues; we can 'reprogram' ourselves, for good or ill, by telling ourselves we like or dislike something enough times that our tastes actually change and thus they do, since tastes depend in part on our decisions and learned behaviors, etc.
Truth By Silence – Assumes that because no convincing opposing argument (or none at all) has been presented against a claim, that claim must be true. In general the advice to be learned from this fallacy is to be aware at all times that in virtually all cases, no matter how sure you are of something, there is always a possibility, however slight it may seem to you, that you are wrong.
Circular Reasoning – An unfortunately very common fallacy that assumes the conclusion is true in order to argue for the conclusion. This line of thinking is useful in contradiction-analysis, in which you examine two or more competing views to determine whether any of the views are self-consistent; that is, you hypothetically consider the conclusions to be true and see if they fit with the other ideas in that view. However, this analysis is very difficult to pull of correctly as it requires a fully thorough analysis of all details involved, and most importantly requires this to be done fairly for ALL relevant views (it is often easy to think you see contradictions because you have not looked closely enough or have not thought it through). Circular reasoning is a fallacy mainly when it is used just to try to prove one view, without considering other views. Usually users of this fallacy think they are showing how a piece of evidence proves their conclusions, when in fact they are really only showing how that piece of evidence could fit with their view, but it could also fit with another view, and it's left uncertain which view is true and which is false. In general the only kind of arguments that PROVE conclusions is when you start from premises that are undeniably true and are NOT the conclusions.
Hasty Generalization – Assumes that because something is true of one member of a group, it must be true of the entire group, when in fact grouping can be done of things (such as people) who each have multiple traits and can have multiple combinations of different traits. This includes stereotyping, which assumes essentially that all people who share one trait also shared all their other traits, when in fact two people could share one trait but be different in other ways. It also includes more abstract generalizations and assumptions about objects.
Confused Definitions – “Equivocation”; one of the most common fallacies as it is also one of the easiest to use by accident; many words in many languages can be spelled the same way but have multiple possible meanings. In most debates there are multiple words which both debaters are using but which they each define in different ways, so that often they think they are disagreeing when they are really agreeing, or agreeing when they're really disagreeing. This causes all manner of problems, especially needless antagonism and taking offense where none was intended. Sometimes it is used intentionally as well, knowingly redefining words to that less careful opponents or observers will think you gained the upper hand, but the advantage is emotional, not logical. For example, if one religion decided that it was okay to deceive, proponents could redefine the word “religion” so that that religion is excluded, and then work hard to enforce this as everybody's definition, and a strong case can be made that this has been done with evolution so as to allow it to unfairly bypass legal restrictions on government-sponsored religion.
Part Confusion – Contains two fairly common fallacies; one assumes that something that is true of a part of a thing is true of the whole, and the other assumes something that is true of the whole is true of all or some of its parts. Essentially fails to understand that various parts with different physics are put together the way they are precisely so that they will together produce a different overall physics (and even this is not always true; sometimes what is true of a part IS true of the whole). Accurate understanding of the exact nature and physics of parts and wholes is of course the key to unraveling this one.
Loaded Question – A famous fallacy often illustrated by the example of “have you stopped beating your wife yet?” – the wording is designed as an emotional trap. The clearest direct answer to the question is of course, “No, I have not stopped beating my wife, because I have never beaten my wife.” But the questioner will usually attempt to interrupt such an answer and demand only a yes or no answer. But if he answers, “Yes,” then he appears to admit that he has beaten his wife in the past. Caution again is needed; sometimes yes or no questions are valid, and demanding such an answer is not always wrong. It depends on the wording of the question.
Only Factor – Also “single cause” – assumption prior to research that an effect had only one cause, or only one significant cause, when in fact often multiple factors often combine to lead to an effect. This fallacy can cause a wide variety of problems, such as repairing just one problem and thinking the repair is thorough, or worse in thinking you only need to do one thing in order to accomplish something you need and waste time not doing other things, until it's too late.
False Cause – Assumes that because one thing happened after another, the first thing caused the other. As with many fallacies this can help determine probabilities; since the first thing did happen first the possibility is opened that it MIGHT have been the cause of the second thing, but more careful analysis is needed before it can be proven so.
Overcitation – Assumes that someone who paraphrases and alludes to sources of ideas must be wrong, while only people who carefully cite their sources are right. It is also possible that the source of an idea could be wrong, so citation has virtually nothing to do with truth or falsehood; it is required in situations such as school papers merely to avoid plagiarism and similar problems. Spending too much time hunting down citations can also detract from the amount of time spent on analysis, though checking sources to ensure correct memory is often important as well. In general in logical analysis it is best to rely on ideas which can stand on their own truth or falsehood, with no relation to who first thought of it, unless it is a source who is being analyzed.
Middle Ground – Assumes that the compromise between two positions is true. Again caution related to subjectivity must be used as compromise is not always wrong, but in most issues it is. Unfortunately this idea is pervasive in emotionally charged issues such as politics; many people believe for example that the right course of action for a political body is that which angers the least people on opposing sides, when in fact all public servants should be using careful logical analysis to find what is objectively best for their people and any who are shown to be wrong should graciously admit so and change their mind. This confusion seems to arise because votes are required in such situations, but the requirement of majority votes to pass laws does not logically mean that those casting their votes are justified in disregarding logic for their own selfish gain.
Self-contradiction – As alluded to above, this is a fallacy which essentially runs by psychological compartmentation, and a person's lack of ability to notice their own contradictions. Then, each part of the person's view is judged out of context of other issues. Next, each part is given an argument which, when that part alone is considered, appears convincing. In this step a person will often concoct or adopt standards, and treat them as if they are universal standards and act offended at the idea that these standards might NOT be applied. Yet, in order to support another part of their view they will concoct or adopt entirely different standards which contradict the other standards, as long these also appear convincing when only this other part is considered. Often users of this fallacy appear not to notice blatant contradictions, sometimes given one right after the other so that observers can clearly see the inconsistency. Other times it can be harder to spot the inconsistency and possibly are never caught, or are only realized after it's too late to point it out. One key to avoiding this is to always ask whether standards applied are always applied, or if they are excepted, why so. Another is to always be sure you have logical reasons to adopt a standard, and never to adopt it merely because it would lead to the conclusion you want to believe. Caution also must be used here, as some standards do logically have legitimate exceptions, and thus observers can sometimes think they see a contradiction only because the arguer has not gone into enough detail on their observations. It is also possible for two ideas not to logically contradict but for observers to be illogical and try to see contradictions so that they will not “lose” the debate.
Misplaced Goalposts – This is very similar to the famous “moving the goalposts” fallacy, which I'll discuss next, but that fallacy considered alone can be very confusing and is easy to mis-identify. The main problem with 'goalposts' – which are standards presented, often as a challenge to an opponent, with implication that if the opponent meets the standards the challenger will concede the point – is that it is easy to place them where they should not logically go, as if someone ripped out the goalposts on a football field and placed them in the middle of the field. The challenge often misses the point and has little or nothing to do with whether the real issue at hand is true or not. The key to avoid this fallacy, and its cousin below, is to always ask, before issuing such a challenge, “is it possible that the goal could be reached, and the idea I disagree with still be false anyways?” Often the answer is yes.
Moving the Goalposts – This is usually called a fallacy, but often what is really going on is that a person has first made the above fallacy, and then realized belatedly (often to their embarrassment) that the meeting of that challenge does not prove the challenger wrong. In that case “moving” the goalpost back to where it really should be, or closer, is not actually fallacious. As long as you avoid illogical challenges this fallacy should never come up if you are honestly truthseeking. However, it is sometimes used as a fallacy when a legitimate test of soundness has been reached (in analogy, the football goalpost is where it's supposed to be), but you have the typical prideful fear of admitting when you are wrong, and thus pretend that those goalposts were actually misplaced. Again, the key is to ask the same question as with the above fallacy – if the meeting of the challenge logically proves the claim true, then moving the goalposts would be a fallacy.
Infallible Self – An unfortunately very common fallacy in which you fail to fairly consider that you might be wrong, often while demanding this of others. A good test is to notice if an arguer is using emotionally loaded, especially condemning terminology of others who the arguer believes are wrong, but does not readily admit when the arguer is wrong or if they are shown wrong, appears not to be emotionally bothered by it; forgiving themselves easily of mistakes (whether perceived or real) they harshly condemn in others, or worse, refusing to ever admit they are wrong. Again, caution must be used here as it is always possible that anybody in the debate might be right or wrong, even while exhibiting or not exhibiting this fallacy. It is a fallacy primarily when bias is used to strongly imply or state that the biased person must be right.
Distractions – “Red Herrings” or sleight of hand; changing the subject. Various forms of this exist, but all have one thing in common; when a person has engaged in a particular debate, but realizes or feels they are losing it, and as is common does not want to admit they are wrong, so they use whatever tactic they can think of to distract from what was being discussed. One very common distraction is to take offense to something someone said, without logically discussing why it is wrong, which often directs replies to explaining that no offense was intended, and often the original issue at hand is simply forgotten. The best way to catch this is to always remind yourself before you read a response what questions were being discussed and ask “does their reply address the question at hand?” Again caution must be used, as often debates bring up countless side and supporting points which, to be treated fairly, would require exponentially expanding detail to analyze properly, and time contraints also make it difficult to answer all questions. This is especially true of origins debates as they literally involve everything in existence so the only limit on how wide the discussion can logically expand is existence itself (and even that is often what is being debated; whether God exists, etc.), so they tend to be like ever-hungry time-eating monsters, but origins can be the most important question of all as it usually relates directly to things such as eternal life, love vs. hate, morals, etc.
False Personification – A difficult fallacy to unravel which often shows up in debates on wide topics, including origins. This is often tied in with False Or / False Dilemma, Wrong Or, and Equivocation. Essentially it treats inanimate objects or even abstracts as if they were people or groups, etc. The most common example in modern times has been to label the popular beliefs of evolutionary scientists as “Science”, and label the lumped-in variety of other worldviews as “Religion” and then equivocate by implying that the Bible is anti-science, when in fact it was biblical belief that inspired the founding of science, and unpopular views in science are still part of science. Essentially in this example science is treated as if it was a person that holds just one view, not an abstract system of study that contains multiple competing views and logical analysis, experimentation, etc. Also called “reification.”
Mix and Match – This umbrella term can apply in many ways, but the primary way it is applied is in a fallacy that has been called, “your theory does not work under my theory, therefore your theory is wrong.” This is very common in origins debates in which evolutionists pretend to be judging whether the biblical position makes sense (is internally consistent), but then act as if parts of evolution's competing belief are part of the biblical position even though they are not, and then show that the biblical position contradicts the evolutionary imported belief. This is especially unreliable because, as my chosen label implies, evolutionists will mix and match parts from either belief in whichever way they already know will produce a contradiction, so they do not even use objective standards as to when an idea should be imported and when instead the biblical idea is to be judged. It is also not easy to catch this as doing so requires a thorough knowledge of what the actual biblical ideas and evolutionary ideas are. For example, evolutionists have kept many people successfully ignorant of the fact that natural selection is biblical, and so they act like creationists do not believe in change over time, and thus the showing of examples of proven change over time supposedly disproves creation. When in fact the Bible describes many changes over time, and selection is downward in terms of genetic information, not upward as required by evolution. This particular example almost always shows up in origins debates, no matter how many times it is debunked seemingly.
Effect to Cause – A seemingly obvious fallacy that is nonetheless difficult to spot sometimes, in which an effect is mistaken for a cause, and vice versa. For example,