Is there a word or phrase for one mistaken belief leading to a web of false ones?

If you filter information so that nothing threatens your view of the world, you may call it 'confirmation bias'. We all do it, actually.

https://en.wikipedia.org/wiki/Confirmation_bias

However, I think more useful would be to explore the following defense mechanisms: rationalisation, wishful thinking, denial. Eg. see here:

https://www.psychologistworld.com/freud/defence-mechanisms-list

EDIT: There's also the concept of 'sunk costs' you might want to work with.

See here: https://en.wikipedia.org/wiki/Sunk_cost


Building a house upon sand, may be the expression you need - a biblical metaphor, from one of Christ's parables, which has stood the test of time.


One word to describe it, from Lexico is a

delusion
NOUN

1 An idiosyncratic belief or impression maintained despite being contradicted by reality or rational argument, typically as a symptom of mental disorder.
Is this for real, or just a delusion on my part?

1.1 MASS NOUN
The action of deluding or the state of being deluded.
The rest of us play along, but no one is fooled by this necessary delusion.


Simply "false premise".

Similar to @ZachP's Latin phrase a falsis principiis proficisci.

Update (for all the down-voters who didn't care to comment):

A "false premise" is an incorrect proposition that forms the basis of an argument. Since the premise is not correct, the conclusion drawn may be in error. However, the logical validity of an argument is a function of its internal consistency, not the truth value of its premises.

False premises are the underpinning of cognitive biases; although, not all false premises are used to construct a cognitive bias. As an example in the OP's context, "The group's choice to continually cling to this false premise led to the acceptance of an entire web of falsehoods."


https://en.wikipedia.org/wiki/Belief_perseverance

Belief perseverance (also known as conceptual conservatism[1]) is maintaining a belief despite new information that firmly contradicts it.[2] Such beliefs may even be strengthened when others attempt to present evidence debunking them, a phenomenon known as the backfire effect (compare boomerang effect).[3] For example, journalist Cari Romm, in a 2014 article in The Atlantic, describes a study in which a group of people, concerned of the side effects of flu shots, became less willing to receive them after being told that the vaccination was entirely safe.[4]

Since rationality involves conceptual flexibility,[5][6] belief perseverance is consistent with the view that human beings act at times in an irrational manner. Philosopher F.C.S. Schiller holds that belief perseverance "deserves to rank among the fundamental 'laws' of nature".[7]


Edit to Add:

https://www.scienceofrunning.com/2015/12/why-you-should-change-your-mind-power.html?v=47e5dceea252

In his new book Black Box Thinking, Matthew Syed outlines the extremes people go to avoiding adjusting their model. From doctors to lawyers to academics, we’re all guilty of cognitive dissonance:

“when we are confronted with evidence that challenges our deeply held beliefs we are more likely to reframe the evidence than we are to alter our beliefs. we simply invent new reasons, new justifications, new explanations. sometimes we ignore the evidence altogether.”