Disclaimer: Do note that in this essay I am not advocating, endorsing or supporting assuming bad faith (I fully support assuming good faith all the time!), I am simply explaining (but not excusing) why assuming bad faith happens.
Most of us come to Wikipedia as good faithed, but naive, editors. Wikipedia was built on good faith. Over the time, we realize the depth of wikipolitics, the ulterior motives of some editors, and we grow more cynical and assume less good faith. It's a sad story, but one that simply parallels the real life: growing out of childhood and teenage idealism, and moving into the adult world of realpolitik.
How badly you'll be hurt by the wikiworld, depends, just as in real life, on where do you come from and where do you live (edit). You'll be exposed to more radical views and bad faith if you live in the Isreali-Palestinian disputed areas than if you live on a peaceful farm in Canada. If you edit rarely visited, uncontroversial Wikipedia articles (about your local town, or uncontroversial, obscure science) you'll have a more positive experience than if you deal with articles about abortion, global warming or the Holocaust.
The more one runs into highly POVed users ("true believers" are the worst), who tend to cluster in the popular and/or controversial articles, the more likely one will slowly radicalize against their POV (the or is important, as some controversial articles are very unpopular - little known facts of Polish-Lithuanian history, usually kept alive by extreme nationalists of one side or another, for example). Even if you are the most kind hearted peacemaker, after living for a few years in a conflict area, you will come to despise the radicals on both sides, who cause you stress and who will target you, simply because if you are not with them, in their mindset, you are against them. And if you prefer one POV over another (which is completely legit and expected - NPOV does state: "all editors and all sources have a point of view"), you may slowly find yourself drifting more and more into extremism.
This includes:
- not editing/creating certain articles ("why help them?", "they can create it themselves");
- editing/creating certain articles ("how do you like this?"),
- assuming more good faith about your side than the others, leading to
- defending problematic editors of one's side (also referred to as "grooming pet trolls" - with unspoken rationale "he may be disruptive, but we need him to combat the even more disruptive editors on the other side"),
- supporting problematic editors in content disputes / discussions / eventually, even harassment of others ("because they have the right POV")
- grouping "enemy" editors into "tag teams" ("users A and B share similar POV and often work together"), and assuming that they have ulterior motives and at the very least are working against your side (WP:CABAL)
- and associating entire group of editors with a given "tag team" ("users A and B belong to nationality M so all users of nationality M are as disruptive as A and B").
Some of the above are acceptable, some are borderline, others are outright bad. Sometimes one may be right (there "may" be a cabal to get you - example), more often one is not (but one may be creating a self-fulfilling prophecy!).
Over time, this leads to more and more WP:BADFAITH on all sides. Good faithed editors will either leave the arena of conflict, finding it too impolite or stressful (thus leading to a vicious spiral decreasing the ratio of good to bad editors in a given topic), or will become radicalized themselves - they will lose good faith, become more and more cynical, bad faithed and radical, to an increasing extent supporting one side or at the very least advocating the use of "ban hammer" with less and less thought ("let's pull the entire neighborhood down, it's impossible to save the ghetto"). With time, they'll find more and more examples to support bad faith (finding even one "true believer" a year may give one a decent sample of "evil cabal" after a few years...). The end result? Certain topics become wiki-battlegrounds. And they spread, along the radicalization (which is like a disease). See model of mass-radicalization of content areas for details.
See also: game theory analysis of wiki conflicts explaining why radicalization is a logical outcome of some conflicts
Solution: Forgive. Assume as much good faith as you can, moderate and even support restrictions/bans on disruptive editors (including "true believers") supporting your side, get mentorship because you may not realize you are crossing the line yourself