During the anniversary Federated Wiki discussion (2021-01-20) an interesting topic has been brought up: how do we make wikis sustainable?
In that question, sustainable wiki means a wiki that can be forgotten for several months, and when you return there, it's ok. Unlike more classic open wikis that tend to get vandalized, fedwikis are quite safe with that.
The idea is simple. You can't edit pages. You can only fork them, copy to your own fedwiki and make your changes there. The software handles that connection and shows pages in such a way that you can navigate to all versions of the pages. Thus, no vandal can ruin your page.
Another effect of this approach is the freedom of speech. On classic wikis you may have disagreements that won't be resolved. Edit wars happen. But on a fedwiki, no one can censor your saying.
The downside is the difficulty for readers. You can easily get lost in the
wiki wiki web of conflicting pages, lost both in time and space. Stuff like that never happens in classic wikis: you know where you are (fedwiki flags are hard to remember and distinguish), you know when you are (on the most recent version, unless you're viewing history).
Yet still classic wikis can be vandalized. Or can they? I think the problem has been solved already. There is captcha, human-oriented questions, stuff like that. Also, authorization. You can make registration process in such a way that no robots pass. For example, on Mycorrhiza all users are added manually (for time being). Quite safe.
Or is it? Imagine if a wiki admin leaves the wiki for two months. Meanwhile, a malicious hater with a grudge on the admin destroys everything. What about that? Of course, such a user shall be banned. But what about the results of the destruction? There's nothing meaningful left.
But hey, isn't that what history has been made up for? Just roll back to an ok version! Of course, the wiki software should be sophisticated enough for that. As for MycorrhizaWiki, it doesn't have buttons for that but, since it runs on git, you can still do that.
But how is it sustainable? It is, if it is automatized. Let's create an algorithm that will analyze the edits and decide if a user is malicious. It mustn't be really difficult to implement, I think. I will look into that.
By the way, git. Why not use it to the fullest? Anyone can fork a git repository and then run your own wiki. Also, git repositories can be merged. It can be relied on.
See idea/import for other ideas of ways of importing content.
So, what do you think? In summary, my opinion is that:
- Classic centralized wikis can be sustainable.
- We should let the machines help us make the wikis sustainable.
- Classic wikis can be forked/copied. Also, consider idea/fediverse. Mycorrhiza will take the integration part to the fullest one day, while still being a classic wiki.