| Edit | Rename | Upload | Download | Back to Top |
SmallWiki needs to support robots.txt. This might require reorganizing the URLs for actions.
John Brant says that spammer like certain pageas, like "Wiki Sandbox". He says we should tell robots not to index this page or follow links on it. See www.robotstxt.org/wc/meta-user.html. This requires being able to set certain attributes of a page.
In addition to locking out ip addresses, there are other ways to handle spam. Regular expressions could be used to search for specific spam links. Also, external addresses could be redirected through google, so that their page rank scores don't get inflated by spamming the wiki.
Could some sort of Bayesian filter work here? Wiki contributions from non-regulars could be given to reviewers for training. -Colin Curtin
| Edit | Rename | Upload | Download | Back to Top |