Oh man, how i understand you.

Isnt it that those spam is added automatically by some robots?

What if you simply activate authorization? Using jspwiki/guest to login and publish this on the main site. Or more dynamically the current date as password.

That way it is still public but not (that easy) accessible by robots.

Another idea could be to deny posts with more than n (e.g. 3) external links.

--imario, 06-Oct-2004


How about inserting a "graphical authentication" to Edit.jsp, like paypal has in its signup page? (There's a generated JPG with some letters and graphical garbage and a form-field for the user to enter the text from the JPG.)

Turning authorization doesn't work. It is quite trivial to teach the robot to register and log in first.. Also other means like trail, referrer, cookie etc inspection can be trivially circumvented.

--kauppi, 06-Oct-2004


This "graphical authentication" is even better, for sure.

And fairly easy to implement too.

--imario, 06-Oct-2004


I don't know, I'd love to know more about Chinese 'cluture' whatever that is...

Seriously though, graphical authentication could work unless the spammers are VERY determined and will do their spamming by hand. In China even that might be possible, though.

-Dragon


Myös täällä pienissä blogeissa on havaittavissa Kiinan liikenteen kasvua. Toistaiseksi vain referer-spammina mutta kuitenkin. Veikkaan, että ongelmat tulevat pahenemaan. Kun on niin paljon väkeä, mahtuu sekaan enemmän häiriköitäkin.

--kallu, 06-Oct-2004


As far as I can tell, this is just one guy who has the spam text open in Notepad, and just cuts'n'pastes it on my wiki. So any robot-stoppers do not really work (unless you, of course, do the auth in Finnish, in which case it'll take a bit of guesswork to figure out).

--JanneJalkanen, 06-Oct-2004


Comment Control by admins is one, but very slow and annoying on high bandwidth sites.

I've got it set so that comments to over 5 day old articles are automatically queued and I need to review them before accepting. Accepting or denying is simply setting all of the comments to either allow/deny and then submitting.
Plus IP banning ofcourse.

--Shrike, 06-Oct-2004


Just a quick note: I have no problems with comment spam. I have problems with people coming to a Wiki site and DELETING everything on a page, and replacing it with a commercial - then THREATENING us that we should not remove it.

--JanneJalkanen, 06-Oct-2004


You contemplate banning one fifth of the planet's population? Because of one bad apple? You are not serious.

--J-Ko, 06-Oct-2004


Yup. It's a Finnish-only website, and frankly, doing a blanket ban on *.cn is the easiest way to do it for me. I have better things to do with my life than babysit some people who should know better.

And it's not one bad apple. This situation has been happening for the past few months, and it's slowly getting intolerable.

--JanneJalkanen, 06-Oct-2004


I've been watching some of the spamming on jspwiki.org for a while now; how about a page filter?

The filter could rejects edits that:

  • causes a _duplicate_ outlink on the page.
  • increase the outlink count on the page beyond a limit.
  • increase the incident count of the outlink on the whole wiki beyond a limit (maybe use lucene and do a search on the outlink & see the number of hits?)
  • link to a site listed on a blocker page (need to turn on security to authorized persons to maintain the blocker page)

This might be a comprimise solution? Best I can think of off hand.

--JohnVolkar, 07-Oct-2004


Just blackhole the /16 he/she is coming from. A couple of times that and he'll have to switch provider. Or just check the provider's assigned numbers and go for that.

Then, he of course could start using proxies. Makes the thing a whole lot messy.

--Sty, 08-Oct-2004



More info...     Add comment   Back to entry
"Main_comments_061004_1" last changed on 15-Jan-2007 15:56:21 EET by JanneJalkanen.