Wednesday, March 25, 2009

profanalyzer version 0.2.1 has been released!

Profanalyzer has one purpose: analyze a block of text for profanity. It is able to filter profane words as well.

What sets it slightly apart from other filters is that it classifies each blocked word as "profane", "racist", or "sexual" - although right now, each word is considered "profane". It also rates each word on a scale from 0-5, which is based on my subjective opinion, as well as whether the word is commonly used in non-profane situations, such as "ass" in "assess".

The Profanalyzer will default to a tolerance of of 2, which will kick back the arguably non-profane words. It will also test against all words, including racist or sexual words.

Lastly, it allows for custom substitutions! For example, the filter at the website turns the word "fuck" into "fark", and "shit" into "shiat". You can specify these if you want.


0.2.1 / 2009-03-25

  • Fixed some wordlist errors.


  1. This comment has been removed by the author.

  2. This is great! I've been looking for something like this. Thanks for your hard work.

    How would I go about adding localized profanity. I am working on a non english site. I noticed that you have a list.yml file where I could add my list. Did you think about adding a another filter for language? ie. de or es

  3. FYI -- the repo is missing the config/file -- possibly removed by github for *ahem* content?

    Could consider ROT13'ing the string or just zip'ing the file.