I have been running a slash site for about 2 years. I get over a 1,000 hits a day on it, but total there are only about 500 comments. And about 400 stories. Basically a lot of the visitors come and download the various PDF files that I have done. What I ended up doing has been to change directories of the files every other week to prevent people from hardlinking to it. I also have robots.txt and such set up to prevent search engines from spidering through out the site.
What I want to do is figure out a way to have a perl script or soemething serve up the file when people ask for it, but maybe based on their karma. Like they need to be registered and have a good Karma to be able to download files.
Any ideas on how to go about this? Any perl scripts I could look at?