- 1 reCAPTCHA
- 2 MediaWiki del.icio.us
- 3 How do I show PHP errors?
- 4 robots.txt
- 5 Shared Hosting
- 6 Gallery 2 Upload
- 7 Gallery 2 RSS
- 8 feed2js
- 9 Wikimedia Sitemap
- 10 Fixed MediaWiki Error
- 11 Under constant attack
- 12 Problems with mod_security
- 13 Regex testers
- 14 Increase upload max from 2M default
- 15 Show php source code
- 16 Displaying XML
Adding del.icio.us to MediaWiki.
How do I show PHP errors?
Add the following to the .htaccess file:
php_flag display_errors on php_value error_reporting 7
Gallery 2 Upload
Putting this in my .htaccess file seems to have fixed my upload woes.
July 2007 Added the memory_limit option.
php_value post_max_size 20971520 php_value upload_max_filesize 20971520 php_value max_execution_time 600 php_value memory_limit 512M
I had to increase the memory limit to 512M in order to resize [this] picture. I wish LinkSky allowed for exec(),, that way Php wouldn't reserve so much in process memory. For most pictures 64M is plenty enough memory. If I get some time, I'll work on the Gallery Remote tool, so it can do client side image resizing for people who can't get their hosted site to resize the photos. It would also be nice for the Java upload tool to be able to upload the file in multiple smaller chunks so people don't have to change their htaccess file.
Gallery 2 RSS
After moving from Gallery 1 to Gallery 2, I started getting 404s for the missing rss.php. I put this page in it's place:
<?php header("HTTP/1.1 301 Moved Permanently"); header("Location: http://www.theeggeadventure.com/gallery/main.php?g2_view=rss.SimpleRender&g2_itemId=7"); exit(); ?>
I use feed2js to read feeds and display them on this site's homepage. Well today I found out that some spammer has figured out how to use this code to generate spam links. For example: http://www.google.com/search?hl=en&lr=&safe=off&q=baldness+theeggeadventure&btnG=Search. I've disallowed feed2js in my robots.txt file, so hopefully Google will not index it. If this doesn't work, I'll simply move the feed2js and make is inaccessable to directly access.
I found a site map for wikimedia: http://www.thinklemon.com/wiki/MediaWiki:Google_Sitemaps . I installed it on my site, so hopefully Google will better index this wiki.
Fixed MediaWiki Error
I found this tip from http://www.kdevelop.org/mediawiki/index.php/KDevelop:Searching when trying to fix this WikiMedia error:
(SQL query hidden) (SQL query hidden) from within function "SearchUpdate::doUpdate". MySQL returned error "1016: Can't open file: 'mw_searchindex.MYD'. (errno: 145) (localhost)".
Sometimes the search code messes up the Database.
DROP TABLE IF EXISTS `mw_searchindex`; CREATE TABLE `mw_searchindex` ( `si_page` int(8) unsigned NOT NULL default '0', `si_title` varchar(255) NOT NULL default '', `si_text` mediumtext NOT NULL, UNIQUE KEY `si_page` (`si_page`), FULLTEXT KEY `si_title` (`si_title`), FULLTEXT KEY `si_text` (`si_text`) ) TYPE=MyISAM PACK_KEYS=1;
Executing the above sql in phpMyAdmin seems to fix things.
Under constant attack
When I first setup this website, I thought it would be nice to add a guest book feature. I think Fantastico came with a PHP based guest book which was easy to install. The guest book worked great for about the first year, and then I started getting spam. The first thing I did was to fix some of the security holes, preventing stuff like SQL injection attacks. This solved most of the problems for a while. Then I started seeing some more, and I added a CAPTCHA check. Again, the number of spam was reduced by 95%. However, I found that some people were actually entering in the CAPTCHA code. The interesting thing is since I added the CAPTCHA to my site by hand, a normal script would not know it was there. This means that people are actually manually entering in the CAPTCHA code in order to write some spam. I suppose more of this will occur once the world get's their $100 laptops. So today, I've gone on the offensive. I've renamed the addentry.php page to something new, and replaced it with a dummy page. For now, everyone's script is going to the dummy page, which I've set to be 2 MB in size. I don't know how effective this will be in slowing down the spammer, but I figure every bit will help. Many scripts seem to give up after downloading about 1 MB, however, even that is quite a bit. As far as the ineffective scripts go, it seems I'm under constant attack from 195.225.176.x - 195.225.177.x. Apparently the IP's belong to some outfit in the Ukraine. In summary, here are my tips for avoiding attack:
- Add CAPTCHA or email verification checks.
- Make sure you receive an email, or a notification when a user posts their own content to your website.
- Rename pages from their default.
- Monitor your server logs to see what pages are frequently under attack, and then see if the referer makes sense.
Problems with mod_security
Recently, I've been plagued with problem trying to add code examples to my wiki. It seems that if I include anything that remotely looks like a command, the Apache mod_security returns the error:
406 Not Acceptable: An appropriate representation of the requested resource wikimedia/index.php could not be found on this server
I found the solution on this message board, which basically is to add
SecFilterEngine Off to the .htaccess file in my wikimedia folder.
Developing your regex online seems to be the easiest way. Here are a couple of online testers
Increase upload max from 2M default
First, you'll want to see what your default settings are. I created a small file 'phpinfo.php' to show me the settings:
<?php phpinfo(); ?>
Next, I edited the .htaccess in the directory with the upload script to look like this: <verbatim> php_value post_max_size 33554432 php_value upload_max_filesize 33554432 </verbatim>
Finally, I verified the change with the 'phpinfo.php' page.
Show php source code
This works on many linux setups. Any file with a .phps will be shown as it's PHP source, will HTML coloring. The easy way to add this is to create a symlink like this:
ln -s rss.php rss.phps
Today I set out to create an XML RSS feed in PHP. I quickly found some examples, and without too much trouble I was getting my feed to validate on http://feedvalidator.org. The difficult part for me was getting the xml file to render as XML in Internet Explorer 6.0.
Here's what my page was looking like:
Here's what I want it to look like:
I knew I needed to include the Content-Type in the header:
<?php header('Content-Type: text/xml'); echo '<'.'?xml version="1.0" encoding="UTF-8" ?'.'>'."\n"; ?>
This didn't help any. My xml looked the same. My site is hosted on Apache/Linux, so I started trying to figure out how .htaccess works and MIME types. I tried creating my own .htaccess file and putting ~AddType into it. This didn't work either. So I decided to copy the .htaccess file from my tikiwiki app. Following this my xml fixed itself. Unfortunately, this was not caused by the .htacess file. The reason is I tried my query in a new window. When there were errors in my XML, it would get sent as plain text. Internet Explorer was caching this content type, and not changing when I would refresh the page.