Posts Tagged PHP

WordPress, suPHP, and Ubuntu Server 10.04

If you have WordPress running under an unprivileged user account, you may have noticed that when trying to install or delete a plugin that it prompts you for FTP information. This is due to a rather unintuitive way that WordPress checks for file access:

The following code is from the get_filesystem_method() method in the wp-admin/includes/file.php file:

if( function_exists('getmyuid') && function_exists('fileowner') ){
    $temp_file = wp_tempnam();
    if ( getmyuid() == fileowner($temp_file) )
        $method = 'direct';
    unlink($temp_file);
}

This code creates a temporary file and confirms that the file just created is owned by the same user that owns the script currently being run. In the case of installing plugins, the script being run is wp-admin/plugin-install.php.

This may seem a little counter-intuitive, since the only thing WordPress really needs to be able to do is write to the wp-content/plugins directory.

If you’re on your own server (i.e. your own box or a VPS) and not worried about security implications, you can simply make the files owned by your web server process (usually www-data or nobody). This will have WordPress’ check succeed and no longer ask for your information.

If you’re on your own server and running a shared hosting environment, or just care about the security implications, you should install suPHP.

What are the security implications? If all web files are owned by the web server process, it’s extremely easy for someone to introduce malicious php code which can affect other sites on the server. Since the web server process has access to all of the web server files across the server, malicious code would have no problem gaining access to other files and directories on the server.

suPHP, configured correctly, causes all php scripts under a defined directory (usually /home) to run as the user account they are owned by. It also enforces other security measures, such as requiring that directories and files do not have write permissions for anyone other than the user.

I could go on and on about what it does, but my biggest struggle has been getting it to work. Installation is easy, but it’s painfully clear it does not work out of the box. After dozens of searches I found varying different ways of making it work, but sometimes drastic and not clean nor easy, few didn’t require recompiling something (which I wasn’t going to do), and none of them seemed to work.

After more than a day of searching and testing, I finally came up with a simple, elegant, working solution. Note that this was written and based on Ubuntu Server 10.04 64-bit, and libapache2-mod-suphp 0.7.1-1 and may or may not work for other platforms.

Install suPHP:

apt-get install suphp-common libapache2-mod-suphp

Edit the sites-enabled/xxxx.conf file for your VirtualHost

Inside your directive, add:

php_admin_flag engine off
AddHandler application/x-httpd-php .php .php3 .php4 .php5 .phtml
suPHP_AddHandler application/x-httpd-php
suPHP_Engine on

Lastly, edit /etc/suphp/suphp.conf and under ;Handler for php-scripts (at the bottom) change:

application/x-httpd-suphp="php:/usr/bin/php-cgi"

to

application/x-httpd-php="php:/usr/bin/php-cgi"

Restart apache and all should be well.

/etc/init.d/apache2 restart

Note: You might get an error message like the following:

Syntax error on line 7 of /etc/apache2/sites-enabled/example.com.conf:
Invalid command 'php_admin_flag', perhaps misspelled or defined by a module not included in the server configuration

In this case, check that you actually have the Apache PHP mod installed and enabled. In can get uninstalled or disabled on occasion when upgrading Apache. Here’s how to reinstall/reenable:

sudo apt-get install libapache2-mod-php5
sudo a2enmod php5

Checking that it’s working

Create a phpinfo.php file with the follow contents:

<?php phpinfo(); ?>

Call it via your browser and check the Server API line near the top: CGI / FastCGI means suphp is working. Anything else means it’s not.

Suphp is slow!

Yes, unfortunately suphp is slow. Suphp runs PHP scripts in CGI mode, which reportedly causes them to run slower. I would argue that the security advantages outweigh the need for fast scripts, but each situation is unique. You have to decide for yourself.

500 Internal Server Error

If you’re getting the 500 Internal Server Error, it means that suphp is probably working, but for some reason it won’t allow the script to run.

Check that you don’t have any PHP opcode caching (APC, etc) running. If you are running any type of PHP opcode cache suphp will never work. You must disable your opcode caching. If you’re using APC, you can disable it system-wide by simply editing /etc/php5/conf.d/apc.ini and commenting the line out with a semicolon as follows:

;extension=apc.so

Another element of importance is file permissions. SuPHP will fail (with a 500 Internal Server Error) any file that has permissions which are not allowed, as defined in /etc/suphp/suphp.conf. For example:

; Security options
allow_file_group_writeable=false
allow_file_others_writeable=false
allow_directory_group_writeable=false
allow_directory_others_writeable=false

Any file or directory with the attributes defined as allow=false will fail. Based on the configuration above, any file that is group- or world-writable will automatically fail. Same with directories. It’s best to leave these options alone (instead of changing them), and change the permissions on your scripts instead.

However, it is supposedly possible to disable it on a per-VirtualHost basis. I haven’t tested this.

Also check that your /var/log/suphp/suphp.log file isn’t over 2GB. If it is, rotate it or delete it.

If all else fails, check /var/log/suphp/suphp.log and /var/log/apache2/error.log for hints.

Many thanks to all of the blogs and articles that each held a piece of this puzzle. :)

, , , , , , , , ,

Leave a comment

Twitter Follower Count in PHP using Twitter REST API

Here’s an easy way to display the number of Twitter followers you (or another user) have in PHP using the Twitter REST API.

This was based off NealGrosskopf.com but was revised to use the Twitter API and not require a logged-in session.

First, declare the necessary functions:

&lt;?php
function curl($url) {
  $ch = curl_init($url);
  curl_setopt($ch,CURLOPT_RETURNTRANSFER, true);
  curl_setopt($ch,CURLOPT_HEADER, 0);
  curl_setopt($ch,CURLOPT_USERAGENT,"__YOUR_DOMAIN__");
  curl_setopt($ch,CURLOPT_TIMEOUT,10);
  $data = curl_exec($ch);
  curl_close($ch);
  return $data;
}

function GetTwitterFollowerCount($username) {
  $twitter_followers = curl("http://api.twitter.com/1/statuses/user_timeline.xml?count=2&amp;amp;screen_name=".$username);
  $xml = new SimpleXmlElement($twitter_followers, LIBXML_NOCDATA);
  return $xml-&gt;status-&gt;user-&gt;followers_count;
}
?&gt;

Now, a simple function displays the count:

&lt;?php
echo GetTwitterFollowerCount("__USER_NAME__");
?&gt;

Replace __YOUR_DOMAIN__ with the domain of the page making the API call, and __USER_NAME__ with the name of the user you want the information on.

Works for me.

UPDATE: For some reason, count=1 broke, but it works with count=2. Either change to count=2 (to reduce the download size) or omit the count= parameter completely (but it can inflate the download size if you have a lot of followers.)

UPDATE 2: If you are using this to fire on every page load, make sure you don’t exceed the Twitter API Rate Limit. If you think you may, you might want to cache the results, else risk being blocked by the Twitter API.

, ,

Leave a comment

Drupal and Yourls

Actually, this should work with any CMS supporting PHP code blocks: Drupal, Joomla, WordPress (with a  plug-in), phpBB, etc. Has been tested working with Drupal 6, Joomla 1.5.

You’ll want to replace YOUR-YOURLS-DOMAIN-HERE below with the actual YOURLS domain, and API-SIGNATURE-HERE with the API key found at your YOURLSDOMAIN/admin/tools.php.

<?php
if ( isset($_REQUEST['url']) ) {
$url = $_REQUEST['url'];
$keyword = isset( $_REQUEST['keyword'] ) ? $_REQUEST['keyword'] : '' ;
if ($keyword) { $keyword = '&amp;keyword='.$keyword; }
$return = file_get_contents('YOUR-YOURLS-DOMAIN-HERE/yourls-api.php?signature=API-SIGNATURE-HERE&amp;action=shorturl&amp;format=simple&amp;url='.urlencode($url).$keyword);
echo <<<RESULT
<h2>URL has been shortened</h2>
<p>Original URL: <code><a href="$url">$url</a></code></p>
<p>Short URL: <code><a href="$return">$return</a></code></p>
RESULT;
} else {
echo <<<HTML
<h2>Enter a new URL to shorten</h2>
<form method="post" action="">
<p><label>URL: <input type="text" name="url" value="http://" size="50" /></label></p>
<p><label>Optional custom keyword: <input type="text" name="keyword" size="5" /></label></p>
<p><input type="submit" value="Shorten" /></p>
</form>
HTML;
}
?>

The idea was based off this post, which I could never get to work for me. It always had a PHP error, and it depended on having Drupal and Yourls installed on the same site. The above code will work even if the installation is remote (on a different server). It only requires that you’re able to get an API key.

Feedback is welcome.

, , , , , ,

Leave a comment

OpenX: Fatal error: Class ‘DataObjects_Clients’ not found in zone-include.php

I've started working with a program called OpenX to handle my ad serving and rotation, and make it easier for me to handle multiple affiliates, advertisers, and the like. During my use, I ran into the following error message:

OpenX: Fatal error: Class 'DataObjects_Clients' not found in openx-2.8.4/www/admin/zone-include.php

I was able to reproduce the error consistently, assuming the following:

  • You have added a website and a zone to that website
  • You have added a user to that website with permissions
  • The added user attempts to link a banner to a zone

I found the following workaround after some intensive Googling, on a Google-cached copy of a post regarding OpenAds (the former name of OpenX):

Edit your zone-include.php file and add the following line:

require_once MAX_PATH . '/lib/max/Dal/Admin/Clients.php';

Works. I haven't noticed any issues since.

,

Leave a comment

Optimizing WordPress

So after my little fiasco with plug-ins and CPU throttling, I’ve been looking for ways to make WordPress at least a little lighter and faster. I’m not going to cover disabling plug-ins, I’m going to go over a few other ways, starting with …

Disabling revisions:

Every time a post is edited and/or published, a new revision is created. These stick around in the database (never deleted) and can not only grow the database, but can also lengthen query times for information. So, per MyDigitalLife and the WordPress codex, here’s the quick-and-dirty:

…simply add the following line of code to wp-config.php file located in the root or home directory of WordPress blog.

define('WP_POST_REVISIONS', false);

If you would rather limit the number of revisions to something small, say 2 for example, just use the number instead of FALSE:

define('WP_POST_REVISIONS', 2);

It should be added somewhere before the require_once(ABSPATH . 'wp-settings.php'); line. That’s it. Revisions will no longer be created. If you want to delete previously created revisions, read on…

Deleting revisions:

So now that you’ve disabled revisions, how do you delete all the old cruft laying around? MyDigitalLife has the answer on this one too.

…and then issue the following [SQL] command (it’s also recommended to backup the database before performing the deletion SQL statements):

DELETE FROM wp_posts WHERE post_type = "revision";

All revisions should now be deleted from the database.

Caching:

Caching is a hot button for sites that could potentially see high amounts of traffic (and since we would all like to be in that category…) The caching plug-in that I use and recommend is WP Super Cache. The UI is easy enough to work around, though it does require editing of the .htaccess file.

Database queries:

Shared hosting providers get real upset when applications and scripts perform excessive and unoptimized database queries. Heavy themes, excessive numbers of widgets, and badly-written plug-ins all contribute to this. Fortunately, a post on CravingTech points to an easy method to check the number of queries happening on a single page load.

You can insert this snippet of code on your Footer.php file (or anywhere) to make it display the number of queries:

&lt;?php echo $wpdb-&gt;num_queries; ?&gt; &lt;?php _e(‘queries’); ?&gt;

After looking at the number of queries occurring on a page load, try changing themes, disabling plug-ins, and/or reducing the number of widgets on a page to reduce the number of queries. SQL Monitor looks like a useful plug-in for further examining SQL queries, but I haven’t used it, so I can’t comment on it’s usefulness (or lack thereof).

Also…

I’ve stumbled on some additional information while researching, and apparently the “WordPress should correct invalidly nested XHTML automatically” setting (under Settings > Writing) can not only increase the load when a post is saved, but can also break some plug-ins. If you’re familiar enough with (X)HTML to handle correctly closing tags, you might actually be better turning this off.

You can also find other settings for wp-config.php on the WordPress Codex page.

, , , , ,

Leave a comment

Captchas, Anti-spam services, and Bad Behavior

I run this WordPress blog as well as a Drupal-powered forum site and one of the biggest challenges that any webmaster can have is controlling spam — both in comments and user sign-ups.

I used to rely heavily on captchas, and I’ve gone through several captcha and non-captcha systems to try to find the “ideal” solution: One that cut the spam down to nearly nothing as well as not putting too much of a burden on the legitimate users (as to possibly deter them from participating on the site).

Here’s what I’ve tried, and what I’ve learned in the process:

reCAPTCHA (WordPress, Drupal): This service aims to stop bots and spammers by presenting two words.

Pros: As a side effort, the service also aims to help digitize books by using the legitimate users to correctly identify one of the mangled words provided. Also has a feature called “reCAPTCHA Mail Hide” to hide email addresses behind a captcha to keep them from being harvested by web bots.

Cons: reCAPTCHA has one distinct weakness: Only ONE word needs to be correctly entered to pass the captcha. Additionally, at least one implementation has a weakness making the captcha worthless.

Mollom (WordPress, Drupal, Joomla) : Mollom is a text analysis service with a captcha fallback.

Pros: Aims to be unobtrusive. Does not present the user a captcha unless textual analysis cannot be performed or appears to the service to be a spam submission. Captchas are “cleaner” looking than other services (less visual distortion). Audio captchas.

Cons: Limitations on the free service, and does not scale well. Free service only allows 1,000 legitimate posts per day, then it’s 30 EUR/mo/site. (Around $50 USD). No service uptime guarantee with the free service.

Akismet (WordPress, Drupal) : Akismet is a non-captcha anti-spam service that does textual analysis (similar to Mollom) except completely without the aid of captchas.

Pros: Comes installed on all WordPress.COM blogs by default and needs no configuration. Powered by, and maintained by Automattic, the same team behind WordPress and Gravatar. Suspicious submissions are placed in a moderation queue for the administrator to manually approve, with the option to automatically expire (delete) them after 30 days or so. Easy setup via an API key.

Cons: Akismet weighs input the same across all Akismet-protected sites. This means that someone who submits a comment on an Akismet-protected blog that gets flagged as spam would get the same treatment on an Akismet-protected forum (and every other Akismet-protected site for that matter) until enough comments get marked as false positive for the system to re-learn the user is not a spammer. I had a user that got hit by this false-positive treatment the first day I implemented Akismet on another site and it became a hassle. When I enabled Akismet on this WordPress site, his comments were still getting flagged as spam. That’s a serious issue for me. (Akismet FAQ)

Defensio (WordPress, Drupal, Facebook) : Similiar to Akismet, weighs each source seperately, and offers Facebook protection as well.

Pros: Defensio is a service similar to Akismet, but weighs content from each website (blog, forum, etc) separately to avoid mistakes. You register each web property you want protected and obtain an API key for each. Slow to learn at first, but avoids false-positive/negative and cross-property disasters like I mentioned above with Akismet. This service is a favorite of mine. Additionally offers profanity / file link protections, as well as customizable filters. (Link)

Cons: Slow to learn at first. Might require you to manually flag content until it learns. Currently free, though they mention possibly charging for the service in the future for commercial users.

Bad Behavior (WordPress, Drupal, Joomla) : Not a captcha or textual analysis service at all, takes a completely different approach

Pros: Filters access at the http level, by blocking proxies, historically abusive IP addresses, suspicious user-agents, and malformed requests. Cuts down on bandwidth, spammers, users who are accessing site content through known proxies, etc. Conserves server bandwidth and resources, as pages are not served up at all when a block is performed. No training required.

Cons: It’s possible that a number of users whose ISPs force proxies may be blocked, but I have not seen evidence that this is happening on my sites.

So there you have it. Personally, I use a combination of Bad Behavior and Defensio on my sites, and I’ve seen a big drop in the amount of spam.

Have experience with one or more of the above? Please share it!

, , , , ,

Leave a comment