I had a frustrating error for a while when using suPHP. It kept giving error 500 when enabling mod_suphp for .php files:

Directory /var/www is not owned by user

Of course, the directory was not owned by the user. The user belonged to a sub-directory only, the reason behind using suPHP.

It turned out to solve this the directory /var/www had to be owned by user root. That fixed the error.

I was configuring an apache2 setup to work with suPHP and ran into various Internal Server Error 500s due to folders and files having too generous permissions on them. suPHP doesn’t like that.

These two commands, when run from the directory where the web files reside, will recursively reset the permissions as required.

Find files and set to permissions 644

sudo find . -type f -exec chmod 644 {} \;

Find folders and set to permissions 755

sudo find . -type d -exec chmod 755 {} \;

I don’t know why I’ve never got round to doing this before, but this handy guide will show you how to create an RSA key and stop having to authenticate with a password each time you try to SSH into your server.

https://help.ubuntu.com/community/SSH/OpenSSH/Keys

Suprisingly simple, and great when managing various servers.  Now, why didn’t I get round to doing this when I had 150+ servers in the mid 2000s.

Note:

When I ran into some problems along the way (due to not setting the permissions to 600 on the authorized_keys file, I used:

ssh -v user@host

To get a verbose output and find the problem.

Aside from my belt-and-braces approach of Raid-5 storage, and regular backups to a separate drive, I wanted to make sure my numerous websites and blogs were also backed up off-site.

For WordPress blogs I have installed the following Plugin:

WordPress Backup to Dropbox

It’s been running a month or so now and appears to work well.  I have it backing up regularly and will purchase more Dropbox storage if required in the future.  You may even be able to do this with Google Drive in the future if you can get more free space there.

There are some premium features which may save you some space and money in the long run: Zipping up backups before uploading in particular.

Aside from having RAID-5 storage I want to make sure I have incremental, regular and automated backups in case anything goes wrong.  I will always delete something I shouldn’t along the way, or run a command and mess up some data.

For the MySQL databases I run, the AutoMySQLBackup project looks to be the ticket.

My setup as follows:

  • Mount a network share or create a folder for where to store the backups.  I have chosen a separate storage array for the backups and mounted it with NFS.
  • Download and install AutoMySQLBackup
  • After running and testing the results, add a job to the daily, hourly or whatever suits ‘Crontab’ to execute the process without your intervention.

I now have a regular, automated backup of my MySQL databases running on my Ubuntu server.

Tired of having to prefix command-line tasks with ‘sudo’ on Ubuntu?

sudo make

sudo make install

rm -Rf *

After logging in with your normal user account then type

sudo su – root

And type in the password.

Proceed at your own risk.

This problem keeps hitting me each time I install a fresh wordpress blog.

Not Found

The requested URL /home-network/ was not found on this server.

Every time I have to re-google until I find the solution.

For reference, I’m running Apache2 on an Ubuntu machine.

The content appears correct in the .htaccess file within the base directory:

<IfModule mod_rewrite.c>
RewriteEngine On RewriteBase /
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^(.+)$ /index.php/$1 [L,QSA]
</IfModule>

The issue falls within /etc/apache2/sites-available/nameoffile.conf

The following must be modified:

AllowOverride None

to

AllowOverride All

You will probably only run into this if you’re a systems administrator, or if your web-host has not set this up correctly.  Changing the wording and issuing:

service apache2 reload

Did the trick for me.  If it didn’t, you wouldn’t see this page.

The following was kindly provided by my friend, James.  You can find his setup at www.donut-tech.com

I wanted to echo (read: output) some system commands to webpages over time.  James has done this as part of a wider project and provided the useful code for TOP.  The TOP command displays the processes running and the CPU and Memory commitments, amongst other things.

This code is for a .php page and will execute and return the outputs of /usr/bin/top

 

<html>
<head><title>Pi</title></head>
<body><h1>Pi</h1><br>
<pre><?
system(“/usr/bin/top -b -n 1”);
?></pre>
</body>
</html>