Sometime on Monday the database that we run our blog software on became corrupted to the point that accessing the blog wasn’t possible for hours, perhaps many, many hours.
I don’t know how long the error existed, just that I created a few new articles in the morning and didn’t check back until late afternoon to see the process eating 99.99% of the available CPU AND not serving any pages.
Below is the first of 6 questions from a reader. I definitely don’t have all the answers, but I’m not short on opinion. ;)
Laurens Duijvesteijn asks:
I have a total of 5 quiet 5400RPM 1TB drives configured in a RAID5+1 array. I installed Ubuntu Server 10.04 onto LVM , inside the LVs JFS is used as the file-system. Is this good practice?
I have used Thunderbird for at least 8 yrs and used Mozilla Mail built into Mozilla/Netscape before that. When the company started using Zimbra for email, IM, calendaring, Lightning never quite worked correctly. With v5 of Thunderbird, the integration to Zimbra with Lightning is working well. After using it about 2 months, I haven’t seen any failures – even on complex calendar settings.
Thunderbird v5 + Lightning Installation Steps
These instructions are for Ubuntu, but probably work with other distros too.
sudo add-apt-repository ppa:mozillateam/thunderbird-stable sudo apt-get update sudo apt-get install thunderbird xul-ext-lightning
Sometimes I lose track of all the devices on a network and need a reminder of everything that is there. Under IPv6, you won’t scan the entire subnet – it would take millions of years – but under IPv4, you still use a scan. nmap is good for this and running it with operating system finger printing goes quickly (relatively speaking).
nmap OS finger print command
$ sudo nmap -O 192.168.0.0/24
Bare with me here. This is a great technique. I think you’ll thank me later after doing what this article suggests.
Homes and businesses today have lots of network devices. Using DHCP is the easiest way to get those on the network, but if you ever want those different devices to talk to each other, perhaps to transfer a file or to have a central backup server, then now your are running a network. Running a network means you probably want to know which devices are on your network or maybe that is just me. Perhaps you want each device to locate each other device too? Static IPs are possible under DHCP, sometimes called DHCP Reservations or Static Leases.
Make it easy for everyone in the house by using your router to force static IPs for the devices when they are at home, but still can connect to DHCP networks easily when roaming. This is really good for portable WiFi devices like laptops, smartphones, and for home entertainment devices that easily support DHCP.
Linux/Ubuntu (maybe others) – ssh key-based authentication made easier.
You know that you shouldn’t be using passwords to remotely connect to a different machine, but setting up key-based authentication has always been just a little too much hassle to bother. It really is simple, but there’s a tool to make it even easier. ssh-copy-id is included with Ubuntu-based distros (and probably others) to push the public key from your desktop to a server and append that public key to the end of the ~/.ssh/authorized_keys file.
FTP, File Transfer Protocol, has been around since the beginning of the internet in the early 1970s. It transfered files when the internet was a safer, more trusting, place. That isn’t the case anymore. Using FTP to host files is probably a bad idea for almost everyone. FTP is like Telnet. No encryption is used for anything. These days, we know that is bad.
In the mid-1990s most organizations stopped using telnet and switched to ssh, secure shell. FTP needs to be replaced for the same reasons. Below I’ll describe why very few people should use plain FTP anymore to remotely access files.
About a month ago, an editor at a large blog website followed one of my links in a comment there back here and offered to republish the story. I was already seeing increased traffic from that link on their site – like 10x more than my normal daily traffic – and it scared me. I don’t have the bandwidth to handle that sort of traffic and my Ruby on Rails blog software … er … pretty much sucks from a scalability perspective. What did I do?
I decided to write this entry after reading an article over a Lifehacker by Whitson Gordon titled What Kind of Maintenance Do I Need to Do on My Windows PC.
What kind of maintenance do I need to do on my Ubuntu/Debian/APT-based PC? Good question. It is pretty simple … for desktops. This article is for APT-based desktop system maintenance, NOT for Linux servers. Linux servers need just a little more love to stay happy. I haven’t used RPM-based distros in many years, so I’m not comfortable providing commands to accomplish the things you need to do, but the methods will be similar.
Let’s get started.
Install System and Application Patches/Updates
This will patch the OS and all your applications.
$ sudo apt-get update; sudo apt-get dist-upgrade
Read about more tips below.
There’s an old SSL/TLS security hole (from 11/2009) that has been out and patched for over a year (since 2/2010), but it appears that many major websites haven’t bothered patching it. CVE-2009-3555
The guys over at ssltls.de have a list. Seems that consistently patching is tough for many organizations. The list is pretty shocking for who is and isn’t patched. Take a look and be afraid. There are lots of big banks on the unpatched list. Scary. The list is not comprehensive, so just because your site or bank aren’t listed, doesn’t mean they are consistently patched.
- home.americanexpress.com is patched, but
- www.americanexpress.com cannot be confirmed as patched.
There are attacks in the wild that take advantage of this issue. I need to check whether my SSL sites are vulnerable too. Here’s an SSL checker