You all know I use Ubuntu heavily. I love the application management and update scheme used – APT.
A new version of Ubuntu is scheduled for release this week – 9.10. While this isn’t an LTS – Long Term Support release, so it isn’t suitable for production servers, it does have enough new and useful things to be worth migration consideration.
I plan to migrate my main desktop to it, but I’ll really be waiting for xubuntu, the lighter weight GUI version.
As I consider the new features included and made significantly easier, I came across an article where someone compared the so-called new features in Windows7 with other OSes.
Big new features in Ubuntu 9.10, from my point of view?
- EXT4 – this will be the default file system. Testing has shown it to be faster than most other alternatives. I’ll be staying with JFS on physical hardware and ext3 on virtual hardware. IMHO, more time is needed before I’ll risk my data. I really want ZFS, but since the license isn’t working with GNU, my best hope is btrfs at some future date. Linus is running btrfs on his daily workstation, so that’s a good sign.
- Cloud Computing infrastructure with Eucalyptus. Eucalyptus was introduced in an earlier release, but I didn’t use it. Ok, so it isn’t new, but it is_ new to me._ This is both compute and storage infrastructure that is compatible with Amazon EC2 and S3. Ubuntu has decided to call this UEC going forward. Very nice.
- KVM – Ubuntu has decided to leave my beloved Xen for KVM. That means it will be easier for me to change from Xen to KVM for virtual servers with the next LTS server release. I’ll need a few months to get ready and test.
- Easy HOME directory encryption – Don’t know that I’ll use this, but I will encrypt a subdirectory. Yes, I know TrueCrypt has been available for a long time and is cross platform. Ubuntu adds an auto-mount / dis-mount as you log in and out of your account. The resulting encryption can be moved, but only with the key provided at create time.
- Bluetooth tethering with cell phones. This means you’ll be able to use a cell data connection from your laptop, fairly easily. Great if you travel much.
- New kernel – In every new kernel, there are lots of new features that don’t matter to most of us and a few new features that matter a bunch. My kernels are so old that there are many, many new to me features. Some are security and others are performance related. KVM is built-into the kernel now. That’s better than getting Xen updates that don’t get tested well enough and occasionally break.
Ubuntu is also excited about free cloud storage they will offer. Of course, there’s a storage amount that will force a payment, but ease of use will make this useful to many people. Even if all you do is share your desktop settings or back them up to the cloud, you’ll be better off.
Anyway – Look for bittorrent downloads on 10/29 and get yours running. Seed after you complete for the entire weekend to help your fellow users out, please.
I know a few people who are not really computer savvy that have gotten their computer so messed up that it is unusable. It boots, but can’t really do anything. These people think they need to:
- buy a new PC or
- pay $200 to a PC tech to get it fixed.
Both of these methods will work, but why? Chances are, their Windows computer has been hacked or is running spyware. In fact, that last internet website they visited for a fun game may have installed the spyware and then something known as a rootkit. Basically, it isn’t safe to use that PC anymore for any reason.
There is a FREE Option, Linux
So, the PC isn’t really broken, but anything on the hard drive shouldn’t be trusted. Many Linux distributions come as a Live CD – this means you just put the CD into the computer and boot up. Here’s a youtube video showing what this looks like . Most computers will load the OS from CD never touching the hard drive. Using one of these, you can use your computer for common tasks that don’t need a hard drive. Using google, google mail, yahoo mail, hotmail, …. anything online.
Using Linux has no risk to your data or even your hard drive. If you don’t like it, don’t boot from the CD anymore and find another way to use your PC again. Take it to a tech for $200 or buy a new one for $500-$1500. You risk nothing, provided you don’t tell it to install to the hard drive. It will not automatically install to your hard drive unless you ask it AND there will be multiple screens and points where you have to answer yes – wipe my disk clearly.
Recently, the Washington Post Security Fix Guy has recommended everyone perform their online banking using an Ubuntu Live CD. The people with the greatest risk are those using the large USA banks, since hackers have created programs that hide in Windows and watch when you login to those banking web sites, then cause transactions from your account to their account(s).
Broken PC, How to get Linux?
Most of the people who will be helped by this method have 1 computer at home and don’t have access to another. There’s good news. The Ubuntu Foundation will snail-mail a CD anywhere
-- for free. Now you just need to get to a computer to request it. Go to your public library or ask a neighbor, or call me and I’ll enter the data so you’ll get the CD in a week or two.
Ubuntu is a full featured operating system like Windows7 or Vista. It is big and capable and the load time reflects that. The Live CD should work with 95% of the PCs out there. Put the CD in and boot. That’s it. Ubuntu runs best when it is installed to a hard drive, but you can test drive it forever if you like without touching your old hard drive. Just know that CDRom drives are much slower than normal hard disk drives. Ubuntu will run nicely in 512MB of RAM.
There are smaller Linux distributions when all you need is to get online. Smaller is better for speed, RAM use and simplicity. You can find many more Linux Distrubutions, some highly specialized at Distrowatch.
There’s a search tool that will help you select the best distro for your needs. Do yourself a favor and stay with the major distributions and only those that are debian-based. Debian is an major distribution known for stability and program management ease via APT. APT rocks, see my prior article on why.
All of us have a broken PC from time to time. Be prepared by creating (or getting) a Live CD Linux distribution and using it once now, when your PC is working is a good idea. It is really easy.
Fantastic is the only word I can use. Windows7 x86 memory use is FANTASTIC (meaning low). I’ve done a little optimization using Vista System Optimizer after installing Win7 on my laptop – here are the results:
Win7 = Host OS
Ubuntu = Client VM – 1224MB allocated
The total system memory used with VirtualBox, Ubuntu and Windows Media Player playing a TV show is 1.75GB. 1.2GB of that is allocated to the client VM. Under Vista-64, this same config would use 2.5GB.
Running another VM, WinXP, with 1GB of use, will bring the total memory used to 2.75GB.
Win7 = Host OS
Ubuntu = Client VM – 1224MB allocated
WinXP = Client VM – 1024MB allocated.
This would use almost 4GB in Vista-64.
Even with the 32-bit limitation of 3.5GB of RAM, on my system, I actually gain more usable RAM with 32-bit Win7 over 64-bit Vista and isn’t giving more RAM to client VMs the goal?
We all keep every computer we have patched and current, right? Every application. Every web browser and every OS library – fully patched, right?
Well, … er … probably not.
Why not? Because it isn’t easy. Keeping just the OS patched is easy. Apple, Microsoft and Linux distributions have made it even easier the last 5 years. But the OS patches are only 10% of the problem. We need to patch the browser, plug-ins, email, office-apps, and every other application on the machine in a timely manner. Are you up to that task? I’m not.
There’s a simple solution. Linux patch management.
Ubuntu Linux distributions include thousands of free applications and make patching all of them trivial. They are updated just as easily as the OS patches. Let me explain.
Desktop Ubuntu Patching
In the upper right-hand corner of the desktop, Ubuntu places an Update Manager notifier. It is unobtrusive, but clear.
- The notifier is a red ! in color; it automatically updates package information daily.
- If any package installed on your system using the Synaptic program has an update, the notifier will be displayed, red. Just click the icon to start the process (get more information). If there are no updates, the notifier isn’t displayed.
- Applying patches usually doesn’t require you to stop doing whatever you are already doing.
- Applying patches usually doesn’t require a reboot, unless there is a kernel update.
- Applying patches usually doesn’t require any manual configuration file changes. Any changes are usually handled by the installation package.
- Any application installed using the GUI package manager ( specifically, any debian-based package depot), will be maintained and updated through the same interface.
- Whether an update to a package is available or not is automated.
Server Ubuntu Patching
I’m certain there’s a way to determine when patches are available, but I never check. I simply update the local package depot list and update all installed packages.
sudo apt-get update sudo apt-get upgrade
Yes, it really is that simple. BTW, these commands work on desktop Ubuntu too. In fact, every Saturday morning, I run a script from a laptop that remotely connects to all the other Ubuntu machines and runs both of those commands.
There are other options of patching Linux, but 95% of the time, these commands are all that you need to know.
Wouldn’t it be nice if other computer vendors made software updates that easy?
I’m asking for help again with my Windows7 final installation. See, Microsoft gave the 32-bit version, not the 64-bit version. This puts a wrinkle in my original plan to host Win7 on the laptop because about 0.6GB of RAM cannot be used. On a system with only 4GB, 0.6GB is a bunch, perhaps too much to waste.
The current goal is:
JeOS/Linux-Host |____Win7-VM (MCE) |____xubuntu-VM |____WinXP-VM (Visio / MS-Office / Quicken)
RAM allocation plans
JeOS – 512MB
Win7 – 1GB
WinXP – 1GB
xubuntu – 1.5GB
If Media Center in Win7 doesn’t work well enough in a VM; safe to leave on 24/7 with USB support, this plan will be trashed. The QAM recording is nice. For me, it is about the recording, not the playback or other features.
There are other complications in using Win7 Media Center. The recorded file format, for example. That’s something for another story.
32-bit DVD – ouch.
So I opened the Windows7 Ultimate DVD and uncovered that it only contains the 32-bit version. After swapping the old/Vista drive with the new/Empty drive in the laptop, I elected to install Win7 even though I’d end up unable to use about 0.5GB of RAM. I wanted to give the new OS a fair chance and gain some experience. The setup was fairly easy, but dumbed down too much for my liking. I actually installed the OS to the wrong partition (280GB), wiped it and reinstalled to the other partition (30GB), that was planned for OS and Apps. Then I proceeded to setup WMC – Media Center.
Windows Media Center – ClearQAM Supported!
I’d heard that ClearQAM was supported and looked forward to using it. My cable system switched almost all channels to QAM 2 months ago. I’d hoped there was an automatic translation between QAM channels and normal cable channels so guide data can be used. I haven’t found that, if it exists. I AM recording a movie as I write this. There’s no noticeable performance hit. Nice.
Overnight, I copied the data and virtual machines from the older drive to the new drive, about 150GB. I split the disk into 2 partitions – C: and D: . C is for the OS and programs. D is for data and virtual machines. This config should make data backup much easier.
I dislike the whole Library BullShite that this new OS forces. I also dislike the new Explorer look and feel. Is there a way to default all Explorer views to Detailed?
So, after getting the new OS installed, the very first program installed was Sun’s VirtualBox. Initial attempts to migrate all the settings and virtual disks didn’t work as well as I’d hoped. However, I did get 1 VM up and running with a small amount of effort. I’ll write up the actual steps and things that didn’t work in another post. There may be another way to migrate the settings, I did retain both XML files for Vbox and for each VM, so seeing the specific differences should be easy.
The next trick is to migrate a vbox image that includes a snapshot image. I’m cautiously hopeful for a good outcome with that.
I need your help deciding how to use the free Windows7 Ultimate license Microsoft gave away yesterday. I want to use it on my laptop but need some considered feedback on how would be best?
Current Laptop Config
- 4GB of RAM – may put 8GB in later
- 320GB disk
- Main OS is Vista-64bit Home Premium
- VirtualBox 3.0.6 for Virtual Machines
- WinXP Pro
My initial thoughts are to
- replace Vista with Win7-64
- eventually remove my WinXP-Pro VirtualBox
- use the built-in WinXP Compatibility layer
I spend 14 hrs a day in the xubuntu VM and only boot WinXP to run Quicken, a few MS apps and access TrueCrypt data. Perhaps 3 times a week.
- How good is the USB support in the WinXP VM?
- HDMI output?
- GigE networking – WiFi networking?
- How good is the driver compatibility for Win7-64? All-in-One Fax, printer, scanner, old Creative Xen and built-in laptop camera are the only devices I see using, in addition to normal flash and ext USB disk drives.
- Hauppauge 950Q ClearQAM TV tuner must work.
- Does Media Center work with this TV tuner and ClearQAM? The current MCE doesn’t.
- Can I consider Win7-32bit at all. Does it access the full 4GB of RAM? Is an upgrade to Win7 64-bit easy?
- TrueCrypt, MS-Visio, MS-Office 2007, and VideoRedoPlus are the only uses for Windows that I have. No gaming, er … very little gaming.
- Run Win7 in a VM, get used to it. Decide later
- Backup the data and VMs, repartition the disk for OS, Apps, Data, and install Win7 ??-bit as the main OS
- 32-bit or
Thoughts and suggestions? Did I miss an option?
Getting Syslog, Pound and Mongrel to work with Awstats
If you run the Ruby on Rails blog, Typo, it is likely you are using Mongrel as a cluster server and not Apache. Mongrel is easy – really easy. If you need 5 backend ruby servers, change 1 entry in the mongrel_cluster.yml file and restart.
Pound is a very simple load balancer written in perl. Many very busy websites use it. Slashdot for example gets 40M pages a day, all going through pound. Scalable? Check.
I’ll assume you already have syslog, awstats, pound, and typo/mongrel installed and simply want better logging. Explaining the nontrivial setup of these, sometimes complex, systems is not something handled in a blog. You’ll need to be root or have root editing via sudo to make this happen. A knowledge of manpages won’t hurt either.
So now you have two non-standard programs handling your web traffic, mogrel and pound. You’d like to get some normal website statistics about your users. By default, pound logs via syslog. We love syslog, but we don’t like that pound doesn’t use a separate file, by default. All your web traffic logs get intermixed with login, attempted hacks, disk failures and other system messages.
- describe the syslog setup to trap pound messages and drop them into a new logfile – /var/log/pound.log
- setup the new logfile to be automatically created, should it disappear
- setup pound to write the the new pound.log file via syslog
- automatically perform log file rotation, in the normal way
- create a custom log_file_format so awstats gets all the data it can from the logs
- be certain that restarting pound gets syslog to bounce or restart too
Here are the changes specific to the logging changes. You should have other changes for your server/domain already in these files. Before starting, you probably want to run an
awstats.pl -config=domain.com -update
to capture the latest stats before you move them all into a new location.
LogFormat=“%time3 – - %host_r %host – - %time1 %methodurl %code %bytesd %refererquot %uaquot”
See, local0 – local7 are approved syslog classes. They are meant to be used just like this. Syslog know about them, we need to be certain that pound will use local0 too. If your system is using local0, then select 1, 2, 3 … local7, which ever isn’t already in use.
You’ll need to have your Service, Redirect, and BackEnd stanzas too.
if [ ! -e “/var/log/pound.log” ] ; then
log_warning_msg “Creating pound.log …”
chmod 0644 /var/log/pound.log
chown syslog.adm /var/log/pound.log
/etc/init.d/sysklogd reload > /dev/null
log_success_msg “pound.log was found”
/etc/init.d/sysklogd reload > /dev/null
create 640 syslog adm
/etc/init.d/sysklogd reload > /dev/null
Good enough? Now just restart pound with
sudo /etc/init.d/pound restart
The next time your awstats is updated, you’ll see more and better stats. Note that we didn’t touch any of the old rrd data that awstats may have been able to parse.
This worked on an Ubuntu server 8.04 LTS running in a Xen virtual machine. There are other ways to do this and some settings can be changed without impacting whether this continues to work or not.Obviously, your situation will be a little different and you’ll need to figure out which differences matter and which don’t. Did I miss something important or does anything need clarification? Use the comments or talkback to let me and other readers know, please.
For the last 10 years, I’ve been doing batch jobs on my server the hard way.
That’s a big confession. For 10 years, I’ve been doing it the hard way. You know, you have a bunch of things to get done, but don’t want them to all run at the same time. Hundreds of little jobs, or perhaps 20 BIG jobs, it doesn’t matter. All this time, I’ve been using at as a manual scheduler. Basically, do something in 20 min, or 60 min or 2 hours or next Friday. Whatever, at is fairly powerful, but for batch jobs where the goal is to use the CPU to the fullest, but not overtax it, at is less than ideal. There could be too many jobs running or unused CPU time. Inefficient.
Then I finally followed up on a freshmeat.net annoucement – ts. Task Spooler, ts, is just that, a queue of tasks. It is a queue where you submit batch jobs to be run. By default, there’s no configuration needed. At this point, I’m not using any configuration. Basically, you pre-pend ‘ts’ in front of your normal command and it adds each to the queue. Installation was trivial – make install
Command line options work as you’d expect – they are passed to the batch unmolested. The environment is also properly retained.
$ ts encode_video some_video_1.mpg $ ts encode_video some_video_2.mpg o o o $ ts encode_video some_video_30.mpg $ ts encode_video some_video_40.mpg
is all that is needed. To monitor your jobs, run ‘ts’ alone.
I’ve told my ts server to run 2 jobs, since I have a dual core processor. It will always ensure no more than 2 jobs are running. The output can be captured and logged or stored into files, or whatever.
You do have to clean up the list of jobs occasionally. That’s just ‘ts -C’`.
It understands that you may want more than 1 queue – using environment variables, you can setup multiple queues with different settings. Then you can set your scripts to use whatever job queue you like. To setup different queues, just set the environment variable that controls the FIFO used. Here’s an example.
Uses for different queues?
- a download queue
- a backup queue
- a CPU intensive use queue
Oh, source code is provided. I’m using it on Linux, but guess it will work on any POSIX compliant OS. Get it here.