Ubuntu Hardy Depots Missing?
Err http://ppa.launchpad.net hardy/main Packages 404 Not Found Err http://ppa.launchpad.net hardy/universe Packages 404 Not Found W: Failed to fetch http://ppa.launchpad.net/madman2k/ubuntu/dists/hardy/main/binary-i386/Packages.gz 404 Not Found W: Failed to fetch http://ppa.launchpad.net/madman2k/ubuntu/dists/hardy/universe/binary-i386/Packages.gz 404 Not Found E: Some index files failed to download, they have been ignored, or old ones used instead.
Ouch.
I knew hardy support would eventually go away, but not before the next LTS release which isn’t scheduled for 4 months.
Fixed, 3 Days later ….
What's Wrong with New Linux Users? 10
Simple. They aren’t willing to spend the same amount of time they’ve spent learning some other operating system to learn Linux.
I’m happy to help them learn Linux in general (not a specific distribution), provided they display a sincere interest and a burning desire to learn.
That doesn’t mean I’ll spoon feed answers for every question they have, that is impossible, but I will help them learn how to find answers to their questions and teach them things that UNIX-like operating systems can do out of the box that most Windows-based systems cannot.
Before heading down the UNIX OS path, be aware that months of effort will probably be needed. Do you have the stomach for that commitment?
Any takers?
Solved - Adito 0.9.1 Installation Woes 1
So, Adito/VPN doesn’t really like Linux. The installation goes better than expected on 32-bit Linux, but if you change any of the configuration settings, then have the admin web page reset the server, the system won’t come back up. This happens on both 64-bit and 32-bit Ubuntu 8.04.x in my experience.
Initially, I installed Adito on a 64-bit Ubuntu Server, but problems with the startup scripts and/or wrapper, prevented it from working as far as I can tell. I never, never, never got the Adito to work after a reboot on 64-bit Linux. I ended up installing a 32-bit VM under ESXi 4 and loading Adito there. Comments below is for 32-bit Linux, since it was clear that the 64-bit startup scripts/wrapper was beyond my ability to solve.
Even on the32-bit Adito install, the system has never come back up after changing settings that require a reboot for me. NEVER without doing some extra work. After making any reboot-required changes, only the method outlined below has gotten a running Adito system again.
The Fix
I’ve tried a few methods (UTF-8 settings, hacked startup files, hacked config files, swapped versions of libwrapper crap), only the following has actually worked. As root:
- cd /opt/adito (or whatever directory you used)
- Delete the conf/webserver.properties
- ant install
- Login through the web interface at http://host:28080
- Step through the install process keeping all the defaults from before. Only the SSL certificate/key will need to be reinstalled.
- Reboot and all is well.
I’ve not lost userIDs or policies or tunnel settings doing this.
At some point, I’ll need to figure out how to backup just the settings and DB, without grabbing the entire server image.
What’s the issue?
I can only guess, but I believe it is too much of a mix of java and supporting java tools. According to the blueprint going forward, they plan to swap ant for some other tool and remove libwrapper. Ant is used by all java development teams, so I doubt that really is the issue. libwrapper appears to be a lazy way to set configurations for java web applications. There seem to be incompatible versions shipped with Adito as well. The startup process is overly complex too.
/etc/init.d/adito points to /opt/adito/install/platforms/linux/adito which is overly complex and reads a config file /opt/adito/conf/wrapper.conf which figures out which specific version of wrapper to call in /opt/adito/install/platforms/linux/x86-64/wrapper with the appropriate options (more than 3 options for some reason)/opt/adito/conf/wrapper.conf is rewritten with every run of ant, which means any time you have to modify it to make it work (UTF-8 character sets), then any use of ant will overwrite those settings. CRAP. There are ways to reset everything with ant or just reset parts. The problem is these resets aren’t explicit in the build.xml file – they are implicit. More crap.
Anyway, the levels of indirection are too complex and just make something that should be
java /opt/adito/lib/adito.warinto something really complex. Yes, I know it isn’t that simple, but it should be. That’s the point.
Update a few weeks later
So we’ve been using the VPN for a few weeks and everything is working well. Performance is good. There are a few things lacking that we’d really like to have, but they are beyond our expertise. Since I’m protecting internal web sites, we didn’t want to leverage LDAP authentication. We really want RADIUS authentication with x.509 certificate exchange. That isn’t a built-in option, so I’ve force really, really long and complex passwords and don’t allow users to change their passwords to get into the VPN. Their LDAP passwords are still known only to them, it is just the VPN access which I demand use 60+ characters. I deliver the password to each of them in a KeePass database, further insisting they follow good password management and protection methods.
Deployment to the entire company happened a few months ago. We’ve had no issues and have placed all our applications behind this VPN.
Help Make This Data Better
As I review this post, I can see where it could be confusing. It isn’t particularly well written. Please let me know which parts are too confusing to be useful. Oddly, this is one of the most popular posts on this site.
Linux and the Maytag Repairman
Linux compared to the Maytag Repairman
In summary, IT consultants don’t deploy Linux because deploying Windows gets them more customers and repeat business. Windows requires more IT support than Linux.
If you are a Linux consultant, you must have many more clients to make a living, which means more effort acquiring the clients.
With Linux systems, once they are deployed, they just work and continue to work. For many systems, even patches are handled automatically with no need for onsite support. The only reason to call your Linux consultant is to upgrade a system to a new release OR for a hardware failure.
I don’t know whether any of these statements are completely true, but there is some fact in each of them.
Expiration of CrossOver Linux Professional Support
Last year, the owner of CodeWeavers, a commercial Windows Interface Layer for Linux called CrossOver-Office, was forced to backup his President Bush hate speak with a fairly large software giveaway. I don’t recall the exact bet he made, but something like I’ll give my products away if any of these 3 things happen. One of them was related to the price of gasoline. At least one of them did happen and he manned up and gave away his products for a few days or weeks. WINE is the free version of this product, just a few months or years behind on compatibility.
Anyone could get a copy, installed it on their Linux machine and use it with support for a year. I did this things, but only used it a little. Perhaps … er … twice. I never used them again. I don’t recall why I didn’t use them more now. Perhaps it was that if every windows program didn’t work or didn’t work perfectly under CrossOver Office, so I still needed to keep a Windows VM anyway. Regardless, it never crossed my mind to use CO.
This morning, an email arrived with a reminder that support was ending in about a week. I should renew my support, if I want the new versions that are coming out soon. I suppose I should go down load the current versions (it has been a year after all) and install them and see if the improvements help with the Windows programs that I use and would like to use under Linux. Those are:
- Quicken 2009
- Investors Toolkit
- MS-Office 2007
- MS-Visio 2007
- then I have a bunch of Windows-only computer secure tools and network scanning tools.
If you work in a structured environment with very specific tools that don’t change very often, you could and should install these tools to validate how well they work. There’s a real savings in using them across an enterprise. but note that patching may not be possible.
I’ll need a Windows VM for the other tools, so I probably won’t remember to use CO. Further, since there is no way to portable install MS-Office, it is a hassle to install it under multiple instances and it could be in violation of the license agreement. I do own an MS-Office 2003 license and work provides an MS-Office 2007 license, so being legal isn’t a problem, provided I don’t install the same version in both places. Sadly, we’ve standardized on 2007 and 2003 won’t read the new file formats. OpenOffice, which runs ever where, does a fairly good job with all the new formats, provided you aren’t collaborating and constantly going back and forth with others. It really would be easier to standardize on OpenOffice. Seriously.
A few links:
- http://www.openoffice.org/
- http://www.codeweavers.com/
- http://www.winehq.org/
If you got in on the deal a year ago, check your email for the 50% coupon code.
Private Computing Clouds
This guy get’s it. He wrote an article on deploying cloud computing inside your company and why the new Ubuntu Server release needs to be considered.
There are other options, but Ubuntu has the most compelling thing in my mind – APT. APT makes using and deploying applications nearly trivial.
I’ve been in RPM hell with the competition, so it really isn’t much of an option to me.
VMware has compelling solutions, if you have $4K to spend for every 2-3 servers. Uh…. no thanks.
Ubuntu 9.10 Launch on Oct 29!
You all know I use Ubuntu heavily. I love the application management and update scheme used – APT.
A new version of Ubuntu is scheduled for release this week – 9.10. While this isn’t an LTS – Long Term Support release, so it isn’t suitable for production servers, it does have enough new and useful things to be worth migration consideration.
I plan to migrate my main desktop to it, but I’ll really be waiting for xubuntu, the lighter weight GUI version.
As I consider the new features included and made significantly easier, I came across an article where someone compared the so-called new features in Windows7 with other OSes.
Big new features in Ubuntu 9.10, from my point of view?
- EXT4 – this will be the default file system. Testing has shown it to be faster than most other alternatives. I’ll be staying with JFS on physical hardware and ext3 on virtual hardware. IMHO, more time is needed before I’ll risk my data. I really want ZFS, but since the license isn’t working with GNU, my best hope is btrfs at some future date. Linus is running btrfs on his daily workstation, so that’s a good sign.
- Cloud Computing infrastructure with Eucalyptus. Eucalyptus was introduced in an earlier release, but I didn’t use it. Ok, so it isn’t new, but it is_ new to me._ This is both compute and storage infrastructure that is compatible with Amazon EC2 and S3. Ubuntu has decided to call this UEC going forward. Very nice.
- KVM – Ubuntu has decided to leave my beloved Xen for KVM. That means it will be easier for me to change from Xen to KVM for virtual servers with the next LTS server release. I’ll need a few months to get ready and test.
- Easy HOME directory encryption – Don’t know that I’ll use this, but I will encrypt a subdirectory. Yes, I know TrueCrypt has been available for a long time and is cross platform. Ubuntu adds an auto-mount / dis-mount as you log in and out of your account. The resulting encryption can be moved, but only with the key provided at create time.
- Bluetooth tethering with cell phones. This means you’ll be able to use a cell data connection from your laptop, fairly easily. Great if you travel much.
- New kernel – In every new kernel, there are lots of new features that don’t matter to most of us and a few new features that matter a bunch. My kernels are so old that there are many, many new to me features. Some are security and others are performance related. KVM is built-into the kernel now. That’s better than getting Xen updates that don’t get tested well enough and occasionally break.
Ubuntu is also excited about free cloud storage they will offer. Of course, there’s a storage amount that will force a payment, but ease of use will make this useful to many people. Even if all you do is share your desktop settings or back them up to the cloud, you’ll be better off.
Anyway – Look for bittorrent downloads on 10/29 and get yours running. Seed after you complete for the entire weekend to help your fellow users out, please.
Broken PC? Tried Linux?
Broken PC?
I know a few people who are not really computer savvy that have gotten their computer so messed up that it is unusable. It boots, but can’t really do anything. These people think they need to:
- buy a new PC or
- pay $200 to a PC tech to get it fixed.
Both of these methods will work, but why? Chances are, their Windows computer has been hacked or is running spyware. In fact, that last internet website they visited for a fun game may have installed the spyware and then something known as a rootkit. Basically, it isn’t safe to use that PC anymore for any reason.
There is a FREE Option, Linux
So, the PC isn’t really broken, but anything on the hard drive shouldn’t be trusted. Many Linux distributions come as a Live CD – this means you just put the CD into the computer and boot up. Here’s a youtube video showing what this looks like . Most computers will load the OS from CD never touching the hard drive. Using one of these, you can use your computer for common tasks that don’t need a hard drive. Using google, google mail, yahoo mail, hotmail, …. anything online.
Using Linux has no risk to your data or even your hard drive. If you don’t like it, don’t boot from the CD anymore and find another way to use your PC again. Take it to a tech for $200 or buy a new one for $500-$1500. You risk nothing, provided you don’t tell it to install to the hard drive. It will not automatically install to your hard drive unless you ask it AND there will be multiple screens and points where you have to answer yes – wipe my disk clearly.
Online Banking
Recently, the Washington Post Security Fix Guy has recommended everyone perform their online banking using an Ubuntu Live CD. The people with the greatest risk are those using the large USA banks, since hackers have created programs that hide in Windows and watch when you login to those banking web sites, then cause transactions from your account to their account(s).
Broken PC, How to get Linux?
Most of the people who will be helped by this method have 1 computer at home and don’t have access to another. There’s good news. The Ubuntu Foundation will snail-mail a CD anywhere -- for free. Now you just need to get to a computer to request it. Go to your public library or ask a neighbor, or call me and I’ll enter the data so you’ll get the CD in a week or two.
Ubuntu Linux
Ubuntu is a full featured operating system like Windows7 or Vista. It is big and capable and the load time reflects that. The Live CD should work with 95% of the PCs out there. Put the CD in and boot. That’s it. Ubuntu runs best when it is installed to a hard drive, but you can test drive it forever if you like without touching your old hard drive. Just know that CDRom drives are much slower than normal hard disk drives. Ubuntu will run nicely in 512MB of RAM.
There are smaller Linux distributions when all you need is to get online. Smaller is better for speed, RAM use and simplicity. You can find many more Linux Distrubutions, some highly specialized at Distrowatch.
There’s a search tool that will help you select the best distro for your needs. Do yourself a favor and stay with the major distributions and only those that are debian-based. Debian is an major distribution known for stability and program management ease via APT. APT rocks, see my prior article on why.
Be Prepared
All of us have a broken PC from time to time. Be prepared by creating (or getting) a Live CD Linux distribution and using it once now, when your PC is working is a good idea. It is really easy.
Memory Use and Win7-x86
Fantastic is the only word I can use. Windows7 x86 memory use is FANTASTIC (meaning low). I’ve done a little optimization using Vista System Optimizer after installing Win7 on my laptop – here are the results:
Win7 = Host OS
Ubuntu = Client VM – 1224MB allocated
The total system memory used with VirtualBox, Ubuntu and Windows Media Player playing a TV show is 1.75GB. 1.2GB of that is allocated to the client VM. Under Vista-64, this same config would use 2.5GB.
Running another VM, WinXP, with 1GB of use, will bring the total memory used to 2.75GB.
Win7 = Host OS
Ubuntu = Client VM – 1224MB allocated
WinXP = Client VM – 1024MB allocated.
This would use almost 4GB in Vista-64.
Even with the 32-bit limitation of 3.5GB of RAM, on my system, I actually gain more usable RAM with 32-bit Win7 over 64-bit Vista and isn’t giving more RAM to client VMs the goal?
Easy Software Updates and Patches
We all keep every computer we have patched and current, right? Every application. Every web browser and every OS library – fully patched, right?
Well, … er … probably not.
Why not? Because it isn’t easy. Keeping just the OS patched is easy. Apple, Microsoft and Linux distributions have made it even easier the last 5 years. But the OS patches are only 10% of the problem. We need to patch the browser, plug-ins, email, office-apps, and every other application on the machine in a timely manner. Are you up to that task? I’m not.
There’s a simple solution. Linux patch management.
Ubuntu Linux distributions include thousands of free applications and make patching all of them trivial. They are updated just as easily as the OS patches. Let me explain.
Desktop Ubuntu Patching
In the upper right-hand corner of the desktop, Ubuntu places an Update Manager notifier. It is unobtrusive, but clear.
- The notifier is a red ! in color; it automatically updates package information daily.
- If any package installed on your system using the Synaptic program has an update, the notifier will be displayed, red. Just click the icon to start the process (get more information). If there are no updates, the notifier isn’t displayed.
- Applying patches usually doesn’t require you to stop doing whatever you are already doing.
- Applying patches usually doesn’t require a reboot, unless there is a kernel update.
- Applying patches usually doesn’t require any manual configuration file changes. Any changes are usually handled by the installation package.
- Any application installed using the GUI package manager ( specifically, any debian-based package depot), will be maintained and updated through the same interface.
- Whether an update to a package is available or not is automated.
Server Ubuntu Patching
I’m certain there’s a way to determine when patches are available, but I never check. I simply update the local package depot list and update all installed packages.
Old way:
sudo apt-get update
sudo apt-get upgrade
2014 update:
sudo aptitude update
sudo aptitude dist-upgrade
I prefer aptitude for a few reasons, but that isn’t critical. The dist-upgrade option will load newer kernels – don’t worry that too much new software will be installed. That isn’t what happens. Been using the 2nd set of commands for a few years without any issues.
Yes, it really is that simple. BTW, these commands work on desktop Ubuntu too. In fact, every Saturday morning, I run a script from a laptop that remotely connects to all the other Ubuntu machines and runs both of those commands.
There are other options of patching Linux, but 95% of the time, these commands are all that you need to know.