Err http://ppa.launchpad.net hardy/main Packages 404 Not Found Err http://ppa.launchpad.net hardy/universe Packages 404 Not Found W: Failed to fetch http://ppa.launchpad.net/madman2k/ubuntu/dists/hardy/main/binary-i386/Packages.gz 404 Not Found W: Failed to fetch http://ppa.launchpad.net/madman2k/ubuntu/dists/hardy/universe/binary-i386/Packages.gz 404 Not Found E: Some index files failed to download, they have been ignored, or old ones used instead.
I knew hardy support would eventually go away, but not before the next LTS release which isn’t scheduled for 4 months.
Fixed, 3 Days later ….
We’ve been using SysUsage to monitor general performance of our Linux servers for a few years. Version 3 was released recently with a new web GUI and simpler installation, but not quite the trivial apt-get install that we’d all love. View a demo.
Anyway, go grab a copy of the source tgz and follow along.
tar zxvf SysUsage-Sar-3.0.tar.gz cd Sys*0 sudo apt-get install sysstat rrdtool librrds-perl perl Makefile.PL make sudo make install sudo crontab -e
Drop these lines into the root crontab.
*/1 * * * * /usr/local/sysusage/bin/sysusage > /dev/null 2>&1
*/5 * * * * /usr/local/sysusage/bin/sysusagegraph > /dev/null 2>&1
I performed these steps using Cluster SSH on almost all our Ubuntu 8.04.x servers; each installation worked.
If you have Apache running in the normal place, browse over to http://localhost/sysusage/ ,
If you don’t run a web server, try firefox /var/www/htdocs/sysusage/index.html to see the results.
Further, our simple rsync over ssh scripts to pull the SysUsage output back to a central performance server are still working. Some of the old data from the v2.12 of the program is still inside the RRD files. It isn’t clear at this point whether the data will be used in the new graphs or not. It takes about a day for the graphs to become useful.
Saw an article today that someone has decided to sell WPA passphrase cracking service for about $40. It takes about 40 minutes, but on average just 20 minutes. Seems he has a Beowulf compute cluster with idle time.
If this is what some guy can do, imagine what different governments can do.
Once again, no consumer-grade WiFi is deemed non-secure. Go wired if you care at all. There is no secure wifi/radio networking, none.
We all get emails asking us to do something. Sometimes the email includes a link to a specific web site to help you complete the task. Well, unless you only see plain text email messages, no RTF, no HTML, then you can’t trust that the URL you click is really where you are being taken. If there are any miss-spellings or simple grammar issues, ignore the email.
Don’t click links provided in emails. Rather, manually go the the website. Use your password manager, like KeePass, to open the correct page and enter your login credentials. Or you could type the known URL, just don’t click the URL in the email. Simple enough?
You know this stuff. You know not to click. But it takes just 1 small mistake to be p’wned and you may not realize it for a few days, if ever.
Hak5.org forums were hacked in 2009. When the maintainers realized it, they sent an email warning everyone and suggesting that you never use the same password on multiple websites. Good advice. If you use a password manager, like KeePass, you never need to worry about reusing the same password. Just use the Generate button to create a strong password for each website you visit.
Web site administrators are being targeted now to gain access to the servers by people who want more bot controllers in their bot network. I’ve used cPanel before at a hosting provider and it would be easy for that page to be cloned, yet still appear to work. The cloner can grab the login credentials and pass them on to the real cPanel page. When you run a web site, the management of that website often bounces between lots of unrelated web servers that you aren’t used to seeing, adding to the confusion. Even Yahoo hosting was targeted, so I can’t believe that some very popular, yet cheap, providers aren’t also.
I get spam emails that are usually sent from small business servers all the time. These servers could be misconfigured to allow email relaying or compromised by some other method. Regardless, sending an email to the administrator never seems to help.
During holidays, we all get Holiday eCards asking us to click on a URL. I’ve gotten 3 called ICQ Greeting Cards this week. The link in all three of those emails was to some nasty software, for sure. Don’t click. They even had safe links included in the email to the real ICQ site to earn my trust.
Be careful out there, especially if you are an administrator for others or public facing internet services. I expect to be hacked at some point. I have been hacked – over 10 years ago. Hopefully, being hacked again won’t happen for some time.
Today I was going through my list of files to backup on my Linux laptop and removing temporary and cache files when I came across a directory that I didn’t recognize. The files were listed as changed with the last 3 days.
changed .purple changed .purple/accels changed .purple/accounts.xml changed .purple/blist.xml changed .purple/prefs.xml changed .purple/status.xml
It turns out they are for pidgin, the extremely popular Instant Messaging software. Ok, I use that – fine. But my interest got the best of me and I looked at the accounts.xml file. Obviously it is an XML file, but I was shocked to discover the following (modified for my protection):
The password isn’t encrypted. Not at all!
This is unacceptable.
There is an encryption plugin for pidgin but it is for IMs, not the stupid passwords. This is just crazy. Heck, there are ROT13 methods and trivial 2-way password encrypt/decrypt methods which could be used if necessary.
The pidgin wiki has this to say. I have to admit, they do have a point, but I still disagree with it. At least they do set the directory permissions to 700 and file permissions to 600 (user only), but this doesn’t help with my backups placed on another system, does it?
So, Adito/VPN doesn’t really like Linux. The installation goes better than expected on 32-bit Linux, but if you change any of the configuration settings, then have the admin web page reset the server, the system won’t come back up. This happens on both 64-bit and 32-bit Ubuntu 8.04.x in my experience.
Initially, I installed Adito on a 64-bit Ubuntu Server, but problems with the startup scripts and/or wrapper, prevented it from working as far as I can tell. I never, never, never got the Adito to work after a reboot on 64-bit Linux. I ended up installing a 32-bit VM under ESXi 4 and loading Adito there. Comments below is for 32-bit Linux, since it was clear that the 64-bit startup scripts/wrapper was beyond my ability to solve.
Even on the32-bit Adito install, the system has never come back up after changing settings that require a reboot for me. NEVER without doing some extra work. After making any reboot-required changes, only the method outlined below has gotten a running Adito system again.
I’ve tried a few methods (UTF-8 settings, hacked startup files, hacked config files, swapped versions of libwrapper crap), only the following has actually worked. As root:
- cd /opt/adito (or whatever directory you used)
- Delete the conf/webserver.properties
- ant install
- Login through the web interface at http://host:28080
- Step through the install process keeping all the defaults from before. Only the SSL certificate/key will need to be reinstalled.
- Reboot and all is well.
I’ve not lost userIDs or policies or tunnel settings doing this.
At some point, I’ll need to figure out how to backup just the settings and DB, without grabbing the entire server image.
What’s the issue?
I can only guess, but I believe it is too much of a mix of java and supporting java tools. According to the blueprint going forward, they plan to swap ant for some other tool and remove libwrapper. Ant is used by all java development teams, so I doubt that really is the issue. libwrapper appears to be a lazy way to set configurations for java web applications. There seem to be incompatible versions shipped with Adito as well. The startup process is overly complex too./etc/init.d/adito points to /opt/adito/install/platforms/linux/adito which is overly complex and reads a config file /opt/adito/conf/wrapper.conf which figures out which specific version of wrapper to call in /opt/adito/install/platforms/linux/x86-64/wrapper with the appropriate options (more than 3 options for some reason)
/opt/adito/conf/wrapper.conf is rewritten with every run of ant, which means any time you have to modify it to make it work (UTF-8 character sets), then any use of ant will overwrite those settings. CRAP. There are ways to reset everything with ant or just reset parts. The problem is these resets aren’t explicit in the build.xml file – they are implicit. More crap.
Anyway, the levels of indirection are too complex and just make something that should bejava /opt/adito/lib/adito.war
into something really complex. Yes, I know it isn’t that simple, but it should be. That’s the point.
Update a few weeks later
So we’ve been using the VPN for a few weeks and everything is working well. Performance is good. There are a few things lacking that we’d really like to have, but they are beyond our expertise. Since I’m protecting internal web sites, we didn’t want to leverage LDAP authentication. We really want RADIUS authentication with x.509 certificate exchange. That isn’t a built-in option, so I’ve force really, really long and complex passwords and don’t allow users to change their passwords to get into the VPN. Their LDAP passwords are still known only to them, it is just the VPN access which I demand use 60+ characters. I deliver the password to each of them in a KeePass database, further insisting they follow good password management and protection methods.
Deployment to the entire company happened a few months ago. We’ve had no issues and have placed all our applications behind this VPN.
Help Make This Data Better
As I review this post, I can see where it could be confusing. It isn’t particularly well written. Please let me know which parts are too confusing to be useful. Oddly, this is one of the most popular posts on this site.
You can manage your google data here https://www.google.com/dashboard/. This is good. I don’t really use all the google apps, but seeing all the searches I’ve made over the last few years and the trend data, was eye opening. I elected to wipe my data, then pause all future capture of that data.
What other data did the dashboard show? I have removed anything personally identifying below.
There was a list of 3rd party sites with access to this data too. I didn’t recall authorizing any of them. Data removed and future 3rd party access prevented.
Major kudos to google for allowing us to manage our data and privacy settings.
I did leave some of the private data out there for use. It isn’t important to me. Your internet use may tell others things that are better not shared. Suppose you search on a medical term because a friend tells you a story about his mother. That search term is saved and tied to your account. What happens if 5 yrs later you end up being medically diagnosed with that illness? Your insurance company may start legal discovery efforts, or just pay google for the data. Now they refuse to cover your treatment since it was a pre-existing condition. Even if you don’t care about this, you know someone who does. What if you search for foods that are bad for you or visit weight loss web sites for 5 years? Expect your insurance company and the govt to have access to this data. If it is stored, it will get out.
It should be noted that if you aren’t logged into your google account, the data captured doesn’t appear to be correlated with your account. That doesn’t mean it isn’t captured by your IP address or a google cookie, stored, and correlated. Further, you can’t manage the data with the dashboard. Google writes about this other data
Today, google is a little less evil. Until they let me remove my data from other peoples’ accounts (contacts, phone calls, email addresses), I’ll still avoid using google with an expectation of privacy.
11/8: The Washington Post Security Fix guy has an article on this now too.
If you have broadband internet service in the USA, chances are you already pay for commercial antivirus programs. AT&T, Verizon and Comcast all provide commercial AV with their broadband services. Here are the links:
Microsoft also has entered this market with a very competitive free download http://www.microsoft.com/Security_Essentials/. The reviews so far are good. I can’t recommend it myself, it needs time to prove it works. However, I have switched to using it since it came out of Beta. I don’t use Windows much, so my risk is very low.
In short, since you’re already paying for these commercial antivirus programs, why spend money on them again? Use the free versions that your ISP provides.
You all know I use Ubuntu heavily. I love the application management and update scheme used – APT.
A new version of Ubuntu is scheduled for release this week – 9.10. While this isn’t an LTS – Long Term Support release, so it isn’t suitable for production servers, it does have enough new and useful things to be worth migration consideration.
I plan to migrate my main desktop to it, but I’ll really be waiting for xubuntu, the lighter weight GUI version.
As I consider the new features included and made significantly easier, I came across an article where someone compared the so-called new features in Windows7 with other OSes.
Big new features in Ubuntu 9.10, from my point of view?
- EXT4 – this will be the default file system. Testing has shown it to be faster than most other alternatives. I’ll be staying with JFS on physical hardware and ext3 on virtual hardware. IMHO, more time is needed before I’ll risk my data. I really want ZFS, but since the license isn’t working with GNU, my best hope is btrfs at some future date. Linus is running btrfs on his daily workstation, so that’s a good sign.
- Cloud Computing infrastructure with Eucalyptus. Eucalyptus was introduced in an earlier release, but I didn’t use it. Ok, so it isn’t new, but it is_ new to me._ This is both compute and storage infrastructure that is compatible with Amazon EC2 and S3. Ubuntu has decided to call this UEC going forward. Very nice.
- KVM – Ubuntu has decided to leave my beloved Xen for KVM. That means it will be easier for me to change from Xen to KVM for virtual servers with the next LTS server release. I’ll need a few months to get ready and test.
- Easy HOME directory encryption – Don’t know that I’ll use this, but I will encrypt a subdirectory. Yes, I know TrueCrypt has been available for a long time and is cross platform. Ubuntu adds an auto-mount / dis-mount as you log in and out of your account. The resulting encryption can be moved, but only with the key provided at create time.
- Bluetooth tethering with cell phones. This means you’ll be able to use a cell data connection from your laptop, fairly easily. Great if you travel much.
- New kernel – In every new kernel, there are lots of new features that don’t matter to most of us and a few new features that matter a bunch. My kernels are so old that there are many, many new to me features. Some are security and others are performance related. KVM is built-into the kernel now. That’s better than getting Xen updates that don’t get tested well enough and occasionally break.
Ubuntu is also excited about free cloud storage they will offer. Of course, there’s a storage amount that will force a payment, but ease of use will make this useful to many people. Even if all you do is share your desktop settings or back them up to the cloud, you’ll be better off.
Anyway – Look for bittorrent downloads on 10/29 and get yours running. Seed after you complete for the entire weekend to help your fellow users out, please.
It is really simple to end all the personal data leaks that we read about all the time, make the penalty of the leak so high that no company would ever allow it to happen. Further, make the fine be paid directly to the impacted persons, so it isn’t the class action lawyer or some neutral party being paid.
A few years ago, my college leaked 20,000 transcripts on the internet. Mine was not one of the leaked, but if I were going to cost $2,000, per instance, for the fine, I suspect my University would be more careful. That fine would have cost them $40M. Yep, they wouldn’t leak anything, that’s for certain.
There are a number of systems out now that are known to leak private data, MySpace, Facebook, and Paypal are constantly found to be deficient with security practices. If there was a $2,000 fine for each failure, I bet they’d fix it or refuse all private data. Or, they’d go out of business, which would give them an opportunity to come back with better security after bankruptcy. Further, venture capitol would demand excellent security processes to prevent any private data breaches.
How is any of this bad? I suppose the companies (slime?) who make money offering bogus privacy insurance would be harmed. They would convert into audit companies or fold. I suspect lawsuits against Microsoft for common program breaches would increase, forcing them to create a secure OS if they wanted to retain customers. I can get behind that. The people and companies certifying private data won’t be leaked will be held accountable if their system fails too.
Is financial data the only private data or is anything not found in either the telephone book or government documents to be considered private? Is there an expectation of privacy for all other information that should be protected?