Why You WANT a Nokia N900

Posted by JohnP 10/01/2009 at 08:58

If you are a smart phone user AND a Linux nerd, you WANT a Nokia N900.
Here’s a very detailed review, perhaps too detailed.

The highights are:

  • CDMA (tri mode) and GSM (quad mode) cellular phone with 3G data speeds
  • WiFi supported
  • Linux – full multitasking; listen to music, surf the web, download files, and 5 other apps at the same time, no need to close apps to do something else* take that Apple lovers
  • GPS and GeoCache-ready apps
  • QWERTY Keyboard take that Apple lovers
  • BlueTooth
  • SDHC expansion memory, easily swapped, 32GB internal plus external slot
  • 800×480 screen take that Apple lovers
  • 3D graphic acceleration
  • 5Mpix Camera with near HD-quality video
  • User swappable battery take that Apple lovers
  • Plays almost any video or audio media take that Apple lovers
  • 1,000s of free Linux apps – lots of software is an understatement; xterm, PDF, RDP, VNC, games, Office/Productivity, IM, RSS
  • Excellent VoIP and Skype support (Ovi, Google Talk, Jabber, and SIP) take that Apple lovers
  • TV-Out
  • Connects to your MS-Exchange server including Calendaring
  • Mozilla-based browser with Flash 9.4 support and multiple window support (# only limited by memory). The reviewer didn’t fine any web pages that didn’t work regardless of javascript, flash, or AJAX.
  • Oh, and all the things you expect from a PDA – contacts, calendars, email,

The review compared the keyboard to that of another Nokia phone, but I’d like a comparison with a Blackberry QWERTY keyboard, which I consider FANTASTIC for thumb typing. I’m curious about built-in security features too, though a lock code is standard.

The only downsides to this device are:

  • Data plan needed (monthly cost)
  • Unclear that any subsidy will be provided by any cellular provider.
  • Unlocked price – $584 on Amazon. Ouch.
  • Screen size reduced from 4.1" to 3.5" so it is about the size of an iPhone.
  • No voice dialing?
  • Java was not shipped with the device, but it is definitely available.

Overview of LinuxFest Atlanta 2009

Posted by JohnP 09/21/2009 at 12:58

Overview of LinuxFest Atlanta 2009

I attended LinuxFest Atlanta 2009 with
700 like-minded people. Lots of good information for the price –
basically free.


There were about 42 sessions organized
for all levels from beginngers (I didn’t count them) from Fixing
Audio in Ubuntu/Linux
to
multiple Kernel Hacker sessions (
Debugging the Kernel,
4
Driver Writing Sessions,
etc.). There were more sessions offered than I could hope to attend.
Due to many late sign ups (about 300 extra), many of the sessions
were standing room only and overflowed into the hallway. I was able
to get a seat by going directly from session to session quickly.


We
need to thank IBM http://www.ibm.com

for providing facilities to this conference. There wasn’t any IBM
advertising that I saw. A
BIG THANK YOU, IBM,
from
me. There were other supporters too with tables in the common areas.
Linux Journal, SuSE, LinuxPro and Cononacal are a few from memory.
Many companies hosted extremely informative sessions.


My session attendence:



  • What Community Has to Offer – OpenSuSE


  • Linux, Hadoop, and Amazon Web Services: Crunching the Big Data in the Cloud

  • Free Software Development with Clouds

  • Securing Your Network wth Open Source Technologies

  • Running and Open Source Business

  • The Weather Ahead: Clouds


There were other
sessions I would have liked to attend, but the conflicts prevented
it.



What Community Has to
Offer – OpenSuSE

Presenter: Chuck Paynehttp://opensuseterrorpup.blogspot.com/

Slides:http://www.magidesign.com/download/alf.odp

The presenter is an OpenSuSE
evangelists and works at the Travel Channel IT in Atlanta as a
sysadmin. He provided a survey of the different tools and
distributions that OpenSuSE provides.

OpenSuSE Studio:

Using the OpenSuSE Studio tool, you can
build a specialized distribution for your team, clients, family,
school. A concrete example was that you could build a server and
desktop distributions for students to perform homework with identical
software available to all from a Live CD boot.

See the
“slides”:http://www.magidesign.com/download/alf.odp for much
more.

Linux, Hadoop, and
Amazon Web Services: Crunching the Big Data in the Cloud

Presenter: John Willis
http://www.johnmwillis.com/

Slides: not available.

Basically, this talk was a list
of companies, FOSS tools, and techniques around dealing with huge
data sets in parallel on cloud infrastructure. It started with the
NIST definition of
Cloud Computing and
ended with how to monitor and merge data from hundreds of individual
systems for an overview. My notes are just a list of tools that I
found interesting during the talk.

Libvirt, OpenNebula, OpenQRM,
Cobbler

RightScale.com

Nanite, Capistrano, ControlTier

Eucalyptus, Enomaly, Nimbus

OpenVPN, CloudNet

Splank

Chef from Opscode, Puppet,
Cfengine

CollectD, jCollectD

Big Data Frameworks: Pig, Hive,
Cascading

It’s 2 days laters and I’ve
checked out RightScale and collectD. We use SysUsage

for monitoring our small group of systems. I must have missed the
main points of this talk. Lots of data, but nothing that made me want
to change jobs.


Free Software
Development with Clouds


Presenter: Deryck Hodge
(Canonical) http://www.devurandom.org/

https://launchpad.net/

is a Canonical-backed software collaboration website. The goal is to
provide everything except compilers for software development
projects. Here’s a bullet list:



  • Blue Prints – architecture
    diagrams


  • Version Control via Bazaar
    with branching and merging


  • Bug Tracking


  • Threaded discussions

  • Release Management

  • Collaborative Translations –
    language files

  • Karma system

  • Code Reviews can be
    mandatory – PQM-based

  • Open Source, but getting it
    running inside your company isn’t easy and they won’t help you. They
    said it would require 15+ servers. Get the source here:
    https://dev.launchpad.net/

While the website has things for
project management, it is tailored to software development projects.
A comment from the croud that tracking server deployment with it was
very possible. Free accounts let anyone have access to view your
project details. Paid versions provide project privacy, if you like.

Securing Your Network
wth Open Source Technologies

Presenter: Nick Owen
http://www.wikidsystems.com/
Lots of how-to guides.

Lots of detailed information, a
little too fast for me, about securing your network, applications,
and users. Here’s a link to the presentation. Basically, use RADIUS and 2-factor authentication.
RADIUS is supported by every vendor and standards were created before
anyone wanted a niche. RADIUS works with Apache, PAM, Microsoft, and
many routers.

Admins are happiest when there are
no users.

Tell all your passwords to go to
hell.

I need to check on

  • RADIUS support in pound (a
    load balancer)

  • Remote Desktops support
    RADIUS

  • Using RADIUS in OpenVPN

  • Apache front ends – don’t
    allow anyone to our apache services until they network authenticate
    via RADIUS

  • One Time Passwords –
    WikID, Opie, FreeToken, OTP Auth

  • FreeRADIUS – AIS
    (Microsoft)

This session provded the greatest
value for me.

Running and Open
Source Business

Presenter: Tarus Balog
http://www.opennms.com/

Basically, this was a talk on how
to start a business with a slant on FOSS. Get a laywer, CPA,
insurance and all the other things you need for a business. Give the
software away and encourage a community to form that provides patches
and modules back to you. He only knows how to make money selling
services for tools, not applications. How much are you willing to pay
for OpenOffice support and installation? $0. OTOH, how much are you
willing to pay to monitor your servers with a great tool that is
complex to install, but easy to run? $10,000/yr?

Main tips:

  • Don’t quit your day job

  • GET A TRADEMARK and copyright everything -

    $300
    and a year of your life


  • Build an awesome app or tool


  • Start a foundation and get a
    company to fund it. IBM funds lots of foundations that Microsoft
    hates.

  • If you use GPL for your
    license, anyone that steals your code must release their code too.
    If you use BSD or Apache or other do-what-you-like licenses, they
    can be secret.

  • Copyrights

  • Owner can change the license
    at any time

  • Defend the code from license
    abuse

  • Sun started theee Dual
    Copyright

  • Have a Contributions Agreement

    that gives you and the contributer both copyright ownership. This
    lets you change the license in the future without asking permission
    from everyone that contributed 15 years ago. Clone the Sun
    agreement.


  • Get ramen
    profitable

    – earn the amount of money to life.


  • Spend less than you earn


  • There’s
    a diagram in the book –
    Crossing the Chasm -
    http://en.wikipedia.org/wiki/File:DiffusionOfInnovation.png

    The difficulty is in getting
    enough customers to be #1 or #2 in your market and becoming an
    Early
    Majority
    solution.


  • Release
    code early and often –
    The Practice Effect


  • Create
    products that are easy to buy – not things that are easy to sell


  • Create
    a website


  • Separate
    work from life.

  • Create
    a blog http://www.adventuresinoss.com/

  • Be
    results driven, not effort driven – my addition

  • Build
    CRM, Trouble Ticketing, and bug tracking BEFORE you need them

  • Create
    a mailing list and/or forums to let your community chat

  • Participate
    in the community – go to conferences and give talks

  • Twitter,
    facebook, whatever for marketing

  • Get
    Paid:

  • Easy
    pricing – “bundle of knowledge consulting”

  • Get
    customers – don’t do free stuff

  • Net-30
    – offer a discound, 2%, for paying early

  • Statements
    of Work – SoW or do time and materials, T&M

  • Annual
    Renewals include consultations, upgrades, etc. If you charge
    $15k/annual support and have 100 customers, you have a business.

  • Value
    your employees – 401(k), Health Insurance, Payroll Service;
    People
    are your company


  • Use
    the Bowling Pin model; after you sell 1 pin, discover 9 other things
    each customer needs and offer it.

  • Grow
    or die

  • Fire
    a bad customer – life is too short for work you really hate to do.

  • How
    to get out?

  • IPO

  • Make
    a great lifestyle company

  • Sell
    to a big company – If someone offers $30M, do you take it?

    Obviously
    from my notes, I liked this talk.

The Weather Ahead:
Clouds

Presenter: John




Ubuntu Jaunty includes a cloud API identical to Amazon S3 and EC2
serivces. This means you can build and test internally, then deploy
with binary compatibility to Amazon or other compatible cloud
providers.

Today, cloud computing is like electricity; turn it on when you need
it. Turn if off when you are done.



No capitol costs.



Ubuntu1 – storage



Landscape – SaaS – stats, hw, sw, trending, patches



AMI – Amazon Machine Image




I need to research switching from Xen to KVM for our internal VM
systems. Managing a cloud is less like managing a group of VMs.




Always migrate forward, never go back. If you have an issue, grab
the next machine, migrate and get it working. Later, you can go back
to the non-working version and figure out what happened or destroy
the VM.




GPS Data and Hiking

Posted by JD 09/20/2009 at 14:30

How to GPS Tag photos with your Nokia N800 and GPSbabel … The instructions here are not really specific to a Nokia N800, so other GPS units should use very similar steps. Only the GPSBabel part will probably change options based on your GPS device.

I’ve been taking my N800 and bluetooth GPS receiver on my hikes. Really just as a way to track approximate mileage. After doing that a few months, it seemed there had to be a way to put the GPS lat/lon into my photos. There is. A few other uses for GPS data, beyond the obvious:

  1. Retain your track data
  2. Estimate distance covered
  3. GPS tag your photos
  4. Share your track as a route for other hikers
  5. Post a track on Google Maps for others – nice visualization with all the zoom and pan that you expect from google.
  6. Mark the actual location of a landmark – waterfall, lookout point, or geocache

So far I’ve retained many of my tracks, but not been able to view them except on the N800. That’s useful, to a point. I’d really like to record them and create a database of visual tracks that is viewable on google maps for my friends to view. The real idea is to create a database of local hikes with trailheads, distances and difficulty ratings to help select future hikes.

Enter gpsbabel

Gpsbabel is a tool converts GPS data between many, many different devices and formats that runs on any platform – win32, unix, linux, N800. It supports conversion between … I guess about 50 different formats. My need is to convert N800/Maemo-Mapper GPX data into something GoogleMaps can use, KML. Originally, I thought gmaps supported GPX too, but that never worked well enough and had limited waypoint support. Yes, KML is the best answer for this.

Conversion steps for maemo-mapper gpx files into kml files that google-maps can display.

  1. Get the GPX file off your N800 … somehow (scp, ftp, pull the memory card and copy the data, whatever)
  2. Use gpsbabel to convert the file to KML.
    gpsbabel -t -i gpx -f “$1” -o kml,points=0 -F “$1.kml”
    points=0 option drops some data, so the resulting track isn’t exact.
  3. Move the .KML file to a web server that googlemaps can access, anywhere really, on your desktop probably isn’t gonna work.
  4. Have google maps display the data – a sample Laughing Falls, NC by fashioning a URL like the link here. Basically, you use http://maps.google.com/maps?q={full-URL-to-file.kml} The file can be waypoints, traces or routes as far as I can tell.

The result isn’t a nice track until you uncheck the Points on the resulting page. Also, I’ve tried to get gpsbabel to reduce the track to a radius around the importance locations, but that isn’t working. Loading gpsbabel was trivial on my Ubuntu laptop and desktop –

sudo apt-get install gpsbabel
, if memory serves.

No Google API key needed for this method either, which is nice.

Another helpful tool for geocaching and the N800 is gpsview. It connects to the GPS receiver and performs bearing math for you. It also helps calm the GPS data and average it out so you know where you are with a higher degree of accuracy after a few minutes, GPS data floats about 50 feet, IME. This tool is very helpful with some geocache hints. So, you have a location and need a bearing for the next cache location or you have a bearing and need a new lat/lon. gpsview does those calculations. I’d post a link, but I can’t find it now. Perhaps it was in the OS2008 depot and just loaded when I selected it.

Get out there and find some fun caches or just hike and know how close you are to roads and streams and where you’ve already been. There’s something fun about searching for a hidden location/waterfall, finding it, then taking an almost direct path back to your car.

Enter gpsPhoto.pl to tag your photos with GPS data

Tagging your photos with GPS coordinates:

gpsPhoto.pl —gpsfile HT-File.gpx \

  1. Camera & GPS times match
    —timeoffset 0 \
  2. Find closest GPS point (2 minutes)
    —maxtimediff 180 —dir ./

I came across a CSV list of waterfalls, converted it into KML and here’s the resulting googlemaps link. I know it is missing many water falls. I’ve been to some that are fairly large and they aren’t in the list. I have no idea how accurate any of these GPS points are either. YMMV.

Now that we have placed our GPS data into the photos, many of the photo hosting sites will display that either on a map or as part of the extra data. I’ve hacked together some GPS code for MyPhotoGallery that will link to google map locations for any photos that contain GPS data. Here’s an example of the EXIF data and Google Maps link that is added to every image displayed in the gallery.

Embedded EXIF data
Camera: SONY DSC-W55
Exposure: 1/160 sec.
Aperture: f/7.1
Focal length: 6.3 mm
ISO: 100
Flash: No
Date taken: Feb 21, 2009 at 3:17:21 PM
GPS: 34.135167,-84.704180

I’ve also hacked search into the perl and provided the search updates back to the original developer. He elected to remove search from his code many years ago. If you are interested in my changes photo gallery, they are hacks, let me know. If there is enough interest, I’ll post them for all.

ClearQAM Hauppauge 950Q Recording 2

Posted by JD 09/19/2009 at 20:09

A few months ago, I purchased a Hauppauge 950Q ATSC/ClearQAM USB HiDef recording dohicky. The play was to create a TiVo replacement.

Plans don’t always work out.

Hardware

The Hauppauge 950Q is a TV tuner for over the air HD broadcast TV and a ClearQAM digital cable recorder. It is like a big USB Flash drive in size. So far, I’ve only gotten it to work with either OTA or ClearQAM settings. Switching between these modes appears to require a complete re-configuration of the driver software. It must be connected to a computer to work.

For some unknown reason, the device doesn’t always work. It could be related to my VirtualBox USB settings. I think the trick is to start the Hauppauge software before any VirtualBox VM grabs the USB device. I dunno.

Software

Hauppauge includes Windows software. The provided software isn’t very good, but it does work. Scheduling future recordings is like an old VCR interface except it feels there’s a 90% chance it won’t work. Yo need to set the start-time a minute before a program or you will miss the first minute due to software startup time.

Windows Media Center (Vista-64 version) doesn’t work with ClearQAM devices. I’ve heard there is a fix for $200 from MS.

GB-PVR, another full PVR solution, doesn’t seem to be able to change channels or otherwise control the tuner. Perhaps someone else will solve the issue. Generally, I like this media player and just wish it worked as a media recorder for my device too.

Schedule Data

This is the main issue. Once you have a TiVo you are addicted to scheduling that just follows the TV show to whatever time and channel, and records it. Set it once and it just works. Last time I checked, there was no viable channel lineup manager for ClearQAM. BeyondTV and TitanTV may provide similar capabilities for PC-based PVRs. I dunno.

Recorded File Sized

HiDef content takes a lot of storage in the recorded format. The 950Q records into MPEG2 format, so every hour is about 8GB of storage. 80GB is only 10 hours. Ouch.

Transcoding

I’ve standardized on xvid mpeg4 avi containers for my video collection. The main reasons for this are that xvid:

  1. is open source
  2. avi is a container that every playback device I own supports (N800, MediaGate, Linux and Windows PCs)
  3. I know it well, since using it for the last 8+ years
  4. Supports HiDef content
  5. Can easily be transcoded to alternate video resolutions.

Transcoding hidef content into mpeg4/xvid should reduce the file from 8GB/hr to about 2-3GB/hr with little impact in quality. At DVD resolutions, 700KB/hr is common. This is good and will playback on a non-hidef player, like a MediaGate MG35.

Next, what is the final solution for good, simple PVR and the 950Q? Watch here.

kmttg TiVo-to-Go Issues

Posted by JD 08/30/2009 at 08:29

KMTTG is a GUI that brings Tivo-2-Go, TTG, downloading to any platform. It simplifies downloading, decryption, commercial skipping and cutting and transcoding via mplayer into a format that is useful for you (PSP, Zune, iPod, iPhone, N800, and numerous other formats like xvid, mp4, wmv, whatever you like). But there’s a problem.

Please help with a solution!

Google Voice Transcripts 1

Posted by JD 08/25/2009 at 12:39

I’ve been using Google Voice, GV, and the prior GrandCentral for almost 3 years. The main thing that GV added was free transcriptions for calls and voice mail. This is great, when it works well, but not so great when the transcription is, shall we say, inaccurate.

Comcast Gone Digital Overnight 2

Posted by JD 08/11/2009 at 07:06

August 10, 2009

So this morning, I flipped on the TV to catch CNN and it was gone. In place is a nice note from Comcast saying I needed to get a digital device from them. This was announced multiple times and multiple ways, so I believe that I’m prepared. The only problem is that Comcast isn’t publishing a ClearQAM channel list (yet?). And all the channels have been relocated and are no longer in order. Anyway, as of yesterday, below are the channels for the North Cobb header with both analog and ClearQAM. Only the analog below 23 still work.
The Channel Lineup:

Techinical Architect Design Goals

Posted by JD 07/08/2009 at 13:40

As a technical architect, I’m pulled in many different directions when working on solutions design. Some include:

  1. Solution meets the majority of must have requirements
  2. Solution provides exceptional value to the customer
  3. Solution needs to meet security requirements
  4. Solution advances long-term technical needs
  5. Solution can be leveraged for future, unknown, needs
  6. Solution is open whenever possible to avoid single-vendor lock-in
  7. Leverage existing available infrastructure, unless it has been proven a poor choice
  8. Costs – deployment and annual maintenance must be considered
  9. Solution must include the smallest number of components; the fewest moving parts. KISS, Keep It Simple Stupid, methodology.
  10. Minimize support complexity. Complex solutions that can’t be easily supported will eventually fail. They cause too much trouble to use.
  11. Ensure customer satisfaction by clear, concise communication with the customer and team on the expected outcomes, issues, and schedule. Provide public and private points for feedback from anyone on the team to say good and bad things. Sometimes failures can be avoided in this way.
  12. Solution must support future migration to a different, competing vendor. Be certain that customer data isn’t locked up inside a complex system that can never be moved.

So, when you begin working a project that includes updates to the technical infrastructure, consider that your solutions designer will be trying to merge all these interactions into the best possible solution for your needs. Sometimes there is not a clear best solution and just a non-worst solution must be selected.

As a customer, I often find that our needs are ahead of the solution availability curve. No solution currently exists so we need to revisit the solution space in a year or two.

Wind Power for Planetary Exploration

Posted by JD 06/29/2009 at 15:27

As we venture off Earth, power generation becomes more and more important. We hear about solar and nuclear power generation in spacecraft.

The solar panels on some Mars experiments were partially covered by dust which prevented the batteries from charging. Dead batteries means no data transmitted, no roving, no science from the experimental package.

Adding a small wind generator to many rover power systems or as a charging station on a planet known for wind would be a good idea. The further away a planet is from the Sun, the more the planet tends towards strong winds. Neptune has the strongest winds of any planet in the solar system, measured above 1,200 mph and predicted above 2,000 mph.

Anyway, I haven’t see much related to wind power for electrical generation mentioned in any Mars colonization plans. Hum. Perhaps that would be a solution for power generation on Mars? Of course, the numbers would need to be carefully determined since a wind generator can be really dangerous if it isn’t built to handle high winds and sand. Also, don’t forget that the Mars atmosphere is 1/10th the density of Earth’s, but the average wind speed over a 3.5 yr period was about 11mph as measured by Viking.

Where I live, the average wind speed is 3 mph and wind power generation isn’t considered cost effective.

The Earth is a Death Trap

Posted by JD 06/26/2009 at 09:16

This planet, the Earth, is a death trap for all life on it. At some point in the future, everything on this planet will be killed off. That is a fact, not some possible future vision, but FACT.

An asteroid hit is the least of our problems. Don’t get me wrong, we need to watch for them and have a plan of action to shift the inbound rock enough that it doesn’t hit us. We’ll need a backup plan should the first shift effort not work well enough. We also need to search for asteroids in the hard to find regions of our sky to prevent another 20 day notice asteroid event like last year. That amount of warning isn’t quick enough to do anything but a hail mary attempt.
You Are Here.

We have to get off this rock if we, as a species, want to survive. The further away from here, the better. Sadly, many of the things that will kill the Earth will also kill Mars and most of the solar system.

There is already a star pointed at us that will send high energy gamma rays AND will destroy all life here when it goes supernova. It is a matter of time and will probably happen before the Sun becomes a red giant and boils away all water on Earth, before expanding beyond Earth’s current orbit.

We need to take the first steps to get off this rock and find alternative travel methods beyond normal propulsion (throwing stuff out the back to move forward) to get to other star systems. There is no viable method of propulsion to get us (or anything else) to another star system currently. Ion, solar wind, etc are pure fantasy and CANNOT GET ANYTHING TO ANOTHER STAR SYSTEM in 1,000 years.

Steps to Get to the Stars

  1. Look for suitable extra-Solar planets to colonize with water, a strong enough magnetic field and appropriate temperatures. Telescopes.
  2. Research theoretical propulsion methods that don’t involve mass thrown out the back of the rocket. We can’t physically carry enough mass to another star. Solar wind is too weak for interstellar travel. Only travel faster than 0.5C or instantaneous travel are useful here. Generational ships traveling for 200+ year trips can’t carry enough mass to throw out the back.
  3. Perform colonization efforts inside our solar system as local laboratories to learn how to live off the land everywhere we go. Expect 75% death rates.
  4. Perform basic renewable farming research in completely closed environments until 50+ years of complete, perfect success. Determine the most efficient amount of space and stacked farming for long interstellar trips.
  5. Perform artificial gravity at 1G trials that are sustainable for 50+ years. Humans cannot survive long-term in lower G environments without exposure to 1G for hours every day to maintain bodily functions and prevent HUGE bone and muscle loss. The best answer is for most of the ship to have 0.8-1.2G to support normal life, plants, animals, and for long term storage to be placed in the lower-G internal areas of the ship. Centripetal force created gravity seems like the only real answer here.
  6. Perform heavy research on low-G conception, birth, and growth into adulthood for as many species as possible. I suspect bad things will happen to most newborns created in this way.
  7. Test more efficient methods to get mass into orbit – probably aircraft-based launch systems, not rockets. Ground cannons and earth based energy pushing devices are also interesting. Space elevators are extremely dangerous. What happens when a 200 mile long 3 foot thick cable falls back to earth? It will be bad – earthquake or tsunami wave = BAD.
  8. Perform heavy research on protective living materials against solar radiation, in space, on planets and moons. Planets must have radiation protection similar to our magnetosphere unless we want to live underground forever. Learn to remotely locate planets with this trait. To learn more, search on exoplanet magnetic fields

We need to get off this rock. It will take generations to accomplish. Every long journey begins with the first step, followed by another and another.