raspberry pi

I finally found time this evening to pick up a few items and bring one of my two Raspberry Pi Model B boards to life. My wife and I went out for a simple supper. While we were out I stopped by a local Best Buy and picked up an inexpensive all-plastic 22″ Samsung 1920 x 1080 LCD display and an Apple USB keyboard. The Samsung comes with a convenient HDMI port, which the Raspberry Pi plugs into with the right cable. The Apple keyboard was surprisingly inexpensive as well, the cheapest on Best Buy’s shelves, yet it’s built around a nice machined aluminum frame.

I’d already set up the boot device, a SanDisk 8GB SDHC, with last year’s Raspbian release. When I booted up into the graphical desktop I was pleasantly surprised that the Logitech M305 wireless mouse worked with the system. Power was supplied by my Galaxy S4 charger. The whole time I was running the system the charger never got warm. I tried to bring up wireless networking with a USB WiFi Cisco Valet dongle that’s normally plugged into the back of my Wii, but that didn’t work. I need to dig around and see if I can find a way to inexpensively add WiFi to the Pi.

The graphical desktop for Raspian is Xfce, a simple environment that has been a refuge for many fleeing Gnome 3. Xfce proved to be absolutely no problem. If you’re used to Gnome 2 in any incarnation, then Xfce is dead simple to operate. Besides, give me a shell to work in and I don’t really care what the visual desktop looks like. A quick check of resources shows it has GCC 4.7.2, Python 2.7, and a late version of Ruby. Although I couldn’t start irb up because not all the Ruby Gems were installed.

Working with Raspian on the Raspberry Pi demands patience. It’s slow. Even my Linux VMs on my Windows 8.1 are faster and smoother than the Pi. And yet one must always remember that the Raspberry Pi is, after all, a $35 bare-bones computer running a mid-to-low-end ARM processor (ARM1176JZFS running at 700MHz) with 512MB of RAM. Although, come to think about it, my AMD 32-bit PC from 2003 was only a little bit faster, ran Microsoft Millennium (ME) on 512MB of DRAM. I suspect that if I were to install the current release of Raspian it might run a bit smoother than it does right now. Still, it’s a bit remarkable to see it running at all. I purchased this as an embedded system, not as some ground pounding workstation. For what I intend for the board it’ll be running in character mode to avoid consuming precious memory with a graphical desktop. For what I have in mind I think it’ll be just fine.

the arcane part of the science lab

I’m not a professional photographer. I am, at best, a sporadic enthusiast, with periods where my cameras lie fallow as it were. I have many other interests, so many that a few of them see even less activity than my photography. One of them is the practical application of the ubiquitous computers that have washed over us like a cybernetic tsunami.

I purchased a pair of Raspberry Pi Model B’s around the time I was laid off in 2013. With all that was happening in my life at the time, I left them sitting in the box they arrived in while I was scrambling to re-establish myself at another job. I kept the box out in plain view so that my little Raspberries wouldn’t wind up hidden away and eventually forgotten.

I ordered the Raspberries with the express purpose of using them for tasks far different from what’s originally been envisioned. Specifically, intelligent machines, or more romantically, robotics. I’ve been a robot hacker since the mid-1980s when I used Intel 8051 single chip micro controllers with steppers, power FETs, and extremely simple photo sensors (diodes and transistors) to build rolling machines that could follow tracks and avoid obstacles. I had great plans to build up a small robotics business here in Orlando, but life and family got in the way, and in an odd turn of events I satisfied part of that itch when I worked for Birket Engineering as a sub to Universal Studios. When I left Birket I kept going further and further into more esoteric projects, such as Time Warner’s Full Service Network (FSN), the Theater Telemedicine Prototype Project, and finally into modeling and simulation for training purposes using HLA federations. All along the way I kept an eye out on robotics and embedded computing, waiting for that magic ‘tipping point’ moment when hardware, software and serendipity would all combine to provide me that magic moment.

No single moment really occurred, but a whole series. That, combined with my always never having quite enough time, stretched out into years until I finally hit 60 this past December. I’m no fool. Only by the wildest stroke of luck would I ever be able to contribute anything of significance to robotics, and even wilder if I could make a decent living at it. Robotics, from hobby to full industrial, has exploded across the technological landscape. And it isn’t as if I tried to keep up across the years. Back when Radio Shack tried to sell VEX Robotics kits, Radio Shack decided they couldn’t sell enough of them and so they put all their remaining VEX stock on sale. I bought enough VEX gear for four complete bots and even picked up extra equipment for tank treads and special sensors that didn’t come with the starter kits. All this back in 2007 while I was still working for what used to be SPARTA in Orlando. It eventually all went into special containers, with the promise to myself to find out how to externally communicate with and ultimately control the VEX controller, but that turned out to be a fruitless task, as VEX Robotics had (and still has from what I can tell) locked their controller down pretty tight.

It isn’t as if the Raspberries are the first embedded ARM-based boards I’ve every bought. Before the Raspberries I purchased a complete Gumstix kit (that was going to be the brains riding on the VEX machines). I got the Gumstix up and running Linux, using my bigger Linux (openSUSE powered) notebook as the console, even upgrading the kernel on the Gumstix. I had built embedded Linux before; one of the tasks I had at SPARTA was to take small mini-ITX x86-based computers and turn them into very specific appliances hosting Linux. That’s when I really dug down into building a complete embedded Linux distribution (nothing graphical) from the kernel on up. Writing drivers, integrating hardware, building a WiFi network to mesh them all together, it was an enjoyable if intense part of my life. Unfortunately I couldn’t go much father than to get the mini-ITX systems working as a reliable distributed platform. It had solid potential, it just needed some extra spark to really give it a purpose and push it over the top. That spark never arrived. At the end of 2007 I put it away on some storage shelf back at the SPARTA office and left for yet another esoteric project at yet another company.

So here I am, with dirt cheap computing hardware, some new wild ideas, and a lot of dusty notebooks with old ideas. My problem is I’m not into wearable tech like Google Glass and Samsung’s Gear. If I had my way I’d be building adaptive mesh sensor networks roaming through space or sitting on the surface of the moon, gather data and shipping it back to Earth. It’s my very firm belief that we put too much resources into too few missions (Curiosity, a technological tour-de-force to be sure, cost $1billion alone just for the rover, compared to about 1/10 that for either Spirit (MER-A) or Opportunity (MER-B) that came before it). A variation on micro-sats to be sure, an established technology that started with sending what are essentially re-purposed Android smart phones into space. I have a different vision for pervasive computing; my vision looks outward to tools that amplify out abilities to learn more about the universe, rather than help drive us inward to more trivial and irrelevant pursuits, such as how much of a Glasshole we can become to one another.

We shall see. Perhaps if I make more of this public it will become part of the overall motivation I need to finally get off my ass before I become so old I can’t do squat.

Photography Note

Photo of one of my two Raspberries were taken with the Samsung G4 smartphone camera, and post processed with VSCO Cam on the phone. Lighting was off to the left with a simple Fotodiox LED panel. Backdrop was my Samsung Series 7 Chronos notebook. This is the last time I’ll deliberately choose to use this smartphone camera. I’m going back to one of my many µFourThirds cameras with an Eye-Fi card and Android app on the smartphone if I need that kind of immediacy. I’ll only use the Galaxy S4 as a last resort.