Xbox 360 Controller With Win8

So something I noticed after upgrading to Windows 8 was that my Xbox 360 controller for the PC no longer worked. The driver wasn’t recognized. I looked at this post on support.xbox.com and thought I’d just need to re-install the software.

No dice. It would get slightly through the progress bar and then error out. (I wish I’d screenshotted the error for note here, but I was a bit busy to think of it at the time.) This confused me, as I saw other folks on the internet doing the same thing and having it work, like in this post:

Something that I didn’t find mentioned anywhere… this doesn’t seem to work under Windows 8, but it does work after you upgrade to Windows 8.1. A nice little “gotcha”, so I figured I’d post here and hopefully help someone else when they go Google searching.

Git with an alternate SSH key…

So I use BitBucket.org both for my day job, and also for managing my private Git repos. (Since BB is free for personal private repo use, whereas GitHub charges for that…)

However, when I go to push to BitBucket for my personal use, I need to make sure that my SSH keys for work aren’t loaded. This has resulted in me doing things like “ssh-add -D” to wipe out all the keys in my ssh agent, then manually loading my personal key for git use. Then when I start work again, I have to reload my other keys. Rather annoying.

I came across a solution here: git admin: An alias for running git commands as a privileged SSH identity

However, it didn’t work for me. Took a bit to figure out why, but it came full circle back to the use of ssh-agent– even though I was properly specifying my SSH identity file, the keys from my ssh-agent were being seen first. All I had to do was to disable the use of ssh-agent inside of the ssh-as.sh script, like so:

#!/bin/bash
set -e
set -u

unset SSH_AUTH_SOCK
ssh -i $SSH_KEYFILE [email protected]

That did the trick for me. Hope that helps someone else out there as well!

Running a system RabbitMQ on a server with Chef Server

One of the things that I’ve been working on getting set up here at home is Logstash to analyze all the various log files from the home network and the servers that I admin. As far as I can tell, that seems to be the Open Source equivalent of Splunk (which is a great tool, but expensive, and the free version is missing some features that I’d be interested in).

However, I recently migrated my systems from Opscode’s Hosted Chef to the Open Source Chef server running on the box that I had been setting logstash up on. Logstash uses RabbitMQ for messaging, as does Chef. I thought that things would be relatively easy to get working together, but I don’t know much about RabbitMQ. While the ultimate solution was actually trivial, I didn’t find it easy to figure out what to do.

I tried many things that ultimately didn’t pan out. I’m still not entirely sure why, since RabbitMQ is a bit of a black box to me. I tried various combinations of the following settings in /etc/chef-server/chef-server.rb, trying to configure the two systems differently enough so that they wouldn’t conflict with each other:

rabbitmq['consumer_id'] = 'curry'            # Chef default: hotsauce
rabbitmq['nodename'] = '[email protected]'         # Chef default: [email protected]
rabbitmq['node_ip_address'] = '192.168.0.4'  # Chef default: 127.0.0.1
rabbitmq['node_port'] = 5673                 # RabbitMQ default: 5672

None of those did the trick. Even after applying all those settings, I still got this back when I did rabbitmqctl status:

[email protected]:/var/log/rabbitmq# rabbitmqctl status
Status of node [email protected] ...
Error: unable to connect to node [email protected]: nodedown

DIAGNOSTICS
===========

nodes in question: [[email protected]]

hosts, their running nodes and ports:
- rain: [{bookshelf,56170},
         {rabbit,47084},
         {erchef,53705},
         {rabbitmqctl13530,57207}]

current node details:
- node name: [email protected]
- home dir: /var/lib/rabbitmq
- cookie hash: vkNSjkgIyXIKaNLguSEV7A==

[email protected]:/var/log/rabbitmq#

Eventually I came across the solution here: http://www.rabbitmq.com/man/rabbitmq-env.conf.5.man.html. All I had to do was create /etc/rabbitmq/rabbitmq-env.conf and add the following lines:

# I am a complete /etc/rabbitmq/rabbitmq-env.conf file.
# Comment lines start with a hash character.
# This is a /bin/sh script file - use ordinary envt var syntax
NODENAME=hare

This ends up with RabbitMQ operating as a different messaging node on the system and co-existing peacefully with the RabbitMQ setup that Chef is running.

I am still curious as to how rabbitmqctl was able to see the config that the Chef server was using, even when it was running on a different port and the databases are stored in two completely different directories (/var/opt/chef-server/rabbitmq and /var/lib/rabbitmq). If anyone knows the answer to that, I’d love to find out!

Running the cacti cookbook under Ubuntu

So something I’ve been loving lately as I dive into the world of DevOps is the large community that Opscode has built up around Chef. While Puppet and Chef aim towards solving the same problem, and have many similarities in thought towards solutions (and many differences, of course), one of the swaying factors for many people like myself is the community. Puppet mostly gave me the tools to reinvent the wheel for my infrastructure; Chef gives me the tools to make a wheel and a shop full of free wheels already made. Sometimes you need to do a bit of work to make it fit, but sometimes you can just hook it up and go. That’s an invaluable thing in today’s fast paced IT world.

My “itch” to scratch of today was Cacti. I’ve been having some problems with the local Comcast connection, and the temperature has been rising here in the PNW, and as a result my mind has returned towards getting my local network monitoring set back up. And indeed, this is a great chance to set up a local testbed for Chef work unrelated to my day job. So I got Nagios and rsyslog bootstrapped with Chef here at home yesterday and worked on Cacti today.

(I got a little derailed when I found out that for some reason my Linux box’s swap partition had an incorrect entry in /etc/fstab. After fixing that, I got an error when trying to turn swap back on:swapon: /dev/sda5: read swap header failed: Invalid argumentThis article pointed me to the solution:mkswap /dev/sda5; swapon -aThat recreated the swap header and then I was able to enable it and have a stable system again.)

There was already a cookbook for Cacti, but it looks like it was designed for Redhat package names and file paths. I spent some time stepping through things and making it work with Ubuntu. For the most part, it was a matter of taking some hard-coded settings, replacing them with attributes, and setting the default values for those attributes to be the same as the old hard-coded values. This allows me to then override them locally, and anyone else already using the cookbook will see no change. I did add a few platform-specific checks, for things like the Ubuntu package names.

In all likelihood, anyone running a Debian system can probably change the spots I added Ubuntu support and extend them for Debian support as well (since most of the core Ubuntu packages either come from or get merged upstream into Debian). However, I don’t have a Debian test environment yet, so I didn’t want to make assumptions. It’s on the list of things to get up and running in a VM… CentOS, Oracle Linux, and Debian.

Here’s the role that I ended up with, when all was said and done. In my case, this server is only available over the internal network, so I didn’t need SSL support.

{
  "name": "cacti-server",
  "description": "Role to configure Cacti server.",
  "json_class": "Chef::Role",
  "chef_type": "role",
  "default_attributes": {
  },
  "override_attributes": {
    "cacti": {
      "user": "www-data",
      "group": "www-data",
      "cron_minute": "*",
      "apache2": {
        "conf_dir": "/etc/apache2/conf.d",
        "doc_root": "/var/www",
        "ssl": {
          "force": false,
          "enabled": false
        }
      }
    }
  },
  "run_list": [
    "recipe[cacti::server]",
    "recipe[cacti::spine]"
  ],
  "env_run_lists": {
  }
}

I sent over a pull request to get the changes merged in, but until then feel free to grab the cookbook from github (note that you’ll want the ubuntu branch). If you’re using Berkshelf, you can add this to your Berksfile:
cookbook 'cacti', github: 'stormerider/chef-cacti', branch: 'ubuntu'

I hope this helps someone else!

Using my fork of htop-osx…

I’ve been futzing with htop-osx in my spare time to add support for CPU temperature monitoring and fan speed… these are things I like to know when I’m using a laptop, and I figured other folks here might as well. If you use homebrew, just do: brew edit htop-osx and paste in the values from https://gist.github.com/stormerider/5804653 and then brew install htop-osx (or if you already have it installed, brew upgrade htop-osx).

Otherwise, you can clone the fork from https://github.com/stormerider/htop-osx.git and build it manually. Once you’ve done so, run htop and hit F2 to enter setup, navigate over to Available Meters, and add them to whichever column you want (left or right). I normally make htop suid anyways to be able to get full process details, so I’m not sure if that’s required to probe the SMC keys for temperature/fan speed, but it’s possible.

(Most folks will only have one fan; the newer MacBook Pros and the 27″ iMacs only do, I believe the Airs as well. Older MBPs have two, like the loaner I used when getting my MBP repaired. Some Mac Pros– the desktops pre-iMac integration– have up to 4 fans. The code currently only displays 3 of them, the 4th being the PSU fan.)

Screenshot:

General Tech Update

I’ve been really bad about keeping this blog up to date, but I’m going to try to post a little bit more frequently. Originally I started it as just a place for me to post things that I was coding, but I’d like to expand that to various different tech-related things that I’m working on or working against.

One of the things that I’ve been playing with lately is the Roku. I picked up one of these (an XD model) on sale at Amazon for around $60, and it’s been great. See, a while back I looked at how much I was paying Comcast and TiVo and the headaches involved, and determined that it wasn’t worth keeping the cable subscription. With the advent of iTunes and Amazon Unbox, I don’t see any need to pay for regular cable service when I can just buy the episodes I want instead. Especially with Netflix to fill in the gaps. The Roku works perfectly with Unbox and Netflix, which are the two primary ways I get content now (I occasionally buy something via iTunes, but in general I try to avoid them– their policy to not allow redownloading is something I don’t like. Video takes up a lot of space, and I’d much rather stream it than have to deal with storing it and backing it up myself. Word on the street is that Apple is looking to change that policy, but they need buy-in from the industry, so we’ll see what happens there.)

However, I do have some local content that I like to play on the TV, and I’ve been looking at the best ways to do that. Before I used to hook up my laptop and play files that way; it worked but now the HDMI port is used by the Roku and I don’t want to deal with plugging and unplugging equipment all the time. Enter two solutions: Gabby and Plex. Both have you set up a local media server and add a channel on the Roku which streams content from the media server. Both Gabby and Plex do transcoding (I believe both use ffmpeg behind the scenes; I know Plex does) so that you can play more than the few media formats the Roku directly supports. This has been an interesting experience as neither is really stable yet. The Plex Media Server for Windows is pretty new, but seems pretty stable; Gabby’s media server has had more than a few glitches and crashes (and I can’t get it to reliably start at boot time due the way it’s implemented in .NET). The Gabby devs are also the devs behind the Gabby Roku channel, since that’s the prime focus for them, whereas the Plex channel is actually developed by someone outside of the core Plex dev team. So I’ve been using both, and liking both, but so far I’m leaning a little bit more towards the Plex camp. Especially since they just announced the availability of the Plex Media Server for Linux. I got it up and running under my dual-boot box (Win Vista / Ubuntu 11.04), but it’s not working properly with the Roku channel. Not sure if that’s a Linux server issue or a Roku channel issue, but I’m sure it’ll get sorted out in a little bit. That gives me one less reason to boot into Windows. 🙂

Speaking of that, I’ve been spending a bit more under Linux in general of late. I find it’s a lot easier on the days that I’m working from home to have a full Linux environment at my fingertips than to run countless PuTTY sessions. Maybe it’s just in my head, but that’s the way it feels to me. I upgraded to Ubuntu Natty Narwhal a while back, which has the upside of Vim 7.3. The downside is that VPN connections seem to make my entire networking stack act weird. I opened a bug on it during the beta, but it’s lingering in limbo at this point. I definitely notice a difference between that system and my laptop, which is a Win Vista / Ubuntu 10.10 dual-boot.

After seeing a presentation on it at LinuxFest NorthWest, I finally buckled down and configured BackupPC on my NAS box (which runs Ubuntu 10.10). I just used the default Ubuntu packages for it, and spent some time configuring all my various machines to work with it. On the Linux side I just use rsync over ssh, and for the Windows boxes I use rsync via DeltaCopy. (The two things I’ll mention about the latter: you need to specifically allow port 873, aka rsyncd, in Windows Firewall. You also need to enable pings in Win Vista/Win7, which can be done via the command “netsh firewall set icmpsetting 8 enable”. Otherwise, even if DeltaCopy is working and the firewall allows it through, BackupPC won’t connect to the host because it’ll think it’s offline.) I’m still fine-tuning things, but it’s backing up around 350GB or so of data for me across like 8-10 machines. I thought for a bit about how to handle my dual-booting computers, and decided to just give them different names and identities between the different OSes. So my main machine is storm under Win Vista and lightning under Ubuntu; my laptop is typhoon under Vista and whirlwind under Ubuntu. I’ll have to figure out something for my netbook once I get the partitions straightened out on that so I can dual-boot that between 7 and Ubuntu.

I’ve also been doing some coding lately, specifically on StormChat. I’ve been working on it for years, but I’ve finally gotten to the point where I have someone else running it on their servers, instead of the only people running it being the people that I maintain it for. Granted, I did install it for Ad Astra, but still… it’s the first install not sitting on my servers. I’m working through the bug list, trying to triage it a bit and get things rolling again. There’s a lot left to be done; right now I wouldn’t really recommend it for anyone that’s not very adventurous or without direct access to me to troubleshoot the things about it that only I know how to fix. I need to work on getting rid of the need for the server, fix/revamp the installer, and fix/revamp the admin console. Those are the three big-ticket items on that project.

I know that my WordPress plugins are really out of date at this point. Some people have reported that some of them still work, which is great to hear. I hope to find the time to sit down in the future and revisit each of them and determine if they need updates or if they can be retired.

I’ve also been playing around with RIFT lately; WoW just hasn’t been grabbing me lately. I tried getting it to run under Wine, but haven’t had much luck yet. If I do get it running, I’ll post how I did so.

WinXP MCE Credential Manager…

Continuing the tale of BartPE…

So, I’ve been using WinXP Pro for quite some time. The wife’s new computer and the new laptop both came with WinXP MCE 2005 edition loaded on them, however. Seemed pretty nice… the laptop looks beautiful when I go to play a DVD and browse through pics/videos/etc. Really happy with that.

But what is this crap with having to type my username and password to access a network share every time? Sure, it’s not an issue if I’m accessing a share on my XP Pro box, but if I’m accessing a Samba share on my fileserver, or the print server, I have to type the password every time? Because somehow being able to save a password is considered a professional feature and not a consumer feature?

Yeah, right.

I did some googling around. Apparently it’s because the password saving is tied into the credential manager, which is tied into joining a domain. Now, I don’t bother with a domain– a regular workgroup is just fine as far as I’m concerned. But further googling lead to some even more interesting notes: not only did previous versions of MCE allow you to do this, but the support was still built into MCE, just disabled. All you have to do is enable a registry key and you’re set.

The one problem? XP won’t let you change that registry key. Now, you can go about this the long way… or dig out that BartPE disk and do things quickly.

  1. Shut system down and boot from BartPE
  2. Open a command prompt (Go > Run > cmd)
  3. copy system32\config\system c:\system
  4. Open the Registry editor (Go > Run > regedit)
  5. Click on ‘HKEY_LOCAL_MACHINE’.
  6. File > Load Hive…
  7. Browse and select C:\SYSTEM
  8. Specify key name ‘BANANA’ and click OK
  9. Expand: HKEY_LOCAL_MACHINE > BANANA > WPA > MedCtrUpg
  10. On the right-hand side, double-click IsLegacyMCE value
  11. Change selected value to 1 and click Ok. (THAT IS NOT AN L!)
  12. Click on BANANA subkey (under HKEY_LOCAL_MACHINE).
  13. File > Unload Hive. Confirm.
  14. Close the Registry Editor and go back to the Command Prompt.
  15. copy c:\system system32\config\system (say yes to overwrite it)
  16. Reboot from your main hard drive and you’re done.

I’m not sure what Microsoft was thinking, but I hope this level of brain-deadness doesn’t continue into Vista (though I’ll not be surprised if it does).

BartPE saves the day…

A while back we got a laptop, which has since become my primary station. It was a cheap Gateway MX6958 that we got with some money that my in-laws gave us as a wedding present.

Imagine my confusion when after getting it home and setting it up that I found that on the 120gb hard drive, there was an empty 60gb partition. The first partition, mind you, not the second. Foolishly, I decided to just format it and skip mucking with things… all was good until I shut the computer down and it did not want to boot… since the boot loader was on the 2nd partition.

Cue a large d’oh sound.

I did some research and ultimately said “screw this”, popped in the rescue cd and reinstalled. I was hoping that the XP cd would give me some options, but the rescue CD is exactly that… no ifs ands or buts. It did let me save the existing files and only overwrote the Windows installation, which was nice, but that’s the only choice I got.

So, then afterwards, I cleaned things up and moved the files over to the primary partition. Looking around, I found that there was a command to resize a parition but you could not do it on the primary parition while Windows was running.

Insert BartPE. BartPE is a Windows version of Linux LiveCDs. It’s meant as a rescue disk that you build yourself (I’m assuming to avoid legal issues with redistributing Microsoft’s files) and burn to a CD. Took me a bit to get my ISO right– after a coaster or two I dug out my one CD-RW– as the MX has an internal SATA drive and the XP CD I was using to build the PE disk didn’t have the SATA driver (I ran into an issue building it from the Gateway rescue disk and just grabbed my other XP cd to save time). But once I got that up and running, I was able to boot from BartPE and resize the partition easily.

I recommend BartPE to any sysadmin who might end up tinkering with their Windows box at some point. You never know when a rescue disk will come in handy, and it’s a lot easier to deal with than trying to work with the Recovery Console on the XP cd… assuming your XP CD even has that.