Mechanical Keyboard

I received the mechanical keyboard from my wishlist for Christmas. It is the Rosewill RK-9000BRI. I hadn’t been able to play with the keyboard in person, so it was pretty exciting to open it up to try.

The first few keyboards that I used were all mechanical keyboards. The Apple II computers at school, and my first Tandy had mechanical keyboards. I remember the Apple II keyboard as having a particularly satisfying feel. My first modern computer was a 386 and it had a mechanical keyboard too. All of these machines were relatively expensive compared to a modern computer, and a nice keyboard was part of that cost.

My next couple of computers came from IBuyPower (a discount computer assembler) and then I started assembling my own. All my keyboards since the 386 have been membrane style. There is nothing really wrong with a membrane keyboard; they work pretty well and are very in-expensive. The main disadvantage of a membrane keyboard is that you have to fully depress (bottom out) each key press. This requires you to use extra force to type and increases strain.

Mechanical keyboards have seen a resurgence in popularity and there are quite a few options. The biggest decision to make is which type of switch to get (there is a switch under each key of a mechanical keyboard). The switch type determines how much force is required to activate the key, how much noise a key press makes and whether there is a tactile bump when the key is activated. There is a really good post summarizing the switch types at http://www.overclock.net/t/491752/mechanical-keyboard-guide

I picked the Cherry MX Brown switch. I wanted there to be a distinct tactile bump prior to activation, and I didn’t want an audible click to sound. That left the brown and the clear. The brown is easier to find and requires a little bit less force than the clear. The other big decisions to make are what layout you want and if you want any back-lighting.

I am really enjoying my keyboard (this blog post is really just an excuse to use it more). I am still getting used to the idea that I don’t have to fully depress the keys. I have used membrane keyboards so long that it is hard to adjust. I would say that I am already faster with the new keyboard, but I am still making a few more errors than I used to.

Torque 2D goes open source

One question that people often ask when they see our games is “How did you make this?” It is a question asked both by people who have very little computer experience and by other professional programmers. When another programmer asks us how we made a game, what they are really asking is: “What set of libraries and code did you start with?” Writing a game (or really any software) is a little like cooking: There are lots of different levels of “from scratch”. Did you buy a pre-made pie and put on the whipped cream? Or did you buy the can of fruit and a crust and bake it? Or did you make your own crust and filling? Or did you harvest the fruit and grind the flour and refine the sugar? Continue reading “Torque 2D goes open source”

Take back the data – Part 5

I have decided to stop using cloud services and move all my data back to my own computers. Part 1 listed all the cloud services that I use. Part 2 described how I plan to replace my cloud services with my own web server. Part 3 covered the process of setting up the web server hardware and software in more detail. Part 4 covered SSL, setting up my own email server, and the backup system.

This conversion has been an interesting experience: I have learned a lot of details about web servers and Linux that I only knew in abstract; I knew that there was a lot of good opensource software out there, but this project has really brought home how much is out there and how good some of it is; I have been reminded how desktop applications can be so much better than web applications, and of how polished and easy to use modern web applications are.

I love having a local website that is reachable from the internet. My DSL upload rate is pretty slow, so the difference between dropping some files on a local network drive and uploading them is huge. I am looking forward to moving my main website to my local web server. (We will have to improve our upload speeds to support chadweisshar.com and wsims.com)

I am enjoying having ownership of all of my data, but I am also feeling the burden of being responsible for keeping the data safe and the server running.

In the rest of this post, I will describe how I am using the web server and OwnCloud installation to replace all my remaining cloud services.

Continue reading “Take back the data – Part 5”

Take back the data – Part 4

I have decided to stop using cloud services and move all my data back to my own computers. Part 1 listed all the cloud services that I use. Part 2 described how I plan to replace my cloud services with my own web server. Part 3 covered the process of setting up the web server hardware and software in more detail.

In this post I’ll describe securing my web server with SSL, setting up my own email server, and the backup system.

Continue reading “Take back the data – Part 4”

Take back the data – part 3

I have decided to stop using cloud services and move all my data back to my own computers. In Take back the data – Part 1, I listed all the services that I use. In Take back the data – Part 2, I described how I plan to replace my cloud services with my own web server. In this post I’ll describe the process of setting up the web server hardware and software in more detail.

Continue reading “Take back the data – part 3”

Trip to New York

We took a trip to New York City to visit my brother and his new baby. We had a great time seeing them and touring the city. They have a nice apartment in Astoria and they were kind enough to let us stay with them for the week.

In addition to meeting my new niece, we also took the opportunity to get legally married. Colorado has separate-but-equal civil unions, but with DOMA struck down, we needed to get married in a state where it is truly legal for the federal government to treat us as married.

We also wanted to go to New York to visit a haunted house called Blackout. Unlike most haunted houses, at Blackout you go through the experience alone, the actors can touch you, you have to sign a waiver, and there is even a safe word. They call themselves a “haunted experience” and I wasn’t really as scared as I was disturbed.

I’ve posted pictures of our sight-seeing here: Gallery.

Continue reading “Trip to New York”

Take back the data – Part 2

I’ve decided to stop using cloud services and move all my data back to my own computers. In Take back the data – Part 1, I listed all the services that I use. The next step is to figure out how to replace them and move my data back. To replace the services that the cloud provides, I need to:

  1. Store my data locally
  2. Backup my data (ideally an offsite backup)
  3. Provide remote access to my data

The first part is easy. The second two are much harder because of how home internet service works.

Internet providers use dynamic IP addresses. Each customer gets a new address every few days/weeks. This is like having a phone but getting a new phone number every week. You could make phone calls to other people, but couldn’t really receive phone calls back because no one knows your number. With dynamic IP addresses, you can talk to other computers, but you have to start the conversation. The consequence of this arrangement is that you can’t really run a website from your home computer.

Broadband companies will sell you a static IP (like a permanent phone number) for a small monthly fee, but since very few people have a static IP addresses, software companies haven’t been motivated to make it easy to setup a home website. So, for most people, if you want to share photos or start a blog, you have to involve a third party like wordpress.com or facebook.

This is really too bad. The promise of the internet was that anyone could publish content that could be seen by anyone else. Now we have a few large companies that are in the business of publishing other people’s content and making money off it. Just like record companies and book publishers before them, many internet companies (Facebook, twitter, flickr, youtube, etc) make money by publishing the content created by other people.

But it didn’t need to be that way. There was really no reason that we couldn’t each have our own IP address and our own personal website. Windows could have made it easy to publish your own content to your own site. Finding and connecting to other people could have been as easy as looking up or sharing a phone number. But that isn’t how things turned out, and now it is quite a bit of hassle to setup your own website. I’ll have another post with a lot more detail about hardware and software setup, but here is the quick summary:

  1. Buy a static IP. We have DSL from CenturyLink and they charge $5 per month to have a static IP.
  2. Register your domain name and point it to that IP address. That costs about $10/year.
  3. Setup a machine as the web-server. This machine should be left on all the time. Most any computer will do for a personal website.
  4. Install apache or some other web server software on the machine.
  5. Keep the OS and web server software up to date, do regular backups, make sure the machine stays on and working.

Once you have a website, it is possible to replicate most of the services that are provided by the cloud. I am going to use software called OwnCloud. OwnCloud is a webapp (a program that runs on a web server) that provides a way to store files, contacts, pictures, music and calendars on your website and share them with just the people you want to share them with. Since OwnCloud is running on my own web server, no third party has access to my data.

Continue reading “Take back the data – Part 2”

Take back the data – Part 1

The recent closures of Google Reader and Catch have reminded me of the quote: “If you are not paying for it, you’re not the customer; you’re the product being sold.” Free services aren’t really free, we are just paying for them with our data instead of our money. If the “free” service can’t find a way to make money with our data, they turn into a pay service or disappear.

I was fortunate that both Google Reader and Catch allowed me to download my feeds and notes before they closed. I just had to spend the time to find replacements and transfer my data. Between these two closures and the revelations that world’s spy agencies are working really hard to monitor and record my data, I have decided that I would like to take back ownership of my data. Continue reading “Take back the data – Part 1”

GenCon 2013

We attended GenCon in Indianapolis with Mesa Mundi. Mesa Mundi had a booth setup in the vendor room where they had two Monolith touch screens and a Microsoft PixelSense screen. They also had a table in the exhibit hall with another touch screen where attendees could come for hour long games on the table. We spent most of our time in the exhibit hall running games of Hansa Teutonica, Bio Infiltrators and the rest of our touch games. I enjoyed the convention and really liked watching people playing Hansa.

Continue reading “GenCon 2013”

Scoring the Nutrition of Foods

Part of the Food Cost Calculator project is to determine the nutritional value of different foods. The program can be used to calculate the cost of a food per calorie or weight. But I also want it to calculate the overall nutritional value of the food and the cost per nutrient.

There are certainly a lot of different opinions about what makes a food healthy. The program will have flexibility for a user to set up the scoring system based on the nutrients available in the USDA database.

One way to score a food is to calculate how much of the recommended vitamins/minerals the food provides. Another thing to consider may be the amount of fat/carbohydrate/protein in the food.

This post will describe the scoring system in the food cost program and how I setup the scoring system for myself.

Continue reading “Scoring the Nutrition of Foods”