- Gulliver's Travels (Classical)
- jQuery in Action (Technical)
- Warrior Heir (Recreational)
- Pride and Prejudice (Classical)
- Web Marketing for Dummies (Business)
- Wizard Heir (Recreational)
- Arguing with Idiots (Recreational)
- Oliver Twist (Classical)
- More Eric Meyer on CSS (Business)
- 4 Hour Work Week (Business)
- Bloodcurdling Tales of Horror and the Macabre: The Best of H. P. Lovecraft (Recreational)
- Latitude (Recreational)
- Time Machine (Recreational)
- Fahrenheit 451 (Recreational)
- Frankenstein (Recreational)
- Jane Eyre (Classical)
- Silverlight 4 in Easy Steps (Technical)
- ADO.NET 2.0 Core Reference (Technical)
- Dragon Heir (Recreational)
- Strength Finder 2.0 (Business)
- The Scarlet Letter (Classical)
- Hackers and Painters (Business)
- Atlas Shrugged (Recreational)
Friday, December 30, 2011
A year of Reading
Saturday, December 3, 2011
Final Update on book
- An idea is not enough. You need to prepare for more than a few minutes to plan things out all the way through.
- Reading is a lot like programming, in the respect that in both crafts, you must keep many things in your head simultaneously to ensure that all angles are accounted for.
- Along the same lines, distractions can kill your train of thought. My wife spent many evenings upstairs with me watching TV while I wrote. Unfortunately, I can't just turn out the TV and during an hour with the TV on, I often found myself writing just a few words or sentences. Far short of where I needed to be.
- The stories in my head and my style of writing fit better as short stories.
Wednesday, November 16, 2011
Book Update
Sunday, November 6, 2011
Labels in Football and Software Business
But when you break down the Indianapolis Colts, in hindsight, it becomes obvious that without their lynch-pin player, the system falls apart. Over the last decade, so much emphasis of the Colts has been placed on Manning. Manning of course is a stellar player, but he's just one player on the team. From a morale and recruitment standpoint, a stellar player can inhibit recruitment in other areas. If you describe your team as a power offense, it will be tough to recruit the best defensive players in the game, the best special teams players, and the best running backs.
I've seen this happen in businesses too. A company that makes and sells software or hardware might change its language and begin to call itself a 'Services' company. This has the effect of preventing recruiters from bringing in the best and brightest developers or hardware engineers, as its clear from the language that the companies focus and emphasis is on the Services department. Likewise, its tough to hold onto the best and brightest developers and engineers as they are no longer appreciated to the level they once were.
In business, as in sports, its dangerous to become labeled. Any labeling has to be done consciously and with the realization that there will be negative consequences to the other groups in the organization. In the case of the Colts, This decision seems intentional. Over the years, its been discussed that the Colts defense is built to be light and play with a lead, with the hope that Manning and the offense can build up a quick lead and force the opponents to pass the ball. This has worked well for the Colts over the years, but remove the key player, Manning, and the system collapses in on itself. The same thing can happen to a business though. If the emphasis is placed on one department that implodes due to anymore of reasons (such as internal politics, turnover, a failed project, etc.), the other departments may be too weak to carry the additional burden.
Monday, October 31, 2011
National Novel Writing Month
Thursday, October 20, 2011
Strengths vs. Weaknesses
Tuesday, September 27, 2011
User Impersonation in .NET
String userName;
SecureString password;
// Code to get and set userName and password.
ProcessStartInfo startInfo = new ProcessStartInfo(“cmd.exe”);
startInfo.UserName = userName;
startInfo.Password = password;
// Add other parameters to startInfo as needed
Process.Start(startInfo);
String userName;
SecureString password;
// Code to get and set userName and password.
ConnectionOptions connectOptions = new ConnectionOptions();
options.UserName = userName;
options.Password = password;
ManagementPath path = new ManagementPath(“path to wmi object to query”);
ManagementScope scope = new ManagementScope(path, connectOptions);
// Once we have a scope object, we can either create an ObjectQuery object and query that way
scope.Connect();
ObjectQuery query = new ObjectQuery(“query to execute”);
ManagementObjectSearcher searcher = new ManagementObjectSearcher(scope, query);
ManagementObjectCollection collection = searcher.Get();
// Or we can create an ObjectGetOptions and a ManagementObject
ObjectGetOptions options = new ObjectGetOptions();
ManagementObject mgmtObject = new ManagementObject(scope, path, options);
mgmtObject.Get();
// Can now retrieve values from ManagementObject like a Dictionary
string value = mgmtObject[“key name”].ToString();
Saturday, September 24, 2011
Be careful what you say to customers
Tuesday, September 6, 2011
.NET Web Deploy options
So what options are there? At my previous job, the code deployments were a manual process where the IT group would run an MSI file we generated during the build and use this to deploy to the .NET code to the web and application servers. As the code was being deployed, SQL scripts would be executed via a custom tool designed to execute the scripts on multiple databases at once to ease in this step of the deployment and to better record the errors.
But is a manual process the only way to go? Certainly not. One option is to utilize Windows Group Policy to remotely install an MSI containing the updated code. While developers may not be familiar with this functionality, IT staff members will likely have used this before to manage the software on desktops or servers remotely.
Another option is to utilize Microsoft's Web Deploy Tool to create deployment packages to push out to code to other servers. This tool can take a snapshot of a website on one server and replicate it to another, or identify changes to a server and replicate them to another server. Deploying these changes is still a manual process, but the Web Deploy tool is a powerful tool that can make the deployment process easier.
Perhaps the most intriguing option out there is Microsoft's Web Farm Framework. This tool can work with the Web Deploy tool or an MSI to automatically handle the deployment to a farm of servers. According to Scott Gu's article on the tool, the Web Farm Framework is an IIS extension that is installed on each server in the farm. One server is identified as the 'Primary' server. Any changes made to this server will be replicated to the other servers in the farm, one at a time in order to keep the entire farm operational.
The Web Farm Framework makes deployments easy, but as with many tools, there are tradeoffs. Deployments will take longer as only one server is pulled from the farm at a time. Because of this strategy, all but one server can respond to requests during the deployment, yet the servers will be on different versions of the code. And what if there is a central database that the web farm utilizes that must be updated? When should the database be updated?
What's the best option for deploying .NET code to a Server Farm? It depends on your situation.
If you have a strong IT staff, perhaps they can administer the Web Farm using Windows Group Policy and MSI's that are created during the build process.
If the IT staff is not familiar with Windows Group Policy, perhaps Web Farm Framework is worthwhile option, especially if the system cannot be taken down during an upgrade.
If you have the spare development staff, your best option might be to build your own tools. This requires a group of developers with an understanding of how the system works as a whole, but with the effort, you can build yourself a tool-set geared directly towards your server farm.
Saturday, July 9, 2011
Inspecting the Memory dump of a .NET Application
The Windows Debugger is the tool for investigating Windows memory dump files. Microsoft provides this tool to query the memory dump files, files with the .dmp extension. On its own, this tool cannot show the stack trace for .NET 2.0 applications, but with the extension PSSCore2, .NET stack traces can be viewed and a great level of detail gleamed from the stack trace. PSSCore2 replaces an older tool SOS. Most of the resources I've found refer to SOS instead of PSSCore2, though the commands for the most part are the same between the two systems.
Once both of these tools have been installed, open the memory dump file with Windows Debugger. After the memory dump file has been loaded, issue the command:
.load clr10\psscore2
You may have to replace clr10 with the folder that you placed psscore2. If you place the psscore2 dll in the same folder as the Windows Debugger, then the command would simply be:
.load psscore2
But how do you use the Windows Debugger once its been installed? This article details the basic steps for loading the old extensions, but it still contains some good references (just replace references to 'sos' with 'psscore2'). This blog post from MSDN outlines some of the most useful commands and how to use them. Also, this blog post provides a real world use case for the Windbg tool. Finally, this blog post contains numerous links and a few random tidbits about the tool.
This outlines most of the useful pages I've found on the web for loading the memory dump of a .NET application. The command “!help” will return the name of the various commands available for the psscore2 extension. In general, commands that start with “!” refer to the commands available in the loaded extension, while those starting with “.” are those available natively in the Windows debugger.
Wednesday, June 8, 2011
Is the Dream of Courier about to be fulfilled?
Monday, June 6, 2011
Samsung Intercept
Well, it turns out, there are a few options out there. In fall of last year, Virgin Mobile started offering the Samsung Intercept, a phone with Android 2.1. The initial price was steep ($250!), but it was a fully loaded smart phone, on a pre-paid plan! Of course, if you wanted data, you're better off with a monthly plan, but these plans are ridiculously priced. For $25 a month, one can get unlimited data, texting, and 300 minutes.
It all sounds good on paper, especially if you were to compare phone and data plans with other carriers. The reduced monthly price (at expense for higher hardware costs) pays for itself in about a year.
Needless to say, I eventually bought the phone (on sale after Christmas for $180) and have been using it for 5 months. I honestly can say I don't know how I operated without this device. Prior to this, I had a cheap phone for calls, an MP3 player to listen to music at work, an old Pocket PC for reading and writing, a Tom Tom GPS in the car for directions. This device has replaced them all and then some.
I use the sliding Qwerty keyboard all the time to write notes, search the web, or even write blog posts. The on screen keyboard is a little cramped, but is handy with the auto-sense.
I have better success with navigation on the phone vs. Tom Tom. This is largely because the GPS device's maps are a few years out of date and map updates cost so much, I might as well buy a new device.
I have a lot more storage on the phone as compared to my MP3 player, so I can keep a much larger music collection on hand. And if I get bored of that, there's always YouTube and Pandora.
With Amazon's Kindle application, reading is much easier on my phone compared to my old Pocket PC. I've finished a few books on the device, plus read hundreds of articles using the InstaFetch application that synchronizes with my InstaPaper account.
Finally, I've taken many more pictures with the phone's camera thanks to it always being in my pocket. Uploading to Facebook can be a bit difficult (as it doesn't always recognize orientation), but I've installed the application PicSay to handle this when the native application fails.
All told, I'm still loving my Droid phone. It was upgraded to Droid 2.2 about a month ago. The only downside here is that during the upgrade, the note application, which I had saved a dozen notes in, was removed and I lost all of the notes. This warning was clearly stated, but unfortunately, I did not heed the warning.
Friday, June 3, 2011
.NET Decompile Tools
For many needing a tool to decompile a .NET assembly, the tool of choice has been Red Gate's Reflector. This tool is one of the fastest tools out there and the decompiled code it displays is often somewhat intelligible (of course, THAT depends on the underlying quality of the code). But, recent changes to the licensing of Reflector have left many .NET developers scrambling for their wallets or for an alternative. Here's a few alternatives to .NET Reflector.
JetBrains DotPeek
This is my favorite replacement for .NET Reflector. It feels a little slower than Reflector, but the code it decompiles is very readable. Each variable that the decompiler has to name, it attempts to find a somewhat sensible name.
This tool will decompile the AssemblyInfo.cs as well, so any assembly properties will be available.
DotPeek requires .NET 4.0 but does not require registration.
Telerik JustDecompile
JustDecompile is comparable to DotPeek in many ways, but in the early build I tried, it did not decompile the AssemblyInfo.cs. While the variable names were also reasonably named, since it can't decompile the AssemblyInfo.cs file, JustDecompile is an incomplete replacement for Reflector.
JustDecompile does require free registration.
Conclusion
There are still other tools to decompile .NET code, but Reflector, DotPeek, and JustDecompile are all powerful tools that are supported by major vendors in the .NET ecosystem. As such, they bring a lot of clout to these solutions and have the support team in place to maintain such a tool. At the end of the day, JustDecompile lacks the ability to decompile the AssemblyInfo.cs. If this is important to you as it is to me, you will want to look at DotPeek instead.
Monday, May 9, 2011
How rake db:migrate works
When I first started developing Rails applications, the Database migrations with ActiveRecord seemed so much more advanced then anything I've worked with in .NET (and I've used ADO.NET, LINQ-to-SQL, and Nhibernate). The migrations seemed so intuitive, powerful, and magically. Unfortunately, when the magic doesn't work, you need an understanding of the underlying architecture to fix it.
The Problem
The other day I rolled out a Production update for Kanban for Developers. In it were 2 migrations that added a new integer column to 2 different tables and a bit column to a third table. The migration passed local validation, but when I went to deploy it to production, I issued the command:
rake db:migrate RAILS_ENV=”PRODUCTION”
But running this command gave me an error stating that it couldn't add a duplicate index. The migration name provided was created in the middle of last summer. There have been numerable migrations since then, so for some reason, the system thought this migration had been skipped.
Now, there's a couple of ways to fix the problem. The first, and easiest would have been to remove the index from the database and re-run the command. However, I wanted to learn how db:migrate operated, so I began to look around for something to indicate the migrations that had been executed on the database.
Schema.rb
The first place I began to look was within the folder structure of the web application itself. The db folder would be the logical place to store this kind of information. In this folder is the schema.rb folder, and as one might guess, it contains the schema information for the database.
The file in production had all of the latest additions, though the database itself didn't. Why? Because in my rush to move code to production, I had copied the entire db folder instead of just the migration folder or just the latest migration file. With the power of source control, I reverted the old file and re-ran rake db:migrate. At this point, I received the same error message.
schema_migrations
And the reason for this is because the migration information is stored within the database, and not a file, which of course makes much more sense. Databases are locked down much more then the file system, generally, and provide a much more logically place to store this information. For Rails, this information is stored in a simple table named 'schema_migrations' which contains a single column, the name of the migrations that have been applied.
Resolution
In the end, after ensuring the the index and all of the database changes from the missing migration had indeed been applied, I simply wrote an insert statement to add a record for the missing migration. After this, I was able to proceed with the new migrations and complete the build.
And the schema.rb file? It actually was updated to include the latest changes after I ran rake db:migrate. So while this file isn't how rake db:migrate determines whether a migration has been executed, it does document what database changes have been made in a location separate from the database.
Thursday, April 14, 2011
Geek Reading List
In an article on Tech Republic, an author listed what he (or she) considered to be required reading for a geek. While the list is a decent start, there are many books that I felt were left off of the list that are excellent reads for geeks.
Lord of The Rings trilogy: Any book which defines a new language is bound to be popular with geeks, but the fact that this book is the Grand daddy of all modern Fantasty books, and you have a book that is almost be required reading for every geek.
Myst: This book is the prequel to the story told in the popular mid -90s video game. I've always thought of the clues and game play of Myst to be particularly geeky and the prequel in book format does not disappoint.
Flat Land: The story of an imagining 2 dimensional World and what happens when it encounters beings from a 3 dimensional world. The topic and the Science involved are certainly in the realm of the Geeky.
Foundation Series by Issac Asimov: This Book is based on the premise that by studying data enough, one could predict future events. Place a story regarding this in space, and you have the Holy Grail for a geek sci-fi book. This book has several sequels, but I feel the first book was by far the best.
This of course is not an exhaustive list, but does contain several good books that should be included, or at least considered for a "geek" reading list.
Monday, April 11, 2011
Saturday, April 9, 2011
Jquery UI Draggable and Scrollbars
While working on a recent sprint on my task management software Kanban for Developers, I ran across an issue that took some time to track down. On the main screen of the application is a Kanban board with yellow boxes to represent tasks written on sticky notes on a white board. There are 5 different categories and the tasks can be dragged and dropped onto any of the categories. By default, categories are limited to only a small number of tasks to prevent users from being overwhelmed and so that the tasks will be displayed nicely on the board. However, users can edit the size of these categories and in the latest sprint, can double click on a task and it will expand to fit the text within the task. This can greatly increase the size the task category and look unsightly. Something needed to be done to reighn in the height of the task categories.
The simplest solution would be to add scrollbars via the CSS overflow property, but the appearance is less then desirable. Then I found a promising site with a list of 10 different scrollbar widgets.
While the site looked promising, there were some, like jScrollPane, which required too much setup and broke other parts of the application. Most of the other plugins would not work with the draggable plugin from jQuery UI. Once the draggable element reached the edge of the its parent container, the element disappeared behind the adjacent container. I encountered this behavior with jQuery scroll and Tiny Scrollbar. After trying a few different plugins, finally I was able to get the expected behvior out of the ShortScroll plugin.
Is Short Scroll the only choice for getting jQuery droppable and scrollbar plugibs to play nice with eachother? No, but if you encounter this issue with jQuery scrollbar and UI-Droppable elements, then you will probably have to switch your scrollbar plugin, as there does not appear to be a work around with many of these plugins.
Of course, this could change in a few months.