The following are two things you can try when Windows Update fails with a mysterious error such as 0x80240004

Fix #1 – Rename software distribution folder.

  1. Go to the MoGo or the Start Menu and find cmd -> right click on cmd -> Run as administrator and run the following command:
  2. net stop wuauserv

  3. Click Run, type: %windir% and press Enter.
  4. In the opened window, rename the folder SoftwareDistribution to SoftwareDistributionold.123 Close the window.
  5. Go back to the cmd window and run the following command:
  6. net start wuauserv

  7. Now perform the windows update and check the issue, if the issue persists please follow the below,

Fix #2 – Hack the registry

  1. Go to the MoGo or the Start Menu and find regedit.exe and run it
  2. Navigate to HKEY_LOCAL_MACHINE\Software\Polocies\Microsoft\Windows\WindowsUpdate\AU
  3. Change the value of UseWUServer from 1 to 0.
  4. Now perform the windows update and hopefully it will work.

To great videos from Microsoft that demonstrate a future where information is at your finger tips.

And just in case you are inclined to be skeptical, consider a similar video from 1993

Ahhhhh, I spent thrity minutes scratching my head over this before a a coworker pointed out an easy solution my problem. Since most of my time was spent searching the web for a possible solution without finding any, I decided to post this entry in hopes that a search engine or two might pick it up and help someone else who doesn’t have awesome coworkers walk past their office on a regular basis.

I just used Visual Studio 2010 to create an ATL project and added an object to it and added a method to the object. This is all the stuff you see on stage at every PDC. CodeProject has a tutorial very similar to what I did. Anyway I write a quick 2 line test.js file that uses ActiveXObject to create it and call the method and I get the following error (0x800A01AD).

test.js(1, 1) Microsoft JScript runtime error: Automation server can’t create object

So what did I do wrong. Turns out, nothing except trying to run the test.js file from a 64 bit command console. All I had to do to get it working the way I expected was launch the 32 bit command console (commonly found at C:\Windows\SysWOW64\cmd.exe) and run it from there.

As Mr. Crockford is removing JSLint‘s support for WSH siting other community improvements, I thought I would post one such improvement. To create s WSH version of JSLint follow these easy steps:

1. Paste the following code into Notepad or your favorite editor and save as wsh.js.

(function () {
  var i, j, e, filename, file, source, fso = new ActiveXObject("Scripting.FileSystemObject");
  if (WScript.Arguments.length > 0) {
    for (i = 0; i < WScript.Arguments.length; i += 1) {
      filename = WScript.Arguments(i);
      if (fso.FileExists(filename)) {
        file = fso.OpenTextFile(filename, 1, false, -2);
        source = file.ReadAll()
          .replace(/\r\n/g, '\n')
          .replace(/\r/g, '\n')
        if (!JSLINT(source, {})) {
          for(j = 0; j < JSLINT.errors.length; j += 1) {
            e = JSLINT.errors[j];
            if (e && e.line) {
              WScript.StdErr.WriteLine(filename + '(' + e.line + ') : ' + e.reason);
        } else {
      } else {
        WScript.StdErr.WriteLine(filename + ' : File Not Found');
  } else {
    WScript.StdOut.WriteLine('jslint - Windows Scripting front end for JSLint');
    WScript.StdOut.WriteLine('Usage: jslint filename [filename] ...');

Update: The code above includes a fix suggested by Marina Schiff on the jslint Yahoo group after Mr. Crockford removed a similar workaround from jslint.

2. Download the latest version of fulljslint.js from Mr. Crockford's GITHUB.

3. Concatenate the two files into one file named jslint.js (You can do this any way you want but the following DOS command will work if you run it from the directory that contains both files)

copy fulljslint.js+wsh.js jslint.js

4. To ease typing, paste the following command into jslint.cmd

@cscript "C:\PATH TO YOUR TOOLS DIRECTORY\jslint.js" //Nologo %*

Now to check foobar.js just run the following command

jslint foobar.js

Since I am an avid Source Insight user, I hooked the above up to a "compile" (F5) key and use si's output parser to quickly take me to the lines containing lint.

  • images
  • April 5, 2008

Because of a weird twist of fate, my new computer did not come with any software to burn CDs or DVDs even though it has a burner. Last night, at 12:30am, I really needed to burn a CD before going to bed. I tried Microsoft’s built in CD burner, but it failed. Actually it never started. I couldn’t seem to convince it that I had actually inserted a blank CD. So I hit the internet trail and came across an outstanding open source CD/DVD burnning application called InfraRecorder. In the span of 15 mintues this application has earned a place in my short list of must install applications for Windows PCs.

Just the other day I read an article by Joel Spolsky titled, VBA for Macintosh goes away. In it, he pointed out the fact that Microsoft has lost its “backwards compatibility religion.” He indicated that this had the effect of “making it very hard for many Mac Office 2004 users to upgrade to Office 2008, forcing a lot of their customers to reevaluate which desktop applications to use.” (I couldn’t agree more, developers should really read the whole article.)

In fact, I remember when Microsoft first articulated their backwards compatibility religion to me in a way that resonated with my developer mind. It was 1993 and I was working at Intuit on TurboTax. C++ was brand new, Object Oriented Frameworks and smart pointers were all the talk around the water cooler and we were trying to re-architect our whole development process from the ground up. One of the big decisions was if we should use an Object Oriented Framework and if so, which one. Borland had released the first OOF a couple of years prior to this and I had bought a personal copy of Borland C++ to use at home and had been teaching myself the ins and outs of their Object Windows Library (OWL 1.0). It was a strange time in the development world. I can remember that almost everyone used Borland’s IDE and compiler at work to quickly build and unit test their individual features. Then finally, when we needed a good build for the testing team, we would rebuild with MSVC. Back then Borland’s compiler was much faster, but Microsoft’s produced very optimized code which meant smaller binaries, which meant fewer disks in the package. This was before the days of ubiquitous CDROM drives. Reducing the number of disks was the holy grail within the development department. Management had constantly reminded us that each extra disk in the package added millions to our production costs. The thing that hindered us from adopting OWL 1.0 was the fact that it only worked with the Borland compiler because of a few custom extensions. Microsoft had just introduced MFC the year before at their NT developers conference, but most Windows developers I knew had already learned OWL and figured that the Borland compiler would only get better over time. As we went round and round looking at other OOFs (there were plenty around), postulating that the amount of code saved using an OOF would outweigh any loss from doing peep hole optimizations, Borland did a very dumb thing. They released their latest C++ compiler with OWL 2.0! The new improved OOF was a complete rewrite and broke all backwards compatibility. It didn’t rely on custom extensions in the compiler any more, but every OWL 1.0 developer now had to face completely rewriting their applications to move to the new OOF. It was here that Microsoft did one of the smartest things I can remember. They promised that MFC would never do this! If a developer wrote an application based on MFC 1.0, Microsoft guaranteed that it would recompile without errors on all future versions of MFC. To the best of my knowledge, this is still true today. Developers flocked to Microsoft in droves. Our discussion at Intuit was done. MFC was it, no questions. Later the next year, Microsoft offered me a job to come work on the MFC team and I excitedly accepted.

The message of backwards compatibility is still as important today as it was then. Unlike then, however, now we have applicationsservices that live on the internet. Just dealing with the limitations of some online services is hard enough and pushes customers to choose carefully. Take Dave Winer for instance. Recently, in a post titled, Best online bank?, he asked people in the blogosphere to recommend a bank with a useful online banking service. He had the following complaints with his current bank’s online banking service.

“I have to use two browsers, one set up to pay bills from my personal account and the other to pay from my business account. I haven’t been able to figure out how to choose an account any other way. I’ve tried repeatedly to convince the bank that I don’t live in Massachusetts, but there are all these replicated copies of my address in their system, and they keep presenting the wrong address as the default.”

Add to this a lack of backwards compatibility and the problems grow rapidly. Here is an actual example from my personal experience. As regular readers will know, my son is a diabetic and last year he got his first insulin pump. These things are fantastic and no type-1 diabetic should be without one. When we chose our pump I specifically wanted one that included software, so I could download the usage logs. I’d been downloading the logs from his blood glucose meter for several years and using this data we have been able to fine tune Micah’s diabetes care.

When Micah’s pump came, the software wouldn’t work on my computer. Calling their tech support, I was told that the software didn’t work on some computers and that they didn’t know why. They encouraged me to use their new web service instead, since they were no longer developing their stand alone application. Having access to 4 computers at home and 5 more at work I was able to determine that their software only worked with USB 1.0 hardware. Only my oldest computer had USB 1.0. I tried informing them to see if they would issue a patch, but they said they would not and that I should try their web service. If that didn’t work, they would fix that. Regretfully, I went ahead and tried their online service. It did work, but not really the way I wanted it too. Had my son’s health not depended on it, I may have given up right there, but I worked with it.

The old software had been able to export the logs to CSV files that I could load into Excel and analyze in nearly infinite ways. The online service doesn’t have that feature. The reports only come in PDF files with security permissions set to prohibit copying text. (I have no idea why) So the next best thing I could do was start generating reports on a monthly basis and saving the resulting PDF files until I had time to figure out how to hack the data out of them. Well, I logged in just recently to generate the next monthly data table only to discover that the web service had been updated and now you were only able to generate data table reports 14 days at a time. Generating a month long data table report was now impossible.

Here we are presented with the single biggest problem with web services. I am forced to upgrade. I did not get to consider whether or not I could choose to stick with the older version. Granted this was just a personal process I had developed for dealing with my situation. Had this been a business process, disrupted by a forced upgrade that broke backward compatibility, imagine how bad that would be. In discussing this post with a colleague at work, he pointed out other potential forced upgrade problems. What if a forced upgrade requires a newer version of your browser or runtime? What if you use two different web services and one forces an upgrade requiring a new browser version that the other web services is not compatible with? And hence the title of my post. Things are only going to get worse before they get better.

Getting the word out.

I think this is something very important that the developer community at large should address and consider carefully. Jeff Jarvis wrote a great post on the obsolete interview, where he explained that the reporters currency is going away.

“Reporters think that they are the ones doing the subjects the favor and, indeed, that used to be the case and to a lesser and lesser extent, for some, it still is”

He quoted an email from Jason Calacanis stating, “Besides I have 10,000 people come to my blog every day, I don’t need wired to talk to the tech industry.” And Dave Winer saying “Like Jason, I don’t have any trouble getting my ideas out on my own.” Well, I am not Jason or Dave and I doubt I have 100 people visiting my blog every day, let alone 10,000. So I am going to try some self promotion to try and get this idea out. Who knows, maybe Corey Doctorow will write another short story.

  • images
  • March 16, 2007

Microsoft is very good at being #2. When are we going to take advantage of our raw problem solving talent and apply it to the right problems, rather than re-solving the same problems our competitors are solving?

  • images
  • July 17, 2006

Microsoft announced this morning that it is pulling support for private folders, which it had just recently released on the web. This seemed like a good time to mention a free software product that I have been using for a while that I believe is many times better.


Free Open-Source On-The-Fly Encryption


TrueCrypt is really simple to use. After installing (which is optional for all you no-install fans) You merely create a new true crypt volume. The first decision you need to make is the encryption alogorithm. There are several to choose from. (AES, Blowfish, CAST5, Serpent, Triple DES, Twofish, AES-Twofish, AES-Twofish-Serpent, Serpent-AES, Serpent-Twofish-AES, Twofish-Serpent). Luckily TrueCrypt provides a detailed description of each so you can pick one that’s right for you. Next, you need to choose the size. Finally you need to specify a password. TrueCrypt will then format your volume with randomly encrypted data.

After you have created the volume you can mount it to any available drive letter. From there you just drag and drop the files you want to encrypt onto the drive. That’s all there is too it. When the volume is unmounted there is no way to get to the data until you mount it again. (which requires the password)

One of the neatest features of TrueCrypt is “Plausible Deniability.” I know that sounds kind of cloak and dagger but practically it means this. You can name your volume anything you want with any extension you want. So call it setup.exe or something common like that. If someone tries to run it it won’t work. If someone more savvy looks at a hex dump there is nothing in the file that would clue you into the fact that its an encrypted volume. If I kept sensitive customer data on my laptop in a TrueCrypt volume and my laptop was stolen. A hacker might figure out my administrator password (There are plenty of cracking programs available to do that) but they probably won’t notice a temporary file, or a bad setup.exe and think, hey there is probably secret data in there!

Moon from Celestia

I saw a post on Make today of a collection of high-resolution (2750×2300) image scans of the 30 color plates from Alexander Jamieson’s 1822 (almost 200 years old!) star atlas. Couresy of the United States Naval Observatory Library. I’m sure many can think of lots of ways to use this beautiful antique art.

Anyway, while perusing antique star charts it reminded me that I had come across a really cool little application recently that I haven’t heard much about. Celestia, written by Chris Laurel, is like Google Earth for space. Even if you are not an astronomer, this little graphics application offers the opportunity to see the galaxy in a totally different way.