Ahhhhh, I spent thrity minutes scratching my head over this before a a coworker pointed out an easy solution my problem. Since most of my time was spent searching the web for a possible solution without finding any, I decided to post this entry in hopes that a search engine or two might pick it up and help someone else who doesn’t have awesome coworkers walk past their office on a regular basis.

I just used Visual Studio 2010 to create an ATL project and added an object to it and added a method to the object. This is all the stuff you see on stage at every PDC. CodeProject has a tutorial very similar to what I did. Anyway I write a quick 2 line test.js file that uses ActiveXObject to create it and call the method and I get the following error (0x800A01AD).

test.js(1, 1) Microsoft JScript runtime error: Automation server can’t create object

So what did I do wrong. Turns out, nothing except trying to run the test.js file from a 64 bit command console. All I had to do to get it working the way I expected was launch the 32 bit command console (commonly found at C:\Windows\SysWOW64\cmd.exe) and run it from there.

As Mr. Crockford is removing JSLint‘s support for WSH siting other community improvements, I thought I would post one such improvement. To create s WSH version of JSLint follow these easy steps:

1. Paste the following code into Notepad or your favorite editor and save as wsh.js.

(function () {
  var i, j, e, filename, file, source, fso = new ActiveXObject("Scripting.FileSystemObject");
  if (WScript.Arguments.length > 0) {
    for (i = 0; i < WScript.Arguments.length; i += 1) {
      filename = WScript.Arguments(i);
      if (fso.FileExists(filename)) {
        file = fso.OpenTextFile(filename, 1, false, -2);
        source = file.ReadAll()
          .replace(/\r\n/g, '\n')
          .replace(/\r/g, '\n')
          .split('\n');
        if (!JSLINT(source, {})) {
          for(j = 0; j < JSLINT.errors.length; j += 1) {
            e = JSLINT.errors[j];
            if (e && e.line) {
              WScript.StdErr.WriteLine(filename + '(' + e.line + ') : ' + e.reason);
            }
          }
          WScript.Quit(1);
        } else {
          WScript.StdOut.WriteLine(filename);
          WScript.Quit(0);
        }
      } else {
        WScript.StdErr.WriteLine(filename + ' : File Not Found');
      }
    }
  } else {
    WScript.StdOut.WriteLine('jslint - Windows Scripting front end for JSLint');
    WScript.StdOut.WriteLine('Usage: jslint filename [filename] ...');
  }
}());

Update: The code above includes a fix suggested by Marina Schiff on the jslint Yahoo group after Mr. Crockford removed a similar workaround from jslint.

2. Download the latest version of fulljslint.js from Mr. Crockford's GITHUB.

3. Concatenate the two files into one file named jslint.js (You can do this any way you want but the following DOS command will work if you run it from the directory that contains both files)

copy fulljslint.js+wsh.js jslint.js

4. To ease typing, paste the following command into jslint.cmd

@cscript "C:\PATH TO YOUR TOOLS DIRECTORY\jslint.js" //Nologo %*

Now to check foobar.js just run the following command

jslint foobar.js

Since I am an avid Source Insight user, I hooked the above up to a "compile" (F5) key and use si's output parser to quickly take me to the lines containing lint.

  • images
  • December 4, 2007

Coding Styles can be an almost religious issue among programmers and so I begin this post with a little trepidation but as I have been asked to document my particular coding style for my former team before I move on to a new team next week, I thought it would be best to do it as a post to my blog. In the over 20 years that I have been programming my style has only changed a few times. The first programming language I was ever formally taught was PASCAL so the issue of where to put curly braces has always been settled in my mind and I won’t be discussing that here. What I do want to discuss is code flow. All through high school, college and my first several jobs in the computing industry I used a code flow like the following sample.

HRESULT DoBazThroughFooAndBar()
{
IFooThings* pFooThings = NULL;
  HRESULT hr = GetFooThings(&pFooThings);
  if (SUCCEEDED(hr))
  {
  IBarThings* pBarThings = NULL;
  hr = pFooThings->GetBarThings(&pBarThings);
  if (SUCCEEDED(hr))
  {
IBazThings* pBazThings = NULL;
hr = pBarThings->GetBazThings(&pBazThings);
if (SUCCEEDED(hr))
{
hr = pBazThings->DoSomethingProfound();
pBazThings->Release();
}
pBarThings->Release():
}
pBarThings->Release();
}
return hr;
}

This type of code flow should be very familiar to every developer out there. The only disadvantage to using this type of style is that complex functions will typically nest to absurd levels and that can make the code very difficult to read. The first change to my coding style happened when I joined Microsoft back in 1994. I came to work on the MFC team. Unlike most development teams at the time, MFC had the distinction of shipping not only binaries, but the source as well. Consequently, the team a developed very rigorous style guidelines well before I arrived. These guidelines abhorred the kind of nesting demonstrated above and instead opted for a code flow like the following sample.

HRESULT DoBazThroughFooAndBar()
{
CComPtr<IFooThings> pFooThings;
HRESULT hr = GetFooThings(&pFooThings);
if (FAILED(hr))
return hr;

CComPtr<IBarThings> pBarThings;
hr = pFooThings->GetBarThings(&pBarThings);
if (FAILED(hr))
return hr;

CComPtr<IBazThings> pBazThings;
hr = pBarThings->GetBazThings(&pBazThings);
if (FAILED(hr))
return hr;

hr = pBazThings->DoSomethingProfound();
return hr;
}

The thinking here was that since MFC was a class library we can assume that the programmer is using C++. So any type that needs to be cleaned up after being used should be wrapped in a class so it’s destructor can do the cleanup. This made the code much more readable. However, it had a few disadvantages. First, you often needed to implement one off classes to wrap types that needed to be cleaned up or alternately spend time looking for pre-existing wrappers for HANDLEs, HWNDs, HBITMAPs, not to mention memory that needed to be freed using delete, free, HeapFree or CoTaskMemFree. Because this isn’t always practical you would often end up with one or two types that were not wrapped and needed to be cleaned up inside of each FAILED case. This became a problem when code analysis tools hit the scene. Around 2002, it became vogue at Microsoft for the test team to measure the success of their automated tests by using code coverage tools to analyzed the percentage of code that was exercised. I remember one tester being particularly frustrated with me because he couldn’t get better than 40% of my covered because of all the error handling code to handle errors that almost never happened. The other disadvantage came when you needed to debug your application or instrument it with tracing since it was virtually guaranteed that you would have way more than one exit point within the function. So if you had a macro for tracing that needed to be placed at the beginning of every function and another one at the end, you would need to invoke that macro at every return. Consequently, this style was not popular outside of MFC. Instead most teams opted for a code flow like the following sample.

HRESULT DoBazThroughFooAndBar()
{
IFooThings* pFooThings = NULL;
HRESULT hr = GetFooThings(&pFooThings);
if (FAILED(hr))
goto Cleanup;

IBarThings* pBarThings = NULL;
hr = pFooThings->GetBarThings(&pBarThings);
if (FAILED(hr))
goto Cleanup;

IBazThings* pBazThings = NULL;
hr = pBarThings->GetBazThings(&pBazThings);
if (FAILED(hr))
goto Cleanup;

hr = pBazThings->DoSomethingProfound();

Cleanup:
if (pBazThings != NULL)
pBazThings->Release();

if (pBarThings != NULL)
pBarThings->Release():

if (pBarThings != NULL)
pBarThings->Release();

return hr;
}

Those who practice the art of programming and can trace their methodological lineage back to the school of the great Edsger W. Dijkstra) will immediately chaff at the thought of using gotos in their code. Consequently, many have sought other code flows that solve the same problems without the gotos. One in particular looks like the following sample.

HRESULT DoBazThroughFooAndBar()
{
IFooThings* pFooThings = NULL;
IBarThings* pBarThings = NULL;
IBazThings* pBazThings = NULL;
HRESULT hr = S_OK;
do
{
hr = GetFooThings(&pFooThings);
if (FAILED(hr))
break;

hr = pFooThings->GetBarThings(&pBarThings);
if (FAILED(hr))
break;

hr = pBarThings->GetBazThings(&pBazThings);
if (FAILED(hr))
break;

hr = pBazThings->DoSomethingProfound();
}
while(false);

if (pBazThings != NULL)
pBazThings->Release();

if (pBarThings != NULL)
pBarThings->Release():

if (pBarThings != NULL)
pBarThings->Release();

return hr;
}

While it very cleverly coaxes a goto out of a loop, I dislike it for exactly that reason. A loop should be used for looping, not faking a goto semantic. Since code is read many more times than it is written, I believe a programmer should strive to make the code as obvious and readable as possible. Using the above trick may confuse the casual observer and risks making it hard to maintain even for the seasoned developer. Notice particularly that variables that require cleanup must be declared outside the loop.

I finally converted (see I told you it was religious) over to the style I use today after I realized that it addresses all the issues above. Not only that but using my style will get you near 100% code coverage as the successful execution path hits every line of code. Here is a sample using my current style.


HRESULT DoBazThroughFooAndBar()
{
IFooThings* pFooThings = NULL;
HRESULT hr = GetFooThings(&pFooThings);

IBarThings* pBarThings = NULL;
if (SUCCEEDED(hr))
hr = pFooThings->GetBarThings(&pBarThings);

IBazThings* pBazThings = NULL;
if (SUCCEEDED(hr))
hr = pBarThings->GetBazThings(&pBazThings);

if (SUCCEEDED(hr))
hr = pBazThings->DoSomethingProfound();

if (pBazThings != NULL)
pBazThings->Release();

if (pBarThings != NULL)
pBarThings->Release():

if (pBarThings != NULL)
pBarThings->Release();

return hr;
}

  • images
  • September 25, 2007

It’s been a while since a comic has made me laugh out loud, but this one did it for me.

Just the other day I read an article by Joel Spolsky titled, VBA for Macintosh goes away. In it, he pointed out the fact that Microsoft has lost its “backwards compatibility religion.” He indicated that this had the effect of “making it very hard for many Mac Office 2004 users to upgrade to Office 2008, forcing a lot of their customers to reevaluate which desktop applications to use.” (I couldn’t agree more, developers should really read the whole article.)

In fact, I remember when Microsoft first articulated their backwards compatibility religion to me in a way that resonated with my developer mind. It was 1993 and I was working at Intuit on TurboTax. C++ was brand new, Object Oriented Frameworks and smart pointers were all the talk around the water cooler and we were trying to re-architect our whole development process from the ground up. One of the big decisions was if we should use an Object Oriented Framework and if so, which one. Borland had released the first OOF a couple of years prior to this and I had bought a personal copy of Borland C++ to use at home and had been teaching myself the ins and outs of their Object Windows Library (OWL 1.0). It was a strange time in the development world. I can remember that almost everyone used Borland’s IDE and compiler at work to quickly build and unit test their individual features. Then finally, when we needed a good build for the testing team, we would rebuild with MSVC. Back then Borland’s compiler was much faster, but Microsoft’s produced very optimized code which meant smaller binaries, which meant fewer disks in the package. This was before the days of ubiquitous CDROM drives. Reducing the number of disks was the holy grail within the development department. Management had constantly reminded us that each extra disk in the package added millions to our production costs. The thing that hindered us from adopting OWL 1.0 was the fact that it only worked with the Borland compiler because of a few custom extensions. Microsoft had just introduced MFC the year before at their NT developers conference, but most Windows developers I knew had already learned OWL and figured that the Borland compiler would only get better over time. As we went round and round looking at other OOFs (there were plenty around), postulating that the amount of code saved using an OOF would outweigh any loss from doing peep hole optimizations, Borland did a very dumb thing. They released their latest C++ compiler with OWL 2.0! The new improved OOF was a complete rewrite and broke all backwards compatibility. It didn’t rely on custom extensions in the compiler any more, but every OWL 1.0 developer now had to face completely rewriting their applications to move to the new OOF. It was here that Microsoft did one of the smartest things I can remember. They promised that MFC would never do this! If a developer wrote an application based on MFC 1.0, Microsoft guaranteed that it would recompile without errors on all future versions of MFC. To the best of my knowledge, this is still true today. Developers flocked to Microsoft in droves. Our discussion at Intuit was done. MFC was it, no questions. Later the next year, Microsoft offered me a job to come work on the MFC team and I excitedly accepted.

The message of backwards compatibility is still as important today as it was then. Unlike then, however, now we have applicationsservices that live on the internet. Just dealing with the limitations of some online services is hard enough and pushes customers to choose carefully. Take Dave Winer for instance. Recently, in a post titled, Best online bank?, he asked people in the blogosphere to recommend a bank with a useful online banking service. He had the following complaints with his current bank’s online banking service.

“I have to use two browsers, one set up to pay bills from my personal account and the other to pay from my business account. I haven’t been able to figure out how to choose an account any other way. I’ve tried repeatedly to convince the bank that I don’t live in Massachusetts, but there are all these replicated copies of my address in their system, and they keep presenting the wrong address as the default.”

Add to this a lack of backwards compatibility and the problems grow rapidly. Here is an actual example from my personal experience. As regular readers will know, my son is a diabetic and last year he got his first insulin pump. These things are fantastic and no type-1 diabetic should be without one. When we chose our pump I specifically wanted one that included software, so I could download the usage logs. I’d been downloading the logs from his blood glucose meter for several years and using this data we have been able to fine tune Micah’s diabetes care.

When Micah’s pump came, the software wouldn’t work on my computer. Calling their tech support, I was told that the software didn’t work on some computers and that they didn’t know why. They encouraged me to use their new web service instead, since they were no longer developing their stand alone application. Having access to 4 computers at home and 5 more at work I was able to determine that their software only worked with USB 1.0 hardware. Only my oldest computer had USB 1.0. I tried informing them to see if they would issue a patch, but they said they would not and that I should try their web service. If that didn’t work, they would fix that. Regretfully, I went ahead and tried their online service. It did work, but not really the way I wanted it too. Had my son’s health not depended on it, I may have given up right there, but I worked with it.

The old software had been able to export the logs to CSV files that I could load into Excel and analyze in nearly infinite ways. The online service doesn’t have that feature. The reports only come in PDF files with security permissions set to prohibit copying text. (I have no idea why) So the next best thing I could do was start generating reports on a monthly basis and saving the resulting PDF files until I had time to figure out how to hack the data out of them. Well, I logged in just recently to generate the next monthly data table only to discover that the web service had been updated and now you were only able to generate data table reports 14 days at a time. Generating a month long data table report was now impossible.

Here we are presented with the single biggest problem with web services. I am forced to upgrade. I did not get to consider whether or not I could choose to stick with the older version. Granted this was just a personal process I had developed for dealing with my situation. Had this been a business process, disrupted by a forced upgrade that broke backward compatibility, imagine how bad that would be. In discussing this post with a colleague at work, he pointed out other potential forced upgrade problems. What if a forced upgrade requires a newer version of your browser or runtime? What if you use two different web services and one forces an upgrade requiring a new browser version that the other web services is not compatible with? And hence the title of my post. Things are only going to get worse before they get better.

Getting the word out.

I think this is something very important that the developer community at large should address and consider carefully. Jeff Jarvis wrote a great post on the obsolete interview, where he explained that the reporters currency is going away.

“Reporters think that they are the ones doing the subjects the favor and, indeed, that used to be the case and to a lesser and lesser extent, for some, it still is”

He quoted an email from Jason Calacanis stating, “Besides I have 10,000 people come to my blog every day, I don’t need wired to talk to the tech industry.” And Dave Winer saying “Like Jason, I don’t have any trouble getting my ideas out on my own.” Well, I am not Jason or Dave and I doubt I have 100 people visiting my blog every day, let alone 10,000. So I am going to try some self promotion to try and get this idea out. Who knows, maybe Corey Doctorow will write another short story.

  • images
  • January 26, 2007

I can remember when I got to a certain point in school where I thought Math was easy. After all, at that point it was just addition, subtraction, multiplication and division and I had do it over and over for so many years that I figured I had Mathematics mastered. As I continued to grow, I learned about Algebra, then Geometry, then Trigonometry, then Linear Algebra, then Calculus, in college I continued to learn about Differential Equations, then Rings and Fields, then Combinatorics, Numerical Analysis, Probability, it wasn’t until this year that I heard of Quaternions. I’ve now come to the realization that Math is a gigantnormus subject and I am fooling myself if I think I could ever master it.

The same thing happened today with programming when I read the following introduction to template metaprogramming on Wikipedia. Programming has become an equally enormous field where no one can expect to know everything. What will they think of next?

I’ve been promising to write a post about scripting in Xbox Media Center and so here it is. In case you haven’t heard, Xbox Media Center is the single biggest reason to go out and buy an old Xbox (not 360). It can access media from just about any source, be it local, FTP, SMB or UPnP so you can play movies, or play mp3s while viewing a slideshow of your photos or just watching the awesome Milkdrop visualizations. It will also launch your Xbox games and even display the weather for up to 3 customizable locations. Both my brother and I are heavy XBMC fans. He uses a Plextor PVR to record TV shows and then saves them to his 2TB server, then he accesses them from one of his three Xboxs (he has one for each TV in the house). I obtain my content from a variety of sources (Windows Media Center, TiVo, Podcasts and Videocasts, etc) and store them on my Buffalo NAS and then access them from my Xbox in my living room or the PC in the kitchen upstairs.

One of the coolest geek features of XBMC is its integrated support for python scripts. Most copies of XBMC come with a few scripts already installed. One of the quick favorites is called XBMCScripts. This script will help you download and install other scripts. Like most open source projects, the documentation and support are kind of lite, but if you are not afraid to dig in and try to figure things out, there is enough there to get you started. What follows is a narrative of my adventure discovering how to write my first script for XBMC.

The Goal

Being really into podcasts and videocasts, I wanted to write a script that would list the podcasts or videocasts in a RRS feed and let you download and play them directly on XBMC. I realized that the MultiRSS Reader would be a great place to start as that was almost exactly what I wanted to do. So I connect to my Xbox using Filezilla and downloaded the MultiRSS.py file from the XBMC/scripts directory. Next I opened it up and gazed at the code. Now, I’ve written programs in BASIC, 6502, Pascal, Fortran, iSETL, Redcode, C, 68000, C++, Visual Basic, VBScript, JScript, Perl, C#, PHP and a few more I don’t remember so alot of it looked familiar, but after trying to make a few obvious modifications, it became clear I would need to read some documentation. Thus began my first hunt.

Information Gathering

I started by combing over the XBOXMediaCenter site until I found the wiki page titled Building Scripts this gave some great background information as well as a link to XBMCScripts. From here I was able to download lots of script to use as samples. I also found the most useful link of all to Alexpoet’s XBMC Scripts.Alex shares a few scripts, a XBMCEmulator script and a great tutorial! I also found links to what looked like automatic generated documentation to the xbmc and xbmcgui imports.

After reading the tutorial I wanted to try to install the emulator so I read through Alex’s installation instructions and got a desktop installation of Python up and working. The online documentation that I could find for Python stinks compared to say PHP, but if you take the time to download and unzip the complete documentation it is very good and very thorough.

Synthesis

At this point I realized that python was more bolted onto XBMC than integrated into it. This is good and bad. Good, because python has a rich set of support libraries that makes it very easy to do very powerful things and also good because that meant I could do most of my development on the desktop which greatly sped things up. However it was bad because it meant that accessing the features unique to XBMC could be spotty (which they were). As I was working with MultiRSS.py I quickly realized that it was designed to handle RSS feeds that were signifcantly smaller than the ones I wanted to handle. So I would need to use a different parsing method. I also want to handle OPML directories of RSS feeds and some of these could be huge ( > 2MB ). I eventually discovered and settled on using the xml.parsers.expat library as it lightweight enough to handle monsterous files and could easily be made very robust and recoverable (something I came to appreciate in python scripts). 

Providence

When the student is ready the teacher will emerge….

It was here that I had a most happy accident. I had been playing around with MC360, a very cool skin that almost exactly mimics an Xbox 360 (I know because I have an Xbox360, though it is hardly ever turned on), and noticed some .py files in the extras directory. Here I realized that some of the functionality of this skin was coded in python. I opened up one of these .py files and saw code like the following:

self.addControl(xbmcgui.ControlImage(0,0, 720,576, 'background-green.png'))
self.addControl(xbmcgui.ControlImage(70,0, 16,64, 'bkgd-whitewash-glass-top-left.png'))
self.addControl(xbmcgui.ControlImage(86,0, 667,64, 'bkgd-whitewash-glass-top-middle.png'))
self.addControl(xbmcgui.ControlImage(753,0, 16,64, 'bkgd-whitewash-glass-top-right.png'))
self.addControl(xbmcgui.ControlImage(86,427, 667,64, 'bkgd-whitewash-glass-bottom-middle.png'))
self.addControl(xbmcgui.ControlImage(70,427, 16,64, 'bkgd-whitewash-glass-bottom-left.png'))
self.addControl(xbmcgui.ControlImage(753,427, 667,64, 'bkgd-whitewash-glass-bottom-right.png'))
self.addControl(xbmcgui.ControlImage(60,0, 32,576, 'background-overlay-whitewash-left.png'))
self.addControl(xbmcgui.ControlImage(92,0, 628,576, 'background-overlay-whitewash-centertile.png'))

A quick grep in the skin directory revealed that these files were actually contained in the monsterous Textures.xpr file. So I opened Textures.xpr in my favorite hex editor (Notepad) and lo and behold there at the beginning of the file was a list of all the resource files contained therein. I quickly copied and pasted and cleaned this up into a nice list of resources for the MC360 skin and then I opened the main Textures.xpr and created another nice list of resources for the Mayhem skin. Next I wrote a quick script called ResourceBrowser that merely displayed each Mayhem resource (full-screen) one at a time. With this new found list of resources it was a simple task to create a harness that used xbmc.getSkinDir to determine which resources to try and load.

Hack and Slash

At this point I had a desktop python script that would read RSS and OPML files and parse out the enclosures and an XBMC python script that would display the right kind of UI. All I had to do was merge the two together. Here is where I ran into my first limitation. Since I was working on this at work (long story, don’t ask) I discovered that I couldn’t access the internet, because I was behind a corporate firewall. I had configured XBMC to access the internet through our proxy server, but those settings are not accessible from script, so scripts have no way of knowing what proxy settings you have configured in XBMC.

Before I just added proxy server info to my own settings, I decided to do some poking around. Being fairly confident that I had found all the documentation there was to find on the internet for XBMC, it was time to look at the code. I found the link on XBOXMediaCenter to the SourceForge project and quickly discovered that I could browse the CVS directly from the web. I eventually determined that the code behind the xbmc and xbmcgui imports was in /XBMC/xbmc/lib/libPython/xbmcmodule/ I also discovered browsing the code in xbmcmodule.cpp that a list of all functions that could be called from xbmc.executebuiltin were listed here and all the functions that could be called from xbmc.executehttpapi were listed here. It was here I noticed the FileDownloadFromInternet function. So I ripped out urllib and urllib2 and all the code that went with it and replaced them with a single call that merely downloads the feed to temporary file on the Z drive. I also noticed that built-in PlayMedia command works a lot more consistently than downloading to a temporary file and calling the xbmx.Player().play() function. Why, I am not sure.

Release

So without further ado, here is my first XBMC python script. (I’ve also tried to post this on xbmcscripts but just in case it gets rejected for some reason I’ll leave it up here as well.)

Media Feed Browser – Allows the user to browse a configurable set of RSS and OPML feeds and play any content contained therein.

Aside from the work that I have to do for a living, which most would say is considerable, I’ve been buried in too many cool things to do. So to bring you (notice singular, I don’t expect I have a huge following) up to speed here is what I’ve been upto.

First, I’ve installed Office 2007 beta2 and have been trying to use it as my RSS reader. I’m sure with all the eye-candy the new version of Outlook will appeal to many, but I was looking for something that would make my life easier. I originally sought out the beta after hearing it mentioned on an episode of The TabletPC Show podcast. The feature mentioned that sounded so cool was that the new Outlook would not only scan your RSS feeds, but download the actual webpage as an attachment so you could read it offline. I thought if that was possible, I could consider taking public transportation to work. The reason I don’t now is that it takes me 15 minutes to drive to work, but for some reason our public transportation system require 45-75 minutes and two bus changes to accomplish this feet. If I could use that time to read my feeds then I wouldn’t be wasting the time. Anyway after days to wrestling, the beta version of Outlook will indeed download the webpage and save it as an attachment, but there is no way to display that attachment in Outlook. It’s just like Microsoft to go 95% of the way to an awesome solution and leave the last 5% hanging. I guess if I was an entrepreneur I could get rich doing the last 5%, I just don’t have the time.

Within all this, one of my favorite little companies goes and acquires the US portion of a company that owns a graphics program I’ve been dying to try out. DAZ Productions is one of the few purely internet companies that I have ever been a repeat and delighted customer of. I’ve been a Poser user since version 3, back when DAZ was part of Zygote and Poser was made by MetaCreations. Anyway, MetaCreations imploded after Kai Krause (an awesome software architect, best remembered for Kai’s Power Tools) left in 1999 and his legacy of fine graphics packages were scattered to the four corners of the internet. But that is a matter worthy of a much larger post altogether. Since then I have continued to use Poser as it passed from hand to hand and along the way became a loyal and repeat customer of DAZ Productions. I have an Outlook reminder that fires every Monday morning to remind me to go out and download the weekly free model there and in the past I have joined the Platinum Club for a year so I could buy a bunch of models at fantastic discounts. Anyway, the first thing DAZ did after acquiring Eovia was to celebrate by selling Hexagon for $1.99 to all existing Platinum Club members. After downloading the trial version. I reinstated my Platinum Club membership for another year ($99.95) and got my copy of Hexagon. For those used to doing surface modeling ala Wings3D, Hexagon rocks! Then DAZ announced that they were selling Carrara for $99 (after Platinum Club discount), I was in heaven. Now, with Hexagon, Poser and Carrara, I have an arsenal of 3D Modeling tools, not to mention a slew of free models that DAZ gave me for renewing my membership, plus more that I bought with their $35 voucher.

To top that all off, I’m becoming a Second Life addict. I read a blog from my favorite Co-host of my favorite podcast where he said he had bought land in SecondLife. So I have been hanging out at his place, building and scripting. At one point I started working on a skyscraper, but I quickly hit his prim limit and was stopped at the first floor, I don’t yet understand what that is, but I’m learning. Anyway, I left him an orb with a greeting card, but as near as I can figure he hasn’t seen it yet. It’s still there, so he hasn’t deleted it. I don’t want to make it so big that you can see from space or anything so I guess I’ll just have to be patient. Aside from that I’ve been wondering around (you can cover a lot of ground with a teleporter) like a Pack Rat collecting virtual copies of every free model, texture and animation there is and then spending hours organizing and sorting and culling duplicates. I’m sure this behavior is OCAR related but I can’t help it.

So this turned out longer than I thought, but there you go. That’s why I haven’t been blogging recently.

  • images
  • April 19, 2006

Beagle Bros SoftwareI’ve been waiting a long time for this. Microsoft has annouced that Visual Studio 2005 Express Edition will be free permanently. Finally I have a chance to proclaim that Microsoft has done something very very good. Not since I owned an Apple //e was I able to write programs at the same quality as professional programmers (remember the Beagle Bros). For the last couple of years I have used #Develop for all my C# development. The biggest reason was because it was free and if anyone asked me how they could write programs like mine, I could point them down a successful path. Before that I was a big fan of HTML Applications for the simple reason that they were free to develop. I’m convinced that free access drives innovation.When I was in high school, I worked for a contractor who had taught himself Basic and had written a program to help him bid jobs more quickly and accurately. Next thing you know he’s running a business and hiring programmers (me and my friends) to rewrite his app in C. Later, in college, I worked for Price Club and was surprised (then) to learn that when they had first started out, they had a position in the warehouse called the EDP operator. This person would run the evening reports on the cash register system which usually printed out on cash register receipt tape and then they would enter it, by hand, into and AS400 terminal. One of the early EDP operators taught himself Basic and started writing all kinds of apps to speed up the process as well as enable him to do more in his position. Eventually, his programs were adopted across the company and he was promoted up to corporate headquarters where he designed new system that professional programmers (me and my coworkers) would implement and deploy.

I firmly believe this is a big reason that Linux has really taken off. It’s not that its particularly better (I’m a geek remember, I understand the religious arguments on both sides), but rather the fact that its particularly accessible to students and hobbiests without deep pockets. I know people running ISPs that spend tons of time running their business on Linux boxes. They’d love to run Windows Server 2003, get regular updates on Windows Update rather than scour the internet for patches that they then have to recompile. But the fact is they can’t afford to do it any other way. Frankly, it boggles my mind how my webhost can do it and make money.

  • images
  • April 12, 2006

Darpan Gogia has an article on .NET Decompilation and Source Code Protection. I’ve used ildasm a few times before to look into the details of other peoples programs, but it never occured to me to use it to aid porting a project from one managed language to another.

I was kind of disappointed that he mentioned a rather poor product in the last section of his article that proports to provide protection for .Net assemblies by merely precompiling your IL to native code, thus losing any optimizations you might gain from JITing it on it’s intended platform. I’m hoping Darpan was just trying to be thorough in his article and doesn’t seriously endorse this stuff.