March 26, 2007
Another Engineering Excellence BloggerJames Waletzky, who works on the Development Engineering Excellence team with me, has started blogging. The blog is called Progressive Development and details the (and I quote) "Zany Adventures in Software Engineering with Maven and Motley". Maven and Motley, it seems, are two developers at a software company who disagree on what the $#%@* they are supposed to be doing. One of them is a slightly grumpy throwback and the other is a well-read progressive. I'll leave it to you to decide which one is less annoying. James is quite wise in the ways of software and there should be some good tips in there. So far he has blogged about vision, money, and debuggers.
March 25, 2007
The Great Lawnmower CaperLast week I tried a small experiment to determine the monetizability (which probably isn't a word, but you get the idea) of the Mini-Microsoft site. DISCLAIMER: I have no ability (or interest) in actually trying to turn the site into piles of cash, nor do I have any knowledge that Mr. da'Punk intends to. I was just CURIOUS about it.
Anyway, I had a lawnmower I wanted to sell. As I have mentioned, the extremely useful Micronews classified ads were transitioned to an online version which nobody reads. What people at Microsoft DO read is Mini-Microsoft. Someone had even made a comment (which I can't find right now) about how the site would be a great place to look for disgruntled Microsofties to hire. So, it seemed an obvious lateral thought to advertise the lawnmower on Mini-Microsoft.
I was pretty confident that Mini would accept the post, being an enlightened individual. Besides, it was bound to generate some discussion. Naturally it was all about the size of my brainpan, but I was actually hoping to sell the lawnmower. This is what being "agile" is about...put something in front of the customers, see how they respond, and iterate. Unfortunately, the ad didn't work...I did get some amused comments from inside the company, but no buyers (it is a nice lawnmower, really).
I'm not sure what conclusions to draw from this. I'm pretty sure a lot of people at Microsoft read Mini-Microsoft, and I doubt anybody would ignore the lawnmower just because the advertising method was tacky. So I guess nobody wants a lawnmower right now...oh well (thinking about it, the reason I am selling it is because I hire somebody to mow my lawn, which might be true of many other employees). If you want a nice battery-operated lawnmower, you know where to find me.
I did accomplish one thing: encouraging KeeperPlanet to post on Mini-Microsoft asking for angel funding for his startup. You can't stop progress.
March 24, 2007
Cool Pictures by Satoshi Kuribayashi
I saw this in Natural History magazine. Kuribayashi is a photographer who modifies his camera so he can take pictures where both an extremely close object and a distant background are in focus. The resulting effect can make insects look giant, like in the photo I show here. If you go to his site you can see more examples (click on Enter, then choose Gallery 1, "Micro Landscape"; the one I saw in the magazine was image #3). He also took the pictures for a book called In Front of the Ant, but I can't tell how good those are (the picture on Amazon, ironically, is out of focus).
March 22, 2007
The DST FixupAs you probably noticed, the US government decided to pull a fast one and change the start of Daylight Savings Time, moving it from the first Sunday in April to the second Sunday in March. This caused problems, particularly for calendar appointments scheduled between those two dates. So why was that?
I was discussing this in a meeting of my team the other day and people were sort of harrumphing about how silly it was that Outlook had these problems, and didn't developers know that you should just store things in UTC (aka, more or less, Greenwich Mean Time) which doesn't change to DST and etc. In fact, the Wikipedia article on Daylight Savings Time states "Some computer-based systems may require downtime or restarting when clocks shift...These problems are compounded when the DST rules themselves change, as in the Y2K7 problem" and then goes on to claim "These problems can be avoided by adopting Coordinated Universal Time (UTC), which is unaffected by DST."
Right, so the people who wrote the software must be stupid? I mean, if most of the Developer Excellence Team agrees with Wikipedia, it's not going to get much more truthy than that, is it?
Except, of course, that they were wrong (as I eventually convinced them). This is an unusual case where storing appointments times in UTC actually winds up hurting you. I'm not saying storing in UTC is bad; you want a conference call between New York and Seattle to show up at the correct local time in each place, and computers move around and all that, so storing it in UTC on the server makes sense.
For example: Consider an recurring appointment that you set up every Friday at 10 am, for 4 weeks beginning March 23 and ending April 13. Outside of DST Seattle is 8 hours behind UTC, and during DST it is 7 hours behind. So if we are a computer that thinks that the cutover is April 1, that innocuous 10 am meeting actually occurs as follows, with times expressed in UTC:
- 6 pm on March 23
- 6 pm on March 30
- 5 pm on April 6
- 5 pm on April 13
Now, if you suddenly change things so that DST starts in early March, the two meetings scheduled for March 23 and March 30 actually shift; not in local time (which is still 10 am), but in actual UTC time. They are now at 5 pm also. So while most fixes about DST simply involve modifying the mapping between UTC and local time, in this case the meeting, as viewed by an alien and/or a Londoner, has shifted its start time.
So, when the cutover to DST changed, any meeting scheduled between March 11 and March 31 would need to actually change to be an hour earlier than previously. If not, it would show up an hour later when converted back to local time on a computer that had been updated (which is actually happening; people are showing up for our classes an hour late because that is what their calendar shows). And what if you have two meetings scheduled on March 26, one by a computer that has been patched and one by a computer that has not been patched? I doubt that Exchange Server has a way for a client machine to say "I really know what I am talking about when I tell you a UTC time to schedule an appointment on." And what if the Exchange Server itself hasn't been patched?
I suppose this all could have been designed for ahead of time if computers somehow presented evidence as to how up-to-date their DST cutoff date rules were, and the server arbitrating with the latest machine winning and knowing how to patch things up between different rules. But my point is that this is really a very tricky problem, which may actually be impossible to solve perfectly without a much more complicated patch than we likely had time to do.
March 17, 2007
Engineering for Software PerformanceIf you dig around Microsoft for guidance on how to make your software have good performance ("performance" in this case meaning not "what it's supposed to do" but rather "it runs fast"), you will probably find some advice along the lines of "performance is not something you can leave until the end; performance is something you need to work on all throughout the life of your product." This seems like self-evident advice: if you delay worrying about performance until (say) you are integrating and preparing to ship, it may have drifted out of control and you won't have enough time to improve it before you need to ship. The comparison is made to security, where a design mistake can be just as painful as a coding mistake, and it took Microsoft a while to appreciate the (correct) assertion that "security is not something you can leave until the end; security is something you need to work on all throughout the life of your product."
The problem is when I think about my programming past, this really hasn't been the way it has been; I have pretty much ignored performance until the end, or more particularly I've ignored performance until a user noticed it was so bad that we had to fix it. The user could have been me, or someone else on the team, or someone in a beta...but the point is I didn't have to dig into the details of design to see ahead of time where the potential performance problems were going to be, the way you have to do for security. If something was so bad a user noticed it, then I fixed it, and the fix was always some local thing where I had to finetune the particular piece of code that was slower than it needed to be--nothing that destabilized all my work and made me wish I had analyzed the design in enough detail to perceive the performance problem before I started writing code.
Performance is not like security, where your system is only as secure as the weakest link on the chain; if some code that rarely executes is slow, it probably doesn't matter. Someone was telling me that at some point during the early work on Word 2007, they discovered they were writing a value to the registry on every keystroke. That sounds really bad, and it was, which is why they noticed...but the fix was nothing too fancy, just stop doing it.
On the other hand, I have seen LOTS of bugs in code that was attempting to optimize the speed of something that didn't need to be optimized, because it wasn't something a user was ever going to care about. Implementing a cache of some sort seems to be the most common mistake, I suppose because caches are the kind of thing you learn about in college and rush out to implement when you get a job. Overly complicated algorithms, like trying to get your sort down from O(n^2) to O(n log n) when all you ever sort is 10 items, is another one, for similar reasons. This has led some people to say that premature optimization is the root of all evil, a sentiment that I agree with more than I disagree with.
Microsoft has now realized that performance tuning needs to be scenario based, that is you have to actually think of something a user would actually do and then decide if the performance is going to be good enough. The problem, of course, is that you can't REALLY tell how long something is going to take until you run it, so this perforce does lead to optimization being done late in the game. And I really don't mind that; I have seen more bugs due to premature optimization than I have seen...well, I can't quite figure out how to parallel-and-reverse that previous sentence fragment, but you get the idea.
There have been times (particularly when I was working on Windows NT networking and trying to win performance tests against Novell's Netware) where we knew we had to absolutely be as fast as possible, and before we had any test data we would sit down with the assembly language that the compiler had produced and try to save clock ticks by having our if statements fall through more often than they jumped. But that is a pretty unusual case where we basically knew the exact user scenario that was going to matter, and also the code that was going to be executed the most.
There is something else interesting that is going on with computers, especially big servers. It turns out that memory usage is actually much more of a performance issue, especially as you scale out, than raw speed is. A server with thousands of connections is probably only doing active processing on a few of them, but it is holding the memory for all of them. And if you use up too much memory you wind up paging to disk, which typically dwarfs all of the goodness or badness in your algorithm. I once heard a story that the Microsoft C compiler used to have switches for "optimize for speed" (which would do things like unroll short loops) or "optimize for size" (which would try to squeeze the generated code into fewer bytes) but it was discovered that optimizing for size actually made things run faster also, because they used less memory. And, thinking about it, memory usage is a) something you probably can get a better handle on in design and have a bigger effect on during design, b) something where if you have to make changes late in the game they might be very disruptive, and c) something, like security, where every little bit helps. So if I had to give someone guidance on how to optimize for quote-unquote performance during the design phase, I would tell them to worry about optimizing their memory usage during the design phase; they could worry about execution speed once the thing was running and they could put it in front of a customer (the one case where you would want to think about speed during design was if you had a public API that had baked in some bad speed characteristics; the old DOS way of reading files in a directory, with its "find first/find next" type API, seems like one such case).
The other big piece of advice is that you do want to be able to figure out how your code is doing, so make sure your design allows you to measure the performance, whatever it is. This may just be baking in some stuff that can count how many times a method runs, and how much memory is used...which are probably things the compiler/IDE can help you with later no matter what you do in upfront design. But just in case, keep it in mind.
I also once heard someone say that when you design an algorithm, you should have in mind a way to improve it IF it turns out to be the performance bottleneck. So I'm fine with that, as long as it's just a quick thought and the IF is understood. Yes, if my sort is the problem, we can replace it with a better sort...but chances are it won't be, and I'll have saved time. As the saying goes, never put off until tomorrow what you can put off until the day after tomorrow.
March 13, 2007
Workspace Advantage: We've Got Swatches!After some delays due to permitting and planning, the Workplace Advantage remodel of the Engineering Excellence space is on track. I know some of you are rolling their eyes, but I personally am excited about this. We've got plans of the new space, and an actual date where we move out for construction (mid-May).
There will be a period where we are a bit nomadic; some groups in EE will have assigned temporary offices, but the dev team, at least, decided to not move into temporary space; we'll have a spot to put our desktop machines, but otherwise we will work from home, book conference rooms, squat in lounges, etc. I expect there will be a period of discovery as we seek out the most comfortable spots. Of course when we are teaching, we'll be in our regular classrooms.
The best sign this was really happening was that we were asked to choose among fabric samples for our couches. Since we are giving up some personal space, we get it back in the form of a mini-lounge in our team area, complete with coffee table and laptop stands. So, we need to choose the colors for all this. We had some slightly interesting fabrics to choose from. I was leaning towards a red-with-green-palm-tree and grey-with-dots combo, but the consensus in our team was for an exciting combination of beige, tan, ecru and sand. Oh well. You can take the developer out of the office, but you can't take the office out of the developer.
March 12, 2007
U-Dub SnubMarch Madness is upon us, the time of year when all thoughts turn to the NCAA college basketball tournament. The selections for the tournament were announced yesterday, including surprising upstart the Washington State Cougars.
For teams that don't make the NCAA, there is the NIT, essentially a consolation tournament which takes the best teams that didn't get chosen for the NCAA. This ESPN article about the tournament describes it as "Once the NCAA picked the 65-team field for its tournament, the NIT had the pick of the leftovers, which included the four No. 1s and others such as Syracuse, Drexel, Kansas State and Washington."
The only problem is that Washington didn't get chosen for the NIT. The top 6 teams in the Pac-10 got into the NCAA and UW was 7th in the conference. It was basically a foregone conclusion that Washington would get into the NIT, given that they beat UCLA, USC, and Oregon and played in a power conference, but for whatever reason they didn't. This Seattle PI article describes the scene: the players got all ready for practice, then gathered around the TV to watch the NIT selections, to find out where they would be playing. Surprise! The answer was "nowhere". Clunk.
The ESPN article raises the possibility that people on the East Coast (including the NIT selection committee) simply forgot the Huskies existed, and the whole thing was just a big oversight.
March 10, 2007
Startup UrgeI recently heard of two people who are leaving Microsoft to start their own companies. One of them is Christopher Payne, VP of Search; the other is someone you have never heard of (yet!).
This may be a fatal character flaw, but I personally have no urge to start my own company or "do a startup" with some other people. I think the main reason is because my parents were/are of an academic bent. I know people whose parents impressed upon them, from an early age, the importance of being your own boss (and presumably nudged them in that direction, whatever their career success). But not mine, for which I am glad. I don't have any concerns that I am letting my parents down by drawing a salary, and I don't chafe under any built-in revulsion at working for "the man".
Understand, of course, that I'm not out there breaking rocks every day; the software industry in general, and my job in particular, allows me to operate with an impressive degree of independence. Still I have no sense that things would be any better if I hung out my own shingle. I view it more as a negative; having to worry about payroll and rent and equipment, prostrate myself to venture capitalists for money, argue with my partners over the couches on the corporate jet...who needs it! Yes, there would be some plusses, like being able to order up a Mariners suite for the season, but I don't view the fundamental "I am working for myself" as a particular benefit.
I was once talking to a Microsoftie about his dream of doing a startup; when I asked him what he wanted the company to do, he shrugged and explained that there were lots of ideas floating around; they just needed someone to run with them. I'm not saying never: I may one day start a small company. But it would be to pursue a specific idea, not because I was living the dream.
March 08, 2007
C#: Not Just for Breakfast AnymoreWe are in the process of signing our oldest son up for junior high school, and one of the electives he could take is a C# programming class. This is a new class next year and I guess they weren't sure who to offer it to, so it's open to anybody (that being grades 7 through 9 at this school).
The description states "Students will learn the basis of C# programing language to create small programs and games. These programs will include how to author, compile, debug, and run Windows applications, console applications, and internet applications." The topics they cover include Writing Code to Handle Events and Set Properties, Using branching and Recursion, Getting to Know the.NET Framework, Obtaining Data From a SQL Server 2005 Express Edition Database, and Data binding Data to User Interface Controls. That's a pretty impressive list. I don't recall worrying about binding data to controls until at least 8th grade (all kidding aside, I did actually begin programming around 8th grade, and we got our first IBM PC towards the end of 9th grade. Then again, I was a geek).
It's not that the school is some tech mecca; this course is basically the alpha and omega of their programming curriculum. Since they take the students all the way to a working database-backed application, there may not be much left to teach beyond how to choose a VC for your mezz round.
The school does offer a class in graphics (mostly web-focused, it appears). There is also one on using application such as Word and PowerPoint. Our son could probably teach the PowerPoint section: topics would include How To Rip Off Animations from YouTube for Fun and Profit, Looping Sounds to Annoy Your Parents, The Functional Efficacity of Purple-on-Green as an Eyesight Test, and Slide Transitions: For Great Victory.
March 05, 2007
Vancouver, Land of the SPUIWe went down to Portland last weekend. Actually we stayed in Vancouver, Washington, which is just across the (Columbia) river from Portland.
In Vancouver we had the pleasure (or annoyance, if it's nighttime and the lane markings are unclear) of driving through several SPUIs. What is a SPUI, you ask? Ahhh, it's a Single Point Urban Interchange. In short, it's a highway exit where instead of two sets of traffic lights, one on either side of the highway, you have a single traffic light in the middle. The diagram on the Wikipedia page hopefully makes this clear.
Vancouver has several SPUIs. The ones we visited were on S.R. 500 at the Thurston and Andresen exits (the first two exits west of Interstate 205) and on Interstate 5 at the NE 99th Ave exit.
The Kurumi page on SPUIs promises a link to a complete list of known ones, but the link is dead, so I can't check if the Vancouver ones are listed. I also saw a SPUI in Fairbanks, Alaska about 10 years ago--presumably holding the title of "world's northernmost SPUI".