Wednesday, September 23, 2009

Tech Ed 2009

So as I mentioned a few posts back, I went to Tech Ed New Zealand 2009 last week. Ever since I’ve been struggling with what to blog about it, if at all. I’ve had many great Tech Ed experiences in the past and was looking forward to going this year, but it turned out to be almost a complete waste of time, and I don’t really want to post a whole bunch of negative comments about it.

To be honest I’m not even sure why it was so bad, and I suspect that’s because there was more than one problem. One speaker admitted that content on .Net 4.0 and VS 2010 was reasonably light because beta 1 had been out long enough that most people who were interested had already seen it, and also because he believed beta 1 was terrible. The same speaker said that most of the time the Tech Ed event coincided with new releases, betas or otherwise, of products and so there was a lot of new stuff to talk about, but that wasn’t the case this year. The lack of interesting content meant I actually had three sessions where there weren’t any presentations on I wanted to go to, and I ended up doing work instead.

Lack of new content was a problem too, but perhaps that’s a symptom of the times we live in rather than a fault of Tech Ed. Most of the ‘good’ sessions I went to, such as the one on parallelisation in .Net 4.0, and some of the TFS sessions weren’t useful because I’d already seen everything that was presented already… online. In fact, one of the presentations on TFS branching and merging I’d seen the week before in an MSDN video online, and it was much better done in the online version. The Tech Ed version of the demo was functionally fine but too simple (not enough data in the setup) and so didn’t really indicate why or when the feature was useful, in fact my boss who was in the presentation with me was quite confused by why anybody would think the feature as demoed would be any good.

A second problem was that many of the sessions were insultingly simple. To use an ‘Australasian’ phrase, it was like being taught to suck eggs… when you’re"it was like being taught to suck eggs… when you’re already a professional egg sucker." already a professional egg sucker. One session spent around ten minutes explaining what code reuse was and different ways of implementing code reuse to a room of developers. Another session on Team Test and Lab Manager spent several minutes on a slide explaining why we should care about testing. Samples and examples were overly trivial (they always are, but seemed worse this year) and the rest of the time the speakers kept belabouring the same point over and over again. Often they spent 2 minutes or more on a concept or bullet point, when we’d got the message in about ten seconds.

Frustratingly several of the sessions could have been really good, but failed to deliver. One session on DSL’s and ‘declarative programming’ would have been excellent if the guys doing the presentation had gone into detail about the application they’d built, the pitfalls they’d run into, their solutions and so on. Instead we got not much more than the same DSL/code generation information that has been covered again and again on the internet in web pages, videos and podcasts (like Dot Net Rocks). They gave us a tantalising taste of a really cool framework they’d built to run some cool looking applications they’d also built, and then showed us how we could build a three step workflow that added two numbers together and returned the result… there just wasn’t any link in the presentation between the tools they were talking about and the result they’d achieved. If they hadn’t shown us the tiny bits we saw of the applications they’d built, the true power of the tools they’d used would have been completely lost. What’s more, even though their system was heavily workflow based, this wasn’t the presentation where we found out the workflow engine/library in .Net 4.0 has been completely rebuilt in a new namespace with the old one left in for compatibility. I wonder if they even know !

Another talk on building applications with WPF started out promising but then completely missed the point and was inaccurate to a degree. The talk started out saying it was going to explain why Window Forms programmers working in WPF are so unproductive (at least at first) and the experiences the presenter had while building his first few commercial applications using WPF. The presenter then went on to explain MVC/MVP a little bit and the problems inherent with them, then went on to explain how to properly bind controls and adjust their appearance etc at runtime in WPF to avoid these problems. The issue here is that he implied this couldn’t/wasn’t done with Windows Forms… but it’s exactly what we do at work ! There was nothing that required WPF in this part of the presentation even though that was implied. It was just good architecture, and again we were being taught to suck eggs. The only useful piece of advice was to design ‘lookless’ user interface, and that if you didn’t you’d run into problems. Good advice, but not exactly new. He spent most of the session saying everything should be lookless, then he almost (but not quite) said the only benefit to designing lookless controls/user interface is to be able to change the look and feel later, and then he said that even if you never customised the look of anything you should still use WPF… but then he didn’t explain why.

My last negative comment is that all the evaluations had been moved to being ‘online’, which is perhaps not a bad thing in itself, but there was no ‘mobile’ version of the site at which you posted feedback. If you didn’t have a laptop or weren’t prepared to connect your laptop to the open wireless network, then you were pretty much out of luck for posting feedback during the day. Of course, with my internet connection still not working at home, that meant I pretty much couldn’t post feedback at all.

So with that off my chest, here’s a few positive/useful things;

The talk on parallelisation was very well presented, it would have been informative if I hadn’t seen it all on line already, and the presenter was funny and entertaining… “There’s only three ways to use all the CPU power on a 16 core machine. 1. Write a very bad program. 2. Write a very good program. 3. Run Outlook”…

“Did you hear about the 486 talking to the Pentium ?

486: What’s 2 + 2 ?

Pentium: 5.

486: That’s wrong !

Pentium: I know, but I was fast !”

Ok, it was funny in the presentation, I promise. Perhaps it was his accent.

I did however lean three things in this session.

First, it seems the TPL hasn’t just been added to the .Net 4.0 BCL as is, and then everyone called it a day… there have actually been updates inside the CLR to improve the threading abilities provided. This is apparently important for the second thing I learned; there are new debugging tools. Specifically there is a new ‘tasks’ window which shows running ‘Tasks’ (equivalent to the Task class in the TPL) and can indicate tasks that are deadlocked. Apparently the CLR had to be modified so that shared resources could be monitored in order to provide the deadlock detection. As well as the ‘Tasks’ window the existing Threads window has also had some updates, so it now shows the stack trace for each thread right there as part of the threads details, no more having to change the active thread and leap over to the call stack window.

Finally, there is a really exciting feature built out of black magic, called ‘Debug History’. The Debug History is shown in a dockable tool window within the IDE, and effectively tracks the runtime history of your application and allows you to revert execution to any prior point. I only saw about 4 minutes of this in the entirety of Tech Ed, and it was split over two different sessions so they demoed pretty much the same thing, but it was rumoured that you could put execution back at a prior point and all your memory would reset as at the selected point in time, expired threads would be restarted in the correct state and so on. I can’t imagine how many goats had to be slaughtered to make that sort of voodoo magic work. Only time will tell if it lives up to it’s promises.

In a talk on what was new in the .Net 4.0 framework we discovered that WF, originally WWF, and referring to ‘Workflow Foundation’ has been completely rewritten. Oh no, you might cry, will our existing applications work on .Net 4.0 ? Is it backwards compatible ? Yes, and sort of. It turns out the re-write was done in a new namespace, all the existing 3.5 WF classes and types remain and work as they used to, but you’re probably not going to get much benefit out of that so in reality you actually do need to re-write your workflow applications to use the new types in the new namespace. Luckily for us, we don’t use .Net workflows.

Last, but not least, the Microsoft Test and Lab Manager product which makes testers first class citizens of the SDLC and the TFS world looks awesome. Not only can testers easily view, edit and create work items, there are new work item types for test cases etc. There is also a fantastically robust looking user-interface action recorder which can be used by testers and developers to play back complicated sequences of actions require to reproduce problems, and with just a little coding these can be turned into automated user interface tests. When creating work items, videos of the testing, memory dumps for debugging, system information (such as CPU, RAM, available disk space, Windows & service pack versions etc) and action recording can all be attached (and sometimes automatically) to the created tasks. This product, when released, will definitely be worth looking into for anyone who takes testing seriously.

And sadly, that’s about it. So that’s my last word on Tech Ed for this year. I hope others got more out of it than my boss and I did, although I heard a few other people grumbling so it sounds like we weren’t the only unhappy ones.


Technorati Tags: ,

No comments:

Post a Comment