Wednesday, November 14, 2007

Things that make me happy

I have a feeling that one of the tests of "real happiness" is to be aware that you are happy, rather than just "feeling OK". For the last few weeks I have been particularly aware that I am happy, and that things are going well for me (as opposed to last year, which was quite nasty). I am getting good feed back from customers and colleagues (they like my work), and I'm past that point in my new job when I don't feel properly engaged. I've finally found floor covering for my kitchen that accords with the vision I had in my head, and I should be able to get it installed soon. My cabinet maker's sketches for my new sideboard are just what I wanted, and he thinks he can have it finished before Christmas. All good.

And Stephen Fry has started to write a blog and someone has had the happy thought of getting him to write a column for The Guardian (and if you don't know who Stephen Fry is, I can do no better than to refer you to his performances in productions such as Jeeves and Wooster and Cold Comfort Farm). To find that he is a geek in the very best sense, and verily a Mac person as well - what can I say? It's like a very large, gift wrapped, completely undeserved present!

Closer to home, the lovely folks at Zengobi have issued forth version 4.1 of Curio, which includes the plugin necessary to make Curio documents visible in Leopard's CoverFlow and QuickLook, and numerous other enhancements. Love your work, people!

On a slightly downward note, I've been told that the LookSmart people are selling some of their assets to CNET, and that appears to include Furl. As my friend Alec Muffet says Back-Up Your Furl Databases, ASAP !!! (and his blog has the instructions). Unfortunately, Furl appears to be broken, and will not let me download the zip file of my saved content. Furl is aware of the problem, and one can only hope that there is someone there who can fix the system - they've laid off 25% of their staff in the last quarter, and in my experience engineers always get shafted long before managers.

Tuesday, November 13, 2007

On the slow corruption of the English language

I've been meaning to have a little whinge about this subject for a while, and since I have 10 spare minutes. here it comes. First let me state that I am English by birth and Australian by naturalization. I lived in England until I was nearly 12, when my family emigrated to Australia, so I did part of my education in each country. I've always loved the written word in almost any form, which is why I originally became a librarian. However, the years have passed, I somehow drifted into IT, and now I seem to read more words on computer screens than I do on paper.

The general standard of written English is not improving, it is getting steadily worse, and I do not understand this. Computer are particularly intolerant of errors in both spelling and grammar. Anyone who has ever mistyped a variable name, or made a syntax error in a bit of code will know that this is true, and not likely to change any time soon. I was once called out to a site to diagnose a problem with a Unix system. Some piece of software that the system administration folks were using to automate account creation was refusing to load on this machine, and things had reached the finger pointing stage between the operating system vendor and the application vendor. I poked about for a few minutes, and realized that some fool had edited the /etc/password file and changed a colon (:), the normal field delimiter for that file to a semi-colon (;). The application software was reading the file, and choking on the unexpected character. These things matter to computers.

So if we all work with these machines, strict enforcers of specific rules for spelling and grammar, why is the increasing sloppiness of the written word as we use it to communicate with one another tolerated? Written language matters, people: it is a communications protocol that we use to make sense of one another, and heaven knows that can be hard enough without having to cope with protocol errors.

And don't tell me you use a spell checker! A spell checker is worthless unless it is used intelligently, and by someone who understands their own language well enough to know when the program is wrong. I used to work for a senior manager who had the unfortunate habit of spell checking his written communications, and just accepting whatever the program suggested: this frequently resulted in hilariously nonsensical sentences, and the need for clarifications.

My current pet hate is the confusion that seems to have arisen around the words troll and trawl. Try this: go to Google, and search this phrase "troll through my archives". About 60 hits on this combination, and there are variations on this theme: try "trolling through my archives", for another 77.

Let us get this straight once and for all: if you are hunting through an archive, you are searching. You could say dredging, or hunting or trawling. These are all fine and appropriate words, which convey the sense of sifting through a collection of objects. What you are not doing is trolling. Trolling (which seems to have come down as a corruption from either Middle English or High German) means either to fish with a baited line - which is hardly comparable to a thorough search - or to stroll about. Trawling is a fishing activity, typically involving large nets which collect everything in their path, and the word has acquired the additional meaning of thorough searching.

Unfortunately the word "troll" already has a well established meaning across the Internet - those pathetic souls who derive personal gratification from stirring up flame wars by posting deliberately inflammatory comments online. And since most of the planet has apparently seen Peter Jackson's version of the Lord of the Rings, you would think that most people would have a mental image of a troll that did not include a capacity for patient searching as part of the feature set. This usage of the word "troll" derives from Norse mythology, and refers to something ugly that that lives in a cave, and seems quite apt for the sort of person whose most effective method of gaining personal attention is to annoy strangers.

I realize that all living languages evolve and mutate: one has only to look at the new words that have entered the dictionary in the last few years to understand that (spyware, ringtone, biodiesel - all recent additions). But the very richness and flexibility of the English language can only be maintained if we keep plenty of distinctive words in use. If we manage to collapse "troll" and "trawl", a little bit of colour and vitality drains away.

So let's keep our cave dwellers and our fishing nets separated, shall we?

Sunday, November 11, 2007

VMware Fusion: OS virtualization for Mac OS X

One of the biggest buzzes in IT at the moment is around virtualization. Server virtualization works like this: most computers in data centers are not really very busy: many of them are actually less than 15% utilized most of the time. There are many reasons for this. In the past, as a company started a new project, it was common to buy a complete set of new equipment for that project: a new web server, new application server, new database server, and in some cases two or more of each for redundancy. Plus firewalls and load balancers and such. I've built lots of these set ups and they are sold like this for three main reasons:

1. because it allows the purchaser to attribute all blame for anything that goes wrong to the vendor who sold "the solution"; this is known as "the one-throat-to-choke principal".

2. because whoever is funding the project wants to "own" the kit, and not share with anyone else; this is common in environments that aren't mature enough to have figured out how to bill for compute time and facilities as a service. We call this phenomenon "server hugging".

3. often no one has a clear idea of how the new environment will perform, what sort of load it will carry, or even if the service it provides will be popular enough to pay for the deployment. There is a desire to protect the known, stable part of the infrastructure from the new project, in case something goes wrong during the course of the build. This is warranted - I recall several new deployments that had unforeseen and regrettable impacts on the existing systems in the data center. If you don't have a good test environment, the only mitigation of this risk is physical separation.

So new systems go into production, and sit there at perhaps 15% utilization, often much less, consuming electricity and generating heat 24 hours a day, 365 days a year. This is a huge waste of money, and environmentally irresponsible. It's been common in the past because in the majority of companies, the IT department does not pay for electricity: power and cooling comes out of another budget, so IT has no incentive to do anything about it. It's not even terribly obvious if your servers are collocated in someone else's data center: there, you are typically charged for rack space, and sometime per CPU, and the data center owners will have factored power and cooling into their charges. But business owners are beginning to be aware of these details, and the accountants are sharpening their knives. If green computing can save money, they are all for it.

So if you take 4 machines that are only 15% utilized, and merge their workloads together onto one machine, that one machine will be a lot busier, but it will occupy less rack space, consume less electricity and produce less heat. When IT vendors wax lyrical about "green computing", this is generally part of what they are talking about.

Of course not all applications are suitable for virtualization: anything with very high I/O requirements should be tested carefully. Some applications just do not play well in a shared environment, and you need to consider what your security policies say about the separation of various services and components before you start consolidating things. Consider also that if you are going to put a lot of eggs into a single basket, it should be a very good quality basket, not some nameless beige box that you got cheap from a friend of a friend. But with sensible planning, testing and resourcing, server virtualization can save a great deal of money.

The leading vendor in this field is VMware, who produce the VMware Server products. There are other players, including Xen, IBM, Red Hat and Sun, but VMware (to the annoyance of their competitors) was first to market with a robust solution, and they have both market and mind share that others only can dream about. Their products are good, their support excellent, and as a vendor they are a joy to deal with. So when I found that their VMware Fusion product, VMware for Mac OS X, was about to be released, I was curious to see what it had to offer. I'd been dimly aware of the beta, but never looked at it, and the final release is an Intel binary, so when my new MacBook arrived in August I downloaded the demo and gave it a whirl.

Now I must confess to having followed standard engineering practice for installing this software

1. Download software
2. Glance at the first page of the vendor's doco, conclude that it looks simple, toss doco aside.
3. Install software by following the prompts and doing what seems obvious.
4. Reach the point where it prompts for a Windows installation CD.
5. Go round to the IT department and borrow media and software keys from those enormous folders of CDs that Microsoft ships people who take its products seriously.
6. Stick Windows XP disk into MacBook's DVD drive while thinking "what am I doing?"
7. Go through the usual messy and disgusting procedure of installing XP, Office, Project and Visio, and applying enough patches to choke a horse. I had to refer to the doco briefly at this point, to work out how to have the Mac send CTRL+ALT+DEL to the Windows virtual machine.
8. Watch XP boot in a window on my Mac desktop and realize with horror that I have loaded a petri dish onto my beautiful Mac.
9. Scrabble hastily for a distro of our preferred anti virus software, and get it going before something bad happens.
10. Tinker for a few minutes, to get the virtual machine to authenticate to Active Directory, and get a print queue and some file shares.
11. Build another virtual machine (VM), this one running Solaris 10 (you'll need the X86 media, not Sparc).
12. Download a pre-built VM of Ubuntu Linux from the VMware appliance site.
13. Start all three VMs simultaneously in different windows.
14. Try to engage the enthusiasm of the person at the next desk in how utterly cool this is.
15. Conclude that I shouldn't have tried step 14 on a business development manager.

Now you may well ask "why would I do this?", or "why not use Parallels or Boot Camp or whatever?" Taking those questions in reverse order:

Boot Camp gives you a dual boot machine: you have to stop one operating system to start the other one. Life is too short.

Parallels only supports 32 bit guest operating systems, it's a notorious resource hog, and I have heard horror stories about its stability, or lack thereof. Fusion is highly configurable, doesn't bog my machine down, and is stable. Also, and Parallels doesn't have very many pre-built virtual appliances available. VMware has hundreds, many of them free, including VMs of all sorts of operating systems and networking kit. This gives me the opportunity to tinker with things that I might otherwise never get to try.

Furthermore, VMware is a large and successful company: they have recently completed their IPO, and are going from strength the strength. The other players in this space have interesting products, but big businesses like to buy software from other big businesses: it gives them a cosy feeling that there will be ongoing support, if they need it. And since I have a lot of customers who use VMware in their production environments, it's handy for me to use some of the software myself: helps me stay current on the products.

For me the ability to run multiple operating systems on the Mac is wonderful. For example, I'm in the throes of updated a training course that I wrote years ago, 'The General Introduction to Unix', which I am generalising (it was originally written for SCO Unix and Solaris in the late 90's) to cover Solaris, Linux and Mac OS X. I need to check every bit of syntax, update screen shots, and in some cases remind myself how things work in different operating systems. Not having to power up the old Ultra 10 in my study to look at Solaris saves time and electricity.

Then there are the cases where you need to run some piece of software that will only run on Windows. To be honest, the only thing I really need on Windows is Visio. Anything else either runs natively on a Mac, or there is a better alternative on a Mac. Visio however I do use, and it is pretty much the standard diagramming package used across the industry - you even see it as a required skill set in job descriptions. I use OmniGraffle on my Mac, but until Omnigraffle can give me a Size and Position dialog like Visio's, I will never be able to get the level of precise control that I need. Also, customers frequently send me Visio diagrams and just assume that everyone can read them, because Visio is "the standard". And it is a good piece of software (I note that Microsoft bought the product, they did not create it).

Furthermore, running Windows in a VM on my Mac allows me to get past a problem that has annoyed me since the day I started this job: the local IT folk refuse to add my Mac to Active Directory, so I have never been able to map a file share or print (and yes, I have tried direct IP printing; we have particularly nasty brand of printers that do not support direct IP printing from Macs; the vendors published "work around" is "do not print from Mac". I kid you not). But as soon as the VM of Windows booted, AD recognized it, and allowed me to map drives and set up a print queue. The VM looks just like a PC to AD, and it treats it as such.

And it just works. A few weeks back, I took my Mac to a training course, and when I got it out, the instructor's face fell. The software we were supposed to be learning had a client interface that ran on Windows, and he didn't think it would work on a Mac. So I started my Windows VM, loaded the client, and it worked perfectly (and the other people on the course were jealous, because they had the usual Windows based laptops: slow, ugly and flaky). I had to allocate a bit more memory to the Windows VM for the duration of the course (the client software was a real hog), but once I had done that, my virtual Windows box could easily keep up with their physical ones.

Fusion allows you to emulate all sorts of things, which is handy if you need to do demonstrations. I recently attended a presales demo by an antivirus vendor. The customer wanted to buy antivirus software for their fleet of Microsoft machines. The antivirus presales guy turned up to the demo with a Mac, started up a hoard of Microsoft VMs, and demonstrated his products using them. It worked brilliantly - he had a demo lab in a box, something that it would have taken days to build physically (and he closed the sale, against stiff competition).

The arrival of Leopard has made using Fusion even easier, because I can park the VMs in a Spaces pane until I need them, and they don't even clutter up my desktop. If you haven't tried Fusion, there is a 30 day evaluation version available from the VMware website: give it a try, I think you'll like it.

Addthis

Bookmark and Share