I've decided to up and move this blog to wordpress.com
For all future posts please visit my new blog at techblog.wordpress.com
Friday, 1 June 2007
Blog or not blog?
Personal Web sites have been around for a very long time. Running commentary on subjects such as movies and sport isn’t a good enough reason to call your Web site a blog and I'm not sure this site qualifies as a blog either, regardless of the fact I'm using blog software!
As Wikipedia states
Calling a Web site which allows two-way conversations a blog, is a way of benchmarking time, technology and/or techniques. The very same can be said for Web 2.0. Although, Web 2.0 principles have been around longer than the term itself, Web 2.0 enables us to talk about the same stuff, well, sort of. Most of the confusion in my opinion, appears to be around whether people are talking about technology or marketing.
One could argue that if your comment is worthy enough, it should in fact warrant its own post on your own blog, thereby mitigating the need for comments in the first place. However, this is a cop out and doesn’t counter my argument to use the term ‘blog’ when referring to Web sites that enable comments from readers.
Sethi Godin, Dave Winer and Russell Beattie are just 3 people who call their Personal Web sites blogs. Perhaps people like Winer can get away with it as people are very likely to write posts on their own blogs and then link back to his original article.
Perhaps a few people should be given a ‘get out of jail free’ card? Can they get away with being an exception to the rule?
As Wikipedia states
A blog (a portmanteau of web log) is a website where entries are written in chronological order and commonly displayed in reverse chronological order. “Blog” can also be used as a verb, meaning to maintain or add content to a blog.The bit that interests me most is, “The ability for readers to leave comments in an interactive format is an important part of many blogs.”
Blogs provide commentary or news on a particular subject such as food, politics, or local news; some function as more personal online diaries. A typical blog combines text, images, and links to other blogs, web pages, and other media related to its topic. The ability for readers to leave comments in an interactive format is an important part of many blogs. Most blogs are primarily textual, although some focus on art (artlog), photographs (photoblog), sketchblog, videos (vlog), music (MP3 blog), audio (podcasting) or sexual topics (Adult blog), and are part of a wider network of social media.
Calling a Web site which allows two-way conversations a blog, is a way of benchmarking time, technology and/or techniques. The very same can be said for Web 2.0. Although, Web 2.0 principles have been around longer than the term itself, Web 2.0 enables us to talk about the same stuff, well, sort of. Most of the confusion in my opinion, appears to be around whether people are talking about technology or marketing.
One could argue that if your comment is worthy enough, it should in fact warrant its own post on your own blog, thereby mitigating the need for comments in the first place. However, this is a cop out and doesn’t counter my argument to use the term ‘blog’ when referring to Web sites that enable comments from readers.
Sethi Godin, Dave Winer and Russell Beattie are just 3 people who call their Personal Web sites blogs. Perhaps people like Winer can get away with it as people are very likely to write posts on their own blogs and then link back to his original article.
Perhaps a few people should be given a ‘get out of jail free’ card? Can they get away with being an exception to the rule?
Friday, 25 May 2007
Why "free" isn't important
In one of the best posts yet from one of my favorite ZDnet bloggers, Adrian Kingsley-Hughes offers Five crucial things the Linux community doesn’t understand about the average computer user. It’s a great primer on the significant difference in mind set between hobbyist and uber-geek computer users and the “average user”.
Here are the five things. Please do read the post and the unbelievably long comment thread (or at least some of it – it does end up getting kind of repetitive).
Chill out. It’s just an operating system.
I’m not sure if it’s just a case that there’s a small subset of the Linux community which is both aggressive and vocal or whether the problem is much broader, but this is a major turn off for people considering making the transition to a Linux OS. Even back when Mac communities were considered by many to be pretty hostile and unfriendly places Steve Jobs was clever enough to make sure that this kind of fanatical nonsense didn’t make it onto the Apple site and sales literature (although Apple is perfectly capable of coming up with their own fanatical nonsense, at least it’s not that aggressive). Negative campaigning seems to work for political parties but it doesn’t work for Linux - and the numbers prove this.
Seriously, given the passion behind some of the comments I come across from some Linux users, you’d have thought I was talking about something with life-or-death importance like a heart machine and not an OS.
Brilliant job Adrian!
Here are the five things. Please do read the post and the unbelievably long comment thread (or at least some of it – it does end up getting kind of repetitive).
- On the whole, users aren’t all that dissatisfied with Windows
- Too many distros
- People want certainty that hardware and software will work
- As far as most people are concerned, the command line has gone the way of the dinosaur
- Linux is still too geeky
The PC market is extremely cut-throat. It has to be because consumers will go to great lengths to save a few bucks when buying their latest system. But it seems that this thriftiness hasn’t resulted in hordes of users choosing to buy PCs without Windows installed and instead choosing to install Linux instead. In fact, there are plenty of users who would rather break the law and install pirated copies of Windows than go the legal route and install a Linux distro. On the whole, most people would rather spend the money on Windows (or Mac) than take the time to experiment with Linux.A followup to the post was just published. Point number three is the essence of what Platform Agnostic is all about:
Why?
It seems that a lot of people are wondering this. Since starting to dabble in the world of Linux I’ve seen this question posed on innumerable websites, forums and blogs. Why is it that when consumer satisfaction with Windows is at a low (at least according to many in the pro-Linux community it is) is the Linux market share so low? It’s pretty sad, but beyond a certain small segment of computer users, you can’t give Linux away.
Chill out. It’s just an operating system.
I’m not sure if it’s just a case that there’s a small subset of the Linux community which is both aggressive and vocal or whether the problem is much broader, but this is a major turn off for people considering making the transition to a Linux OS. Even back when Mac communities were considered by many to be pretty hostile and unfriendly places Steve Jobs was clever enough to make sure that this kind of fanatical nonsense didn’t make it onto the Apple site and sales literature (although Apple is perfectly capable of coming up with their own fanatical nonsense, at least it’s not that aggressive). Negative campaigning seems to work for political parties but it doesn’t work for Linux - and the numbers prove this.
Seriously, given the passion behind some of the comments I come across from some Linux users, you’d have thought I was talking about something with life-or-death importance like a heart machine and not an OS.
Brilliant job Adrian!
Sunday, 25 February 2007
Gmail as a personal hub
I was lucky enough to get in on the Gmail beta when it launched and I haven't looked back since. Even though I've had an account for almost three years and I get over 100 emails a day, I have chewed up only 18% of the generous 2.8 gigabytes of storage.
However, in recent weeks I have started using Gmail as much more than an email host. With its gobs of storage, speed and tremendous search/tagging capabilities, you can transform it into a personal nerve center that's available from any computer or mobile device. When you tap into this power and combine Gmail with some other tools, it is perhaps the most essential site ever developed. Most of the following life hacks have not been documented.
This series has several parts...
Using Gmail as a Massive Database
I revel in information. Can't get enough of it. I like that I get a lot of email. I scan 275 RSS feeds in Google Reader and I use dozens of bookmarklets and shortcuts to help me manage it all.
Everyday I come across something on the web that I want to save for future reference. While previously I was using Yojimbo to manage all of this information, I found the solution wanting since I travel a lot and need to access my bits from a mobile device. Google Notebook also doesn't work on a mobile device and its search functions are rather lacking. Enter Gmail and the Google Toolbar.
The latest version of the Google Toolbar has a send to Gmail function. Select some text or graphics, right click on it and send it to Gmail. The Toolbar then automatically feeds it into a new message.
Now, when I find something I want to save I use this feature and send it to a secret contact in my address book. This is basically a steverubel+[secretphrase]@gmail.com email address (Lifehacker explains the value of these here).
Once the article arrives in my Gmail inbox, I have a filter whisk it a way into the archive and tag it with an @Database label. Further, I am toying with having the same filter also forward these to a premium Google Apps account that has 10 gigs of space. Now all I need to do to call it up later is enter label:@Database and a keyword. Whammo - an instant personal database.
However, in recent weeks I have started using Gmail as much more than an email host. With its gobs of storage, speed and tremendous search/tagging capabilities, you can transform it into a personal nerve center that's available from any computer or mobile device. When you tap into this power and combine Gmail with some other tools, it is perhaps the most essential site ever developed. Most of the following life hacks have not been documented.
This series has several parts...
- How to turn Gmail into a massive personal database (Gmail + the Google Toolbar)
- How to get real-time news updates in Gmail (Gmail+ Google Talk + Twitter)
- How to automatically store your bookmarks in Gmail (Gmail + del.icio.us + Yahoo Alerts)
- How to manage Calendar and To-Dos in Gmail (Gmail + Backpack + GCal + GTalk + iMified)
- How to blog from Gmail (Gmail + Wordpress/TypePad/Blogger + IMified)
Using Gmail as a Massive Database
I revel in information. Can't get enough of it. I like that I get a lot of email. I scan 275 RSS feeds in Google Reader and I use dozens of bookmarklets and shortcuts to help me manage it all.
Everyday I come across something on the web that I want to save for future reference. While previously I was using Yojimbo to manage all of this information, I found the solution wanting since I travel a lot and need to access my bits from a mobile device. Google Notebook also doesn't work on a mobile device and its search functions are rather lacking. Enter Gmail and the Google Toolbar.
The latest version of the Google Toolbar has a send to Gmail function. Select some text or graphics, right click on it and send it to Gmail. The Toolbar then automatically feeds it into a new message.
Now, when I find something I want to save I use this feature and send it to a secret contact in my address book. This is basically a steverubel+[secretphrase]@gmail.com email address (Lifehacker explains the value of these here).
Once the article arrives in my Gmail inbox, I have a filter whisk it a way into the archive and tag it with an @Database label. Further, I am toying with having the same filter also forward these to a premium Google Apps account that has 10 gigs of space. Now all I need to do to call it up later is enter label:@Database and a keyword. Whammo - an instant personal database.
Monday, 4 December 2006
Love in Action - I don't think so!
I just heard that the pastor of a church I once attended called the Wirral Christian Centre has slandered me in a book he wrote way back in the late 80's! The book called 'Love in Action' was written by Pastor Paul Epton of the Wirral Christian Centre.
I haven't decided yet what to do legally about this, I may let sleeping dogs lie I think. The man who wrote the book is a quite horrible and manipulative person who disgusted me with his ego maniac behaviour back when we were starting the Wirral Christian Centre. A quick web search shows that the man is still abusing people at his church in Birkenhead. I found several websites including this one about the Wirral Christian Centre.
On page 10 of the book Paul Epton calls me "an inveterate womanizer." He goes on to write, "He wasn't the most troublesome member of the church by any means, but he pestered the women constantly." Later on, on page 18 Epton describes me as "paying too much attention to women." To say I am shocked to see me described like this is an understatement. But to be honest I doubt any kind of cruelty by Pastor Paul Epton would surprise me, the man is a fraud and a manipulative empire builder. He was supposed to have a 5000 seater church by now, what ever happened to that "promise from God" I wonder?
The book is out of print these days and the publishers have long since closed down. A friend of mine sent me a PDF of the book, but I was also able to find a second hand copy for sale on Amazon.co.uk. I suspect this was a self-financed book by Epton, distributed in very small numbers to the faithful few many moons ago. Even so, I am extremely angry that the man who is supposed to be a Christian would write such a wicked and vile work of fiction presented as fact.
I'm glad I left the Wirral Christian Centre. being reminded that Paul Epton is still out there abusing his position of influence and trust makes me very very sad indeed.
You can read the book here.
I haven't decided yet what to do legally about this, I may let sleeping dogs lie I think. The man who wrote the book is a quite horrible and manipulative person who disgusted me with his ego maniac behaviour back when we were starting the Wirral Christian Centre. A quick web search shows that the man is still abusing people at his church in Birkenhead. I found several websites including this one about the Wirral Christian Centre.
On page 10 of the book Paul Epton calls me "an inveterate womanizer." He goes on to write, "He wasn't the most troublesome member of the church by any means, but he pestered the women constantly." Later on, on page 18 Epton describes me as "paying too much attention to women." To say I am shocked to see me described like this is an understatement. But to be honest I doubt any kind of cruelty by Pastor Paul Epton would surprise me, the man is a fraud and a manipulative empire builder. He was supposed to have a 5000 seater church by now, what ever happened to that "promise from God" I wonder?
The book is out of print these days and the publishers have long since closed down. A friend of mine sent me a PDF of the book, but I was also able to find a second hand copy for sale on Amazon.co.uk. I suspect this was a self-financed book by Epton, distributed in very small numbers to the faithful few many moons ago. Even so, I am extremely angry that the man who is supposed to be a Christian would write such a wicked and vile work of fiction presented as fact.
I'm glad I left the Wirral Christian Centre. being reminded that Paul Epton is still out there abusing his position of influence and trust makes me very very sad indeed.
You can read the book here.
Sunday, 6 August 2006
Bye bye Visual Basic
Well, its been less than two days since the MacBU announced that Visual Basic is being removed from the next version of Mac Office. The news has created quite a firestorm on many Mac forums (I’ve been scanning MacNN, Ars Technica, and a few others) and I received some very strongly expressed opinions about it in comments on yesterday’s post. I’d like to take some time to express my own views and experiences on the removal of Mac VB.
I should clear up one misconception about how the VB removal affects existing macros that has been making the blog and comment rounds. The removal of VB means that existing macros in Office documents will be round-tripped across file open and save, but you will not be able to edit them and you will not be able to run them on the Mac. Even previously compiled macros will not execute, because they have been compiled to PowerPC code that conforms to an older binary interface.
I want to say right up front that the MacBU is very aware of the pain this decision will cause for users, consultants, and enterprise organizations. I’ve personally seen the phrases “apoplectic with rage” and “absolutely livid” in two emails that crossed my inbox. Some people made comments on my post yesterday that were expressly clear about how this decision would drive them to one of the free Open Office variants instead of buying Mac Office 12, and other posts in other forums made similar statements. I’m sure some people will indeed decide that lack of VB is an absolute deal-breaker and they will plan to use other software. I’m truly sorry if that is the case for you.
The MacBU did not make this decision lightly. I personally spent several weeks digging into the VB source code to identify and plan what work would have to be done to move it to Xcode and make it universal, and I had several long discussions with our product planning folks to help our group leadership weigh the costs of doing the VB port vs. the costs of not doing it. I’ll try to lead you through some of the analysis here.
From my perspective, Mac Office has two primary driving requirements:
(We’ve got other requirements and product visions, but as I see it, they really act to refine these two basic needs.) As you may imagine, these two goals are many times not perfectly aligned. In the worst cases, they may actually be diametrically opposed, and we have to wrestle with making the best decision we can, knowing full well that whichever way we go it will help some users and hurt others. This VB decision is one where we’re truly caught between the Mac rock and the Win Office hard place.
VB on the Mac exists for cross-platform compatibility. There is no other software on the Mac that also uses VB, so it doesn’t help Mac Office integrate with other workflows based purely on Apple solutions. Thus, any work we do on VB only serves to satisfy one of the two major requirements. Doing that work then means we have less developer time to continue to improve Mac Office’s use of Apple-specific technologies (or tools, such as Xcode.)
Let me describe for you some of the technical challenges that would be involved were we to try to port VB to Xcode and to the Intel platform. For those of you reading who are not developers, bear with me for a little bit. Hopefully you’ll at least get a sense of the scope of work even if you don’t quite follow the nitty-gritty details.
VB on the Mac is really three parts: VBE (the editor), VBA (the execution engine) and Forms (the buildable windows and controls you edit in VBE and see when running a macro.)
VBE is pretty standard C++ code. However, the code is generally very old — it was originally designed and written several years before I came to Microsoft in 1996. VBE contains the top-level parser that converts the text of a macro into a series of mostly machine-independent opcodes (kind of like Java bytecodes, but not exactly the same). Thus you can’t just hook an external text editor up to VBA, because of the upper-level dependency. The VBE code actually isn’t too hard to port to Intel, but it is tricky to port to Xcode/GCC because of the age of the code. As I mentioned in an earlier post, GCC is very picky about code meeting the current standards and the VBE code most certainly does not. That’s not to say the code is ‘bad,’ it was just designed and written long before current modern C++ standards.
VBA, on the other hand, is incredibly difficult to port to Intel. The execution engine basically runs through the previously mentioned opcodes and, well, executes them. The hard part is that ‘executing’ them doesn’t mean interpreting them, it means converting one or more at a time into a block of assembly generated at runtime that looks and behaves like a regular function that can be called directly by other normally compiled code. This is in essense ’self-creating’ code, and VBA is constantly flushing the CPU’s code cache in order to mark these chunks of data as executable. VBA’s generated code must adhere to the Application Binary Interface of the host platform (historically PowerPC and the Code Fragment Manager). This means register allocation, stack alignment, parameter passing locations, etc. VBA is basically a compiler that emits code at runtime. It does so by running a large state machine that tracks PPC register usage, stack location, mapping between PPC registers and VB variables, etc and then concatenates large blocks of pre-generated assembly together. VBA subsequently tweaks the assembly bit-field by bit-field to do things like assign registers to each opcode, set branch addresses, and create transition vectors for all function calls. The templates are very PPC- and CFM-specific and the state machine is designed for architectures that allocate static stack frames and pass parameters by register, unlike Intel which has dynamic stack frames (you can push and pop data to/from the stack any time you want) and parameters are passed on the stack. So, for us to port this to Intel we’d have to rewrite the entire state machine and create brand-new templates of IA-32 code. That’s basically writing a rudimentary compiler almost from scratch (we’d at least have the initial parsing and machine-independent opcodes already done.) Again, this is all a design that long predates me or most of my peers in Mac Office, and is code that we inherited when we created the MacBU (i.e, none of us wrote it in the first place.) There’s nothing inherently bad about the code, it was just designed for the constraints of the day and that design simply doesn’t lend itself to being architecture-independent.
Some folks might ask why not just port the Win Office VBA over to the Mac? Well, VBA circa Win Office 97 (which is the closest Windows VBA to what we have on the Mac) doesn’t implement their execution engine this way at all. Instead, they have tens of thousands of lines of IA-32 assembly that directly implements all of the opcodes. That assembly does so according to the Windows Intel ABI, which is different from the Mac ABI in several important ways (the specifics of which are described here.) Also, the assembly is in MASM format which is close to but not the same as NASM as supported by GCC. So, we’d have to edit the source to be compilable by GCC, and scrub it line-by-line to find and adjust the parts that aren’t compliant with the Apple Intel ABI. We’d also end up with two completely different implementations of VBA (PPC state machine and Intel straight assembly) that we’d have to maintain and keep in sync. That would be horribly bug-prone.
Lastly, we have Forms. Forms is also C++, but is backed by several thousand lines of gnarly custom assembly. This assembly ‘allows’ the C++ code to swap object virtual function tables and individual member function pointers between objects on the fly, to essentially do very fast object morphing. To do so, the assembly has to have specific knowledge of aspects of the C++ compiler (vtable layout, implementation of ptrs-to-member-functions, etc) and has to work in lockstep with the compiler. I spent almost two weeks massaging this code to try to make it compatible with just the PPC Mach ABI, which is only slightly different from the PPC CFM ABI. Even after all that work, I still didn’t get it completely right and internal builds had some really bad stability problems. We also don’t even have the Win Office 97 Forms source code, so I was not able to compare our code to how it was implemented for Windows.
I just noted that the assembly has to work hand-in-hand with the normal C/C++ compiler. That wasn̢۪t too much of a problem when we were using CodeWarrior, as the C++ compiler only changed in small ways every few years or so. With Xcode and GCC, my understanding is that Apple has to merge in all the changes that external developers commit to GCC, and we run the risk of GCC changing much more frequently. That might not be a problem in reality, but the risk is non-zero and we have to take that into account.
One final problem is that all of this custom assembly is currently PPC 32-bit, and even the corresponding Windows assembly is Intel 32-bit. If we ever want to make a 64-bit native version of Office, any work we might do to solve all of the above problems would have to be done all over again.
So, in short: VB has lots of code and assembly that specifically assumes it is running on a PPC with the Code Fragment Manager, and to re-do it for Intel would involve writing a rudimentary compiler and relying on private compiler implementations that are subject to change at any time.
Whew, that’s a lot of technical stuff. I hope it provides some idea of the scope of work we were facing. We estimated that it would take roughly two years to of development time to move it all over to Xcode and to Intel. That would mean two more years before the next version of Mac Office made its way to consumers. In the meantime, Leopard will ship and Mac Office 2004 would still be running in Rosetta. Win Office 2007 and the new XML file formats will be ever more common. All Mac Office users would still be stuck with the old formats, unable to share in or use the great expansion of capabilities these new file formats bring. During that time, we’d also not be adding any other items our users have asked for.
Beyond that, if we were to port VB over to Intel in those two years, what you’d end up with is VB for Mac just as it is today. It still wouldn’t be feature-comparable to VB in Win Office, and the object model in Mac Office would still not be the same as the one in Win Office. That means that your macros would still be restricted to the same set of compatible items as you have today. Over the last 10 years, the Win Office programming model has become very different from that of Mac Office. We’ve tried to keep the object models in sync for the features that we have ported from Win Office, but we haven’t ported everything.
So, given that the developer cost was huge, that the consumer cost due to the delay while we did the work was quite large, and that the end result would be no better than what we have today, we made the very difficult decision to invest our time and resources in the other pillar of Mac Office, namely taking advantage of Apple tools and technologies to be more ‘Mac-like’. We’ve continued to improve the AppleScriptability of our apps (many many bug fixes post-Office-2004) and as announced are looking into adding some Automator actions to the suite. We’ve completed the rest of our transition to Xcode and to Intel and are forging ahead with the rest of the product.
I think a common question might be ‘if the cost is so huge, why doesn’t Microsoft just devote more resources to the problem? They’ve got a ton of cash, right?’ Well, the real question is ‘what resources do you throw at the problem?’ We’ve been working very hard to hire a bunch of developers, but it has turned out to be quite difficult to fill our existing open headcount positions. As an example, I’ve had an open position on my own team for 9 of the last 12 months (it took 8 months to fill the slot when one developer moved from my team to another one in MacBU, and only last week did we hire someone to fill the slot vacated recently when another developer moved to a different team at Microsoft.) The question of how Microsoft allocates developer headcount and funding to MacBU is a separate topic of its own which hopefully I or some other MacBU blogger will tackle later. In any case, there’s no point in adding new headcount to the MacBU when we haven’t yet filled the positions we already have open.
I know that explaining all this doesn’t make the fact of VB’s death any easier for those users who currently depend on it. As I said at the beginning, we in the MacBU really are aware of the difficulties you face. Our product planners, program managers, developers, and testers are working to alleviate some of that pain. Many people have only a few simple macros they use, and I do want to point out that those macros will translate very easily into AppleScript. Even large macros can be rewritten in AppleScript, although that takes some time and definitely some knowledge scripting on the Mac. The AppleScript object model and the old VB object model for our apps are roughly equivalent, so apart from the syntactical differences, if you could do it in VB you can do it in AppleScript. While I can’t comment on any more specific feature work for Office 12, I’m sure we will be working closely with enterprise customers to help them address their concerns. We’ll be saying more about our scripting plans as we get closer to the product release for Office 12.
For those of you contemplating a switch to Open Office, I don’t know if Open Office has any support for VB macros or other OLE Automation technologies so I don’t know if you’ll be any better off from a cross-platform perspective. You probably can’t be worse-off except that Open Office certainly doesn’t support any of the Mac OS scripting technologies that Mac Office does support and in which we will continue to invest, nor will it (at least for a while yet) support the new XML-based file formats. If you do switch, we’ll miss you.
Many people have viewed this announcement by MacBU as a sign that we are out to screw the Mac community, or that we’re just looking for an exit strategy. We’re not. Most empatically, we’re not. This decision was agonizing. My manager even said he felt ’sick about the impact on those who really rely on xplat [cross-platform] VB support, particularly in Excel where we see it the most.’ In my post yesterday, I said that I wasn’t so sad to see VB go. I said that from the perspective of a developer who’s worked to maintain the code for many years. However, there’s nothing good about removing a feature that many people rely on, except that it frees up resources for us to invest more heavily in other important areas. Due to the age of the code, VB has been a very large drain on our resources for a long time with relatively little return. A couple of months ago I wrote that I hoped my blog would help people trust the MacBU a little more. I can see that many of you are very mad about this decision; I do hope that my post today helps you see some of the issues behind the press release. We had to make a hard decision one way or the other, and this is how it turned out.
I should clear up one misconception about how the VB removal affects existing macros that has been making the blog and comment rounds. The removal of VB means that existing macros in Office documents will be round-tripped across file open and save, but you will not be able to edit them and you will not be able to run them on the Mac. Even previously compiled macros will not execute, because they have been compiled to PowerPC code that conforms to an older binary interface.
I want to say right up front that the MacBU is very aware of the pain this decision will cause for users, consultants, and enterprise organizations. I’ve personally seen the phrases “apoplectic with rage” and “absolutely livid” in two emails that crossed my inbox. Some people made comments on my post yesterday that were expressly clear about how this decision would drive them to one of the free Open Office variants instead of buying Mac Office 12, and other posts in other forums made similar statements. I’m sure some people will indeed decide that lack of VB is an absolute deal-breaker and they will plan to use other software. I’m truly sorry if that is the case for you.
The MacBU did not make this decision lightly. I personally spent several weeks digging into the VB source code to identify and plan what work would have to be done to move it to Xcode and make it universal, and I had several long discussions with our product planning folks to help our group leadership weigh the costs of doing the VB port vs. the costs of not doing it. I’ll try to lead you through some of the analysis here.
From my perspective, Mac Office has two primary driving requirements:
- it must be as Mac-like as possible, use Mac features, and take advantage of the Mac operating system, and
- it must be as compatible with Win Office as possible, and share as many features and commonalities as it can.
(We’ve got other requirements and product visions, but as I see it, they really act to refine these two basic needs.) As you may imagine, these two goals are many times not perfectly aligned. In the worst cases, they may actually be diametrically opposed, and we have to wrestle with making the best decision we can, knowing full well that whichever way we go it will help some users and hurt others. This VB decision is one where we’re truly caught between the Mac rock and the Win Office hard place.
VB on the Mac exists for cross-platform compatibility. There is no other software on the Mac that also uses VB, so it doesn’t help Mac Office integrate with other workflows based purely on Apple solutions. Thus, any work we do on VB only serves to satisfy one of the two major requirements. Doing that work then means we have less developer time to continue to improve Mac Office’s use of Apple-specific technologies (or tools, such as Xcode.)
Let me describe for you some of the technical challenges that would be involved were we to try to port VB to Xcode and to the Intel platform. For those of you reading who are not developers, bear with me for a little bit. Hopefully you’ll at least get a sense of the scope of work even if you don’t quite follow the nitty-gritty details.
VB on the Mac is really three parts: VBE (the editor), VBA (the execution engine) and Forms (the buildable windows and controls you edit in VBE and see when running a macro.)
VBE is pretty standard C++ code. However, the code is generally very old — it was originally designed and written several years before I came to Microsoft in 1996. VBE contains the top-level parser that converts the text of a macro into a series of mostly machine-independent opcodes (kind of like Java bytecodes, but not exactly the same). Thus you can’t just hook an external text editor up to VBA, because of the upper-level dependency. The VBE code actually isn’t too hard to port to Intel, but it is tricky to port to Xcode/GCC because of the age of the code. As I mentioned in an earlier post, GCC is very picky about code meeting the current standards and the VBE code most certainly does not. That’s not to say the code is ‘bad,’ it was just designed and written long before current modern C++ standards.
VBA, on the other hand, is incredibly difficult to port to Intel. The execution engine basically runs through the previously mentioned opcodes and, well, executes them. The hard part is that ‘executing’ them doesn’t mean interpreting them, it means converting one or more at a time into a block of assembly generated at runtime that looks and behaves like a regular function that can be called directly by other normally compiled code. This is in essense ’self-creating’ code, and VBA is constantly flushing the CPU’s code cache in order to mark these chunks of data as executable. VBA’s generated code must adhere to the Application Binary Interface of the host platform (historically PowerPC and the Code Fragment Manager). This means register allocation, stack alignment, parameter passing locations, etc. VBA is basically a compiler that emits code at runtime. It does so by running a large state machine that tracks PPC register usage, stack location, mapping between PPC registers and VB variables, etc and then concatenates large blocks of pre-generated assembly together. VBA subsequently tweaks the assembly bit-field by bit-field to do things like assign registers to each opcode, set branch addresses, and create transition vectors for all function calls. The templates are very PPC- and CFM-specific and the state machine is designed for architectures that allocate static stack frames and pass parameters by register, unlike Intel which has dynamic stack frames (you can push and pop data to/from the stack any time you want) and parameters are passed on the stack. So, for us to port this to Intel we’d have to rewrite the entire state machine and create brand-new templates of IA-32 code. That’s basically writing a rudimentary compiler almost from scratch (we’d at least have the initial parsing and machine-independent opcodes already done.) Again, this is all a design that long predates me or most of my peers in Mac Office, and is code that we inherited when we created the MacBU (i.e, none of us wrote it in the first place.) There’s nothing inherently bad about the code, it was just designed for the constraints of the day and that design simply doesn’t lend itself to being architecture-independent.
Some folks might ask why not just port the Win Office VBA over to the Mac? Well, VBA circa Win Office 97 (which is the closest Windows VBA to what we have on the Mac) doesn’t implement their execution engine this way at all. Instead, they have tens of thousands of lines of IA-32 assembly that directly implements all of the opcodes. That assembly does so according to the Windows Intel ABI, which is different from the Mac ABI in several important ways (the specifics of which are described here.) Also, the assembly is in MASM format which is close to but not the same as NASM as supported by GCC. So, we’d have to edit the source to be compilable by GCC, and scrub it line-by-line to find and adjust the parts that aren’t compliant with the Apple Intel ABI. We’d also end up with two completely different implementations of VBA (PPC state machine and Intel straight assembly) that we’d have to maintain and keep in sync. That would be horribly bug-prone.
Lastly, we have Forms. Forms is also C++, but is backed by several thousand lines of gnarly custom assembly. This assembly ‘allows’ the C++ code to swap object virtual function tables and individual member function pointers between objects on the fly, to essentially do very fast object morphing. To do so, the assembly has to have specific knowledge of aspects of the C++ compiler (vtable layout, implementation of ptrs-to-member-functions, etc) and has to work in lockstep with the compiler. I spent almost two weeks massaging this code to try to make it compatible with just the PPC Mach ABI, which is only slightly different from the PPC CFM ABI. Even after all that work, I still didn’t get it completely right and internal builds had some really bad stability problems. We also don’t even have the Win Office 97 Forms source code, so I was not able to compare our code to how it was implemented for Windows.
I just noted that the assembly has to work hand-in-hand with the normal C/C++ compiler. That wasn̢۪t too much of a problem when we were using CodeWarrior, as the C++ compiler only changed in small ways every few years or so. With Xcode and GCC, my understanding is that Apple has to merge in all the changes that external developers commit to GCC, and we run the risk of GCC changing much more frequently. That might not be a problem in reality, but the risk is non-zero and we have to take that into account.
One final problem is that all of this custom assembly is currently PPC 32-bit, and even the corresponding Windows assembly is Intel 32-bit. If we ever want to make a 64-bit native version of Office, any work we might do to solve all of the above problems would have to be done all over again.
So, in short: VB has lots of code and assembly that specifically assumes it is running on a PPC with the Code Fragment Manager, and to re-do it for Intel would involve writing a rudimentary compiler and relying on private compiler implementations that are subject to change at any time.
Whew, that’s a lot of technical stuff. I hope it provides some idea of the scope of work we were facing. We estimated that it would take roughly two years to of development time to move it all over to Xcode and to Intel. That would mean two more years before the next version of Mac Office made its way to consumers. In the meantime, Leopard will ship and Mac Office 2004 would still be running in Rosetta. Win Office 2007 and the new XML file formats will be ever more common. All Mac Office users would still be stuck with the old formats, unable to share in or use the great expansion of capabilities these new file formats bring. During that time, we’d also not be adding any other items our users have asked for.
Beyond that, if we were to port VB over to Intel in those two years, what you’d end up with is VB for Mac just as it is today. It still wouldn’t be feature-comparable to VB in Win Office, and the object model in Mac Office would still not be the same as the one in Win Office. That means that your macros would still be restricted to the same set of compatible items as you have today. Over the last 10 years, the Win Office programming model has become very different from that of Mac Office. We’ve tried to keep the object models in sync for the features that we have ported from Win Office, but we haven’t ported everything.
So, given that the developer cost was huge, that the consumer cost due to the delay while we did the work was quite large, and that the end result would be no better than what we have today, we made the very difficult decision to invest our time and resources in the other pillar of Mac Office, namely taking advantage of Apple tools and technologies to be more ‘Mac-like’. We’ve continued to improve the AppleScriptability of our apps (many many bug fixes post-Office-2004) and as announced are looking into adding some Automator actions to the suite. We’ve completed the rest of our transition to Xcode and to Intel and are forging ahead with the rest of the product.
I think a common question might be ‘if the cost is so huge, why doesn’t Microsoft just devote more resources to the problem? They’ve got a ton of cash, right?’ Well, the real question is ‘what resources do you throw at the problem?’ We’ve been working very hard to hire a bunch of developers, but it has turned out to be quite difficult to fill our existing open headcount positions. As an example, I’ve had an open position on my own team for 9 of the last 12 months (it took 8 months to fill the slot when one developer moved from my team to another one in MacBU, and only last week did we hire someone to fill the slot vacated recently when another developer moved to a different team at Microsoft.) The question of how Microsoft allocates developer headcount and funding to MacBU is a separate topic of its own which hopefully I or some other MacBU blogger will tackle later. In any case, there’s no point in adding new headcount to the MacBU when we haven’t yet filled the positions we already have open.
I know that explaining all this doesn’t make the fact of VB’s death any easier for those users who currently depend on it. As I said at the beginning, we in the MacBU really are aware of the difficulties you face. Our product planners, program managers, developers, and testers are working to alleviate some of that pain. Many people have only a few simple macros they use, and I do want to point out that those macros will translate very easily into AppleScript. Even large macros can be rewritten in AppleScript, although that takes some time and definitely some knowledge scripting on the Mac. The AppleScript object model and the old VB object model for our apps are roughly equivalent, so apart from the syntactical differences, if you could do it in VB you can do it in AppleScript. While I can’t comment on any more specific feature work for Office 12, I’m sure we will be working closely with enterprise customers to help them address their concerns. We’ll be saying more about our scripting plans as we get closer to the product release for Office 12.
For those of you contemplating a switch to Open Office, I don’t know if Open Office has any support for VB macros or other OLE Automation technologies so I don’t know if you’ll be any better off from a cross-platform perspective. You probably can’t be worse-off except that Open Office certainly doesn’t support any of the Mac OS scripting technologies that Mac Office does support and in which we will continue to invest, nor will it (at least for a while yet) support the new XML-based file formats. If you do switch, we’ll miss you.
Many people have viewed this announcement by MacBU as a sign that we are out to screw the Mac community, or that we’re just looking for an exit strategy. We’re not. Most empatically, we’re not. This decision was agonizing. My manager even said he felt ’sick about the impact on those who really rely on xplat [cross-platform] VB support, particularly in Excel where we see it the most.’ In my post yesterday, I said that I wasn’t so sad to see VB go. I said that from the perspective of a developer who’s worked to maintain the code for many years. However, there’s nothing good about removing a feature that many people rely on, except that it frees up resources for us to invest more heavily in other important areas. Due to the age of the code, VB has been a very large drain on our resources for a long time with relatively little return. A couple of months ago I wrote that I hoped my blog would help people trust the MacBU a little more. I can see that many of you are very mad about this decision; I do hope that my post today helps you see some of the issues behind the press release. We had to make a hard decision one way or the other, and this is how it turned out.
Tuesday, 7 February 2006
Blog stalkers
‘Stalker’ is such a harsh word and one not to be used lightly but in December of last year I realized that I had one.
I’ve hinted at this once or twice in this blog and in my email newsletter and some ProBlogger readers did see a few of the comments that he left on this blog (he was the one that called himself ‘blogkiller’ - but I’ve avoided talking about it up until now for reasons of security and not wanting to agitate the situation further.
It’s been almost two months now since the situation was resolved and I believe it is now safe to talk about it without inflaming things (but hope you’ll forgive me for not going into too many specifics).
What I will say is that the situation arose when someone who lives in my city read a number of posts written on another blog about me. Before he read them I was unknown to him but the posts attacked me, made allegations about me which were untrue and it was written (in my opinion) without fact checking in quite an aggressive tone. Who wrote it and which post it is is irrelevant (in fact I’ve made peace with the blogger and resolved it) - the fact is the person who read it was in a place in their life where they were under extreme pressure and mentally unstable.
The posts were enough to trigger some extreme thought processes and obsessions in this person that led to a chain of escalating events that went from what I initially considered to be a harmless comment troll, to a cyber-nuisance, to a concerning threat maker, to what unfortunately became a situation where there was a physical attack made upon my property.
This process was very unsettling and in the end shook me up quite a bit.
As I’ve written above the situation is now resolved. I do not feel under threat - but in the process I’ve learned a lot and have a somewhat different view of blogging.
I wanted to share this story for a couple of reasons.
Firstly I think it’s important for us all to remember that the words we write might be written with one intention - but that they can be read and interpreted in a very different way. The blogger who wrote the posts that triggered this chain reaction did not intend for this to happen and could not have foreseen it. I don’t hold them responsible for it and as I say we’ve resolved our differences. However it has made me think twice before posting about other people since.
Secondly I wanted to remind bloggers to consider their personal safety.
While I’ve seen a number of articles in recent times about how to keep your blog safe and secure from hacking and accidental loss - it’s also worth considering how to keep yourself (and those you live with) safe and secure also. When you write in a public forum you are doing so with the hope that people will read you. The unfortunate side of this is that you have little control over the perceptions of others towards you and that from time to time people will disagree with you and even become agitated towards you. This is both one of blogging’s biggest strengths (ie the conversation, diversity and dialogue) but also one of it’s biggest weaknesses when it goes to far.
Lessons in Blogger Security
While I’m no expert on personal online safety I would STRONGLY advise you consider what information you do and don’t reveal online about where you live and your family. While this person tracked me down through offline sources it’s a good reminder that the things you write can often be used to track you down. Here are a few reflections on the experience:
Decide up front how much personal information you will share on your blog - I’ve always been reasonably careful about this. I don’t post my address (I have a PO Box), I don’t post the name of my spouse and I never post her picture or those of other family or friends. If I do post photos I generally ask for permission or make them anonymous.
Consider your offline security - Ask yourself - ‘How easy would it be for someone to find you in real life?’ As I say I was not found directly through information on my blog (although I’m sure knowing my name and city which I reveal in my about pages helped) but through offline sources. I’m sure there are different ways to add layers of security in different parts of the world but consider silent numbers, PO Boxes for mail, being silent on the electoral roll etc.
Consider the way you are perceived online - I work hard at presenting myself online in a fairly easy going, polite and well mannered person (I’m often made fun of for this even). While at times I don’t feel like being this way it is an intentional thing. This is partly because it’s my character and personality (I’m a shy guy who was brought up to always consider the feelings of others) and partly as a security consideration - I don’t want to inflame the wrong person. Obviously it doesn’t always work - but I do worry about some bloggers who seem intent on promoting themselves through angry, attacking and personal attacks. Consider the costs of your actions and words both on yourself and others before you post. You may still choose to take the attacking approach - but do so at your own risk and knowing the full extent of what it could one day lead to.
Have a plan of action - I would strongly recommend giving some thought to how you will deal with escalating situations that could lead to personal safety problems. This is part of the reason I wrote the What to do when your blog is Attacked post a week or so back. In the vast majority of cases things do not escalate to the stalker stage and can be resolved by using some of the strategies I mentioned there. However what if they escalate? At what point will you involve the police? How is the security of your home? etc
Don’t face it Alone - If things do escalate - I would strongly advise that you do not face the situation alone. The resolution of my situation came with the involvement of others. I don’t wish to go into details of this but ‘others’ could mean the help of other bloggers, others who live near you and others with some official and legal ability to help.
Online stalkers are not a common thing to my knowledge and I don’t share my story to scare anyone - but I do think it’s something to be aware of as you blog. Don’t let this kill your blogging but let it be something you give a little thought to in the hope the tiny chances of this happening to you are lessoned even further.
I’ve hinted at this once or twice in this blog and in my email newsletter and some ProBlogger readers did see a few of the comments that he left on this blog (he was the one that called himself ‘blogkiller’ - but I’ve avoided talking about it up until now for reasons of security and not wanting to agitate the situation further.
It’s been almost two months now since the situation was resolved and I believe it is now safe to talk about it without inflaming things (but hope you’ll forgive me for not going into too many specifics).
What I will say is that the situation arose when someone who lives in my city read a number of posts written on another blog about me. Before he read them I was unknown to him but the posts attacked me, made allegations about me which were untrue and it was written (in my opinion) without fact checking in quite an aggressive tone. Who wrote it and which post it is is irrelevant (in fact I’ve made peace with the blogger and resolved it) - the fact is the person who read it was in a place in their life where they were under extreme pressure and mentally unstable.
The posts were enough to trigger some extreme thought processes and obsessions in this person that led to a chain of escalating events that went from what I initially considered to be a harmless comment troll, to a cyber-nuisance, to a concerning threat maker, to what unfortunately became a situation where there was a physical attack made upon my property.
This process was very unsettling and in the end shook me up quite a bit.
As I’ve written above the situation is now resolved. I do not feel under threat - but in the process I’ve learned a lot and have a somewhat different view of blogging.
I wanted to share this story for a couple of reasons.
Firstly I think it’s important for us all to remember that the words we write might be written with one intention - but that they can be read and interpreted in a very different way. The blogger who wrote the posts that triggered this chain reaction did not intend for this to happen and could not have foreseen it. I don’t hold them responsible for it and as I say we’ve resolved our differences. However it has made me think twice before posting about other people since.
Secondly I wanted to remind bloggers to consider their personal safety.
While I’ve seen a number of articles in recent times about how to keep your blog safe and secure from hacking and accidental loss - it’s also worth considering how to keep yourself (and those you live with) safe and secure also. When you write in a public forum you are doing so with the hope that people will read you. The unfortunate side of this is that you have little control over the perceptions of others towards you and that from time to time people will disagree with you and even become agitated towards you. This is both one of blogging’s biggest strengths (ie the conversation, diversity and dialogue) but also one of it’s biggest weaknesses when it goes to far.
Lessons in Blogger Security
While I’m no expert on personal online safety I would STRONGLY advise you consider what information you do and don’t reveal online about where you live and your family. While this person tracked me down through offline sources it’s a good reminder that the things you write can often be used to track you down. Here are a few reflections on the experience:
Decide up front how much personal information you will share on your blog - I’ve always been reasonably careful about this. I don’t post my address (I have a PO Box), I don’t post the name of my spouse and I never post her picture or those of other family or friends. If I do post photos I generally ask for permission or make them anonymous.
Consider your offline security - Ask yourself - ‘How easy would it be for someone to find you in real life?’ As I say I was not found directly through information on my blog (although I’m sure knowing my name and city which I reveal in my about pages helped) but through offline sources. I’m sure there are different ways to add layers of security in different parts of the world but consider silent numbers, PO Boxes for mail, being silent on the electoral roll etc.
Consider the way you are perceived online - I work hard at presenting myself online in a fairly easy going, polite and well mannered person (I’m often made fun of for this even). While at times I don’t feel like being this way it is an intentional thing. This is partly because it’s my character and personality (I’m a shy guy who was brought up to always consider the feelings of others) and partly as a security consideration - I don’t want to inflame the wrong person. Obviously it doesn’t always work - but I do worry about some bloggers who seem intent on promoting themselves through angry, attacking and personal attacks. Consider the costs of your actions and words both on yourself and others before you post. You may still choose to take the attacking approach - but do so at your own risk and knowing the full extent of what it could one day lead to.
Have a plan of action - I would strongly recommend giving some thought to how you will deal with escalating situations that could lead to personal safety problems. This is part of the reason I wrote the What to do when your blog is Attacked post a week or so back. In the vast majority of cases things do not escalate to the stalker stage and can be resolved by using some of the strategies I mentioned there. However what if they escalate? At what point will you involve the police? How is the security of your home? etc
Don’t face it Alone - If things do escalate - I would strongly advise that you do not face the situation alone. The resolution of my situation came with the involvement of others. I don’t wish to go into details of this but ‘others’ could mean the help of other bloggers, others who live near you and others with some official and legal ability to help.
Online stalkers are not a common thing to my knowledge and I don’t share my story to scare anyone - but I do think it’s something to be aware of as you blog. Don’t let this kill your blogging but let it be something you give a little thought to in the hope the tiny chances of this happening to you are lessoned even further.
Subscribe to:
Posts (Atom)