LongURL Mobile Expander slows me down

A reminder that addons, extensions, and other bolt-on software capabilities aren’t free:

It was a maddening bug. On my machine, and mine alone, our web based application slowed to a crawl when I chose a particular option. No one else could recreate the bug.

As I was showing the bug to the developer, we had a hunch, checked my add-ons, and turned off about half of them. The problem went away. Now I had a hunch about where the problem was. I turned on all the add-ons except LongURL Mobile Expander. The web application was working properly again, and I had my culprit.

I’m not a JavaScript developer so I’m not sure, even looking at the source code, why there was a problem. I wonder whether the issue was the fetch of the list of supported services, which seems to happen on every onload() event — possibly on our Ajaxy web app, the lookup was firing more than once per page? (Update: No See below.) All I know is that it’s turned off for good for me.

It’s kind of a shame, because LongURL performed a useful function: with it installed, when you hover over a link to tinyurl.com, bit.ly or one of the other URL shortening services, it looks up the link and shows you the destination in a tooltip–so you can tell if you’re going to get RickRolled, essentially. Useful, but not at the cost.

Update: the developer who looked at the issue with me does speak JavaScript, and he says the issue is not the fetching of supported services (happens once, then cached). Instead, the real issue is that the script re-parses the web page’s document object model each time a new node is added. This is what just about every AJAX app does all the time, which explains why the problem is only visible on apps like ours–or Facebook, as one rater of the add-on points out.

Google Chrome 1.0 (.154.36)

Well, that was fast. Google Chrome went from new to 1.0 in about 100 days:

chrome1dot0

But is it ready? And why so soon?

chromepngbug

I expected Google to add more features over time, since the merely architectural improvements of the browser didn’t seem to meet the critical differentiator threshold to justify launching a new browser. But that didn’t really happen. And in fact, Google seems to be launching Chrome with some rough edges intact. Check out this snippet of the WordPress 2.7 login screen (right).See those black edges around the box? That’s a rendering bug in Chrome’s version of WebKit. (The black corners aren’t there in Safari.)

So: Google is rushing a new browser that they “accidentally” leaked just 100 days ago, a browser that has significant speed but demonstrable rendering flaws, into an already crowded market. Why? And why launch two days after previewing the Google Native Code plug-in, a web technology that seems a far more significant leap forward?

My guess: they’re scared of having their thunder stolen, maybe by Firefox. The new Mozilla JavaScript engine, TraceMonkey, appears to be running neck-and-neck with Google’s V8. And when the major feature in your browser is speed, you don’t want to risk being merely as good as your better established competitor. So maybe releasing Chrome ahead of Firefox 3.1 (which still has no release date, and at least one more beta to go) was simply a defensive move to make sure they aren’t competitively dead before they launch.

Ubiquity memory issues on Firefox

I may have to stop using Ubiquity for a while. I’ve used it exclusively because it, plus the share-on-delicious script, provides a great keyboard-only way to tag web pages for Delicious, simply by ctrl-space and typing share Delicious bookmark description tagged delicious tags entitled title“.

Alas, there are definite memory issues with Ubiquity or with the script. I currently have three tabs open in Firefox and the memory is more or less stable at 112,988K. If I invoke Ubiquity and start typing:

share This is a sample Delicious post that's not too different from one I would normally do, except a bit shorter and more fictional. tagged ubiquity entitled foo ubiquity test.

then suddenly memory usage spikes up to 571,028K !!! The memory use gradually falls back down, but it climbs steadily and precipitously while I’m typing, and there’s a point beyond which Firefox becomes unusable. Maybe I’m a canary user because I’m a touch typist, and I’m typing faster than Firefox can garbage collect memory? I still can’t believe that Ubiquity could be consuming so much, though.

(Update: apparently I’m not alone.)

Get a jump on Download Day

Courtesy a little bird, it’s possible to download Firefox 3.0 already, though it hasn’t been announced yet.

The latest public download is RC3:

http://download.mozilla.org/?product=firefox-3.0rc3&os=win〈=en-US

but if you remove rc3 from the URL, you get:

http://download.mozilla.org/?product=firefox-3.0&os=win〈=en-US

which is a valid URL. (So much for security by obscurity.) Enjoy your early start on Download Day! (Tip o’ the hat to Dil.)

Update: Or not. The version string in the -3.0 version is the same as the one in the RC3 version about box. Oh well.