I’ve been playing with the Firebug 1.0 private beta for the last few days and it’s quite a tool. v0.4 the last release of Firebug had already combined the most popular features Venkman, Console2, and the DOM inspector. 1.0 adds a entire new set of features. JS profiling and Net request tracing similar to Tamper Data are IMHO the most powerful. Joe has been hard at work and been fising issues as quick as we find them. Firebug now has enough functionality that it could implement and examine all the tips included in my entire presentation at OSCON.
The new website Get Firebug gives you a glimpse of what is coming shortly:
JSONRequest has been proposed by Douglas Crockford as a new native method for web browsers to exchange data with servers. The biggest notable difference with XMLHttpRequest is that it doesn’t prevent requests to multiple domains. JSONRequest can make requests to any server, not just the origin server like the current XMLHttpRequest. Web applications today that need to make cross-domain requests must use a proxy or a dynamic script tag to make requests to non-origin domains.
JSONRequest also has provisions for long lasting server initiated, live push of data. Crockford calls this “Duplex”.
“JSONRequest is designed to support duplex connections. This permits applications in which the server can asynchronously initiate transmissions. This is done by using two simultaneous requests: one to send and the other to receive. By using the timeout parameter, a POST request can be left pending until the server determines that it has timely data to send.
Duplex connections can be used in real time notification applications such as process management and finance. It can also be used in collaborative applications such as instant messaging, instant email, chat, games, presentation, and shared applications. “
Alex Russell calls this same server-push technique Comet. I actually like Comet a bit more than Duplex. Duplex is a bit overloaded with a few other uses in tech. Like the half or full duplex with configuring network cards or terminal echo. Not as if I have a vote(just like AJAX sotra stuck), but let’s hope Comet wins.
Ever need to create a favicon image? Not something you do everyday but when you do it sure would be nice to have a tool. You could always use Photoshop, Microsoft Paint, etc. I wanted an easier way. After a couple web searches I found the perfect answer. Upload any image and a favicon is created for you. The Favicon Generator will even create an animated favicon! I’d never wish this annoyance on anyone but the idea is quite novel. There is also a tool to validate your favicon and the link tags needed to make favicon’s work with a wide variety of browsers. Here’s the link tags I’ve been using for sometime now and this seems to cover the major browsers.
<link rel="ICON" type="image/gif" href="/favicon.gif"/>
<link rel="SHORTCUT ICON" href="/favicon.ico"/>
Nate posted an article which coincided with Yahoo’s launch of their UI Library titled "Graded Browser Support". It appears he first coined the phrase in 2004. Subsequent google hits all point to Nate. I must admit it’s quite novel. He also posted a matrix of browser’s that Yahoo domain applications support. It’s quite refreshing to see an age old problem explored in a new way and with a catchy new phrase to help coral the thought going forward. It reminds me of some work in a previous life around WAP browser capabilities. The first generation of tool kits and support for WAP devices included huge switch statements and hacky User-Agent regular expressions. The second generation and what I belive is still in use today was a framework that detected capabilities. Rather than look for browser X and apply hacks X, the code would detect capabilities. For example the ability to support pages greater than 15k (remember we were in the mobile world), or the ability to support a password input field. The capability matrix kept the code much clearer. Instead of complex decision trees with User-Agent’s, code was clean with simple if/else statements for a particular capability. The hard part was contained in a single matrix that maps capabilities to various browsers. A quick update to the matix and a new handset could be added with little or no code changes.
The same problem has existed in web browser’s for many years. Most applications simply had isIE() or isNetscape() type checks. For the most part this worked pretty well as the capabilities being detected were generally split between the two dominant browser’s of the time; Internet Explorer and Netscape. Today the landscape is very different. No longer is the browser war a 2 horse race. For most applications on today’s web it’s at least a 3 to 4 horse race. In Nick and Yahoo’s case there are 6 different browser’s with a total of 10 browser/version combinations. This doesn’t even take into consideration the OS side of the equation. Web application complexity continues to increase. As applications take advantage of the latest features of new releases the support for capabilities in earlier versions becomes harder to represent. A simple example is the new native XmlHttpRequest object in IE7. Assuming it performs better than the ActiveX object in current releases web developers will want to take advantage of the new native object. In a isIE() type decision tree there would need to be a special case to handle IE7. However in a capabilities based application IE7 would be defined with a native XmlHttpRequest object such that the code would function without any special cases.