Author:


Diggbar Feedback (and how not to collect feedback)

Posted by – April 6, 2009

Digg recently introduced a feature that frames the content posted to Digg, reminding me of the internet in the 90′s, where framing other people’s content to put ads on top of (remember about.com’s ubiquitous frames?) was commonplace and when one of the first bits of JavaScript I used was a “frame buster” that would prevent those obnoxious banners above my web pages.

But I’m not writing this to rip on Digg. Their “Diggbar” is a good example of why the old “banner in a frame” tactic began to die off and why things like the Diggbar can succeed.

As in any such case where you are breaking the fundamental way people expect the internet to work (in this case by framing the destination url) the tradeoff between usefulness and annoyance is what will make or break this feature and Digg has some useful (for some) features included like:

  • URL shortening. Twitter’s character limit has sparked a renaissance in short url services, and Digg made it easy to create them by appening any url to digg.com/ (e.g. digg.com/http://agilewebmasters.com/robert/diggbar-feedback/).
  • Access to comments and related submissions. In my experience with Digg the top ranked comments often provide useful mirrors, context or even refutation. Some will find their inclusion in the Diggbar useful in this format. As to related submissions once they manage to make this useful (they are surprisingly bad at term extraction, but I’ll write about this some other day) this too can serve as an additional exploration point for their users.
  • Digg/Bury. Providing access to their users to digg and bury articles while reading the actual article in a frame is a good thing. Their users don’t need to vote on an article grid or go to the comments page and can go to a single destination for both content and the core digg functions.
  • Random. This opens up an entirely new use pattern for digg bringing the serendipity of StumbleUpon and allowing for even more casual content discovery. This is by far the most game-changing part of the Diggbar, and in my opinion this should be reflected in the interface UI by giving it more prominence on the toolbar. Personally, I’d put it where they have their logo right now and incorporate the logo into the random button (call it “digg shuffle” and you are just a lower-case “i” away from the hearts and minds of a sizable cult following among their user base).

But I most likely won’t be using this feature (call me old fashioned but I like the traditionalist notion that links to another site are best served by an actual link to the other site, not a link to your own site with a frame, and yes I am inordinately parenthetical) and digg had the good sense to include an easy way to opt out of this permanently instead of only providing the traditional “remove this frame” option of the 90′s. Without this option I’d likely not use digg anymore. Hell, the fact that reddit provides a direct link to the submission url in their rss feeds is one of the biggest reasons I’ve seen my time on reddit increase and my time on digg decrease over time. For many reasons I like less, not more, between me and the content I’m trying to see.

But now that I got that convoluted rambling out of the way, I’ll get to what prompted me to write up a post in the first place, and that is something digg got fundamentally wrong.

  1. Despite my reservations,  I wanted to try the casual content discovery that is random browsing. I clicked on the Diggbar’s “random” button a few times before a solicitation for feedback pushed the buttons over and caused my next click to launch a popup to twitter.

    That brought my foray down digg’s random path to an abrupt end, and if they are introducing a “toolbar” that they hope will drive more such use they should take greater care to make UI elements predictably positioned, especially if it’s a button whose typical use pattern involved frequent clicks (I note with great humility that my brilliant “digg shuffle” idea would have precluded this).

  2. They are soliciting feedback through SurveyMonkey.com, and I’d already provided feedback. So far so good but now I discover that this link leads me to a page saying “Thanks for Taking the Survey.  Looking to Create Your Own?” and does not allow me to provide any additional feedback.This is daft for several reasons. An obvious reason to start with is for continuing to solicit feedback when I can no longer provide it but that doesn’t even begin to address the more fundamental mistake of using such a third party provider that can’t handle the simple use case of a user having feedback and then, as in my example of seeing additional behavior after the first feedback was provided, having more. It’s not much more difficult to roll their own simple and scalable solution for feedback through Google App Engine or Amazon Simple DB than it is to use a third party with such limits on their users’ experience and that is the solution I’d elect in their shoes, but even if they wanted to eat someone else’s dog food they could have at least provided a more useful forum such as Google Moderator or Get Satisfaction that would have allowed for more than one instance of feedback (though given the open communication they might be more transparent than digg prefers for their initial feedback).As it stands, I had feedback that they’ll have to happen upon on this blog post to read (8-ball says “Outlook not so good” and tossed in “nobody reads this blog anyway”), and that their medium could not collect. There are more elegant solutions to prevent sock puppetry than such a limitation and if I were running a site digg’s size I’d want to get the basics right. And to be fair, if they were in my shoes, they’d probably be headed to bed right now instead of waxing loquacious on such usability inanities at this hour.

Video: Nicole Sullivan – Design Fast Websites

Posted by – March 3, 2009

Amazon’s sales dropped 1% with 100ms longer load times. Site performance is money, and this is a useful video on the basics of making a fast website.

Percentage rounding and sub pixel perfection

Posted by – February 22, 2009

I have been working on a grid design framework for able2know, and have been wrestling with the inconsistencies between browsers when it comes to percentage rounding. There are good static grid CSS frameworks out there (e.g. Blueprint, 960gs) but the attempts to convert them into liquid grids (e.g. Liquid Blueprint, fluid960gs) share the same bugs I was running into, where width is not correctly calculated and in IE6 and IE7 the effect was often dramatic, with columns jumping around when the browser is resized. At some widths everything would be perfect, but at others columns would wrap into the next row, breaking the layout. A couple of pixels off might not sound like much, but have a look at this video to see the effect on the layout those few pixels can have.

I set out to fix this problem, and began writing my own grid CSS framework, trying all sorts of different mathematical approaches to the problem (I was working with three columns so my initial suspicion was that the browsers were having a hard time dealing with fractions like 1/3). I also tried every IE CSS hack to try to coax the floats the grid uses into compliance, but no dice. So after more investigation, I learned that this is a rendering problem with no perfect solution for the browser manufacturs. The limitation is your monitor.

To make a very long story short, your monitor needs your browser to display objects sized in pixels. Browsers have no easy way around this problem, and there is no perfect solution. This, of course, doesn’t mean that the various browsers will find a consistent way to handle it, and as per usual IE managed to settle on the least optimal approach of the bunch.

First of all, let’s explain the problem. Let’s say you want to divide your layout into 4 columns. The pixels this area will represent may not be divisible by four. Let’s say you are dividing 50 pixels (no matter what you choose, the browser may end up with an indivisible problem, so I’m just using an easy example that John Resig used in his comparison of how each browser handles the rounding), each column should be 12.5 pixels wide according to your CSS telling the browser to divide the 50 pixels into 25% columns. The problem is that your browser can’t do this, and needs to round to an integer. No matter what your browser does at this point, there will be imperfections. If your browser rounds down there may be gaps and if you round up, then there can be an overflow.

Here is a test you can use to see this error in effect in your browser. It should display a black box, but drag your browser around resizing it, and at some sizes you should see gaps (in Firefox 3 you may not be able to do so due to an innovative way they are handling the rounding rounding the layout to 1/60th of a CSS pixel).

Now I understand the limitations each browser faces, and how they can’t possibly get it all perfect, but what IE does that breaks layouts as you resize the browser, causing your divs to jump around is round up. By rounding up they may cause your columns to exceed the width of their container, and wrap. There’s not a great way around this, and the only easy solution is to not let your columns add up to 100%, leaving room for rounding up without breaking your layout. It’s not a great solution, and it ruins the pixel perfect grid I was trying to implement on able2know with more space on the right than on the left but I thought I’d write up the problem in case others need it to fix their floats.

Other sources:

https://bugzilla.mozilla.org/show_bug.cgi?id=177805

https://bugzilla.mozilla.org/show_bug.cgi?id=63336

https://connect.microsoft.com/IE/feedback/ViewFeedback.aspx?FeedbackID=334118

http://www.satzansatz.de/cssd/geckogaps.html

http://www.ojctech.com/content/css-jumping-columns-and-ies-percentage-rounding-algorithm

What is the best PHP accelerator to use?

Posted by – February 1, 2009

Well let me go ahead and tell you. Of course, this is all just my opinion and your milage may vary.

First, I will discuss the role of PHP accelerators (Opcode cache) in server tuning and scalability briefly. These tools will not enable your server to handle much more traffic in most scenarios. In some, the additional overhead of the PHP caching will even cause more load, others gain a marginal improvement by getting requests served a bit more quickly and having fewer concurrent connections. But they will not significantly raise your concurrent user limitations, as in a LAMP stack your bottleneck is usually at the database.

The way these PHP accelerators work is by caching the compiled bytecode of your human-readable PHP. Normally, your PHP code is compiled and then executed at runtime but these tools cache the compiled code, saving the expense of compiling it and thusly generally save you a bit of CPU at the cost of some increased memory usage.

So what PHP acceleration can do is make your PHP execute more quickly, and execute in roughly half the time. But it’s important to understand just what it’s accelerating, because your PHP execution is typically not what most influences the perception of speed to the user. To the user it’s a combination of page generation time, network latency, and page rendering time. These php caching tools may influence the page generation time but as I’ve already said, the database is usually the key there, and it’s both the bottleneck for concurrent users as well as the bulk of the page generation in typical setups. To make the biggest difference there you need good database design and data object caching with something like memcached, but here we’ll go over the options to improve your PHP execution times.

Alternative PHP Cache – http://pecl.php.net/package/APC

eAccelerator – http://eaccelerator.net/

XCache – http://xcache.lighttpd.net/

Zend Platform – http://www.zend.com/products/platform

ionCube PHP Accelerator – http://www.php-accelerator.co.uk/

Turck MMCache – http://turck-mmcache.sourceforge.net/index_old.html

On the able2know Q&A site we have used Zend and Turck MMCache in the past, with favorable results, but I am on a scalability and performance crusade here, and wanted to pick out the best of the current crop. Turck MMCache has not been actively developed for a while now, and is not a viable option for us to use in production, so that one’s out. I did the research into a variety of benchmarks (see chart), and with few exceptions the main competitors perform close enough to each other to make the performance differences less of a deal-killer in a selection between them. Simply put, the marginal performance gains you may acheive by selecting one over the other may be outweighted by differences in things like price, or how actively the code is being developed.

So I narrowed the selection down to XCache, APC and Zend. Zend does much more than PHP caching, and may be the right choice for others but they are a proprietary option that you must pay for, and that doesn’t justify the cost difference through performance gain as a PHP accelerator. XCache and APC are developed by well known programmers in the open source world, being the developers of lighttpd and PHP itself respectively. But between the two I am opting to use APC on able2know, as it is a pecl extension maintained by the maintainers of PHP itself (including the creator of PHP) and is reportedly going to become a core part of PHP 6.

So at least for me:

The best PHP accelerator to use is APC.

Able2know launches

Posted by – August 25, 2008

Able2know has long run an experts exchange on phpbb, but has recently launched a new custom software to replace the old forums. This software was developed by two of us Agile Webmasters (Nick Ashley and myself) over the course of the last 10 months or so and while we launched quite rough it’s been evolving quickly in the days since and we are committed to making it a very rich and usable web application.

Over the course of the next weeks we’ll talk about our development process and decisions to give advice to others facing similar challenges but you are welcome to ask us questions about web development on able2know anytime.

The essential Firefox Plugins for the Web Developer

Posted by – August 15, 2008

Nick has already posted about Firebug and Firebug Extensions and in my own experience in setting up my workstation I too found this to be an essential Firefox Add on. In addtion to Firebug, here are the other Firefox plugins I’ve already installed in my first few days of work on this new computer.

  • Delicious – An agile webmaster might know thousands of online resouces and having them on one computer in the form of browser book marks is not very agile is it? Using delicious you can take your bookmarks with you, and this was one of the first things I missed having: all my bookmarks.
  • SearchStatus – This plugin is essential for the SEO professional. It gives you Google pagerank, Alexa Rank and Compete rank information for the websites you visit and also has a useful keyword density analyzer and quick access to searches to find indexed pages, backlinks and cache. It’s one of the tools I use the most when investigating a website.
  • Web Developer Toolbar – While this was originally a must-have tool for its live css editing, Firebug has become much more useful for that purpose. But this tool is still very useful for a variety of other information and I also tend to use it a lot to quickly disable cache, cookies, or JavaScript.
  • ColorZilla – This gives you an easy color picker and an eyedropper that can be used to quickly get the color of anything on any webpage.
  • And of course, Firebug and YSlow as Nick has already blogged about.

Yahoo launches Fire Eagle

Posted by – August 13, 2008

Yahoo launched Fire Eagle yesterday, a web service that lets users input their location for use by other web applications. Web developers can use its API to create applications that then use this information to provide location-based services to the user.

The user can input their location through a variety of methods, with the most antiquated being the entry of their location on the Fire Eagle website or even, in the hyper Web 2.0 world, SMS. However it also allows phone-based applications to broadcast the user’s location to the web service which allows for real-time uses of local data that open a lot more application possibilites.

The user has control over what location details are broadcast, but privacy advocates are sure to cringe at the encroaching of the smart cloud that knows more about you, as the initial uses for this are largely related to commercial opportunities in your proximity.

Are you imagining an ad network that serves ads relevant to where your laptop or phone currently is? I am and it’s a frickin’ “Starbucks on the right” banner that I think could wring a few more bucks a day out of those caffeine junkies.

Comscore announces explosive growth in social networking websites

Posted by – August 12, 2008

Not to be outdone by the other web metrics companies releasing their traffic analysis articles in the last few days, Comscore announced today that “Social Networking Explodes Worldwide“, citing a 25% growth worldwide in the last year.

Naturally, the growth is largely coming outside of North America whose 9% growth rate for the year is the lowest globally and emerging internet markets were responsible for most of the growth.

Additionally, they released information about specific sites showing MySpace nearly stalled at 3% growth with Facebook and Hi5 the fastest growing social networks at the moment. Also of interest to the social media marketer is that they are showing Facebook as having overtaken MySpace in monthly uniques. But then again, only a blind social media marketer wouldn’t have noticed Facebook’s meteoric rise.

Google crosses 70% search market share in US

Posted by – August 12, 2008

Research firms Neilsen Online and Hitwise have released traffic information that shows strong Google growth in the US market. Google’s audience through its variety of web properties grew by a million from June to July to 129 million (Google has the largest audience in the US through its properties like Google.com and YouTube.com). But even more damning for its rivals (namely Microsoft and Yahoo), and arguably for the webmaster community, is the information Hitwise released today about the most coveted traffic of all: search traffic. Hitwise announced that Google has crossed the 70% mark (they cite 70.77%) which is up 10% from July of last year and 2% from June of this year. Here’s the rest of the search landscape that Google dominates:

  • Google – 70.77%
  • Yahoo – 18.65%
  • MSN/Live – 5.36%
  • Ask – 3.53%

And here’s a graph from Hitwise:

Google releases Keyczar open source cryptographic toolkit

Posted by – August 11, 2008

Google has announced the release of their open source cryptographic toolkit Keyczar. It is a toolkit that Google claims (I have no experience with it yet) will make it easier to do cryptography right.

Keyczar’s key versioning system makes it easy to rotate and revoke keys, without worrying about backward compatibility or making any changes to source code.