What We Have Brewing In The Background
Lately, there have been many things going on with us. Besides watching our ranking climb in several cities, we’ve been hard at work. Testing and perfecting our craft is something we take very seriously.
At the end of the day, we need to be able to deliver results, and testing is the way to ensure those results will take place.
One such test was watching exactly which of our Web 2.0 properties was most likely to rank. And we found conclusive evidence as to which properties are the sure fire ones that will top the SERPS.
One of the more important things that we consider, and it’s a big one, is called the local maxima. The local maxima is a phenomenon that happens when following data-driven results.
The local maximum described here:
Do you ever feel that your design has become stale and that despite your making lots of little changes to it over time without any big overhaul there is just no way to drastically improve it?
If so you’ve probably hit what Andrew Chen calls the “Local Maximum”. The local maximum is a point in which you’ve hit the limit of the current design…it is as effective as its ever going to be in its current incarnation.
Even if you make 100 tweaks you can only get so much improvement; it is as effective as its ever going to be on its current structural foundation.
The local maximum occurs frequently when UX practitioners rely too much on a/b testing or other testing approaches to make improvements. This type of design is typified by Google and Amazon…they do lots and lots of testing, but rarely make large changes.
(Except, of course, Google’s homepage background change this week, which was quickly reverted)
While a cycle of smaller improvements is better than the dysfunctional design processes most of us are stuck with, one of the criticisms of this type of extreme optimization is that it’s always and only incremental: you can only make a few small changes at a time and therefore your design evolves slowly.
And if you’re doing rigorous testing, by only changing one variable at a time, then you’re only changing one small part of your application in each iteration. This work cycle becomes dependent on how fast you can run tests.
For Google and Amazon, who are blessed with millions of visitors per day, this is no problem because they can run tests extremely quickly. For most people building web sites, low traffic volume can be a huge hurdle because it means that tests have to run longer and thus slows down rate of iteration.
Read more here.
This is a great solution for businesses in Bridgeport, CT. Make sure you read up on our other blog posts here.