What We Have Brewing In The Background
Lately, there have been many things going on with us. Besides watching our ranking climb in several cities, we’ve been hard at work. Testing and perfecting our craft is something we take very seriously.
At the end of the day, we need to be able to deliver results, and testing is the way to ensure those results will take place.
One such test was watching exactly which of our Web 2.0 properties was most likely to rank. And we found conclusive evidence as to which properties are the sure fire ones that will top the SERPS.
One of the more important things that we consider, and it’s a big one, is called the local maxima. The local maxima is a phenomenon that happens when following data-driven results.
The local maximum described here:
Do you ever feel that your design has become stale and that despite your making lots of little changes to it over time without any big overhaul there is just no way to drastically improve it?
If so you’ve probably hit what Andrew Chen calls the “Local Maximum”. The local maximum is a point in which you’ve hit the limit of the current design…it is as effective as its ever going to be in its current incarnation.
Even if you make 100 tweaks you can only get so much improvement; it is as effective as its ever going to be on its current structural foundation.
The local maximum occurs frequently when UX practitioners rely too much on a/b testing or other testing approaches to make improvements. This type of design is typified by Google and Amazon…they do lots and lots of testing, but rarely make large changes.
(Except, of course, Google’s homepage background change this week, which was quickly reverted)
While a cycle of smaller improvements is better than the dysfunctional design processes most of us are stuck with, one of the criticisms of this type of extreme optimization is that it’s always and only incremental: you can only make a few small changes at a time and therefore your design evolves slowly.
And if you’re doing rigorous testing, by only changing one variable at a time, then you’re only changing one small part of your application in each iteration. This work cycle becomes dependent on how fast you can run tests.
For Google and Amazon, who are blessed with millions of visitors per day, this is no problem because they can run tests extremely quickly. For most people building web sites, low traffic volume can be a huge hurdle because it means that tests have to run longer and thus slows down rate of iteration.
Read more here.
This is a great solution for businesses in Bridgeport, CT. Make sure you read up on our other blog posts here.
Bridgeport SEO Expert and Testing MethodsBridgeport SEO Expert
Finding any pinnacle is crucial, making sure its not only the local maximum, but the best possible design is tough. In order to do this, often we need to completely re-evaluate every little decision we made, the controls we’ve used.
As of now, we seem to be ranking fairly well with a diversified SEO strategy. We are always looking to find the better design, and not remain on the local maxima.
Testing At The Neovora Offices
This is how we are able to remain on the top of our game and target the keywords in the markets we do. Currently we have a few interesting projects going on: a regional dental branch across many markets, a realtor in Central Texas, just to name a few.
We’ll keep testing and making sure that we identify if we are stuck in the cul-de-sac of results; the local maxima. Hopefully we can climb out and reach our true potential, there is nothing more frustrating and scary than not doing so. As a Bridgeport SEO, we understand the need to reach the pinnacle.
If you are wondering about our rankings, we’ve been careful to stay ahead of the curve with the recent penguin updates. We’ll see if our tests reveal anything.