SEO and Issues Relating to Page Load Speed

Speedometer

Is Your Site Speed Adequate?

It seems that web design has come full circle. Not too long ago web pages were designed to be compatible with slow dial up modems and needed to load quickly. A good design therefore kept images and code to a minimum so a user could access the page easily over a modem connection. With the advent of broadband connections page speed became less of a concern and developers were more at liberty to create pages with fancy animations, graphics, transitions and effects, all of which required substantial bandwidth due to the image heavy and code heavy requirements of this type of design. But then came mobile, with bandwidth limitations often similar to that of the dial up connections of yesteryear. Google now includes page load speed as a factor in its search rankings and that has received a lot of attention, with many SEOs saying that most sites do not have to worry about it. However, this ignores usability factors such as bounce rate and rate of page abandonment, and little attention has been given to the negative consequences in search rankings to pages with high rates of abandonment.

Google now factors in page load speed as one of its many considerations in determining the quality of a site to be listed in its search results. Google is not too picky about this, however, as only 5% of pages are claimed to be affected by this consideration. So as long as your site is in the top 95% of pages on the web with regards to page load time, the page speed factor will likely not be of consequence to you. A Google Page Speed tool is available online so that you may see how your page stacks up against all the others. The results are given in a score of from 0 to 100, with any score over 5 (representing 5%) considered acceptable.

The fact that only 5% of web pages are affected by the page speed ranking factor has caused many SEO professionals, while acknowledging the existence of page speed as a factor in rankings, to then advise clients that it is not something with which the client should be concerned. This approach, however, ignores other factors relevant to both users and Google itself, and is demonstrative of a tendency within much of the SEO community to focus on only one or a few factors when advising clients as opposed to considering the big picture.

The problem is twofold. First, a page that loads slowly will discourage users from using the page. A large amount of users leaving the page and returning to the search result is knows as the bounce rate or rate of abandonment (the opposite of this is known as stickiness, when a user “sticks” on a site and clicks through multiple pages). Second, Google considers bounce rates and rates of abandonment in its search rankings.

So let’s take the example of a page that scores a 15 on the page speed tool. An untrained SEO provider might state that this is acceptable. However, what this means is that the page in question, while acceptably exceeding Google’s minimum expectations, is still slower than 85% of the web pages out there. The the page is frustratingly slow, Google’s opinion notwithstanding. Now consider the fact that by some estimates over 50% of all search traffic now comes from mobile devices. What do you do when you are on a mobile device and it takes five, ten or twenty seconds to load a page? Most people return to the search results and try again.

The effect of this is that the site owner is losing a large portion of her potential customers. That is a problem. Focusing on rankings only without respect to usability and conversions is a topic for another article. Suffice it to say, however, that if somewhere around 50% of your users are on mobile and are never making it to your site because of your slow page load speed you have a problem.

So the site owner takes an immediate hit in the bottom line as a result of a high abandonment or bounce rate.

Now this is where the snowball effect comes into play. Google also tracks user behavior after the user leaves the search results and lands on a page. If the user bounces off that page, or quickly abandons it by clicking the “Back” button on the browser, then Google knows that. How would you interpret this if you were Google? If a user finds a search result, goes to a page, and then quickly returns, the most logical interpretation is that the page did not offer information relevant to the user’s search query. So Google considers that as Google is in the business of providing relevant results. The slow-to-load page is given a sort of relevance demerit, and its search rankings suffer.

Also consider the fact that Google looks out for nobody but Google. It has obligations to its shareholders. If Google got a reputation for providing results full of slow to load pages, Google users would defect and find an alternative that provided them with speed. Google can’t allow that to happen. To argue the point that page speed is irrelevant to search, or a minor factor in search, is to argue that user behavior is irrelevant to Google’s business model.

So the snowball effect of a page with a low but otherwise acceptable page load speed is this: a slow page load speed will result in higher bounce and abandonment rates. Higher bounce and abandonment rates are interpreted by Google as both (1) a sign that the page is not relevant to the given search query; and (2) a threat to its business model. Therefore, the slow page suffers in the rankings.

By: Matt Foster. Mr. Foster is an SEO consultant and the CEO of ArteWorks SEO in Austin, Texas. Mr. Foster can be found on Twitter @ArteWorks_SEO or on Linked In at /arteworks.

SEO for Parallax

Unimpressed with Parallax SEO?

Unimpressed with Parallax SEO?

Providing SEO for a site utilizing the parallax scrolling effect may at first seem a bit challenging, given the fact that on its face a parallax site does not offer the opportunity for deep links or individually optimized page content. However, there is no need to be unimpressed – there is a solution! Here’s a hint: treat it like a Flash site.

What is a Parallax Site?

The parallax effect is typically achieved using Javascript or JQuery to create a scrolling or 3D effect when jumping between named anchors on a single web page. You can see some examples of sites created using the parallax effect here: http://webdesignledger.com/inspiration/21-examples-of-parallax-scrolling-in-web-design

The navigational scheme is such that when navigating about the site, instead of going from one individual URL to another for each page, one long web page is created, which page contains the entire contents of the site. Named anchors are used to jump or scroll between pages. When the user navigates about the site, instead of a new page loading on the screen, the new page scrolls, slides, or “whooshes” in from above, below, or the side, depending upon where the user is at within the one page site’s content, and to which named anchor the user is navigating.

So a site created using the parallax site is an entire site written as one big, fat, giant web page file, graphics and all. The negative SEO implications of this are obvious, and mainly center around (a) the inability to deep link to individual pages of a site using a unique root-level URL; and (b) the difficulty in optimizing basic on page elements such as title tags for individual page content.

Background

We were contacted by a potential client who owned a children’s entertainment center. Think of something along the lines of laser tag, or a bowling alley, or a pizza arcade, or a go cart track. The client had determined to build the site using the parallax effect, and wanted to rank well in the search engines for a variety of keyphrases pertaining to specific events that might be searched for on Google. For example, the client wanted to rank well for such things as birthday parties, bar mitzvahs, quinceaneras and the like.

Normally this would be accomplished by creating separately optimized pages for each of the specific events (title tags and such), describing the packages available for each event, and deep linking using appropriate anchor text to each of the pages. With the parallax site, this was simply not possible.

The Solution

I began to think of solutions to the problem and I found myself treating the parallax page as akin to a Flash site. There’s not much one can typically do with a Flash site – except one thing. Typically I would advise the owner of a Flash site to create a machine readable HTML version, both for the search engines, but also for purposes of accessibility and for those users who may not have the Flash plugin. This would not be considered duplicate content and would not infringe upon any of Google’s guidelines.

In the case of a parallax site though, there would be a duplicate content problem if we created separate “landing” pages for each of the children’s events, as the landing pages would be duplicative of the event content on the parallax page.

So the solution that I came up with was to organize the site content into two separate types of content: (1) Content important to the user but not likely to be searched for; and (2) Content important to the user and highly likely to be searched for.

Examples of content in the first category, that a user would be unlikely to search for yet which would be important information for the user would be such things as a contact form, maps, photo galleries, catering menu, testimonials, and the like. It is highly unlikely that someone is going to search for “laser tag testimonials”.

Examples of content in the second category, that would be both highly likely to be searched for as well as information useful to the user would be the specific events, like birthdays, bar mitzvahs, and such. People search for things like “laser tag birthdays” or even the more general “Austin birthday parties”.

By defining the schism between the two types of content, we can then begin to organize our SEO strategy and incorporate it into the site build. We decided to use the parallax effect on the home page and on all general information-type pages, such as contact, gallery, menu, testimonials, map, etcetera. We included in the parallax page a section called “Events”.

When the user eventually scrolled his or her way through to the Events page, we presented the user with a list of events typically hosted by the facility. Examples would be those given above, such as birthday parties, team parties, church parties, bar mitzvahs, quinceaneras, and such. Each item on the list was a hard link to an individual URL which contained content specific to that event (keyphrase), including optimized on page elements such as title tags, and customized content appropriate to that event.

The net result is that the user starts off on the parallax site, clicks through to a particular event, and lands on a static, or “hard” page, for that event. When clicking back from the event, onto any of the navigation buttons (Home, Contact, etc.), the user is returned to the parallax page and the scrolling effect can begin anew.

The creation of the “hard” event pages allows for individualized SEO for all necessary on page elements, customized content, and deep linking to the site.

Problem solved!

By: Matt Foster

Matt Foster is the CEO of ArteWorks SEO Austin, an Internet marketing and search engine optimization firm located in beautiful Central Texas. For more information, please visit www.arteworks.biz.

New Year’s Resolution – Lose That (Website) Fat!

It’s a new year and a great time to embark on a quest to lose all that extra fat that we have put on over the years. I’m talking about website fat, of course. All of that bloated code, deprecated HTML tags, tables, javascript, JQuery, Flash, images and such. The new trend in both web design and search is to get lean – less is more! Not only does leaner code (faster loading websites) improve search engine rankings (the importance of speed in search engine rankings is no longer merely a topic of debate, it is accepted fact among knowledgeable SEO providers), but also dramatically improves the user experience (in other words, increases conversions, stickiness, and return visits) for the vast numbers of mobile users. If you aren’t willing to cut the fat from your site to create a speedy and enjoyable experience for mobile users, you could be missing out on over half (yes, upwards of 50% of all web searches are now conducted on mobile devices) of your customers.

(As an aside, at the time of writing in January of 2013 I am well aware that our own company’s website is in dire need of a weight loss regimen in accordance with the recommendations in this article. Fear not, as I am not only willing to prescribe the medicine, but I am willing to take it too! Our new website is currently under construction and is anticipated to be launched next month.)

Known as responsive web design, modern user and search engine trends require that most websites over two years old be reworked so as to upgrade to HTML5 and CSS3. The use of these new tools eliminates fat by dispensing with the need for Javascript, JQuery, Flash, and many images, as well as creating leaner HTML code and reducing HTTP requests to the server.

Throught the use of what is called a “media query”, a website can be adapted (or respond to, hence the term “responsive” web design) to any number of screen resolutions or screen sizes, from the smallest mobile device to the largest flat screen display. Media queries establish the operating system, browser, and viewport size (basically, the screen size), among other items, and then tell the site how to display itself to automatically fit the device it is being viewed on. Media queries can also be utilized to serve different file sizes of the same image, for low resolutions (low file size) images for smaller viewports on handheld devices with low bandwidth to high resolution, larger file sizes for HD or “retina” display devices with large screens and a broadband connection. This ensures that the user will get the most optimum viewer experience regardless of the device utilized to view the site.

HTML5 and CSS3 also eliminate the need for many images (therefore HTTP requests to the server and image file downloads) by enabling the site to display gradients, navigation buttons, backgrounds, rounded corners, text effects such as drop shadows, divider lines and the like through the use of a few lines of code. In other words, there is no need to create and download actual images, the code and stylesheet can now be written so as to tell the machine to draw the image or effect itself! Also, the need for Flash or javascript/JQuery transitions and animations can largely be eliminated via this same technique. This cuts tons of website fat, increases download speed and response time. Forms also are no longer dependent on javascript/JQuery or other scripting, as HTML5 fully supports native form functionality.

Also, through the addition of a few lines of code, all of these new features can be made backward compatible for older browsers that do not support HTML5 or CSS3.

With the increasing use of limited bandwidth mobile devices, the above techniques greatly reduce site load times. This, in turn, improves a site’s performance on the search engines as well. Therefore, I would strongly recommend getting a website checkup today so as to maximize your site’s customer reach and SEO performance. Happy new year, and good luck with your new website diet!

About the Author: Matt Foster is the founder of ArteWorks SEO, a web design and SEO firm based in Austin, Texa. For more information, please visit http://www.arteworks.biz.

Will Google Glasses Forever Change How We Look at Computers?

Google is planning to release a high-tech set of display glasses by the end of 2012. These glasses are a revolutionary concept in computers, and will take the idea of the mobile computing one step further. The glasses are rumored to have an outward facing camera, whose job is to record your movements and gestures. Supposedly, the technology will also allow your hand can reach out like a computer mouse and control the glasses’ navigation. The best part about these glasses is that the lenses are clear, which allow you to stay in touch with the rest of the world, instead of the traditional tablets, that can become all-consuming. The glasses will be Android based, and will have flash and voice input capabilities.

Sources say the glasses will be priced similar to a smartphone, and could cost anywhere from $150 to $600, with some sources say closer to $250 for a pair. The glasses may look a little geeky in their first versions, but will supposedly look similar to a pair of Oakley’s Thumbs MPS player glasses. While the screens are transparent, the heads up display is only for one eye, and will supposedly not even take up the whole screen.

Google is still unsure if they will find a market for this product. The glasses are considered a pet project, to see if the world is ready for a new form of computing. The idea was thanks in part to Google’s ‘Google X’ laboratory, where they trial experimental, “out there” ideas. Google has always allowed it’s engineers to devote 20 percent of their time towards these experimental projects and have tested out products as wacky as talking refrigerators and robot workers. There is even a rumored ‘self-drive’ vehicle in the works that could change the future of transportation. Google proudly admits that they have invested quite a bit of money in very speculative Research and Development projects; some of which have been more successful than others.

If Google can find a market for their Google Glasses, they could potentially change everything we know, from marketing and advertising, to the way we do business and operate in our personal lives. Adding a fully functioning computer to our already expanding human capacity could elevate our society to a completely new level. Buyers, however, will be the ones to accept or reject this technology, and it may prove to be only a first step in introducing a revolutionary product from Google. As a top 5 search engine optimization company, ArteWorks is excited to see what kind of response this kind of technology will receive.

Matt Foster is the CEO of ArteWorks SEO, a full service Internet marketing firm, who has been active in the SEO industry since 1995. Mr. Foster can be found on Twitter @ArteWorks_SEO. ArteWorks SEO can be found at www.arteworks.biz.

Tricking Google is a No No

black hat seo

Black Hat SEO

While some have tried to figure out or manipulate Google’s search algorithm, others have referenced Google itself and have used the many tools available to optimize and enhance their sites. Free tools such as Google Analytics or Webmaster Tools are invaluable and should not be looked down on because they are F-R-to the double E.

People think they have SEO figured out because they took a one hour webinar or read an article about how to become an SEO guru. I’m sorry friends, but you just don’t have it. There is more to it than listening to a guru. Do your research, practice, live and breathe SEO. But always refer back to the search engine itself. Google tells us what it is looking for and what it frowns upon. Don’t waste your time trying to pull one over on Google because it is just too smart for that. Tricking Google is a no no and it’s time you’ve learned a lesson.

Here are some of the quality guidelines and things that you should not do that Google has specifically outlined for you to improve your sites’ ranking on its search engine. For more detailed information from Google’s Webmaster Guidelines please go here http://www.google.com/support/webmasters/bin/answer.py?answer=35769#3.

Here is a condensed version:

The first basic principle that has been said over and over is to make your site for your users and not the search engines. “Cloaking” or presenting information to your users that is different from what the search engines see is frowned upon. If your users can interpret and easily navigate your pages, so will the search engines. Keep in mind that you will need to fully optimize title tags and meta descriptions as well as other basic search engine optimization “onpage” elements for the search engines to see.

The second basic principle is to not trick Google. Trust me, it is a matter of time before you will get caught and you will pay for your trickery. Some tricks that people actually think help them are: creating domains with duplicate content, using hidden or invisible text, using hidden links, and/or throwing up a bunch of keywords that have no relevancy to your site. These tricks won’t cut it.

This should go without saying, but too many people just don’t want to listen…do not participate in link schemes! Link exchange programs or link farming is not respected by Google and it is not what it wants to see. Links should be built naturally and you shouldn’t have to build or obtain links by doing something sneaky. If you are participating in a link scheme you will likely be linking to bad neighborhoods that will hurt your position on the SERPs. If you have good content on your site and/or blog posts, people will link to you. The links will come!

Finally, do not fall for the myth that SEO is a one-time thing. Making “onpage” changes or modifications should only happen once. However, the ongoing blogging and social media optimization is what is of most importance in the long run.

About the Author: Krystle Green is the Vice President of ArteWorks SEO, a full service search engine optimization firm located in Austin, TX. For more information about SEO, SEM, or social media please visit http://www.arteworks.biz.