In online marketing, a lot of terms are thrown around. Some are easier to understand than others, and some are used interchangeably when they don’t mean the same thing. In search engine optimization (SEO), two common phrases – search engine friendly and search engine optimized – are often used in place of each other, but have two similar, yet different meanings.
Whether you’re hiring us here at Sachs Marketing Group or looking to handle your SEO efforts on your own, you must understand what these terms mean and how to make both of them work as part of your strategy.
Overview
What is Search Engine Friendly?
Search engine friendly, or SEF, refers to building your website on a solid foundation that makes it easy for the search engines to read the code and crawl it for optimal indexing. It focuses on:
Website construction: Today, many content management systems (CMS), including WordPress, are built as search engine friendly solutions. This keeps your most important elements in HTML format, and avoids using Flash, Java applets, and other non-text elements. Non-text elements are generally either devalued, or worse, ignored completely, by the robot crawlers in charge of discovering and indexing your website.
Each page having unique content: This focuses on keeping all pages uniquely named, and structured, to avoid potentially confusing search engines. Each page will have unique content, whether it’s a just a paragraph or two, or thousands of words, depending on the needs and context of the page.
Title and description tags: SEF makes sure that unique title and description tags are used on each page, relevant to the content on each of those pages.
Readable URLs: The search engines clearly need to see the content to list the pages in their database, but they must also be able to see the links to find the content to begin with. That’s why a crawlable link structure is vital. Many websites structure their navigation in way that the bots cannot access, thus making it harder for those pages to be listed in the index.
What kinds of things can stop the search engines from being able to read a link?
- Submission-required forms: If users must complete an online form before getting access to certain content, then the search engines are likely never see those pages. For some purposes, this is okay, because you don’t necessarily want those pages indexed.
- Unparsable JavaScript: If you’re using JavaScript for your links, the search engines will either give them little weight, or ignore them all together. If you want them to be crawled, replace the JavaScript links with a standard HTML structure.
- txt: This file allows you to restrict the files on your website you want the crawlers to access. If there are pages on your website that link to pages that are blocked by either the robots.txt or meta robots tag, those links won’t be counted, since the robots stop their crawl when they reach the information.
- Relying on search boxes: While you should have a search box on your website to make it easier for your users to find information they’re looking for, you cannot assume the search engine bots will use it to find everything on your site.
- Links embedded in plug-ins: The bots focus on text, so any links that are embedded within Java, Flash, or other plugins won’t get seen, crawled, and indexed, thus never allowing users to find them with a query.
- Links on pages with hundreds (or more) links: The search engine bots will only crawl so many links on a page. They do this to avoid spam and keep rankings as user-friendly as possible. If you have a page with hundreds or thousands of links, you likely won’t see all of them crawled and indexed.
- Frames/iFrames: Links in both are crawable, but the structure makes it difficult for the bots to organize and follow. Unless you’re a highly skilled developer who understands how the bots index and follow links embedded in frames, avoid them.
Canonical tags: Canonical tags are similar to a 301 redirect, but rather than actually redirecting visitors to a new URL, you’re just telling the search engines that multiple pages should be considered one. The 301 redirect sends all traffic, whether humans or bots, to the unique URL, and offers a much stronger signal that multiple pages have a single source, and can be used across domains, whereas the canonical tag cannot.
Current algorithm guidelines: SEF platforms should adhere to most of the currently published search engine algorithms. Many of the guidelines in use today were established years ago and are easily implemented at the code base.
SEF is a one-time process, done when a website is first setup. Of course, there are hundreds of elements involved, but once you’ve built a search engine friendly website, there is not much more for you to do. There’s always a chance that you’ll have to do something later, if there’s some kind of forced change to the system you’re using, but after you’re done, you’re pretty much done for good.
If you want to get an idea of how Google’s indexing bots see your website – look at the cached version of your website using a tool like Cached View. Compare that view to how your website looks in the browser, and you’ll see what is indexable. The Google bots crawl the web and take snapshots of each page to store for backup purposes, should a page not be available. If your site ever goes down temporarily, you can still access it with the cache. However, the cache may take a few days to update. It depends on how often your website is updated.
What is Search Engine Optimized?
Search engine optimized (SEO) is an ongoing process, as more content gets added to your site on a regular basis. There will always be more keywords to rank for, additional content to create, more links to go after and get, better rankings to achieve, more traffic to bring in, more conversions to get. It will never be done. It focuses on:
- Site messaging: Rather than treating text as a placeholder on each page, optimization focuses on using keywords appropriately to signal to the search engines the page is on topic, but also on delivering the right message to the audience.
- Optimizing content for rankings and conversions: This includes the use of keywords, but also calls to action, and ultimately relevant creating content that site visitors find useful and helpful.
- Optimizing title and descriptions to drive clickthroughs: This process uses keywords appropriately to encourage users to click through the link to the actual page. These are built to match the users need and intent.
- URLs built to follow the best navigation paths for usability: URLs need to mimic the navigation and breadcrumb paths on the site. Rather than http://yourdomain.com/page, it should be more like: http://yourdomain.com/navigation/subnavigation/page. Keeping the URLs in line with
- Eliminating issues with duplicate content: Optimized sites go beyond the bandaid fixes of simply directing the search engines to the correct content. It will, with the help of the CMS and as far as it will allow it to go, completely eliminate duplicate content issues all together, rather than sending the signals you have to hope the bots will follow.
- Future algorithm guidelines: An optimized site, on the other hand, will consider more than what the search engines are looking at today. By focusing on the future, staying ahead of things like the spam filters, and providing real valuable content for visitors, optimized sites go beyond quick-fix loophole solutions designed to earn rankings. Any time there is a major algorithm change, we see a number of sites get hit hard, that spend months, if not years, recovering to build them again. If your site survives a major algorithm change like Panda, with little to no negative change in ranking, then you know you’re on the right track.
Because your website is never completely and fully optimized, there are a number of tools available to help you see how you’re doing in terms of ranking for keywords and the number of backlinks you have.
Using Both to Create a Stellar Web Experience
You can’t have SEO without SEF. If the foundation of your website isn’t built with the search engines in mind, there’s zero sense in optimizing your content for them. Start with a basic structure like something you’d find in WordPress, that’s built to be SEF. Then, move on to the various stages of optimization, according to the needs of your website.
If you’ve already established a website, and you’re not sure how well it fits the definition of friendly, or optimized, it’s time to do an audit. It will help you see the changes you need to make to improve your website, and then guide you through the process.
Begin defining goals. Check your Google Analytics and look at what the data has to offer. If you haven’t already, sign up for Webmaster Tools to get additional data Analytics doesn’t offer.
After this – it’s time to start the audit process:
- Website crawl: Check in Google to see what they see compared to what’s actually on your website. In the Google search box, type: “site:http://www.yourdomain.com”. You can use a tool like Screaming Frog to get a deeper crawl and export everything into a spreadsheet so you can analyze your website’s current state. From there, you’ll see more information about page errors, links, and more.
- Site Speed: Use the Google PageSpeed Insights tool to see how quickly your website loads on mobile and desktop. Pingdom can also help you see where the issues with site speed are, and how to fix them.
- Domain: Checking both the www. and non www. versions of the domain in tools like org and whois.com can give you an idea of what the site used to look like, whether or not there were subdomains, and more, to help you see the kind of domain authority you’ve built up, based on what it was used for in the past.
- Website information: Check the site with BuiltWith to learn more about what the site was built with, if you don’t already know for sure. This lets you know whether you’re on the right track with the platform you’ve built the site on.
- Site structure/architecture: How many clicks does it take the user to get to where they need to go? Is there anything you can do to improve the process, thus improving usability? Is everything logically organized? If not, take steps to organize it accordingly.
- File and URL names: Is everything readable for the visitor? Everything after the # in a URL Google ignores, so keep this in mind.
- Key performance indicators (KPIs): Are there KPIs in place? Goals, engagement, sales, ranking, domain authority – whatever they may be, these are vital to know so you can work on making improvements in the order of priority.
- Keywords: What keywords are you targeting? Which keywords are you ranking for? Use tools like SEM Rush to get some keyword insight. You may discover keywords that are easier to rank for because there are fewer overall results and less competition. With the keyword information in mind, move onto content adjustments.
- Content: Adjust if necessary to be sure the keywords are used appropriately in the tags, on-page content. If written for the search engines rather than visitors, look for ways you can edit the content to provide real value to your reader.
- Duplicate Content: Copyscape is a great tool for finding variations of your website’s content. Though you can also search for your content with quotes around it in Google, it’s a bit more of a painstaking process. When you find it, use canonical tags or 301 redirects to fix the issue as necessary.
- Meta: Check meta tags and descriptions, for character length and proper descriptions. Keep titles limited to 70 characters, and descriptions limited to 160.
- Images: Check to make sure there are no broken links to the actual image files. Check to make sure all images are properly compressed. If not, use tools like PicResize and TinyJPG to compress them for faster load time. Check ALT tags for optimal descriptions. Check image links – WordPress automatically links to the image file – and you may or may not want this based on whether or not the images will be useful in the search results.
- Forms: Are your forms properly setup and operational? These are often necessary for conversion rates.
- Links: Check all links on the site – internally and externally. Check for optimal structure, and make sure none are broken.
- Social Signals: Do you have social profiles attached to the website? If not, get to work. Social signals play a role in overall ranking, so it’s critical to develop a social presence in the places where your audience is active.
- Citations: If you’re a local business, citations in reviews and directories are important. Check for and claim listings in Google My Business, Yelp. TripAdvisor, and more.
SEF and SEO Create Magic
When you start with an SEF website structure, then build and optimize everything else, you’ll be well on your way to great rankings.
Have something else to add? Share it in the comments.
Photo credit: Adobe Stock