Better SEO at JTCG (3of3… for now)

We wanted to share with you a little more about our general and specific processes regarding search engine optimization, particularly because Google says to never trust a search engine optimizer who won’t explain their methods.  (Not that we don’t have a few trade secrets!)

Stage one analysis includes (but isn’t limited to):

Meta Data

  • Picking the right keywords for every page, doing a deep dive statistical analysis on search volume, competition, and relevance of a word to your page.
  • The same for your pages title, and the uniqueness, within a site, of the title.
  • Usage of “alt” tags for all images and “title” tags for anchors
  • The length of your meta description, it’s appearance on the search engines, phrasing, and inclusion of keywords and words listed on the same page.

Content Analysis – Some of this does appear during the design phase if you have Jacob Tyler create your site.

  • Internal Linking Structure
  • Where are links on a page?
  • What are they linking to?
  • Other pages on your site
  • External linking for helpful sites or authority sources
  • What is the link text?
  • Spelling
  • Grammar
  • Four clicks or less to get to anything on the site
  • Call to action
  • Presentation and layout
  • Use of heading text, structure, and page placement
  • Use of bold, underlined, and italicized text and their placement on a page

Our in depth Search Engine Optimization service is primarily concerned with many other things including (but not exclusively) the following:


  • Page Doctype specification
  • W3C Standards Compliance to newest technologies available
  • Error free JavaScript
  • Error free CSS
  • Error free XHTML

Domain Setup

  • URL structure
  • Is your URL even right for you?
  • Should you use a personalized nameserver

Architecture & Server Setup

  • Where JavaScript and CSS files are included in a page
  • Various kinds of page and file compression methods
  • Apache Configuration
  • Use of 404 error pages
  • Internal redirection
  • Server requests and lookups
  • Number of documents on a site
  • Number of files on a page
  • Page load times
  • Use of a Content Delivery Network service to reduce network hops
  • A few other tricks up our sleeve…

Machine Informative

  • Usage of robots.txt for search engines, dynamic where applicable
  • Dynamic (where applicable) XML Sitemaps

Security – Because we’re digging into the nitty gritty anyway…

  • Hiding the site’s underlying technology – techniques such as removing the “.html” or the “.php” from the user agent
  • Hiding the CMS platform (when one is used) from the user agent

Better SEO Article 3 of 3