Wow oh Wow, the SEO revolution has just began!!!
Before we get to our revolutionary SEO tool – Sage, to make this simple here is a quick explanation on how SEO works and why corporations pay thousands every month to have their website go to the top of Google and other search engines.
Using SEO to Get Better Results in the SERPS
Being the top of Google search engine results page (SERP) for keyword searches linked to your website can generate up to around 80% of all search engine traffic to your domain which is why good SEO is important.
It is down to your web designer to create a good customer experience, and yourself to then close the deal and provide the product or service the client requires.
With the latest algorithm updates from google such as Panda and Penguin, your website needs to be mobile, tablet and PC compatible now as Google recognizes that even when a computer is available people are preferring to search by phone or other mobile device for what they are looking for along with seo – optimized websites.
SEO Friendly URL’s
Working alongside a web designer an SEO consultant provides detailed information on how the URL permalinks structure should be set out.
The URL’s should be search engine friendly by creating permalinks that search engines can crawl and read with ease and also contain the keywords for that page topic.
The website should use canonical URL’s and have a 301 redirect to just one domain for example google can read both versions of ‘www.your-domain.com’ and ‘your-domain.com’ and can read this as two websites and assume that it is copied content which is bad for SEO.
The web designers main concern is making the website work on all platforms while creating a great user experience for the customer by making it look good and easy to navigate.
When you are creating a business or blog and you want readers to be able to find it, it’s also very important that we take other thins in to consideration too such as media.
Using Descriptive Titles For Your SEO
When you upload an image to your website make sure you give it a good descriptive title. Uploading images with 001, 002 etc is not very descriptive at all and the url it will be associated with will show this too.
Another good SEO practice is to make sure it has the keyword for that page within the title and also it is important to add alt text (alternative text) to your images. An example of doing this manually would be to set the image something like this:
The href url text and alt text would then read something like this depending on the category slug.
href=”/mens-fashion/mens-yellow-polo-shirt/ alt=”latest mens fashion yellow polo shirt” title=”Latest Mens Yellow Polo Shirt in Yellow”
As well as creating search engine friendly URL’s the way the URL is added also tells the search engine this is an internal link. We need to create more internal links than external links to help what is called link juice. We will create blog post explaining more on that later.
Using Schema Markup for SEO
Other things we SEO Guru’s need to consider are Google schema markup which is added to each page to tell google what the page is about and what it is and also the contents.
The schema markup tells the search engine everything it needs to know, is it a local business, what services and products does it provide and to which markets, is it an article for news publication etc etc.
You can test your schema mark up with Google’s structured data and validation tool. There are also other tools for other search engines such as Yandex and Bing which have their own structured data testing tools.
This blog post from SEO Skeptic provides quite an extensive list of structured data testing tools for other search engines. Having your blog or eCommerce site optimized for all search engines for maximum exposure is a must for any serious business.
The biggest thing to consider along with SEO is the server where your website is hosted. Servers need to be constantly monitored and checked and have all the latest patches and optimized for speed and reliability as well as having a very high priority for security.
If you are using shared hosting it is also imperative that the are no adult websites hosted as this can result in the server being blacklisted from the search results too.
Monitoring SEO Campaigns
Using analytics and webmaster tools helps us monitor how traffic is flowing into the site whether it is from organic searches, from Adwords or even social media. These tools also help tell us the landing pages and the keywords used to bring potential customers to the website.
Not long ago Google removed the ability to see which keywords were used for the organic searches an focused only on its adword campaigns.
Yes there is a way around this so that you can see all keywords used even in the organic search results and we will cover this at another time.
One of the things Google collects in its data along with visitors to the site is how many pages are viewed, and how long a user is on the site.
This tells google how relevant the keyword is to your website and whether people searching the keyword found your content useful or what they were looking for.
Each IP is collected and google can tell whether you are a returning visitor or a unique visitor and how you got there. How long someone is on the website is what is known as bounce rate.
If a user comes to your website and leaves immediately then this incurs a high bounce rate. If you have a 100% bounce rate this reflects on your website and also can reflect on how google ranks you for that keyword.
The perfect bounce rate i would recommend any website having to show relevance is around 50-30%, although anything below 80% is not bad, there is just room for improvement. If your website is showing 0% then i suggest you have it looked at as something is definitely not working.
So now you have the initial workings of how to set your standards to get your website or blog started so that people are able to find you. We have been researching on different ways on how we can create that perfect bounce rate and help increase rankings and our initial tests have started with google.
Our SEO Revolutionary Tool – SAGE
When we were doing our research we came across a website and software called crowdsearch and found they were using some software they gave away and paid users to use. This works in a very similar way to how some tools out there use these type of software to have people running which gives them a very low rate to have it running on a pay per click service many years ago.
I am not sure if there are many of these still about and we have not researched into it but the problem we see with tools like this is actually getting people to run them as it slows down many users computers if they are not sure how to keep them running smooth so after a few weeks tend to just stop using them or remove them completely.
We looked more into the crowdsearch tool created and notice posts from Microsoft and other companies who were actually blogging about this software and soon realized that this was not classed as a black hat tactic and is perfectly acceptable so our team set to work on what we could do to create the desired effect very similar but not exactly the same as theirs.
One of the things we wanted to implement differently was the fact we had to distribute the software and get people to use it. This can take time and also a lot of advertising and money to get out there and actually getting people to use it.
This gave us the idea to create our own standalone tool and how could we create hits to a website from different unique IP addresses. First we experimented using tor and the tor browser and all seemed to be working well.
We worked a way how to use googles redirect urls to a text file so we could create a list of pages and websites the tool would crawl and we managed to set a time schedule for each page to be opened before moving on to the next link either internally or externally.
We are able to check the results but it seemed like we had failed. Google although can recognize tor traffic wasn’t picking up our hits and we realized one of the problems with using the tor browser is the fact that it asks for a captcha very very often which can also be very annoying and as the program was being automated it more than likely refused a connection and then left, changed IP again and repeated the process over and over again.
We were not about to give up even though we had a 100% bounce rate there had to be another way. The answer was proxies and this turned out to be even better than we ever thought due to the fact that Tor bounces around different countries we are able to set country specific URL’s to where we would like the traffic to come from.
Over the last few days we run some tests during the evening and checked to see what the results were and these speak for them self. The tool was run on a newly created website we built last month and as you can see from the previous days in the image no hits had been registered with the site.
The search string was used from Google’s search query for ‘bio energy healing sunshine coast’ the topic and the area the service is provided. Not only did we create the perfect bounce rate but in just 2 days the website is now number 1 and 2 on google for that search term.
At first we wanted to build something that we could market and sell to the public to use for themselves but now we realize that in the wrong hands this could also cause so much damage so we will not be releasing it as it is.
Not only can this tool be used to help increase hits to your own website but it can also be used against the competition by setting the exit node at around 5 seconds to then enter again and again repeatedly and although it would increase page hits would also give a 100% bounce rate.
This tool could also be used at adwords clients using (PPC) Pay per click and could in a matter of months cause chaos. The opportunities are endless so we have decided against releasing it white label.
What we will be doing after we are 100% happy with the whole mechanism behind it is provide a token service where clients can purchase tokens and send us a listed number of URL’s and keywords.
For the latest news on this subject please follow us either via RSS feed or our Facebook Page to see when we will be releasing this service to the public.