Monday, July 29, 2019

Analyse Your Website Using Google Analytics

All You Want To Know About Google Analytics

Google Analytics is used to analyze our website to gather information about the traffic on our website. Traffic is all about the interactions done on our website by the user. With the help of this, a realtime analysis is also possible to be done. It is a highly beneficial one for the webmasters to keep track of the website working. 


Google Analytics


  There may be different traffic sources on our website. Some of them are

1.Direct Traffic

If we reach the website directly by typing the URL, it is considered as the direct traffic

Direct Traffic












2.Organic Traffic

It is the way of reaching our website by searching in the searching and reaches to our website.


Organic Traffic


3. Referral Traffic

It is the way of reaching our website through the links which are given on other websites. As the links are referred by the user, it is named as referral traffic.

4. Social Media Traffic

It is the way of reaching our website via social media.

5. Ad Traffic

It is the way of reaching our website by clicking on the advertisements that are given about our website.

Steps to Add Google Analytics to Blogger

  1. Login to blogger
  2. Go to gooogle.com/analytics
  3. Sign in to the Analytics and click on Signup
  4. To track website select the website option rather than the mobile app.
  5. Give the account name and website URL.
  6. select category.
  7. Select the reporting time zone. Always remember to choose the right time zone because selecting a false time zone may result in confusions on the realtime analysis.
  8. Get tracking ID
  9. Click on agree on the terms of service agreement after selecting the Country.
  10. To add using tracking ID, Copy the ID and go to blogger, click on settings ->Other ->Google Analytics-> Web Property ID    and paste the ID there.
  11. To add using the code, copy the code and go to blogger. Select the theme and edit HTML by pasting the code within the head section.
  12. Go to analytics, click on home. 
Now you can analyze the traffic by clicking on the overview in realtime option.


You can also refer to my previous post on An Overview About Google Search Console 
 
               





Labels: ,

An Overview About Google Search Console

Usage and Tips of Search Console


It is the basic tool used in SEO. It is used to introduce our website to google and to get verified by google. Its old name is webmaster tools. It is considered as the major communication point between the user and Google. To implement SEO in a controlled form, we need this method of verification using the search console. The controlled form is all about customizing the visibility of our website to users in particular areas. Google informs us through the search console about the structure of our website, errors like title duplication, irrelevant content, etc.


Google Search Console


Crawl Rate


It is the rate at which Google crawls our website. Crawl rate for a new website will be less. It increases when,
crawl rate
  • Visits on our website increases.
  • updations are done frequently.
  • Website is reached through links on other sites.


The new cache will be taken only if a big change is done and found on the crawling process. For crawling the website more easier, we could generate a sitemap and submit it to Google. 

Sitemap


It is an XML document submitted to google which consists of URLs in it. It is more crawlable for google than other ways. With the help of this, details about the changes done in the websites corresponding to the URLs are found out about the structure and content given in the website.

Different Verification Options in Search Console


Different verification methods are available to verify our website to claim the ownership of the website to do further actions on our website. It can be done in various methods which consist of a recommended method and four alternate methods.

Verification Methods

1. Using HTML File (Recommended Method)


A small HTML file will be given to us by google. We can log in to our C-Panel and go to the root directory and upload the file and click for verification.



html file method

2. Using HTML Tag


Copy the meta tag given and paste it in the head section of the HTML code




html tag method


3. Using Google Analytics


 If you are using google analytics for tracking the traffic of your website, you can also verify your website with the help of google analytics.


google analytics method


4. Using Google Tag Manager


If you have a google tag manager account, then you can verify your website with the same. It is a dynamic process where all the verifications can be done easily. Analytics cannot be added if you are using this method for verification.


google tag manager method

5. Using DNS Record


If we are using the domains from the providers like GoDaddy, we can verify our website by copying the text record into the configurations of the provider.


DNS verification method

                              Among the above five methods of verification, HTML file verification is considered as the recommended method because in the other methods, change is done on the HTML code of the website and if we  make any change to the website like changing the design, then the connection with the search console will be lost.



You can also refer to my previous post on On-Page Optimization Techniques.






Labels: , , , , ,

Wednesday, July 24, 2019

On Page SEO Techniques

                                         On-Page SEO


On-page optimization is the process of enhancing our web page in all the aspect to make it optimized so that it will be easier to be crawled by google and to make it user-friendly and understandable. It is done on various steps and procedures.
                               On Page SEO

Snippet

It is the portion of the website that showed when we search for it in the google.
Normal snippet includes title,URL and meta description.
Snippet
The title is the heading about the web page and meta description is the small description of the website given inside. All these need to be optimized to make it an effective one. In the rich snippet, we could add thumbnail and events also.

On-Page Optimisation Techniques

Title Optimization

The title should be of 3 or more words.a small title will be one with good appearance but it should not be much shorter because google will not be able to crawl anything from the title to know about the website. The title should not be written in full capital letters which in turn increases the pixel width.
SEO title
The pixel width limit is 512 pixels. The title should be in the range of 55 to 60 characters. It should not contain any spelling or grammar errors even by mistake which is considered as the worst part in the SEO. SEO title must be unique because if not, title duplication error will be generated and there will be a chance for cannibalism which is the situation that leads to a competition if two pages come with the same title in one website.
If no SEO title is present then the h1 part of the body will be displayed as a title. if h1 also not present, then h2 will be displayed by the google on its publication and if a website is reached from a link in a powerful website, that link will be displayed as the title.

Meta Description Optimization

The meta description is the portion shown under the title of the website which describes the content briefly.
meta tag optimisation
it is good to give the focusing keyword in the meta description once to make it more effective. Its size limit is 155 to 160 characters for a page and less than 155 for an article or blog post because for a post, it shows the date also along with the description. its pixel width limit is 1024. Meta description must be unique and it could be content of the same page but not of another website. If no meta description is given or it is not enough then google takes any part of the content and shows as meta description.

h1 and h2 Optimization

h1 is the major visible portion which is given as the caption to the page. It is good for it to be of up to 8 or 9 words. It is better to give the keyword in the h1. The presence of more than one h1 in one page will create confusion and google will move our site to the sandbox. Sandbox is the temporary storage where confusing which gets crawled are saved. The website that is stored in the sandbox will not be ranked.
It is better to give one h2 and if there is a content division is there more than one h2 could be used. The content given under these headings must match the heading.

Content Optimisation

It is better to use simple sentences in the content which helps the user to understand it more faster.A readability test will be done to measure the ease of reading our content and if we score higher,it indicates that our content is more effective.
keyword density
Keyword Density: It is the percentage of the presence of focusing keyword in our content.In the old school SEO articles which spread SEO myths. They said to fill the content with a maximum number of focus keywords. But later, google itself declared that they don't consider the keyword density and also said that the lack of focusing keyword that is less than a count of two will not help the website to progress in ranking.
Usage of the bold keyword in a large paragraph is highly recommended because it will help not to miss the lines inside a big paragraph because a bold portion almost appears like a special area of concentration.it also makes the content more attractive. Arranging the focusing keyword before a full stop or comma will help the google to crawl our content and to understand it easier because Google acts as an individual and it stops if there is a full stop and that portion will be noticed more.

Anchor Text Optimization

It is the optimization of the links which are given inside the content. It is done by using the anchor tag. The visible portion in the content given inside the tag is the anchor text.
Anchor Tag
Giving some anchor text inside the content will boost up the crawling process. Do not give another website as a hyperlink in the hypertext which is named as the focusing keyword which in turn decreases the credit of our website.

Keyword Optimization

It is good to use the keyword phrases in some portions of the content which makes the content more relatable with the focusing keyword and makes the content more authentic with respect to the focusing keyword

Image Optimization

Image is added using the img tag where the path is given as the src. It also has the alt tag which describes the image and makes it understandable for the google. Filename of the image is suggested to be the description about the image.
You can refer my previous blog about History and Evolution of SEO

Labels: , , , , , ,

Monday, July 22, 2019

History and Evolution of SEO | Beno SEO Analyst

History and Evolution of SEO

During the initial times of Google, it wasn't a quality assured service. It was noticed during the world trade centre attack that people were not able to get the required information about the incidents from google. It came under discussion and they found out google's algorithm was not able to crawl the appropriate website and store it in the cache in order to provide to the users using a way of indexing whenever they request. In order to make website crawlable, they found that it can only be done by the webmasters. This method of performing modifications on the site is called on-page optimisation. Initially, a discussion was held to decide whether the guidelines to be published or not and finally starter guide was published to the webmaster. 
                  
History and Evolution of SEO

In order to optimise the website, google initially introduced the content-specific approach in which the website with most keywords will be ranked. Later, keyword stuffing was done by the people to make their website available which was a black hat SEO technique. webmasters focussed only on the keywords and so the quality of the websites was not considered.
Next method was the link specific method in which websites were ranked by considering the no. of times in which the site was referred by the people from other websites through links. A competition was held to put on the website links by giving money to other websites and those people took advantage of it and still the quality was not considered.
So they introduced the quality rank specific approach in which the site will be ranked only if the website is referred by a website with a higher rank. They introduced the page ranking which varies from 1 to 10 where 1 introduces the least and 10 introduce the highest rank. Only a few websites had ranking 10 out of 10 like twitter and U.S govt. website.
One of the strategy in it was "passing the juice" in which, if a website with a specific page rank gives numerous links to other websites, its rank decreases. Without losing the equality, they could provide the link with the help of typing rel=no follow in the anchor tab.
During those days, users were increased in number and google became popular. Then they introduced the google AdWords in which people can advertise their products on google by paying to google and it became the main source of income. Adwords were customisable with respect to the time interval, people belonging to a specific location and it was displayed either in the display network or in the search network where the user views the Ad Word when he searches. The payment method was based on the CPC where google gets paid if the link of a particular ad is clicked.
Later they introduced the AdSense where we can display the ads taken by google on our website and google pays for us.
                       Google Adwords

During the year of 2009, many changes were done to maintain the quality of Google. Search engine became more interactive and they introduced the google assist. If the website is interactive, the ranking will be increased by analysing the website interaction using data centres. Bounce rate was the percentage of people who closes a site immediately and if it is higher, the site is not a better one. Pogo sticking was also introduced such that, if the interaction in one site increases more than another site that uses the same keyword, google promotes the one with more interaction. At that time, calculations were also implemented. They introduced the personalised search result also in which, if the account is logged in and we search something and later if we do similar searches, google will suggest us with early keywords and site by mentioning the period of the visit too.
In 2010, social media signals where introduced. If a website is shared across social media, its ranking will be increased. Then they considered the social media authority value based on the interaction in social media such as the likes. If the likes for two similar posts are the same, the influential power of people who liked will be checked.

Updates

1. Panda Update

 It was done against the content spam.Implemented on 2011.

                           Google panda update


Content spam included different types like    
  • Content duplication which is the copying of content from other sites
  • Publishing the quality fewer contents consisting of spelling mistake and grammatical errors
  • Content spinning which is the usage of the same content in different places
  • Publishing thinner pages with lesser contents
  • Keyword stuffing
Panda update was done in various stages and one of the important was panda 4.0 done at 2014 in which content duplication was prohibited and it was implemented as a permanent filter.

2. Penguin update

In 2012,it was done against link spam which consists of a link exchange, paid links, quality fewer links, link farming where links were exchanged inside a group, comment spamming where our links are given in comments, wiki spamming in which links were given in Wikipedia contents. All these were controlled in this updation. Guest blogging was introduced where we write a blog on someone else's website and giving links there which was punishable if it goes beyond the limit.



Further penguin updation was done in 2016 which is 4.0 where real-time implementation was done with regard to the punishing and all.

3. Pigeon update 

  It was also an important update which is related to local SEO. Local SEO are those who focus on marketing in a particular locality. Its procedures and steps were listed in this updation.

 4. Hummingbird updation 

It was focused on giving deep info by analysing the feedback given by the users. If we search for a particular thing, its detailed info will be given.

  5. Rankbrain 

   It was introduced in 2015, which was the upgraded version of hummingbird updation. In this, AI was used to guess the searches from the user even if he does not enter the exact keywords.
6. Mobile geddon
It was implemented in 2015 where a mobile-friendly environment became important for ranking. Its second updation was in 2016, where it was strictly implemented.

Some of the smaller updates are the park domain update in 2012, where blocking a website came into discussion and further actions were taken. Another one is the pirate updation which works on the basis of Digital Media Copyright Act in which usage of someone else's media on our website became punishable. Another one is the Exact Match Domain update in 2012, which took action against websites in keyword name and stays idle for a long time. Medic updation was done in 2018 with respect to the health and wellness related website which was a value-based updation. In 2017, bracket updation was done to control the review manipulations done by the webmasters.

Labels: , , , , ,