David Carralon http://wordpress-477546-1867265.cloudwaysapps.com Helping you succeed with SEO Thu, 18 Jun 2020 16:50:44 +0000 en-US hourly 1 http://wordpress-477546-1867265.cloudwaysapps.com/wp-content/uploads/2019/02/cropped-cropped-dc-favicon-65x66-32x32.png David Carralon http://wordpress-477546-1867265.cloudwaysapps.com 32 32 Penguin 4.0 is real-time, more granular and forms part of the core algorithm http://wordpress-477546-1867265.cloudwaysapps.com/penguin-4-0-is-real-time-more-granular-and-forms-part-of-the-core-algorithm http://wordpress-477546-1867265.cloudwaysapps.com/penguin-4-0-is-real-time-more-granular-and-forms-part-of-the-core-algorithm#respond Sat, 08 Oct 2016 17:25:22 +0000 http://wordpress-477546-1867265.cloudwaysapps.com/?p=1965 Are you trying to find out what the Penguin 4 algo update was about? well, my post headline gives you in a single sentence what this Penguin update was all about: real-time, granular and integrated on the core algorithm. After about a year of wrong guesses, telling the SEO community that the new update to …

The post Penguin 4.0 is real-time, more granular and forms part of the core algorithm appeared first on David Carralon.

]]>
Are you trying to find out what the Penguin 4 algo update was about? well, my post headline gives you in a single sentence what this Penguin update was all about: real-time, granular and integrated on the core algorithm.

After about a year of wrong guesses, telling the SEO community that the new update to Penguin was about to come, finally Google has officially launched Penguin 4.0.

Here’s the main characteristics of Penguin 4:

  • Penguin is now part of the core algorithm, same as Panda. So Google will no longer confirm future Penguin updates.
  • Penguin 4 is supposed to be rolled out simultaneously in all languages, but Jennifer Slegg from the SEMPost confirms that this is not the case. Some specific countries/languages will be seeing the update first. Take this info as a pinch of salt for now until you are able to draw out your conclusions or more info is given on this.
  • the devaluation of sites that get hit by Penguin 4 will take place at page-level, as opposed to the whole site or subdomain being hit. This means that you will potentially only lose rankings for the pages that happen to offend or be spammed. Although Google’s definition of “granular” is still not clear.
  • as Penguin now runs in real-time recovery will, in theory, be faster. That IF you are able to act fast too and remove every sign of over optimisation, inbound toxic links or spam quickly. At the next crawl, Google will assess your link portfolio again and decide whether you stay in the filter or come out.
  • Low-quality, unnatural and unhealthy spammy links are now automatically discounted by the algorithm. This means that, in theory, just in theory, the need for disavowing links is not there any more.

Despite the large amount of information given out by Google and by other independent SEO sources, SEO and marketing professional are still asking  questions like these ones in the forums, events, and other communities:

“how long should I wait for Google to re-crawl my links?” or “How do I diagnose whether a website is under an algorithmic action if the Google is applying penalisations on a page or site section basis?”

Here’s some practical advice as answers to the above questions:

  • Conduct a backlink audit of your entire link portfolio with an eye open for offending toxic links. You many want to start with SEMrush’s toxic link feature as part of their Link analysis reports. Other more sophisticated tools are Link detox from LRT, from LRT and Kerboo. If you perceive the scale of the link toxicity quite important, I would recommend you employ as many link analysis tools as you can.
  • keep an eye on your GSC console for any messages from google, in case you happen to have a manual penalty message.
  • Remove every sign of spam on your site manually, and request manual removals of toxic links. This can be huge work depending on the scale of the problem
  • build your disavow file carefully and keep your disavow file up to date
  • file a reconsideration request only if you’ve had manual action taken on your website.

Interesting note: In the specific niche (Jobs boards) where I operate, I have been able to see spikes of traffic triggered by Penguin 4.0. See below the spikes ocurring on the 20 September in the Uk, whereas for Sweden taking place just a couple of days after. For the most, the impact of Penguin 4.0 on my sites and those who we track as genuine competitors is positive:

fluctuations triggered by Penguin 4 on a certain 'niche' in the Uk market
Spikes observed on a set of sites in the Jobs board niche in the Uk
fluctuations triggered by Penguin 4 on a certain 'niche' in the swedish market
Spikes observed on a set of sites in the Jobs board niche in Sweden

Summary

This Penguin update has been a significant in the sense that Google has integrated it into the core algorithm, but doesn’t seem to have wreaked havoc, like other previous ones.

The update eliminates the long wait for site owners to recover if they have been penalised. With Penguin 4.0 devaluation of sites with spammy link portfolios or over optimisation will happen faster too.

On a positive note the devaluation of sites will take place at a page level as opposed to site-wide as it had been the case before. It is not clear yet whether this will make the algorithm filter hit harder or easier to detect.

I foresee a lot of confusion at site audit level with a lot of ongoing speculation on whether a site may have been partially hit by Penguin. This will, in turn, open doors to a more sophisticated SEOindustry where only real experts will be able to conclude on whether a site is hit or simply losing rankings naturally.

I hope you enjoyed the summary. If you feel like you still need to learn more about Penguin 4.0 you should watch the latest whiteboard friday video from the Fishkin man:

The post Penguin 4.0 is real-time, more granular and forms part of the core algorithm appeared first on David Carralon.

]]>
http://wordpress-477546-1867265.cloudwaysapps.com/penguin-4-0-is-real-time-more-granular-and-forms-part-of-the-core-algorithm/feed 0
Searchmetrics’ Search engine ranking factors 2014 is out and rocks! http://wordpress-477546-1867265.cloudwaysapps.com/searchmetrics-search-engine-ranking-factors http://wordpress-477546-1867265.cloudwaysapps.com/searchmetrics-search-engine-ranking-factors#comments Fri, 12 Sep 2014 12:55:34 +0000 http://wordpress-477546-1867265.cloudwaysapps.com/?p=1649 What are the search engine ranking factors that have high rank correlations with organic search results ? The answer is not straightforward and the search engines are not going to let us peek into their algorithms. The short story is : great content, accessible sites leading to great architecture and many natural backlinks. But those …

The post Searchmetrics’ Search engine ranking factors 2014 is out and rocks! appeared first on David Carralon.

]]>
What are the search engine ranking factors that have high rank correlations with organic search results ?

The answer is not straightforward and the search engines are not going to let us peek into their algorithms. The short story is : great content, accessible sites leading to great architecture and many natural backlinks. But those three areas degenerate into so many different SEO tasks and processes, each being weighted so differently by Google that it’s often not easy to know what direction to take to improve your SEO.

This is when having independent studies looking at correlation data comes in very useful. There are well established companies in the SEO industry building their own tools to help us get as close as possible to how search engines think and do things. One of them is Searchmetrics: an international enterprise SEO platform, and one of the most reliable sources of SEO intelligence and information. For two years in a row they have launched their very own ‘search engine ranking factors’ study.

Their second one: the Search Engine ranking factors for 2014 (link remove as their study is not available now) is out now. It is Google US-centred as they indicate, but it can give a very good idea of the direction that Google is taking in general. The report also tells how they performed the scope and research, and of course they make the point clear that ‘factors correlations’ doesn’t imply ‘causation’. It is a very interesting piece and in my view very easy to digest, particularly if you are a visual learner, by just looking at their ‘infographic’.

They rate the factors in order of importance by categorising them into these 5 groups, clearly distinguishable by different colours within their infographic:

  • User signals (UX)
  • Social signals
  • Backlinks
  • Technical opitmisation
  • Onpage Content

I have summarised the whole report in a few lines here below:

User signals

the highest ranking url had a clickthrough rate (CTR) of 32%‘Clickthrough rate’ sticks out as the top search engine ranking factor on Google.com in 2014!  I am assuming that this means the number of clicks that a url may get on the SERPs (search engine result pages) for one specific query search term.

This is interesting as it can be directly related to many different other factors: branding, good content, but also page tagging.  If ‘title tags’ are well written, explicit and branded, they are more likely to attract attention on the search results. And if the page boasts great content, then the page is also likely to have a low bounce rate (as in ‘click back’ to the SERPs). Lastly a call to improve descriptions of your products and test how they show on the SERPs.

‘Time on site’ and ‘Bounce rate’ are also user-level signals being tracked in the study, but although important, I was surprised to see them much farther down in the chart. In my view those three signals go hand in hand.  The takeaway here would then be: do as much as you can to improve your content’s presence on the SERPs and ensure content on your landing pages is high quality and engaging as to avoid visitors to bounce back to the SERPs once they click to visit you webpage.

Chart with top Ranking factors, from SearchMetrics 2014 Chart with top Ranking factors, from SearchMetrics 2014.

OnPage (Content)

Relevant terms is one of the most important SEO factors on Searchmetrics 2014 studyRelevant terms, unsurprisingly, stands at the top of the ranking factors chart. This one is probably one that both inhouse teams and agencies find handy to have on this infographic as it helps tremendously to often quote references like this when trying to evangelise the importance of Onpage optimisation.

It is clear that every piece of editorial should be written naturally and by authors that know and understand their subject (Content is king). However, from an SEO standpoint, failure to mention underlying target terms or keyphrases in the content leads to poor search engine visibility. This happens quite often, when writers or content owners aim to write in strong academic style and using very selected, subject-specific terms but ‘unpopular’ with their online audiences. Hummingbird has gone some way to counteract this issue but it cannot always be relied upon to make miracles.

The second most important fact that outstands on this section is internal links.  The top ten ranking urls seem to have one unique characteristic: a count of internal links averaging 130. This is quite a remarkable tip for everyone in my view, as there is often a perception that if every page in the architecture is linked to from the top navigation structure then no additional internal links are needed.

Lastly, the study also mentions the importance of embedding images & videos, mentioning keywords related to other relevant terms, and in short, focus on topics instead of single keywords.

Social Media Signals

There is nothing much new in this section since last year’s study.

Social signals correlates highly with rankings on Search metrics' ranking factors studyThe definite social-drive ranking factor winner, according to SearchMetrics, is Google+. Although authorship is now dismantled,  there is still a lot of value in engaging with audiences on Google+.

The interesting thing is that social signals still stick up right at the top of the ranking factors chart as top correlation factors, despite the fact that Google has made it clear, by hand of Matt Cutts, that social signals do not influence rankings:

But despite this revelation that came in earlier in the year (2014), the research data from SearchMetrics indicates that there is correlation. Matt may be right that as your content is shared more, you are more likely to get more backlinks, and therefore better traction on the search results. But it could also be that Google, and other search engines, are using social signals ‘indirectly’ to determine quality, authority or reputation and those may therefore lead to better ranks. The important thing to know is that the most people vote for your content on Google+, the more likely you are to rank higher.

The second most important social signal correlated with high rankings seems to be Facebook shares, comments and likes. I wonder if this is going to trigger an increase of blogs implement facebook comments via WP plugins. Many already agree that this increases engagement and encourages more comments on blog posts. Lastly Pinterest comes up higher than Twitter! that actually raised my eyebrows.

‘Quality’ Backlinks

Websites with high ranks boasted an average of 29% of links to the homepage vs deep links. It is no surprise that on their ranking factors study, Links come up right on the top of the chart, only after the social media signal indicators.  Not less important is the fact that, as an average, 29% of the overall backlink portfolio was pointing to the homepage, with the rest being deeplinks.

It is also worth noting the ‘Nofollow’ links signal higher up in the chart than other signals : ‘new backlinks’ demostrating higher correlation with high ranks that other signals deemed more important, like ‘new backlinks’, or ‘links from News Domains’.

And lastly, it is to note that the signal ‘Anchor text’ links as a correlation factor keeps decreasing, with a 13% ratio. This effectively gives you an idea of the share percentage of anchor text links you should have on your site vs Branded links.

Site Architecture (Technical optimisation)

The average length for the top ten ranking URL in the study (in characters) was 36.And last, but not least, amongst the top technical SEO factors we find:

  • ‘site speed’ not surprisingly at the top. This ranking factor has clearly been moving up the scale in recent years.
  • ‘position of the (target ) keyword in the Title tag’. This one is a obvious one, although I didn’t it would stick out as importante still.
  • ‘length of the URL’. Another obvious one that seems to be ignored yet in the outside world, particularly for large sites. The shorter the url the greater the ability to rank higher’. The average length for the top ten ranking URL in the study (in characters) was 36.

In addition I can also see a bunch of liners mentioning the heading mark-up ( H1 and H2) as important signals. It’s interesting to see this one coming back, as in previous Moz Ranking Factor studies (2011 and 2013), results showed that keywords in H tags did not much correlation with rankings.

Lastly, it is worth noting the ‘video integration’ feature as a ranking factor.

Bottom part of the 'Ranking Factors' Chart, from SearchMetrics 2014 Bottom part of the ‘Ranking Factors’ Chart, from SearchMetrics 2014.

In summary, it is quite an interesting study. I spend well about one hour looking at the infographic in detail and trying to draw conclusions and getting ideas for how to shape up internal strategies inhouse, but also to put to use on my personal humble web projects.  The full infographic (link removed) does delve into great visual details for each of the above groups. The information, in my view, are easy to digest both for SEO professionals and for non-SEOs too.

The main takeaways from this study are: the imperative need to produce the best content as possible in your niche, get an Onpage optimization plan in place for your site ensuring you keep your urls short, nice and factual, get involved with the Google Plus social network, without forgetting facebook, do not stop your link building plans at all as you constantly need new links, and work on your site speed until it’s a fast as possible.

I’m looking forward to a similar 2014 study produced by Searchmetrics, but ideally with a scope outside the US, eg: France or Spain.

I hope that you found my summary useful. Is there anything you found particularly remarkable about this ‘search engine ranking factors’ study? feel free to comment.

The post Searchmetrics’ Search engine ranking factors 2014 is out and rocks! appeared first on David Carralon.

]]>
http://wordpress-477546-1867265.cloudwaysapps.com/searchmetrics-search-engine-ranking-factors/feed 1
Googlebot now crawls the web from Korea and Russia too http://wordpress-477546-1867265.cloudwaysapps.com/googlebot-now-crawls-the-web-from-korea-and-russia http://wordpress-477546-1867265.cloudwaysapps.com/googlebot-now-crawls-the-web-from-korea-and-russia#respond Thu, 12 Jun 2014 20:57:34 +0000 http://wordpress-477546-1867265.cloudwaysapps.com/?p=1595 Last month, at the International Search Summit conference in London, a Googler Gary Illyes, who was presenting the controversial topic of website geotargetting and Hreflang, revealed an interesting piece of information: Googlebot is now crawling the web out of three different geographical locations: Mountain view:  Most SEOs know  that this is the ‘defacto’ starting point …

The post Googlebot now crawls the web from Korea and Russia too appeared first on David Carralon.

]]>
Last month, at the International Search Summit conference in London, a Googler Gary Illyes, who was presenting the controversial topic of website geotargetting and Hreflang, revealed an interesting piece of information: Googlebot is now crawling the web out of three different geographical locations:

  • Mountain view:  Most SEOs know  that this is the ‘defacto’ starting point for the Googlebot to start crawling the web
  • Russia: although he did not specify where in Russia, I am not surprised about this as Google has been struggling to increase market share in this region for some time due to  stiff competition from a stronger local opponent.
  • SouthKorea: not a surprise either.  Their two main local search engines Naver (72%) and Daum(18%) account for 90% of all searches in the Korean search engine market.  What could Google possibly want here?

So Why SouthKorea and Russia?

Gary mentioned that they were not keen to let any more information go around as to the reasons why Google may now be crawling the web out of those two new geographical locations. So with this post, I am just throwing my  two cents on this matter.

Google is clearly not hiding his plans as its engineers go out to the conferences and use this information on their decks. But what are the reasons and Google’s interest in throwing their crawler from those two additional geographical locations?

It’s very clear:

Korea, next to Russia and together with China form the three main and major Google competitors in the international search engine market land. Google fears that some of these search engines will at some point defeat its almighty presence in other territories.

Google wants to dominate in every single market. Google is trying to avoid big competitors like Baidu, Yandex and Naver to become bigger and gain more market share. The three main markets where they don’t have the lionshare’s chunk of search are China, Russia and Korea.

We also have Japan and Taiwan where Yahoo boast over 50% of search share but their results are served up by Google, at least in Japan. So effectively Google controls and owns most of the search results in those two countries too. This renders us with just three: China, Russia and Korea as the main difficult markets for Google to penetrate and dominate.

So my take is that Google has brought the crawler here in order to begin drilling deeper into the search landscape in these markets.

 

Non Google search engines
Non Google search engines

source: Preston Carey’s slideshare

We are going to leave China out of the equation as Google is being restricted by China’s censorship internal wall and not as easy market to penetrate into. Years ago they abandoned their attempts to walk inside the Chinese wall. So leaving China aside the two other ones on the spotlight are Russia and Korea.

Google wants to learn and understand more about those two local market search engines, their linguistic nuances and about the search results that make people want to use their local versions as opposed to Google’s own engine. Google wants to be able to provide an equal, similar or better search engine results than those offered by Korean’s Naver and Russian’s Yandex.

UnGagged, the SEO UnConvention

 

Some Stats about South Korea and Russia

Google boasts a very humble piece of the south Korea search engine market share (5%), with Naver having the lionshare of search.  The leading search engine in this market in Naver, with a difference, followed by Daum, and then Google in the last position. This is, potentially, the most likely reason for Google to take base on their crawler and strengthen their presence their by opening new offices. Otherwise, why not Japan which is also quite a strong economy and with a booming internet market?

In Russia. Google has got a larger marketshare than in Korea: about 25% . Yandex obviously enjoying the larger chunk: 65%. What is bad news for Google is that Yandex has been making plans to expand in recent years and has already began expanding onto emerging markets in LatinAmerica, Europe, Turkey. Yandex is nowadays the fastest growing search engine.

So as Yandex enjoyes 62% of share, Google may want to get better and more insightful ‘intel’ about the Russian market by establishing base there.

Sure, but what about Baidu, they are also expanding and penetrating into other markets?

Yes, but as mentioned before, as they are not allowed to set up data centres in China unless the comply with censorship, which currently they dont, they are kind of leaving China and Baidu aside for the time being. However, my opinion is that Google will, in the end, bite the bullet and comply with Chinese censorship and government control rules on Internet use.

I wouldnt be surprised if within a year or two, Google takes a completely different steer and ses up base in Beijing or other key location in China.  They could soon start crawling the China’s wide web at ease and gathering data, insights and signals that helps them streamline their results in Chinese in order to compete better with Baidu and stand a greater chance to increase market share in this huge market.

 

And what about Bing and Yahoo?

Of course Bing and Yahoo are competitors too, but they are kind of US companies, therefore it is easier for Google to get key info about how they operate. Plus they are search engines that also struggle with international penetration in certain markets.

Yahoo for example is big in Japan, and in Taiwan and Hong Kong, but they don’t present a massive problem to Google’s monopolistic ambitions.

Yahoo’s UVP in Japan for example is not just ‘search’ but a range of localized mobile and social services that they provide to the local markets, which makes them unique and distinguishable in those countries. But are they truly competitors that may present a block to Google worth worrying about. I’d agree that Yahoo and Bing may be serious competitors to Google but on a different level.

What do you think? Are you able to speculate the reasons why Google may now be crawling the web out of South Korea and Russia and not anywhere, like Spain,  Malaysia or Wallis & Futuna?

 

Articles I have read as reference for this post :

 

http://googlewebmastercentral.blogspot.fr/2006/09/how-to-verify-googlebot.html

http://www.link-assistant.com/blog/google-vs-naver-why-cant-google-dominate-search-in-korea/

http://returnonnow.com/internet-marketing-resources/2013-search-engine-market-share-by-country/

http://searchengineland.com/matt-cutts-in-south-korea-109861

 

The post Googlebot now crawls the web from Korea and Russia too appeared first on David Carralon.

]]>
http://wordpress-477546-1867265.cloudwaysapps.com/googlebot-now-crawls-the-web-from-korea-and-russia/feed 0
Matt Cutts says ok to chain some 301 redirects together http://wordpress-477546-1867265.cloudwaysapps.com/matt-cutts-says-ok-to-chain-some-301-redirects-together http://wordpress-477546-1867265.cloudwaysapps.com/matt-cutts-says-ok-to-chain-some-301-redirects-together#respond Sat, 03 Sep 2011 06:05:50 +0000 http://wordpress-477546-1867265.cloudwaysapps.com/?p=957 Matt Cutt is back on track with a new set of instructional ‘technical SEO’ videos, with whiteboard and all, Rand Fishkin’s style. I found this video (embedded below) particularly helpful as it covers a question that most in-house enterprise SEOs dealing with large enterprise-level websites may have asked themselves at some point in their jobs: …

The post Matt Cutts says ok to chain some 301 redirects together appeared first on David Carralon.

]]>
Matt Cutt is back on track with a new set of instructional ‘technical SEO’ videos, with whiteboard and all, Rand Fishkin’s style. I found this video (embedded below) particularly helpful as it covers a question that most in-house enterprise SEOs dealing with large enterprise-level websites may have asked themselves at some point in their jobs: ‘will it be an issue to have 301 redirects chained together?’ I am sure most of us agree that it is not good practice having a ‘301 redirect’ chained to another one, but if that accidentally happened?

The video is actually not just about how many 301 redirects you can chain together. It covers some other basic and useful typical questions:

  • linking urls by topicality instead of just linking to the homepage
  • No. of total 301 urls you can do on a site, is there a limit?

but the chaining of 301 redirects and the perceived risk of penalisation is what I had been wondering about.

If you have happened to have gone round asking this question to other SEO industry peers, in SEO forums or in conferences, you will have most likely got the same answer I did, typically along the lines of: ‘…ummm I wouldn’t overdo it and, if possible, I would stick to just having one sequence of redirects and avoid having double redirects or more in case you get penalised….’, well watch the video to hear Matt Cutts’s googly opinion:

On this video, Matt confirms that the Googlebot starts being suspicious on multiple redirect levels when it bumps onto a 4th or 5th sequential redirect. If you watch the video, make sure you are patient as it covers this at the end of it.

“We are willing to follow multiple hubs, multiple levels of urls…at the same time you get too many, if you’re getting up to four or five hubs, then that’s starting to get a bit dangerous, in the sense that Google may decide not to follow all those redirects”

I conclude that if the unavoidable situation of having double redirects happens, if we ever slip, we are on the safe side with Google. The question that lies pending for those international SEOs responsible for multiregional sites is: ‘ what about the other search engines, how does Bing, Naver, Yandex, Baidu, Sezam and other local or regional search engines react to that?’ If you have an insight to share on this respect, please feel free to comment.

The post Matt Cutts says ok to chain some 301 redirects together appeared first on David Carralon.

]]>
http://wordpress-477546-1867265.cloudwaysapps.com/matt-cutts-says-ok-to-chain-some-301-redirects-together/feed 0
meta-tags fuss http://wordpress-477546-1867265.cloudwaysapps.com/meta-tags-fuss Sun, 18 Oct 2009 14:41:04 +0000 http://wordpress-477546-1867265.cloudwaysapps.com/blog/whats-this-fuzz-about-meta-tags-all-about.html Despite the fact that keywords meta-tags have lost credibility as a search engine ranking factor quite some time ago, there is quite a bit of buzz on the subject lately. Matt Cutts, Head of web spam at Google has reiterated publicly once again, that the Googlebot ignores keywords embeded into the meta keywords elements of …

The post meta-tags fuss appeared first on David Carralon.

]]>
Despite the fact that keywords meta-tags have lost credibility as a search engine ranking factor quite some time ago, there is quite a bit of buzz on the subject lately. Matt Cutts, Head of web spam at Google has reiterated publicly once again, that the Googlebot ignores keywords embeded into the meta keywords elements of web pages and that they don’t use the meta area at all, at least as a ranking factor.

Such officially post confirms that the Googlebot ignores the keywords meta tag, but it is interesting to still see that everyone suddently starts a conversation about it on the various blogs:

So the clear message from google is keywords meta-tags are useless. However there other search engines, like Ask or Bing that may be still using the keywords metas to ascertain relevance. However, as far as Yahoo is concerned, about a couple of weeks later, they announced that Yahoo Search no longer uses meta keywords tag .

The interesting part of this story is that a few days later, Danny Sullivan, editor of Search engine land, posted an article stating that following a homepage test, he discovered that Yahoo still indexes the meta keywords tag.

Et ce n’est pas grave façon de Testeo il ne se c’est tout ce qui lui masse t’es c’est ça t’as sport. Esto es un . que estoy realizando para ver si funciona la funcionalidad de voz.

Dices a Test right this is a test to see if it works dysfunctionality . 

The experiment consisted of placing a unique word: ‘xcvteuflsowkldlslkslklsk’ in the meta keywords tag on the home page of Search Engine Land. He made a search for this word on yahoo search and Search engine land ranked clearly on top.

What’s the bottom point with this post? should you use the meta keywords tag or not? The consensus amongst SEO experts is :

  • if you already have your meta tags fully populated with keywords leave them there as they are as long as they are relevant to the page topic.
  • if you are starting a new page, you could do with jotting a keyword or two on the keywords meta tag just to be sure that you stand your chances with other search engines and in case, things change in future, and as they will be useful for other things that search engine rankings

My personal advice is not to obsess with on page and meta optimisation and not to spend more than a few seconds on that tag when creating content.

Instead, focus on creating good content, link assets or even link building.

The post meta-tags fuss appeared first on David Carralon.

]]>