" "

Over the last 2 decades, I have unsuccessfully engaged SEO professionals to lift my site on the web, being to scared to attempt anything like that myself (even though as a professional sailor I am comfortable in situations of personal peril during storms at sea). Your guidence and tools have given me the confidence to incorporate my knowledge & expertise into my web site content. I originally doubted my ability thinking professional writers were required BUT just like our guests soak up my wifes knowledge of the underwater world, I look forward rewriting my web site and sharing our specialised knowledge and experience of our “back yard” which is about half the size of Texas !
“Sharability” – Not every single piece of content on your site will be linked to and shared hundreds of times. But in the same way you want to be careful of not rolling out large quantities of pages that have thin content, you want to consider who would be likely to share and link to new pages you’re creating on your site before you roll them out. Having large quantities of pages that aren’t likely to be shared or linked to doesn’t position those pages to rank well in search results, and doesn’t help to create a good picture of your site as a whole for search engines, either.
Breaking it down, Traffic Cost is SEMRush’s way of showing the hypothetical value of a page. Traffic Cost estimates the traffic a page is getting by estimating clickthrough rate (CTR), and then multiplying it against all the positions it ranks for. From there, it looks at what others would be willing to pay for that same traffic using Google AdWords’ CPC.
The SEO starter guide describes much of what your SEO will do for you. Although you don't need to know this guide well yourself if you're hiring a professional to do the work for you, it is useful to be familiar with these techniques, so that you can be aware if an SEO wants to use a technique that is not recommended or, worse, strongly discouraged.
The leading search engines, such as Google, Bing and Yahoo!, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically. The Yahoo! Directory and DMOZ, two major directories which closed in 2014 and 2017 respectively, both required manual submission and human editorial review.[39] Google offers Google Search Console, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not discoverable by automatically following links[40] in addition to their URL submission console.[41] Yahoo! formerly operated a paid submission service that guaranteed crawling for a cost per click;[42] however, this practice was discontinued in 2009.
“In conclusion, this research illuminates how content characteristics shape whether it becomes viral. When attempting to generate word of mouth, marketers often try targeting “influentials,” or opinion leaders (i.e., some small set of special people who, whether through having more social ties or being more persuasive, theoretically have more influence than others). Although this approach is pervasive,recent research has cast doubt on its value (Bakshy et al. 2011; Watts 2007) and suggests that it is far from cost effective. Rather than targeting “special” people, the current research suggests that it may be more beneficial to focus on crafting contagious content. By considering how psychological processes shape social transmission, it is possible to gain deeper insight into collective outcomes, such as what becomes viral.”
“In conclusion, this research illuminates how content characteristics shape whether it becomes viral. When attempting to generate word of mouth, marketers often try targeting “influentials,” or opinion leaders (i.e., some small set of special people who, whether through having more social ties or being more persuasive, theoretically have more influence than others). Although this approach is pervasive,recent research has cast doubt on its value (Bakshy et al. 2011; Watts 2007) and suggests that it is far from cost effective. Rather than targeting “special” people, the current research suggests that it may be more beneficial to focus on crafting contagious content. By considering how psychological processes shape social transmission, it is possible to gain deeper insight into collective outcomes, such as what becomes viral.”
I would also advise to continue doing what works. If something you have rolled out generates great traffic and links bring out a new version of the content, for example the 2012 version worked effectively bring out the 2013 version of the content. Another effective strategy is to make the piece of content into an evergreen article which you add to over time so it is always up to date.
Make it as easy as possible for users to go from general content to the more specific content they want on your site. Add navigation pages when it makes sense and effectively work these into your internal link structure. Make sure all of the pages on your site are reachable through links, and that they don't require an internal "search" functionality to be found. Link to related pages, where appropriate, to allow users to discover similar content.

“To give you an example, our domain authority is currently a mediocre 41 due to not putting a lot of emphasis on it in the past. For that reason, we want to (almost) automatically scratch off any keyword with a difficulty higher than 70%—we just can’t rank today. Even the 60% range as a starting point is gutsy, but it’s achievable if the content is good enough.”

By relying so much on factors such as keyword density which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. This meant moving away from heavy reliance on term density to a more holistic process for scoring semantic signals.[13] Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search, poor quality or irrelevant search results could lead users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate. In 2005, an annual conference, AIRWeb, Adversarial Information Retrieval on the Web was created to bring together practitioners and researchers concerned with search engine optimization and related topics.[14]


Get a FREE website traffic report for any domain, with unlimited access 24-7. Traffic Estimate has been providing a website traffic estimator, site rankings, and analytics since 2004. We strive to provide useful information for website owners, domain buyers, and SEO gurus. You can use our traffic estimator, check statistics, and monitor data on just about any domain. See how to drive more site traffic, track competitors, compare websites, view related keywords, and increase your web traffic footprint by using our SEO tools, stats, and partners.
In addition to the Cowboys and the Mavericks, Dallas is also home to the Dallas Stars of the National Hockey League, the Texas Rangers of Major League Baseball who are perennial contenders for the American League title, and FC Dallas of Major League Soccer. The Rangers also play in Arlington at Rangers Ballpark, while the Dallas Stars share the American Airlines Arena with the Mavericks, and FC Dallas has its own purpose built soccer stadium in Dallas.

Thick & Unique Content – There is no magic number in terms of word count, and if you have a few pages of content on your site with a handful to a couple hundred words you won’t be falling out of Google’s good graces, but in general recent Panda updates in particular favor longer, unique content. If you have a large number (think thousands) of extremely short (50-200 words of content) pages or lots of duplicated content where nothing changes but the page’s title tag and say a line of text, that could get you in trouble. Look at the entirety of your site: are a large percentage of your pages thin, duplicated and low value? If so, try to identify a way to “thicken” those pages, or check your analytics to see how much traffic they’re getting, and simply exclude them (using a noindex meta tag) from search results to keep from having it appear to Google that you’re trying to flood their index with lots of low value pages in an attempt to have them rank.


5. Link building. In some respects, guest posting – one popular tactic to build links, among many other benefits – is just content marketing applied to external publishers. The goal is to create content on external websites, building your personal brand and company brand at the same time, and creating opportunities to link back to your site. There are only a handful of strategies to build quality links, which you should learn and understand as well.
Not all web traffic is welcomed. Some companies offer advertising schemes that, in return for increased web traffic (visitors), pay for screen space on the site. There is also "fake traffic", which is bot traffic generated by a third party. This type of traffic can damage a website's reputation, its visibility on Google, and overall domain authority.[citation needed]
The other forms is just social media. So, this is just social. This is using the platforms, such as Facebook, such as, you know, Facebook Groups, using Instagram, using YouTube, using social media in order to promote your business, your links, or whatever. So, this to me, is, it depends, the quality of the traffic that you get from social, really depends on the platform. Honestly, I’d say something like Twitter would not get you very high quality traffic. In fact, traffic from Twitter, in my experience, is usually very low quality, which means low conversions, and no sales, basically. So, you know, this isn’t a hard or fast rule, but basically, something like YouTube, on the other hands, is very high converting. So, basically, you got social, you got SEO, this is organic traffic. It’s free.
Well as noted in the post it is not just above the links that was only one key part of a wider strategy. This website in question has deep levels of content. So it is not just about a blog section, they have numerous high quality content sections we have developed over time. It would not be advisable ever to attack competitors sites with low quality links.
Brian, I recently found your blog by following OKDork.com. Just want to say you’re really amazing with the content you put out here. It’s so helpful, especially for someone like me who is just starting out. I’m currently writing posts for a blog I plan to launch later this year. I think my niche is a little too broad and I have to figure out how to narrow it down. I essentially want to write about my current journey of overcoming my fears to start accomplishing the dreams i have for blogging, business, and travel. In doing so, I will share the best tips, tools, and tactics I can find, as well as what worked, what didn’t and why.
Another tip you can use is just reach out to the prior agency and say something like the following: “We realise you were using link networks for our website which has resulted in a Google penalty and loss in business. Can you please remove my website from any link network you have built?”. If the prior agency is decent, they will remove the links from the network.
In my experience, a lot of people are more open about sharing traffic stats then you would think. You see this not just in interviews but if you peruse through the archived articles on a blog, there’s a good chance you’ll stumble upon a “blog in review” or “traffic report” post. With those stats, you can start to figure out how much traffic the site is getting today.
By 2004, search engines had incorporated a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation. In June 2007, The New York Times' Saul Hansell stated Google ranks sites using more than 200 different signals.[25] The leading search engines, Google, Bing, and Yahoo, do not disclose the algorithms they use to rank pages. Some SEO practitioners have studied different approaches to search engine optimization, and have shared their personal opinions.[26] Patents related to search engines can provide information to better understand search engines.[27] In 2005, Google began personalizing search results for each user. Depending on their history of previous searches, Google crafted results for logged in users.[28]

Use social media. Build a presence on social media networks like LinkedIn, Twitter, Facebook, Google+ etc. All of these activities help to get your name out and website address out on the internet. Read about how we doubled our social media audience in a week. Add share buttons to your site to make it easy for people to share your content. And write content worthy of sharing.
hey james - congrats on your success here. just a question about removing crummy links. for my own website, there are hundreds of thousands of backlinks in webmaster tools pointing to my site. The site has no penalties or anything  - the traffic seems to be growing every week. would you recommend hiring someone to go through the link profile anyway to remove crummy links that just occur naturally?

Thank you Brian for this great article! I enjoy reading it even though it took quite sometime of slow reading to sink all concept in and trying to remember them. For future reference, I also shared your article in my Facebook post so I can refer to and share with those who worked with me. I like the way you presented the details and its easy to read and understand.:)

Great guide. 1 thing i would like to mention ( If i may ) is that the importance of having a secure domain ( SSL ) cant be overstated. A recent Semrush survey revealed that over 65% of websites ranking top 3 organically, all had HTTPS domains. If Rankbrain is going to look at bounce rate as a signal then i can’t see any bigger factor than this in terms of having an effect once a user lands on a website, particularly as Google is going to make it crystal clear if a domain is secure or not.
would it be easier to set up 2 separate GMAIL Accounts with 2 separate analytics accounts for 2 different web sites? Or is it ok to use 1 GMAIL account to manage 2 sites under 1 Analytics accounts and just have 2 properties inside of it? Take into consideration that it’s a local business doing services (no store front) and might need Adwords etc. Also take into consideration Search console , not sure how it influences Analytics /sites verifications

Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[10][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[11] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[12]

Simple navigation reigns and quality content is king – A user-friendly website, with interesting and easy-to-find information is what will boost your traffic. Each page needs to be built around keyword themes, with unique content, so search engines can easily index yours and rank you higher. Positive behaviors from site visitors is your best bet for a better ranking, so keep the content natural and focused; avoid jargon and keyword stuffing to keep users from leaving the site unhappy and hurting its ranking.
Hi , the post is really nice , and it made me think if our current strategy is ok or not , 2 things are important " High quality content strategy " and " Good quality Links " now joining those correctly can pose some real challenges , say if we have n no of content writers who are writing for couple of websites, to be generic let’s consider , 1 writer @ 1 website . We have to write make a content strategy for in-house blog of the website to drive authentic traffic on it and a separate content strategy for grabbing  links from some authentic High PR website i.e. CS should be 2 ways , In-house / Outhouse .
This is the number of views that you can test each month on your website.It's up to you how you choose to use them, either by allocating all the views to one test or to multiple test, either on one page or on multiple pages. If you have selected the 10.000 tested views plan and you run an experiment on your category page which is viewed 7000 times per month, then at the end of the month 7000 is what you'll be counted as tested views quota.
×