SEO - Express Writers - Page 7

Why Google Killed Doorway Pages & How To Make Sure You’re In the Clear

Why Google Killed Doorway Pages & How To Make Sure You're In the Clear

If you use the web to surf, purchase, play, or read, it’s likely you’ve come across a “doorway” page. They’re bad news all around. How so? Doorway pages are web pages designed for the sole purpose of helping websites improve their traffic. They’re designed to rank well for particular phrases or keywords and often feature spammy, keyword-stuffed content and little to no user value. In addition to clogging up the Internet, doorway pages are obnoxious and frustrating for users. Imagine coming across a piece you thought was “a thorough DIY guide on grooming your dog” just to find it was a landing page selling you dog food with a tiny blurb that wasn’t really helpful titled “how to groom your dog.” Yeah, bad news. Fortunately, Google is trying to fix the issue. Last year, the search engine released an algorithm update meant specifically to kill doorway pages across the web. Here’s what you need to know–and keep reading for five ways to make sure you aren’t hosting any so-called “doorway page,” unbeknownst to you. Why Doorway Pages Are Bad There are dozens of SEO tactics designed to boost search results, but doorway pages are a particularly frustrating one. Doorway pages often masquerade as a single page on a site or as a selection of distinct domains. They’re a problem because, when a Google user enters a search query, it’s possible that he or she will receive a complete list of results that all point to the same site. This creates a negative experience for Google users and makes it difficult for them to find the information they need. Short Answer: If You Are Investing in High Quality Content, You Won’t Have to Worry Remember, in 2015, Google released a massive Search Quality Evaluator guide. They have real people who evaluate every site based on these standards. We talked about these standards here, in a massive 3,000-word blog. And this ties into doorway pages. I know this sounds simple, but it really matters: overall, things boil down to the quality of content you have. If it’s high; you’re in the clear. If it’s low; you’re not. Is your site messy? In both design and content? Poor in wording? An empty page or two? You probably have doorway pages and Google isn’t going to like you. Or is your site and written content on it really good? You’d be proud to read it off to your newest client? That kind of quality. Per Google’s guidelines, things boiled down to these two standards: Key Ways to Instantly Recognize Telltale Doorway Pages If you’re still unsure whether you might host a doorway page or not, luckily, Google offers several pretty clear guidelines on how users can identify doorway pages, so you’ll never create them or let someone else create them for you. Website owners beware of these key following telltale signs of a doorway page: If the purpose of a page is to rank for a search term or funnel visitors to a particular portion of a site, the page is probably a doorway page. If the page is targeting generic search terms, but is filled with specific body content, it’s likely a doorway page. If the page collects and aggregates things that can be found on the website, such as locations and product descriptions, it’s likely a doorway page. If the page exists for the sole purpose of funneling users or harvesting affiliate traffic without offering valuable content or functional design, it’s likely a doorway page. If the page makes it impossible to navigate to other portions of the site, it’s likely a doorway page. What Google’s Anti-Doorway Update Actually Does The update Google put out, which currently doesn’t have a name, categorizes doorway pages as possessing three traits: Doorway pages often have multiple domain names or funnel users to a specific page Doorway pages are pages that seek to funnel visitors to relevant or usable portions of the site Doorway pages are similar pages that run parallel to search results rather than a “browse-able hierarchy.” Within its press release surrounding the topic of the update, Google states that “doorway pages that are created solely for search engines can harm the quality of the user’s search experience” and goes on to say that large sites that use doorway campaigns are likely to be impacted by the impending algorithm change. In addition to helping Google users get a feel for what doorway pages are, these guidelines can also help webmasters determine if they have doorway pages on their sites and remove them in order to avoid being penalized by Google. 5 Ways to Protect Your Site from The Worst of Google Doorway Penalties If you’ve been using doorway campaigns on your site, there are several things you can do to avoid Google penalties. 1. Destroy empty pages If you’ve got empty pages sitting around on your site, it’s time to take them down. Because these pages don’t add what Google calls “clear, unique value,” they’re likely to be interpreted as doorway pages and can harm your site’s overall ranking. With this in mind, only publish new pages when you have content available for them. This ties into the overall theme of “is your content good? is it high quality? does it help people?” Empty pages obviously don’t! 2. Improve your site navigation This plays into design. Work on both the appearance and navigation abilities of your site. One trait of doorway pages is that they make it difficult or virtually impossible to navigate to other portions of the website. If you’ve got a page that isn’t integrated into the navigation of your site, consider improving your navigation so that the page in question is easier for users to find and to navigate away from. 3. Pay attention to customer-generated content pages Pages that get most of their content from customers or staff are at a high risk of sitting idle and damaging your site’s ranking. These pages include things like open-to-contributors review pages and galleries. To avoid getting dinged by Google’s recent update, be … Read more

RIP: Google Says Farewell to Right-Hand Sponsored Ads & What It Means for Marketers

RIP: Google Says Farewell to Right-Hand Sponsored Ads & What It Means for Marketers

If you’ve been paying attention to the news in SEO lately, you’d have noticed news stories like “Four Ads on Top: the Wait is Over.” These headlines are referring to Google’s recent decision to alter its SERPs to display four ads at the top of the page, and remove sponsored ads in the space adjacent to the search results on the right-hand side. While it may not seem like this is big news in the world of SEO, it is. The new SERP layout is much more inbound-friendly and much less welcoming for traditional, cold advertisements. This creates a better user experience and produces SERPs that are more useful, relevant, and easy to navigate than they’ve ever been before! To see exactly where the ads used to be, refer to the below screenshot, question mark referring to the empty space where sponsored ads were: Google Says Goodbye to Right-Hand Sponsored Ads: What This Means for Marketers So what’s this “Huzzah!” attitude all about? How does this change affect marketers and their rankings in the SERPs? How will it affect you and your content? Here’s what you need to know about this new change. What the New SERPs Look Like If you’ve searched for a high-volume keyword lately, you’ll notice that the SERPs look a bit different. Before this recent Google update, ads appeared at the top, bottom, and right side of organic search results. Now, however, they look like this for competitive search terms, with a clean slate on the right side. Cheers, Google – you look so tidy! While the change didn’t affect the ads that appear below the search results, it did do away with those to the sponsored ads to the right of the search results. This leaves more room for organic search rankings and throws inbound marketing into a whole new world of importance. Why Google Made the Change As far as most content marketer experts can guess, the primary reason behind this new feature is user convenience. Displaying ads in two places rather than tree creates a simplified user experience and makes mobile search easier than ever, which is something Google now sees as a major priority (As evidenced by recent algorithm changes). Additionally, top-placed ads seem to simply perform better in terms of clicks and traffic than side-placed ads. In light of this, Google’s new change streamlines a page and focuses traffic while also enhancing user experience. How The New Ad Placement Will Affect Organic Search The introduction of four ads above SERPs will have an effect on search, but primarily for marketers who are paying for sponsored ads rather than creating valuable content. Because the new ad structure places more importance on organic search, it stands to reason that marketers who focus their attention on creating quality content and optimizing their material for organic search success will perform better in the new environment. While this new change has shocked many content marketers, it’s clear that it is simply one in a series of Google changes (including Knowledge Graph and Featured Snippet) that are meant to provide users with relevant, quality results without intrusive advertising. 4 Ways Content Marketers Can Cope With Google’s Sponsored Ad Changes Again, Google’s most recent changes really only present a challenge to people who have been relying heavily on sponsored ads. For everyone else, it’s more of a pivot than a change. In light of Google’s new ad placements, the importance of organic, quality content is going to be more important than ever. Here are a few steps you can take to ensure that you appear prominently in the new SERP layout. 1. Review your sponsored ads now If you’re a content marketer that has invested in Google’s sponsored ads, and you’ve got live campaigns going on, check on their position and appearance from your dashboard. If your ads aren’t where you want them to be, it’s important to consider how you can alter them to perform better. While you can do this a few different ways (increasing your bid, boosting your position for the keywords you’re targeting, or increasing your quality score), the best way to go about it is to take a comprehensive approach to increase your ad’s performance. Don’t jump to increase your bid first thing. Instead, seek to make changes that will improve your quality score. This means making your text more relevant and helpful and ensuring that you’re targeting keywords properly. 2. Shift your focus to content Google has made it painfully clear over the last several years that content is the king of the game and that’s especially true with this recent change. While sponsored ads used to rank on the same level as organic ads, the tables have turned and sponsored ads have left the building. In light of this, it’s important to shift your company’s focus to how you can create quality content Google will want to rank well. Content marketers focusing on just that–content marketing–will do well to boost and improve the quality of that focus. A Little Case Study From Moi Yours truly, Julia from Express Writers speaking here: we’ve been in business for five years and we’ve never used a sponsored ad. Instead, we’ve taken all of the money we could have poured into purchasing ad space and we’ve allocated it to creating content. By hiring quality writers, focusing on publishing relevant, long-form content on our blog, and creating guidelines that genuinely answer our reader’s questions, we’ve gained more than 300 positions in Google rankings through organic content alone. We’re proof that it is possible, and you can get there, too! If you’re a little unsettled by Google’s new SERP structure, consider how you can re-allocate your resources. If you’ve been purchasing sponsored ads with the majority of your marketing budget, focus on creating content instead. Guides, in-depth, long-form, how-to content, and relevant blog posts are all sure-fire ways to draw more readers to your content and gain SERP prominence without spending money on sponsored ads. Because the new layout of Google’s SERPs is much more … Read more

A Guide to Google Posts, Or Podium: It’s About to Be A Brand New World

A Guide to Google Posts, Or Podium: It’s About to Be A Brand New World

If you’ve been following the presidential race online, it’s likely that you’ve probably already noticed what the topic of my blog is all about today. There’s been a slight change in how Google is displaying their search results. Search a presidential candidate on Google right now, and you’ll find a distinctly Twitter-like feed that allows Google users to have direct access to live-feed information from the Google platform. In addition to helping Google users locate the information they want more quickly, this feature has the potential to eventually expand to celebrities and businesses, allowing them to have their own personal content feeds embedded within the Google platform. Welcome to a new world: the Google Posts, or Podium, as they’re currently being called by news sources. Google’s Circle of Life: Google Posts (or Podium) Opening in Search Results Here’s what you need to know about this huge upcoming change that’s on the works at the Google-sphere. What Do The New Search Results Look Like? If you head to Google right now and type in the name of any presidential candidate in the race, as I did with Bernie Sanders’s name, you’ll find results that look like this:   The display features a small blue checkmark that indicates that Sanders is a verified presence. In addition to allowing users to scroll through a live feed of recent news, it’s easy to share the content directly to Twitter, Facebook, Google+, or Email by simply clicking the social sharing icon at the bottom. Who Can Use It? Google’s Wait List for Their Twitter-Lookalike is Open While the new SERP layout is available only to these two parties: select businesses and the current US presidential candidates …Many experts expect it to open up significantly in the future. In fact, Google is currently offering a waitlist where you can request the feature for your business. While Google has been relatively quiet about this new change, the search engine giant has called it an “experimental new podium” that allows users to “hear directly from the candidates they’re searching for in real time…” The change allows the presidential candidates to share images, videos and text directly to the Google platform, where they appear instantaneously in the search results. Plus, since Google has made it so easy for Google users to share the content directly from the platform, it’s clear that there’s a certain viral intention present in the change. How Google Posts in Search Results Works Google began experimenting with this new feature last month when it was released as a way to allow political candidates a way to post lengthy responses to one another and expound on their stances on various issues such as education, gun control, and immigration. In combination with other recent innovations like Google’s mobile “Cards,” this new scrolling feed feature is a fantastic way for Google users to gain access to the information they need and want quickly and easily. While many people are calling the new feature “Posts,” or “Podium,” Google hasn’t actually endorsed this name and the feature doesn’t technically have a moniker just yet. While the rollout of the new feature is still quite limited, the content provided by the new feature appears prominently in Google’s SERPs, which could be a fantastic platform for marketers and businesses down the road. The current format of the posts looks quite a lot like a Google+ post, although Google users are unable to comment directly on the page or follow the content. As it stands now, the feature is ideal for sharing and rapid-fire updates on important social and political issues. Why the New Google Posts SERP Matters In addition to the political candidates currently using the feature, there are several small businesses, including Andrew Jewellers and A Healthy Choice, that have surfaced with the new search feature. For these businesses, the new search feature has the potential to revolutionize their search results by providing more traffic and easier social sharing. Because Google has such a huge reach (much larger than any social network), it’s clear that having content featured directly on the search platform could provide any business using the feature with a huge SEO boost. Because of this, this platform has the potential to provide more reach than any other social media platform, which could make it the perfect place to deliver news, updates, and offers. The Future of Google’s New Search Feature According to Google’s current statements, there are no plans to charge for this search feature in the future, although there’s a good possibility that it could eventually become a feature that would compete with Facebook’s Sponsored Posts feature. If this happened, it would allow brands or individuals taking advantage of the feature to sponsor posts on the basis of keyword searches. While the future of this innovative new search layout remains uncertain, it’s clear that Google is committed to offering clear, relevant, and up-to-date content to users without offering too many hoops to jump through. Yes, There’s Pros: The Top 5 Advantages of Google’s New Search Look While Google’s most recent feature is still in a very limited roll-out stage, we’re excited to see where it goes. Here are the five top advantages I expect the platform to bring to content marketers who can get their hands on it: 1. Increased visibility Getting to the top of Google’s SERPs has always been a battle and, if this new feature rolls out on a widespread basis, it could change the way we relate to ranking. While there’s no doubt that ranking well in Google through things like SEO, content, links, and referrals will always be important, the new feature could help small businesses and individuals display content prominently to their customers directly from a simple, easy to use platform. This, in turn, could help even small businesses build their online brands quickly and easily and without the expense so often associated with high-level SEO and content marketing. 2. More relevant content Since Google’s new feature operates on a live-feed basis, it’s constantly shuffling the most recent … Read more

A Guide to Boosting Your Site Authority in 2016

A Guide to Boosting Your Site Authority in 2016

Site authority (with alternative terms like domain authority, or page authority) refers to the quality of a website online and the amount of authority, trust, or gain it has in the rankings. In other words, it’s a big deal for marketers looking to boost their web presence. Think: higher site authority = more traffic + high rank in Google’s SERPs. Site authority itself based on several different factors. As a general rule, the stronger those factors are, the stronger your site is likely to be. And guess what ties in? Fundamentally, good content. We’re about to discuss all the major tips and tricks you need to know to maintain, boost and grow your overall site authority, so keep reading. A Guide To Boosting Your Site Authority If you’re looking to boost your site authority in 2016, here’s what you need to know. Let’s delve in! Defining Site Authority Site authority is a ranking metric that determines how authoritative and trustworthy your site is. Site authority is based on a variety of factors, including link profiles (how many links point to your site and how well-known and respected those links are), content strength, page design, and other trust factors. And, site authority can be improved or damaged as a result of changes to these factors. The main term for this, domain authority, was actually developed by Moz as part of their MozBar DA algorithm, and is a metric using their algorithm from 1-100 that says how strong your website authority is. Here’s an example of the Moz DA in action, calculating the DA of a YouTube video for “boxer training tips.” Incidentially, it calculates YouTube itself, so the DA is a whopping 100. Have you seen Google’s recent 160-page doc released November of 2015 that defines search guidelines? I broke it down in our major blog post here, as well. 6 Key Steps to Improving Your Site Authority As I mentioned earlier, site authority is not something that is rigid and unmovable. Google is constantly evaluating a site’s authority levels, so it’s not uncommon for these rankings to fluctuate based upon improvements or decreases in a site. Like a brain that constantly works, Google is continually updating and working, always keeping an eye on your site and overall quality. So there are many things you can and should do to bring up your site authority. Follow our key six steps for better site authority rankings now and in the future. Step #1: Publish fresh content Fresh content is a powerful tool in the world of SEO. This is true for several reasons. For one, every fresh page you publish is one more that Google can index. This provides you more opportunities to get your content in front of viewers and rank well in Google’s SERPs. Additionally, Google loves fresh content and has been displaying a blatant bias toward fresh content for years now. This is because Google knows that its users crave fresh content and, as such, the search engine giant is happy to rank sites that publish ample amounts of fresh content higher than those that go without updates for months. In fact, HubSpot has found that companies that publish more than 16 blog posts each month earn an average of 3.5x as much traffic as companies that publish between 0-4 blog posts each month. (I mean, I could have told you that. I’ve blogged over 600 times on our site and see a huge amount of growing traffic.) Also, those results hold true for both B2B companies and B2C companies. To freshen up your content, you can do a few things: First (and most obviously) you can publish new content. To do this effectively, you’ll want to focus on targeting keywords that provide fresh results or writing about hyper-current events in your industry. The second option is to update older content. This is a common go-to for many SEOs and, when done correctly, can help you get the most out of your existing material. Maybe your old content used to rank in the top two or three SERP spots but now it’s stuck in the lower half of the page. In cases like this, a simple update will commonly solve the problem. There are many guides on how to update old content, but as a general rule, you’ll want to add new details, update anything that’s no longer factual, and seek to make it relevant to today’s readers. Finally, you may consider publishing a regularly updated series if your content focuses on industries that change constantly, like sports. Because sports news changes on a daily or weekly basis, a page that isn’t being updated continually is nearly useless. For this reason, sites that focus on these types of topics can easily increase their site authority by offering regular, real-time updates on the industry. While fresh content is all well and good, there are some things you’ll have to remember. The first is that publishing fresh content does not mean simply publishing the same content (or even very similar content) over and over again. This places you at huge risk of being down-ranked by Google’s algorithms and failing to truly meet your users’ needs. With that in mind, focus on providing valuable information and helping users answer questions or solve problems every time you publish fresh content. You should also keep in mind that creating and publishing expert content (as per Google’s recently released Search Quality Evaluator Guidelines) can help you gain increased site authority. While it seems simple, this step will go a long way toward increasing your site authority. Step #2: Create landing pages Landing pages do exactly what they promise: they give readers a place to land. While landing pages are used for a variety of purposes, you’ll commonly see them used to offer specific information, simplify complex concepts (a promotion, for example), or provide a one-stop shop that allows audiences to gain the information they need to convert. Landing pages are especially important for marketers who use PPC ads, affiliate links, or a variety of channels to produce and … Read more

Is Keyword Density Essentially Dead?

Is Keyword Density Essentially Dead?

For many years, “keyword density” was the holy grail of SEO content. There have been dozens, if not hundreds of theories on what the optimal keyword density is, from formulas produced to strict guidelines on how writers should be using it in SEO content daily. No one knows better than me. I came from the old Google days (pre-2012) when online writers were sometimes treated as the minions of SEO black hats. It was rough: we had to stuff in those keywords like nobody’s business, no matter how much they read awkwardly. According to many industry experts, however, keyword density has now completely died today. Is that true? Let’s chat about it. What Really is Keyword Density? Keyword density is the measurement that indicates how many times a keyword appears in a piece of content (i.e. blog, web page) versus the total number of words in the piece. Keywords were counted within content, headings, meta descriptions, image names and alt tags to provide what many experts believed for many years was a better user experience. Keyword density was calculated by the number of times a specific keyword was included in content, divided by the total number of words in an analyzed text, x one hundred. For example: 10 keywords in 500 words = 2% keyword density. Here’s an official chart showing the formula: This formula, while it may seem meaningless today, was very popular in and around 2011 and was widely believed to be the “right” way to do SEO content. As I mentioned earlier, I remember the days of stuffing my SEO content with keywords and counting said density, and it was rough. Marketers in 2011 thought creating 50 articles on this exact keyword “payday loan Atlanta Georgia” was a good idea, and those keywords stuffed in made the content like the most overstuffed Thanksgiving turkey you ever saw. Now as you and any good content marketer knows, today’s content is more about people than keywords.  And that’s the real reason we don’t count our keywords any more. Let’s explore further why keyword density is officially dead. The Keyword Density Booby Trap According to Moz, companies that focus too much on attaining a certain keyword density often run the risk of ruining content, slaughtering credibility, annoying readers, and earning themselves all too many “back” clicks. Strong words, right? Unfortunately, Moz is right. It stands to reason that keyword density may be one of the great SEO myths of our day. While keyword density is meant to create a more readable document, more often than not it simply destroys the legibility and readability of content, creating low conversion rates and poor user experience. Why, then, have we been taught that keyword density is the end-all-be-all of SEO and high-quality content? For one, many people were taught to believe that keyword density is how search engines determine the relevance of a given page. This couldn’t be further from the truth. According to Moz, if search engines focused solely on keyword density to rank pages, all content creators would need to do is repeat the keyword phrase of choice over and over again in order to rank well in Google. And, trust us, Google is not that stupid. In fact, it’s likely that Google evolved beyond that in the late 90’s. That said, it’s unwise to use density as a reliable metric in today’s search climate. Most reputable keyword tools have already kicked it to the curb. In order to rank pages, Google does take keywords into account but the actual density doesn’t matter nearly as much as we’ve always believed it does. The Pitfalls of Keyword Density When you really think about it, keyword density is a fluid term. To have a certain number of keywords in a piece of content is one thing, but to attain the correct relative position and dispersion of keywords throughout the document is entirely another. The traditional measure of keyword density fails to take into account things like how many documents are relevant for a given keyword or how the piece of content targeting the keyword uses things like internal linking, webpage structure, user experience (including how long users interact with a page and what the page’s bounce rate is), domain age, and back links. Yes: Keyword Density is Now Confirmed as a Useless Metric Because of this, keyword density is essentially a useless metric which many industry leaders believe is a complete waste of time. According to Moz, “people who chase some mystical on-page keyword density are probably doing more harm than good.” The Rise of Semantic Search In 2013, Google released the Hummingbird update. This update allowed Google to process search results based on semantic search, which evaluates results based on their ability to match user intent, rather than ranking them by keyword density and other Boolean measurements. As soon as Hummingbird came out, marketers began to re-evaluate their relationships with keyword density. Specifically, many marketers began wondering if keyword density mattered as much as they had always thought it did. Overwhelmingly, the answer was “no.” Before semantic search, Google used metrics like keywords and linking architecture to determine which pages were the best match to a reader’s intent and query. Once the search engine had evaluated these things, it returned rank-ordered results that were based largely upon how well the page’s keywords matched to the number of links within the site as a whole. More keywords, more positive evaluation, generally. This led to the rise of keyword density and to many marketers beginning to see keyword density as one of the best ways to rank well in Google. This, in turn, led not only to a craze with keyword density but also to black-hat SEO tactics like keyword stuffing and predatory linking strategies. Semantic Search De-Values the Keyword “Overstuffing” Density Fortunately, the introduction of Hummingbird altered the playing field in a big way. One of the main things Hummingbird did was use semantic search principals to make keyword stuffing and misleading linking strategies too difficult and expensive to pursue. Because Hummingbird evaluates content based on its … Read more

Does Social Media Really Matter to Your SEO Rankings?

Does Social Media Really Matter to Your SEO Rankings?

Trying to find out how social media affects SEO is a little bit like staring through a really dirty window: it’s unclear and often frustrating. While business owners know that social media is important for many things, including building and maintaining visibility on the web and driving customer engagement, it can be tough to tell just how much (if at all) social media affects SEO ranking. This is justified, of course, as in 2010 Google’s Matt Cutts released a video saying that social signals were a SEO ranking factor. In 2014, however, Cutts released another video saying that social signals weren’t actually a ranking factor. (This was addressed late 2015 in a Stone Temple article: “Does Social Media Effect SEO? Matt Cutts answers”. I’d definitely recommend this read: it addresses this topic well from an SEO standpoint.) In the new version of the video, Cutts clarifies some of the ambiguity that’s plagued SEO for years and gives us a final (if not drastically more clear) answer to the question: does social matter for SEO rankings? I’m going to take a look. Keep reading. How Social Factors Play into SEO: 3 Key Points There’s been hot debate over whether or not social signals matter for SEO rankings. Here’s what we know: 1) Social signals on Facebook and Twitter are treated like web pages in SEO When a search engine crawler looks through Facebook and Twitter feeds, it can pull out individual pieces of content and select the ones that are valuable. This does two things: first, it helps Google cut down on the unimportant social signals out there, of which there are many considering the fact that Twitter users generate more than 500 million tweets per day. Secondly, this helps Google be more selective in what it indexes and where that content is indexed, which can be a good or a bad thing for marketers depending upon the strength of the social factors that do matter (more on this later). 2) Google’s crawlers can’t cover the entire social universe While they may be studious, even Google’s crawler bots aren’t capable of evaluating every social page on the web. Part of this is because the bots sometimes get blocked from doing this work, as Barry Schwartz claims they were from Twitter. While Google’s crawler bots can still see every Tweet that is posted on the platform, they are inherently limited in their ability to index every Tweet. According to a study released earlier this year, Google actually indexes less than 4% of all Tweets issued. That said, Google doesn’t appreciate social signals it can’t fully index and, because of this, it’s possible that signals like individual Tweets don’t influence SEO as much as one may think. 3) Drumroll please… Google says it does not use Facebook or Twitter followers to rank pages This has been a long-debated topic and one that has been fraught with misunderstanding and confusion. According to Matt Cutts, though, Google doesn’t use signals like Facebook or Twitter fans or followers simply because they don’t have “a high confidence in the meaning” of those signals. This is because Google’s bots are simply incapable of crawling all of Facebook and Twitter, which means that there is no way they can glean all of the connections and signals they would need to make a complete and well-rounded assumption about the content on those sites. Additionally, Facebook and Twitter followers are fluid and ever-changing and it’s possible that a bot that visits a page in the early stages of its existence, for example, may not visit it again for a very long time, which may inappropriately skew the page’s rankings. Similarly, a page that has a huge number of followers when a crawler bot visits may then lose a huge number of followers (this is far from impossible and it actually happened to many high-profile Instagram users when the site destroyed millions of spammy profiles in what is now known as “The Instagram Rapture”), but since the crawler bot will not reflect that loss, it’s impossible for the bot to make these social signals an accurate representation of the site itself. Cutts also put to bed the Moz and SearchMetrics studies that claimed Facebook likes were one of the most important factors for sites that rank well in Google’s SERPs. While many people believed that these studies proved that social signals in fact caused high-level search rankings, Cutts explained that this was simply due to the fact the the sites which produce high levels of fan engagement are also the sites that attract a high number of inbound links and other authority metrics that do have a direct effect on SEO ranking. What this Means for Social and SEO: 3 Things Marketers Need to Know So – what’s the answer? Does social matter for your SEO? The answer is complicated and two-fold: it does and it doesn’t. We’ve just addressed the reasons it doesn’t – some social signals are too instable to be reliable ranking metrics and others are simply impossible for Google to attain. But what about the ways in which social signals do affect your SEO? Right now, one of the most important things in the world of great SEO rankings is content that meets search user needs. Despite the confusing back-and-forth of “SEO or no SEO?” some social signals are indeed important for SEO and, moreover, sites that pay attention to the following things are more likely to rank higher in SERPs. 1) Social links matter While we’ve established that social signals don’t matter for SEO, social links might. Consider the following: when a post goes viral on Facebook or Twitter (As Mark Manson’s “7 Strange Questions That Help You Find Your Life Purpose” blog post did, for example) and thousands of people are sharing and searching for it, does that affect search rankings? Many marketers believe that it does. We know that Bing pays attention to viral posts and, according to the 2014 Cutts video, Google treats individual … Read more

Google RankBrain Launches, 15% of New Keyword Searches Come to Light

Google RankBrain Launches, 15% of New Keyword Searches Come to Light

Big news in the Interwebz! Officially on October 26, 2015, Google released news that it has begun using an artificial intelligence page ranking system called “RankBrain.” This AI (Artificial Intelligence) system is designed to help Google organize and categorize all of its search results and news of it is currently breaking the Internet. For those of you who are unfamiliar, here’s the skinny on the new RankBrain technology. What is Google RankBrain? Google RankBrain is an AI system that Google designed to assist in processing search results. The system operates by teaching itself how to complete a task and is currently being used to search the billions of pages in Google’s ranking indexes in order to find the ones that are most relevant and most valuable for a given search query. Because the release is so new, it’s still a little unclear whether or not RankBrain is a part of the entire Google Algorithm known as Hummingbird, but sources like Search Engine Land believe that it is. There are dozens of components that make up Hummingbird and many SEOs believe that RankBrain is simply the latest. This is fueled by the fact that Bloomberg Business reported that Google RankBrain won’t handle all searches as the algorithm would, and is only responsible for a portion of them. According to Google, RankBrain has been live since early in the year and has been fully rolled-out for several months now. RankBrain will affect a huge number of queries and, as queries continue to roll in, the AI system will continue to become more advanced and learn to make predictions about certain search patterns. In fact, RankBrain is already beginning to get better at predicting a page’s rank than its human counterparts: according to recent information, engineers involved in developing the software were asked to guess where various pages would be ranked according to Google’s ranking signals. While the engineers guessed correctly 70% of the time, RankBrain got it right 80% of the time. (Better than human?) People who want to learn more about exactly how the AI properties of RankBrain function can consult this blog post (although the technology is not called RankBrain in the post). How Does Google RankBrain Work? The details on this are still foggy but right now the best guess is that RankBrain is used to interpret searches that are submitted to Google and to match them with pages that may not feature the exact keyword phrase that was searched for, but which are relevant nonetheless. This is an expansion on previous Google technologies that allowed the search engine to present pages that didn’t feature the exact search terms entered – so that people searching for “running shoes” would also see pages that targeted the keyword “sneakers” and so on and so forth. Right now, Google receives over 3 billion searches on a daily basis and, in 2007, the search engine giant reported that 20-25% of those search terms were totally unfamiliar. In 2013, that number scaled down to 15%, which was still significant for such a huge machine (it amounts to about 450 million search terms each day that Google has never seen). The 15% estimation holds true today and, presumably, RankBrain is a way to refine and categorize those queries in order to deliver better results for Google users. How RankBrain Is Involved in Google’s Ranking Signals When it comes time to rank a webpage, Google uses a wide variety of so-called “signals” to determine how to rank the page in the index. Things like bolded words, mobile-friendly pages, and local listings are all signals that Google uses to rank a page. These signals are processed by various parts of the algorithm in order to determine which pages show up in SERPs and which do not. According to Google, there are more than 200 big-time ranking signals used when ranking each page. Many people believe that these 200 signals then give way for up to 10,000 sub-signals. This is important to know because, seemingly out of the blue, Google is now saying that RankBrain is the third-most important ranking signal in existence right now. Although we know that this is a huge development, since we don’t know exactly how RankBrain will look in the coming months, it’s hard to tailor content to this development as it stands now. The Future of Google RankBrain Right now, many SEO experts believe that the presence of RankBrain may indicate a future trend toward voice searches. Because people don’t issue voice searches the same way they issue text-based searches, search engines and marketers alike need to start adapting now. For example, a person who wants to issue a voice search may ask “Where can I get a bagel in NYC?” while a text search may look more like “bagels NYC.” The fact that RankBrain is capable of learning, adapting to, and predicting a variety of new search queries indicates that the Google engineers may be predicting an AI system that can eventually answer basic questions and even complete easy puzzles. This, ultimately, is an extension of a program called The Knowledge Graph, which Google released in 2012. The Knowledge Graph was Google’s way to reach toward becoming more intelligent about the connections between words. With the inception of this program, Google moved toward doing what it called searching for “things not strings.” This meant that Google went beyond searching only for information that matched a string of letters and began, instead, searching for pages that provided answers to the questions a person was probably asking through their search queries. For example, The Knowledge Graph allows searchers to enter a term like “when was Nixon born?” and get an answer complete with maps without ever specifying that you mean President Nixon. As RankBrain becomes more established, it seems evident that the service will combine with other Google technologies, such as rich answers (more on these in a moment) to create an intuitive search experience that allows the search engine to see … Read more

Is Google Authorship Coming Back?

Is Google Authorship Coming Back?

I remember those days… the good ol’ days of G+ Authorship. Your picture would show up next to your website/blog, if you set it up correctly, like mine did:   A Little Google Authorship Background Introduced in 2011, Google Authorship was a service that allowed for the connection of multiple pieces of content with a single author. The idea behind it was to provide a sort of scoring system by which authors could be ranked based on their authority and trust signals. This, in turn, would allow Google users to find content that had been written by the same writer and would help that writer establish legitimacy and credibility. Although it was introduced as a shining star that would allow writers to stake claim to their own content, it was a short-lived affair. After an extensive series of changes, Google pulled Authorship support from its services in August 2014,, although it threw audiences a loop by telling them to keep Authorship source code alive. That left many SEOs wondering if Authorship was coming back and, if so, when? To answer that question, let’s take a look back at the past. Why Google Authorship Died When Google pulled support for Authorship, webmaster John Mueller stated that there were two main reasons that Authorship was chopped. Those reasons were as follows: Low adoption rates To put this simply, people simply weren’t using Authorship. Google caught on to this the first year Authorship was launched and, by 2012, Google had made attempts at auto-attribution that would allow content to be attributed to its rightful author even if that author didn’t participate in the Authorship platform. Immediately thereafter, however, it became clear that mis-attribution had become a problem. It was such a problem, in fact, that the service attributed Truman Capote (then dead for 28 years) as the author of a New York Times article. Whoops! Minimal value to users In its original inception, Google Authorship didn’t perform and the Google team noticed that the service was producing little difference in click behavior on Authorship and non-Authorship pages. This, combined with the service’s mis-attribution problems, were enough to bury it in a shallow grave in 2014. Is Google Authorship Coming Back? Despite its original failings, Google seemed to have a soft spot for Authorship and the team provoked much curiosity when they killed Authorship but told audiences to leave the authorship source code live. Some people, when asked if Google Authorship is coming back, would argue that Authorship never actually went away. Sure, the author images disappeared from the SERP’s but Google has never stopped their mission to interconnect information. Since Google seems increasingly hesitant to confirm updates, however, it seems unlikely that they’re going to say anything definitive about Google Authorship until it’s here, or not. Conclusion The one thing we can say is that Google Authorship seemed like a promising service. Although it ran into its fair share of trouble in the beginning, it’s not impossible to imagine that the Google search team may choose to resuscitate the platform and use a renewed version of Authorship to do everything from determine author rank to displaying in depth articles in SERPs. Until we receive further clarification from Google, though, all we can do is wait and wonder.  

Why Google’s Newest Panda Might Not Hit Till 2016

Why Google’s Newest Panda Might Not Hit Till 2016

Anyone who is familiar with Google’s algorithm updates knows that they are extensive, frequent and often vague in terms of detail. Although Google’s most recent update, Google Panda 4.2, was released in July of 2015, many SEOs believe that the rolling-out process of the newest major Panda update isn’t actually done rolling out. Read on to learn more. Looking At the Current Panda 4.2 Update When Panda 4.2 was introduced in July of this year, its express purpose was to reward quality content and down rate scraped, duplicate or low-quality content. From the get-go, it was clear that this Panda was a little slower moving than the others, but, even as the weeks began to wear on, very few SEOs expected so much time to pass before admins started to notice changes in their sites. Sites that were hit by Panda 4.1 have had to wait 10 months to redeem themselves and, due to the fact that 4.2 is also painfully slow to roll out, it’s unclear whether the changes made by those previously affected sites have actually been effective. Needless to say, this is a source of frustration in the SEO community. In addition to being amazingly slow to implement, Panda 4.2 is also impressively extensive. When Google released Panda 4.2, it stated that 2-3% of Google’s search queries would be affected but, when you take into account that Google gets billions of daily searches, 2-3% equates to roughly 36 million affected queries. This makes it the broadest update in quite some time due to the fact that, between 2011 and May of 2014, no individual Google update affected more than 2.4% of search queries. Even though SEOs know that Panda 4.2 will affect millions of searches, the entire process has been so slow to roll out that nobody knows how, exactly, those millions of searches will be eventually be affected. In the light of all of that ambiguity, it’s tough to make any solid statements about Panda 4.2. The one thing that SEOs do know for sure about this Panda update is that, as usual, quality content is the only safe place to be. This means avoiding things like keyword stuffing, ugly sites or exploitative SEO practices. Because Google has dedicated itself to rewarding high-class content and down-ranking sub-par content, sites that publish well-written, original content are less likely to be negatively affected by the mysterious new Panda updates. While nobody quite knows just yet what Panda 4.2 will reward or punish, most SEOs believe that creating great content and avoiding bad SEO practices is a safe place to sit and wait for the changes to begin showing themselves. Questions Regarding Panda 4.2’s Status The main question people have about Panda 4.2 is “is it here?” The answer is “Yes – but only kind of.” Panda 4.2 was instated around July 18th, 2015, at which point Google said the changes would take place over “several months.” Obviously, that’s vague at best and absolutely unclear at worst. What we do know is that Google waited 10 months after Panda 4.1 to release its next update and that a large portion of that was due to technical glitches and complicated system issues. Additionally, while it’s not uncommon for a Google algorithm change to take a number of months to go into full effect, it’s clear that this one is going extra-slowly. This is actually the slowest Panda roll out to date and, as such, many SEOs are concerned that the new changes are so complex that they can’t easily be instated in a matter of days. Google, however, says this isn’t true. During numerous interviews, the company has stated that the new Panda roll out isn’t going slowly purposely to confuse SEOs or to make life more difficult. Instead, the company says that this slow Panda roll-out is a preview of coming attractions: a time when Panda will be one, large, continuously rolling and changing system that is incorporated into the company’s most fundamental algorithms. While the company acknowledges that they’re not there, yet, it seems as if this ultra-slow Panda roll out is the first step in that direction. Additionally, the company has assured SEOs that, while Google Panda 4.2 is a site-wide action, it is unlikely to affect each of a site’s pages in the same way, which is why many SEO’s have yet to see any real or definitive changes. Sites that are bound to be dinged by Panda 4.2 may not actually know it until the entire roll out is complete – some time in the distant future. At that point, affected sites will need to wait until Google releases its next update in order to revamp their pages. The Verdict: The Newest Panda In the Works May Not Hit Us Before 2016 Considering the fact that there were 10 long months between Panda 4.1 and Panda 4.2, it seems unlikely that the next update will happen any time before 2016. For now, the best Google has to offer is that site owners should keep an eye on their analytics. By isolating organic traffic driven by Google and closely analyzing any large dips or boosts, site owners can begin to get an idea about whether the big Panda is here and, furthermore, whether it has begun to affect their sites or not. What You Can do to Stay Sane During the Slow Roll Out Although many SEOs feel a bit helpless during this time of Google ambiguity, there are a few things you can do to mitigate the uncertainty and ascertain whether Panda 4.2 has affected your site: 1. Frequent Site Audits. The first is that you can keep checking your site for any positive or negative impacts that seem different from the norm. This is likely to be a strong indicator of the fact that Panda 4.2 was there. Unfortunately, however, once the changes are instated, it’s too late to do anything about your site and, if you have content that is going to be punished by Google, it will be punished no matter what you do.  2. Stay Informed. With that in mind, it’s … Read more

How to Syndicate Without Being Duplicate: The 101 On Content Syndication

How to Syndicate Without Being Duplicate: The 101 On Content Syndication

For those who don’t know, “content syndication” is the act of publishing a piece of writing multiple times in multiple locations. Keep in mind, however, that good content syndication does not mean copying and pasting the same article for use over and over again without any attribution or precautions. Google recognizes the latter as duplicate content and will happily ding your site for this. Instead, syndicating your own work is essentially the same as creating re-runs of your greatest hits and, for writers who publish a lot of blog posts or articles, it can be a great way to get the most bang for your metaphorical buck. Additionally, good syndication practices have the potential to earn you more shares and afford your content a much wider reach. When syndication is done well, it allows a variety of online sources to find and feature your original work, which is a win-win for everyone involved. “But how,” you might ask, “do I syndicate correctly?” While there are many myths about syndicated content flying around, making sure that you are syndicating correctly is an important piece of the syndication process because, as we mentioned earlier, duplicate content is a big no-no in the world of SEO. Let’s find out more about this.   What is Content Syndication? As we’ve established, content syndication is when a publisher or writer re-purposes an already-published piece of writing for use on a different platform. It’s a tricky business though, because Google hates duplicate content (as it has made explicitly clear with its recent updates) and will happily ding sites that copy a blog from one platform to another. The reason Google is so tough on duplicate content is easy to understand: the Internet is an information-delivering machine and nobody wants to encounter the same post on every site they visit. Duplicate content doesn’t benefit readers nearly as much as high-quality, original content and, when Google users enter search queries, they expect to see a few million similar but different results pop up, which is impossible in a world of copy-and-paste content. Syndicating content without ticking off Google is a difficult ball game and it is important that writers and publishers take it very seriously. 5 Ways to Syndicate Content Safely Although it’s not fair to say that content syndication is so risky and ill-advised that it shouldn’t be attempted at all, it is fair to say that it should be undertaken cautiously and with a broad knowledge of how to syndicate safely. These tactics will keep you in Google’s good graces while also allowing you to reap the benefits of re-purposed content: 1. Write a Recap One of the easiest ways to syndicate content correctly is to write a recap. Start a blog post by introducing the post you’ve already written as well as the platform on which it was published. Add a few more words, a nice image, a compelling call-to action, a link to the piece and you’re done. This form of syndication is great because it’s simple, amazingly time-efficient, and easy to do. Additionally, linking to the site that features your blog has the potential to boost your SEO ranking and also provides your readers with the opportunity to visit a blog they’re unfamiliar with, thus expanding their horizons as well. 2. Add A rel-canonical Tag If you’re syndicating your own content to a different portal, consider adding a useful little rel-canonical tag to the page that will feature your new article. Keep in mind that the tag in question should always point back to your site’s original article. By doing this, you give Google a way to interpret your syndicated content correctly and help search engines realize that the article is a copy and that you are its original publisher. This prevents you from slipping into the dark world of duplicate content and saves your site from costly SEO dings by the Google Gods. Additionally, all of the subsequent links to your syndicated copy will point back to your original copy, which is good for your site and your visibility. 3. Opt for NoIndex If the rel-canonical tag isn’t up your alley, try the No-Index option. When you syndicate your own content, simply insert a No-Index tag in the article copy. Doing this tells Google that it needs to exclude the syndicated copy from its index but allows linking between the two articles. Keep in mind that this is not the same process as No-Follow, which is an entirely different practice among bloggers. 4. Ensure Balance Once your content marketing strategy begins to tip heavily toward all syndicated content, all the time, you’re in trouble. When Google’s Panda 4.0 was introduced, many sites that favor syndicated content saw a 60% decrease in their organic traffic. That said, it’s wise to ensure that if you are using syndicated content, you’re balancing it well with high-quality, original content. Opt for a 60/40 split, with 60% of your content being original pieces that are updated often. Ensure you’re utilizing proper linking techniques and that your content is garnering good shares. This, combined with other preventative syndication measures, should be enough to keep you in Google’s good graces. 5. Beef Up Syndicated Pieces One of the worst things you can do in pursuit of syndication is copy and paste third-party pieces without adding value to the content. Generally, good writers know that constructing a blog built entirely around third-party pieces that are taken from other sources is a dangerous game. There is, however, a way to do it correctly. By utilizing third party pieces to quote from or to draw fragments from, writers can build authority and synthesize new content. To do this, ensure that the pieces you choose to syndicate were published by a high-quality site and that they are written with flow and comprehension in mind. Additionally, be sure that you are adding some quality to syndicated third-party pieces. Insert your own commentary or pull pieces of the third-party article to beef up your … Read more