You may or may not know that I haven’t always been Copyblogger’s editor. For many years, I was a Copyblogger reader. I didn’t know Brian. I didn’t know Sonia. But I pretended that I did. Of course I didn’t tell anyone that … I just received so much guidance from Copyblogger that helped me position The post 30 Tips that Help You Become an In-Demand Freelance Writer appeared first on Copyblogger. via Copyblogger http://ift.tt/2iabWT0
0 Comments
418: Developing the Mental Toughness and Self-Discipline that Yields Success by Luana Sicari11/26/2017 When you have a goal in mind, you do everything you can to make it happen. Luana Sicari shows us that when you don’t have a plan B, you have no choice but to be successful. Also, everyone has two choices is life, you either question everything or you simply start doing. Who is Luana Sicari?Luana Sicari has over 15 years of MLM experience. She’s been a top producer for many years and currently is the CEO of a new MLM company that just launched this year. Luana credits her upbringing to her success. She lost both her parents at the age of 22 and that gave her the mental toughness and self discipline to get through life. She’s also fond of martial arts and has a black belt in Krav Maga. One of the things that drives her is the constant reminder that life is a nonstop lesson. Favorite Quote“Through difficulties to the stars” Must Read BookFailing Forward by John C. Maxwell Recommended Online AppApps about motivation Recommended Prospecting ToolWebinar Contact InfoLuanasicari.blogspot.it What Did You Learn?Thanks for joining me on the show. So what did you learn? If you enjoyed this episode please share it on social media and send it to someone that needs extra motivation in their MLM business. Do you have any thoughts or comments? Please take 60 seconds to leave an HONEST review for the MLM Nation Podcast on iTunes. Ratings and reviews are extremely important for me to make this show better. Finally, don’t forget to subscribe to the show on iTunes so that you get updates and new episodes downloaded to your phone automatically. Click Here to Subscribe via iTunes Click here to Subscribe via Stitcher Click Here to Subscribe via RSS (non-iTunes feed) The post 418: Developing the Mental Toughness and Self-Discipline that Yields Success by Luana Sicari appeared first on MLM Nation: Network Marketing Training | Prospecting | Lead Generation | Leadership | Duplication | Motivation. via MLM Nation: Network Marketing Training | Prospecting | Lead Generation | Leadership | Duplication | Motivation http://ift.tt/2BeXywt In this week’s Search In Pictures, here are the latest images culled from the web, showing what people eat at the search engine companies, how they play, who they meet, where they speak, what toys they have and more. Happy Thanksgiving cupcakes:
Android candy dispenser:
Google custom Jiu Jitsu shirt:
Bing select partner plaque:
The post Search in Pics: Android candy dispenser, Google Jiu Jitsu shirt & Bing partner plaque appeared first on Search Engine Land. via Search Engine Land http://ift.tt/2A3SYEz The most important thing I’ve learned from my 15 years of PPC experience is that sooner or later, account performance will take a downturn. When that day comes, we must be prepared to deal with the consequences of performance not meeting expectations. These consequences could range from stakeholders losing trust in your abilities to receiving ultimatums to “fix performance or else,” and worst-case scenario, someone else being brought in to take over the paid search program you’ve spent so much time and energy building. Performance downturns can be very stressful and put you on the defensive. However, having a solid methodology for responding when performance is bad can help instill confidence that you have what it takes to turn a negative performance situation into a positive one. This article discusses a two-step methodology for confronting underperformance in a way that helps you garner trust with your stakeholders and instill confidence in your ability to manage PPC accounts through the tough times. Step #1: Diagnosing the problemClients and stakeholders need to have confidence in those managing their paid search program. When performance takes a downturn, they depend on their account manager to tell them what the problem is. If an account manager cannot demonstrate they understand what the problem is, then why would the client/stakeholder have any confidence that the account manager can solve their performance issues? How do we go about diagnosing the root cause of a problem? Diagnosing a problem requires diligent research to pinpoint:
Putting the methodology into practice I’m currently dealing with an account performance issue that is causing this month’s performance to lag in terms of lead volume. I ultimately identified the issue as a drop-off in brand keyword traffic. How did I discover that brand traffic was the source of this issue? I did it by analyzing the following key metrics:
The sudden drop-off in conversion volume and CTR, along with a spike in CPCs, led me directly to consider recent brand traffic performance. Typically, this account I manage has very steady traffic patterns with steady CPCs and conversion volume. As I dove further into brand campaign performance, I saw that branded impressions and clicks dropped dramatically, which caused CPCs to spike and volume to drop. Because of the brand traffic performance drop-off, cost per conversion increased dramatically due to the account’s over-dependence on non-branded traffic. Further digging into the account, I discovered that branded traffic dropped suddenly at the end of October. This information allowed me to focus on specific changes made to the account during that period. I ultimately discovered several high-traffic branded keywords were paused in error as part of an overall optimization. These keywords were unpaused and bids readjusted. Traffic and conversion volume is now recovering. As you can see from the example above, it took quite a bit of research to arrive at the problem’s root cause. Once a problem has been identified, it’s time to move on to the next step. Step 2: Communicate what the performance problem is and recommend solutionsThroughout the course of my career, I’ve seen a lack of understanding and communication be the downfall of many business relationships. I’ve witnessed PPC account managers fundamentally not understand the performance problems they’re facing, ignore the fact that a problem even exists and fail to address problems head-on with their stakeholders. Allowing any of these things to happen quickly erodes trust. To maintain your credibility as a PPC expert, it’s imperative that you do the following when there’s an underperformance issue:
Clear communication and context helps remove fear. Oftentimes, stakeholders become emotional and lash out because they feel their account manager doesn’t grasp the gravity and urgency of a situation. It’s our job as account managers to take the lead in eliminating fear of the unknown by providing as much background information and context as possible regarding an underperformance problem’s root cause, and propose sufficient courses of action for remedying the situation. Final thoughtsNo matter how hard we try, we’ll never be able to avoid the inevitable performance downturn. However, what we can do is be prepared for how we’ll respond when this time arrives. Fully understanding root causes of performance issues, developing the appropriate solution and decisively communicating all of this to our stakeholders is crucial to successfully surviving performance downturns. It’s easy to be liked and respected when everything is going well. The real test comes when things are not going according to plan. Going through the fire of underperformance and successfully coming out on the right side of it will help build your credibility and your stakeholders’ trust in you. The post 2-step methodology for dealing with PPC performance downturns appeared first on Search Engine Land. via Search Engine Land http://ift.tt/2mYKlWD Consumers overwhelmingly expect the reviews they peruse on Amazon, Yelp, Google and other review sites to be trustworthy, neutral and objective. But this reasonable expectation is frequently thwarted by the efforts of aggressive marketers who pay third parties to create phony reviews in exchange for compensation or incentivize existing good customers to leave reviews with discounts or free products or services. These deceptive practices — termed “opinion spam” or “sock puppetry” — are a form of information pollution with multiple victims. Opinion spam blinds the consumer to the truth and poisons the reputation of the review site where the fake review appears. When detected, it may subject the marketer and/or opinion spammer to criminal and civil penalties. Unfortunately, opinion spam — despite the best efforts of review sites to control it — appears to be a permanent, intractable feature of the e-commerce and local business information ecosystem. Not that reviews sites aren’t trying. In 2015, Amazon filed a lawsuit against a company that offered false four- and five-star reviews for product pages. Later that year, Amazon sued over 1,100 sellers of fake reviews who allegedly posted reviews in exchange for money. And in early 2016, it sued three of its own vendors for the same practice. Later that year, Amazon changed its Community Guidelines to prohibit reviews solicited by marketers in exchange for incentives such as discounts, freebies or other rewards. Such “incentivized reviews” had hitherto been permitted by the service. These steps — plus the development of algorithms targeting fake reviews — are all intended to clean up its messy marketplace. Google — which for decades has warred against SEO spammers — in 2016 warned product review bloggers to disclose any compensation-based relationships with vendors and to nofollow any links to sites with whom such a relationship exists. These steps were taken to align the company’s practices better with US law, particularly Section 5 of the Federal Trade Commission Act, which outlaws deceptive advertising. And Yelp — which estimated in 2013 that between 20 percent and 25 percent of its 70 million reviews were “suspicious” — has been forced to resort to undercover sting operations involving “decoy accounts” to keep its legions of opinion spammers at bay. The siren song of sockpuppetryAs a marketer, you’re fully aware of how critical online reviews are to your business. You may have already been tempted by a consultant or social media agency to “go over to the dark side” and start posting some complimentary reviews about your own stuff on Amazon, Google or Yelp. Here are some typical arguments you might hear — along with reasons you need to resist the temptation: ‘Everybody (including your competition) is doing it.’ Hypercompetitive consumer categories appear to be highly attractive to opinion spam. According to an anonymous former Amazon opinion spammer interviewed by Digiday earlier this year, review requests for “mobile phone accessories, Bluetooth devices, and sometimes baby products” are especially frequent, but any popular product category will likely attract opinion spam because spam “follows the money.” It’s very difficult to counsel marketers who sincerely believe that they are the target of opinion spam (either positive or negative) that they should not resort to the same tactics used by their competitors, especially those who’ve already listened to the second argument, which is: ‘If you’re smart about it, you’ll never get caught.’ Unfortunately, there’s more truth to this statement than there should be because every move made by the review sites to clean things up is met by increasingly sophisticated workarounds from opinion spammers. For example, when Amazon forbade compensation-based reviews in 2016, opinion spammers reportedly shifted their tactics to those less easily detectable (e.g., by offering to recompense the reviewer after the purchase of the reviewed item or by telling the reviewer to simply buy the product and return it within Amazon’s 30-day return window). The truth is that if one carefully studies the weak points of each review site, is exceptionally stealthy and takes care to hide one’s tracks, it’s possible to insert opinion spam into the conversational chain at will without fear of suffering any short-term negative consequences. The long-term scenario, however, is much less friendly to the prospective opinion spammer. Algorithms — some proprietary to the review sites, others available as third-party tools — are getting better at flagging opinion spam, using a mix of linguistic, behavioral and relational signals to zero in on the offender. Additionally, federal and state public entities are growing increasingly active in policing bad actors. So while the odds of being caught in 2017 remain low, they’re bound to increase as technology and law enforcement begin to catch up to opinion spammers. Remember, soliciting fake reviews is illegal — are a couple of five-star reviews really worth breaking the law? What’s a marketer to do?Opinion spam is a part of life. But as discussed above, it’s foolish and risky to fight it directly by resorting to the same tactics as your opponents. So what can a marketer do? Make full use of white-hat methods to encourage reviews. Review quantity and review recency count, and there are many things you can do to increase your flow of reviews from actual customers. Here are some methods for encouraging customers to review your wares that will never get you into trouble:
Emphasize the undeniably valid reviews you’ve got. One unintended byproduct of the current state of online review unreliability is the increased importance of vetted review sites such as Consumer Reports. Many competitive CPG verticals (e.g., cars, cameras, computers, software and so on) have bona fide review sites associated with them whose reputations are high due to the fact that they are not open – and will never be open — to outside evaluators. So, don’t leave your product’s reputation in the exclusive (and anonymous) hands of internet reviewers; get it into the hands of pro reviewers whose publications have integrity. Inform on bad actors. If you see something, say something. Sure, nobody likes to be thought of as a “snitch,” a “fink” or a “rat,” but if you have reason to believe that your competition is resorting to opinion spam, report them to the review site (Believe me, they’d do the same to you at the drop of a hat). Remember, you and the review site — be it Amazon, Google or Yelp — are allies in the battle against opinion spam. If you’re going to go this route, it’s ideal to provide the best forensic evidence you can obtain, so make use of sites such as Fakespot.com and ReviewSkeptic.com, which let you paste in a URL from a review site (or the text of the review itself) in order to evaluate the probability that it’s real or fake. Adopt a responsive consumer relations posture that responds quickly to reviews — both good and bad. While some online consumers will credulously accept any review they stumble across, many are sophisticated and discerning enough to realize that no business is perfect and not every customer is perfectly happy. If negative reviews appear on your page, acknowledge them and open up a channel to the complainant in order to make things right. Being active and responsive in the review space will be seen as proof that you’re paying attention, you care, and above all, you’re human. These qualities are important — arguably more important than ever in an information ecosystem in which “real” and “fake” are, even with the best available anti-spam algorithms, often maddeningly difficult to tell apart. The post Coming to terms with fake reviews appeared first on Search Engine Land. via Search Engine Land http://ift.tt/2iLGQhA Allow yourself to be poured into. Edwin Haynes shows us that most people create their own headaches because they refuse to learn different ways to do things. Also, why you need to find out what it is that makes you get up in the morning and motivates you to perform like a leader. Who is Edwin Haynes?Edwin Haynes is a 20 year veteran in the direct selling profession and is a 7 figure a year earner in his current company. He’s considered one of the top income earners worldwide. Edwin has multiple streams of income and used his network marketing income to purchase and own multiple traditional businesses. He’s the CEO of Haynes Global Services, a manufacturing services company and also the founder of Athon Systems, a software manufacturing company. Edwin is also the author of the Amazon best seller, “You Have the Permission to Succeed.” Favorite Quote“If it is to be it is up to me” Must Read BookThe 21 Irrefutable Laws of Leadership by John C. Maxwell Recommended Online AppNone Recommended Prospecting ToolHimself Contact InfoEdwinhaynes.com What Did You Learn?Thanks for joining me on the show. So what did you learn? If you enjoyed this episode please share it on social media and send it to someone that needs extra motivation in their MLM business. Do you have any thoughts or comments? Please take 60 seconds to leave an HONEST review for the MLM Nation Podcast on iTunes. Ratings and reviews are extremely important for me to make this show better. Finally, don’t forget to subscribe to the show on iTunes so that you get updates and new episodes downloaded to your phone automatically. Click Here to Subscribe via iTunes Click here to Subscribe via Stitcher Click Here to Subscribe via RSS (non-iTunes feed) The post 417: How to Properly Use Your MLM Business to Create Multiple Streams by Edwin Haynes appeared first on MLM Nation: Network Marketing Training | Prospecting | Lead Generation | Leadership | Duplication | Motivation. via MLM Nation: Network Marketing Training | Prospecting | Lead Generation | Leadership | Duplication | Motivation http://ift.tt/2i2dSgm Thanksgiving Google doodle turkey pardons itself & takes leave from any dinner traditions11/23/2017 Today’s Thanksgiving Google doodle is an animated image of a turkey taking leave for the holidays. “Unlike his domesticated brethren, the Turkey in today’s Doodle is taking flight…from the Thanksgiving table,” writes Google on the Google Doodle blog. Google offered a quick bit of history around Thanksgiving, noting how the first meal 369 years ago was a feast celebrated between the Pilgrims and the Native American tribe of Wampanoag — and how Abe Lincoln made Thanksgiving a national holiday in 1863. It also referenced the presidential tradition of pardoning turkeys for the last 28 years: “Though the pardoning of turkeys has been a presidential privilege since 1989, the Turkey in this Doodle has decided to pardon itself.” According to http://ift.tt/1jOmjo1, the first president to pardon a turkey was George H.W. Bush. Here’s the fully animated image of today’s Google doodle that leads to a search for “Thanksgiving“: Search Engine Land wishes all its readers a happy Thanksgiving filled with love and good food. The post Thanksgiving Google doodle turkey pardons itself & takes leave from any dinner traditions appeared first on Search Engine Land. via Search Engine Land http://ift.tt/2jiBWIt Below is what happened in search today, as reported on Search Engine Land and from other places across the web. From Search Engine Land:
Recent Headlines From Marketing Land, Our Sister Site Dedicated To Internet Marketing:
Search News From Around The Web:
The post SearchCap: Google printed materials, SEO migrations & development servers appeared first on Search Engine Land. via Search Engine Land http://ift.tt/2B3lz9U Few things can destroy a brand’s performance in the search results faster than a poorly implemented site migration. Changing your domain name or implementing HTTPS can be a great business move, but if you fail to consider how search engines will react to this move, you are almost certain to take a major hit in organic search traffic. Use the following SEO checklist to prepare yourself as you develop a migration game plan for your website. 1. Carefully consider if migration is the right choiceA site migration will almost always result in a temporary loss of traffic — Google needs time to process the change and update its index accordingly. A carefully executed site migration can minimize traffic fluctuations, and in a best-case scenario, Google will ultimately treat the new site as if it were the original. Still, that is only the best-case scenario. The reality is that site migrations, in and of themselves, typically offer little to no SEO benefit and do not eliminate search engine penalties. (That is why SEOs often use site migrations as an opportunity to make SEO improvements, like streamlining the site structure, fixing broken links, consolidating redundant pages and making content improvements.) With all of that in mind, when is a site migration worth it?
2. Use a sandboxNever do a site migration without first testing everything on a test server. Verify that the redirects work properly, and do all of the checks that follow in private before going public. Trying to do it all in one go without testing is bound to lead to errors, and if the mistakes are bad enough, they can set your site back by weeks. 3. Plan to migrate during a slow periodA well-planned and monitored migration shouldn’t permanently affect your traffic, but you should plan for a temporary dip. For that reason, it’s best to perform the migration during a slow part of the year, assuming that there is some seasonality to your site’s performance. A site migration during or shortly before the holidays is always a bad idea. While the goal should always be to avoid losing any traffic, it’s important to make sure that if you do lose traffic, you lose it when business is already slow. 4. Crawl your site before the migrationCrawl your site with a tool like Screaming Frog, and be sure to save the crawl for later. You need to make sure you have a complete list of the URLs on your old site so that nothing ends up getting lost because of the transition. Use this as an opportunity to identify any crawl errors and redirects that exist on the old site. These have a tendency to creep up over time. I rarely come across a site that doesn’t have at least some broken or redirected links. You should absolutely remove or replace any links that point to 404 pages during the migration process. In addition, I highly recommend updating any links that point to redirected pages so that they point to the final page. You do not want to end up with redirect chains after the migration. Remember that a site crawl may not be able to identify every single page on your site. For example, if you have pages that aren’t linked from other pages on your site, they won’t show up in a crawl. You can use your own records and databases to find these pages, of course, but if this isn’t possible, you can find these pages in your Google Analytics data, as well as through a link explorer like Ahrefs. If you find any orphan pages, make sure to update the site, and link to these during the migration. These pages are much less likely to pick up search engine traffic if they aren’t linked to from the rest of your site. 5. Benchmark your analyticsMake a copy of your Google Analytics data; you will need this information so that you can quickly identify if any traffic is lost after the migration. If any traffic is lost, export the Analytics data from your new site and run a side-by-side comparison with the data from your old site, so that you can identify precisely which pages lost the traffic. In many cases, a loss of traffic will be isolated to individual pages, rather than taking place across the entire site. You may also want to identify and take note of your top linked-to pages using a tool like Ahrefs. After the migration, you will want to pay special attention to these pages and monitor them closely. If these lose traffic, it is a sign that the authority isn’t being properly transferred from your old site to the new one. These pages contribute the most to your authority, so losses here may affect the overall performance of your site. 6. Map all changed URLs from old to newYou should have a spreadsheet that lists every old URL and every new URL. Ideally, during a site migration, all of the old pages exist on the new site. Obviously, removing a page removes its ability to capture search engine traffic. On top of that, dropping too many pages during the migration may lead Google to conclude that the new site isn’t the same as the old site, causing you to lose your rankings. Also, ideally, the URL architecture should be identical to the old one unless you have very strong reasons to change it. If you do plan on changing it, a site migration may seem like the ideal time to do it, but you should be aware that doing so may cause Google to see it as an entirely different site. If you do both at the same time, you will not be able to determine whether any losses in traffic were the result of changing the architecture or of migrating the site. Another reason to keep the architecture the same is that it allows you to use regex in your .htaccess file to easily redirect from your old pages to the new ones. This puts less load on your server than naming the redirects one by one, and it makes the process of setting up the redirects much less painful. 7. Update all internal linksThe HTML links on your new site should point to the new site, not the old one. This might sound obvious, but as you go through the process, you will quickly realize how tempting it might be to leave the links unchanged, since they will redirect to the new URL anyway. Do not succumb to this temptation. Apart from the server load, which slows down site performance, the redirects may dampen your PageRank. The ideal way to rewrite the links is by performing a search and replace operation on your database. The operation should be performed so that it updates the domain name without changing the folder structure (assuming you’re keeping your site structure the same). Write your search and replace operations carefully so that only text containing a URL is updated. You generally want to avoid updating your brand name and your URLs with the same search and replace operation. 8. Self-canonicalize all new pagesVerify that canonicalization on the new site references the new site and not the old. Canonicalizing to the old site can be disastrous, as it may prevent the new site from being indexed. I recommend self-canonicalizing all of your pages on the new site (except, of course, for pages that should canonicalize to another page). In combination with the redirects, this tells Google that the new site is, in fact, the new location of the old site. Sitewide self-canonicalization is recommended anyway, since URL parameters create duplicate content that should always canonicalize to the parameter-free URL. 9. Resolve duplicate content issuesVarious missteps during the migration process can result in duplicate content issues. Be aware of these issues, and take steps to avoid them:
10. Identify and address any removed pagesI mentioned above that you should generally avoid removing any pages during the migration. If some pages simply must be removed for branding purposes, take the following steps:
11. Ensure that a custom 404 page is in placeA custom 404 page allows users to easily navigate your site and find something useful if they land on a page that no longer exists. 12. Manage and submit sitemapsKeep your old sitemap in the Google Search Console, and add the sitemap for the new site as well. Requesting Google to crawl the old sitemap and discover the redirects is a good way to accelerate the process. 13. Keep analytics in place at all timesInstall Google Analytics on the new domain and get it up and running well before you launch the site to the public. You do not want to have any missing data during the transition, and it’s important to watch for any changes in traffic during the migration. 14. Redirect all changed linksAs mentioned above, the ideal way to set up your redirects is with a regex expression in the .htaccess file of your old site. The regex expression should simply swap out your domain name, or swap out HTTP for HTTPS if you are doing an SSL migration. For any pages where this isn’t possible, you will need to set up an individual redirect. Make sure this doesn’t create any conflicts with your regex and that it doesn’t produce any redirect chains. Test your redirects on a test server and verify that this doesn’t produce any 404 errors. I recommend doing this before the redirects go live on your public site. Keep in mind that once the redirects go live, your site has effectively been migrated. The new site should be in pristine condition before setting up the redirects. 15. Keep control of the old domainUnless the purpose of the migration was to sell the original domain, I would strongly advise against giving up control of the old domain. Ideally, the old domain should redirect to the new one, on a page-by-page basis, indefinitely. If those redirects are lost, all of the inbound links earned by the old site will also be lost. Some industry professionals claim that you can give up control of the old domain once Google stops indexing it, but I would never advise doing this. While it’s possible that Google will attribute links pointed at the old site to the new one, even without the redirect, this is placing far more faith in the search engine then I would ever recommend. 16. Monitor traffic, performance and rankingsKeep a close eye on your search and referral traffic, checking it daily for at least a week after the migration. If there are any shifts in traffic, dive down to the page level and compare traffic on the old site to traffic on the new site to identify which pages have lost traffic. Those pages, in particular, should be inspected for crawl errors and linking issues. You may want to pursue getting any external links pointing at the old version of the page changed to the new one, if possible. It is equally important to keep a close eye on your most linked pages, both by authority and by external link count. These pages play the biggest role in your site’s overall ability to rank, so changes in performance here are indicative of your site’s overall performance. Use a tool like SEMrush to monitor your rankings for your target keywords. In some cases, this will tell you if something is up before a change in traffic is noticeable. This will also help you identify how quickly Google is indexing the new site and whether it is dropping the old site from the index. 17. Mark dates in Google AnalyticsUse Google Analytics annotations to mark critical dates during the migration. This will help you to identify the cause of any issues you may come across during the process. 18. Ensure Google Search Console is properly set upYou will need to set up a new property in Google Search Console for the new domain. Verify that it is set up for the proper version, accounting for HTTP vs. HTTPS and www vs. non-www. Submit both the old and new sitemaps to solidify the message that the old site has been redirected to the new one. Submit a change of address in the Google Search Console, request Google to crawl the new sitemap, and use “fetch as Google” to submit your new site to be indexed. It is incredibly important to verify that all of your redirects, canonicalizations and links are error-free before doing this. 19. Properly manage PPCUpdate your PPC campaigns so that they point to the correct site. If your PPC campaigns are pointing to the old site, attribution will be lost in Analytics because of the redirect. 20. Update all other platformsUpdate all of your social media profiles, bios you use as a guest publisher, other websites you own, forum signatures you use, and any other platforms you take advantage of, so that the links point to the new site and not the old. 21. Reach out for your most prominent linksContact the most authoritative sites that link to you in order to let them know about the migration, and suggest that they update the link to point to the new website. Not all of them will do this, but those that do will help accelerate the process of Google recognizing that a site migration has occurred. I wouldn’t recommend doing this with every single link, since this would be extremely time-consuming for most sites, but it is worth doing this for your top links. 22. Monitor your indexed page countGoogle will not index all of the pages on your new site immediately, but if the indexed page count is not up to the same value as the old site after a month has passed, something has definitely gone wrong. 23. Check for 404s and redirectsCrawl the new site to verify that there are no 404s or 301s (or any other 3xx, 4xx, or 5xx codes). All of the links on the new site should point directly to a functioning page. The 404 and 501 errors are the biggest offenders and should be taken care of first. If there is a suitable replacement for a 404 page, change the link itself to point to the replacement, and verify that a 301 is in place for anybody who arrives at the missing page through other means. The second-worst offenders are links to 301 pages that exist on the old site. Even though these redirect to the new site, the server load is bad for performance, and linking back to the old site may lead to confusion over the fact that a site migration has taken place. While all of the other efforts taken should clarify this to Google and the other search engines, these things are best never left to chance. Any other 301s can be taken care of after this. Always update your internal links to point directly to the correct page, never through a redirect. 24. Crawl your old URLsUse Screaming Frog or a similar tool to crawl all of your old URLs. Be sure to crawl a list of URLs that you collected before the migration, and make sure the list includes any URLs that were not discoverable by crawling. Do not attempt to crawl the site directly; the 301s will cause it to crawl only the first page. Verify that all of the old URLs redirect to the new site. There should not be any 404s unless you removed the page during the migration process. If there are any 404s, verify that there are no links to them. If the 404s are not intended, set up a proper redirect. Check the external URLs to verify that all of the redirects are functional. None of the external URLs should be 301s or 404s. A 301 in the external URLs is indicative of a redirect chain and is bad for performance. A redirect to a 404 will lead to a very frustrating experience for your users and may hurt your SEO in other ways. ConclusionIf a site migration is carried out without taking SEO into account, you can almost bet on losing search engine traffic in the process. Other than clients who have approached me after being penalized by Google, the worst SEO predicaments I’ve come across were the ones caused during a site migration by professionals who didn’t consider how search engines would react to the process. Keep all of the above in mind if you are planning to migrate your site, and it should go off without a hitch. The post A site migration SEO checklist: Don’t lose traffic appeared first on Search Engine Land. via Search Engine Land http://ift.tt/2jPmA20 One of the most common technical SEO issues I come across is the inadvertent indexing of development servers, staging sites, production servers, or whatever other name you use. There are a number of reasons this happens, ranging from people thinking no one would ever link to these areas to technical misunderstandings. These parts of the website are usually sensitive in nature and having them in the search engine’s index risks exposing planned campaigns, business intelligence or private data. How to tell if your dev server is being indexedYou can use Google search to determine if your staging site is being indexed. For instance, to locate a staging site, you might search Google for site:domain.com and look through the results or add operators like -inurl:www to remove any www.domain.com URLs. You can also use third-party tools like SimilarWeb or SEMrush to find the subdomains. There may be other sensitive areas that contain login portals or information not meant for public consumption. In addition to various Google search operators (also known as Google Dorking), websites tend to block these areas in their robots.txt files, telling you exactly where you shouldn’t look. What could go wrong with telling people where to find the information you don’t want them to see? There are many actions you can take to keep visitors and search engines off dev servers and other sensitive areas of the site. Here are the options: Good: HTTP authenticationAnything you want to keep out of the index should include server-side authentication. Requiring authentication for access is the preferred method of keeping out users and search engines. Good: IP whitelistingAllowing only known IP addresses — such as those belonging to your network, clients and so on — is another great step in securing your website and ensuring only those users who need to see the area of the website will see it. Maybe: Noindex in robots.txtNoindex in robots.txt is not officially supported, but it may work to remove pages from the index. The problem I have with this method is that it still tells people where they shouldn’t look, and it may not work forever or with all search engines. The reason I say this is a “maybe” is that it can work and could actually be combined with a disallow in robots.txt, unlike some other methods which don’t work if you disallow crawling (which I will talk about later in this article). Maybe: Noindex tagsA noindex tag either in the robots meta tag or an X-Robots-Tag in the HTTP header can help keep your pages out of the search results. One issue I see with this is that it means more pages to be crawled by the search engines, which eats into your crawl budget. I typically see this tag used when there is also a disallow in the robots.txt file. If you’re telling Google not to crawl the page, then they can’t respect the noindex tag because they can’t see it. Another common issue is that these tags may be applied on the staging site and then left on the page when it goes live, effectively removing that page from the index. Maybe: CanonicalIf you have a canonical set on your staging server that points to your main website, essentially all the signals should be consolidated correctly. There may be mismatches in content that could cause some issues, and as with noindex tags, Google will have to crawl additional pages. Webmasters also tend to add a disallow in the robots.txt file, so Google once again can’t crawl the page and can’t respect the canonical because they can’t see it. You also risk these tags not changing when migrating from the production server to live, which may cause the one you don’t want to show to be the canonical version. Bad: Not doing anythingNot doing anything to prevent indexing of staging sites is usually because someone assumes no one will ever link to this area, so there’s no need to do anything. I’ve also heard that Google will just “figure it out” — but I wouldn’t typically trust them with my duplicate content issues. Would you? Bad: Disallow in robots.txtThis is probably the most common way people try to keep a staging site from being indexed. With the disallow directive in robots.txt, you’re telling search engines not to crawl the page — but that doesn’t keep them from indexing the page. They know a page exists at that location and will still show it in the search results, even without knowing exactly what is there. They have hints from links, for instance, on the type of information on the page. When Google indexes a page that’s blocked from crawling, you’ll typically see the following message in search results: “A description for this result is not available because of this site’s robots.txt.” If you recall from earlier, this directive will also prevent Google from seeing other tags on the page, such as noindex and canonical tags, because it prevents them from seeing anything on the page at all. You also risk not remembering to remove this disallow when taking a website live, which could prevent crawling upon launch. What if you got something indexed by accident?Crawling can take time depending on the importance of a URL (likely low in the case of a staging site). It may take months before a URL is re-crawled, so any block or issue may not be processed for quite a while. If you got something indexed that shouldn’t be, your best bet is to submit a URL removal request in Google Search Console. This should remove it for around 90 days, giving you time to take corrective actions. The post How to keep your staging or development site out of the index appeared first on Search Engine Land. via Search Engine Land http://ift.tt/2ztkko0 |
Archives
April 2024
Categories |