Crypto still makes me nervous. Maybe it’s the fact I was coming of age between 9/11 and the 2008 financial crisis. I graduated from college during the aftermath of that crisis. And then I see the people who are already rich because of cryptocurrency investments and I wonder if my feelings are wrong. Besides, I’m no Luke Skywalker here. I can’t really trust my feelings can I? There is one bitcoin investor out there who is already a millionaire after seven years and he’s saying everybody should ignore their “feelings” about bitcoin and invest. His name is Erik Finman
via ShoeMoney https://ift.tt/2r5DXMV
0 Comments
Google has announced new job posting guidelines for job schema. The new guidelines can be read here. They include the requirement to remove expired job listings, as we covered 10 days ago. In addition to the requirement of removing expired jobs, Google also requires webmasters to place structured data on the job’s detail page and the requirement to ensure that all job details are present in the job description. Google has posted how to remove a job posting, which says: To remove a job posting that is no longer available, follow the steps below:
Google does not want to show job listings to applicants when the job listing is not available. Google said, “[I]t can be very discouraging to discover that the job that they wanted is no longer available.” The additional two requirements are standard schema and structured data requirements. Often webmasters place schema and structured data on the wrong page. Google wants that markup on the most detailed landing page for that job listing, not on a page with all the job listings. Plus, you want to make sure to include all the information you include in your schema and structured data on the job listing web page. Google said, “[I]f you add salary information to the structured data, then also add it to the job posting. Both salary figures should match.” Google says if you violate these guidelines, Google “may take manual action against your site and it may not be eligible for display in the jobs experience on Google Search.” If you do get a manual action, Google says you “can submit a reconsideration request to let us know that you have fixed the problem(s) identified in the manual action notification. … If your request is approved, the manual action will be removed from your site or page.” The post Google announces new job posting guidelines & requirements appeared first on Search Engine Land. via Search Engine Land https://ift.tt/2r4yVPS Below is what happened in search today, as reported on Search Engine Land and from other places across the web. From Search Engine Land:
Recent Headlines From Marketing Land, Our Sister Site Dedicated To Internet Marketing:
Search News From Around The Web:
The post SearchCap: Voice assistant study, SEO audits & PPC budgets appeared first on Search Engine Land. via Search Engine Land https://ift.tt/2r694XC When campaign budgets are capped, it is more important than ever to get the best performance from your campaign. Since we can’t spend money on everything, we have to figure out how to get the most return out of the money that we have. Know what drives valueFirst things first, we have to be able to understand what is truly driving value. With e-commerce sites, it’s a little easier because the return on ad spend (ROAS) is easily tracked and tied back to campaigns. To take it one step further, it is best to understand which campaigns (or, more specifically, keywords) are driving the most lifetime value. For lead generation, it can be a little harder. The cheapest leads aren’t always the best leads, and the best cost per action (CPA) doesn’t always equate to the best performance. Setting up tracking to determine close rates is really important to understand what is working. As with e-commerce, understanding lifetime value is even better, so that we can make sure money is allocated to the best-performing campaigns. Trim wasted spendOnce I know what’s working, I start to cut out things that aren’t. I dig into all of the details of the account where wasted spend could go unnoticed. The Dimensions reports (or in the new AdWords interface: The Predefined Reports) are a good place to start. I look for anything I can cut that won’t have a proportional impact on results:
All of these things may seem small, but they add up. I then dig into keywords and ad groups. I look at different time frames, including recent performance but also long-term performance. When auditing accounts, I’ve found that a lot of keywords go unnoticed because they aren’t huge spenders but, in looking at a longer time frame, they may have spent a significant amount without producing results. Likewise, I look at ad groups through the same lens. It could be that none of the keywords within the ad group are spending at a significant rate, but in aggregate they may have spent a substantial amount without performing. In these scenarios, I label the ad group and keywords so that I can reactivate them in the future for testing. It may seem premature to cut them, but the short-term goal is to have a laser focus on top performers. Max out the right things firstOn the flip side, I look for what is working well. As I’m digging through all facets of the campaigns and looking for what isn’t working, I also take note of what is likely the best ROAS and sustainable volume. Most importantly, I ask:
The idea is to try to find ways to give those situations more runway. Whether certain locations are driving a high amount of volume at a low cost, or a device, a select number of keywords, or some combination of factors — I like to separate the high performers into their own campaign so that budget can be opened (as much as possible) while maintaining tighter caps on lesser-performing auctions. Know when to use and not to use shared budgetsShared budgets can be a godsend when it comes to budget-capped campaigns, or they can be crippling. I typically look to shared budgets in situations where the max cost per click (CPC) within the campaigns is close to or above the campaign budget. (Yes, it happens!) Sometimes, as budgets get pulled back, then pulled back again — unknowingly, keywords performance is hampered because the bids are so close to the campaign budget. Granted, with AdWords now having the ability to double campaign budgets, this alleviates a little bit of this strain but doesn’t entirely resolve it. When implementing, I group campaigns with like performance together. One of the pitfalls of shared budgets is that you can’t control how the budget is prioritized. I never want a poor performer to suck up all of the budget. I set up multiple shared campaigns if needed, to ensure top performers aren’t competing against poor performers for a budget. If performance is inconsistent across all campaigns, then shared budgets probably aren’t the best fit. Last but not least, I never lump search and display campaigns into the shared budget. Display campaigns, if allowed, can absorb a lot of budgets, so I always keep those separate. Bidding to maximize budgetThere are also some bidding opportunities that can help maximize impact on a capped budget. A few options that I like to try include:
As with any test, sometimes these improve performance and sometimes they don’t, so I just keep track and then roll out whichever performs best. Increase your conversion rateAnother surefire way to improve return, even on a limited budget, is to improve conversion rates. Increasing your conversion rate ensures that you get the most sales out of the traffic that you drive to the site. Best of all, increasing your conversion rate can have an impact across multiple channels — not just search — so the impact can be disproportionately positive. If you can’t increase budget but need to scale, the best way to increase production is to focus on conversion rate. The post Surefire tactics to get the most value out of budget-limited campaigns appeared first on Search Engine Land. via Search Engine Land https://ift.tt/2vRCXRk Indexing is really the first step in any SEO audit. Why? If your site is not being indexed, it is essentially unread by Google and Bing. And if the search engines can’t find and “read” it, no amount of magic or search engine optimization (SEO) will improve the ranking of your web pages. In order to be ranked, a site must first be indexed. Is your site being indexed?There are many tools available to help you determine if a site is being indexed. Indexing is, at its core, a page-level process. In other words, search engines read pages and treat them individually. A quick way to check if a page is being indexed by Google is to use the site: operator with a Google search. Entering just the domain, as in my example below, will show you all of the pages Google has indexed for the domain. You can also enter a specific page URL to see if that individual page has been indexed. When a page is not indexedIf your site or page is not being indexed, the most common culprit is the meta robots tag being used on a page or the improper use of disallow in the robots.txt file. Both the meta tag, which is on the page level, and the robots.txt file provide instructions to search engine indexing robots on how to treat content on your page or website. The difference is that the robots meta tag appears on an individual page, while the robots.txt file provides instructions for the site as a whole. On the robots.txt file, however, you can single out pages or directories and how the robots should treat these areas while indexing. Let’s examine how to use each. Robots.txtIf you’re not sure if your site uses a robots.txt file, there’s an easy way to check. Simply enter your domain in a browser followed by /robots.txt. Here is an example using Amazon (https://ift.tt/2FlZrZS The list of “disallows” for Amazon goes on for quite awhile! Google Search Console also has a convenient robots.txt Tester tool, helping you identify errors in your robots file. You can also test a page on the site using the bar at the bottom to see if your robots file in its current form is blocking Googlebot.
There are many cool and complex options where you can employ the robots file. Google’s Developers site has a great rundown of all of the ways you can use the robots.txt file. Here are a few: Robots meta tagThe robots meta tag is placed in the header of a page. Typically, there is no need to use both the robots meta tag and the robots.txt to disallow indexing of a particular page. In the Search Console image above, I don’t need to add the robots meta tag to all of my landing pages in the landing page folder (/lp/) to prevent Google from indexing them since I have disallowed the folder from indexing using the robots.txt file. However, the robots meta tag does have other functions as well. For example, you can tell search engines that links on the entire page should not be followed for search engine optimization purposes. That could come in handy in certain situations, like on press release pages.
The Google Developer’s site also has a thorough explanation of uses of the robots meta tag. XML sitemapsWhen you have a new page on your site, ideally you want search engines to find and index it quickly. One way to aid in that effort is to use an eXtensible markup language (XML) sitemap and register it with the search engines. XML sitemaps provide search engines with a listing of pages on your website. This is especially helpful when you have new content that likely doesn’t have many inbound links pointing to it yet, making it tougher for search engine robots to follow a link to find that content. Many content management systems now have XML sitemap capability built in or available via a plugin, like the Yoast SEO Plugin for WordPress. Make sure you have an XML sitemap and that it is registered with Google Search Console and Bing Webmaster Tools. This ensures that Google and Bing know where the sitemap is located and can continually come back to index it. How quickly can new content be indexed using this method? I once did a test and found my new content had been indexed by Google in only eight seconds — and that was the time it took me to change browser tabs and perform the site: operator command. So it’s very quick! JavaScriptIn 2011, Google announced it was able to execute JavaScript and index certain dynamic elements. However, Google isn’t always able to execute and index all JavaScript. In Google Search Console, the Fetch and Render tool can help you determine if Google’s robot, Googlebot, is actually able to see your content in JavaScript. The Fetch and Render tool shows us that Googlebot is unable to see the content and links the same way humans will. This means that Googlebot cannot follow the links in the JavaScript to these deeper course pages on the site. ConclusionAlways keep in mind your site has to be indexed in order to be ranked. If search engines can’t find or read your content, how can they evaluate and rank it? So be sure to prioritize checking your site’s indexability when you’re performing an SEO audit. The post The first steps of your SEO audit: Indexing issues appeared first on Search Engine Land. via Search Engine Land https://ift.tt/2Fn7aab In this week’s Search In Pictures, here are the latest images culled from the web, showing what people eat at the search engine companies, how they play, who they meet, where they speak, what toys they have and more. Google uses this scary client masks:
Girl scouts meet at Google:
Parrots visit the GooglePlex:
VHS tapes can be found at the Google London office:
The post Search in Pics: Parrots at Google, VHS tape library & scary client masks appeared first on Search Engine Land. via Search Engine Land https://ift.tt/2HUTaK8 Two more studies analyzing interactions with voice search and virtual assistants were released this week: one from Stone Temple Consulting and the other from digital agency ROAST. The latter focuses on Google and explores voice search for 22 verticals. Stone Temple’s report compares virtual assistants to one another in terms of accuracy and answer volume. The Stone Temple report is a follow-up to its 2017 virtual assistant study and, therefore, it can provide insights into how voice search results have changed and improved in the past year. The 2018 study involved nearly 5,000 queries, compared across Alexa, Cortana, Google Assistant (Home and smartphone) and Siri. Source: Stone Temple — Rating the Smarts of the Digital Personal Assistants” (2018) What the company found was that Google Assistant was again the strongest performer, with the highest answer volume and percentage of correct answers. Cortana came in second, and Alexa saw the most dramatic improvement in terms of answer volume but also had the highest number of incorrect responses. Siri also made improvements but was last across most measures in the test. The ROAST report looked exclusively at Google Assistant results and determined the sources for the answers provided. It’s also a follow-up to an earlier report released in January. This new report examined more than 10,000 queries across 22 verticals, including hotels, restaurants, automotive, travel, education, real estate and others. In contrast to the Stone Temple results above, only 45 percent of queries were answered in the ROAST study. One of the most interesting findings of the ROAST study is that the Google Featured Snippet is often not the go-to source for Google Assistant. In a number of cases, which varied by category, web search and Google Assistant results differed for the same query:
Results by vertical Source: ROAST “Voice search vertical comparison overview” (2018) Though it’s a bit challenging to read, the red bars in the chart above represent instances where the query was met with no response. Restaurants was the category with the smallest no-response percentage, while “transport” had the highest percentage of queries that failed to yield an answer. Below is a color legend indicating the data or answer sources according to ROAST: The post Study: Google Assistant most accurate, Alexa most improved virtual assistant appeared first on Search Engine Land. via Search Engine Land https://ift.tt/2HtEJNG Below is what happened in search today, as reported on Search Engine Land and from other places across the web. From Search Engine Land:
Recent Headlines From Marketing Land, Our Sister Site Dedicated To Internet Marketing:
Search News From Around The Web:
The post SearchCap: Local search ads, Intent keywords & The Flying Housewife appeared first on Search Engine Land. via Search Engine Land https://ift.tt/2KheZ59
Before you jump on the latest big digital marketing bandwagon, ask yourself these questions: How complete is our customer data? How much of our customer data sits in silos? Can we scale what we know about our customers? Join our experts as we discuss data best practices that will solidify your customer data foundation. We’ll explore how new techniques in identity resolution can connect the data fragments that exist across your organization and fuel more relevant customer relationship marketing strategies. Register today for “Customer Data Strategies & Identity Resolution: Best Practices,” produced by Digital Marketing Depot and sponsored by FullContact. The post New date! Customer data strategies & identity resolution webinar appeared first on Search Engine Land. via Search Engine Land https://ift.tt/2HPwlaV Some brick-and-mortar advertisers logged into accounts in late February to find new campaigns named “Local Search Ads Experiment Campaign” populated in AdWords. Google has confirmed that these campaigns are currently running for select advertisers only and promote verified business locations in local search results in both Google Search and maps without using keywords. Instead of keywords, Google uses Google My Business (GMB) information such as location address and location category to trigger relevant results. Advertisers cannot request to take part in the experiment at this time. While these campaigns are still in very early days, how should advertisers think about this recent development? Locally focused optimizationsRight now, most brick-and-mortar advertisers only have one lever available to control when ads show up in Google maps searches or in local ad displays directly on Google.com: the activation or deactivation of location extensions. Local Services is another paid local format that is managed through a dedicated local services portal, but it is currently limited to a handful of home service industries in select US cities. Thus, location extensions are driving local ad results for most brick-and-mortar brands in most locations at this point. Location extensions have existed for years and are traditionally added to AdWords campaigns for all brick-and-mortar advertisers as a best practice for gaining additional search engine results pages (SERP) real estate and giving searchers useful information on store locations and hours. Over the past couple of years, Google has used these extensions and active keywords to trigger ad units for local searches in maps and the local pack. While there’s no clean way to segment maps traffic from other AdWords traffic at this time, click type reports feature “get location details” clicks, which essentially all come from maps, according to Google. The share of traffic coming from this click type increased meaningfully through mid-2017, particularly for brand keywords, but has remained roughly steady over the last three quarters. Maps and other locally focused units like ads in local packs are grouped in with Google.com rather than the search partner network, and thus, all campaigns with location extensions are automatically opted into showing maps ads. Thus, advertisers have no ability to manage this traffic beyond turning it on or off with the addition or subtraction of location extensions on campaigns. The possibility of new local-only campaigns gives hope for being able to manage these searches separately from traditional Google searches. While the lack of keywords does cause some concern over lack of control, I’d hope advertisers will at least be able to set negative keywords for these campaigns to trim out irrelevant queries if or when Google moves past the experimental phase. Performance doesn’t currently populate in the user interface (UI), and Google plans to communicate results directly to advertisers in time, so it’s unclear what kind of queries are triggering traffic and how performance looks at this stage. However, Google does appear to be equipping Google My Business (GMB) with new features that might help it trigger relevant ad results through these campaigns, as judged by two recent updates. GMB gets new bells and whistlesIn the month following the release of these experiment campaigns, Google made two notable changes to Google My Business. The first came with the introduction of business descriptions within GMB, allowing 750 characters with which to briefly inform customers about what a particular business offers and any other defining The second update gave businesses the ability to specify any services they provide, creating a structured combination of service name, item name within that service, item price and item description. While GMB menus have existed for restaurants since February 2018, they are now available for businesses in a host of other industries as well. With both of these changes, in a host of industries, Google now has additional signals to use in triggering ads based on GMB information. Further, in the case of the services update, it now has structured pricing information for services which might be incorporated into both paid and organic listings. Alternative motives at play?The lack of keywords in local campaigns offer advertisers the possibility of being able to target local searchers without relying on existing campaigns and active keywords. This could be a nice advancement for brands looking to monitor and control this traffic better than the current setup allows. The recent updates to Google My Business give business owners the ability to provide important information to Google, which should help Google in finding relevant local results for searchers in both Google Maps and on Google.com. It should also give users better reference points to go on in determining which local business to use. Looking over at the local experiment campaigns, however, it seems like at least part of the motivation here might be to better equip AdWords with the information it needs to match ads with relevant searches sans keywords. Maybe these local experiment campaigns will fall by the wayside, as plenty of Google tests and experiments do, and the GMB updates are really just focused on giving users more information and better results. It wouldn’t surprise me if there are alternative motives at play. Regardless of the implications for monetization, optimizing these new details in GMB is likely to be important for ranking for local searches. Much like proper product feed management gives Google product information to help a brand rank well in Google Shopping results, the structured data associated with services in GMB is likely to require similar optimization. Imagine typing in “oil change near me” and returning a local pack complete with oil change pricing information for nearby automotive repair shops. This is already happening with product-specific searches through local inventory ads, a variation of Google Shopping ads which includes information about when a product is available at a nearby store. ConclusionWhether such listings are paid or organic, I think it makes total sense for Google to start packing this kind of information into the Local Pack for more searches than just those specifying products. The additional information Google is now set to glean from GMB for local businesses in a variety of industries is likely to help it do just that. The post Look Ma, no keywords! Phrase-free AdWords campaigns are here appeared first on Search Engine Land. via Search Engine Land https://ift.tt/2KhNjgo |
Archives
April 2024
Categories |