Monday, March 30, 2020

Operating During COVID-19: Helpful Tips for Local Businesses

Posted by MiriamEllis

Local businesses know better than any other model what it means to fully participate in community life. You are the good neighbors who are there to serve, inspire, and sustain the people and traditions that make your town a unique and enjoyable place to call home.

As we explore this topic of what local businesses can do during the COVID-19 pandemic, I want to honor all that you have always done to take care of your community as a local business owner or marketer. Thank you.

In this article, you will find local SEO tips that could make a difference for your business in the coming weeks, innovative resources for support, advice from my own tight-knit community of some of the world’s best local SEOs, and some serious thinking about building a better local future.

Adhere to all regulations

First and foremost, start each day with a review of both local and national news to be sure you are complying with the evolving regulations for your city, county, and country. Policies designed to mitigate the harm of COVID-19 vary widely from region to region, and your business must keep informed of which forms of service you are allowed to offer in this dynamic scenario.

And, while social media can be a great connector within your community at any time, beware of misinformation and, sadly, scams in the days ahead. Get your news from sources you trust, and if you are not certain about interpreting a guideline, directly contact local authorities. This article does not take the place of laws and regulations specific to your community.

Communicate abundantly

The most helpful thing any local business can do right now, whether it’s deemed an essential or non-essential service, is to provide accurate information to its community. There are three key places to do this:

Google My Business

“More than ever, your Google Business Profile is a critical communication nexus with your customers”. —Mike Blumenthal, GatherUp

Local businesses know just how big a role Google plays as intermediary between brands and the public. This remains true during this difficult time however, Google’s local product is not running at full strength. Joy Hawkins’ article for Local University on March 23 details the limited support for or complete discontinuation of Google Q&As, posts, descriptions, reviews, and owner responses. It’s an evolving scenario, with local SEOs reporting different outcomes each day. For example, some practitioners have been able to get some, but not all, Google posts to publish.

As of writing this, there are four fields you can utilize to communicate current information to customers via GMB, but please be aware that some edits may take several days to go into effect:

Name

Google is allowing businesses to edit their business name field to reflect that they are offering curbside service, takeout, and delivery. For example, if your current name is “John’s Grill”, you are allowed to temporarily change your name to “John’s Grill — Delivery Available”.

Phone number

If regulations are keeping you at home but you still want customers to be able to reach you on your home or cell phone for information, update your work answering machine to reflect the changes and edit your GMB phone number to the appropriate new number.

Hours of operation

The discussion on how best to show that your business either has no hours or limited new hours is ongoing. I believe the best route for the present is to use Google’s method of setting special hours. This option should be especially useful for multi-location enterprises who can set special hours via the API.

Be advised, however, that there are some instances of agencies setting special hours for clients and then clients receiving emails from Google asking if the business has closed. This can alarm those clients. However, to date, it appears that when Google receives responses to this prompt that yes, the business is closed, they simply put a message about this on the listing rather than remove the listing entirely.

On March 25, Google implemented a “temporarily closed” button inside the “Info” tab of the GMB dashboard, as reported by Joy Hawkins. Utilizing this button may temporarily decrease your rankings, but you will be able to remove the label in the future and I strongly hope (but cannot guarantee) that this will remove any effects of suppression. I recommend using this button if it applies to your business because we must put safety first over any other consideration.

COVID-19 update posts

Google has newly created a Google posts type that you’ll see as an option in your GMB dashboard. While other post types have been published sporadically, I am seeing examples of the COVID-19 Update posts going live. Try to fit as much information as you can about the changed status of your business into one of these posts.

In addition to the edits you make to your GMB listing, update your most visible local business listings on other platforms to the best of your ability, including on:

  • Bing: A “Temporarily closed” business status is available in the Bing Places dashboard. This is currently not available in the API.
  • Yelp: Yelp has introduced a new field called “temporarily closed”. This is meant to be used by businesses which are or will be closed (but not on a permanent basis) due to the COVID-19 outbreak. Businesses need to indicate the “end date” for when this business status will end. Given the uncertainty surrounding timelines, Yelp is allowing users to provide an “estimate” for the end date which they can always update later. Special opening hours can be added on Yelp itself, too. Neither field is available in the API.

Website

Google My Business may be experiencing support issues right now, but thank goodness you still have full control of your website as a home base for conveying important information to the public. Here’s a quick checklist of suggested items to update on your site as soon as you can:

  • Put a site wide banner on all pages of the website with key information such as “temporarily closed”, “drive-up service available 9-5 Monday - Friday” or “storefront closed but we can still ship to you.”
  • Provide the most complete information about how your business has been affected by COVID-19, and detail any services that remain available to customers.
  • Edit location landing pages in bulk or individually to reflect closures, new hours, and new temporary offers.
  • Be sure hours of operation are accurate everywhere they are mentioned on the website, including the homepage, contact page, about page, and landing pages.
  • If your main contact phone number has changed due to the situation, update that number everywhere it exists on the website. Don’t overlook headers, footers, or sidebars as places your contact info may be.
  • If you have a blog, use it to keep the public updated about the availability of products and services.
  • Be sure your website contains highly visible links to any social media platforms you are using to provide updated information.
  • It would be a worthy public service right now to create new content about local resources in your community for all kinds of basic needs.

Social media and email

“Make it clear what you’re doing, such as things like home delivery or curbside pickup. And mention it EVERYWHERE. The companies that are being successful with this are telling people non-stop how they can still support them. Additionally, don’t be afraid to reach out to people who have supported you via social media in the past and ask them to mention what you’re doing.” —Dana DiTomaso, Kick Point

Whether your customers’ social community is Facebook, Twitter, Instagram, YouTube, or another platform, there has never been a more vital time to make use of the instant communication these sites provide. It was Fred Rogers who famously said that in times of crisis, we should “look for the helpers.” People will be looking to your brand for help and, also, seeking ways that they can help, too.

If you can make the time to utilize social media to highlight not just your own services, but the services you discover are being provided by other businesses in your city, you will be strengthening your community. Ask your followers and customers to amplify information that can make life safer or better right now.

And, of course, email is one of the best tools presently at your disposal to message your entire base about changed conditions and special offers. My best practice advice for the present is to be sure you’re only communicating what is truly necessary. I’ve seen some examples of brands (which shall remain nameless) exploiting COVID-19 for senseless self-promotion instead of putting customers’ concerns and needs first. Don’t go that route. Be a helper!

Beyond your local business listing, websites, social media platforms, and email, don’t overlook offline media for making further, helpful informational contributions. Call into local radio shows and get in touch with local newspapers if you have facts or offers that can help the public.

Operate as fully as you can

“Find out what support is being made available for you at [the] government level, tap into this as soon as you can — it’s likely there will be a lot of paperwork and many hoops through which you’ll need to jump.” —Claire Carlile, Claire Carlile Marketing

While the social safety net differs widely from country to country, research any offers of support being made to your business and make use of them to remain as operational as possible for the duration of this pandemic. Here are six adjustments your business should carefully consider to determine whether implementation is possible:

1. Fulfill essentials

If your business meets local, state, or federal regulations that enable it to continue operating because it’s deemed “essential”, here are the ways different business models are adapting to current conditions:

  • Some healthcare appointments can be handled via phone or virtual meetings, and some medical facilities are offering drive-up testing.
  • Drivethrough, delivery, and curbside pickup are enabling some brands to offer takeout meals, groceries, prescriptions, and other necessary goods to customers.
  • Supermarkets and grocery stores without built-in delivery fleets are contracting with third parties for this service.
  • Farms and ranches can offer honor system roadside stands to allow customers to access fresh produce, dairy products, and meats with proper social distancing.
  • Companies that care for vulnerable populations, banking, laundry, and fuel can implement and communicate the extra steps they are taking to adhere to sanitation guidelines for the safety of customers and staff.
  • Brands and organizations that donate goods and services to fulfill essential needs are taking an active role in community support, too.

2. Evaluate e-commerce

If your local business already has an e-commerce component on its website, you’re many steps ahead in being well set up to keep selling via delivery. If you’ve not yet implemented any form of online selling, investigate the following options:

  • If you have a credit card processing machine, the most basic solution is to take orders over the phone and then ship them, allow curbside pickup, or deliver them.
  • If you lack a credit card processing service, PayPal invoicing can work in a pinch.
  • If your site is built on WordPress and you’re quite comfortable with that platform, Moz’s own Sha Menz highly recommends the ease of the WooCommerce plugin for getting online shopping set up with PayPal as a built-in payment option. It allows easy setup of flat rate or free shipping and local pickup options. WooCommerce automatically sends order confirmation emails to both owner and customer and even supports creation of discount coupons.
  • Pointy is a simple device that lets you scan product barcodes and have them catalogued online. Read my 2019 interview with the company’s CEO and determine whether Pointy plus shipping could be a solution to keep you in business in the coming months.
  • If you’ve determined that robust investing in e-commerce is a wise move for the present and future, I found this 2020 overview of options from Shopify to Volusion to Magento very useful. Don’t overlook the Moz blog’s e-commerce category for free, expert advice.

3. Connect virtually

In my very large family, one relative has transitioned her yoga studio to online classes, another is offering secure online psychotherapy appointments, and another is instructing his orchestra on the web. While nothing can replace in-person relationships, virtual meetings are the next-best-thing and could keep many business models operating at a significant level, despite the pandemic. Check out these resources:

4. Use downtime for education

If COVID-19 has somewhat or completely paused your business, it’s my strong hope that there will be better days ahead for you. If, like so many people, you find yourself with much more time on your hands than usual, consider using it to come out of this period of crisis with new business knowledge. Please make use of this list of resources, and I want to give special thanks to my friend, Claire Carlile, for contributing several of these suggestions:

Begin working towards a stronger local future

“I would say generally it’s critical for business owners to connect with one another. To the extent they can join or form groups for support or to share ideas, they should. This is a terrible and scary time but there are also potential opportunities that may emerge with creative thinking. The ‘silver lining’, if there is one here, is the opportunity to reexamine business processes, try new things and think — out of necessity — very creatively about how to move forward. Employees are also a great source of ideas and inspiration.” —Greg Sterling, Search Engine Land

I’d like to close with some positive thinking. Local SEO isn’t just a career for me — it’s a personal belief system that well-resourced communities are the strongest. Every community, town, and city shares roughly the same needs, which we might depict like this:

In this simple chart, we see the framework of a functional, prepared, and healthy society. We see a plan for covering the basic needs of human existence, the cooperation required to run a stable community, contributive roles everyone can play to support life and culture, and relief from inevitable disasters. We see regenerative land and water stewardship, an abundance of skilled educators, medical professionals, artisans, and a peaceful platform for full human expression.

COVID-19 marks the third major disaster my community has lived through in three years. The pandemic and California’s wildfires have taught me to think about the areas in which my county is self-sustaining, and areas in which we are unprepared to take care of one another in both good times and bad. While state and national governments bear a serious responsibility for the well-being of citizens, my genuine belief as a local SEO is that local communities should be doing all they can to self-fulfill as many data points on the chart above as possible.

While it’s said that necessity is the mother of invention, and it certainly makes sense that the present moment would be driving us to invent new solutions to keep our communities safe and well, I find models for sane growth in the work others have already contributed. For me, these are sources of serious inspiration:

  • Learn from indigenous cultures around the world about stewardship and community. Here is just one example of how knowledge is being applied by tribes in the Pacific Northwest during the pandemic. In my own state of California, a number of tribes are leading the way in mitigating wildfires via cultural burning, addressing what has become an annual disaster where I live.
  • Look at the policies of other countries with a higher index of human happiness than my own. For example, I am a great admirer of Norway’s law of allemannsrett which permits all residents to responsibly roam and camp in most of the country, and more importantly, to harvest natural foods like mushrooms and berries. In my community, most land is behind fences, and even though I know which plants are edible, I can’t access most of them. Given current grocery store shortages, this concept deserves local re-thinking.
  • Study the Economic Bill of Rights US President Franklin Delano Roosevelt introduced but didn’t live to see passed. Had this been implemented, my local community would not now be suffering from a shortage of medical providers and denial of medical care, a shortage of nearby farms for complete nutrition, homelessness and unaffordable housing, and a widespread lack of education and essential skills. From a purely commercial standpoint, FDR’s bill could also have prevented the collapse of “Main St.”, which local search marketers have been fighting every day to reverse.
  • Join organizations like the American Independent Local Business Alliance which exist to build more resilient local communities via methods like the Buy Local movement and community education. I strongly encourage you to check in with AMIBA for guidance in these times.

Other models and examples may personally inspire you, but I share my friend Greg Sterling’s opinion: now is the time to bring creativity to bear, to connect with fellow local business owners and community members, and to begin planning a more realistic and livable future.

For now, you will have to make those connections virtually, but the goal is to come out of this time of crisis with a determination to make local living more sustainable for everyone. You can start with asking very basic questions like: Where is the nearest farm, and how many people can it feed? What do we need to do to attract more doctors and nurses to this town? Which facilities could be converted here to produce soap, or bathroom tissue, or medical supplies?

I don’t want to downplay the challenge of forward-thinking in a time of disruption, but this I know from being a gardener: new seeds sprout best where the earth is disturbed. You have only to visit the margins of new roads being laid to see how digging is quickly followed by verdant crops of fresh seedlings. Humanity needs to dig deep right now for its best solutions to serious challenges, and this can begin right where you are, locally.

Please allow me to wish many better days ahead to you, your business, and your community, and to work by your side to build a stronger local future.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Operating During COVID-19: Helpful Tips for Local Businesses published first on http://goproski.com/

Friday, March 27, 2020

Generating Local Content at Scale - Whiteboard Friday

Posted by rjonesx.

Building local pages in any amount can be a painful task. It’s hard to strike the right mix of on-topic content, expertise, and location, and the temptation to take shortcuts has always been tempered by the fact that good, unique content is almost impossible to scale.

In this week’s edition of Whiteboard Friday, Russ Jones shares his favorite white-hat technique using natural language generation to create local pages to your heart’s content.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hey, folks, this is Russ Jones here with Moz again to talk to you about important search engine optimization issues. Today I’m going to talk about one of my favorite techniques, something that I invented several years ago for a particular client and has just become more and more and more important over the years. 

Using natural language generation to create hyper-local content

I call this using natural language generation to create hyper-local content. Now I know that there’s a bunch of long words in there. Some of you are familiar with them, some of you are not. 


So let me just kind of give you the scenario, which is probably one you’ve been familiar with at some point or another. Imagine you have a new client and that client has something like 18,000 locations across the United States.


Then you’re told by Google you need to make unique content. Now, of course, it doesn’t have to be 18,000. Even 100 locations can be difficult, not just to create unique content but to create uniquely valuable content that has some sort of relevance to that particular location. 


So what I want to do today is talk through one particular methodology that uses natural language generation in order to create these types of pages at scale.

What is natural language generation?

Now there might be a couple of questions that we need to just go ahead and get off of our plates at the beginning. So first, what is natural language generation? Well, natural language generation was actually originated for the purpose of generating weather warnings. You’ve actually probably seen this 100,000 times.

Whenever there’s like a thunderstorm or let’s say high wind warning or something, you’ve seen on the bottom of a television, if you’re older like me, or you’ve gotten one on your cellphone and it says the National Weather Service has issued some sort of warning about some sort of weather alert that’s dangerous and you need to take cover.

Well, the language that you see there is generated by a machine. It takes into account all of the data that they’ve arrived at regarding the weather, and then they put it into sentences that humans automatically understand. It’s sort of like Mad Libs, but a lot more technical in the sense that what comes out of it, instead of being funny or silly, is actually really useful information.

That’s our goal here. We want to use natural language generation to produce local pages for a business that has information that is very useful. 

Isn’t that black hat?

Now the question we almost always get or I at least almost always get is: Is this black hat? One of the things that we’re not supposed to do is just auto-generate content.

So I’m going to take a moment towards the end to discuss exactly how we differentiate this type of content creation from just the standard, Mad Libs-style, plugging in different city words into content generation and what we’re doing here. What we’re doing here is providing uniquely valuable content to our customers, and because of that it passes the test of being quality content.

Let’s look at an example

So let’s do this. Let’s talk about probably what I believe to be the easiest methodology, and I call this the Google Trends method. 

1. Choose items to compare

So let’s step back for a second and talk about this business that has 18,000 locations. Now what do we know about this business? Well, businesses have a couple of things that are in common regardless of what industry they’re in.

They either have like products or services, and those products and services might have styles or flavors or toppings, just all sorts of things that you can compare about the different items and services that they offer. Therein lies our opportunity to produce unique content across almost any region in the United States.

The tool we’re going to use to accomplish that is Google Trends. So the first step that you’re going to do is you’re going to take this client, and in this case I’m going to just say it’s a pizza chain, for example, and we’re going to identify the items that we might want to compare. In this case, I would probably choose toppings for example.

So we would be interested in pepperoni and sausage and anchovies and God forbid pineapple, just all sorts of different types of toppings that might differ from region to region, from city to city, and from location to location in terms of demand. So then what we’ll do is we’ll go straight to Google Trends.

The best part about Google Trends is that they’re not just providing information at a national level. You can narrow it down to city level, state level, or even in some cases to ZIP Code level, and because of this it allows us to collect hyper-local information about this particular category of services or products.

So, for example, this is actually a comparison of the demand for pepperoni versus mushroom versus sausage toppings in Seattle right now. So most people, when people are Googling for pizza, would be searching for pepperoni.

2. Collect data by location

So what you would do is you would take all of the different locations and you would collect this type of information about them. So you would know that, for example, here there is probably about 2.5 times more interest in pepperoni than there is in sausage pizza. Well, that’s not going to be the same in every city and in every state. In fact, if you choose a lot of different toppings, you’ll find all sorts of things, not just the comparison of how much people order them or want them, but perhaps how things have changed over time.



For example, perhaps pepperoni has become less popular. If you were to look in certain cities, that probably is the case as vegetarian and veganism has increased. Well, the cool thing about natural language generation is that we can automatically extract out those kinds of unique relationships and then use that as data to inform the content that we end up putting on the pages on our site.

So, for example, let’s say we took Seattle. The system would automatically be able to identify these different types of relationships. Let’s say we know that pepperoni is the most popular. It might also be able to identify that let’s say anchovies have gone out of fashion on pizzas. Almost nobody wants them anymore.

Something of that sort. But what’s happening is we’re slowly but surely coming up with these trends and data points that are interesting and useful for people who are about to order pizza. For example, if you’re going to throw a party for 50 people and you don’t know what they want, you can either do what everybody does pretty much, which is let’s say one-third pepperoni, one-third plain, and one-third veggie, which is kind of the standard if you’re like throwing a birthday party or something.

But if you landed on the Pizza Hut page or the Domino’s page and it told you that in the city where you live people actually really like this particular topping, then you might actually make a better decision about what you’re going to order. So we’re actually providing useful information. 

3. Generate text

So this is where we’re talking about generating the text from the trends and the data that we’ve grabbed from all of the locales.

Find local trends

Now the first step, of course, is just looking at local trends. But local trends aren’t the only place we can look. We can go beyond that. For example, we can compare it to other locations. So it might be just as interesting that in Seattle people really like mushroom as a topping or something of that sort.

Compare to other locations

But it would also be really interesting to see if the toppings that are preferred, for example, in Chicago, where Chicago style pizza rules, versus New York are different. That would be something that would be interesting and could be automatically drawn out by natural language generation. Then finally, another thing that people tend to miss in trying to implement this solution is they think that they have to compare everything at once.

Choose subset of items

That’s not the way you would do it. What you would do is you would choose the most interesting insights in each situation. Now we could get technical about how that might be accomplished. For example, we might say, okay, we can look at trends. Well, if all of the trends are flat, then we’re probably not going to choose that information. But we see that the relationship between one topping and another topping in this city is exceptionally different compared to other cities, well, that might be what gets selected.

4. Human review

Now here’s where the question comes in about white hat versus black hat. So we’ve got this local page, and now we’ve generated all of this textual content about what people want on a pizza in that particular town or city. We need to make sure that this content is actually quality. That’s where the final step comes in, which is just human review.

In my opinion, auto-generated content, as long as it is useful and valuable and has gone through the hands of a human editor who has identified that that’s true, is every bit as good as if that human editor had just looked up that same data point and wrote the same sentences.

So I think in this case, especially when we’re talking about providing data to such a diverse set of locales across the country, that it makes sense to take advantage of technology in a way that allows us to generate content and also allows us to serve the user the best possible and the most relevant content that we can.

So I hope that you will take this, spend some time looking up natural language generation, and ultimately be able to build much better local pages than you ever have before. Thanks.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Generating Local Content at Scale - Whiteboard Friday published first on http://goproski.com/

Thursday, March 26, 2020

Help Your Community from Six Feet Away: Non-Marketing Tips from Mozzers

Posted by morgan.mcmurray

For the last few weeks, you’ve probably experienced an influx of emails from companies detailing how COVID-19 is affecting them and thus you, their customer. It’s… a lot, isn’t it? So today, we want to take a departure from the world of “how this affects us” and focus instead on actionable things we can all do to make things brighter for ourselves and our communities. This won’t be your regularly scheduled programming — we won’t be discussing SEO or marketing. Instead, we’re sharing ideas and advice from the folks at Moz who’ve been finding ways to be helpers as we all navigate this new normal.

Donate and shop

For those who have steady income during this time of economic uncertainty, it’s more important than ever to support local businesses and charitable organizations. Many employers, Moz included, offer charitable donation matching to make use of as well.

Food banks, shelters, and charities

You can donate money or call local organizations (like homeless shelters, food banks, and animal rescues) to see what items they most need. Mozzers have found several creative ways to contribute, including a super helpful spreadsheet of all the food banks in our area shared by Britney Muller. A few of us have volunteered to be pet foster parents, and Skye Stewart has even seen neighbors turn their “little free libraries” into pantries for those in need! 

Skye has seen little free libraries stocked as pantries throughout the Wallingford and Fremont neighborhoods of Seattle. This one belongs to Clay and Elli Stricklin.

Blood banks

If you’re healthy and able, consider signing up to donate blood. The blood banks in our area have received so many volunteers that they’re scheduling appointments weeks in advance — what a fantastic show of community support!

Buy gift cards or shop online

All of our favorite local salons, restaurants, bars, or home goods stores are likely suffering from recent closures. Gift cards give them support now and give you the option to shop later (or have your holiday shopping done a little early). Many local businesses also have online shops for you to browse from home. Shipping times are likely impacted, though, so be understanding!

Order take-out

Local restaurants are shifting to take-out and to-go order business models. If you can’t go pick up food, apps like DoorDash and Grubhub are offering no-contact delivery options.


Grocery shop

Stock up only with what you need for two or three weeks for yourself. You can also volunteer, like Mozzer Hayley Sherman, to make grocery runs for at-risk friends or family.

Stay healthy

This sounds like a no-brainer — of course we’re all trying to stay healthy! But it has to be said, as now we have to be a bit more creative to keep up our healthy habits.

Online workouts

With recent closures, local gyms and studios are offering online classes. Have you ever wondered what a yoga or dance class is like via Zoom? A few of us at Moz have found out, and it’s definitely different — but also surprisingly fun — to connect with all the other students in this new way.

Walk or run

We’ve been enjoying some unseasonable sunshine in the Pacific Northwest, making it the perfect time to fight cabin fever with a walk or run outside. Weather permitting, you can do the same! Just make sure to maintain social distance from other walkers and runners (even if they have a cute puppy with them — tough, we know).

Meditate

Meditation can help calm the anxiety many of us might be feeling right now. Dr. Pete recommends the Ten Percent Happier app for assistance, and apps like Insight Timer and Calm have dozens of free meditation options for you to choose from, too.

Keep eating fresh fruits and veggies

While it’s tempting to only stock up on non-perishable food like mac and cheese (I’m guilty of having several boxes stored in my pantry) and rely on supplements or Emergen-C, fresh produce is still one of the best options to get necessary vitamins and boost your immunity.

Go offline

Several of us at Moz have found it helpful to disconnect from the news cycle for a while every day, and we try to only pay attention to news from reputable sources. With so many voices in the conversation, this can be hard, which is why going offline can be so helpful.

Stay connected

Human connection remains important for maintaining morale and good humor, even if we can’t share the same physical space.

Check in

Call people you would normally see regularly, and reach out to those you haven’t seen in awhile. Mozzers are staying connected by calling into morning coffee hangouts and virtual team lunches — it’s been great to see everyone’s smiling faces!

You might start a weekly virtual happy hour or book club using free video conferencing software like Google Hangouts or Skype, or schedule some time to watch movies together with the new Netflix Party extension.

Join online communities

Social media groups or apps like Nextdoor allow you to meet your neighbors, share memes, and check to see if anyone needs anything like a grocery run, medicine, or just a virtual hug.

We’ve created channels in our company Slack for topics like parenting, wellness, gardening, and just general fun. These groups have really helped bring light and friendship to our shared situation. In the parenting channel, specifically, Moz parents have banded together to share resources and suggestions to help support each other in this new world of homeschooling.

Lean into empathy

We’re living through an unprecedented time, and one of the best things we can do is understand that sometimes, humans just need to be human. If you’re leading a team that’s working from home, you might find your employees keeping unorthodox working hours with school closures, disrupted schedules, and technical difficulties. Flex your empathy muscle, and consider enacting flexible policies that will reduce stress on your employees while making sure the work still gets done.

Let everyone know it’s okay to sign off during normal working hours to prioritize family time and child care. You can also schedule non-work-related check-ins, or build relaxation time into your schedules. Moz CEO Sarah Bird gave all employees a “Take a Breather” day to give everyone time to relax, make “quarantinis”, and adjust to our current reality. We all really appreciated that time!

This list of ways to help is by no means exhaustive, and we’d love to hear your ideas! Leave a comment or send us a tweet. We’re in this together.


What we’re doing

We’re committed to keeping as much normalcy in the routines of our community as possible, and that includes minimizing the impact of this crisis on our customers and employees. There will be no interruptions to our tool functionality or to our support team’s ability to serve our customers. We will also continue to publish helpful, actionable content — even if that means you see a few Whiteboard Fridays from the living rooms of our experts!

Employees at Moz have already been trained as a distributed team, which has prepared us well for a life of working from home — now a mandatory policy. We’re also given paid time off, including sick leave, and are encouraged to sign off from work when we’re feeling under the weather to rest and recuperate.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Help Your Community from Six Feet Away: Non-Marketing Tips from Mozzers published first on http://goproski.com/

Wednesday, March 25, 2020

How to Handle Temporarily Out-of-Stock Product Pages

Posted by Dr-Pete

The next few months are going to be uncharted territory for all of us, with serious challenges for both brick-and-mortar and online businesses. Many e-commerce sites are already facing a unique situation right now, and it looks something like this:

These are hand sanitizer results from Staples.com, and this screenshot is just a portion of the first page. I’m not picking on Staples — this page is representative of a problem across every major e-retailer right now. While there are many ways to handle out-of-stock and discontinued items under normal conditions, this situation is very specific:

  1. Multiple similar items are out-of-stock at the same time
  2. Retailers may not know when they’ll be back in stock
  3. These products may not stay back in stock for long
  4. Demand is high and continuing to rank is critical

From an SEO standpoint, it’s essential that these pages continue to rank, both for consumers and retailers, but in the short-term, the experience is also frustrating for consumers and can drive them to other sites.

Is this a technical SEO problem?

The short answer is: not really. We want these pages to continue to rank — they’re just not very useful in the short-term. Let’s take a quick look at the usual toolbox to see what applies.

Option #1: 404 (Not Found)

This one’s easy. Do not 404 these pages. These products are coming back and you want to sell them. What’s more, you want to be able to act quickly when they’re back in stock. If you remove the page and then put it back (and then, most likely, remove it again and put it back again), it can take Google a lot of time to reconcile those signals, to the point where the page is out of sync with reality. In other words, by the time the page starts ranking again, the product might already be out of stock again.

Option #2: 301 (Permanent Redirect)

As tools go, 301s still have a special place in our tool belts, but they’re not a good bet here. First, the product still exists. We don’t really want to move it in any permanent sense. Second, reversing a 301 can be a time-consuming process. So, just like with 404s, we’re likely to shoot ourselves in the foot. The only exception would be if a product went out of stock and that prompted the manufacturer to permanently replace it with a similar product. Let’s say Acme Essentials ran out of the 10-ounce Mountain Fresh hand sanitizer, so decided just to do away with that product and replace it with the 12-ounce option. In that case, by all means 301-redirect, but that’s going to be a fairly rare situation.

Option #3: 302 (Temporary Redirect)

This has got to be the one, right? Unfortunately, we’re still stuck with the timing problem if this product comes back in stock for a short period of time. Let’s say you’re out of the Acme Essentials 10-ounce Mountain Fresh, but you’ve got the Trapper Moe’s 10-ounce Spring Breeze in stock. Could you temporarily swap in the latter product from a search perspective? Maybe, if you could get the timing right, but now imagine the visitor experience. People would potentially still be able to search (on-site) for the Acme Essentials product, but then would be redirected to the Trapper Moe’s product, which could seem deceptive and is likely to harm conversion.

Option #4: ItemAvailability Schema

You can use the [availability] property in product-offer schemas to set options including: InStock, InStoreOnly, OutOfStock, and SoldOut. Google may choose to display this information as part of your organic result, such as this one (thanks to Claire Carlisle for this great example):

Good news — sloths are still in stock. Unfortunately, there are two challenges to this approach. First, while searchers may appreciate your honesty, you may not be keen to display “Out of stock” on your search result when everyone else is displaying nothing at all. Second, we’ve still got the timing issue. You can automate flipping from “In stock” to “Out of stock" in real time, but Google still has to crawl and update that information, and that takes time.

So, it’s basically hopeless?

If it seems like I’ve just ruled out all of the options, it’s because fundamentally I don’t believe this specific case is an SEO problem. Removing or redirecting pages in a volatile situation where products may go out of stock and come back into stock on a daily basis requires timing Google’s processes in a way that’s extremely risky.

So, if we’re going to keep these pages indexed and (hopefully) ranking, the key is to make sure that they continue to give value to your search visitors, and this is primarily a user experience problem.

Here’s an example of what not to do (sorry, unnamed big-box retailer):

Shipping is unavailable, but at least I can pick this up in the store, right? Nope, and for some reason they’ve auto-selected this non-option for me. If I accept the pre-selected unavailable option, I’m taken to a new screen telling me that yes, it is in fact unavailable. There’s absolutely no value here for a search visitor.

Here’s another example that might not seem so different, but is much more useful. Please note, while all of these elements are taken from real e-commerce sites, I’ve simplified the pages quite a bit:



The product is out of stock at my local store and not available for delivery, but it is available at a nearby store. That’s not ideal, and under normal circumstances I’d probably go somewhere else, but in the current environment it’s at least a viable option. A viable option is a potential sale.

Here’s an approach that gives search visitors another viable option:

It’s not the most visually-appealing layout, but that [Notify Me] button expands into a quick, single-field email form that gives visitors an immediate alternative. Even if they don’t buy from this store today, they might still enter their email and end up ordering later, especially at a time when supplies are low everywhere and people want alternatives.

This same page had another option I really like, an “Also available in” pull-down:

Unfortunately, these other options were also out of stock, but if this feature could be tuned up to only reflect similar, in-stock products, it could present an immediate purchase option. In this unique scenario, where demand massively outpaces supply, consumers are going to be much more amenable to similar products.

Obviously, these features represent a lot more work than a few 301 redirects, but we’re looking at a situation that could last for weeks or months. A few enhancements that give visitors viable options could be worth many thousands of dollars and could also help maintain search rankings.

What about internal search?

Obviously, the experience at the top of this post is less than ideal for internal search users, but should you remove those products from being displayed temporarily? From an SEO perspective, this is a bit tricky. If you block those products from being shown, then you’re also blocking the internal link equity temporarily, which could impact your rankings. In addition, you may end up with a blank page that doesn’t accurately represent your usual inventory. I think there are two options that are worth considering (both of which will require investment):

1. Let people filter out-of-stock products

I know that e-commerce sites are reluctant to hide products and want to maintain the perception of having a lot of available items, but they’re useless if none of those items are actually available. If you allow customers to easily filter out out-of-stock products, you address both problems above. First, visitors will get to see the full list initially and know which products you normally carry. Second, you can make the filter unavailable to search bots so that they continue to pass link equity to all products.

2. De-prioritize out-of-stock products

I’m not usually a fan of overriding search filters, as it can be confusing to visitors, but another option would be to push out-of-stock products to the bottom of internal search results, maintaining filters and sorts within the stocked and out-of-stock groups. This lets people see the entire list and also gives search bots access, but brings available products to the forefront. Visitors aren’t going to wade through pages of out-of-stock inventory to find the one available item.

No, really, what’s the secret?

I wish I could give you the magic HTML tag or line of .htaccess that would solve this problem, but when the situation is changing day-by-day or even hour-by-hour, many of our best practices fall apart. We can’t apply ordinary solutions to extraordinary problems.

In this unique case, I think the most important thing, from an SEO standpoint, is to maintain the ranking power of the page, and that probably means leaving it alone. Any technical wizardry we can perform ends at the point that search bots take over, and the process of re-crawling and re-caching a page takes time. Our best bet is to provide an experience that gives search visitors options and maintains the page’s value. While this will require investment in the short-term, these changes could equate to thousands of dollars in revenue and will continue to produce benefits even when life returns to normal.

What challenges are you facing?

As a Seattle-based company, Moz is painfully aware of the disruptions so many businesses and individuals are facing right now. How can we help you during this difficult period? Are there unique SEO challenges that you’ve never faced before? In the spirit of we’re-all-in-this-together, we’d like to help and commit content resources toward addressing the immediate problems our customers and readers are facing. Please tell us about your current challenges in the comments.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


How to Handle Temporarily Out-of-Stock Product Pages published first on http://goproski.com/

Tuesday, March 24, 2020

You Can Now Take Moz Academy Courses for Free

Posted by Roger-MozBot

The well-being of our community — from our customers to our readers to our team members — is of the utmost importance to us here at Moz. The ongoing situation around the spread of COVID-19 is ever-changing. Many of you are experiencing the impact of this pandemic, and we want to address the difficulties you’re facing and acknowledge how you might be feeling.

The state of the world and current events bring significant, often crushing, impact to businesses large and small. While it can be really hard to focus on work and on what is happening in the SEO industry during this difficult time, we also know that your work can’t stop.

Whether you’re reading this as a small business owner concerned about your traffic, or an agency with clients who are hurting financially — we’re here to support you.

Today through May 31, you’ll be able to access the courses in Moz Academy for free. Hopefully you can use this resource to level up your skills, learn a new discipline, or simply channel your energy into a productive distraction.

There’s something for everyone:

  • SEO Fundamentals
  • Local SEO Fundamentals
  • Keyword Research
  • Page Optimization
  • Backlink Basics
  • Reporting on SEO
  • Technical SEO Site Audit
  • Backlink Audit & Removal
  • The Fundamentals of SEO Client Prospecting
  • Finding Potential SEO Clients
  • Prepare for the SEO Client Pitch
  • Selling the Value of SEO
  • Client Onboarding
  • How to Use Moz Pro

If you’re already a Moz customer or community member, you can head straight to academy.moz.com. As long as you’re logged in, you’ll be good to go. Just pick the courses you want to take part in and apply promo code “wegotthis” at checkout.

If you’re not a Moz customer or community member, simply create a free account with us to get started.

We love you, we’re here for you, and we’re in this together.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


You Can Now Take Moz Academy Courses for Free published first on http://goproski.com/

Friday, March 20, 2020

Getting Smarter with SERPs - Whiteboard Friday

Posted by rjonesx.

Modern SERPs require modern understanding. National SERPs are a myth — these days, everything is local. And when we’re basing important decisions on SERPs and ranking, using the highest quality data is key. Russ Jones explores the problem with SERPs, data quality, and existing solutions in this edition of Whiteboard Friday.

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Hey, folks, this is Russ Jones here again with another exciting edition of Whiteboard Friday. Exciting might be an exaggeration, but it really is important to me because today we’re going to talk about data quality. I know I harp on this a whole lot.

It’s just, as a data scientist, quality is really important to me. Here at Moz, we’ve made it a priority of the last several years, from improving the quality of our Domain Authority score, improving Spam Score, completely changing the way we identify the search volume in particular keywords. Quality is just part of our culture here.

Today I want to talk about a quality issue and probably the most important metric in search engine optimization, which are search rankings. Now I know there’s this contingent of SEOs who say you shouldn’t look at your search rankings. You should just focus on building better content and doing better outreach and just let it happen.

But for the vast majority of us, we look at our rankings for the purposes of determining how we’re performing, and we make decisions based on those rankings. If a site stops performing as well for a very important keyword, well, then we might spend some money to improve the content on that page or to do more outreach for it.

We make important decisions, budgetary decisions on what the SERPs say. But we’ve known for a while that there’s a pretty big problem with the SERPs, and that’s personalization. There just is no national search anymore, and there hasn’t been for a long time. We’ve known this, and we’ve tried different ways to fix it.

Today I want to talk about a way that Moz is going about this that I think is really exceptional and is frankly going to revolutionize the way in which all SERPs are collected in the future. 

What’s wrong with SERPs?

1. Geography is king

Let’s just take a step back and talk a little bit about what’s wrong with SERPs. Several years back I was a consultant and I was helping out a nonprofit organization that wanted to rank for the keyword “entrepreneurship.”

They offered grants and training and all sorts of stuff. They really deserved to rank for the term. Then one day I searched for the term, as SEOs do. Even though they rank track, they still check it themselves. I noticed that several local universities to where I live, the University of North Carolina Chapel Hill and Duke, had popped up into the search results because they were now offering entrepreneurship programs and Google had geolocated me to the Durham area.

Well, this wasn’t represented at all in the rank tracking that we were doing. You see, the nationalized search at that time was not picking up any kind of local signals because there weren’t any colleges or universities around the data center which we were using to collect the search results.

That was a big problem because that one day Google rolled out some sort of update that improved geolocation and ultimately ended up taking a lot of traffic away for that primary keyword because local sites were starting to rank all across the country. So as SEOs we decided to fight back, and the strategy we used was what I call centroid search.

2. Centroid search sucks

The idea is pretty simple. You take a town, a city, a state, or even a country. You find the latitude and longitude of the dead center of that location, and then you feed that to Google in the UULE parameter so that you get a search result from what would happen if you were standing right there in that specific latitude and longitude and perform the search.

Well, we know that that’s not really a good idea. The reason is pretty clear. Let me give an example. This would be a local example for a business that’s trying to perform well inside of a small city, a medium town or so. This is actually, despite the fact that it’s drawn poorly, the locations of several Italian restaurants in South Bend, Indiana.

So as you can see, each little red one identifies a different Italian restaurant, and the centroid of the city is right here, this little green star. Well, there’s a problem. If you were to collect a SERP this way, you would be influenced dramatically by this handful of Italian restaurants right there in the center of the city.

But the problem with that is that these blue circles that I’ve drawn actually represent areas of increased population density. You see most cities, they have a populous downtown, but they also have around the outside suburban areas which are just as population dense or close to as population dense.

At the same time, they don’t get represented because they’re not in the middle of the city. So what do we do? How do we get a better representation of what the average person in that city would see? 

3. Sampled search succeeds

Well, the answer is what we call sampled search. There are lots of ways to go about it.

Right now, the way we’re doing it in particular is looking at the centroids of clusters of zip codes that are overlapping inside a particular city. 

As an example, although not exactly what would happen inside of Local Market Analytics, each one of these purple stars would represent different latitudes and longitudes that we would select in order to grab a search engine result and then blend them together in a way based on things like population density or proximity issues, and give us back a result that is much more like the average searcher would see than what the one person standing in the center part of the city would see.

We know that this works better because it correlates more with local search traffic than does the centroid search. Of course, there are other ways we could go about this. For example, instead of using geography, we could use population density specifically, and we can do a lot better job in identifying exactly what the average searcher would see.

But this just isn’t a local problem. It isn’t just for companies that are in cities. It’s for any website that wants to rank anywhere in the United States, including those that just want to rank generically across the entire country. You see, right now, the way that national SERPs tend to be collected is by adding a UULE of the dead center of the United States of America.

Now I think pretty much everybody here can understand why that’s a very poor representation of what the average person in the United States would see. But if we must get into it, as you can imagine, the center part of the United States is not population-dense.

We find population areas throughout the coastlines for the most part that have a lot more people in them. It would make a lot better sense to sample search results from all sorts of different locations, both rural and urban, in order to identify what the average person in the United States would see.



Centroid search delivers you a myopic view of this very specific area. Whereas sampled search can give you this blended model that is much more like what the average American or in any country or county or city or even neighborhood would see. So I actually think that this is the model that SERPs in general will be moving to in the future, at least SERP collection.

The future of SERPs

If we continue to rely on this centroid method, we’re going to continue to deliver results to our customers that just aren’t accurate and simply aren’t valuable. But by using the sampled model, we’ll be able to deliver our customers a much more quality experience, a SERP that is blended in a way that it represents the traffic that they’re actually going to get, and in doing so, we’ll finally solve, to at least a certain degree, this problem of personalization.

Now I look forward to Moz implementing this across the board. Right now you can get in Local Market Analytics. I hope that other organizations follow suit, because this kind of quality improvement in SERP collection is the type of quality that is demanded of an industry that is using technology to improve businesses’ performance. Without quality, we might as well not be doing it at all.

Thanks for hearing me out. I’d like to hear what you have to say in the comments, and in the SERPs as well, and hopefully we’ll be able to talk through some more ideas on quality. Looking forward to it. Thanks again.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Getting Smarter with SERPs - Whiteboard Friday published first on http://goproski.com/

Wednesday, March 18, 2020

How to Query the Google Search Console API

Posted by briangormanh

If you’ve been an SEO for even a short time, you’re likely familiar with Google Search Console (GSC). It’s a valuable tool for getting information about your website and its performance in organic search. That said, it does have its limitations.

In this article, you’ll learn how to get better-connected data out of Google Search Console as well as increase the size of your exports by 400%.

Google Search Console limitations

While GSC has a number of sections, we’ll be focusing on the “Performance” report. From the GSC dashboard, there are two ways you can access this report:

Once inside the “Performance” report, data for queries and pages can be accessed:

This reveals one of the issues with GSC: Query and page data is separated.

In other words, if I want to see the queries a specific page is ranking for, I have to first click “Pages,” select the page, and then click “back” to “Queries.” It’s a very cumbersome experience.

The other (two-part) issue is with exporting:

  • Performance data for queries and pages must be exported separately.
  • Exports are limited to 1,000 rows.

We’ll look to solve these issues by utilizing the GSC API.

What is the Google Search Console API?

Now we know the GSC user interface does have limitations: Connecting query data with page data is tricky, and exports are limited.

If the GSC UI represents the factory default, the GSC API represents our custom settings. It takes a bit more effort, but gives us more control and opens up more possibilities (at least in the realm of query and page data).

The GSC API is a way for us to connect to the data within our account, make more customized requests, and get more customized output. We can even bypass those factory default settings like exports limited to 1,000 rows, for instance.

Why use it?

Remember how I said earlier that query and page data is separated in the “vanilla” GSC UI? Well, with the API, we can connect query data with the page that query ranks for, so no more clicking back and forth and waiting for things to load.

Additionally, we saw that exports are limited to 1,000 rows. With the API, we can request up to 5,000 rows, an increase of 400%!

So let’s hook in, make our request, and get back a more robust and meaningful data set.

Setup

Log in to the appropriate GSC account on this page (upper right corner). For instance, if my website is example.com and I can view that Search Console account under admin@email.com, that’s the account I’ll sign into.

Enter the URL of the appropriate GSC account:

Set up your request:

  1. Set startDate. This should be formatted as: YYYY-MM-DD.
  2. Set endDate.
  3. Set dimensions. A dimension can be:
    • query
    • page
    • device
    • and/or country
  • Set filters (optional). A filter must include:
    • dimension (a dimension can be: query, page, device, or country)
    • operator (an operator can be: contains, notContains, equals, notEquals)
    • expression (an expression can be any value associated with the dimensions)
  • Set the rowLimit. With the GSC API, you can request up to 5,000!
  • The page shared in step one makes all of this setup pretty easy, but it can be tedious and even confusing for some. I’ve done all the fussing for you and have created JSON you can edit quickly and easily to get the API return you’d like.

    Unfiltered request

    The following request will be unfiltered. We’ll set our preferred dates, dimensions, and a row limit, and then make our request.

    The order in which you place your dimensions is the order in which they’ll be returned.

    The API will return data for desktop, mobile, and tablet, separated out. The numbers you see in the GSC user interface — clicks, for instance — are an aggregate of all three (unless you apply device filtering).

    Remember, your dimensions can also include “country” if you’d like.

    {

    “startDate”: “2019-11-01”,

    “endDate”: “2020-01-31”,

    “dimensions”:

    [

    “query”,

    “page”,

    “device”

    ],

    “rowLimit”: 3000

    }

    Filtered request

    This version of our request will include filters in order to be more specific about what is returned.

    Filters are stated as dimension/operator/expression. Here are some examples to show what’s possible:

    It looks like you can only apply one filter per dimension, just like in the normal GSC user interface, but if you know differently, let us know in the comments!

    {

    “startDate”: “2019-11-01”,

    “endDate”: “2020-01-31”,

    “dimensions”:

    [

    “query”,

    “page”,

    “device”

    ],

    “dimensionFilterGroups”:

    [

    {

    “filters”:

    [

    {

    “dimension”: “device”,

    “operator”: “notContains”,

    “expression”: “tablet”

    }

    ]

    }

    ],

    “rowLimit”: 3000

    }

    Choose a template, unfiltered or filtered, and fill in your custom values (anything after a colon should be updated as your own value, unless you like my presets).

    Execute the request

    So there you have it! Two request templates for you to choose from and edit to your liking. Now it’s time to make the request. Click into the “Request body”, select all, and paste in your custom JSON:

    This is where you could manually set up your request keys and values, but as I stated earlier, this can be tedious and a little confusing, so I’ve done that work for you.

    Scroll down and click “Execute.” You may be prompted to sign-in here as well.

    If everything was entered correctly and the request could be satisfied, the API will return your data. If you get an error, audit your request first, then any other steps and inputs if necessary.

    Click into the box in the lower right (this is the response from the API), select all, and copy the information.

    Convert from JSON to CSV

    Excel or Sheets will be a much better way to work with the data, so let’s convert our JSON output to CSV.

    Use a converter like this one and paste in your JSON output. You can now export a CSV. Update your column headers as desired.

    Query your own data

    Most SEOs are pretty comfortable in Excel, so you can now query your request output any way you’d like.

    One of the most common tasks performed is looking for data associated with a specific set of pages. This is done by adding a sheet with your page set and using VLOOKUP to indicate a match.

    The API output being in a spreadsheet also allows for the most common actions in Excel like sorting, filtering, and chart creation.

    Get more out of Google Search Console

    GSC offers important data for SEOs, and the GSC API output offers not only more of that data, but in a format that is far less cumbersome and more cohesive.

    Today, we overcame two obstacles we often face in the standard GSC user interface: the query/page connection and limited exports. My hope is that utilizing the Google Search Console API will take your analyses and insights to the next level.

    While my JSON templates will cover the most common scenarios in terms of what you’ll be interested in requesting, Google does offer documentation that covers a bit more ground if you’re interested.

    Do you have another way of using the GSC API? Is there another API you commonly use as an SEO? Let me know in the comments!


    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


    How to Query the Google Search Console API published first on http://goproski.com/

    Wednesday, March 11, 2020

    We Need to Talk About Google's “People Also Ask”: A Finance Case Study

    Posted by barryloughran

    For a while now, I’ve been disappointed with the People Also Ask (PAAs) feature in Google’s search results. My disappointment is not due to the vast amount of space they take up on the SERPs (that’s another post entirely), but more that the quality is never where I expect it to be.

    Google has been running PAAs since April 2015 and they are a pretty big deal. MozCast is currently tracking PAAs (Related Questions) across 90% of all searches, which is more than any other SERP feature.

    The quality issue I’m running into is that I still find several obscure PAA questions and results or content from other countries.

    When I run searches that have a universal answer, such as “can you eat raw chicken?”, the answer is universally correct so there is no issue with the results. But when I run a search that should return local (UK) content, such as “car insurance”, I’m finding a heavy influence from the US — especially around YMYL queries. 


    I wanted to find out how much of an issue this actually is, so my team and I analyzed over 1,000 of the most-searched-for keywords in the finance industry, where we would expect UK PAA results.

    Before we dig in, my fundamental question going into this research was: “Should a financial query originating in the UK, whose products are governed within UK regulations, return related questions that contain UK content?”

    I believe that they should and I hope that by the end of this post, you agree, too.

    Our methodology

    To conduct our analysis, we followed these steps:

    1. Tag keywords by category and sub-category:

    2. Remove keywords where you would expect a universal result, e.g. “insurance definition”.

    3. Extract PAAs and the respective ranking URLs using STAT.

    4. Identify country origin through manual review: are we seeing correct results?

    Our findings

    55.1% of the 4,507 available financial PAAs returned non-UK content. US content was served 50.5% of the time, while the remaining 4.6% was made up of sites from India, Australia, Canada, Ireland, South Africa, Spain, and Singapore.

    Results by category

    Breaking it down by category, we see that personal finance keywords bring back a UK PAA 33.72% of the time, insurance keywords 52.10%, utilities keywords 64.89%, and business keywords 38.76%.

    Personal finance

    Digging into the most competitive products in the UK, personal finance, we found that a significant percentage of PAAs brought back US or Indian content in the results.

    Out of the 558 personal finance keywords, 186 keywords didn’t bring back a single UK PAA result, including:

    • financial advisor
    • first credit card
    • best car loans
    • balance transfer cards
    • how to buy a house
    • best payday loans
    • cheap car finance
    • loan calculator

    Credit cards

    17.41% of credit card PAAs were showing UK-specific PAAs, with the US taking just over four out of every five. That’s huge.

    Another surprising find is that 61 out of 104 credit card keywords didn’t bring back a single UK PAA. I find this remarkable given the fact that the credit card queries originated in the UK.

    Loans

    Only 15.8% of searches returned a UK PAA result with over 75% coming from the US. We also saw highly-competitive and scrutinized searches for keywords like “payday loans” generate several non-UK results.

    Mortgages

    While the UK holds the majority of PAA results for mortgage-related keywords at 53.53%, there are still some major keywords (like “mortgages”) that only bring back a single UK result. If you’re searching for “mortgages” in the UK, then you want to see information about UK mortgages, but instead Google serves up mainly US results.

    Insurance

    Insurance results weren’t as bad as personal finance. However, there was still a big swing towards the US for some products, such as life insurance.

    Out of the 350 insurance keywords tested, there were 64 keywords that didn’t bring back a single UK PAA result, including:

    • pet insurance
    • cheap home insurance
    • life insurance comparison
    • car insurance for teens
    • cheap dog insurance
    • types of car insurance

    Car insurance

    60.54% of car insurance PAAs were showing UK-specific PAAs, with the US taking 36.97%. Out of the 132 keywords that were in this sub-category, UK sites were present for 118, which is better than the personal finance sub-categories.

    Home insurance

    As one of the most competitive spaces in the finance sector, it was really surprising to see that only 56.25% of results for home insurance queries returned a UK PAA. There are nuances to policies across different markets, so this is a frustrating and potentially harmful experience for searchers.

    Utilities

    Although we see a majority of PAAs in this keyword category return UK results, there are quite a few more specific searches for which you would absolutely be looking for a UK result (e.g. “unlimited data phone contracts”) but that bring back only one UK result.

    One interesting find is that this UKPower page has captured 35 PAAs for the 49 keywords it ranks for. That’s an impressive 71.43% — the highest rating we’ve seen across our analysis.

    Business

    At the time of our analysis, we found that 36.7% of business-related PAAs were from the UK. One of the keywords with the lowest representation in this category was "business loans", which generated only 6.25% UK results. While the volume of keywords are smaller in this category, there is more potential for harm with serving international content for queries relating to UK businesses.

    What pages generate the most PAA results?

    To make this post a little more actionable, I aggregated which URLs generated the most PAAs across some of the most competitive financial products in the UK. 

    Ironically, four out of the top 10 were US-based (cars.news.com manages to generate 32 PAAs across one of the most competitive industries in UK financial searches: car insurance). A hat tip to ukpower.co.uk, which ranked #1 in our list, generating 35 results in the energy space.

    To summarize the above analysis, it’s clear that there is too much dominance from non-UK sites in finance searches. While there are a handful of UK sites doing well, there are UK queries being searched for that are bringing back clearly irrelevant information.

    As an industry, we have been pushed to improve quality — whether it’s increasing our relevancy or the expertise of our content — so findings like these show that Google could be doing more themselves.

    What does this mean for your SEO strategy?

    For the purpose of this research, we only looked at financial terms, so whilst we can’t categorically say this is the same for all industries, if Google is missing this much across financial YMYL terms then it doesn’t look good for other categories.

    My advice would be that if you are investing any time optimizing for PAAs, then you should spend your time elsewhere, for now, since the cards in finance niches are stacked against you.

    Featured Snippets are still the prime real estate for SEOs and (anecdotally, anyway) don’t seem to suffer from this geo-skew like PAAs do, so go for Featured Snippets instead.

    Have you got any thoughts on the quality of PAAs across your SERPs? Let me know in the comments below!


    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


    We Need to Talk About Google’s “People Also Ask”: A Finance Case Study published first on http://goproski.com/

    Monday, March 9, 2020

    Crawled — Currently Not Indexed: A Coverage Status Guide

    Posted by cml63

    Google’s Index Coverage report is absolutely fantastic because it gives SEOs clearer insights into Google’s crawling and indexing decisions. Since its roll-out, we use it almost daily at Go Fish Digital to diagnose technical issues at scale for our clients.

    Within the report, there are many different “statuses” that provide webmasters with information about how Google is handling their site content. While many of the statuses provide some context around Google’s crawling and indexation decisions, one remains unclear: “Crawled — currently not indexed”.

    Since seeing the “Crawled — currently not indexed” status reported, we’ve heard from several site owners inquiring about its meaning. One of the benefits of working at an agency is being able to get in front of a lot of data, and because we’ve seen this message across multiple accounts, we’ve begun to pick up on trends from reported URLs.

    Google’s definition

    Let’s start with the official definition. According to Google’s official documentation, this status means: “The page was crawled by Google, but not indexed. It may or may not be indexed in the future; no need to resubmit this URL for crawling.”

    So, essentially what we know is that:

    1. Google is able to access the page
    2. Google took time to crawl the page
    3. After crawling, Google decided not to include it in the index

    The key to understanding this status is to think of reasons why Google would “consciously” decide against indexation. We know that Google isn’t having trouble finding the page, but for some reason it feels users wouldn’t benefit from finding it.

    This can be quite frustrating, as you might not know why your content isn’t getting indexed. Below I’ll detail some of the most common reasons our team has seen to explain why this mysterious status might be affecting your website.

    1. False positives

    Priority: Low

    Our first step is to always perform a few spot checks of URLs flagged in the “Crawled — currently not indexed” section for indexation. It’s not uncommon to find URLs that are getting reported as excluded but turn out to be in Google’s index after all.

    For example, here’s a URL that’s getting flagged in the report for our website: https://gofishdigital.com/meetup/

    However, when using a site search operator, we can see that the URL is actually included in Google’s index. You can do this by appending the text “site:” before the URL.

    If you’re seeing URLs reported under this status, I recommend starting by using the site search operator to determine whether the URL is indexed or not. Sometimes, these turn out to be false positives.

    Solution: Do nothing! You’re good.

    2. RSS feed URLs

    Priority: Low

    This is one of the most common examples that we see. If your site utilizes an RSS feed, you might be finding URLs appearing in Google’s “Crawled — currently not indexed” report. Many times these URLs will have the “/feed/” string appended to the end. They can appear in the report like this:

    Google finding these RSS feed URLs linked from the primary page. They’ll often be linked to using a “rel=alternate” element. WordPress plugins such as Yoast can automatically generate these URLs.

    Solution: Do nothing! You’re good.

    Google is likely selectively choosing not to index these URLs, and for good reason. If you navigate to an RSS feed URL, you’ll see an XML document like the one below:

    While this XML document is useful for RSS feeds, there’s no need for Google to include it in the index. This would provide a very poor experience as the content is not meant for users.

    3. Paginated URLs

    Priority: Low

    Another extremely common reason for the “Crawled — currently not indexed” exclusion is pagination. We will often see a good number of paginated URLs appear in this report. Here we can see some paginated URLs appearing from a very large e-commerce site:

    Solution: Do nothing! You’re good.

    Google will need to crawl through paginated URLs to get a complete crawl of the site. This is its pathway to content such as deeper category pages or product description pages. However, while Google uses the pagination as a pathway to access the content, it doesn’t necessarily need to index the paginated URLs themselves.

    If anything, make sure that you don’t do anything to impact the crawling of the individual pagination. Ensure that all of your pagination contains a self-referential canonical tag and is free of any “nofollow” tags. This pagination acts as an avenue for Google to crawl other key pages on your site so you’ll definitely want Google to continue crawling it.

    4. Expired products

    Priority: Medium

    When spot-checking individual pages that are listed in the report, a common problem we see across clients is URLs that contain text noting “expired” or “out of stock” products. Especially on e-commerce sites, it appears that Google checks to see the availability of a particular product. If it determines that a product is not available, it proceeds to exclude that product from the index.

    This makes sense from a UX perspective as Google might not want to include content in the index that users aren’t able to purchase.

    However, if these products are actually available on your site, this could result in a lot of missed SEO opportunity. By excluding the pages from the index, your content isn’t given a chance to rank at all.

    In addition, Google doesn’t just check the visible content on the page. There have been instances where we’ve found no indication within the visible content that the product is not available. However, when checking the structured data, we can see that the “availability” property is set to “OutOfStock”.

    It appears that Google is taking clues from both the visible content and structured data about a particular product’s availability. Thus, it’s important that you check both the content and schema.

    Solution: Check your inventory availability.

    If you’re finding products that are actually available getting listed in this report, you’ll want to check all of your products that may be incorrectly listed as unavailable. Perform a crawl of your site and use a custom extraction tool like Screaming Frog’s to scrape data from your product pages.

    For instance, if you want to see at scale all of your URLs with schema set to “OutOfStock”, you can set the “Regex” to: “availability”:“

    This: “class="redactor-autoparser-object”>http://schema.org/OutOfStock” should automatically scrape all of the URLs with this property:

    You can export this list and cross-reference with inventory data using Excel or business intelligence tools. This should quickly allow you to find discrepancies between the structured data on your site and products that are actually available. The same process can be repeated to find instances where your visible content indicates that products are expired.

    5. 301 redirects

    Priority: Medium

    One interesting example we’ve seen appear under this status is destination URLs of redirected pages. Often, we’ll see that Google is crawling the destination URL but not including it in the index. However, upon looking at the SERP, we find that Google is indexing a redirecting URL. Since the redirecting URL is the one indexed, the destination URL is thrown into the “Crawled — currently not indexed” report.

    The issue here is that Google may not be recognizing the redirect yet. As a result, it sees the destination URL as a “duplicate” because it is still indexing the redirecting URL.

    Solution: Create a temporary sitemap.xml.

    If this is occurring on a large number of URLs, it is worth taking steps to send stronger consolidation signals to Google. This issue could indicate that Google isn’t recognizing your redirects in a timely manner, leading to unconsolidated content signals.

    One option might be setting up a “temporary sitemap”. This is a sitemap that you can create to expedite the crawling of these redirected URLs. This is a strategy that John Mueller has previously recommended.

    To create one, you will need to reverse-engineer redirects that you have created in the past:

    1. Export all of the URLs from the “Crawled — currently not indexed” report.
    2. Match them up in Excel with redirects that have been previously set up.
    3. Find all of the redirects that have a destination URL in the “Crawled — currently not indexed” bucket.
    4. Create a static sitemap.xml of these URLs with Screaming Frog. 
    5. Upload the sitemap and monitor the “Crawled — currently not indexed” report in Search Console.

    The goal here is for Google to crawl the URLs in the temporary sitemap.xml more frequently than it otherwise would have. This will lead to faster consolidation of these redirects.

    6. Thin content

    Priority: Medium

    Sometimes we see URLs included in this report that are extremely thin on content. These pages may have all of the technical elements set up correctly and may even be properly internally linked to, however, when Google runs into these URLs, there is very little actual content on the page. Below is an example of a product category page where there is very little unique text:

    This product listing page was flagged as “Crawled — Currently Not Indexed”. This may be due to very thin content on the page.


    This page is likely either too thin for Google to think it’s useful or there is so little content that Google considers it to be a duplicate of another page. The result is Google removing the content from the index.

    Here is another example: Google was able to crawl a testimonial component page on the Go Fish Digital site (shown above). While this content is unique to our site, Google probably doesn’t believe that the single sentence testimonial should stand alone as an indexable page.

    Once again, Google has made the executive decision to exclude the page from the index due to a lack of quality.

    Solution: Add more content or adjust indexation signals.

    Next steps will depend on how important it is for you to index these pages.

    If you believe that the page should definitely be included in the index, consider adding additional content. This will help Google see the page as providing a better experience to users. 

    If indexation is unnecessary for the content you’re finding, the bigger question becomes whether or not you should take the additional steps to strongly signal that this content shouldn’t be indexed. The “Crawled —currently not indexed” report is indicating that the content is eligible to appear in Google’s index, but Google is electing not to include it.

    There also could be other low quality pages to which Google is not applying this logic. You can perform a general “site:” search to find indexed content that meets the same criteria as the examples above. If you’re finding that a large number of these pages are appearing in the index, you might want to consider stronger initiatives to ensure these pages are removed from the index such as a “noindex” tag, 404 error, or removing them from your internal linking structure completely.

    7. Duplicate content

    Priority: High

    When evaluating this exclusion across a large number of clients, this is the highest priority we’ve seen. If Google sees your content as duplicate, it may crawl the content but elect not to include it in the index. This is one of the ways that Google avoids SERP duplication. By removing duplicate content from the index, Google ensures that users have a larger variety of unique pages to interact with. Sometimes the report will label these URLs with a “Duplicate” status (“Duplicate, Google chose different canonical than user”). However, this is not always the case.

    This is a high priority issue, especially on a lot of e-commerce sites. Key pages such as product description pages often include the same or similar product descriptions as many other results across the Web. If Google recognizes these as too similar to other pages internally or externally, it might exclude them from the index all together.

    Solution: Add unique elements to the duplicate content.

    If you think that this situation applies to your site, here’s how you test for it:

    1. Take a snippet of the potential duplicate text and paste it into Google.
    2. In the SERP URL, append the following string to the end: “&num=100”. This will show you the top 100 results.
    3. Use your browser’s “Find” function to see if your result appears in the top 100 results. If it doesn’t, your result might be getting filtered out of the index.
    4. Go back to the SERP URL and append the following string to the end: “&filter=0”. This should show you Google’s unfiltered result (thanks, Patrick Stox, for the tip).
    5. Use the “Find” function to search for your URL. If you see your page now appearing, this is a good indication that your content is getting filtered out of the index.
    6. Repeat this process for a few URLs with potential duplicate or very similar content you’re seeing in the “Crawled — currently not indexed” report.

    If you’re consistently seeing your URLs getting filtered out of the index, you’ll need to take steps to make your content more unique.

    While there is no one-size-fits-all standard for achieving this, here are some options:

    1. Rewrite the content to be more unique on high-priority pages.
    2. Use dynamic properties to automatically inject unique content onto the page.
    3. Remove large amounts of unnecessary boilerplate content. Pages with more templated text than unique text might be getting read as duplicate.
    4. If your site is dependent on user-generated content, inform contributors that all provided content should be unique. This may help prevent instances where contributors use the same content across multiple pages or domains.

    8. Private-facing content

    Priority: High

    There are some instances where Google’s crawlers gain access to content that they shouldn’t have access to. If Google is finding dev environments, it could include those URLs in this report. We’ve even seen examples of Google crawling a particular client’s subdomain that is set up for JIRA tickets. This caused an explosive crawl of the site, which focused on URLs that shouldn’t ever be considered for indexation.

    The issue here is that Google’s crawl of the site isn’t focused, and it’s spending time crawling (and potentially indexing) URLs that aren’t meant for searchers. This can have massive ramifications for a site’s crawl budget.

    Solution: Adjust your crawling and indexing initiatives.

    This solution is going to be entirely dependent on the situation and what Google is able to access. Typically, the first thing you want to do is determine how Google is able to discover these private-facing URLs, especially if it’s via your internal linking structure.

    Start a crawl from the home page of your primary subdomain and see if any undesirable subdomains are able to be accessed by Screaming Frog through a standard crawl. If so, it’s safe to say that Googlebot might be finding those exact same pathways. You’ll want to remove any internal links to this content to cut Google’s access.

    The next step is to check the indexation status of the URLs that should be excluded. Is Google sufficiently keeping all of them out of the index, or were some caught in the index? If Google isn’t indexing a large amount of this content, you might consider adjusting your robots.txt file to block crawling immediately. If not, “noindex” tags, canonicals, and password protected pages are all on the table.

    Case study: duplicate user-generated content

    For a real-world example, this is an instance where we diagnosed the issue on a client site. This client is similar to an e-commerce site as a lot of their content is made up of product description pages. However, these product description pages are all user-generated content.

    Essentially, third parties are allowed to create listings on this site. However, the third parties were often adding very short descriptions to their pages, resulting in thin content. The issue occurring frequently was that these user-generated product description pages were getting caught in the “Crawled — currently not indexed” report. This resulted in missed SEO opportunity as pages that were capable of generating organic traffic were completely excluded from the index.

    When going through the process above, we found that the client’s product description pages were quite thin in terms of unique content. The pages that were getting excluded only appeared to have a paragraph or less of unique text. In addition, the bulk of on-page content was templated text that existed across all of these page types. Since there was very little unique content on the page, the templated content might have caused Google to view these pages as duplicates. The result was that Google excluded these pages from the index, citing the “Crawled — currently not indexed” status.

    To solve for these issues, we worked with the client to determine which of the templated content didn’t need to exist on each product description page. We were able to remove the unnecessary templated content from thousands of URLs. This resulted in a significant decrease in “Crawled — currently not indexed” pages as Google began to see each page as more unique.

    Conclusion

    Hopefully, this helps search marketers better understand the mysterious “Crawled — currently not indexed” status in the Index Coverage report. Of course, there are likely many other reasons that Google would choose to categorize URLs like this, but these are the most common instances we’ve seen with our clients to date.

    Overall, the Index Coverage report is one of the most powerful tools in Search Console. I would highly encourage search marketers to get familiar with the data and reports as we routinely find suboptimal crawling and indexing behavior, especially on larger sites. If you’ve seen other examples of URLs in the “Crawled — currently not indexed” report, let me know in the comments!


    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


    Crawled — Currently Not Indexed: A Coverage Status Guide published first on http://goproski.com/