Friday, November 13, 2020

The Energy Update – Week of November 9, 2020

This week the team highlights recent articles focusing on Governor Wolf’s attempt to push the Regional Green House Gas Initiative into law in Pennsylvania despite push back from the state legislator, as well as the hidden impacts of renewable energy. Plus, get a preview of the latest episode of the Plugged In podcast featuring Nick DeIuliis of CNX Resources.

Links

• ARTICLE Pennsylvania Governor Orders RGGI Implementation despite the Legislature

• ARTICLE The Environmental Impact of Lithium Batteries

• ARTICLE America’s Only Offshore Wind Farm Will Go Offline for Expensive Repair

• PODCAST Plugged In Podcast #63: Nick DeIuliis on Energy, Business, and Politics

• BOOK The Leech: An Indictment of the Evil Sapping America, Depleting Free Enterprise, and Bleeding Producers

 

The post The Energy Update – Week of November 9, 2020 appeared first on IER.

How Do Sessions Work in Google Analytics? — Best of Whiteboard Friday

Posted by Tom.Capper

Google Analytics data is used to support tons of important work, ranging from our everyday marketing reporting, all the way to investment decisions. To that end, it's integral that we're aware of just how that data works. In this Best of Whiteboard Friday edition, Tom Capper explains how the sessions metric in Google Analytics works, several ways that it can have unexpected results, and as a bonus, how sessions affect the time on page metric (and why you should rethink using time on page for reporting).

Editor’s note: Tom Capper is now an independent SEO consultant. This video is from 2018, but the same principles hold up today. There is only one minor caveat: the words "user" and "browser" are used interchangeably early in the video, which still hold mostly true. Google is trying to further push multi-device users as a concept with Google Analytics 4, but still relies on users being logged in, as well as extra tracking setup. For most sites most of the time, neither of these conditions hold.

How do sessions work in Google Analytics?

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hello, Moz fans, and welcome to another edition of Whiteboard Friday. I am Tom Capper. I am a consultant at Distilled, and today I'm going to be talking to you about how sessions work in Google Analytics. Obviously, all of us use Google Analytics. Pretty much all of us use Google Analytics in our day-to-day work.

Data from the platform is used these days in everything from investment decisions to press reporting to the actual marketing that we use it for. So it's important to understand the basic building blocks of these platforms. Up here I've got the absolute basics. So in the blue squares I've got hits being sent to Google Analytics.

So when you first put Google Analytics on your site, you get that bit of tracking code, you put it on every page, and what that means is when someone loads the page, it sends a page view. So those are the ones I've marked P. So we've got page view and page view and so on as you're going around the site. I've also got events with an E and transactions with a T. Those are two other hit types that you might have added.

The job of Google Analytics is to take all this hit data that you're sending it and try and bring it together into something that actually makes sense as sessions. So they're grouped into sessions that I've put in black, and then if you have multiple sessions from the same browser, then that would be a user that I've marked in pink. The issue here is it's kind of arbitrary how you divide these up.

These eight hits could be one long session. They could be eight tiny ones or anything in between. So I want to talk today about the different ways that Google Analytics will actually split up those hit types into sessions. So over here I've got some examples I'm going to go through. But first I'm going to go through a real-world example of a brick-and-mortar store, because I think that's what they're trying to emulate, and it kind of makes more sense with that context.

Brick-and-mortar example

So in this example, say a supermarket, we enter by a passing trade. That's going to be our source. Then we've got an entrance is in the lobby of the supermarket when we walk in. We got passed from there to the beer aisle to the cashier, or at least I do. So that's one big, long session with the source passing trade. That makes sense.

In the case of a brick-and-mortar store, it's not to difficult to divide that up and try and decide how many sessions are going on here. There's not really any ambiguity. In the case of websites, when you have people leaving their keyboard for a while or leaving the computer on while they go on holiday or just having the same computer over a period of time, it becomes harder to divide things up, because you don't know when people are actually coming and going.

So what they've tried to do is in the very basic case something quite similar: arrive by Google, category page, product page, checkout. Great. We've got one long session, and the source is Google. Okay, so what are the different ways that that might go wrong or that that might get divided up?

Several things that can change the meaning of a session

1. Time zone

The first and possibly most annoying one, although it doesn't tend to be a huge issue for some sites, is whatever time zone you've set in your Google Analytics settings, the midnight in that time zone can break up a session. So say we've got midnight here. This is 12:00 at night, and we happen to be browsing. We're doing some shopping quite late.

Because Google Analytics won't allow a session to have two dates, this is going to be one session with the source Google, and this is going to be one session and the source will be this page. So this is a self-referral unless you've chosen to exclude that in your settings. So not necessarily hugely helpful.

2. Half-hour cutoff for "coffee breaks"

Another thing that can happen is you might go and make a cup of coffee. So ideally if you went and had a cup of coffee while in you're in Tesco or a supermarket that's popular in whatever country you're from, you might want to consider that one long session. Google has made the executive decision that we're actually going to have a cutoff of half an hour by default.

If you leave for half an hour, then again you've got two sessions. One, the category page is the landing page and the source of Google, and one in this case where the blog is the landing page, and this would be another self-referral, because when you come back after your coffee break, you're going to click through from here to here. This time period, the 30 minutes, that is actually adjustable in your settings, but most people do just leave it as it is, and there isn't really an obvious number that would make this always correct either. It's kind of, like I said earlier, an arbitrary distinction.

3. Leaving the site and coming back

The next issue I want to talk about is if you leave the site and come back. So obviously it makes sense that if you enter the site from Google, browse for a bit, and then enter again from Bing, you might want to count that as two different sessions with two different sources. However, where this gets a little murky is with things like external payment providers.

If you had to click through from the category page to PayPal to the checkout, then unless PayPal is excluded from your referral list, then this would be one session, entrance from Google, one session, entrance from checkout. The last issue I want to talk about is not necessarily a way that sessions are divided, but a quirk of how they are.

4. Return direct sessions

If you were to enter by Google to the category page, go on holiday and then use a bookmark or something or just type in the URL to come back, then obviously this is going to be two different sessions. You would hope that it would be one session from Google and one session from direct. That would make sense, right?

But instead, what actually happens is that, because Google and most Google Analytics and most of its reports uses last non-direct click, we pass through that source all the way over here, so you've got two sessions from Google. Again, you can change this timeout period. So that's some ways that sessions work that you might not expect.

As a bonus, I want to give you some extra information about how this affects a certain metric, mainly because I want to persuade you to stop using it, and that metric is time on page.

Bonus: Three scenarios where this affects time on page

So I've got three different scenarios here that I want to talk you through, and we'll see how the time on page metric works out.

I want you to bear in mind that, basically, because Google Analytics really has very little data to work with typically, they only know that you've landed on a page, and that sent a page view and then potentially nothing else. If you were to have a single page visit to a site, or a bounce in other words, then they don't know whether you were on that page for 10 seconds or the rest of your life.

They've got no further data to work with. So what they do is they say, "Okay, we're not going to include that in our average time on page metrics." So we've got the formula of time divided by views minus exits. However, this fudge has some really unfortunate consequences. So let's talk through these scenarios.

Example 1: Intuitive time on page = actual time on page

In the first scenario, I arrive on the page. It sends a page view. Great. Ten seconds later I trigger some kind of event that the site has added. Twenty seconds later I click through to the next page on the site. In this case, everything is working as intended in a sense, because there's a next page on the site, so Google Analytics has that extra data of another page view 20 seconds after the first one. So they know that I was on here for 20 seconds.

In this case, the intuitive time on page is 20 seconds, and the actual time on page is also 20 seconds. Great.

Example 2: Intuitive time on page is higher than measured time on page

However, let's think about this next example. We've got a page view, event 10 seconds later, except this time instead of clicking somewhere else on the site, I'm going to just leave altogether. So there's no data available, but Google Analytics knows we're here for 10 seconds.

So the intuitive time on page here is still 20 seconds. That's how long I actually spent looking at the page. But the measured time or the reported time is going to be 10 seconds.

Example 3: Measured time on page is zero

The last example, I browse for 20 seconds. I leave. I haven't triggered an event. So we've got an intuitive time on page of 20 seconds and an actual time on page or a measured time on page of 0.

The interesting bit is when we then come to calculate the average time on page for this page that appeared here, here, and here, you would initially hope it would be 20 seconds, because that's how long we actually spent. But your next guess, when you look at the reported or the available data that Google Analytics has in terms of how long we're on these pages, the average of these three numbers would be 10 seconds.

So that would make some sense. What they actually do, because of this formula, is they end up with 30 seconds. So you've got the total time here, which is 30, divided by the number of views, we've got 3 views, minus 2 exits. Thirty divided 3 minus 2, 30 divided by 1, so we've got 30 seconds as the average across these 3 sessions.

Well, the average across these three page views, sorry, for the amount of time we're spending, and that is longer than any of them, and it doesn't make any sense with the constituent data. So that's just one final tip to please not use average time on page as a reporting metric.

I hope that's all been useful to you. I'd love to hear what you think in the comments below. Thanks.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Thursday, November 12, 2020

Plugged In Podcast #63: Nick DeIuliis on Energy, Business, and Politics

Nick DeIuliis, President and Chief Executive Officer of CNX Resources Corporation, joins the show to discuss what the hydraulic fracturing revolution has meant for Pennsylvania, the rest of the country, and for developing economies around the world. Nick also weighs in on how energy is discussed in the public and political arenas.

Links:

Get a copy of The Leech: An Indictment of the Evil Sapping America, Depleting Free Enterprise, and Bleeding Producers

Learn more about CNX Resources

Follow Nick on Twitter

• More from IER on Pennsylvania’s energy landscape

More from IER on the value of hydraulic fracturing

 

The post Plugged In Podcast #63: Nick DeIuliis on Energy, Business, and Politics appeared first on IER.

The Environmental Impact of Lithium Batteries

During the Obama-Biden administration, hydraulic fracturing was accused of causing a number of environmental problems—faucets on fire, contamination of drinking water, etc.—but the administration’s own Environmental Protection Agency could not validate those accusations.  Now Biden is planning to transition the transportation sector to electric vehicles that are powered by lithium batteries and require other critical metals where China dominates the market. Mining and processing of lithium, however, turns out to be far more environmentally harmful than what turned out to be the unfounded issues with fracking.

In May 2016, dead fish were found in the waters of the Liqi River, where a toxic chemical leaked from the Ganzizhou Rongda Lithium mine. Cow and yak carcasses were also found floating downstream, dead from drinking contaminated water. It was the third incident in seven years due to a sharp increase in mining activity, including operations run by China’s BYD, one of the world’ biggest supplier of lithium-ion batteries. After the second incident in 2013, officials closed the mine, but fish started dying again when it reopened in April 2016.

Lithium prices doubled between 2016 and 2018 due to exponentially increasing demand. The lithium ion battery industry is expected to grow from 100 gigawatt hours of annual production in 2017 to almost 800 gigawatt hours in 2027. Part of that phenomenal demand increase dates back to 2015 when the Chinese government announced a huge push towards electric vehicles in its 13th Five Year Plan. The battery of a Tesla Model S, for example, has about 12 kilograms of lithium in it; grid storage needed to help balance renewable energy would need a lot more lithium given the size of the battery required.

Processing of Lithium Ore

The lithium extraction process uses a lot of water—approximately 500,000 gallons per metric ton of lithium. To extract lithium, miners drill a hole in salt flats and pump salty, mineral-rich brine to the surface. After several months the water evaporates, leaving a mixture of manganese, potassium, borax and lithium salts which is then filtered and placed into another evaporation pool. After between 12 and 18 months of this process, the mixture is filtered sufficiently that lithium carbonate can be extracted.

South America’s Lithium Triangle, which covers parts of Argentina, Bolivia and Chile, holds more than half the world’s supply of the metal beneath its salt flats. But it is also one of the driest places on earth. In Chile’s Salar de Atacama, mining activities consumed 65 percent of the region’s water, which is having a large impact on local farmers to the point that some communities have to get water elsewhere.

As in Tibet, there is the potential for toxic chemicals to leak from the evaporation pools into the water supply including hydrochloric acid, which is used in the processing of lithium, and waste products that are filtered out of the brine. In Australia and North America, lithium is mined from rock using chemicals to extract it into a useful form. In Nevada, researchers found impacts on fish as far as 150 miles downstream from a lithium processing operation.

Lithium extraction harms the soil and causes air contamination. In Argentina’s Salar de Hombre Muerto, residents believe that lithium operations contaminated streams used by humans and livestock and for crop irrigation. In Chile, the landscape is marred by mountains of discarded salt and canals filled with contaminated water with an unnatural blue hue. According to Guillermo Gonzalez, a lithium battery expert from the University of Chile, “This isn’t a green solution – it’s not a solution at all.”

China is among the five top countries with the most lithium resources and it has been buying stakes in mining operations in Australia and South America where most of the world’s lithium reserves are found. China’s Tianqi Lithium owns 51 percent of the world’s largest lithium reserve in Australia, giving it a controlling interest. In 2018, the company became the second-largest shareholder in Sociedad Química y Minera—the largest lithium producer in Chile. Another Chinese company, Ganfeng Lithium, has a long-term agreement to underwrite all lithium raw materials produced by Australia’s Mount Marion mine—the world’s second-biggest, high-grade lithium reserve.

Recycling Lithium-Ion

In Australia, only two percent of the country’s 3,300 metric tons of lithium-ion waste is recycled. Unwanted MP3 players and laptops often end up in landfills, where metals from the electrodes and ionic fluids from the electrolyte can leak into the environment.

Because lithium cathodes degrade over time, they cannot be placed into new batteries. Researchers are using robotics technology developed for nuclear power plants to find ways to remove and dismantle lithium-ion cells from electric vehicles. There have been a number of fires at recycling plants where lithium-ion batteries have been stored improperly, or disguised as lead-acid batteries and put through a crusher. Not only have these batteries burned at recycling plants, but auto makers are seeing battery-related fires leading to vehicle recalls and safety probes. In October, U.S. safety regulators opened a probe into more than 77,000 electric Chevy Bolts after two owners complained of fires that appeared to have begun under the back seat where the battery is located.

Because manufacturers are secretive about what goes into their batteries, it makes it harder to recycle them properly. Currently, recovered cells are usually shredded, creating a mixture of metal that can then be separated using pyrometallurgical techniques—burning—which wastes a lot of the lithium. Alternative techniques, including biological recycling where bacteria are used to process the materials, and hydrometallurgical techniques which use solutions of chemicals in a similar way to how lithium is extracted from brine are being investigated.

It is estimated that between 2021 and 2030, about 12.85 million tons of EV lithium ion batteries will go offline worldwide, and over 10 million tons of lithium, cobalt, nickel and manganese will be mined for new batteries. China is being pushed to increase battery recycling since repurposed batteries could be used as backup power systems for China’s 5G stations or reused in shared e-bikes, which would save 63 million tons of carbon emissions from new battery manufacturing.

Cobalt Extraction Also Poses Environmental Problems

Cobalt is found in huge quantities in the Democratic Republic of Congo and central Africa where it is extracted from the ground by hand, using child labor, without protective equipment. China owns eight of the 14 largest cobalt mines in the Democratic Republic of Congo and they account for about half of the country’s output. While China has only 1 percent of the world’s cobalt reserves, it dominates in the processing of raw cobalt. The Democratic Republic of Congo is the source of over two-thirds of global cobalt production, but China has over 80 percent control of the cobalt refining industry, where raw material is turned into commercial-grade cobalt metal.

Like lithium, the price of cobalt has quadrupled in the last two years.

Conclusion

Environmentalists expressed unfounded concerns about fracking, but they need to be worried about replacing fossil fuels in the transportation and electric generating sector with electric vehicles and renewable energy where lithium, cobalt and other critical metals are needed to produce these technologies. Mining, processing, and disposing of these metals can contaminate the drinking water, land and environment if done improperly as seen from several examples. And, since China dominates the global market, it just switches what once was U.S. reliance on the Middle East to U.S. reliance on the People’s Republic.

The post The Environmental Impact of Lithium Batteries appeared first on IER.

Wednesday, November 11, 2020

Location Data + Reviews: The 1–2 Punch of Local SEO (Updated for 2020)

Posted by MiriamEllis

localseocombo.jpg

Get found. Get chosen.

It’s the local SEO two-step at the heart of every campaign. It’s the 1-2 punch combo that hinges on a balance of visible, accurate contact data, and a volunteer salesforce of consumer reviewers who are supporting your rise to local prominence.

But here’s the thing: while managed location data and reviews may be of equal and complementary power, they shouldn’t require an equal share of your time.

Automation of basic business data distribution is the key to freeing you up to focus on the elements of listings that require human ingenuity — namely, reviews and other listings-based content like posts and Q&A.

It’s my hope that sharing this article with your team or your boss will help you get the financial allocations you need for automated listings management, plus generous resources for creative reputation management.

Location data + reviews = the big picture

When Google lists a business, it gives good space to the business name, and a varying degree of space to the address and phone number. But look at the real estate occupied by the various aspects associated with reputation:

If Google cares this much about ratings, review text, responses, and emerging elements like place topics and attributes, any local brand you’re marketing should see these factors as a priority. In this article, I’ll strive to codify your actionable perspective on managing both location data and the many aspects of reviews.

Ratings: The most powerful local filter of them all

In the local SEO industry, we talk a lot about Google’s filters, like the Possum filter that’s supposed to strain local businesses through a sort of sieve so that a greater diversity of mapped results is shown to the searcher. But searchers have an even more powerful filter than this — the human-driven filter of ratings that helps people intuitively sort local brands by perceived quality.

Whether they’re stars or circles, the majority of rating icons send a 1–5 point signal to consumers that can be instantly understood. This symbol system has been around since at least the 1820s; it’s deeply ingrained in all our brains as a judgement of value.

This useful, rapid form of shorthand lets a searcher needing to do something like grab a quick taco see that the food truck with five Yelp stars is likely a better bet than the one with only two. Meanwhile, searchers with more complex needs can comb through the ratings of many listings at leisure, carefully weighing one option against another for major purchases. In Google’s local results, ratings are the most powerful human-created filter that influences the major goal of being chosen.

But before a local brand can be chosen on the basis of its high ratings, it has to rank well enough to be found. The good news is that, over the past three years, expert local SEOs have become increasingly convinced of the impact of Google ratings on Google local pack rankings. In 2017, when I wrote the original version of this post, contributors to the Local Search Ranking Factors survey placed Google star ratings down at #24 in terms of local rankings influence. In 2020, this metric has jumped up to spot #8 — a leap of 16 spots in just three years.

In the interim, Google has been experimenting with different ratings-related displays. In 2017, they were testing the application of a “highly rated” snippet on hotel rankings in the local packs. Today, their complex hotel results let the user opt to see only 4+ star results. Meanwhile, local SEOs have noticed patterns over the years like searches with the format of “best X in city” (e.g. best burrito in Dallas) appearing to default to local results made up of businesses that have earned a minimum average of four stars. Doubtless, observations like these have strengthened experts’ convictions that Google cares a lot about ratings and allows them to influence rank.

Heading into 2021, any local brand with goals of being found and chosen must view low ratings as an impediment to reaching full growth potential.

Consumer sentiment: The local business story your customers are writing for you

Here’s a randomly chosen Google 3-pack result when searching just for “tacos” in a small city in the San Francisco Bay Area:

taco3pack.jpg

We’ve just covered the topic of ratings, and you can look at a result like this to get that instant gut feeling about the 4-star-rated eateries vs. the 2-star place. Now, let’s open the book on business #3 and see precisely what kind of brand story its consumers are writing, as you would in conducting a professional review audit for a local business, excerpting dominant sentiment:

tacoaudit.jpg

It’s easy to ding fast food chains. Their business model isn’t commonly associated with fine dining or the kind of high wages that tend to promote employee excellence. In some ways, I think of them as extreme examples. Yet, they serve as good teaching models for how even the most modest-quality offerings create certain expectations in the minds of consumers, and when those basic expectations aren’t met, it’s enough of a story for consumers to share in the form of reviews.

This particular restaurant location has an obvious problem with slow service, orders being filled incorrectly, and employees who have been denied the training they need to represent the brand in a knowledgeable, friendly, or accessible manner. If you audited a different business, its pain points might surround outdated fixtures or low standards of cleanliness.

Whatever the case, when the incoming consumer turns to the review world, their eyes scan the story as it scrolls down their screen. Repeat mentions of a particular negative issue can create enough of a theme to turn the potential customer away. One survey says only up to 11% of consumers will do business with a brand that’s wound up with a 2-star rating based on poor reviews. Who can afford to let the other 91% of consumers go elsewhere?

The central goal of being chosen hinges on recognizing that your reviewer base is a massive, unpaid salesforce that tells your brand story. Survey after survey consistently finds that people trust reviews — in fact, they may trust them more than any claim your brand can make about itself.

Going into 2021, the writing is on the wall that Google cares a great deal about themes surfacing in your reviews. The ongoing development and display of place topics and attributes signifies Google’s increasing interest in parsing sentiment, and doubtless, using such data to determine relevance.

Fully embracing review management and the total local customer service ecosystem is key to giving customers a positive tale to tell, enabling the business you’re marketing to be trusted and chosen for the maximum number of transactions.

Velocity/recency/count: Just enough of a timely good thing to be competitive

This is one of the easiest aspects of review management to convey. You can sum it up in one sentence: don’t get too many reviews at once on any given platform but do get enough reviews on an ongoing basis to avoid looking like you’ve gone out of business.

For a little more background on the first part of that statement, watch Mary Bowling describing in this LocalU video how she audited a law firm that went from zero to thirty 5-star reviews within a single month. Sudden gluts of reviews like this not only look odd to alert customers, but they can trip review platform filters, resulting in removal. Remember, reviews are a business lifetime effort, not a race. Get a few this month, a few next month, and a few the month after that. Keep going.

The second half of the review timing paradigm relates to not running out of steam in your acquisition campaigns. Multiple surveys indicate that the largest percentage of review readers consider content from the past month to be most relevant. Despite this, Google’s index is filled with local brands that haven’t been reviewed in over a year, leaving searchers to wonder if a place is still in business, or if it’s so unimpressive that no one is bothering to review it.

While I’d argue that review recency may be more important in review-oriented industries (like restaurants) vs. those that aren’t quite as actively reviewed (like septic system servicing), the idea here is similar to that of velocity, in that you want to keep things going. Don’t run a big review acquisition campaign in January and then forget about outreach for the rest of the year. A moderate, steady pace of acquisition is ideal.

And finally, a local SEO FAQ comes from business owners who want to know how many reviews they need to earn. There’s no magic number, but the rule of thumb is that you need to earn more reviews than the top competitor you are trying to outrank for each of your search terms. This varies from keyword phrase, to keyword phrase, from city to city, from vertical to vertical. The best approach is steady growth of reviews to surpass whatever number the top competitor has earned.

Authenticity: Honesty is the only honest policy

For me, this is one of the most prickly and interesting aspects of the review world. Three opposing forces meet on this playing field: business ethics, business education, and the temptations engendered by the obvious limitations of review platforms to police themselves.

I often recall a basic review audit I did for a family-owned restaurant belonging to a friend of a friend. Within minutes, I realized that the family had been reviewing their own restaurant on Yelp (a glaring violation of Yelp’s policy). I felt sorry to see this, but being acquainted with the people involved (and knowing them to be quite nice!), I highly doubted they had done this out of some dark impulse to deceive the public.

Rather, my guess was that they may have thought they were “getting the ball rolling” for their new business, hoping to inspire real reviews. My gut feeling was that they simply lacked the necessary education to understand that they were being dishonest with their community and how this could lead to them being publicly shamed by Yelp, or even subjected to a lawsuit, if caught.

In such a scenario, there’s definitely an opportunity for the marketer to offer the necessary education to describe the risks involved in tying a brand to misleading practices, highlighting how vital it is to build trust within the local community. Fake positive reviews aren’t building anything real on which a company can stake its future. Ethical business owners will catch on when you explain this in honest terms and can then begin marketing themselves in smarter ways.

But then there's the other side. Mike Blumenthal’s reporting on this has set a high bar in the industry, with coverage of developments like the largest review spam network he’d ever encountered. There's simply no way to confuse organized, global review spam with a busy small business making a wrong, novice move. Real temptation resides in this scenario, because, as Blumenthal states:

“Review spam at this scale, unencumbered by any Google enforcement, calls into question every review that Google has. Fake business listings are bad, but businesses with 20, or 50, or 150 fake reviews are worse. They deceive the searcher and the buying public and they stain every real review, every honest business, and Google.”

When a platform like Google makes it easy to “get away with” deception, companies lacking ethics will take advantage of the opportunity. Beyond reporting review spam, one of the best things we can do as marketers is to offer ethical clients the education that helps them make honest choices. We can simply pose the question:

Is it better to fake your business’ success or to actually achieve success?

Local brands that choose to take the high road must avoid:

  • Any form of review incentives or spam
  • Review gating that filters consumers so that only happy ones leave reviews
  • Violations of the review guidelines specific to each review platform

Owner responses: creatively turning reviews into two-way conversations

Over the years, I’ve devoted abundant space in my column here at Moz to the fascinating topic of owner responses. I’ve highlighted the five types of Google My Business reviews and how to respond to them, I’ve diagrammed a real-world example of how a terrible owner response can make a bad situation even worse, and I’ve studied basic reputation management for better customer service and how to get unhappy customers to edit their negative reviews.

My key learnings from nearly two decades of examining reviews and responses are these:

  • Review responses are a critical form of customer service that can’t be ignored any more than business staff should ignore in-person customers asking for face-to-face help. Many reviewers expect responses.
  • The number of local business listings in every industry with zero owner responses on them is totally shocking.
  • Negative reviews, when fairly given, are a priceless form of free quality control for the brand. Customers directly tell the brand which problems need to be fixed to make them happy.
  • Many reviewers think of their reviews as living documents, and update them to reflect subsequent experiences.
  • Many reviewers are more than happy to give brands a second chance when a problem is resolved.
  • Positive reviews are conversations starters warmly inviting a response that further engages the customer and can convince them that the brand deserves repeat business.

Local brands and agencies can use software to automate updating a phone number or hours of operation. Software like Moz Local can be of real help in alerting you to new, incoming reviews across multiple platforms, or surfacing the top sentiment themes within your review corpus.

Tools free up resources to manage what can’t be automated: human creativity. It takes serious creative resources to spend time with review sentiment and respond to customers in a way that makes a brand stand out as responsive and worthy. It takes time to fully utilize the opportunities owner responses represent to impact goals all the way from the top to the bottom of the sales funnel.

I’ve never forgotten a piece Florian Huebner wrote for StreetFight documenting the neglected reviews of a major fast food chain and its subsequent increase in location closures and decrease in profits. No one was taking the time to sit down with the reviews, listen, fix problems customers were citing, or offer proofs of caring resolution via owner responses.

And all too often, when brands large and small do respond to reviews, they take a corporate-speak stance equivalent to “whistling past the graveyard” when addressing complaints. To keep the customer and to signal to the public that the brand deserves to be chosen, creative resources must be allocated to providing gutsy, honest owner responses. It’s easy to spot the difference:

whistlinggutsy.jpg

The response in yellow signals that the brand simply isn’t invested in customer retention. By contrast, the response in blue is a sample of what it takes to have a real conversation with a real person on the other side of the review text, in hopes of transforming one bad initial experience into a second chance, and hopefully, a lifetime of loyalty.

NAP and reviews: The 1–2 punch combo every local business must practice

Right now, there’s an employee at a local business or a staffer at an agency who is looking at the review corpus of a brand that’s struggling for rankings and profits. The set of reviews contains mixed sentiment, and no one is responding to either positive or negative customer experiences.

Maybe this is an issue that’s been brought up from time to time in company meetings, but it’s never made it to priority status. Decision-makers have felt that time and budget are better spent elsewhere.

Meanwhile, customers are quietly trickling away for lack of attention, leads are being missed, structural issues are being ignored…

If the employee or staffer I’m describing is you, my best advice is to make 2021 the year you make your strongest case for automating listing distribution and management with software so that creative resources can be dedicated to full reputation management.

Local SEO experts, your customers and clients, and Google, itself, are all indicating that location data + reviews are highly impactful and here to stay. In fact, history proves that this combination is deeply embedded in our entire approach to local commerce.

When traveling salesman Duncan Hines first published his 1935 review guide Adventures in Good Eating, he was developing what we think of today as local SEO. Here is my color-coded version of his review of the business that would one day become KFC. It should look strangely familiar to anyone who has ever tackled local business listings management:

duncanhines.jpg

No phone number on this “citation,” of course, but telephones were quite a luxury in 1935. Barring that element, this simple and historic review has the core earmarks of a modern local business listing. It has location data and review data; it’s the 1–2 punch combo every local business still needs to get right today. Without the NAP, the business can’t be found. Without the sentiment, the business gives little reason to be chosen.

From Duncan Hines to the digital age, there may be nothing new under the sun in marketing, but striking the right pose between listings and reputation management may be new news to your CEO, your teammates, or clients. So go for it — communicate this stuff, and good luck at your next big meeting!

Check out the new Moz Local plans that let you take care of location data distribution in seconds so that the balance of your focus can be on creatively caring for the customer.

New Moz Local Plans


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Tuesday, November 10, 2020

Behind the SEO: Launching Our New Guide — How to Rank

Posted by Cyrus-Shepard

Seven years ago, we published a post on the Moz Blog titled "How to Rank: 25 Step Master SEO Blueprint."

From an SEO perspective, the post did extremely well.

Over time, the "How to Rank" post accumulated:

  • 400k pageviews
  • 200k organic visits
  • 100s of linking root domains

Despite its success, seven years is a long time in SEO. The chart below shows what often happens when you don't update your content.

Predictably, both rankings and traffic declined significantly. By the summer of 2020, the post was only seeing a few hundred visits per month.

Time to update

We decided to update the content. We did this not only for a ranking/traffic boost, but also because SEO has changed a lot since 2013.

The old post simply didn't cut it anymore.

To regain our lost traffic, we also wanted to leverage Google's freshness signals for ranking content.

Many SEOs mistakenly believe that freshness signals are simply about updating the content itself (or even lazier, putting a new timestamp on it.) In actuality, the freshness signals Google may look actually take many different forms:

  1. Content freshness.
  2. Rate of content change: More frequent changes to the content can indicate more relevant content.
  3. User engagement signals: Declining engagement over time can indicate stale content.
  4. Link freshness: The rate of link growth over time can indicate relevancy.

To be fair, the post had slipped significantly in all of these categories. It hasn't been updated in years, engagement metrics had dropped, and hardly anyone new linked to it anymore.

To put it simply, Google had no good reason to rank the post highly.

This time when publishing, we also decided to launch the post as a stand-alone guide — instead of a blog post — which would be easier to maintain as evergreen content.

Finally, as I wrote in the guide itself, we simply wanted a cool guide to help people rank. One of the biggest questions we get from new folks after they read the Beginner's Guide to SEO is: "What do I read next? How do I actually rank a page?"

This is exactly that SEO guide.

Below, we'll discuss the SEO goals that we hope to achieve with the guide (the SEO behind the SEO), but if you haven't check it out yet, here's a link to the new guide:

How to Rank On Google


SEO goals

Rarely do SEO blogs talk about their own SEO goals when publishing content, but we wanted to share some of our strategies for publishing this guide.

1. Keywords

First of all, we wanted to improve on the keywords we already rank for (poorly). These are keywords like:

  • How to rank
  • SEO blueprint
  • SEO step-by-step

Our keyword research process showed that the phrase "SEO checklist" has more search volume and variations that "SEO blueprint", so we decided to go with "checklist" as a keyword.

Finally, when doing a competitor keyword gap analysis, we discovered some choice keywords that our competitors are ranking for with similar posts.

Based on this, we knew we should include the word "Google" in the title and try to rank for terms about "ranking on Google."

2. Featured snippets

Before publishing the guide, our friend Brian Dean (aka Backlinko) owns the featured snippet for "how to rank on Google."

It's a big, beautiful search feature. And highly deserved!

We want it.

There are no guarantees that we'll win this featured snippet (or others), but by applying a few featured snippets best practices—along with ranking on the first page—we may get there.

3. Links

We believe the guide is great content, so we hope it attracts links.

Links are important because while the guide itself may generate search traffic, the links it earns could help with the rankings across our entire site. As Rand Fishkin once famously wrote about the impact of links in SEO, "a rising tide lifts all ships."

Previously, the old post had a few hundred linking root domains pointing at it, including links from high-authority sites like Salesforce.

Obviously, we are now 301 redirecting these links to the new guide.

We'll also update internal links throughout the site, as well as adding links to posts and pages where appropriate.

To help build links in the short-term, we'll continue promoting the guide through social and email channels.

Long-term, we could also do outreach to help build links.

To be honest, we think the best and easiest way to build links naturally is simply to present a great resource that ranks highly, and also that we promote prominently on our site.

Will we succeed?

Time will tell. In 3-6 months we'll do an internal followup, to track our SEO progress and see how we measured up against our goals.

To make things more complicated, SEO is far more competitive than it was 7 years ago, which makes things harder. Additionally, we're transparently publishing our SEO strategy out in the open for our competitors to read, so they may adjust their tactics.

Want to help out? You can help us win this challenge by reading and sharing the guide, and even linking to it if you'd like. We'd very much appreciate it :)

To your success in SEO.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Monday, November 9, 2020

America’s Only Offshore Wind Farm Will Go Offline for Expensive Repair

The Block Island offshore wind farm, a 30-megawatt facility off the coast of Rhode Island, will be taken offline this spring for expensive repairs. It is the only operating offshore wind farm in the United States. The residents of Block Island will have their electricity produced by diesel generators on floating docks while the wind farm is offline. The Block Island wind farm consists of 6 turbines and has been operating for only about 4 years, since December 12, 2016.

The developers of the Block Island wind farm did not bury the high voltage cables that carry electricity to land deep enough. The cables are being exposed as the seabed is being worn away by tides and storms; the exposed cables are dangerous to swimmers. One part of the repair will cost $30 million and the cost of the other part is currently undisclosed. Ørsted A/S—the Danish power company that now owns Block Island–is bearing the cost of reburying the Block Island cable connecting the turbines to the island grid. The original cable was installed by the previous owner—Deepwater Wind. Ørsted is replacing the cable to a greater depth, which state regulators had originally wanted, but were over-ruled by a state board. National Grid, which owns the cable that connects the island to the mainland, will charge customers of the Narragansett Electric Company to fix the problem. It is unknown what the cost to ratepayers will be.

Over the next few months, the companies will dig up the ocean floor to install new portions of cable at 20 to 50 feet below the seabed, compared to the current 4- to 6-foot depths. In the spring, when the wind farm is offline, they will splice the cables together.

Offshore Wind Energy Transmission an Ongoing Challenge

High-voltage lines that carry power beneath the sea from wind farms to the onshore electrical grid remain a challenge for the offshore wind industry. The current U.S. power and transmission system was not designed for a large offshore wind industry. Offshore wind farms are currently injecting their cables at locations that previously housed retired power plants in order to reach the onshore electrical grid, but that is not a comprehensive solution to avoid overloading and congesting the onshore grid system.

About a decade ago, a “transmission first” planning process for offshore wind was suggested. The Atlantic Wind Connection proposed a transmission backbone to run from Virginia to New Jersey, which was estimated to represent cost savings of $1.1 billion in the PJM Interconnection. The proposal was dropped when it ran into significant regulatory and financial hurdles.

A recent white paper by the Business Network for Offshore Wind pressed for a planned approach to the offshore cable issue, indicating that “comprehensive and coordinated transmission planning will best position the U.S. offshore wind industry to achieve sustained success.”

Like other developments in technology, it will take time and investment to produce coordinated approaches, which will involve several states having different perspectives, power balancing authorities and federal oversight from the Interior Department and the Federal Energy Regulatory Commission (FERC).

State Offshore Wind Plans

Many Northeastern states have made offshore wind procurements including Maine, Massachusetts, Rhode Island, Connecticut, New York, New Jersey, Maryland and Virginia, with a total of 6,460 megawatts selected and a future goal of 28,530 megawatts. The Mid-Atlantic States of New York, New Jersey and Virginia are in the process of procuring another 7,540 megawatts.

These states are offering subsidies and contracts for offshore wind power to quickly reach their decarbonization goals. For example, New York has made a pledge to be carbon-free by 2050 beginning with a 100 percent carbon-free power sector by 2040. In 2016, Massachusetts made a commitment to procure 1,600 megawatts of offshore wind by 2027 and an additional 1,600 megawatts by 2035.

The Bureau of Ocean Energy Management estimates about 2,000 turbines could be constructed offshore within a 10-year period. Updating the onshore grid to accommodate the 15,000 to 20,000 megawatts of renewable resources northern states already require through existing policies could cost as much as $10 billion.

Conclusion

The first-hand experience with the Block Island wind farm is indicative of the future pit-falls that the U.S. offshore wind industry can expect as it continues toward replacing existing traditional generating technologies with offshore wind. As with Block Island, transmission poses a continuing challenge for the industry. Currently, the interconnection of cables from offshore wind farms to the onshore electrical grid is happening at points where retired generating units were located. Eventually, all of those sites will be exhausted and the industry will need to look at other areas to tie into the grid. The U.S. electrical grid was designed for traditional generating technologies and it is unclear whether it can smoothly handle power coming from offshore wind sites.

The post America’s Only Offshore Wind Farm Will Go Offline for Expensive Repair appeared first on IER.