Friday, November 30, 2018

Energy Sustainability: Giving Free Market Thanks

Depletion…pollution…security…climate change. These flashpoints of energy sustainability have been invoked time and again to advocate forced (government) transformation away from fossil fuels. But each complaint has proved highly exaggerated for demoting the primary role of mineral energies (natural gas, coal, and petroleum) for modern living.

The congruence of private gain and social good in energy markets is reason to give thanks this holiday season. Consumers in good conscience can stay warm with natural gas and fuel oil, as well as travel on gasoline and diesel. Electricity, too, can be generated with the cheapest and most versatile carbon-based energy without regret.

Background

Energy sustainability is an offshoot of sustainable development, classically defined in a 1987 report by the World Commission on Environment and Development as “development that meets the needs of the present without compromising the ability of future generations to meet their own needs.”

The so-called Brundtland Report led to the 1992 United Nations conference in Rio de Janeiro and Agenda 21, a 350-page action plan by the United Nations for global sustainable development. The U.S. was one of 178 nations to sign the (nonbinding) Agenda 21. For implementation ideas, the Clinton/Gore Administration created the President’s Council on Sustainable Development (1993–99), which defined sustainability as “economic growth that will benefit present and future generations without detrimentally affecting the resources or biological systems of the planet.”

The “Vision Statement” of PCSD’s Sustainable America: A New Consensus for Prosperity, Opportunity, and A Healthy Environment for the Future (1996), read:

“Our vision is of a life-sustaining Earth…A sustainable United States will have a growing economy that provides equitable opportunities for satisfying livelihoods and a safe, healthy, high quality life for current and future generations. Our nation will protect its environment, its natural resource base, and the functions and viability of natural systems on which all life depends (p. iv).”

Given this definition, are mineral energies “sustainable”? The answer is a resounding yes under a free-market interpretation of sustainable development:

A sustainable energy market is one in which the quantity, quality, and utility of energy improve over time. Sustainable energy becomes more available, more affordable, more usable and reliable, and cleaner. Energy consumers do not borrow from the future; they subsidize the future by continually improving today’s energy economy, which the future inherits (Bradley, Capitalism at Work: Business, Government, and Energy, 2008: p. 187).

Countering Complaints

The energy sustainability triad has concerned depletion, pollution, and climate change. A fourth area, energy security, primarily relating to unstable oil imports from Middle Eastern countries, arose in the 1970s and peaked with the Gulf War in 1990/91.

Depletionism concerns resource exhaustion, better known as Peak Oil (and Peak Natural Gas), where demand outraces supply to result in increasing prices. Pollution has centered around the criteria air pollutants carbon monoxide (CO), sulfur dioxide (SO2), particulate matter (PM), nitrogen oxides (NOx), lead (Pb), and volatile organic compounds (VOC). Climate change has shifted from brief worry about anthropogenic global cooling to an ongoing concern of anthropogenic global warming.

Peak-supply fears have been quelled by new-generation oil and gas extraction technology that, yet again, has turned high cost and inaccessible supply into economically minable resources. In response, fossil-fuel foes have turned to a keep-it-in-the-ground strategy conceding that many decades, if not centuries, of oil and gas inventory await. And with the US becoming the oil and gas center of the world, prior concern over energy security has faded.

Regarding the once vexing problem of urban air pollution, the Environmental Protection Agency has documented a 73-percent decline in criteria emissions since 1970 with further improvement expected. Technology in light of achievable regulatory rules has made fossil fuels and clean air a success story that industry critics did not think possible early on.

Climate change? This is a separate issue entirely from the above, but the direct benefits of carbon-dioxide fertilization and moderate warming have made the debate over costs versus benefits of anthropogenic climate change ambiguous. The public policy takeaway is not to regulate CO2, but to embrace free markets at home and abroad to capitalize on the positives and ameliorate the negatives of weather and climate change, natural or anthropogenic.

Free Market Environmentalism

The energy sustainability debate relates to the larger intellectual tradition of free market environmentalism. The private property/voluntary exchange model was codified by authors Terry Anderson and Donald Leal as follows:

Free market environmentalism emphasizes the importance of market processes in determining optimal amounts of resource use. Only when rights are well-defined, enforced, and transferable will self-interested individuals confront the trade-offs inherent in a world of scarcity (Free Market Environmentalism, 1991: p. 22).

Private entrepreneurship seeking gains from trade is key to overcoming negative externalities:

As entrepreneurs move to fill profit niches, prices will reflect the values we place on resources and the environment. Mistakes will be made, but in the process a niche will be opened and profit opportunities will trace resources managers with a better idea …

In cases where definition and enforcement costs are insurmountable, political solutions may be called for,” Anderson and Leal add, while warning that “those kinds of solutions often become entrenched and stand in the way of innovative market processes that promote fiscal responsibility, efficient resource use, and individual freedom (Ibid., p. 22-23).

In a 1993 essay, “Sustainable Development—A Free-Market Perspective,” Fred Smith applied the Anderson/Leal framework as an alternative to sustainable development. Free market environmentalism, Smith states (p. 297), “recognizes that the greatest hope for protecting environmental values lies in the empowerment of individuals to protect those environmental resources that they value (via a creative extension of property rights).” He explains (p. 298–99):

Sustainable development is not an artifact of the physical world but of human arrangements. Environmental resources will be protected or endangered depending upon the type of institutional framework we create, or allow to evolve, to address these concerns.

After going through examples of self-interested solutions to economic and environmental progress, Smith concludes: “The empirical evidence is clear: resources integrated into a private property system do, in fact, achieve ‘sustainability’” (p. 301).

Smith insists that “government failure” be assessed alongside alleged market failure, noting how “individuals who make resource-use decisions in a bureaucracy are rarely those who bear the costs or receive the benefits of such decisions” (p. 304). He contrasts the politicization of drilling in the Alaska National Wildlife Reserve (ANWR) versus drilling in the Audubon Society’s Rainey wildlife sanctuary in Louisiana in this regard (ibid.).

Conclusion

In a 1999 policy analysis for the Cato Institute, The Increasing Sustainability of Conventional Energy,” I concluded:

[T]he technology of fossil-fuel extraction, combustion, and consumption continues to rapidly improve. Fossil fuels continue to have a global market share of approximately 85 percent, and all economic and environmental indicators are positive. Numerous technological advances have made coal, natural gas, and petroleum more abundant, more versatile, more reliable, and less polluting than ever before, and the technologies are being transferred from developed to emerging markets. These positive trends can be expected to continue in the 21st century.

Almost twenty years later, production and consumption trends for mineral energies remain robust, despite determined, costly government policies to force wind and solar in electrical generation and ethanol in transportation markets. The global market share for fossil fuels remains more than 80 percent, with the most recent year registering growth rates of 3 percent, 1 percent, and 1.6 percent for natural gas, coal, and oil, respectively.

It is not doom-and-gloom in the energy market but quite the opposite. New generations of technology have made our ever-increasing quantities oil, coal, and natural gas environmental products, not just energy products. The sustainability threat is not free markets but government ownership and direction of resources in the name of energy sustainability. That supreme irony must be the subject for another day.

The post Energy Sustainability: Giving Free Market Thanks appeared first on IER.

Local Search Ranking Factors 2018: Local Today, Key Takeaways, and the Future

Posted by Whitespark

In the past year, local SEO has run at a startling and near-constant pace of change. From an explosion of new Google My Business features to an ever-increasing emphasis on the importance of reviews, it's almost too much to keep up with. In today's Whiteboard Friday, we welcome our friend Darren Shaw to explain what local is like today, dive into the key takeaways from his 2018 Local Search Ranking Factors survey, and offer us a glimpse into the future according to the local SEO experts.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans. I'm Darren Shaw from Whitespark, and today I want to talk to you about the local search ranking factors. So this is a survey that David Mihm has run for the past like 10 years. Last year, I took it over, and it's a survey of the top local search practitioners, about 40 of them. They all contribute their answers, and I aggregate the data and say what's driving local search. So this is what the opinion of the local search practitioners is, and I'll kind of break it down for you.

Local search today

So these are the results of this year's survey. We had Google My Business factors at about 25%. That was the biggest piece of the pie. We have review factors at 15%, links at 16%, on-page factors at 14%, behavioral at 10%, citations at 11%, personalization and social at 6% and 3%. So that's basically the makeup of the local search algorithm today, based on the opinions of the people that participated in the survey.

The big story this year is Google My Business. Google My Business factors are way up, compared to last year, a 32% increase in Google My Business signals. I'll talk about that a little bit more over in the takeaways. Review signals are also up, so more emphasis on reviews this year from the practitioners. Citation signals are down again, and that makes sense. They continue to decline I think for a number of reasons. They used to be the go-to factor for local search. You just built out as many citations as you could. Now the local search algorithm is so much more complicated and there's so much more to it that it's being diluted by all of the other factors. Plus it used to be a real competitive difference-maker. Now it's not, because everyone is pretty much getting citations. They're considered table stakes now. By seeing a drop here, it doesn't mean you should stop doing them. They're just not the competitive difference-maker they used to be. You still need to get listed on all of the important sites.

Key takeaways

All right, so let's talk about the key takeaways.

1. Google My Business

The real story this year was Google My Business, Google My Business, Google My Business. Everyone in the comments was talking about the benefits they're seeing from investing in a lot of these new features that Google has been adding.

Google has been adding a ton of new features lately — services, descriptions, Google Posts, Google Q&A. There's a ton of stuff going on in Google My Business now that allows you to populate Google My Business with a ton of extra data. So this was a big one.

✓ Take advantage of Google Posts

Everyone talked about Google Posts, how they're seeing Google Posts driving rankings. There are a couple of things there. One is the semantic content that you're providing Google in a Google post is definitely helping Google associate those keywords with your business. Engagement with Google Posts as well could be driving rankings up, and maybe just being an active business user continuing to post stuff and logging in to your account is also helping to lift your business entity and improve your rankings. So definitely, if you're not on Google Posts, get on it now.

If you search for your category, you'll see a ton of businesses are not doing it. So it's also a great competitive difference-maker right now.

✓ Seed your Google Q&A

Google Q&A, a lot of businesses are not even aware this exists. There's a Q&A section now. Your customers are often asking questions, and they're being answered by not you. So it's valuable for you to get in there and make sure you're answering your questions and also seed the Q&A with your own questions. So add all of your own content. If you have a frequently asked questions section on your website, take that content and put it into Google Q&A. So now you're giving lots more content to Google.

✓ Post photos and videos

Photos and videos, continually post photos and videos, maybe even encourage your customers to do that. All of that activity is helpful. A lot of people don't know that you can now post videos to Google My Business. So get on that if you have any videos for your business.

✓ Fill out every field

There are so many new fields in Google My Business. If you haven't edited your listing in a couple of years, there's a lot more stuff in there that you can now populate and give Google more data about your business. All of that really leads to engagement. All of these extra engagement signals that you're now feeding Google, from being a business owner that's engaged with your listing and adding stuff and from users, you're giving them more stuff to look at, click on, and dwell on your listing for a longer time, all that helps with your rankings.

2. Reviews

✓ Get more Google reviews

Reviews continue to increase in importance in local search, so, obviously, getting more Google reviews. It used to be a bit more of a competitive difference-maker. It's becoming more and more table stakes, because everybody seems to be having lots of reviews. So you definitely want to make sure that you are competing with your competition on review count and lots of high-quality reviews.

✓ Keywords in reviews

Getting keywords in reviews, so rather than just asking for a review, it's useful to ask your customers to mention what service they had provided or whatever so you can get those keywords in your reviews.

✓ Respond to reviews (users get notified now!)

Responding to reviews. Google recently started notifying users that if the owner has responded to you, you'll get an email. So all of that is really great, and those responses, it's another signal to Google that you're an engaged business.

✓ Diversify beyond Google My Business for reviews

Diversify. Don't just focus on Google My Business. Look at other sites in your industry that are prominent review sites. You can find them if you just look for your own business name plus reviews, if you search that in Google, you're going to see the sites that Google is saying are important for your particular business.

You can also find out like what are the sites that your competitors are getting reviews on. Then if you just do a search like keyword plus city, like "lawyers + Denver," you might find sites that are important for your industry as well that you should be listed on. So check out a couple of your keywords and make sure you're getting reviews on more sites than just Google.

3. Links

Then links, of course, links continue to drive local search. A lot of people in the comments talked about how a handful of local links have been really valuable. This is a great competitive difference-maker, because a lot of businesses don't have any links other than citations. So when you get a few of these, it can really have an impact.

✓ From local industry sites and sponsorships

They really talk about focusing on local-specific sites and industry-specific sites. So you can get a lot of those from sponsorships. They're kind of the go-to tactic. If you do a search for in title sponsors plus city name, you're going to find a lot of sites that are listing their sponsors, and those are opportunities for you, in your city, that you could sponsor that event as well or that organization and get a link.

The future!

All right. So I also asked in the survey: Where do you see Google going in the future? We got a lot of great responses, and I tried to summarize that into three main themes here for you.

1. Keeping users on Google

This is a really big one. Google does not want to send its users to your website to get the answer. Google wants to have the answer right on Google so that they don't have to click. It's this zero-click search result. So you see Rand Fishkin talking about this. This has been happening in local for a long time, and it's really amplified with all of these new features Google has been adding. They want to have all of your data so that they don't have to send users to find it somewhere else. Then that means in the future less traffic to your website.

So Mike Blumenthal and David Mihm also talk about Google as your new homepage, and this concept is like branded search.

  • What does your branded search look like?
  • So what sites are you getting reviews on?
  • What does your knowledge panel look like?

Make that all look really good, because Google doesn't want to send people to your new website.

2. More emphasis on behavioral signals

David Mihm is a strong voice in this. He talks about how Google is trying to diversify how they rank businesses based on what's happening in the real world. They're looking for real-world signals that actual humans care about this business and they're engaging with this business.

So there's a number of things that they can do to track that -- so branded search, how many people are searching for your brand name, how many people are clicking to call your business, driving directions. This stuff is all kind of hard to manipulate, whereas you can add more links, you can get more reviews. But this stuff, this is a great signal for Google to rely on.

Engagement with your listing, engagement with your website, and actual humans in your business. If you've seen on the knowledge panel sometimes for brick-and-mortar business, it will be like busy times. They know when people are actually at your business. They have counts of how many people are going into your business. So that's a great signal for them to use to understand the prominence of your business. Is this a busy business compared to all the other ones in the city?

3. Google will monetize everything

Then, of course, a trend to monetize as much as they can. Google is a publicly traded company. They want to make as much money as possible. They're on a constant growth path. So there are a few things that we see coming down the pipeline.

Local service ads are expanding across the country and globally and in different industries. So this is like a paid program. You have to apply to get into it, and then Google takes a cut of leads. So if you are a member of this, then Google will send leads to you. But you have to be verified to be in there, and you have to pay to be in there.

Then taking a cut from bookings, you can now book directly on Google for a lot of different businesses. If you think about Google Flights and Google Hotels, Google is looking for a way to monetize all of this local search opportunity. That's why they're investing heavily in local search so they can make money from it. So seeing more of these kinds of features rolling out in the future is definitely coming. Transactions from other things. So if I did book something, then Google will take a cut for it.

So that's the future. That's sort of the news of the local search ranking factors this year. I hope it's been helpful. If you have any questions, just leave some comments and I'll make sure to respond to them all. Thanks, everybody.

Video transcription by Speechpad.com


If you missed our recent webinar on the Local Search Ranking Factors survey with Darren Shaw and Dr. Pete, don't worry! You can still catch the recording here:

Check out the webinar

You'll be in for a jam-packed hour of deeper insights and takeaways from the survey, as well as some great audience-contributed Q&A.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Wednesday, November 28, 2018

The State of Local SEO: Industry Insights for a Successful 2019

Posted by MiriamEllis

A thousand thanks to the 1,411 respondents who gave of their time and knowledge in contributing to this major survey! You’ve created a vivid image of what real-life, everyday local search marketers and local business owners are observing on a day-to-day basis, what strategies are working for them right now, and where some frankly stunning opportunities for improvement reside. Now, we’re ready to share your insights into:

  • Google Updates
  • Citations
  • Reviews
  • Company infrastructure
  • Tool usage
  • And a great deal more...

This survey pooled the observations of everyone from people working to market a single small business, to agency marketers with large local business clients:

Respondents who self-selected as not marketing a local business were filtered from further survey results.

Thanks to you, this free report is a window into the industry. Bring these statistics to teammates and clients to earn the buy-in you need to effectively reach local consumers in 2019.

Get the full report

There are so many stories here worthy of your time

Let’s pick just one, to give a sense of the industry intelligence you’ll access in this report. Likely you’ve now seen the Local Search Ranking Factors 2018 Survey, undertaken by Whitespark in conjunction with Moz. In that poll of experts, we saw Google My Business signals being cited as the most influential local ranking component. But what was #2? Link building.

You might come away from that excellent survey believing that, since link building is so important, all local businesses must be doing it. But not so. The State of the Local SEO Industry Report reveals that:

When asked what’s working best for them as a method for earning links, 35% of local businesses and their marketers admitted to having no link building strategy in place at all:

And that, Moz friends, is what opportunity looks like. Get your meaningful local link building strategy in place in the new year, and prepare to leave ⅓ of your competitors behind, wondering how you surpassed them in the local and organic results.

The full report contains 30+ findings like this one. Rivet the attention of decision-makers at your agency, quote persuasive statistics to hesitant clients, and share this report with teammates who need to be brought up to industry speed. When read in tandem with the Local Search Ranking Factors survey, this report will help your business or agency understand both what experts are saying and what practitioners are experiencing.

Sometimes, local search marketing can be a lonely road to travel. You may find yourself wondering, “Does anyone understand what I do? Is anyone else struggling with this task? How do I benchmark myself?” You’ll find both confirmation and affirmation today, and Moz’s best hope is that you’ll come away a better, bolder, more effective local marketer. Let’s begin!

Download the report


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

SEI selected as the Leading Training Organization at Solar Power Mexico 2019

Solar Energy International (SEI) will be offering a total of 11 solar energy workshops during Solar Power Mexico 2019. The event will take place at the Citibanamex Center, Mexico City, Mexico from March,19-21.

Solar Power Mexico is organized by three of the leading trade show companies in the world: Deutsche Messe, SNEC PV Power Expo, and Solar Power International (SPI) have joined forces to help develop the Mexican solar energy market. 

Three world-renowned trade show organizers – each with a global reach – are throwing in their weight to make Solar Power Mexico 2019 a success. Their tremendous foothold at their home bases in Germany, China, U.S. and Mexico combined with their extensive international networks will help you connect with a highly qualified audience of solar industry professionals, facilitators and investors.

This is the first exhibition and congress specialized in the energy and solar technology segment, a business with growth rates of over 25% and an expected investment of over USD $100 billion in renewable energy by 2031.

The event will feature a seminar programme and exhibition at Centro Citibanamex. Stay tuned for more information about SEI special discount and registrations!

 

  • Solar Power Mexico 2019 - english version

    Solar Power Mexico 2019 - english version

The post SEI selected as the Leading Training Organization at Solar Power Mexico 2019 appeared first on Solar Training - Solar Installer Training - Solar PV Installation Training - Solar Energy Courses - Renewable Energy Education - NABCEP - Solar Energy International (SEI).

The Lame-Duck Carbon Tax Lob

The Deutch Plan

Congressman Ted Deutch (D-Fla.) has announced his intention to introduce a new carbon tax proposal in the House of Representatives. The Deutch proposal, officially “The Energy Innovation and Carbon Dividend Act,” would establish a carbon tax of $15 per ton of carbon dioxide equivalent in 2019, set to increase by $10 every year thereafter. The tax would be imposed on any covered entity’s use, or sale or transfer for use, of any covered fuel. Covered fuels include crude oil, natural gas, coal, and derived products. Covered entities include, but are not limited to, refineries, petroleum importers, coal mining operations, coal importers, and companies entering pipeline quality natural gas into the transmission system. Reps. Francis Rooney (R-Fla.), Brian Fitzpatrick (R-Pa.), John Delaney (D-Md.), and Charlie Crist (D-Fla.) are co-sponsors.

The Deutch tax plan’s $15 starting point masks its severity. The relatively-low $15 is not where our attention should focus considering the increase of $10 annually. Such a structure means that within five years the tax would be over $50 per ton (in inflation-adjusted terms). According to the handy Resources for the Future carbon tax calculator a tax at that level would mean a hike in the price of gasoline by more than 22 percent. Other fuel price hikes would be even larger. A tax that climbs to over $100 per ton by 2030 is a harrowing prospect indeed.

Revenue-Recycling Strategy

The plan was hatched with the guidance of the Climate Leadership Council and, accordingly, allocates its revenue in the form of lump-sum rebate checks delivered to households. According to the proposal’s draft text, the Treasury would be responsible for redistributing tax revenue via monthly so-called dividend payments to American families. Interestingly, the plan contains an anti-natalist element—capping families’ eligibility for pro-rata shares at two children. Climate Leadership Council Vice President Greg Bertlesen, who joined Congressman Deutch on a conference call with the press, has described the lump-sum rebate approach as having more political viability than other revenue-recycling approaches thanks to its focus on minimizing the costly effects of the carbon tax on low-income families who allocate a more significant proportion of their budgets to covering energy expenses.

The lump-sum rebate approach is one of the numerous strategies that IER’s recently-published study conducted by Capital Alpha Partners, LLC., “The Carbon Tax: An Analysis of Six Potential Scenarios,” evaluates. The findings do not bode well for Congressman Deutch’s proposal.

While the Capital Alpha study does not, of course, model the exact parameters of this newly-announced plan, an evaluation of the revenue-recycling strategy is nevertheless informative. When taking heed of the Joint Committee on Taxation’s standard 25-percent offset, recycling revenue via lump-sum rebates in all six analyzed carbon tax scenarios results in persistent economic underperformance. The study indicates that such an approach would reduce GDP by as much as 1.67 percent relative to the reference case at the beginning of the forecast period before the effects taper off. But lost production in the interim is never recovered. With a lump-sum rebate as the revenue-recycling strategy, the modeled scenarios result in lost GDP equal to between $1.88 trillion and $2.86 trillion in constant 2015 dollars over a standard 10-year budget period and between $3.76 trillion and $5.92 trillion over the study’s entire 22-year forecast period. In fact, the lump-sum rebate approach is uniquely damaging among revenue-recycling strategies.

Even carbon tax advocates such as Noah Kaufman, of the Columbia Center on Global Energy Policy, agree on this point. According to Kaufman’s evaluation:

Using revenues for rebates under the Deutch plan would sacrifice opportunities for better macroeconomic outcomes or government services. The Whitehouse proposal returns revenues to Americans primarily by cutting the payroll taxes paid by workers, which would boost the economy by encouraging work. The Curbelo proposal allocates the revenue to government programs to support transportation infrastructure, energy innovation, climate change adaptation, and assistance for displaced workers.

The Capital Alpha study indicates that the only way to construct a pro-growth carbon tax would be to allocate the entirety of the available revenue to corporate tax reductions. And even that is tenuous.

Timing

Another factor to consider with this bill is its timing and motivation. No one (sponsors included) expects this bill to ever become law. The political realities in a certain sense remove constraints on tax drafters, enabling Congressman Deutch and his colleagues to establish in the minds of the public the aggressive increase structure and, I fear, anchor future negotiations. In comparison, a tax that increases by 2 percent annually (such as July’s proposal by soon-to-be-former Congressman Curbelo) will look more reasonable and tempt compromise.

Regulatory Relief and Political Compromise

As my colleague Robert Murphy warns, a carbon tax is unlikely to satisfy the demands of the political left, despite the assurance of libertarian and conservative carbon tax advocates. Carbon-mitigation plans emerging from the political left sometimes include a carbon tax, but only as one aspect in the suite of coercive measures it deems necessary for climate action. “Every aspect of our lives, from our cars to our meals to our family sizes, affects global emissions—and therefore,” Murphy concludes, “the interventionists want every tool at their disposal to control others.” Responses to the Deutch proposal have confirmed Murphy’s appraisal. Take, for example, the statement published by Wenonah Hauter, executive director of Food and Water Watch:

The Deutch proposal amends the Clean Air Act so that the same sources of greenhouse gas emissions covered by the carbon tax are not subject to separate regulations by the Environmental Protection Agency (EPA). For example, it would suspend regulations of CO2 emissions from power plans, such as the Trump administration’s proposed Affordable Clean Energy Plan that would replace the Obama administration’s Clean Power Plan. (The carbon tax would reduce power plant CO2 emissions by far more than either of these regulations.) It would also suspend regulations of CO2 from energy use by industrial sources—EPA has had the authority to regulate these emissions since 2009, but it has not done so. Under the Deutch proposal, if actual emissions exceed the emissions targets by 2030, EPA is instructed to impose regulations to fill this emissions gap.

A carbon tax with a realistic possibility of being signed into law, would not be the revenue-neutral, regulation-busting efficient solution that libertarian and conservative tax advocates desire. The political forces on the left want no part of an even nominally market-based solution.

A related aspect of this discussion is that any carbon tax bill will be subject to unsavory horse-trading as the political process plays out in Washington. And, indeed, the Deutch proposal itself includes a massive exemption for a powerful interest group: agricultural. According to the draft text, “If any covered fuel or its derivative is used on a farm for a farming purpose, the Secretary shall pay (without interest) to the ultimate purchaser of such covered fuel or its derivative, the total amount of carbon fees previously paid upon that covered fuel, as specified by rule of the Secretary.” This exemption blatantly undermines the tax’s claimed purpose of reducing emissions.

If this is the bill we see when drafters have no expectation of passage, what will a bill that aims for true viability look like?

Conclusion

The Deutch plan fails to respect the economics literature, which warns of the perils of the lump-sum rebate. With its rapidly-escalating price structure, this lame-duck lob is highly unlikely to make a dent on Capitol Hill. Tax opponents should be wary, however, of this plan serving as an anchor in discussions of climate policy when the 116th Congress convenes next year.

The post The Lame-Duck Carbon Tax Lob appeared first on IER.

Climate Interventionists Won’t Stop With a Carbon Tax

Say what you will about the climate policy discussions at Vox, but they don’t mince words. They come right out and tell you how much they want to micromanage every last detail of your life. Vox’s resident expert, David Roberts, recently interviewed policy wonk and author Hal Harvey, to discuss which areas of society government should regulate in the name of slowing climate change. Everything was on the table—ranging from building codes to auto fuel efficiency to diet to family size—with the only debate being over the relative results from the various interventions.

Among other results, this peek into the interventionist mentality should serve as a wake-up call for the few writers who keep charmingly calling on libertarians and conservatives to strike a carbon tax deal with progressive leftists. As the Roberts/Harvey discussion says quite plainly, a carbon tax is just one arrow in the quiver of those championing aggressive government intervention to slow climate change.

A Carbon Tax Is Not Enough

Let me validate the carbon tax claim first. Here’s the key exchange from the Vox interview:

David Roberts

These days, people across the political spectrum are talking about carbon pricing. How does it fit into the larger effort?

Hal Harvey

The thing about carbon pricing is, it’s helpful, but it’s not dispositive. There are a number of sectors that are impervious to a carbon price, or close to impervious.

A carbon price works when it’s part of a package that includes R&D and performance standards. It does not work in isolation. It helps, but it doesn’t do nearly as much as is required.

Harvey elsewhere in the interview explicitly criticizes the standard “market solution” rhetoric behind a carbon tax when he says:

[Government-mandated performance standards] have a bad rep from an age-old and completely upside-down debate about “command-and-control” policy. But we use performance standards all the time, and they work really well. Our buildings don’t burn down very much; they used to burn down all the time. Our meat’s not poisoned; it used to be poisoned, or you couldn’t tell. And so forth. If you just tell somebody, this is the minimum performance required, guess what? Engineers are really good at meeting it cost-effectively.

In addition to their (naïve) promises of revenue neutrality, those pushing for a carbon tax swap deal also promise conservatives and libertarians that a “price on carbon” would allow for the dismantling of the existing top-down regulations. Yet we now have several lines of evidence to show just how naïve this hope is: (1) Harvey in the quotation above throws them under the bus. (2) The recent Curbelo carbon tax bill contained no *meaningful* regulatory relief. (3) Economist Paul Krugman is fine with outright bans on new (and existing?) coal-fired power plants, and (4) the people at Vox have said for years that a carbon tax would only work in conjunction with other anti-emission government policies. Notice that I am not scouring obscure subreddit threads to find Marxists posting from a hipster café, I am quoting from quite mainstream sources who are openly declaring that putting “a price on carbon” will not do enough to reduce emissions.

The Interventionist Mentality

The reader should also realize that Roberts and Harvey don’t merely consider fuel economy standards and building efficiency codes when it comes to “command and control” regulations. Everything is on the table, and the only reason to refrain from pursuing certain strategies is the dilution of political capital. The following excerpt illustrates:

David Roberts

The book also has nothing about behavior change — no turning off lights or going vegetarian. Do you find that lever unrealistic? 

Hal Harvey

It’s a policy design book, and there aren’t many policies that have people change their diet. Michael Bloomberg taxed sugar, so there’s one. But we’re not gonna have the tons-of-barbecue-per-capita tax in North Carolina…

We have limited political bandwidth. If you’re serious about change, you have to identify the decision makers that can innovate the most tons the fastest….There are 7.5 billion decision makers on diet. There are 250 utility commissioners in America — and utility commissioners control half the carbon in America.

Trying to invoke behavior change on something as personal as eating en masse is morally sound, it’s ecologically a good idea, but as a carbon strategy, it doesn’t scratch the surface.

Indeed, even when they give a nod to basic human rights, Roberts and Harvey sound creepy. Consider this exchange:

David Roberts

Paul Hawken’s Drawdown Project looked at options for reducing greenhouse gases and found that educating girls and family planning were the two most potent. 

Hal Harvey

When I was at the Hewlett Foundation, we sponsored a study by the National Center for Atmospheric Research that asked the question: Globally, if you met unmet need for contraceptives — that is to say, no coercion whatsoever — what would it cost and what would the carbon impact be?

We found large-scale abatement at less than a dollar a ton. So I’m completely in favor of that. [Bold added.]

It’s the part in bold that is chilling. For starters, I point out that this is the one area of life—the decision on how many children to have—where Harvey apparently feels that the government should not be using coercion to alter people’s behavior; coercive interventions in every other arena—from building design to diet to urban transit—were only tempered by pragmatic considerations.

Beyond that, the reason Harvey had to stress that his approach would be voluntary is that historically, this wasn’t taken for granted. There is a long and sordid history of wiser-than-thou social planners forcibly restricting how many children others could have, all in the name of some higher social good.

Indeed, Vox’s founder—Ezra Klein—recently got himself into an awkward spot when the website originally promoted his discussion with Bill Gates using the following tweet:

Vox quickly deleted the tweet after outrage and advertised the interview in a more sensitive manner, but the whole episode offered another peek into the interventionist mindset.

Conclusion

On these pages I have tirelessly pushed back against the small but vocal group of pundits and scholars who call on conservatives and libertarians to accept a carbon “tax swap deal” with leftist progressives. Beyond their failure to appreciate some of the technical nuances in the tax literature, these pleas overlook the simple fact that the mainstream thought leaders among the wonkish progressives have long since moved beyond the idea of a simple “price on carbon.” Every aspect of our lives, from our cars to our meals to our family sizes, affects global emissions—and therefore the interventionists want every tool at their disposal to control others.

The post Climate Interventionists Won’t Stop With a Carbon Tax appeared first on IER.

Using a New Correlation Model to Predict Future Rankings with Page Authority

Posted by rjonesx.

Correlation studies have been a staple of the search engine optimization community for many years. Each time a new study is released, a chorus of naysayers seem to come magically out of the woodwork to remind us of the one thing they remember from high school statistics — that "correlation doesn't mean causation." They are, of course, right in their protestations and, to their credit, and unfortunate number of times it seems that those conducting the correlation studies have forgotten this simple aphorism.

We collect a search result. We then order the results based on different metrics like the number of links. Finally, we compare the orders of the original search results with those produced by the different metrics. The closer they are, the higher the correlation between the two.

That being said, correlation studies are not altogether fruitless simply because they don't necessarily uncover causal relationships (ie: actual ranking factors). What correlation studies discover or confirm are correlates.

Correlates are simply measurements that share some relationship with the independent variable (in this case, the order of search results on a page). For example, we know that backlink counts are correlates of rank order. We also know that social shares are correlates of rank order.

Correlation studies also provide us with direction of the relationship. For example, ice cream sales are positive correlates with temperature and winter jackets are negative correlates with temperature — that is to say, when the temperature goes up, ice cream sales go up but winter jacket sales go down.

Finally, correlation studies can help us rule out proposed ranking factors. This is often overlooked, but it is an incredibly important part of correlation studies. Research that provides a negative result is often just as valuable as research that yields a positive result. We've been able to rule out many types of potential factors — like keyword density and the meta keywords tag — using correlation studies.

Unfortunately, the value of correlation studies tends to end there. In particular, we still want to know whether a correlate causes the rankings or is spurious. Spurious is just a fancy sounding word for "false" or "fake." A good example of a spurious relationship would be that ice cream sales cause an increase in drownings. In reality, the heat of the summer increases both ice cream sales and people who go for a swim. That swimming can cause drownings. So while ice cream sales is a correlate of drowning, it is *spurious.* It does not cause the drowning.

How might we go about teasing out the difference between causal and spurious relationships? One thing we know is that a cause happens before its effect, which means that a causal variable should predict a future change.

An alternative model for correlation studies

I propose an alternate methodology for conducting correlation studies. Rather than measure the correlation between a factor (like links or shares) and a SERP, we can measure the correlation between a factor and changes in the SERP over time.

The process works like this:

  1. Collect a SERP on day 1
  2. Collect the link counts for each of the URLs in that SERP
  3. Look for any URLs are out of order with respect to links; for example, if position 2 has fewer links than position 3
  4. Record that anomaly
  5. Collect the same SERP in 14 days
  6. Record if the anomaly has been corrected (ie: position 3 now out-ranks position 2)
  7. Repeat across ten thousand keywords and test a variety of factors (backlinks, social shares, etc.)

So what are the benefits of this methodology? By looking at change over time, we can see whether the ranking factor (correlate) is a leading or lagging feature. A lagging feature can automatically be ruled out as causal. A leading factor has the potential to be a causal factor.

We collect a search result. We record where the search result differs from the expected predictions of a particular variable (like links or social shares). We then collect the same search result 2 weeks later to see if the search engine has corrected the out-of-order results.

Following this methodology, we tested 3 different common correlates produced by ranking factors studies: Facebook shares, number of root linking domains, and Page Authority. The first step involved collecting 10,000 SERPs from randomly selected keywords in our Keyword Explorer corpus. We then recorded Facebook Shares, Root Linking Domains, and Page Authority for every URL. We noted every example where 2 adjacent URLs (like positions 2 and 3 or 7 and 8) were flipped with respect to the expected order predicted by the correlating factor. For example, if the #2 position had 30 shares while the #3 position had 50 shares, we noted that pair. Finally, 2 weeks later, we captured the same SERPs and identified the percent of times that Google rearranged the pair of URLs to match the expected correlation. We also randomly selected pairs of URLs to get a baseline percent likelihood that any 2 adjacent URLs would switch positions. Here were the results...

The outcome

It's important to note that it is incredibly rare to expect a leading factor to show up strongly in an analysis like this. While the experimental method is sound, it's not as simple as a factor predicting future — it assumes that in some cases we will know about a factor before Google does. The underlying assumption is that in some cases we have seen a ranking factor (like an increase in links or social shares) before Googlebot has and that in the 2 week period, Google will catch up and correct the incorrectly ordered results. As you can expect, this is a rare occasion. However, with a sufficient number of observations, we should be able to see a statistically significant difference between lagging and leading results. However, the methodology only detects when a factor is both leading and Moz Link Explorer discovered the relevant factor before Google.

Factor Percent Corrected P-Value 95% Min 95% Max
Control 18.93% 0
Facebook Shares Controlled for PA 18.31% 0.00001 -0.6849 -0.5551
Root Linking Domains 20.58% 0.00001 0.016268 0.016732
Page Authority 20.98% 0.00001 0.026202 0.026398

Control:

In order to create a control, we randomly selected adjacent URL pairs in the first SERP collection and determined the likelihood that the second will outrank the first in the final SERP collection. Approximately 18.93% of the time the worse ranking URL would overtake the better ranking URL. By setting this control, we can determine if any of the potential correlates are leading factors - that is to say that they are potential causes of improved rankings.

Facebook Shares:

Facebook Shares performed the worst of the three tested variables. Facebook Shares actually performed worse than random (18.31% vs 18.93%), meaning that randomly selected pairs would be more likely to switch than those where shares of the second were higher than the first. This is not altogether surprising as it is the general industry consensus that social signals are lagging factors — that is to say the traffic from higher rankings drives higher social shares, not social shares drive higher rankings. Subsequently, we would expect to see the ranking change first before we would see the increase in social shares.

RLDs

Raw root linking domain counts performed substantially better than shares at ~20.5%. As I indicated before, this type of analysis is incredibly subtle because it only detects when a factor is both leading and Moz Link Explorer discovered the relevant factor before Google. Nevertheless, this result was statistically significant with a P value <0.0001 and a 95% confidence interval that RLDs will predict future ranking changes around 1.5% greater than random.

Page Authority

By far, the highest performing factor was Page Authority. At 21.5%, PA correctly predicted changes in SERPs 2.6% better than random. This is a strong indication of a leading factor, greatly outperforming social shares and outperforming the best predictive raw metric, root linking domains.This is not unsurprising. Page Authority is built to predict rankings, so we should expect that it would outperform raw metrics in identifying when a shift in rankings might occur. Now, this is not to say that Google uses Moz Page Authority to rank sites, but rather that Moz Page Authority is a relatively good approximation of whatever link metrics Google is using to determine ranking sites.

Concluding thoughts

There are so many different experimental designs we can use to help improve our research industry-wide, and this is just one of the methods that can help us tease out the differences between causal ranking factors and lagging correlates. Experimental design does not need to be elaborate and the statistics to determine reliability do not need to be cutting edge. While machine learning offers much promise for improving our predictive models, simple statistics can do the trick when we're establishing the fundamentals.

Now, get out there and do some great research!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Tuesday, November 27, 2018

SEI selected as the Leading Training Organization at Solar Power Mexico 2019

Solar Energy International (SEI) will be offering a total of 11 solar energy workshops during Solar Power Mexico 2019. The event will take place at the Citibanamex Center, Mexico City, Mexico from March,19-21.

Solar Power Mexico is organized by three of the leading trade show companies in the world: Deutsche Messe, SNEC PV Power Expo, and Solar Power International (SPI) have joined forces to help develop the Mexican solar energy market.

SEI has partnered with Solar Power International since 2012 to offer world class solar training at its yearly events. SPI was designed to serve and advance the solar energy industry by bringing together people, products, and professional development opportunities to drive the solar industry, and has become the perfect ally to help spread SEI’s mission to empower people, business and communities worldwide with solar education.

“SEI’s Programa Hispano has been growing rapidly since 2013, reaching over 9.000 spanish speaking alumni. Solar Power Mexico is the perfect opportunity to bring SEI’s leading training curriculum closer to Mexico and Latin America” SEI’s Latam Business Development Director said.

As the Mexican market is booming, supported by aggressive renewable energy policies and low solar prices, Solar Power Mexico will help visitors navigate the market opportunities, and SEI will help the industry grow strong and healthy with qualified solar education.

Using SEI registration code visitors receive a 15% off the full conference ticket. Request your coupon at programahispano@solarenergy.org

Registrations will start December 2, for the 3 paid trainings:

  1. Point of Interconnection Tuesday March 19, 9am – 12pm
  2. PV Installation Best Practices Wednesday March 20, 9am – 12pm
  3. Ground Faults (GF) Troubleshooting and GF Protection Retrofit Thursday March 21, 9am – 12pm

See below the free on-the-floor workshops that will be offered (final list, dates and times will be confirmed closer to the event)

  1. PV System Types and Components
  2. Site Analysis and the Solar Resource
  3. System Sizing – How Many Watts?
  4. Verifying PV System Performance
  5. Array-to-Inverter Configuration
  6. IV Curves Demystified
  7. PV Systems Maintenance Tools and Techniques
  8. OCPDs and Disconnects: which, where, and why

Learn more about SEI training opportunities at https://www.solarenergy.org/training-schedule/

The post SEI selected as the Leading Training Organization at Solar Power Mexico 2019 appeared first on Solar Training - Solar Installer Training - Solar PV Installation Training - Solar Energy Courses - Renewable Energy Education - NABCEP - Solar Energy International (SEI).

Monday, November 26, 2018

Oil and Gas Production on Federal Land Falls Far Below Historic Norms

What to Do with Your Old Blog Posts

Posted by -LaurelTaylor-

Around 2005 or so, corporate blogs became the thing to do. Big players in the business world touted that such platforms could “drive swarms of traffic to your main website, generate more product sales” and even “create an additional stream of advertising income” (Entrepreneur Magazine circa 2006). With promises like that, what marketer or exec wouldn’t jump on the blog bandwagon?

Unfortunately, initial forays into branded content did not always dwell on minor issues like “quality” or “entertainment,” instead focusing on sheer bulk and, of course, all the keywords. Now we have learned better, and many corporate blogs are less prolific and offer more value. But on some sites, behind many, many “next page” clicks, this old content can still be found lurking in the background.

This active company blog still features over 900 pages of posts dating back to 2006

This situation leaves current SEOs and content teams in a bit of a pickle. What should you do if your site has excessive quantities of old blog posts? Are they okay just sitting there? Do you need to do something about them?

Why bother addressing old blog posts?

On many sites, the sheer number of pages are the biggest reason to consider improving or scaling back old content. If past content managers chose quantity over quality, heaps of old posts eventually get buried, all evergreen topics have been written about before, and it becomes increasingly harder to keep inventory of your content.

From a technical perspective, depending on the scale of the old content you're dealing with, pruning back the number of pages that you put forward can help increase your crawl efficiency. If Google has to crawl 1,000 URLs to find 100 good pieces of content, they are going to take note and not spend as much time combing through your content in the future.

From a marketing perspective, your content represents your brand, and improving the set of content that you put forward helps shape the way customers see you as an authority in your space. Optimizing and curating your existing content can give your collection of content a clearer topical focus, makes it more easily discoverable, and ensures that it provides value for users and the business.

Zooming out for a second to look at this from a higher level: If you've already decided that it's worth investing in blog content for your company, it’s worth getting the most from your existing resources and ensuring that they aren’t holding you back.

Decide what to keep: Inventory and assessment

Inventory

The first thing to do before accessing your blog posts is to make sure you know what you have. A full list of URLs and coordinating metadata is incredibly helpful in both evaluating and documenting.

Depending on the content management system that you use, obtaining this list can be as simple as exporting a database field. Alternatively, URLs can be gleaned from a combination of Google Analytics data, Webmaster Tools, and a comprehensive crawl with a tool such as Screaming Frog. This post gives a good outline of how to get the data you need from these sources.

Regardless of whether you have a list of URLs yet, it’s also good to do a full crawl of your blog to see what the linking structure looks like at this point, and how that may differ from what you see in the CMS.

Assessment

Once you know what you have, it’s time to assess the content and decide if it's worth holding on to. When I do this, I like to ask these 5 questions:

1. Is it beneficial for users?

Content that's beneficial for users is helpful, informative, or entertaining. It answers questions, helps them solve problems, or keeps them interested. This could be anything from a walkthrough for troubleshooting to a collection of inspirational photos.

Screenshots of old real estate articles: one is about how to select a location, and the other is about how to deal with the undead

These 5-year-old blog posts from different real estate blogs illustrate past content that still offers value to current users, and past content that may be less beneficial for a user

2. Is it beneficial for us?

Content that is beneficial to us is earning organic rankings, traffic, or backlinks, or is providing business value by helping drive conversions. Additionally, content that can help establish branding or effectively build topical authority is great to have on any site.

3. Is it good?

While this may be a bit of a subjective question to ask about any content, it’s obvious when you read content that isn’t good. This is about fundamental things such as if content doesn’t make sense, has tons of grammatical errors, is organized poorly, or doesn’t seem to have a point to it.

4. Is it relevant?

If content isn’t at least tangentially relevant to your site, industry, or customers, you should have a really good reason to keep it. If it doesn’t meet any of the former qualifications already, it probably isn’t worth holding on to.

These musings from a blog of a major hotel brand may not be the most relevant to their industry

5. Is it causing any issues?

Problematic content may include duplicate content, duplicate targeting, plagiarized text, content that is a legal liability, or any other number of issues that you probably don’t want to deal with on your site. I find that the assessment phase is a particularly good opportunity to identify posts that target the same topic, so that you can consolidate them.

Using these criteria, you can divide your old blog posts into buckets of “keep” and “don’t keep.” The “don’t keep” can be 301 redirected to either the most relevant related post or the blog homepage. Then it’s time to further address the others.

What to do with the posts you keep

So now you have a pile of “keep” posts to sort out! All the posts that made it this far have already been established to have value of some kind. Now we want to make the most of that value by improving, expanding, updating, and promoting the content.

Improve

When setting out to improve an old post that has good bones, it can be good to start with improvements on targeting and general writing and grammar. You want to make sure that your blog post has a clear point, is targeting a specific topic and terms, and is doing so in proper English (or whatever language your blog may be in).

Once the content itself is in good shape, make sure to add any technical improvements that the piece may need, such as relevant interlinking, alt text, or schema markup.

Then it’s time to make sure it’s pretty. Visual improvements such as adding line breaks, pull quotes, and imagery impact user experience and can keep people on the page longer.

Expand or update

Not all old blog posts are necessarily in poor shape, which can offer a great opportunity. Another way to get more value out of them is to repurpose or update the information that they contain to make old content fresh again. Data says that this is well worth the effort, with business bloggers that update older posts being 74% more likely to report strong results.

A few ways to expand or update a post are to explore a different take on the initial thesis, add newer data, or integrate more recent developments or changed opinions. Alternatively, you could expand on a piece of content by reinterpreting it in another medium, such as new imagery, engaging video, or even as audio content.

Promote

If you’ve invested resources in content creation and optimization, it only makes sense to try to get as many eyes as possible on the finished product. This can be done in a few different ways, such assharing and re-sharing on branded social channels, resurfacing posts to the front page of your blog, or even a bit of external promotion through outreach.

The follow-up

Once your blog has been pruned and you’re working on getting the most value out of your existing content, an important final step is to keep tabs on the effect these changes are having.

The most significant measure of success is organic organic traffic; even if your blog is designed for lead generation or other specific goals, the number of eyes on the page should have a strong correlation to the content’s success by other measures as well. For the best representation of traffic totals, I monitor organic sessions by landing page in Google Analytics.

I also like to keep an eye on organic rankings, as you can get an early glimpse of whether a piece is gaining traction around a particular topic before it's successful enough to earn organic traffic with those terms.

Remember that regardless of what changes you’ve made, it will usually take Google a few months to sort out the relevance and rankings of the updated content. So be patient, monitor, and keep expanding, updating, and promoting!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

#11: Colin Grabow, of the Cato Institute, on the Jones Act and energy

Alex sits down with Colin Grabow, a policy analyst at the Cato Institute’s Herbert A. Stiefel Center for Trade Policy Studies, to discuss problems with the Jones Act and the law’s impact on American energy.

Links:

The post #11: Colin Grabow, of the Cato Institute, on the Jones Act and energy appeared first on IER.

Friday, November 23, 2018

What SEOs Can Learn from AdWords - Whiteboard Friday

Posted by DiTomaso

Organic and paid search aren't always at odds; there are times when there's benefit in knowing how they work together. Taking the time to know the ins and outs of AdWords can improve your rankings and on-site experience. In today's edition of Whiteboard Friday, our fabulous guest host Dana DiTomaso explains how SEOs can improve their game by taking cues from paid search in this Whiteboard Friday.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hi, my name is Dana DiTomaso. I'm President and Partner at Kick Point, and one of the things that we do at Kick Point is we do both SEO and paid. One of the things that's really useful is when SEO and paid work together. But what's even better is when SEOs can learn from paid to make their stuff better.

One of the things that is great about AdWords or Google Ads — whenever you're watching this, it may be called one thing or the other — is that you can learn a lot from what has a high click-through rate, what performs well in paid, and paid is way faster than waiting for Google to catch up to the awesome title tags you've written or the new link building that you've done to see how it's going to perform. So I'm going to talk about four things today that you can learn from AdWords, and really these are easy things to get into in AdWords.

Don't be intimidated by the interface. You can probably just get in there and look at it yourself, or talk to your AdWords person. I bet they'd be really excited that you know what a callout extension is. So we're going to start up here.

1. Negative keywords

The first thing is negative keywords. Negative keywords, obviously really important. You don't want to show up for things that you shouldn't be showing up for.

Often when we need to take over an AdWords account, there aren't a lot of negative keywords. But if it's a well-managed account, there are probably lots of negatives that have been added there over time. What you want to look at is if there's poor word association. So in your industry, cheap, free, jobs, and then things like reviews and coupons, if these are really popular search phrases, then maybe this is something you need to create content for or you need to think about how your service is presented in your industry.

Then what you can do to change that is to see if there's something different that you can do to present this kind of information. What are the kinds of things your business doesn't want? Are you definitely not saying these things in the content of your website? Or is there a way that you can present the opposite opinion to what people might be searching for, for example? So think about that from a content perspective.

2. Title tags and meta descriptions

Then the next thing are title tags and meta descriptions. Title tags and meta descriptions should never be a write it once and forget it kind of thing. If you're an on-it sort of SEO, you probably go in every once in a while and try to tweak those title tags and meta descriptions. But the problem is that sometimes there are just some that aren't performing. So go into Google Search Console, find the title tags that have low click-through rate and high rankings, and then think about what you can do to test out new ones.

Then run an AdWords campaign and test out those title tags in the title of the ad. Test out new ad copy — that would be your meta descriptions — and see what actually brings a higher click-through rate. Then whichever one does, ta-da, that's your new title tags and your meta descriptions. Then add those in and then watch your click-through rate increase or decrease.

Make sure to watch those rankings, because obviously title tag changes can have an impact on your rankings. But if it's something that's keyword rich, that's great. I personally like playing with meta descriptions, because I feel like meta descriptions have a bigger impact on that click-through rate than title tags do, and it's something really important to think about how are we making this unique so people want to click on us. The very best meta description I've ever seen in my life was for an SEO company, and they were ranking number one.

They were obviously very confident in this ranking, because it said, "The people above me paid. The people below me aren't as good as me. Hire me for your SEO." I'm like, "That's a good meta description." So what can you do to bring in especially that brand voice and your personality into those titles, into those meta descriptions and test it out with ads first and see what's going to resonate with your audience. Don't just think about click-through rate for these ads.

Make sure that you're thinking about conversion rate. If you have a really long sales cycle, make sure those leads that you're getting are good, because what you don't want to have happen is have an ad that people click on like crazy, they convert like crazy, and then the customers are just a total trash fire. You really want to make sure you're driving valuable business through this kind of testing. So this might be a bit more of a longer-term piece for you.

3. Word combinations

The third thing you can look at are word combinations.

So if you're not super familiar with AdWords, you may not be familiar with the idea of broad match modifier. So in AdWords we have broad phrases that you can search for, recipes, for example, and then anything related to the word "recipe" will show up. But you could put in a phrase in quotes. You could say "chili recipes." Then if they say, "I would like a chili recipe," it would come up.

If it says "chili crockpot recipes," it would not come up. Now if you had + chili + recipes, then anything with the phrase "chili recipes" would come up, which can be really useful. If you have a lot of different keyword combinations and you don't have time for that, you can use broad match modifier to capture a lot of them. But then you have to have a good negative keyword list, speaking as an AdWords person for a second.

Now one of the things that can really come out of broad match modifier are a lot of great, new content ideas. If you look at the keywords that people had impressions from or clicks from as a result of these broad match modifier keywords, you can find the strangest phrasing that people come up with. There are lots of crazy things that people type into Google. We all know this, especially if it's voice search and it's obviously voice search.

One of the fun things to do is look and see if anybody has "okay Google" and then the search phrase, because they said "okay Google" twice and then Google searched "okay Google" plus the phrase. That's always fun to pick up. But you can also pick up lots of different content ideas, and this can help you modify poorly performing content for example. Maybe you're just not saying the thing in the way in which your audience is saying it.

AdWords gives you totally accurate data on what your customers are thinking and feeling and saying and searching. So why not use that kind of data? So definitely check out broad match modifier stuff and see what you can do to make that better.

4. Extensions

Then the fourth thing is extensions. So extensions are those little snippets that can show up under an ad.

You should always have all of the extensions loaded in, and then maybe Google picks some, maybe they won't, but at least they're there as an option. Now one thing that's great are callout extensions. Those are the little site links that are like free trial, and people click on those, or find out more information or menu or whatever it might be. Now testing language in those callout extensions can help you with your call-to-action buttons.

Especially if you're thinking about things like people want to download a white paper, well, what's the best way to phrase that? What do you want to say for things like a submit button for your newsletter or for a contact form? Those little, tiny pieces, that are called micro-copy, what can you do by taking your highest performing callout extensions and then using those as your call-to-action copy on your website?

This is really going to improve your lead click-through rate. You're going to improve the way people feel about you, and you're going to have that really nice consistency between the language that you see in your advertising and the language that you have on your website, because one thing you really want to avoid as an SEO is to get into that silo where this is SEO and this is AdWords and the two of you aren't talking to each other at all and the copy just feels completely disjointed between the paid side and the organic side.

It should all be working together. So by taking the time to understand AdWords a little bit, getting to know it, getting to know what you can do with it, and then using some of that information in your SEO work, you can improve your on-site experience as well as rankings, and your paid person is probably going to appreciate that you talked to them for a little bit.

Thanks.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Wednesday, November 21, 2018

Massive Solar Farm Proposed in Virginia

A 500-megawatt solar power generating facility operating on about 2 million solar panels and erected on land stretching 10 square miles, covering over 6,000 acres—about half the size of Manhattan—is being proposed in Spotsylvania, Virginia. If constructed, it will be the largest solar project east of the Rocky Mountains and one of the largest in the world. The Virginia State Corporation Commission has given conditional approval for the facility, which will have 1.8 million solar panels. The builder, Sustainable Power Group, plans to have the first phase of the solar-generating plant operating by next June. But the proposal must meet several requirements beforehand, one of which is to obtain a special use permit from the county.

The Spotsylvania Board of Supervisors has the final say and can require the company to meet certain stipulations, such as lessening the environmental impact. Because the solar plant requires large amounts of water to clean the panels, concerned citizens worry that the aquifer that supplies water to homes and farms could be damaged or depleted. Run-off into wetlands and nearby streams is another concern. Also of concern is that the solar panels will contain approximately 100,000 pounds of either cadmium—a carcinogenic substance that can lead to health issues—or cadmium telluride, which the builder claims is insoluble and possesses different chemical elements than the carcinogen cadmium. The worry is that chemicals could leach into the soil if the panels break.

The project has a projected lifetime of 35 years, which is about half of the time that a typical fossil fuel or nuclear plant could operate. Because solar is an intermittent technology, which only provides power when the sun is shining, it will need back-up power, typically from gas-fired generators, to provide electricity around the clock. It also has an estimated $31 million decommissioning cost, but recycling could reduce some of that cost.

The Project

The 500-megawatt solar facility would be built in four phases, with the panels taking up about 3,500 acres on the 6,000-plus-acre site owned by the builder. The size of the four components is as follows:

  • Pleinmont Solar 1 – 75 megawatts
  • Pleinmont Solar II – 240 megawatts
  • Highlander Solar Energy Station 1 – 165 megawatts
  • Richmond Spider Solar – 20 megawatts

The company plans to use existing and replanted trees, berms, and other buffers to keep the facility out of sight of neighboring properties. The panels are to be built to withstand 130 mile-per-hour winds. When completed, the solar plant will have 10 to 15 full-time employees.

Virginia’s largest solar farm to date is 100 megawatts, so if this farm is built, it will be five times larger. Other large solar projects of this size in the United States are built in un-populated desert areas in the Southwest, far away from residential areas. This facility would be near some 7,000 residents.

Microsoft plans to purchase 315 megawatts, with the rest available for other entities via the PJM interconnection. More recent agreements are with the University of Richmond and a group that includes Apple. According to Microsoft, it would be the largest corporate purchase of solar power in the United States. Amazon, Microsoft, Google and Facebook—some of the nation’s biggest technology companies—are driving the growth of renewable power. These technology companies are moving into Virginia and want renewable energy to power their facilities.

In giving its conditional approval, the commissioners stated that the company bears the risk and must abide by regulations, which include paying for system upgrades to avoid potential impacts to rate-payers and adhering to environmental oversight.

But other citizens’ concerns are the solar farm’s effect on taxpayers through increased electricity rates, potential bankruptcies of the limited liability corporations established to build and operate it once the current federal and state subsidies expire, potential loss in property values to homeowners and the cost to clean up the site. Because there is no fuel cost associated with solar power, some people believe that it is cost-free, but that is not the case since rate-payers must pay for the cost of the solar panels, inverters, and associated necessary industrial facilities and their installation and on-going maintenance.

The 30-percent federal investment tax credit for commercial solar facilities is being phased down after 2019 to a permanent 10-percent tax credit in 2022 so there is a big incentive for the builder to start construction in 2019. According to the Energy Information Administration, a 500-megawatt solar photovoltaic plant could cost around $1 billion, and the investment tax credit would provide a $300 million taxpayer subsidy to the owner if construction is started before the subsidies are phased down.

Virginia’s new energy law that took effect July 1 stipulates that adding 5,000 megawatts of wind and solar to the state’s generation portfolio by 2028 is in the public interest.

Conclusion

Virginia may be the first state east of the Rocky Mountains to build a massive solar power plant. While concerned citizens are fighting the proposed plant due to potential environmental and cost issues, large technology firms are indicating their interest in purchasing the power. Microsoft has already agreed to purchase over 300 megawatts.

The land mass of the proposed 500-megawatt plant is humongous

The post Massive Solar Farm Proposed in Virginia appeared first on IER.

Tuesday, November 20, 2018

Announcing the 2018 Local Search Ranking Factors Survey

Posted by Whitespark

It has been another year (and a half) since the last publication of the Local Search Ranking Factors, and local search continues to see significant growth and change. The biggest shift this year is happening in Google My Business signals, but we’re also seeing an increase in the importance of reviews and continued decreases in the importance of citations.

Check out the full survey!

Huge growth in Google My Business

Google has been adding features to GMB at an accelerated rate. They see the revenue potential in local, and now that they have properly divorced Google My Business from Google+, they have a clear runway to develop (and monetize) local. Here are just some of the major GMB features that have been released since the publication of the 2017 Local Search Ranking Factors:

  • Google Posts available to all GMB users
  • Google Q&A
  • Website builder
  • Services
  • Messaging
  • Videos
  • Videos in Google Posts

These features are creating shifts in the importance of factors that are driving local search today. This year has seen the most explosive growth in GMB specific factors in the history of the survey. GMB signals now make up 25% the local pack/finder pie chart.

GMB-specific features like Google Posts, Google Q&A, and image/video uploads are frequently mentioned as ranking drivers in the commentary. Many businesses are not yet investing in these aspects of local search, so these features are currently a competitive advantage. You should get on these before everyone is doing it.

Here’s your to do list:

  1. Start using Google posts NOW. At least once per week, but preferably a few times per week. Are you already pushing out posts to Facebook, Instagram, or Twitter? Just use the same, lightly edited, content on Google Posts. Also, use calls to action in your posts to drive direct conversions.
  2. Seed the Google Q&A with your own questions and answers. Feed that hyper-relevant, semantically rich content to Google. Relevance FTW.
  3. Regularly upload photos and videos. (Did you know that you can upload videos to GMB now?)
  4. Make sure your profile is 100% complete. If there is an empty field in GMB, fill it. If you haven’t logged into your GMB account in a while, you might be surprised to see all the new data points you can add to your listing.

Why spend your time on these activities? Besides the potential relevance boost you’ll get from the additional content, you’re also sending valuable engagement signals. Regularly logging into your listing and providing content shows Google that you’re an active and engaged business owner that cares about your listing, and the local search experts are speculating that this is also providing ranking benefits. There’s another engagement angle here too: user engagement. Provide more content for users to engage with and they’ll spend more time on your listing clicking around and sending those helpful behavioral signals to Google.

Reviews on the rise

Review signals have also seen continued growth in importance over last year.

Review signals were 10.8% in 2015, so over the past 3 years, we’ve seen a 43% increase in the importance of review signals:

Many practitioners talked about the benefits they’re seeing from investing in reviews. I found David Mihm’s comments on reviews particularly noteworthy. When asked “What are some strategies/tactics that are working particularly well for you at the moment?”, he responded with:

“In the search results I look at regularly, I continue to see reviews playing a larger and larger role. Much as citations became table stakes over the last couple of years, reviews now appear to be on their way to becoming table stakes as well. In mid-to-large metro areas, even industries where ranking in the 3-pack used to be possible with a handful of reviews or no reviews, now feature businesses with dozens of reviews at a minimum — and many within the last few months, which speaks to the importance of a steady stream of feedback.
Whether the increased ranking is due to review volume, keywords in review content, or the increased clickthrough rate those gold stars yield, I doubt we'll ever know for sure. I just know that for most businesses, it's the area of local SEO I'd invest the most time and effort into getting right -- and done well, should also have a much more important flywheel effect of helping you build a better business, as the guys at GatherUp have been talking about for years.”

Getting keywords in your reviews is a factor that has also risen. In the 2017 survey, this factor ranked #26 in the local pack/finder factors. It is now coming in at #14.

I know this is the Local Search Ranking Factors, and we’re talking about what drives rankings, but you know what’s better than rankings? Conversions. Yes, reviews will boost your rankings, but reviews are so much more valuable than that because a ton of positive reviews will get people to pick up the phone and call your business, and really, that’s the goal. So, if you’re not making the most of reviews yet, get on it!

A quick to do list for reviews would be:

  1. Work on getting more Google reviews (obviously). Ask every customer.
  2. Encourage keywords in the reviews by asking customers to mention the specific service or product in their review.
  3. Respond to every review. (Did you know that Google now notifies the reviewer when the owner responds?)
  4. Don’t only focus on reviews. Actively solicit direct customer feedback as well so you can mark it up in schema/JSON and get stars in the search results.
  5. Once you’re killing it on Google, diversify and get reviews on the other important review sites for your industry (but also continue to send customers to Google).

For a more in-depth discussion of review strategy, please see the blog post version of my 2018 MozCon presentation, “How to Convert Local Searchers Into Customers with Reviews.”

Meh, links

To quote Gyi Tsakalakis: “Meh, links.” All other things being equal, links continue to be a key differentiator in local search. It makes sense. Once you have a complete and active GMB listing, your citations squared away, a steady stream of reviews coming in, and solid content on your website, the next step is links. The trouble is, links are hard, but that’s also what makes them such a valuable competitive differentiator. They ARE hard, so when you get quality links they can really help to move the needle.

When asked, “What are some strategies/tactics that are working particularly well for you at the moment?” Gyi responded with:

“Meh, links. In other words, topically and locally relevant links continue to work particularly well. Not only do these links tend to improve visibility in both local packs and traditional results, they're also particularly effective for improving targeted traffic, leads, and customers. Find ways to earn links on the sites your local audience uses. These typically include local news, community, and blog sites.”

Citations?

Let’s make something clear: citations are still very valuable and very important.

Ok, with that out of the way, let’s look at what’s been happening with citations over the past few surveys:

I think this decline is related to two things:

  1. As local search gets more complex, additional signals are being factored into the algorithm and this dilutes the value that citations used to provide. There are just more things to optimize for in local search these days.
  2. As local search gains more widespread adoption, more businesses are getting their citations consistent and built out, and so citations become less of a competitive difference maker than they were in the past.

Yes, we are seeing citations dropping in significance year after year, but that doesn’t mean you don’t need them. Quite the opposite, really. If you don’t get them, you’re going to have a bad time. Google looks to your citations to help understand how prominent your business is. A well established and popular business should be present on the most important business directories in their industry, and if it’s not, that can be a signal of lower prominence to Google.

The good news is that citations are one of the easiest items to check off your local search to do list. There are dozens of services and tools out there to help you get your business listed and accurate for only a few hundred dollars. Here’s what I recommend:

  1. Ensure your business is listed, accurate, complete, and duplicate-free on the top 10-15 most important sites in your industry (including the primary data aggregators and industry/city-specific sites).
  2. Build citations (but don’t worry about duplicates and inconsistencies) on the next top 30 to 50 sites.

Google has gotten much smarter about citation consistency than they were in the past. People worry about it much more than they need to. An incorrect or duplicate listing on an insignificant business listing site is not going to negatively impact your ability to rank.

You could keep building more citations beyond the top 50, and it won’t hurt, but the law of diminishing returns applies here. As you get deeper into the available pool of citation sites, the quality of these sites decreases, and the impact they have on your local search decreases with it. That said, I have heard from dozens of agencies that swear that “maxing out” all available citation opportunities seems to have a positive impact on their local search, so your mileage may vary. ¯\_(ツ)_/¯

The future of local search

One of my favorite questions in the commentary section is “Comments about where you see Google is headed in the future?” The answers here, from some of the best minds in local search, are illuminating. The three common themes I pulled from the responses are:

  1. Google will continue providing features and content so that they can provide the answers to most queries right in the search results and send less clicks to websites. Expect to see your traffic from local results to your website decline, but don’t fret. You want those calls, messages, and driving directions more than you want website traffic anyway.
  2. Google will increase their focus on behavioral signals for rankings. What better way is there to assess the real-world popularity of a business than by using signals sent by people in the real world. We can speculate that Google is using some of the following signals right now, and will continue to emphasize and evolve behavioral ranking methods:
    1. Searches for your brand name.
    2. Clicks to call your business.
    3. Requests for driving directions.
    4. Engagement with your listing.
    5. Engagement with your website.
    6. Credit card transactions.
    7. Actual human foot traffic in brick-and-mortar businesses.
  3. Google will continue monetizing local in new ways. Local Services Ads are rolling out to more and more industries and cities, ads are appearing right in local panels, and you can book appointments right from local packs. Google isn’t investing so many resources into local out of the goodness of their hearts. They want to build the ultimate resource for instant information on local services and products, and they want to use their dominant market position to take a cut of the sales.

And that does it for my summary of the survey results. A huge thank you to each of the brilliant contributors for giving their time and sharing their knowledge. Our understanding of local search is what it is because of your excellent work and contributions to our industry.

There is much more to read and learn in the actual resource itself, especially in all the comments from the contributors, so go dig into it:

Click here for the full results!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!