Friday, August 31, 2018

Trump Undoes Obama’s Corporate Average Fuel Economy Standards

In 2012, the Obama administration issued rules ordering carmakers to improve the average miles per gallon of their car fleets (the Corporate Average Fuel Economy mandate, known as CAFE) from about 27 miles per gallon in 2012 to 54.5 miles per gallon by 2025. The new standards were to reduce fuel consumption and carbon dioxide emissions, which would help achieve Obama’s pledge to reduce U.S. carbon dioxide emissions as part of the Paris Climate Agreement. Automakers agreed to the rule because of the Obama administration bailout in the 2008-2009 financial crises, hoping to later convince politicians that the new standard was not feasible. Automakers do not have the technology to meet the standards other than forcing low-or-no-profit electric vehicles and other small cars upon consumers, who prefer SUVs, pick-up trucks, and crossover vehicles that are far more profitable. The Obama standards assumed that two thirds of vehicles sold would be cars, and a third light trucks, but the ratio lately has been just the reverse.

To meet Obama’s CAFE standards, auto manufacturers would have to produce vehicles that are at least 30 percent electric over the next seven years—far more electric vehicles than consumers are likely to want. Electric vehicles make up only about 1.5 percent of new vehicles sales. Further, nearly half of consumers who purchase an electric car do not buy another because of low vehicle range and long recharging times. Thus, the Obama standard would make automakers build and sell electric vehicles at a loss, and sell other vehicles at higher prices to make up the difference. Since electric vehicles are mostly purchased by more well-to-do buyers, this results in a regressive tax imposed on lower-end auto buyers, forced to pay more for the vehicles that fit their needs.

Realizing the automakers’ predicament and the public’s desire to select their own vehicles for purchase, the Trump administration is proposing to freeze the CAFE standards at about 37 miles per gallon that is required by 2020 in the Obama CAFE plan through 2026. The proposal is estimated to save consumers about $2,300 per new car and to save about 1,000 lives per year, because consumers will be able to afford safer cars. A Heritage Foundation study found that Obama’s fuel regulations had already cost consumers at least $3,800 per car for the 2016 model year.

Higher production costs for cars meeting the more stringent CAFE emissions standards pushed the price of new cars to above $30,000—beyond the range at which many American households can afford to purchase new vehicles. According to the Trump administration, the best way to reduce emissions is to help put newer cars on the road by reducing prices. Changing the CAFE standards will make cars safer by discouraging the production of very light vehicles, which may not be as resilient to crash impacts, and by helping to put new cars on the road. The average American car is about 12 years old and does not have the features of many of the newer vehicles that have increased vehicle safety.

The Trump proposal also revokes California’s waiver to set its own fuel economy standards because Congress intended for the federal government—not any single state—to set the standards for the entire nation. Carmakers do not want to make two lines of cars, one for California and another for other states. When Congress created the CAFE program in 1975, it forbade states from adopting their own rules because it would increase the costs of compliance to manufacturers. Despite that, the Obama administration decided to continue an exemption that California originally received because of unique weather and geographic conditions around Los Angeles that made it especially susceptible to smog. Since smog-forming pollutants and greenhouse gases are much different emissions, there is no reason for California to continue to receive a waiver from the one standard national rule. The argument is that carbon dioxide is a “pollutant” affecting global warming, not local air conditions. In addition, if California is able to have its own unique standards, consumers throughout the United States would end up paying more for the vehicles they want in order to fulfill California’s desire to make a statement about electric vehicles. California would, in effect, be imposing the costs of its political decisions upon the citizens in many other states, many of whom cannot afford and do not want an electric vehicle.

Further, the reason for CAFE originally was because of a shortage of oil and high prices that no longer exists due to the shale oil renaissance and the technology to reap its benefits through hydraulic fracturing and directional drilling. The Energy Information Administration estimated that in 2019, the United States will be the world’s leading producer of oil, topping its 1970 peak of 9.6 million barrels per day by 2.1 million barrels.

Electric Vehicles Do Not Reduce Carbon Dioxide Emissions

Most electricity in the United States is generated by burning fossil fuels, mainly coal and natural gas, despite the Obama administration’s push toward wind and solar. Due to losses in generation and transmission, it takes the combustion of three British thermal units (BTUs) of coal, natural gas, or other fuel to deliver one BTU of electricity. Thus, the savings in carbon dioxide emissions is far less than suggested by the Obama CAFE standards. The Obama administration wanted an all renewable future and was pushing toward it through its Clean Power Plan despite the impossibility of an all-renewable electricity sector. The Clean Power Plan is also being overturned by the Trump administration.

Conclusion

The Trump administration has proposed a change to the Obama administration’s CAFE standards to freeze them at 2020 levels (37 miles per gallon) for six years, thereby bringing down the cost of new vehicles so that more American families can afford to purchase them and obtain their increased safety features, saving lives. The Trump administration is also proposing to revoke the waiver that the Obama administration approved that allows California to set its own standard. The legislation that created CAFE had intended only one national standard because of the cost to manufacturers to comply with additional standards. The Trump administration proposal is returning to the single standard originally required by legislation.

The post Trump Undoes Obama’s Corporate Average Fuel Economy Standards appeared first on IER.

Building Better Customer Experiences - Whiteboard Friday

Posted by DiTomaso

Are you mindful of your customer's experience after they become a lead? It's easy to fall in the same old rut of newsletters, invoices, and sales emails, but for a truly exceptional customer experience that improves their retention and love for your brand, you need to go above and beyond. In this week's episode of Whiteboard Friday, the ever-insightful Dana DiTomaso shares three big things you can start doing today that will immensely better your customer experience and make earning those leads worthwhile.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hi, Moz fans. My name is Dana DiTomaso. I'm the President and partner of Kick Point, and today I'm going to talk to you about building better customer experiences. I know that in marketing a lot of our jobs revolve around getting leads and more leads and why can't we have all of the leads.

The typical customer experience:

But in reality, the other half of our job should be making sure that those leads are taken care of when they become customers. This is especially important if you don't have, say, a customer care department. If you do have a customer care department, really you should be interlocking with what they do, because typically what happens, when you're working with a customer, is that after the sale, they usually get surveys.

- Surveys

"How did we do? Please rate us on a scale of 1 to 10," which is an enormous scale and kind of useless. You're a 4, or you're an 8, or you're a 6. Like what actually differentiates that, and how are people choosing that?

- Invoices

Then invoices, like obviously important because you have to bill people, particularly if you have a big, expensive product or you're a SaaS business. But those invoices are sometimes kind of impersonal, weird, and maybe not great.

- Newsletters

Maybe you have a newsletter. That's awesome. But is the newsletter focused on sales? One of the things that we see a lot is, for example, if somebody clicks a link in the newsletter to get to your website, maybe you've written a blog post, and then they see a great big popup to sign up for our product. Well, you're already a customer, so you shouldn't be seeing that popup anymore.

What we've seen on other sites, like Help Scout actually does a great job of this, is that they have a parameter of newsletter at the end of any URLs they put in their newsletter, and then the popups are suppressed because you're already in the newsletter so you shouldn't see a popup encouraging you to sign up or join the newsletter, which is kind of a crappy experience.

- Sales emails

Then the last thing are sales emails. This is my personal favorite, and this can really be avoided if you go into account-based marketing automation instead of personal-based marketing automation.

We had a situation where I was a customer of the hosting company. It was in my name that we've signed up for all of our clients, and then one of our developers created a new account because she needed to access something. Then immediately the sales emails started, not realizing we're at the same domain. We're already a customer. They probably shouldn't have been doing the hard sale on her. We've had this happen again and again.

So just really make sure that you're not sending your customers or people who work at the same company as your customers sales emails. That's a really cruddy customer experience. It makes it look like you don't know what's going on. It really can destroy trust.

Tips for an improved customer experience

So instead, here are some extra things that you can do. I mean fix some of these things if maybe they're not working well. But here are some other things you can do to really make sure your customers know that you love them and you would like them to keep paying you money forever.

1. Follow them on social media

So the first thing is following them on social. So what I really like to do is use a tool such as FullContact. You can take everyone's email addresses, run them through FullContact, and it will come back to you and say, "Here are the social accounts that this person has." Then you go on Twitter and you follow all of these people for example. Or if you don't want to follow them, you can make a list, a hidden list with all of their social accounts in there.

Then you can see what they share. A tool like Nuzzel, N-U-Z-Z for Americans, zed zed for Canadians, N-U-Z-Z-E-L is a great tool where you can say, "Tell me all the things that the people I follow on social or the things that this particular list of people on social what they share and what they're engaged in." Then you can see what your customers are really interested in, which can give you a good sense of what kinds things should we be talking about.

A company that does this really well is InVision, which is the app that allows you to share prototypes with clients, particularly design prototypes. So they have a blog, and a lot of that blog content is incredibly useful. They're clearly paying attention to their customers and the kinds of things they're sharing based on how they build their blog content. So then find out if you can help and really think about how I can help these customers through the things that they share, through the questions that they're asking.

Then make sure to watch unbranded mentions too. It's not particularly hard to monitor a specific list of people and see if they tweet things like, "I really hate my (insert what you are)right now," for example. Then you can head that off at the pass maybe because you know that this was this customer. "Oh, they just had a bad experience. Let's see what we can do to fix it,"without being like, "Hey, we were watching your every move on Twitter.Here's something we can do to fix it."

Maybe not quite that creepy, but the idea is trying to follow these people and watch for those unbranded mentions so you can head off a potential angry customer or a customer who is about to leave off at the pass. Way cheaper to keep an existing customer than get a new one.

2. Post-sale monitoring

So the next thing is post-sale monitoring. So what I would like you to do is create a fake customer. If you have lots of sales personas, create a fake customer that is each of those personas, and then that customer should get all the emails, invoices, everything else that a regular customer that fits that persona group should get.

Then take a look at those accounts. Are you awesome, or are you super annoying? Do you hear nothing for a year, except for invoices, and then, "Hey, do you want to renew?" How is that conversation going between you and that customer? So really try to pay attention to that. It depends on your organization if you want to tell people that this is what's happening, but you really want to make sure that that customer isn't receiving preferential treatment.

So you want to make sure that it's kind of not obvious to people that this is the fake customer so they're like, "Oh, well, we're going to be extra nice to the fake customer." They should be getting exactly the same stuff that any of your other customers get. This is extremely useful for you.

3. Better content

Then the third thing is better content. I think, in general, any organization should reward content differently than we do currently.

Right now, we have a huge focus on new content, new content, new content all the time, when in reality, some of your best-performing posts might be old content and maybe you should go back and update them. So what we like to tell people about is the Microsoft model of rewarding. They've used this to reward their employees, and part of it isn't just new stuff. It's old stuff too. So the way that it works is 33% is what they personally have produced.

So this would be new content, for example. Then 33% is what they've shared. So think about for example on Slack if somebody shares something really useful, that's great. They would be rewarded for that. But think about, for example, what you can share with your customers and how that can be rewarding, even if you didn't write it, or you can create a roundup, or you can put it in your newsletter.

Like what can you do to bring value to those customers? Then the last 33% is what they shared that others produced. So is there a way that you can amplify other voices in your organization and make sure that that content is getting out there? Certainly in marketing, and especially if you're in a large organization, maybe you're really siloed, maybe you're an SEO and you don't even talk to the paid people, there's cool stuff happening across the entire organization.

A lot of what you can bring is taking that stuff that others have produced, maybe you need to turn it into something that is easy to share on social media, or you need to turn it into a blog post or a video, like Whiteboard Friday, whatever is going to work for you, and think about how you can amplify that and get it out to your customers, because it isn't just marketing messages that customers should be seeing.

They should be seeing all kinds of messages across your organization, because when a customer gives you money, it isn't just because your marketing message was great. It's because they believe in the thing that you are giving them. So by reinforcing that belief through the types of content that you create, that you share, that you find that other people share, that you shared out to your customers, a lot of sharing, you can certainly improve that relationship with your customers and really turn just your average, run-of-the-mill customer into an actual raving fan, because not only will they stay longer, it's so much cheaper to keep an existing customer than get a new one, but they'll refer people to you, which is also a lot easier than buying a lot of ads or spending a ton of money and effort on SEO.

Thanks!

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Wednesday, August 29, 2018

SEPA report predicts the rapid growth of solar+storage: SEI classes can help you get an edge on this emerging market

This week, The Smart Electric Power Alliance (SEPA) released its 2018 Utility Energy Storage Market Snapshot. According to SEPA, the report delivers “unbiased analysis and figures based on verified interconnection data from over 130 US utilities.” This year’s Utility Energy Storage Market Snapshot covers the expanding applications of energy storage and key market trends.

Here are some of the key take-aways from the report, based on reporting done by Solar Power World:

  • U.S. utilities interconnected 216.7 MW, 523.9 MWh of energy storage to the grid across a total of 2,588 systems in 2017. By the end of the year, cumulative deployed energy storage had reached 922.8 MW, 1,293.6 MWh across 5,167 systems nationwide.
  • In 2017, residential energy sotrage accounted for 13.3 MW, 29.3 MWh; while non-residential added 59 MW, 139.7 MWh; and utility supply reported 144.4 MW, 354.9 MWh.
  • The Advancing Commonwealth Energy Storage (ACES) initiative in Massachusetts has provided $20 million in funding for 26 storage pilot projects.
  • Storage technologies are being deployed in demonstration projects across a wide range of applications including aggregated behind-the-meter batteries, stand-alone deployments for ancillary services and load shifting, traditional-battery hybrid power plants, non-wires alternatives and as the key asset of a microgrid.
  • Behind-the-meter battery storage customer offerings are of key interest to utilities, 64% interested, planning or actively implementing an offering. Green Mountain Power is leading the charge with two pilots: a Tesla Powerwall and a Bring Your Own Battery.
  • Solar+storage projects are rapidly emerging across the United States as the costs decline and utilities leverage the capabilities these systems can offer. Salt River Project is testing a solar+storage project for smoothing intermittent renewable generation, while the Kauai Island Utility Cooperative now has a solar+storage system that provides fully dispatchable solar power.

So what’s the best way to keep up on this new market trend? SEI offers multiple different storage-centered classes ranging from battery basics, to advanced microgrid design.

PVOL203: Solar Training-PV System Fundamentals (Battery-Based) is a great place to start. In PVOL203 the focus is on the fundamentals of battery­-based PV systems. The applications and configurations are many, and their complexity far exceeds that of grid­-direct PV systems. Components such as batteries, charge controllers, and battery­-based inverters are covered in detail, along with safety and maintenance considerations unique to battery­-based systems. Load analysis is critical to system design and will also be addressed along with other design criteria such as battery bank configuration and the electrical integration of the system.

In more advanced battery training, we also offer PVOL303: Solar Training- Advanced PV Mulitmode and Microgrid design (battery-based) and PVOL304: Solar Training- Advanced PV Standalone System Design (battery-based).

Our most intensive offering of battery-based training is our Battery-Based Photovoltaic Systems Certificate track. This track has 6 required courses. The recommended training progression is: PV101 or PVOL101 > PV203 or PVOL203 > PV201L > PV301L > PV303 or PVOL303 > PV304 or PVOL304.

To learn more about the trainings we offer, to inquire further about the Solar Professionals Certificate program, contact the Student Services Team at sei@solarenergy.org or call 970-527-7657 x1.

The post SEPA report predicts the rapid growth of solar+storage: SEI classes can help you get an edge on this emerging market appeared first on Solar Training - Solar Installer Training - Solar PV Installation Training - Solar Energy Courses - Renewable Energy Education - NABCEP - Solar Energy International (SEI).

Massachusetts Offshore Wind Farm Forecasts Incredibly Low Rates

A proposed wind farm off the coast of Martha’s Vineyard in Massachusetts has indicated that it can produce electricity starting at 7.4 cents per kilowatt hour. This will purportedly save Massachusetts customers $1.4 billion dollars over 20 years and reduce monthly bills by between 0.1 percent and 1.5 percent. Whether they can fulfill that pledge is questionable, but to even attempt to do so requires huge subsidies from the government paid for by the American taxpayer. Federal tax credits and a long-term power-purchase agreement are what make the project feasible. The costs of installing turbines in the ocean are very high, but the investment tax credit can significantly cut the capital costs for developers—by 30 percent. Thus, a $2 billion project would receive $600 million from the federal government in the form of tax credits.

According to the Massachusetts Department of Energy Resources, the Vineyard Wind project will generate an increased renewable energy credit supply at 6.5 cents per kilowatt hour and thereby lower energy market prices. The project is estimated to save the Massachusetts ratepayer approximately 3.5 cents per kilowatt hour over standard energy sources with a total net benefit of $1.4 billion over 20 years. Massachusetts has the highest residential electricity prices in the lower 48 states.

Massachusetts has fought the offshore wind battle before with a proposed wind farm off of Cape Cod that was canceled after a prolonged fight with local residents. The only offshore wind farm in operation in the United States is off of Block Island—a 30-megawatt facility off the coast of Rhode Island that cost $10,000 per kilowatt to construct—over 10 times more than the cost of a new natural gas combined cycle unit. According to the Energy Information Administration, the cost of unsubsidized offshore wind is 13.8 cents per kilowatt hour—about double what Vineyard Wind is projected to cost.

Vineyard Wind

The 800-megawatt proposed wind farm will be located 15 miles (24 kilometers) south of Martha’s Vineyard. Construction is expected to begin in 2019 because the investment tax credit expires for developments that begin construction after 2019. The Vineyard Wind farm is proposed to be built in two stages. The first 400 megawatts slated to go into commercial operation in 2022 would start at 7.4 cents per kilowatt hour; the second 400 megawatts, going into operation in 2023, would start at 6.5 cents per kilowatt hour. The price of each contract is expected to increase by 2.5 percent annually over the contract’s 20 years (the wind turbines’ lifetime), resulting in the two deals having an average price of 8.4 cents per kilowatt hour. Each phase will be covered by 20-year power purchase agreements with three utilities operating in Massachusetts: National Grid, Eversource Energy, and Unitil.

The large drop in price for this large U.S. offshore wind farm is surprising since in Europe, where the bulk of the world’s offshore wind industry is located, offshore wind prices dropped only recently below 10 cents per kilowatt hour. And Europe has been able to install offshore wind turbines in much more shallow waters than those proposed for U.S. offshore areas. Vineyard Wind is owned by two Europe-based companies—Copenhagen Infrastructure Partners and Avangrid Renewables—each with a 50-percent interest.

The large decline in the price of offshore wind is attributed to several factors. Vineyard Wind plans on using turbines capable of generating at least 8 megawatts and perhaps as much as 10 megawatts, compared to turbines that previously generated 2 or 3 megawatts each. Because installation accounts for much of the project’s cost, increased size helps offshore wind achieve an economy of scale.

Besides technology improvements, increased competition and the lower cost of capital have helped to lower prices. The cost of borrowing money to finance offshore wind projects has declined as those investments are seen as less risky.

Also helping economy of scale is the selection of an 800-megawatt proposal over a 400-megawatt proposal for Vineyard Wind by the Massachusetts Department of Energy Resources. The Department indicated the 800-megawatt proposal was “superior to other proposals,” and was likely to produce significantly more economic benefits to ratepayers. Economy of scale arises due to the project’s fixed costs being spread over a greater amount of turbine capacity.

Along with the subsidies and purchase power agreements are the renewable energy credits that are a result of the Regional Greenhouse Gas Initiative (RGGI), of which Massachusetts is a member. Massachusetts has an initial goal of installing 1,600 megawatts of offshore wind by 2027, but the state’s lawmakers recently approved legislation to double that figure.

Other State Initiatives

New York, New Jersey, and Maryland are targeting a combined addition of over 6 gigawatts of offshore wind capacity by 2030. Maryland’s Public Service Commission has committed to subsidizing two wind farms off Ocean City with a combined cost of over $2 billion. But local officials are leery about the projects due to fear of the town losing its commercial fishing industry or having tourism decline. New Jersey has a goal of building 3,500 megawatts of offshore wind by 2030. New York’s plan is for 2,400 megawatts. Connecticut is proceeding with a solicitation for 200 megawatts of offshore wind capacity, while Rhode Island is focused on 400 megawatts.

Conclusion

Whether Vineyard Wind can live up to its price forecast is yet to be seen. In its favor are subsidies and mandates that include the federal investment tax credit, the state’s renewable energy mandate, power purchase agreements, RGGI, among others. But as noted above, subsidies are just a transfer from the consumer to the taxpayer and are not a free good. As more and more projects are added, the costs to the taxpayers for these sources grows rapidly.

The post Massachusetts Offshore Wind Farm Forecasts Incredibly Low Rates appeared first on IER.

The Long-Term Link Acquisition Value of Content Marketing

Posted by KristinTynski

Recently, new internal analysis of our work here at Fractl has yielded a fascinating finding:

Content marketing that generates mainstream press is likely 2X as effective as originally thought. Additionally, the long-term ROI is potentially many times higher than previously reported.

I’ll caveat that by saying this applies only to content that can generate mainstream press attention. At Fractl, this is our primary focus as a content marketing agency. Our team, our process, and our research are all structured around figuring out ways to maximize the newsworthiness and promotional success of the content we create on behalf of our clients.

Though data-driven content marketing paired with digital PR is on the rise, there is still a general lack of understanding around the long-term value of any individual content execution. In this exploration, we sought to answer the question: What link value does a successful campaign drive over the long term? What we found was surprising and strongly reiterated our conviction that this style of data-driven content and digital PR yields some of the highest possible ROI for link building and SEO.

To better understand this full value, we wanted to look at the long-term accumulation of the two types of links on which we report:

  1. Direct links from publishers to our client’s content on their domain
  2. Secondary links that link to the story the publisher wrote about our client’s content

While direct links are most important, secondary links often provide significant additional pass-through authority and can often be reclaimed through additional outreach and converted into direct do-follow links (something we have a team dedicated to doing at Fractl).

Below is a visualization of the way our content promotion process works:

So how exactly do direct links and secondary links accumulate over time?

To understand this, we did a full audit of four successful campaigns from 2015 and 2016 through today. Having a few years of aggregation gave us an initial benchmark for how links accumulate over time for general interest content that is relatively evergreen.

We profiled four campaigns:

The first view we looked at was direct links, or links pointing directly to the client blog posts hosting the content we’ve created on their behalf.

There is a good deal of variability between campaigns, but we see a few interesting general trends that show up in all of the examples in the rest of this article:

  1. Both direct and secondary links will accumulate in a few predictable ways:
    1. A large initial spike with a smooth decline
    2. A buildup to a large spike with a smooth decline
    3. Multiple spikes of varying size
  2. Roughly 50% of the total volume of links that will be built will accumulate in the first 30 days. The other 50% will accumulate over the following two years and beyond.
  3. A small subset of direct links will generate their own large spikes of secondary links.

We'll now take a look at some specific results. Let’s start by looking at direct links (pickups that link directly back to our client’s site or landing page).

The typical result: A large initial spike with consistent accumulation over time

This campaign, featuring artistic imaginings of what bodies in video games might look like with normal BMI/body sizes, shows the most typical pattern we witnessed, with a very large initial spike and a relatively smooth decline in link acquisition over the first month.

After the first month, long-term new direct link acquisition continued for more than two years (and is still going today!).

The less common result: Slow draw up to a major spike

In this example, you can see that sometimes it takes a few days or even weeks to see the initial pickup spike and subsequent primary syndication. In the case of this campaign, we saw a slow buildup to the pinnacle at about a week from the first pickup (exclusive), with a gradual decline over the following two weeks.

"These initial stories were then used as fodder or inspiration for stories written months later by other publications."

Zooming out to a month-over-month view, we can see resurgences in pickups happening at unpredictable intervals every few months or so. These spikes continued up until today with relative consistency. This happened as some of the stories written during the initial spike began to rank well in Google. These initial stories were then used as fodder or inspiration for stories written months later by other publications. For evergreen topics such as body image (as was the case in this campaign), you will also see writers and editors cycle in and out of writing about these topics as they trend in the public zeitgeist, leading to these unpredictable yet very welcomed resurgences in new links.

Least common result: Multiple spikes in the first few weeks

The third pattern we observed was seen on a campaign we executed examining hate speech on Twitter. In this case, we saw multiple spikes during this early period, corresponding to syndications on other mainstream publications that then sparked their own downstream syndications and individual virality.

Zooming out, we saw a similar result as the other examples, with multiple smaller spikes more within the first year and less frequently in the following two years. Each of these bumps is associated with the story resurfacing organically on new publications (usually a writer stumbling on coverage of the content during the initial phase of popularity).

Long-term resurgences

Finally, in our fourth example that looked at germs on water bottles, we saw a fascinating phenomenon happen beyond the first month where there was a very significant secondary spike.

This spike represents syndication across (all or most) of the iHeartRadio network. As this example demonstrates, it isn’t wholly unusual to see large-scale networks pick up content even a year or later that rival or even exceed the initial month’s result.

Aggregate trends

"50% of the total links acquired happened in the first month, and the other 50% were acquired in the following two to three years."

When we looked at direct links back to all four campaigns together, we saw the common progression of link acquisition over time. The chart below shows the distribution of new links acquired over two years. We saw a pretty classic long tail distribution here, where 50% of the total links acquired happened in the first month, and the other 50% were acquired in the following two to three years.

"If direct links are the cake, secondary links are the icing, and both accumulate substantially over time."

Links generated directly to the blog posts/landing pages of the content we’ve created on our clients’ behalf are only really a part of the story. When a campaign garners mainstream press attention, the press stories can often go mildly viral, generating large numbers of syndications and links to these stories themselves. We track these secondary links and reach out to the writers of these stories to try and get link attributions to the primary source (our clients’ blog posts or landing pages where the story/study/content lives).

These types of links also follow a similar pattern over time to direct links. Below are the publishing dates of these secondary links as they were found over time. Their over-time distribution follows the same pattern, with 50% of results being realized within the first month and the following 50% of the value coming over the next two to three years.

The value in the long tail

By looking at multi-year direct and secondary links built to successful content marketing campaigns, it becomes apparent that the total number of links acquired during the first month is really only about half the story.

For campaigns that garner initial mainstream pickups, there is often a multi-year long tail of links that are built organically without any additional or future promotions work beyond the first month. While this long-term value is not something we report on or charge our clients for explicitly, it is extremely important to understand as a part of a larger calculus when trying to decide if doing content marketing with the goal of press acquisition is right for your needs.

Cost-per-link (a typical way to measure ROI of such campaigns) will halve if links built are measured over these longer periods — moving a project you perhaps considered a marginal success at one month to a major success at one year.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Tuesday, August 28, 2018

A Quarter-Million Reasons to Use Moz's Link Intersect Tool

Posted by rjonesx.

Let me tell you a story.

It begins with me in a hotel room halfway across the country, trying to figure out how I'm going to land a contract from a fantastic new lead, worth annually $250,000. We weren't in over our heads by any measure, but the potential client was definitely looking at what most would call "enterprise" solutions and we weren't exactly "enterprise."

Could we meet their needs? Hell yes we could — better than our enterprise competitors — but there's a saying that "no one ever got fired for hiring IBM"; in other words, it's always safe to go with the big guys. We weren't an IBM, so I knew that by reputation alone we were in trouble. The RFP was dense, but like most SEO gigs, there wasn't much in the way of opportunity to really differentiate ourselves from our competitors. It would be another "anything they can do, we can do better" meeting where we grasp for reasons why we were better. In an industry where so many of our best clients require NDAs that prevent us from producing really good case studies, how could I prove we were up to the task?

In less than 12 hours we would be meeting with the potential client and I needed to prove to them that we could do something that our competitors couldn't. In the world of SEO, link building is street cred. Nothing gets the attention of a client faster than a great link. I knew what I needed to do. I needed to land a killer backlink, completely white-hat, with no new content strategy, no budget, and no time. I needed to walk in the door with more than just a proposal — I needed to walk in the door with proof.

I've been around the block a few times when it comes to link building, so I wasn't at a loss when it came to ideas or strategies we could pitch, but what strategy might actually land a link in the next few hours? I started running prospecting software left and right — all the tools of the trade I had at my disposal — but imagine my surprise when the perfect opportunity popped up right in little old Moz's Open Site Explorer Link Intersect tool. To be honest, I hadn't used the tool in ages. We had built our own prospecting software on APIs, but the perfect link just popped up after adding in a few of their competitors on the off chance that there might be an opportunity or two.

There it was:

  1. 3,800 root linking domains to the page itself
  2. The page was soliciting submissions
  3. Took pull requests for submissions on GitHub!

I immediately submitted a request and began the refresh game, hoping the repo was being actively monitored. By the next morning, we had ourselves a link! Not just any link, but despite the client having over 50,000 root linking domains, this was now the 15th best link to their site. You can imagine me anxiously awaiting the part of the meeting where we discussed the various reasons why our services were superior to that of our competitors, and then proceeded to demonstrate that superiority with an amazing white-hat backlink acquired just hours before.

The quarter-million-dollar contract was ours.

Link Intersect: An undervalued link building technique

Backlink intersect is one of the oldest link building techniques in our industry. The methodology is simple. Take a list of your competitors and identify the backlinks pointing to their sites. Compare those lists to find pages that overlap. Pages which link to two or more of your competitors are potentially resource pages that would be interested in linking to your site as well. You then examine these sites and do outreach to determine which ones are worth contacting to try and get a backlink.

Let's walk through a simple example using Moz's Link Intersect tool.

Getting started

We start on the Link Intersect page of Moz's new Link Explorer. While we had Link Intersect in the old Open Site Explorer, you're going to to want to use our new Link Intersect, which is built from our giant index of 30 trillion links and is far more powerful.

For our example here, I've chosen a random gardening company in Durham, North Carolina called Garden Environments. The website has a Domain Authority of 17 with 38 root linking domains.

We can go ahead and copy-paste the domain into "Discover Link Opportunities for this URL" at the top of the Link Intersect page. If you notice, we have the choice of "Root Domain, Subdomain, or Exact Page":

Choose between domain, subdomain or page

I almost always choose "root domain" because I tend to be promoting a site as a whole and am not interested in acquiring links to pages on the site from other sites that already link somewhere else on the site. That is to say, by choosing "root domain," any site that links to any page on your site will be excluded from the prospecting list. Of course, this might not be right for your situation. If you have a hosted blog on a subdomain or a hosted page on a site, you will want to choose subdomain or exact page to make sure you rule out the right backlinks.

You also have the ability to choose whether we report back to you root linking domains or Backlinks. This is really important and I'll explain why.

choose between page or domain

Depending on your link building campaign, you'll want to vary your choice here. Let's say you're looking for resource pages that you can list your website on. If that's the case, you will want to choose "pages." The Link Intersect tool will then prioritize pages that have links to multiple competitors on them, which are likely to be resource pages you can target for your campaign. Now, let's say you would rather find publishers that talk about your competitors and are less concerned about them linking from the same page. You want to find sites that have linked to multiple competitors, not pages. In that case, you would choose "domains." The system will then return the domains that have links to multiple competitors and give you example pages, but you wont be limited only to pages with multiple competitors on them.

In this example, I'm looking for resource pages, so I chose "pages" rather than domains.

Choosing your competitor sites

A common mistake made at this point is to choose exact competitors. Link builders will often copy and paste a list of their biggest competitors and cross their fingers for decent results. What you really want are the best link pages and domains in your industry — not necessarily your competitors.

In this example I chose the gardening page on a local university, a few North Carolina gardening and wildflower associations, and a popular page that lists nurseries. Notice that you can choose subdomain, domain, or exact page as well for each of these competitor URLs. I recommend choosing the broadest category (domain being broadest, exact page being narrowest) that is relevant to your industry. If the whole site is relevant, go ahead and choose "domain."

Analyzing your results

The results returned will prioritize pages that link to multiple competitors and have a high Domain Authority. Unlike some of our competitors' tools, if you put in a competitor that doesn't have many backlinks, it won't cause the whole report to fail. We list all the intersections of links, starting with the most and narrowing down to the fewest. Even though the nurseries website doesn't provide any intersections, we still get back great results!

analyze link results

Now we have some really great opportunities, but at this point you have two choices. If you really prefer, you can just export the opportunities to CSV like any other tool on the market, but I prefer to go ahead and move everything over into a Link Tracking List.

add to link list

By moving everything into a link list, we're going to be able to track link acquisition over time (once we begin reaching out to these sites for backlinks) and we can also sort by other metrics, leave notes, and easily remove opportunities that don't look fruitful.

What did we find?

Remember, we started off with a site that has barely any links, but we turned up dozens of easy opportunities for link acquisition. We turned up a simple resources page on forest resources, a potential backlink which could easily be earned via a piece of content on forest stewardship.

example opportunity

We turned up a great resource page on how to maintain healthy soil and yards on a town government website. A simple guide covering the same topics here could easily earn a link from this resource page on an important website.

example opportunity 2

These were just two examples of easy link targets. From community gardening pages, websites dedicated to local creek, pond, and stream restoration, and general enthusiast sites, the Link Intersect tool turned up simple backlink gold. What is most interesting to me, though, was that these resource pages never included the words "resources" or "links" in the URLs. Common prospecting techniques would have just missed these opportunities altogether.

While it wasn't the focus of this particular campaign, I did choose the alternate of "show domains" rather than "pages" that link to the competitors. We found similarly useful results using this methodology.

example list of domains opportunity

For example, we found CarolinaCountry.com had linked to multiple of the competitor sites and, as it turns out, would be a perfect publication to pitch for a story as part of of a PR campaign for promoting the gardening site.

Takeaways

The new Link Intersect tool in Moz's Link Explorer combines the power of our new incredible link index with the complete features of a link prospecting tool. Competitor link intersect remains one of the most straightforward methods for finding link opportunities and landing great backlinks, and Moz's new tool coupled with Link Lists makes it easier than ever. Go ahead and give it a run yourself — you might just find the exact link you need right when you need it.

Find link opportunities now!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

The Social Cost of Carbon: Considerations and Disagreements in Climate Economics

This June, the Environmental Protection Agency (EPA) issued an Advance Notice of Proposed Rulemaking to announce its request for public input on whether and how to change the way it considers costs and benefits in making regulatory decisions. Of particular interest to EPA and the public is the figure known as the social cost of carbon (SCC).

The SCC is the estimated marginal external cost of a unit emission of carbon dioxide, based upon the future damages (such as reduced agricultural productivity, increased flood damage, or worsened health and mortality) that that unit will inflict through its contribution to the greenhouse effect and the global warming that results. The metric, trenchantly described by Obama economic advisor Michael Greenstone as the “the most important number you’ve never heard of,” is the lynchpin of myriad climate-related regulations and carbon tax proposals. Given the Supreme Court’s 2007 directive to EPA to evaluate carbon dioxide and EPA’s subsequent finding that it qualifies as a pollutant, calculations of the SCC will have far-reaching consequences. As a point of entry to the stark disagreements on the topic, compare the Trump administration’s current estimate of the SCC of $5 per ton to its predecessor’s estimate of over $40. What with the United States’ annual carbon dioxide output of around 6 billion metric tons of carbon dioxide equivalent, incorporating one estimate instead of the other into the calculation of a regulatory proposal’s costs and benefits all but determines the likelihood of the proposal’s adoption.

Given this significance, it is unfortunate that Greenstone’s observation about the SCC’s obscurity rings true. To make matters worse, even when the concept is known, it is all too frequently misrepresented. The central misunderstanding of the social cost of carbon is that it is a figure existing in nature independent of human judgment, akin to the atomic mass of nickel or the effective gravity at Earth’s equator. On the contrary, the social cost of carbon is an estimate that rests upon normative judgments and assumptions supplied by the modeler. The public, however, is often led to think otherwise.

Consider, for example, Huffington Post’s Alexander C. Kaufman, who describes the social cost of carbon as “a calculation of the damages to property, human health, economic growth and agriculture as a result of climate change.” The SCC utilizes damage projections, yes, but damages and an SCC are not synonymous. Rather, the SCC is an estimate of future damages on a selected timescale translated at a selected rate into present dollar terms. And while Kaufman acknowledges that there is a wide range of estimates for the metric, he still misleads readers by implying that we will soon have the advantage of a scientifically-valid figure. “Under Obama, the EPA estimated the social cost of carbon to be between $11 and $105 per ton of carbon dioxide pollution. But the real cost,” Kaufman writes, “could be much higher, according to a study Purdue University published last year, which found that existing models relied on decades-old agricultural data.” Kaufman’s treatment is typical of mainstream coverage of this issue and does our public discourse a disservice.

With this article I do not intend to endorse any particular view of the SCC nor describe it in full, but rather to bring to the reader’s attention to a trio of factors—and the complex normative considerations therein—that, along with others, determine an SCC estimate.

Integrated Assessment Models

Integrated Assessment Models (IAMs) are the tools with which we analyze the potential effects—positive and negative—of increased carbon dioxide emissions. IAMs link climate projections with projections of economic activity to predict and monetize welfare impacts. The two most prominent IAMs are FUND, developed by Richard Tol, and DICE, developed by William Nordhaus.

Using the IAMs, modelers can provide an estimate of the marginal cost to the economic system as a whole of emitting a ton of carbon dioxide. But, crucially, even if we assent to the climate projections utilized and trust the economic variables selected by the developers to adequately capture economic costs and benefits in the future, in order to understand the resulting SCC we need to grasp two additional instrumental variables: the discount rate and the time horizon.

Discount Rate

The discount rate is a concept that seeks to translate future costs and benefits into present dollar terms. As described by Nordhaus, “discounting is a factor in climate-change policy—indeed in all investment decisions—that involves the relative weight of future and present payoffs.” The discount rate is in essence the relative importance one places on costs and benefits that will arise in the future as compared with costs and benefits today. Regarding global warming, the most adverse effects will occur far in the future, but the cost of regulating emissions occurs in the present. The discount rate addresses the question, how much benefit (or averted damage) do we require in the future to prompt us to take on a cost now?

The first element of a discount rate with which to reckon is what is known as pure rate of time preference. Pure rate of time preference is the weighing of import one places on present consumption versus the future consumption as such. Some climate analyses, such as the “Stern Review on the Economics of Climate Change,” performed for the British government last decade, operate on a premise of intergenerational neutrality—essentially holding that we ought value ourselves today no more than ourselves in the future nor more than the generations of humans that will follow us. This time-preference orientation can result in a near-zero discount rate. And though it may sound sensible at first blush, when we consider this approach from the perspective of investment the logic crumbles. In contemplating the appropriate relationship between the future and the present we ought to take into account the broader considerations that govern intertemporal decision-making, beginning with rate of growth and the opportunity cost of capital. As Nordhaus explains:

“(The discount rate) is a positive concept that measures the relative price of goods at different points of time. This is also called the real return on capital, the real interest rate, the opportunity cost of capital, and the real return. The real return measures the yield on investments corrected by the change in the overall price level. In principle, this is observable in the marketplace.”

This approach to the discount rate is sometimes referred to as the financial-equivalent discount rate and tracks with the rate of return one can expect from the market, which averages about 7 percent annually. The upside this presents, to draw the reader’s attention back to the climate issues at hand, is that it prioritizes wealth accumulation—our best buffer against climate risk. The downside is that it may heighten that very risk for our future selves and our offspring.

It is thus between these two extremes that the discount debate transpires: from intergenerational neutrality on the one hand to the opportunity cost of capital on the other. The Obama Interagency Working Group on the Social Cost of Carbon chose to hover in the middle, presenting an SCC at 2.5-, 3- and 5-percent discount rates, but ignoring longstanding Office of Management and Budget guidance to present the SCC at a 7-percent discount rate.

To observe the magnitude of divergence created by simply changing the discount rate, examine the table below. According to calculations based on simulation results using Nordhaus’s DICE model (one of the three IAMs used by the Obama administration), the SCC fluctuates by a nearly factor of ten in the year 2020 when comparing the 2.5-percent discount rate result of $56.92 with the 7-percent rate result of $5.87.

Average SCC Baseline, End Year 2300

As if this pursuit were not yet fraught enough with uncertainty, theorists also incorporate factors such as the rate of risk aversion and the income elasticity of the value of climate change impacts. Others seek to augment these factors with distributional considerations.

In a recent New York Times article, Brad Plumer offered one of the more thorough treatments of the SCC we have seen from the mainstream media, including even a discussion of the discount rate. “The federal government,” Plumer writes, “has long recommended discount rates of 3 percent and 7 percent for valuing costs and benefits across a single generation. But some economists have argued that higher rates are inappropriate for thinking about long-range problems like global warming, where today’s emissions can have impacts, like melting ice sheets, that reverberate for centuries.”

Plumer’s choice to introduce his readers to the discount rate is praiseworthy, but his commentary, like that of Huffington Post’s Kaufman, still lacks appreciation for the subjectivity of pure rate of time preference and, ergo, the discount rate.

For a final word on the discount rate, I will turn to Lawrence Goulder and Roberton Williams’ consideration of the dilemma in their 2012 discussion paper, “The Choice of Discount Rate for Climate Change Policy Evaluation.” In response to the aforementioned Stern Review, Goulder and Williams argue that “the disagreements about the discount rate are not merely arguments about empirical matters; there are major debates about conceptual issues as well.” 

Time Horizon

The second concept, closely related to that of the discount rate, with which we must grapple is the time horizon. This concept is simpler than that of the discount rate in many respects, but its impact is just as profound. IAMs generate multi-century simulations with some stretching to the next millennium.

When taking note of the time horizon, I find that the discounting issue—particularly the danger of a low one—is brought into sharper relief. People tend to value their children’s and grandchildren’s lives to an equal or greater extent than their own, but upon some branch on one’s tree of descendants the capacity for rational concern necessarily wanes. Again, Nordhaus:

This approach is more difficult to interpret when it involves different generations living many years from now, and it arises with particular force when the current generation’s great(n)-grandchildren consume goods and services that are largely unimagined today. These will almost certainly involve unrecognizably different health-care technologies, with supercomputers cheap enough and small enough to fit under the skin, and future generations that grow up and adapt to a world that is vastly different from that of today.

It thus becomes quite plain when we contemplate the long reaches of time that intergenerational neutrality is a non-starter. Human beings alive today deserve more consideration in forming public policy than hypothetical future members of our species. Otherwise, the benefits and damages we expect to see from climate change in the near term get utterly overwhelmed by damages centuries into the future. As Nordhaus describes in his assessment of the Stern Review:

“The effect of low discounting can be illustrated with a ‘wrinkle experiment.’ Suppose that scientists discover a wrinkle in the climate system that will cause damages equal to 0.1 percent of net consumption starting in 2200 and continuing at that rate forever after. How large a one-time investment would be justified today to remove the wrinkle that starts only after two centuries? Using the methodology of the Review, the answer is that we should pay up to 56 percent of one year’s world consumption today to remove the wrinkle. In other words, it is worth a one-time consumption hit of approximately $30,000 billion today to fix a tiny problem that begins in 2200. It is illuminating to put this point in terms of average consumption levels. Using the Review’s growth projections, the Review would justify reducing per capita consumption for one year today from $10,000 to $4,400 in order to prevent a reduction of consumption from $130,000 to $129,870 starting two centuries hence and continuing at that rate forever after.”

The time horizon of IAM simulations, when paired with a suspiciously low discount rate, pushes the concept of the social cost of carbon to the very precipice of arbitrariness and capriciousness.

Global vs. Domestic Costs and Benefits

The third and final contributor to SCC figures that I will address is exclusion or inclusion of foreign benefits and damages. An element of cost-benefit analysis that has been ruefully opaque is the question of cost and benefits to whom? The costs that a warming planet would entail would not be distributed evenly within countries, let alone between them. Indeed, some regions of the globe would stand to gain from the greening effect of increased carbon dioxide concentrations and the longer growing season promoted by warmer temperatures. A fundamental rift between the respective approaches of the Obama and Trump administrations is that the former opted for a global estimate, while the latter prefers to focus on the domestic costs and benefits. The Obama estimate, largely as a result of this difference in approach, is much higher

As Plumer described in his New York Times article:

“First, the E.P.A. took the Obama-era models and focused solely on damages that occurred within the borders of the United States, rather than looking at harm to other countries as well. That change alone reduced the social cost of carbon estimate to around $7 per ton.

The reasoning was simple: If Americans are paying the cost of these rules to mitigate climate change, then only benefits that accrue to Americans themselves should be counted.”

But this issue is not as simple as a wealthy global north enriching itself at the expense of a poor global south. Some of the developing countries that face the gravest risks from future climate damages also reap reward in the near term from direct emissions benefits. As described by David Anthoff, Richard Tol, and Gary Yohe in “Discounting for Climate Change (2009)”:

“One of the basic results of the climate change impact literature is that poor countries tend to be more vulnerable to climate change, and the results in Figure 3 certainly reflect this. With zero income elasticities, the SCC tends to be higher – at least, if the discount rate is low. If the discount rate is high, the SCC is negative (that is, climate change is a net benefit); and with zero income elasticities, the SCC is even more negative (that is, climate change is an even greater benefit). The reason can be found in the positive impacts of carbon dioxide fertilization on agriculture in poor countries.”

So while it is the case that the poorer countries of, say, south and southeast Asia will bear consequences of a warmer, wetter world in the future, they also stand to make some gains in the near term from the direct positive impact of carbon dioxide fertilization on agricultural yields. At a high discount rate models can even generate a negative social cost of carbon, implying that carbon emission should not be taxed, but subsidized. For a more comprehensive discussion of the federal government’s approach to global and domestic costs and benefits with respect to climate change, see the work of Ted Gayer and W. Kip Viscusi.

Ultimately, as with the discount rate, one’s answer to the global vs. domestic question relies on his normative premises. If, like I do, one opts to include global costs and benefits in an evaluation of the issue as a matter of justice, it is nevertheless appropriate to make that choice transparent to the American public, who would bear immediate costs of any federal regulatory action that follows.

Conclusion

The purpose of this article is not to endorse any specific calculation of a social cost of carbon nor offer an exhaustive evaluation of methodology. Rather it is to highlight the degree of individual discretion involved in crafting an estimate. The social cost of carbon is not a metaphysically-given figure; it is a figure contingent upon the particular normative perspective one chooses to favor. Depending upon the discount rate—itself contingent upon estimations of rate of time preferences, growth rates, and income elasticity—the time horizon, and the choice to exclude or include global effects, one can generate virtually any SCC he wants

In performing cost-benefit analyses our government has a responsibility to present the fullest view to the public that is possible. In the context of climate change, that means exploring the social cost of carbon at a wide range of discount rates, on a diversity of time horizons, and showing both the domestic and the global consequences

The post The Social Cost of Carbon: Considerations and Disagreements in Climate Economics appeared first on IER.

Friday, August 24, 2018

EPA Proposes Rule to Replace Obama’s Clean Power Plan

SEO Negotiation: How to Ace the Business Side of SEO - Whiteboard Friday

Posted by BritneyMuller

SEO isn't all meta tags and content. A huge part of the success you'll see is tied up in the inevitable business negotiations. In this week's Whiteboard Friday, our resident expert Britney Muller walks us through a bevy of smart tips and considerations that will strengthen your SEO negotiation skills, whether you're a seasoned pro or a newbie to the practice.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hey, Moz fans. Welcome to another edition of Whiteboard Friday. So today we are going over all things SEO negotiation, so starting to get into some of the business side of SEO. As most of you know, negotiation is all about leverage.

It's what you have to offer and what the other side is looking to gain and leveraging that throughout the process. So something that you can go in and confidently talk about as SEOs is the fact that SEO has around 20% more opportunity than both mobile and desktop PPC combined.

This is a really, really big deal. It's something that you can showcase. These are the stats to back it up. We will also link to the research to this down below. Good to kind of have that in your back pocket. Aside from this, you will obviously have your audit. So potential client, you're looking to get this deal.

Get the most out of the SEO audit

☑ Highlight the opportunities, not the screw-ups

You're going to do an audit, and something that I have always suggested is that instead of highlighting the things that the potential client is doing wrong, or screwed up, is to really highlight those opportunities. Start to get them excited about what it is that their site is capable of and that you could help them with. I think that sheds a really positive light and moves you in the right direction.

☑ Explain their competitive advantage

I think this is really interesting in many spaces where you can sort of say, "Okay, your competitors are here, and you're currently here and this is why,"and to show them proof. That makes them feel as though you have a strong understanding of the landscape and can sort of help them get there.

☑ Emphasize quick wins

I almost didn't put this in here because I think quick wins is sort of a sketchy term. Essentially, you really do want to showcase what it is you can do quickly, but you want to...

☑ Under-promise, over-deliver

You don't want to lose trust or credibility with a potential client by overpromising something that you can't deliver. Get off to the right start. Under-promise, over-deliver.

Smart negotiation tactics

☑ Do your research

Know everything you can about this clientPerhaps what deals they've done in the past, what agencies they've worked with. You can get all sorts of knowledge about that before going into negotiation that will really help you.

☑ Prioritize your terms

So all too often, people go into a negotiation thinking me, me, me, me, when really you also need to be thinking about, "Well, what am I willing to lose?What can I give up to reach a point that we can both agree on?" Really important to think about as you go in.

☑ Flinch!

This is a very old, funny negotiation tactic where when the other side counters, you flinch. You do this like flinch, and you go, "Oh, is that the best you can do?" It's super silly. It might be used against you, in which case you can just say, "Nice flinch." But it does tend to help you get better deals.

So take that with a grain of salt. But I look forward to your feedback down below. It's so funny.

☑ Use the words "fair" and "comfortable"

The words "fair" and "comfortable" do really well in negotiations. These words are inarguable. You can't argue with fair. "I want to do what is comfortable for us both. I want us both to reach terms that are fair."

You want to use these terms to put the other side at ease and to also help bridge that gap where you can come out with a win-win situation.

☑ Never be the key decision maker

I see this all too often when people go off on their own, and instantly on their business cards and in their head and email they're the CEO.

They are this. You don't have to be that, and you sort of lose leverage when you are. When I owned my agency for six years, I enjoyed not being CEO. I liked having a board of directors that I could reach out to during a negotiation and not being the sole decision maker. Even if you feel that you are the sole decision maker, I know that there are people that care about you and that are looking out for your business that you could contact as sort of a business mentor, and you could use that in negotiation. You can use that to help you. Something to think about.

Tips for negotiation newbies

So for the newbies, a lot of you are probably like, "I can never go on my own. I can never do these things." I'm from northern Minnesota. I have been super awkward about discussing money my whole life for any sort of business deal. If I could do it, I promise any one of you watching this can do it.

☑ Power pose!

I'm not kidding, promise. Some tips that I learned, when I had my agency, was to power pose before negotiations. So there's a great TED talk on this that we can link to down below. I do this before most of my big speaking gigs, thanks to my gramsy who told me to do this at SMX Advanced like three years ago.

Go ahead and power pose. Feel good. Feel confident. Amp yourself up.

☑ Walk the walk

You've got to when it comes to some of these things and to just feel comfortable in that space.

☑ Good > perfect

Know that good is better than perfect. A lot of us are perfectionists, and we just have to execute good. Trying to be perfect will kill us all.

☑ Screw imposter syndrome

Many of the speakers that I go on different conference circuits with all struggle with this. It's totally normal, but it's good to acknowledge that it's so silly. So to try to take that silly voice out of your head and start to feel good about the things that you are able to offer.

Take inspiration where you can find it

I highly suggest you check out Brian Tracy's old-school negotiation podcasts. He has some old videos. They're so good. But he talks about leverage all the time and has two really great examples that I love so much. One being jade merchants. So these jade merchants that would take out pieces of jade and they would watch people's reactions piece by piece that they brought out.

So they knew what piece interested this person the most, and that would be the higher price. It was brilliant. Then the time constraints is he has an example of people doing business deals in China. When they landed, the Chinese would greet them and say, "Oh, can I see your return flight ticket? I just want to know when you're leaving."

They would not make a deal until that last second. The more you know about some of these leverage tactics, the more you can be aware of them if they were to be used against you or if you were to leverage something like that. Super interesting stuff.

Take the time to get to know their business

☑ Tie in ROI

Lastly, just really take the time to get to know someone's business. It just shows that you care, and you're able to prioritize what it is that you can deliver based on where they make the most money off of the products or services that they offer. That helps you tie in the ROI of the things that you can accomplish.

☑ Know the order of products/services that make them the most money

One real quick example was my previous company. We worked with plastic surgeons, and we really worked hard to understand that funnel of how people decide to get any sort of elective procedure. It came down to two things.

It was before and after photos and price. So we knew that we could optimize for those two things and do very well in their space. So showing that you care, going the extra mile, sort of tying all of these things together, I really hope this helps. I look forward to the feedback down below. I know this was a little bit different Whiteboard Friday, but I thought it would be a fun topic to cover.

So thank you so much for joining me on this edition of Whiteboard Friday. I will see you all soon. Bye.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Thursday, August 23, 2018

#4: IER staff on NEPA reforms (8-23-18)

IER’s policy team discusses problems with the the National Environmental Policy Act (NEPA)and the potential for reform.

Learn more about NEPA here.

The post #4: IER staff on NEPA reforms (8-23-18) appeared first on IER.

#3: Dr. Robert Murphy on carbon taxes (8-21-18)

Alex and Jordan sit down with IER’s Senior Economist Dr. Robert Murphy to discuss carbon taxes.

Links
Robert Murphy on IPCC data

The Case Against a U.S. Carbon Tax

Robert Murphy on climate insurance

The Benefits of Procrastination: The Economics of Geo-engineering

Carbon Taxes and the “Tax Interaction Effect”

More from Robert Murphy on the tax interaction effect

Rolling the DICE – William Nordhaus’s Dubious Case for a Carbon Tax

Dr. Robert Murphy testifies before the US Senate Committee on Environment and Public Works – 2013

The post #3: Dr. Robert Murphy on carbon taxes (8-21-18) appeared first on IER.

#2: IER staff on CAFE changes (8-6-18)

IER’s policy team discusses possible impacts of recently proposed changes to the Corporate Average Fuel Economy (CAFE) standards.

The post #2: IER staff on CAFE changes (8-6-18) appeared first on IER.

#1: Kenny Stein on Kavanaugh (8-2-18)

IER’s director of policy, Kenny Stein, discusses key rulings issued by Brett Kavanaugh and what his appointment to the Supreme Court would mean for the energy sector.

The post #1: Kenny Stein on Kavanaugh (8-2-18) appeared first on IER.

Jack Mintz on the Politics of a Carbon Tax

Jack Mintz of the University of Calgary has an interesting article in the Financial Post explaining that carbon taxes don’t “pass the smell test” among voters. Although Mintz is speaking to a Canadian audience, some of his observations dovetail nicely with points I have been stressing for years here at IER. In particular, voters don’t want a carbon tax—despite its theoretical superiority to other regulatory approaches—because (among other reasons) they fear the pain of higher energy prices will not correspond to meaningful emission reductions, and because they don’t trust the government to honor its pledge of revenue neutrality. Mintz backs up his arguments with survey data and real-world examples of (failed) carbon taxes in action.

At the start of his article, Mintz cites a World Bank report documenting that only 23 countries have an explicit carbon tax, whereas 176 countries have “targets for renewable energy and/or energy efficiency.” On the face of it, this is surprising, because (at least in theory) a properly calibrated carbon tax can achieve the same objectives as top-down energy regulations, at substantially lower economic cost.

For example, Mintz notes that “studies have shown that electric vehicle subsidies have a carbon-tax-equivalent cost of over $600,” and further observes that in Canada “[t]he federal proposed fuel carbon standard may have a carbon price ranging from $450 to $650 per tonne.” These implicit tax rates on carbon dioxide emissions are an order of magnitude higher than the typical proposals for an explicit carbon tax. Moreover, these regulations don’t bring in tax revenue, which (again, in principle at least) could be used to reduce other taxes.

So if governments are currently enacting policies with (implicit) high carbon tax rates, why not implement the real thing? In other words, why are voters so reluctant to agree to carbon taxes, if certain technocratic economists think they’re just what the doctor ordered?

Mintz argues in his article that voter skepticism is well-founded. Based on a report of surveys from around the world, he relays five reasons for the political unpopularity of carbon taxes:

(i) the levies are regressive, hitting lower income households most, (ii) voters are worried about competitiveness and employment effects, (iii) voters view the personal costs as too high, (iv) carbon taxes are believed to be ineffective in reducing emissions, and (v) governments can’t be trusted.

These reasons are self-explanatory—and the interested reader can check out Mintz’s full discussion—but let me amplify his points that overlap with ones I have been making here at IER over the years.

Proponents of a carbon tax admit that it will raise energy prices—that’s the point, after all, to get households and businesses to change their behavior and reduce greenhouse gas emissions. Yet proponents claim that the revenue from a carbon tax can be used to cushion the impact on poorer households. However, that claim is misleading if it’s construed as families “making money” from the tax.

Also, the promises of compensation for the poor conflict with the other promises made for the receipts—such as funding “green” technology, giving tax rate reduction to corporations and high-income individuals (to promote GDP growth), and giving a fair lump-sum rebate to everybody. Oren Cass rightfully refers to this rhetoric as the carbon tax shell game.

In the United States, a small but vocal group of intellectuals and former policymakers have been beating the drum for a supposedly conservative/libertarian carbon tax. Chief among the claims is that a modest carbon tax could be implemented as part of a package deal, in which the government simultaneously phases out top-down energy regulations and strictly abides by a revenue-neutral use of the new money.

Mintz’s article shows just how naïve such claims are. He points out that governments around the world prefer top-down regulations because of their hidden cost, and he also explains that no jurisdiction has obeyed its promises for a revenue-neutral carbon tax. He cites two recent examples from Canada:

[V]oters don’t trust governments who so often break their promises. A government might argue that carbon revenues will be used to reduce taxes or provide a certain subsidy. But with no ironclad way to make sure the revenues are earmarked for a purpose, a government can easily use the money to expand their budgets and increase the size of the bureaucracy. When B.C. and Alberta brought in their carbon taxes, they promised that revenues would not be used for the general budget.

Both provinces have now backtracked, increasing cynicism among voters. These arguments underlying the political fragility of carbon taxes leave out another important dimension: non-transparency.

Elsewhere, I have explained that British Columbia—which used to be the poster child among technocratic economists for a carbon tax “done right”—exhibits the familiar consequences of a tax hike. Not only did the provincial government eventually renege on its promise of revenue neutrality, but the claims that the BC economy was unharmed by its carbon tax also fall apart under scrutiny. It turns out that deliberately making electricity and transportation more expensive is not good for the economy.

Conclusion

Jack Mintz is a Canadian expert on the economics of energy. His recent article explains the very sensible reasons that voters around the world are skeptical of a carbon tax. Despite the assurances of some academics and policymakers, the general public is wary of promises that new tax revenue will be refunded back to them to shield them from a policy that deliberately raises energy prices.

The post Jack Mintz on the Politics of a Carbon Tax appeared first on IER.