Friday, September 29, 2017

Paid Social for Content Marketing Launches - Whiteboard Friday

Posted by KaneJamison

Stuck in a content marketing rut? Relying on your existing newsletter, social followers, or email outreach won't do your launches justice. Boosting your signal with paid social both introduces your brand to new audiences and improves your launch's traffic and results. In today's Whiteboard Friday, we're welcoming back our good friend Kane Jamison to highlight four straightforward, actionable tactics you can start using ASAP.

Paid social for content marketing launches

Click on the whiteboard image above to open a high-resolution version in a new tab!



Video Transcription

Howdy, Moz fans. My name is Kane. I'm the founder of a content marketing agency here in Seattle called Content Harmony, and we do a lot of content marketing projects where we use paid social to launch them and get better traffic and results.

So I spoke about this, this past year at MozCon, and what I want to do today is share some of those tactics with you and help you get started with launching your content with some paid traction and not just relying on your email outreach or maybe your own existing email newsletter and social followers.

Especially for a lot of companies that are just getting started with content marketing, that audience development component is really important. A lot of people just don't have a significant market share of their industry subscribed to their newsletter. So it's great to use paid social in order to reach new people, get them over to your most important content projects, or even just get them over to your week-to-week blog content.

Social teaser content

So the first thing I want to start with is expanding a little bit beyond just your average image ad. A lot of social networks, especially Facebook, are promoting video heavily nowadays. You can use that to get a lot cheaper engagement than you can from a typical image ad. If you've logged in to your Facebook feed lately, you've probably noticed that aside from birth announcements, there's a lot of videos filling up the feed. So as an advertiser, if you want to blend in well with that, using video as a teaser or a sampler for the content that you're producing is a great way to kind of look natural and look like you belong in the user's feed.

So different things you can do include:

  • Short animated videos explaining what the project is and why you did it.
  • Maybe doing talking head videos with some of your executives or staff or marketing team, just talking on screen with whatever in the background about the project you created and kind of drumming up interest to actually get people over to the site.
So that can be really great for team recognition if you're trying to build thought leadership in your space. It's a great way to introduce the face of your team members that might be speaking at industry conferences and events. It's a great way to just get people recognizing their name or maybe just help them feel closer to your company because they recognize their voice and face.


So everybody's instant reaction, of course, is, "I don't have the budget for video." That's okay. You don't need to be a videography expert to create decent social ads. There's a lot of great tools out there.

  • Soapbox by Wistia is a great one, that's been released recently, that allows you to do kind of a webcam combined with your browser type of video. There are also tools like...
  • Bigvu.tv
  • Shakr
  • Promo, which is a tool by a company called Slidely, I think.

All of those tools are great ways to create short, 20-second, 60-second types of videos. They let you create captions. So if you're scrolling through a social feed and you see an autoplay video, there's a good chance that the audio on that is turned off, so you can create captions to let people know what the video is about if it's not instantly obvious from the video itself. So that's a great way to get cheaper distribution than you might get from your typical image ad, and it's really going to stick out to users because most other companies aren't spending the time to do that.

Lookalike audiences

Another really valuable tactic is to create lookalike audiences from your best customers. Now, you can track your best customers in a couple of ways:
  • You could have a pixel, a Facebook pixel or another network pixel on your website that just tracks the people that have been to the site a number of times or that have been through the shopping cart at a certain dollar value.
  • We can take our email list and use the emails of customers that have ordered from us or just the emails of customers that are on our newsletter that seem like they open up every newsletter and they really like our content.

We can upload those into a custom audience in the social network of our choice and then create what's called a lookalike audience. In this case, I'd recommend what's called a "one percent lookalike audience." So if you're targeting people in the US, it means the one percent of people in the US that appear most like your audience. So if your audience is men ages 35 to 45, typically that are interested in a specific topic, the lookalike audience will probably be a lot of other men in a similar age group that like similar topics.

So Facebook is making that choice, which means you may or may not get the perfect audience right from the start. So it's great to test additional filters on top of the default lookalike audience. So, for example, you could target people by household income. You could target people by additional interests that may or may not be obvious from the custom audience, just to make sure you're only reaching the users that are interested in your topic. Whatever it might be, if this is going to end up being three or four million people at one percent of the country, it's probably good to go ahead and filter that down to a smaller audience that's a little bit closer to your exact target that you want to reach. So excellent way to create brand awareness with that target audience.

Influencers

The next thing I'd like you to test is getting your ads and your content in front of influencers in your space. That could mean...
  • Bloggers
  • Journalists
  • Or it could just mean people like page managers in Facebook, people that have access to a Facebook page that can share updates. Those could be social media managers. That could be bloggers. That could even be somebody running the page for the local church or a PTA group. Regardless, those people are probably going to have a lot of contacts, be likely to share things with friends and family or followers on social media.

Higher cost but embedded value

When you start running ads to this type of group, you're going to find that it costs a little bit more per click. If you're used to paying $0.50 to $1.00 per click, you might end up paying $1.00 or $2.00 per click to reach this audience. That's okay. There's a lot more embedded value with this audience than the typical user, because they're likely, on average, to have more reach, more followers, more influence.

Test share-focused CTAs

It's worth testing share focus call to actions. What that means is encouraging people to share this with some people they know that might be interested. Post it to their page even is something worth testing. It may or may not work every time, but certainly valuable to test.

Filters

So the way we recommend reaching most of these users is through something like a job title filter. Somebody says they're a blogger, says they're an editor-in-chief, that's the clearest way to reach them. They may not always have that as their job title, so you could also do employers. That's another good example.

I recommend combining that with broad interests. So if I am targeting journalists because I have a new research piece out, it's great for us to attach interests that are relevant to our space. If we're in health care, we might target people interested in health care and the FDA and other big companies in the space that they'd likely be following for updates. If we're in fashion, we might just be selecting people that are fans of big brands, Nordstrom and others like that. Whatever it is, you can take this audience of a few hundred thousand or whatever it might be down to just a few thousand and really focus on the people that are most likely to be writing about or influential in your space.

Retarget non-subscribers

The fourth thing you can test is retargeting non-subscribers. So a big goal of content marketing is having those pop-ups or call to actions on the site to get people to download a bigger piece of content, download a checklist, whatever it might be so that we can get them on our email newsletter. There's a lot of people that are going to click out of that. 90% to 95% of the people that visit your site or more probably aren't going to take that call to action.


So what we can do is convert this into more of a social ad unit and just show the same messaging to the people that didn't sign up on the site. Maybe they just hate pop-ups by default. They will never sign up for them. That's okay. They might be more receptive to a lead ad in Facebook that says "subscribe" or "download" instead of something that pops up on their screen.

Keep testing new messaging

The other thing we can do is start testing new messages and new content. Maybe this offer wasn't interesting to them because they don't need that guide, but maybe they need your checklist instead, or maybe they'd just like your email drip series that has an educational component to it. So keep testing different types of messaging. Just because this one wasn't valuable doesn't mean your other content isn't interesting to them, and it doesn't mean they're not interested in your email list.

Redo split tests from your site

We can keep testing messaging. So if we are testing messaging on our site, we might take the top two or three and test that messaging on ads. We might find that different messaging works better on social than it does on pop-ups or banners on the site. So it's worth redoing split tests that seemed conclusive on your site because things might be different on the social media network.


So that's it for today. What I'd love for you guys to do is if you have some great examples of targeting that's worked for you, messaging that's worked for you, or just other paid social tactics that have worked really well for your content marketing campaigns, I'd love to hear examples of that in the comments on the post, and we'd be happy to answer questions you guys have on how to actually get some of this stuff done. Whether it's targeting questions, how to set up lookalike audiences, anything like that, we'd be happy to answer questions there as well.

So that's it for me today. Thanks, Moz fans. We'll see you next time.


Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

SEI Alumni Highlight: Daniel Saldarriaga – Vulcano Operations Director in Colombia Shares Advice and Experience

“—We can’t live on this planet leaving a CO2 mark – there has to be a way to change this destructive behavior that humans have. I think solar is a way to change that.
Vulcano’s Operation Director  – Daniel Saldarriaga

“—On the other hand we have to help the economy. Not the macro economy which people in general doesn’t understand, but the day to day economy. Solar helps the average person save money, and that is what we aim to highlight at the company”.

Vulcano was born in MedellĂ­n, Colombia and has been in the energy industry for about 90 years. Just 4 years ago the company had started innovating on different technologies and that is how the company got to solar. Daniel is an energy passionate Business Administrator, that started working for Vulcano 10 years ago. Experiencing the transition of his company to solar energy, this opportunity has created a huge curiosity for Daniel about PV technologies.

With no previous solar energy knowledge, he started searching for information until he found SEI’s free spanish online class – ER100 . He realized at that time that the expertise and curriculum SEI offered was industry leading and decided to pay for the first course following the Professional Solar Training Program track. Once he was finished, he realized he needed a place to practice what he had learned, so he decided to take SEI’s lab classes in Paonia, Colorado. He decided to finish the entire program of study and SEI is proud to announce that Daniel is graduating this month. Daniel shared with us what SEI meant for his career:

“—Our sales on solar installation has grown 200% in the last year, there are other solar companies in Colombia that have had no growth at all. I realized that placing myself in the customer’s shoes, I learned what they need and that is someone who knows what they are doing. They needs someone who is able to provide guidance on what is the best option. And I learned all of that from SEI; how to understand customer needs and how to put in simple words how solar works and what is the best options for each one of them. One other thing I learned is that you might not always find all that you need on the market, especially in Colombia, so you’ll have to find away to work with what you have and be creative, without losing efficiency or security.”

Daniel told us that a big portion of the growth the company experienced in sales, was as a result of fixing errors that other less experience solar installers had made. The company is making great efforts to change the mindset of those people that had a bad experience, trying to make them understand that is not that solar doesn’t work, is that the system was not properly implemented. This created a negative impact on Colombia’s solar industry and in order to overcome that barrier, Vulcano had installed two solar PV systems at their headquarters for people to experience how a properly installed solar system is supposed to work.

“—To fully understand how a solar PV system works, you’ll need more than a week. SEI courses are great, they do give you an enormous amount of knowledge and perspective, but is important to go out there and experience by yourself how solar works in your local region. You leave SEI understanding all the concepts, but is important you practice them in your own country to understand the differences — Daniel says, and then he adds— SEI instructors are amazing because of that, they will teach you what the books says, what the regulation says, and what you will actually find in the field”.

Daniel gives anyone interested in working in the solar industry to keep studying, always, take continuing education training, read solar magazines and books, search the internet, and above all STAY CONNECTED WITH SEI.

“—The market is moving fast, if you go one week without learning something new, you will get left behind”

We want to thank Daniel for his time, and for generously sharing his experience and good practices, helping create a more well educated industry. We hope other solar professionals will share Daniel’s curiosity and will be motivated to continuously learn more about how to become a better solar technician and professional, so we can all achieve a world powered by solar energy.

 

The post SEI Alumni Highlight: Daniel Saldarriaga – Vulcano Operations Director in Colombia Shares Advice and Experience appeared first on Solar Training - Solar Installer Training - Solar PV Installation Training - Solar Energy Courses - Renewable Energy Education - NABCEP - Solar Energy International (SEI).

No Need for a Carbon Tax in Any Tax Reform Plan

Although there is no mention of a carbon tax in the recently released GOP blueprint for tax reform, there had been the familiar chatter of a “grand bargain” wherein Democrats get a carbon tax and Republicans get corporate income tax relief. For example, Edward Kleinbard wrote such an article for the Wall Street Journal earlier in the week.

Because this issue will no doubt continue resurfacing, it’s important to expose the flaws in Kleinbard’s case. He simply ignores the political impossibility of his proposal: why would Democrats agree to a massive new tax falling on poor people, in order to fund tax cuts for corporations? Furthermore, why do we need a “revenue neutral” tax reform plan? As I’ll show, federal spending and taxation are both at relatively high levels, historically speaking. If policymakers want to reduce the deficit, they should trim their budgets, not enact a massive new tax on energy and transportation.

Kleinbard Proposes a False Bargain

Early in his article Kleinbard promises his readers that “there is a powerful bipartisan grand bargain in corporate tax policy waiting to be struck.” Kleinbard goes on to explain:

Republicans are between a rock and a hard place. Growth comes from a permanent low corporate tax rate, not one that expires in 10 years. The GOP should embrace a new revenue-raiser that can attract moderate Democrats without undercutting the economic benefits of reform. The answer? A carbon tax, which raises revenue, satisfies long-term economic efficiency and environmental goals, and is as important to Democrats as corporate tax rate reduction is to Republicans.

Most Republican politicians hate the idea of a carbon tax…And progressive Democrats would never agree to revenue-losing corporate tax reform. Both sides should hold their noses and work toward major corporate tax reform financed in part with a carbon tax[Kleinbard, emphasis added.]

Look at the way Kleinbard stacks the deck: He simply rules out a net tax cut, by claiming that “progressive Democrats would never agree to revenue-losing corporate tax reform.” Maybe he’s right and maybe he’s wrong, but does Kleinbard really think progressive Democrats would be okay with major corporate tax cuts if it were financed by a big tax hike falling most heavily on poor people?

We don’t need to speculate on this question. We can survey the reaction from progressive Democrats to the current GOP proposal, which involves net tax cuts (and no carbon tax). For example, Senate Minority Leader Chuck Schumer (D-NY) says, “It seems that President Trump and Republicans have designed their plan to be cheered in the country clubs and the corporate boardrooms,” and claimed that Republicans are “going to be in for a rude awakening as the American people are going to rise up against this,” because “It’s little more than an across-the-board tax cut for America’s millionaires and billionaires.”

So does Kleinbard really think the way to get Chuck Schumer on board would be to couple the “across-the-board tax cut for America’s millionaires and billionaires” with a regressive tax that makes electricity, gasoline, and heating oil more expensive?

Can’t Use Carbon Tax Receipts for Multiple Purposes

In the block quotation above, Kleinbard opens up a can of worms when he says that both sides should work for corporate tax reform financed in part by a new carbon tax. As I’ve stressed countless times here at IER, you don’t get a “win-win” by doing a buffet approach. If you levy (say) a trillion dollar carbon tax, and then only devote (say) a third of the money to net corporate tax cuts while allocating the rest of the revenue to provide assistance to low-income households and to fund “green energy” investments, then even on paper you don’t get a boost to conventional economic growth. I’ve already argued that a literal swap—where 100% of all net corporate tax cuts are paid for by a carbon tax—would be worse politically than just the corporate tax cuts.

It’s not clear from his article if Kleinbard means that other, more politically popular tax changes (such as increasing the standard deduction to help regular households) would also be thrown into the “grand bargain” to achieve revenue neutrality, but if so, then he should admit to his readers, “Even if my proposal were carried out perfectly, standard models say it would reduce GDP growth.” If Kleinbard wants to justify such an outcome on environmental grounds, fair enough, but that’s not the impression he left the WSJ readers with his actual article.

Why Insist on Revenue-Neutral Tax Reform?

Proponents of a carbon tax act as if any corporate (or other) tax relief must be “paid for” with a levy on greenhouse gas emissions. But why? The following chart—based on OMB historical data—shows that both federal spending and tax receipts are at relatively high levels, historically speaking:

Figure 1. Federal Government Outlays and Receipts, as % of GDP, 1930-2016 (annual)

Source: OMB Historical tables

As Figure 1 indicates, federal government receipts in 2016 stood at 17.8 percent of the entire economy. Look at the dashed black line, where I’ve drawn the current level of the federal tax take back through time. For most of U.S. history, the federal government has taken far less in tax receipts. In other words, for most of U.S. history, the blue line is below the current dashed black line.

Now when it comes to spending, the situation is even clearer: the 2016 level of federal spending—at 20.9 percent of the economy—was much higher than the historical average, at least if we set aside spending during World War II.

To get a more formal result rather than simply eyeballing the chart, consider: From 1950 through 2016, the average figure for federal outlays was 19.5 percent of GDP, compared to the 2016 value of 20.9, meaning we are currently spending 1.4 percentage points more of the economy. On the receipts side, the average from 1950 through 2016 was 17.3 percent of GDP, coming in 0.5 percentage points below the 2016 tax take of 17.8 percent.

In summary, the federal government is currently spending and taxing more than the postwar norm, even when adjusting for the growth in the economy (let alone in terms of dollar amounts). There is no reason to rule out net tax cuts a priori; the U.S. seemed to get along just fine in earlier decades with a smaller tax burden.

If policymakers and pundits are worried about the growing federal debt—as they should be—they should look at spending cuts, not more taxes. Indeed, Kleinbard himself linked to a 2016 CBO study that assessed a carbon tax, but it also listed 54 options for cutting spending.

Conclusion

It is simply not true that a carbon tax is necessary to achieve tax reform. The federal government currently takes in half a percentage point of GDP more in receipts than the postwar average, while spending is 1.4 percentage points higher. If we insist that any tax reform be “deficit-neutral,” then it can be matched with corresponding spending cuts that the CBO has studied, and elsewhere I have itemized more than a trillion dollars in liquid assets that the federal government could privatize.

The fundamental flaw in claims for a “carbon tax swap” involving major corporate tax relief is that Democrats would never agree to such a move, and indeed it’s not clear that even conservatives could in good conscience support a measure that would tie a giant tax cut for the wealthy to a regressive carbon tax that hits the poor the hardest.

Kleinbard concedes in his WSJ article that Republicans and Democrats agree that the current tax code is broken. There’s no need to “fix it” by adding in a giant new tax on energy and transportation, which is what a carbon tax means in practice. If the political process has produced a crazy tax code with stultifying barriers to savings and investment, why would we trust that same political process to do the “optimal” thing with a giant new carbon tax?

The post No Need for a Carbon Tax in Any Tax Reform Plan appeared first on IER.

The Jones Act: Distorting American Energy Markets Since 1920

One notable consequence of this hurricane season has been the renewed interest in the controversial Jones Act. Enacted in 1920, it mandates that only vessels that are built, owned, crewed, and flagged in the United States can participate in maritime shipping between domestic ports. In the wake of Hurricane Harvey and Hurricane Irma, the Department of Homeland Security temporarily suspended the controversial law in order to increase the supply of refined fuel to areas affected by the storms. Yesterday, President Trump also waived Jones Act shipping restrictions to Puerto Rico as they were holding back response efforts to Hurricane Maria. The temporary suspension of the legislation is noteworthy because one of the stated purposes of the Jones Act is to better prepare the country for natural disasters. Waiving these provisions in the wake of these recent hurricanes is an admission that the legislation actually hinders disaster relief by limiting the supply of ships that can legally be used to transport goods between American ports.

National Security or Protectionism?

The alleged purpose of the Jones Act is to improve national defense by protecting the country’s ability to build and maintain a fleet of ships for use in foreign military conflict and respond to natural disasters. Several federal agencies enforce the legislation, including U.S. Customs and Border Protection, the Coast Guard, and the Federal Maritime Commission. These agencies have interpreted the Jones Act and subsequent legislation to include nearly every kind of commercial vessel. Although advocates of the Jones Act are quick to cite national security as its primary purpose, the Mercatus Center’s Thomas Grennes points out there is little evidence the legislation actually contributes to national defense. He writes:

Whatever the act might have contributed to US military operations abroad in the past, the potential contribution must be diminishing. The Jones Act–eligible fleet continues to get smaller and older. The number of large Jones Act commercial ships was 193 in 2000, but by 2014 there were only 90. The total Jones Act fleet of all sizes contains more ships, but a large percentage of the vessels are ferries or tugboats, which would contribute little to distant military actions. The earlier contributions of Jones Act ships to US military operations in Afghanistan and Iraq were judged to be minimal by prominent analysts. Rob Quartel, former US federal maritime commissioner and maritime security analyst, has written unfavorably about the contributions of Jones Act ships during the Gulf War: Of the “armada” of 460 ships that transported military materials into Saudi ports, “no Jones Act vessels participated… The success of the military sealift—a brilliant feat of logistics—occurred despite (rather than because of)” the Jones Act. The Jones Act had to be suspended to provide for fueling of ships.

It’s clear that proponents of the Jones Act have cleverly framed the need for this legislation on the basis of national security in order to distract from its actual purpose: the Jones Act is economic protectionism in its rankest form. Like all protectionist legislation, the Jones Act is able to reward concentrated benefits to a select few because the perception of those benefits significantly outweighs the dispersed costs imposed on everyone else. By prohibiting foreign ships from entering the domestic maritime shipping market in the United States, special interests have affectively shielded the American shipping industry from 90 percent of their competition in the world shipping market. This protection comes at great costs to the American consumer.

A quick look at who exactly supports the Jones Act reveals the cronyism and economic protectionism at play. The act currently has support from members of both parties in Congress and was supported previously by the Obama administration. One of its strongest supporters is Representative Duncan Hunter of California whose district is home to the NASSCO shipyard—the largest U.S. shipyard employer as of 2014. Not surprisingly, groups that benefit most from the protection from foreign competition also largely support the act; they include shipbuilding companies and labor unions that represent the merchant marine. The fact that special interests have expanded their lobbying efforts to influence the enforcement of the law reveals the degree of influence these groups have on the legislation. Recently, a group called the Offshore Marine Services Association successfully pushed U.S. Customs and Border Protection to dedicate resources to an enforcement unit called the Jones Act Division of Enforcement.

Costs of the Jones Act 

The obvious economic effect of the Jones Act is that it excludes foreign ships from participating in the domestic maritime shipping market. Limiting the supply of domestic shippers increases the costs of shipping goods between domestic ports relative to what they would be in a more competitive market; these costs are then passed on to consumers. The higher costs are caused by a combination of factors including the increased cost of producing a ship in an American shipyard (which is four to five times higher than the cost of an imported ship) as well as the increased operating costs of employing an American crew. The Congressional Research Service has shown that operating costs of American vessels bound to the Jones Act can be more than twice as high per day to comparable foreign ships. In 1999, the U.S. International Trade Commission reported the Jones Act costs $1.32 billion annually to American consumers. Areas like Alaska, Hawaii, Puerto Rico, and Guam disproportionally feel the effects of these costs because their geographic locations limit their ability to use alternative forms of transportation such as rail or freight to move goods.

Excluding foreign competition in the domestic maritime shipping market also reduces competition for services and grants domestic firms increased monopoly power in the market. This allows domestic companies to charge higher prices and prevents them from adapting to better meet consumer demand.

The Jones Act and Energy Markets

The economic effects of the Jones Act have had a substantial impact on the U.S. energy market due to the scope of the industry’s supply chain. These effects have been amplified in recent years as the shale revolution has increased the demand for domestic transportation of goods by the energy industry. The new supply of domestic crude oil made accessible by the shale revolution is replacing our former demand for imports, raising demand for domestic transportation of crude oil in the process. In the past, pipelines have predominantly been the preferred method for transporting oil within the United States, but the existing pipeline network is not designed to access the new sources of domestic crude. When you combine this with the fact that there has been strong opposition to the construction of new pipelines—particularly the Keystone XL and Dakota Extension—it’s clear that maritime shipping could play a larger role in our domestic energy supply chain. Unfortunately, the Jones Act continues to prevent foreign ships from participating in the transportation of crude oil between domestic ports, raising the cost of domestic shipping in the process and preventing maritime shipping from taking a larger role in the energy supply chain.

Higher costs on domestic shipping have caused two notable distortions to the energy industry’s supply chain. First, the limited supply of Jones Act approved shipping tankers and the higher costs associated with the legislation has caused a sharp increase in costlier rail shipments of crude oil. These increased costs are then passed on to American energy consumers, which is to say practically everyone in the United States. Second, the Jones Act has also affected where products are shipped because the costs of domestic transportation largely determines the pattern of energy trade. This being the case, in order to avoid the added costs caused by the Jones Act, some companies have opted to hire foreign ships to export crude oil to Canada instead of shipping it to a domestic refinery. This increases costs to energy consumers relative to what they would be absent the Jones Act and also diverts the economic activity created by the refining process away from the United States. In other words, the protection of American maritime jobs by the Jones Act comes at the cost of higher energy prices for everyone and diversion of economic growth to other countries.

Repeal the Jones Act

Although many groups are quick to defend the Jones Act on the basis that it protects American jobs, it’s clear that this outdated law is doing more harm than good—this has become especially apparent following this recent string of natural disasters. By placing strict limitations on the types of ships that may be used to move goods between domestic ports, the Jones Act forces American consumers to pay higher prices for goods simply to protect the American shipping industry from foreign competition. The impact of the legislation is especially apparent in the American energy industry, as the added costs of domestic shipping have severely distorted the industry’s supply chain. Worse than that, during natural disasters when an efficient supply chain can be the difference between life and death, the Jones Act limits the supply of ships that can legally transport goods to areas affected by the disasters, driving up prices and adding another obstacle to the recovery process.

Defenders of outdated, protectionist legislation like the Jones Act often claim they put American jobs and the American people first, yet their policies do no such thing. Sure, these people can point to a slew of jobs that are protected by this legislation, but these jobs come at a cost to all of us. Going forward, the administration should follow the line of thinking that led to this temporarily suspension of the Jones Act as it is consistent with a fairer approach to American business. A push by the administration to repeal the Jones Act—or at the very least, a push to reform it to make it less restrictive—would lower domestic shipping costs substantially, savings that would then be passed on to the American people in the form of lower prices. In the process, the administration would be sending a strong signal that the American economy should not operate based on the protection of entrenched interests in Washington, and instead by the competition and cooperation that defines a dynamic market economy.

The post The Jones Act: Distorting American Energy Markets Since 1920 appeared first on IER.

Wednesday, September 27, 2017

Alumni Spotlight: Tim Chester

A confluence of passions led Tim to spend 8 weeks pursuing solar training at SEI’s training facility in Paonia, CO. A veteran and an educator, Tim recently completed SEI’s Solar Professionals Trainer Certificate Program before he begins his new job as a solar instructor newly developed solar training program at the Rural Institute for Veterans Education and Research (RIVER). RIVER is a a special program for veterans who need additional resources and support to pursue educational opportunities after returning to civilian life.

The new position is the perfect progression for Tim’s career. Tim came into solar as he began his career with the US Coast Guard in 1991, servicing solar-powered aids to navigation buoys. However, his career was unexpectedly cemented in solar after his retirement from the Coast Guard in 2012 when he realized his love for teaching. This led him to the Missoula College Energy Technology Program at the University of Montana. His three years of teaching at the university connected him with SEI Instructor Orion Thornton. As the renewable energy program was ending and Tim found himself wanting to focus  more on PV.  Orion said to him “you gotta go to SEI, there’s no way around it.”

This is how Tim found himself pursuing SEI’s Solar Professionals Trainer Certificate Program as a way to strengthen his background in solar. He said of the experience “even having been in the field and seeing solar everyday at school for the past 5 years, the program here at SEI tied it all together.  It made me understand [solar] and be able to teach it better. The combination of the classroom and the lab ties the concepts together; you can learn about it then see it and experience it.”

Adding about his SEI experience versus at a university that “the biggest difference for me, absolutely was the instructors… In a university system typically those people have all the education and the theory. There’s no question about their knowledge of the concepts they’re teaching, but they don’t have any real world experience to share with you to tie it together. So to talk to people who are doing the instructing as a secondary job but their first real job is out in the field doing what I want to do everyday and what we’re talking about every day, that experience is invaluable.”

Tim is now able to build on his solar training experience. Not only was Tim able to fund his in-person classes through his Veterans Education Benefits but he’s greatly expanding the impact of these courses to teach fellow veterans. Tim said his ultimate passion for completing the training was to “continue and expand my teaching, reaching out to more students, especially those in need of technical training that can’t or won’t attend a traditional 2 or 4 year school for training. I measure my success in life by my ability to help others and improve their lives, I believe this training will allow me to be even more successful in this endeavor to help others. “

 

The post Alumni Spotlight: Tim Chester appeared first on Solar Training - Solar Installer Training - Solar PV Installation Training - Solar Energy Courses - Renewable Energy Education - NABCEP - Solar Energy International (SEI).

How to Track Your Local SEO & SEM

Posted by nickpierno

If you asked me, I’d tell you that proper tracking is the single most important element in your local business digital marketing stack. I’d also tell you that even if you didn’t ask, apparently.

A decent tracking setup allows you to answer the most important questions about your marketing efforts. What’s working and what isn’t?

Many digital marketing strategies today still focus on traffic. Lots of agencies/developers/marketers will slap an Analytics tracking code on your site and call it a day. For most local businesses, though, traffic isn’t all that meaningful of a metric. And in many cases (e.g. Adwords & Facebook), more traffic just means more spending, without any real relationship to results.

What you really need your tracking setup to tell you is how many leads (AKA conversions) you’re getting, and from where. It also needs to do so quickly and easily, without you having to log into multiple accounts to piece everything together.

If you’re spending money or energy on SEO, Adwords, Facebook, or any other kind of digital traffic stream and you’re not measuring how many leads you get from each source, stop what you’re doing right now and make setting up a solid tracking plan your next priority.

This guide is intended to fill you in on all the basic elements you’ll need to assemble a simple, yet flexible and robust tracking setup.

Google Analytics

Google Analytics is at the center of virtually every good web tracking setup. There are other supplemental ways to collect web analytics (like Heap, Hotjar, Facebook Pixels, etc), but Google Analytics is the free, powerful, and omnipresent tool that virtually every website should use. It will be the foundation of our approach in this guide.

Analytics setup tips

Analytics is super easy to set up. Create (or sign into) a Google account, add your Account and Property (website), and install the tracking code in your website’s template.

Whatever happens, don’t let your agency or developer set up your Analytics property on their own Account. Agencies and developers: STOP DOING THIS! Create a separate Google/Gmail account and let this be the "owner" of a new Analytics Account, then share permission with the agency/developer’s account, the client’s personal Google account, and so on.

The “All Website Data” view will be created by default for a new property. If you’re going to add filters or make any other advanced changes, be sure to create and use a separate View, keeping the default view clean and pure.

Also be sure to set the appropriate currency and time zone in the “View Settings.” If you ever use Adwords, using the wrong currency setting will result in a major disagreement between Adwords and Analytics.

Goals

Once your basic Analytics setup is in place, you should add some goals. This is where the magic happens. Ideally, every business objective your website can achieve should be represented as a goal conversion. Conversions can come in many forms, but here are some of the most common ones:

  • Contact form submission
  • Quote request form submission
  • Phone call
  • Text message
  • Chat
  • Appointment booking
  • Newsletter signup
  • E-commerce purchase

How you slice up your goals will vary with your needs, but I generally try to group similar “types” of conversions into a single goal. If I have several different contact forms on a site (like a quick contact form in the sidebar, and a heftier one on the contact page), I might group those as a single goal. You can always dig deeper to see the specific breakdown, but it’s nice to keep goals as neat and tidy as possible.

To create a goal in Analytics:

  1. Navigate to the Admin screen.
  2. Under the appropriate View, select Goals and then + New Goal.
  3. You can either choose between a goal Template, or Custom. Most goals are easiest to set up choosing Custom.
  4. Give your goal a name (ex. Contact Form Submission) and choose a type. Most goals for local businesses will either be a Destination or an Event.

Pro tip: Analytics allows you to associate a dollar value to your goal conversions. If you can tie your goals to their actual value, it can be a powerful metric to measure performance with. A common way to determine the value of a goal is to take the average value of a sale and multiply it by the average closing rate of Internet leads. For example, if your average sale is worth $1,000, and you typically close 1/10 of leads, your goal value would be $100.

Form tracking

The simplest way to track form fills is to have the form redirect to a "Thank You" page upon submission. This is usually my preferred setup; it’s easy to configure, and I can use the Thank You page to recommend other services, articles, etc. on the site and potentially keep the user around. I also find a dedicated Thank You page to provide the best affirmation that the form submission actually went through.

Different forms can all use the same Thank You page, and pass along variables in the URL to distinguish themselves from each other so you don’t have to create a hundred different Thank You pages to track different forms or goals. Most decent form plugins for Wordpress are capable of this. My favorite is Gravityforms. Contact Form 7 and Ninja Forms are also very popular (and free).

Another option is using event tracking. Event tracking allows you to track the click of a button or link (the submit button, in the case of a web form). This would circumvent the need for a thank you page if you don’t want to (or can’t) send the user elsewhere when they submit a form. It’s also handy for other, more advanced forms of tracking.

Here’s a handy plugin for Gravityforms that makes setting up event tracking a snap.

Once you’ve got your form redirecting to a Thank You page or generating an event, you just need to create a goal in Analytics with the corresponding value.

You can use Thank You pages or events in a similar manner to track appointment booking, web chats, newsletter signups, etc.

Call tracking

Many businesses and marketers have adopted form tracking, since it’s easy and free. That’s great. But for most businesses, it leaves a huge volume of web conversions untracked.

If you’re spending cash to generate traffic to your site, you could be hemorrhaging budget if you’re not collecting and attributing the phone call conversions from your website.

There are several solutions and approaches to call tracking. I use and recommend CallRail, which also seems to have emerged as the darling of the digital marketing community over the past few years thanks to its ease of use, great support, fair pricing, and focus on integration. Another option (so I don’t come across as completely biased) is CallTrackingMetrics.

You’ll want to make sure your call tracking platform allows for integration with Google Analytics and offers something called "dynamic number insertion."

Dynamic number insertion uses JavaScript to detect your actual local phone number on your website and replace it with a tracking number when a user loads your page.

Dynamic insertion is especially important in the context of local SEO, since it allows you to keep your real, local number on your site, and maintain NAP consistency with the rest of your business’s citations. Assuming it’s implemented properly, Google will still see your real number when it crawls your site, but users will get a tracked number.

Basically, magic.

There are a few ways to implement dynamic number insertion. For most businesses, one of these two approaches should fit the bill.

Number per source

With this approach, you'll create a tracking number for each source you wish to track calls for. These sources might be:

  • Organic search traffic
  • Paid search traffic
  • Facebook referral traffic
  • Yelp referral traffic
  • Direct traffic
  • Vanity URL traffic (for visitors coming from an offline TV or radio ad, for example)

When someone arrives at your website from one of these predefined sources, the corresponding number will show in place of your real number, wherever it’s visible. If someone calls that number, an event will be passed to Analytics along with the source.

This approach isn’t perfect, but it’s a solid solution if your site gets large amounts of traffic (5k+ visits/day) and you want to keep call tracking costs low. It will do a solid job of answering the basic questions of how many calls your site generates and where they came from, but it comes with a few minor caveats:

  • Calls originating from sources you didn’t predefine will be missed.
  • Events sent to Analytics will create artificial sessions not tied to actual user sessions.
  • Call conversions coming from Adwords clicks won’t be attached to campaigns, ad groups, or keywords.

Some of these issues have more advanced workarounds. None of them are deal breakers… but you can avoid them completely with number pools — the awesomest call tracking method.

Number pools

“Keyword Pools,” as CallRail refers to them, are the killer app for call tracking. As long as your traffic doesn’t make this option prohibitively expensive (which won’t be a problem for most local business websites), this is the way to go.

In this approach, you create a pool with several numbers (8+ with CallRail). Each concurrent visitor on your site is assigned a different number, and if they call it, the conversion is attached to their session in Analytics, as well as their click in Adwords (if applicable). No more artificial sessions or disconnected conversions, and as long as you have enough numbers in your pool to cover your site’s traffic, you’ll capture all calls from your site, regardless of source. It’s also much quicker to set up than a number per source, and will even make you more attractive and better at sports!

You generally have to pay your call tracking provider for additional numbers, and you’ll need a number for each concurrent visitor to keep things running smoothly, so this is where massive amounts of traffic can start to get expensive. CallRail recommends you look at your average hourly traffic during peak times and include ¼ the tally as numbers in your pool. So if you have 30 visitors per hour on average, you might want ~8 numbers.

Implementation

Once you’ve got your call tracking platform configured, you’ll need to implement some code on your site to allow the dynamic number insertion to work its magic. Most platforms will provide you with a code snippet and instructions for installation. If you use CallRail and Wordpress, there’s a handy plugin to make things even simpler. Just install, connect, and go.

To get your calls recorded in Analytics, you’ll just need to enable that option from your call tracking service. With CallRail you simply enable the integration, add your domain, and calls will be sent to your Analytics account as Events. Just like with your form submissions, you can add these events as a goal. Usually it makes sense to add a single goal called “Phone Calls” and set your event conditions according to the output from your call tracking service. If you’re using CallRail, it will look like this:

Google Search Console

It’s easy to forget to set up Search Console (formerly Webmaster Tools), because most of the time it plays a backseat role in your digital marketing measurement. But miss it, and you’ll forego some fundamental technical SEO basics (country setting, XML sitemaps, robots.txt verification, crawl reports, etc.), and you’ll miss out on some handy keyword click data in the Search Analytics section. Search Console data can also be indispensable for diagnosing penalties and other problems down the road, should they ever pop up.

Make sure to connect your Search Console with your Analytics property, as well as your Adwords account.

With all the basics of your tracking setup in place, the next step is to bring your paid advertising data into the mix.

Google Adwords

Adwords is probably the single most convincing reason to get proper tracking in place. Without it, you can spend a lot of money on clicks without really knowing what you get out of it. Conversion data in Adwords is also absolutely critical in making informed optimizations to your campaign settings, ad text, keywords, and so on.

If you’d like some more of my rantings on conversions in Adwords and some other ways to get more out of your campaigns, check out this recent article :)

Getting your data flowing in all the right directions is simple, but often overlooked.

Linking with Analytics

First, make sure your Adwords and Analytics accounts are linked. Always make sure you have auto-tagging enabled on your Adwords account. Now all your Adwords data will show up in the Acquisition > Adwords area of Analytics. This is a good time to double-check that you have the currency correctly set in Analytics (Admin > View Settings); otherwise, your Adwords spend will be converted to the currency set in Analytics and record the wrong dollar values (and you can’t change data that’s already been imported).

Next, you’ll want to get those call and form conversions from Analytics into Adwords.

Importing conversions in Adwords

Some Adwords management companies/consultants might disagree, but I strongly advocate an Analytics-first approach to conversion tracking. You can get call and form conversions pulled directly into Adwords by installing a tracking code on your site. But don’t.

Instead, make sure all your conversions are set up as goals in Analytics, and then import them into Adwords. This allows Analytics to act as your one-stop-shop for reviewing your conversion data, while providing all the same access to that data inside Adwords.

Call extensions & call-only ads

This can throw some folks off. You will want to track call extensions natively within Adwords. These conversions are set up automatically when you create a call extension in Adwords and elect to use a Google call forwarding number with the default settings.

Don’t worry though, you can still get these conversions tracked in Analytics if you want to (I could make an argument either for or against). Simply create a single “offline” tracking number in your call tracking platform, and use that number as the destination for the Google forwarding number.

This also helps counteract one of the oddities of Google’s call forwarding system. Google will actually only start showing the forwarding number on desktop ads after they have received a certain (seemingly arbitrary) minimum number of clicks per week. As a result, some calls are tracked and some aren’t — especially on smaller campaigns. With this little trick, Analytics will show all the calls originating from your ads — not just ones that take place once you’ve paid Google enough each week.

Adwords might give you a hard time for using a number in your call extensions that isn’t on your website. If you encounter issues with getting your number verified for use as a call extension, just make sure you have linked your Search Console to your Adwords account (as indicated above).

Now you’ve got Analytics and Adwords all synced up, and your tracking regimen is looking pretty gnarly! There are a few other cool tools you can use to take full advantage of your sweet setup.

Google Tag Manager

If you’re finding yourself putting a lot of code snippets on your site (web chat, Analytics, call tracking, Adwords, Facebook Pixels, etc), Google Tag Manager is a fantastic tool for managing them all from one spot. You can also do all sorts of advanced slicing and dicing.

GTM is basically a container that you put all your snippets in, and then you put a single GTM snippet on your site. Once installed, you never need to go back to your site’s code to make changes to your snippets. You can manage them all from the GTM interface in a user-friendly, version-controlled environment.

Don’t bother if you just need Analytics on your site (and are using the CallRail plugin). But for more robust needs, it’s well worth considering for its sheer power and simplicity.

Here’s a great primer on making use of Google Tag Manager.

UTM tracking URLs & Google Campaign URL Builder

Once you’ve got conversion data occupying all your waking thoughts, you might want to take things a step further. Perhaps you want to track traffic and leads that come from an offline advertisement, a business card, an email signature, etc. You can build tracking URLs that include UTM parameters (campaign, source, and medium), so that when visitors come to your site from a certain place, you can tell where that place was!

Once you know how to build these URLs, you don’t really need a tool, but Google’s Campaign URL Builder makes quick enough work of it that it’s bound to earn a spot in your browser’s bookmarks bar.

Pro tip: Use a tracking URL on your Google My Business listing to help distinguish traffic/conversions coming in from your listing vs traffic coming in from the organic search results. I’d recommend using:

Source: google
Medium: organic
Campaign name: gmb-listing (or something)

This way your GMB traffic still shows up in Analytics as normal organic traffic, but you can drill down to the gmb-listing campaign to see its specific performance.

Bonus pro tip: Use a vanity domain or a short URL on print materials or offline ads, and point it to a tracking URL to measure their performance in Analytics.

Rank tracking

Whaaat? Rank tracking is a dirty word to conversion tracking purists, isn’t it?

Nah. It’s true that rank tracking is a poor primary metric for your digital marketing efforts, but it can be very helpful as a supplemental metric and for helping to diagnose changes in traffic, as Darren Shaw explored here.

For local businesses, we think our Local Rank Tracker is a pretty darn good tool for the job.

Google My Business Insights

Your GMB listing is a foundational piece of your local SEO infrastructure, and GMB Insights offer some meaningful data (impressions and clicks for your listing, mostly). It also tries to tell you how many calls your listing generates for you, but it comes up a bit short since it relies on "tel:" links instead of tracking numbers. It will tell you how many people clicked on your phone number, but not how many actually made the call. It also won’t give you any insights into calls coming from desktop users.

There’s a great workaround though! It just might freak you out a bit…

Fire up your call tracking platform once more, create an “offline” number, and use it as your “primary number” on your GMB listing. Don’t panic. You can preserve your NAP consistency by demoting your real local number to an “additional number” slot on your GMB listing.

I don’t consider this a necessary step, because you’re probably not pointing your paid clicks to your GMB listing. However, combined with a tracking URL pointing to your website, you can now fully measure the performance of Google My Business for your business!

Disclaimer: I believe that this method is totally safe, and I’m using it myself in several instances, but I can’t say with absolute certainty that it won’t impact your rankings. Whitespark is currently testing this out on a larger scale, and we’ll share our findings once they’re assembled!

Taking it all in

So now you’ve assembled a lean, mean tracking machine. You’re already feeling 10 years younger, and everyone pays attention when you enter the room. But what can you do with all this power?

Here are a few ways I like to soak up this beautiful data.

Pop into Analytics

Since we’ve centralized all our tracking in Analytics, we can answer pretty much any performance questions we have within a few simple clicks.

  • How many calls and form fills did we get last month from our organic rankings?
  • How does that compare to the month before? Last year?
  • How many paid conversions are we getting? How much are we paying on average for them?
  • Are we doing anything expensive that isn’t generating many leads?
  • Does our Facebook page generate any leads on our website?

There are a billion and seven ways to look at your Analytics data, but I do most of my ogling from Acquisition > All Traffic > Channels. Here you get a great overview of your traffic and conversions sliced up by channels (Organic Search, Paid Search, Direct, Referral, etc). You can obviously adjust date ranges, compare to past date ranges, and view conversion metrics individually or as a whole. For me, this is Analytics home base.

Acquisition > All Traffic > Source/Medium can be equally interesting, especially if you’ve made good use of tracking URLs.

Make some sweet SEO reports

I can populate almost my entire standard SEO client report from the Acquisition section of Analytics. Making conversions the star of the show really helps to keep clients engaged in their monthly reporting.

Google Analytics dashboards

Google’s Dashboards inside Analytics provide a great way to put the most important metrics together on a single screen. They’re easy to use, but I’ve always found them a bit limiting. Fortunately for data junkies, Google has recently released its next generation data visualization product...

Google Data Studio

This is pretty awesome. It’s very flexible, powerful, and user-friendly. I’d recommend skipping the Analytics Dashboards and going straight to Data Studio.

It will allow to you to beautifully dashboard-ify your data from Analytics, Adwords, Youtube, DoubleClick, and even custom databases or spreadsheets. All the data is “live” and dynamic. Users can even change data sources and date ranges on the fly! Bosses love it, clients love it, and marketers love it… provided everything is performing really well ;)

Supermetrics

If you want to get really fancy, and build your own fully custom dashboard, develop some truly bespoke analysis tools, or automate your reporting regimen, check out Supermetrics. It allows you to pull data from just about any source into Google Sheets or Excel. From there, your only limitation is your mastery of spreadsheet-fu and your imagination.

TL;DR

So that’s a lot of stuff. If you’d like to skip the more nuanced explanations, pro tips, and bad jokes, here’s the gist in point form:

  • Tracking your digital marketing is super important.
  • Don’t just track traffic. Tracking conversions is critical.
  • Use Google Analytics. Don’t let your agency use their own account.
  • Set up goals for every type of lead (forms, calls, chats, bookings, etc).
  • Track forms with destinations (thank you pages) or events.
  • Track your calls, probably using CallRail.
  • Use "number per source" if you have a huge volume of traffic; otherwise, use number pools (AKA keyword pools). Pools are better.
  • Set up Search Console and link it to your Analytics and Adwords accounts.
  • Link Adwords with Analytics.
  • Import Analytics conversions into Adwords instead of using Adwords’ native conversion tracking snippet...
  • ...except for call extensions. Track those within and Adwords AND in Analytics (if you want to) by using an “offline” tracking number as the destination for your Google forwarding numbers.
  • Use Google Tag Manager if you have more than a couple third-party scripts to run on your site (web chat, Analytics, call tracking, Facebook Pixels etc).
  • Use Google Campaign URL Builder to create tracked URLs for tracking visitors from various sources like offline advertising, email signatures, etc.
  • Use a tracked URL on your GMB listing.
  • Use a tracked number as your “primary” GMB listing number (if you do this, make sure you put your real local number as a “secondary” number). Note: We think this is safe, but we don’t have quite enough data to say so unequivocally. YMMV.
  • Use vanity domains or short URLs that point to your tracking URLs to put on print materials, TV spots, etc.
  • Track your rankings like a boss.
  • Acquisition > All Traffic > Channels is your new Analytics home base.
  • Consider making some Google Analytics Dashboards… and then don’t, because Google Data Studio is way better. So use that.
  • Check out Supermetrics if you want to get really hardcore.
  • Don’t let your dreams be dreams.

If you’re new to tracking your digital marketing, I hope this provides a helpful starting point, and helps cut through some of the confusion and uncertainty about how to best get set up.

If you’re a conversion veteran, I hope there are a few new or alternative ideas here that you can use to improve your setup.

If you’ve got anything to add, correct, or ask, leave a comment!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Utility-Scale Solar, Part 2: Failed Promise

Last week, Part I of this post challenged the declaration by DOE’s Office of Energy Efficiency and Renewable Energy (EERE) that the levelized cost of central-station (utility-scale) solar had fallen to six cents per kWh.

Specifically, EERE does not account for important costs that would double its estimate, in the range of 11 cents and 14 cents per kWh.[1] These hidden costs concern solar’s (corrected) intermittency, lower operational life, higher transmission costs, and special tax benefits.

Relative competitiveness, not press-release claims, is controlling because solar is not an infant industry, and conventional technologies are not standing still. Natural-gas-fired generation, in particular, is one-fourth cheaper than the cost of central-station solar, according to DOE’s nonadvocacy arm, the Energy Information Administration. Gas-fired electricity is also a reliable, not intermittent, product.

Ancient, Not Infant

Solar is the mother and father of all energy. It is also one of the oldest, in terms of technical applications to concentrate solar rays to generate steam and electricity.

Solar collection dates from the 17th century, with numerous (market-based, off-grid) applications emerging in the twentieth century.[2] Central-station solar dates from at least 1911, when the Sun Power Company built more than 10,000 feet of solar collectors to generate steam for machine power.[3]

Observing a solar plant, the Arizona Republican concluded in 1904: “[I]t is safe to say that solar power will be one of the great influences of the new century and will make the arid regions of the West and other parts of the world the theater of the greatest industrial revolution of the future.”[4]

Past Solar Hyperbole

Excitement and exaggeration about the prospects of central-station solar date took off in the 1970s, not to recede. Some examples follow.

1970

“In 1970, two University of Arizona scientists attracted national attention with an ambitious proposal to turn more than 5,000 square miles of the southwestern United States desert into what they called a ‘national solar power firm’ capable of supplying all U.S. electricity needs in the twenty-first century.”[5]

1976

“Mixed solar/conventional installations could become the most economical alternative in most parts of the United States within the next few years.”[6]

1984

“The private sector can be expected to develop improved solar and wind technologies which will begin to become competitive and self-supporting on a national level by the end of the decade [1990] if assisted by tax credits and augmented by federally sponsored R&D.”[7]

1987

“I think … the consensus … is after the year 2000, somewhere between 10 and 20 percent of our energy could come from solar technologies, quite easily.”[8]

1990

“Within a few decades, a geographically diverse country such as the United States might get 30 percent of its electricity from sunshine, 20 percent from hydropower, 20 percent from windpower, 10 percent from biomass, 10 percent from geothermal energy, and 10 percent from natural-gas-fired cogeneration.”[9]

1994

“The Enron Corporation plans to build a plant in the southern Nevada desert that would be the largest operation in the country making electricity directly from sunlight, producing enough to power a city of 100,000 people. It is expected to begin operating in late 1996.”[10] [It would not be built.]

“[T]he cost of solar power generation has quietly declined by two-thirds. Far from depending on some wondrous breakthrough, the experts say, Enron can offer commercially competitive solar power by inexpensively mass-producing solar panels, and then employing thousands of them in the Nevada desert.” [11

“Enron is pledging to deliver the electricity at 5.5 cents a kilowatt- hour in about two years. That would beat the average cost of 5.8 cents currently paid by the Government for the electricity it uses. The national average retail price is 8 cents.”[12]

1996

“Solar and wind energy technologies appear to be entering a ‘takeoff’ phase of the kind that personal computers experienced in the early 1980s.”[13]

2011

“Before maybe the end of this decade, I see wind and solar being cost-competitive without subsidy with new fossil fuel.”[14]

Conclusion

In addition to claiming victory on its 2020 target three years early, DOE’s EERE optimistically referenced these 2030 cost-reduction goals: central station, $0.03/kWh; commercial solar, $0.03/kWh; and residential solar, $0.05/kWh. New rounds of funding are centered around these targets.

Why taxpayers and government direction? Solar is not an infant industry. Past claims about competitiveness and impending market penetration ring hollow. Other technologies are still far cheaper, while being dispatchable (unlike solar).

Level-playing-field competitiveness, not press releases, must decide winners and losers. Off-grid solar, in fact, has a free-market niche unlike grid-connected solar. Such remote electricity does not require government involvement. This is reason enough to scale back, if not eliminate, government largesse in this area.


[1] Excluding the hidden costs of solar, the Energy Information Administration, estimates central-station solar at 7.4 cents per kWh, 23 percent higher than EERE.

[2] Cynthia Shea, “Renewable Energy: Today’s Contribution, Tomorrow’s Promise,” Worldwatch Paper 81 (January 1988), p. 27.

[3] Wilson Clark, Energy for Survival: The Alternative to Extinction (Garden City, NY: Anchor Books, 1974), p. 365.

[4] Ibid., p. 367.

[5] Ibid., p. 408.

[6] Barry Commoner, The Poverty of Power (New York: Alfred A. Knopf, 1976), p. 151.

[7] Booz, Allen & Hamilton (Study for the Solar Energy Industries Association, American Wind Energy Association, and Renewable Energy Institute). Renewable Energy Industry, Joint Hearing before the Subcommittees of the Committee on Energy and Commerce et al., House of Representatives, 98th Cong., 1st sess. (Washington, D.C.: Government Printing Office, 1983), p. 52.

[8] Statement of Scott Sklar, Solar Energy Industries Association. Quoted in Solar Power, Hearing before the Subcommittee on Energy and Power of the Committee on Energy and Commerce, House of Representatives, 100th Cong., 1st sess. (Washington, D.C.: Government Printing Office, 1987), p. 12.

[9] Christopher Flavin and Nicholas Lenssen, Beyond the Petroleum Age: Designing a Solar Economy (Washington: Worldwatch Institute, 1990), p. 47.

[10] Allen Myerson, “Solar Power, for Earthly Prices,” New York Times, November 15, 1994.

[11] Ibid.

[12] Ibid.

[13] Christopher Flavin and Odil Tunali, Climate of Hope: New Strategies for Stabilizing the World’s Atmosphere (Worldwatch Institute, 1996), p. 48.

[14] DOE Secretary Steven Chu (March 23, 2011). Quoted here.

The post Utility-Scale Solar, Part 2: Failed Promise appeared first on IER.

Tuesday, September 26, 2017

How and Why to Do a Mobile/Desktop Parity Audit

Posted by Everett

Google still ranks webpages based on the content, code, and links they find with a desktop crawler. They’re working to update this old-school approach in favor of what their mobile crawlers find instead. Although the rollout will probably happen in phases over time, I’m calling the day this change goes live worldwide “D-day” in the post below. Mobilegeddon was already taken.

You don’t want to be in a situation on D-day where your mobile site has broken meta tags, unoptimized titles and headers, missing content, or is serving the wrong HTTP status code. This post will help you prepare so you can sleep well between now then.

What is a mobile parity audit?

When two or more versions of a website are available on the same URL, a "parity audit" will crawl each version, compare the differences, and look for errors.

When do you need one?

You should do a parity audit if content is added, removed, hidden, or changed between devices without sending the user to a new URL.

This type of analysis is also useful for mobile sites on a separate URL, but that's another post.

What will it tell you? How will it help?

Is the mobile version of the website "optimized" and crawlable? Are all of the header response codes and tags set up properly, and in the same way, on both versions? Is important textual content missing from, or hidden, on the mobile version?

Why parity audits could save your butt

The last thing you want to do is scramble to diagnose a major traffic drop on D-day when things go mobile-first. Even if you don’t change anything now, cataloging the differences between site versions will help diagnose issues if/when the time comes.

It may also help you improve rankings right now.

I know an excellent team of SEOs for a major brand who, for severals months, had missed the fact that the entire mobile site (millions of pages) had title tags that all read the same: "BrandName - Mobile Site." They found this error and contacted us to take a more complete look at the differences between the two sites. Here are some other things we found:

  1. One page type on the mobile site had an error at the template level that was causing rel=canonical tags to break, but only on mobile, and in a way that gave Google conflicting instructions, depending on whether they rendered the page as mobile or desktop. The same thing could have happened with any tag on the page, including robots meta directives. It could also happen with HTTP header responses.
  2. The mobile site has fewer than half the amount of navigation links in the footer. How will this affect the flow of PageRank to key pages in a mobile-first world?
  3. The mobile site has far more related products on product detail pages. Again, how will this affect the flow of PageRank, or even crawl depth, when Google goes mobile-first?
  4. Important content was hidden on the mobile version. Google says this is OK as long as the user can drop down or tab over to read the content. But in this case, there was no way to do that. The content was in the code but hidden to mobile viewers, and there was no way of making it visible.

How to get started with a mobile/desktop parity audit

It sounds complicated, but really it boils down to a few simple steps:

  1. Crawl the site as a desktop user.
  2. Crawl the site as a mobile user.
  3. Combine the outputs (e.g. Mobile Title1, Desktop Title1, Mobile Canonical1, Desktop Canonical1)
  4. Look for errors and differences.

Screaming Frog provides the option to crawl the site as the Googlebot Mobile user-agent with a smartphone device. You may or may not need to render JavaScript.

You can run two crawls (mobile and desktop) with DeepCrawl as well. However, reports like "Mobile Word Count Mismatch" do not currently work on dynamic sites, even after two crawls.

The hack to get at the data you want is the same as with Screaming Frog: namely, running two crawls, exporting two reports, and using Vlookups in Excel to compare the columns side-by-side with URL being the unique identifier.

Here's a simplified example using an export from DeepCrawl:

As you can see in the screenshot above, blog category pages, like /category/cro/, are bigly different between devices types, not just in how they appear, but also in what code and content gets delivered and rendered as source code. The bigliest difference is that post teasers disappear on mobile, which accounts for the word count disparity.

Word count is only one data point. You would want to look at many different things, discussed below, when performing a mobile/desktop parity audit.

For now, there does NOT appear to be an SEO tool on the market that crawls a dynamic site as both a desktop and mobile crawler, and then generates helpful reports about the differences between them.

But there's hope!

Our industry toolmakers are hot on the trail, and at this point I'd expect features to release in time for D-day.

Deep Crawl

We are working on Changed Metrics reports, which will automatically show you pages where the titles and descriptions have changed between crawls. This would serve to identify differences on dynamic sites when the user agent is changed. But for now, this can be done manually by downloading and merging the data from the two crawls and calculating the differences.

Moz Pro

Dr. Pete says they've talked about comparing desktop and mobile rankings to look for warning signs so Moz could alert customers of any potential issues. This would be a very helpful feature to augment the other analysis of on-page differences.

Sitebulb

When you select "mobile-friendly," Sitebulb is already crawling the whole site first, then choosing a sample of (up to) 100 pages, and then recrawling these with the JavaScript rendering crawler. This is what produces their "mobile-friendly" report.

They're thinking about doing the same to run these parity audit reports (mobile/desktop difference checker), which would be a big step forward for us SEOs. Because most of these disparity issues happen at the template/page type level, taking URLs from different crawl depths and sections of the site should allow this tool to alert SEOs of potential mismatches between content and page elements on those two versions of the single URL.

Screaming Frog

Aside from the oversensitive hash values, SF has no major advantage over DeepCrawl at the moment. In fact, DeepCrawl has some mobile difference finding features that, if they were to work on dynamic sites, would be leaps and bounds ahead of SF.

That said, the process shared below uses Screaming Frog because it's what I'm most familiar with.

Customizing the diff finders

One of my SEO heroes, David Sottimano, whipped out a customization of John Resig's Javascript Diff Algorithm to help automate some of the hard work involved in these desktop/mobile parity audits.

You can make a copy of it here. Follow the instructions in the Readme tab. Note: This is a work in progress and is an experimental tool, so have fun!

On using the hash values to quickly find disparities between crawls

As Lunametrics puts it in their excellent guide to Screaming Frog Tab Definitions, the hash value "is a count of the number of URLs that potentially contain duplicate content. This count filters for all duplicate pages found via the hash value. If two hash values match, the pages are exactly the same in content."

I tried doing this, but found it didn't work very well for my needs for two reasons: because I was unable to adjust the sensitivity, and if even only one minor client-side JavaScript element changed, the page would get a new hash value.

When I asked DeepCrawl about it, I found out why:

The problem with using a hash to flag different content is that a lot of pages would be flagged as different, when they are essentially the same. A hash will be completely different if a single character changes.

Mobile parity audit process using Screaming Frog and Excel

Run two crawls

First, run two separate crawls. Settings for each are below. If you don't see a window or setting option, assume it was set to default.

1. Crawl 1: Desktop settings

Configurations ---> Spider

Your settings may vary (no pun intended), but here I was just looking for very basic things and wanted a fast crawl.

Configurations ---> HTTP Header ---> User-Agent

2. Start the first crawl

3. Save the crawl and run the exports

When finished, save it as desktop-crawl.seospider and run the Export All URLs report (big Export button, top left). Save the export as desktop-internal_all.csv.

4. Update user-agent settings for the second crawl

Hit the "Clear" button in Screaming Frog and change the User-Agent configuration to the following:

5. Start the second crawl

6. Save the crawl and run the exports

When finished, save it as mobile-crawl.seospider and run the Export All URLs report. Save the export as mobile-internal_all.csv.

Combine the exports in Excel

Import each CSV into a separate tab within a new Excel spreadsheet.

Create another tab and bring in the URLs from the Address column of each crawl tab. De-duplicate them.

Use Vlookups or other methods to pull in the respective data from each of the other tabs.

You'll end up with something like this:

A tab with a single row per URL, but with mobile and desktop columns for each datapoint. It helps with analysis if you can conditionally format/highlight instances where the desktop and mobile data does not match.

Errors & differences to look out for

Does the mobile site offer similar navigation options?

Believe it or not, you can usually fit the same amounts of navigation links onto a mobile site without ruining the user experience when done right. Here are a ton of examples of major retail brands approaching it in different ways, from mega navs to sliders and hamburger menus (side note: now I’m craving White Castle).

HTTP Vary User-Agent response headers

This is one of those things that seems like it could produce more caching problems and headaches than solutions, but Google says to use it in cases where the content changes significantly between mobile and desktop versions on the same URL. My advice is to avoid using Vary User-Agent if the variations between versions of the site are minimal (e.g. simplified navigation, optimized images, streamlined layout, a few bells and whistles hidden). Only use it if entire paragraphs of content and other important elements are removed.

Internal linking disparities

If your desktop site has twenty footer links to top-selling products and categories using optimized anchor text, and your mobile site has five links going to pages like “Contact Us” and “About” it would be good to document this so you know what to test should rankings drop after a mobile-first ranking algorithm shift.

Meta tags and directives

Do things like title tags, meta descriptions, robots meta directives, rel=canonical tags, and rel=next/prev tags match on both versions of the URL? Discovering this stuff now could avert disaster down the line.

Content length

There is no magic formula to how much content you should provide to each type of device, just as there is no magic formula for how much content you need to rank highly on Google (because all other things are never equal).

Imagine it's eight months from now and you're trying to diagnose what specific reasons are behind a post-mobile-first algorithm update traffic drop. Do the pages with less content on mobile correlate with lower rankings? Maybe. Maybe not, but I'd want to check on it.

Speed

Chances are, your mobile site will load faster. However, if this is not the case you definitely need to look into the issue. Lots of big client-side JavaScript changes could be the culprit.

Rendering

Sometimes JavaScript and other files necessary for the mobile render may be different from those needed for the desktop render. Thus, it's possible that one set of resources may be blocked in the robots.txt file while another is not. Make sure both versions fully render without any blocked resources.

Here’s what you need to do to be ready for a mobile-first world:

  1. Know IF there are major content, tag, and linking differences between the mobile and desktop versions of the site.
  2. If so, know WHAT those differences are, and spend time thinking about how that might affect rankings if mobile was the only version Google ever looked at.
  3. Fix any differences that need to be fixed immediately, such as broken or missing rel=canonicals, robots meta, or title tags.
  4. Keep everything else in mind for things to test after mobile-first arrives. If rankings drop, at least you’ll be prepared.

And here are some tools & links to help you get there:

I suspect it won't be long before this type of audit is made unnecessary because we'll ONLY be worried about the mobile site. Until then, please comment below to share which differences you found, and how you chose to address them so we can all learn from each other.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!