Friday, August 30, 2019

Labor Day Is Also Thank Energy Day

“Energy will do anything that can be done in the world; and no talents, no circumstances, no opportunities will make a two-legged animal a man without it.”

Johann Wolfgang von Goethe (1749–1832)

“Technology and change follow the liberation of energy. The lifestyle of contemporary America was destined by the development of fossil fuels in this seminal era.”

-Wilson Clark, Energy for Survival, 1974), p. 45.

Labor Day commemorates productive America. Begun in the late 19th century by the U.S. labor movement, the first Monday in September is celebrated annually as a public holiday.

The unofficial end of summer also is a peak travel time. This Labor Day weekend, a record 17.5 million passengers are expected to fly to hundreds of U.S. destinations and around the world, according to the trade group Airlines for America. And that means a high demand for energy, specifically petroleum to fuel the planes.

Labor Day could also be called Thank Energy Day. Human productivity is enabled by the appliances and machines that run on mineral (dense) energies, either directly or indirectly via electrical generation.

To capture the essence of human-freed labor, Buckminster Fuller coined the term “energy slave” in 1940 to provide a rough translation of how modern energy did the work of countless would-be laborers in the industrial economy. “Bucky saw that coal, oil and gas were batteries for ancient sunshine that allowed civilization to, for the first, live beyond its [direct daily] solar income,” Stuart McMillen wrote in an illustrative comic of Fuller’s insight.

Energy experts took up the analogy. “Mineral energy provides a greater concentration of power than could the most ingenious and efficient use of untold human and animal labor,” wrote Gloria Waldron and J. Frederic Dewhurst in Power, Machines, and Plenty (1948: 11). Added Erich Zimmermann:

The shift to machine power changed America from a rural agricultural nation to an industrial giant. It also made men’s lives easier and richer. In 1850, the average American worked seventy hours a week. Today he works forty-three. In 1850, our average American produced about 27 cents’ worth of goods an hour. Today he produces about $1.40 worth in dollars of the same purchasing power. (World Resources and Industries, 1951: 58)

Today, the average workweek for an American (non-farm; 16 years or older) is just under 35 hours, according to the US Bureau of Labor Statistics. That person in the office and at home employs hundreds of “energy slaves” without even realizing it. That productivity permits time off and allows an unprecedented quality of leisure when off.

Critics Dispute Good

The energy-slave metaphor has been badly polluted by Fuller’s disciples. In The Energy of Slaves: Oil and the New Servitude (2012: xi), Andrew Nikiforuk wrote:

Both Aristotle and Plato described slavery as necessary and expedient. We regard our new hydrocarbon servants with the same pragmatism. To many of us, our current spending of fossil fuels appears as morally correct as did human slavery to the Romans or the Atlantic slave trade to seventeenth-century British businessmen.

Nikiforuk then mentioned yours truly as a trafficker of such “pseudoscientific absurdities” as expanding depletable resources. He states (pp. 144–145):

Bradley … claims that the world’s material progress is “the result of advances in energy technology made by people living in freedom” and so will continue unerringly…. Bradley, does, however, acknowledge the importance of inanimate slaves. Thanks to hydrocarbons, the proportion of industrial world performed by human hands in the United States has fallen over the last hundred years from 90 percent to 8 percent. This blessed emancipation has given each American the fossil-fuel equivalent of about three hundred slaves, and Bradley predicts that the number of virtual slaves will only grow.

“It is hard to overstate the significance of this trend. It means not just more creature comforts but a fundamental change in the human condition,” he writes. “If we take the current population of the United States as being about 280 million people then the country as a whole has an equivalent of 8.4 billion energy slaves.”

I plead guilty as charged. Thanks to dense, mineral energies running countless machines and appliances, Americans have a leisure side to their work personae. So, this Labor Day, take a well-deserved day off for a long weekend—or even more—in good conscience.

The post Labor Day Is Also Thank Energy Day appeared first on IER.

Wednesday, August 28, 2019

Asian Countries Are Growing Their Economies With Coal

As the United States is reducing its coal-fired electricity, Asian countries are increasing their generation from coal. In 2018, the United States generated just 27 percent of its electricity from coal—a large reduction from coal’s 50 percent share in 2005. According to BP’s Statistical Review of World Energy, during that same period, India increased its share of electricity generated from coal from 68 percent to 75 percent, South Korea from 38 percent to 44 percent, and Vietnam from 21 percent to 41 percent. While China’s share of coal-fired generation went down during those 13 years—from 79 percent to 67 percent—it has more than doubled its coal-fired generation in absolute terms. As a result of these changes, U.S. carbon dioxide emissions declined by 12 percent over that period, while the carbon dioxide emissions in the Asia Pacific region increased by 50 percent—led by China with a 56 percent regional share. This has made environmentalists and forecasters worry that these countries will not meet their Paris commitments.

China’s economic rise was based on coal—the country’s most abundant energy resource—resulting in China becoming the world’s largest emitter of greenhouse gases. India seems to be ready to follow in China’s footsteps, having surpassed China as the world’s fastest-growing economy. India’s economy is expected to grow by 7 percent this year, ahead of the 6.3 percent growth expected for China. The economies of all of Developing Asia are expected to grow by about 5.7 percent, with the majority of the economic expansion located in Southeast Asia.

The Asian Development Bank, International Renewable Energy Agency, and United Nations are pushing governments in South and Southeast Asia to phase out coal in power generation, improve industrial energy efficiency, and develop cleaner transportation alternatives. Through a memorandum of understanding with the International Renewable Energy Agency, Southeast Asian countries are looking to obtain 23 percent of their energy from renewable sources by 2025.

However, the competition from expanding their economic growth and thereby improving the lifestyles of their citizens versus reducing their greenhouse gas emissions may put their commitments to the Paris accord in jeopardy.

China

Despite China’s commitment to the Paris accord, approvals for new coal mine construction in China increased over five-fold in 2019, with the expectation that coal consumption will increase in the future. China’s energy regulator approved the construction of 141 million metric tons of new coal production capacity from January 2019 to June 2019, compared to the 25 million metric tons they approved last year. The projects include new mines in the regions of Inner Mongolia, Xinjiang, Shanxi, and Shaanxi that are part of a national strategy to consolidate output at dedicated coal production “bases”, as well as expansions of existing collieries.

While large cities such as Beijing have cut coal use and shuttered many small mines and power plants, China is still allowing for significant increases in coal production and coal-fired power generation. Chinese coal output increased 2.6 percent in the first-half of 2019 to 1.76 billion metric tons. Further, the research unit of the China State Grid Corporation recently projected that total coal-fired capacity would peak at 1,230 to 1,350 gigawatts, which results in an increase of about 200 to 300 gigawatts.

In addition, the BBC reported last year of China’s possible ongoing construction of as much as 259 gigawatts of new coal capacity—an amount equal to the entire U.S. coal fleet.

China believes it can continue to increase coal production and consumption and still reduce emissions. It has made “ultra-low emissions” (or “ultra-super critical”) technology mandatory in all new coal power plants and is improving mine zoning regulations to minimize pollution. By the end of last year, 80 percent of its coal-fired power capacity had installed “ultra-low emissions” equipment, amounting to 810 gigawatts.

China’s investments in nuclear and renewables are still insufficient to cover rising energy demand. China plans to increase the share of non-fossil fuels in its overall energy mix to 15 percent by the end of next year from around 14.3 percent currently, and to 20 percent by 2030.

Conclusion

Asian economies are growing and need energy to fuel them. The competition from expanding economic growth versus reducing their greenhouse gas emissions to meet the Paris accord seems to be in conflict. The largest greenhouse gas emitter in the region (and world), China, seems to be maintaining current levels of coal output for the next several decades as well as increasing its oil and gas consumption, which would put its commitment to the Paris accord in jeopardy. Its words say one thing, but its actions say quite another.

The post Asian Countries Are Growing Their Economies With Coal appeared first on IER.

SEI Alumni in the Field: Commercial PV Installer for Namaste Solar

Ben Vanderbliet first found out about Solar Energy International (SEI) classes from his brother who was a part of the SEI Solar Ready Vets program, and now he’s a full-time commercial PV installer with Namaste Solar based out of Fort Collins, Colorado. SEI can help launch your career in the solar industry, too! Ben shared details with us on how SEI classes helped him secure his current role in solar. 

According to Ben, after he left his last job, his brother gave him all of his old SEI materials to look through. “After reading the entire Solar Electric handbook, I knew it was the right direction and promptly applied to SEI’s Commercial and Residential Solar Professional Certificate Programs. If I was to be successful in this field, I knew that I needed the best quality training available from a reputable organization. Solar Energy International was unequivocally the best choice,” Ben said. 

Since that point, Ben has progressed through five classes with SEI: PV101, 201L, 202, 203, and 301L. “My classroom experiences were very positive, due to the quality of instructors, staff, and students,” Ben said. “In hindsight I wish my college experience had been nearly as good. The combination of diverse student backgrounds, friendliness, and razor-sharp focus of everyone involved made for an engaging learning experience that I fondly look back on. The hands-on labs were especially enjoyable, as they gave me more confidence in my understanding of PV theory and electrical systems.”

After taking these five courses, Ben took the NABCEP exam and then landed his role with Namaste, all within 6-weeks time. 

Ben explained that through his position with Namaste, his work involves the installation of larger-scale PV for commercial applications, specifically ballast-mount and ground-mount arrays. His work entails duties such as: Setting up the racking systems and getting everything prepped for various stages of installation, and understanding design plans, being proficient in various hand tools, and working well with his team. 

Ben attributes his success, in-part, to his SEI experience. “SEI has played a direct role in launching my solar career. During my time in Paonia, I took full advantage of employment services and landed a great job soon after training. This likely would not have been the case without SEI’s outstanding reputation and dedicated staff who genuinely want the best for their students. I am super grateful for people I have met at SEI and the positive trajectory it has set for my career in PV.” 

Learn more about training with SEI, and check out our full training schedule

The post SEI Alumni in the Field: Commercial PV Installer for Namaste Solar appeared first on Solar Training - Solar Installer Training - Solar PV Installation Training - Solar Energy Courses - Renewable Energy Education - NABCEP - Solar Energy International (SEI).

How to Use Keywords in Your Blogging Strategy

Monday, August 26, 2019

Berkeley Bans Natural Gas in New Residential Buildings

While natural gas is setting consumption records nationwide, Berkeley, California, is banning its use in new residential buildings. Beginning January 1, 2020, the city will not allow developers to build homes, townhouses, or small apartment buildings with natural gas hook-ups for cooking, heating, or hot water. Berkeley intends to expand the ban to bigger apartment buildings and commercial structures as it implements its goal of “fossil-free new buildings.” For those wishing to have a gourmet gas stove in their kitchen, Berkeley will not allow it in a new house.

Berkeley’s ordinance applies to buildings that have been reviewed by the California Energy Commission and determined to meet state requirements and regulations if they are electric only. Electric-only buildings are equipped with heat pumps and induction cooking. The city’s regulations will automatically update as the state commission approves more building models without having to return to the City Council for a vote. The ordinance does not apply to existing buildings.

In 2009, Berkeley adopted a Climate Action Plan to reduce greenhouse gas emissions by 33 percent by 2020 and by 80 percent by 2050. As part of the plan, the city is to use 100 percent renewable electricity by 2035.

Berkley is Expanding

Berkeley’s population has increased by 18 percent since 2000. With increasing numbers of people, more housing was built. From 2014 to 2017, the Planning Department approved building permits for 525 residential units and occupancy permits for 925 units. More housing is expected—with the construction of 1,400 units due to the Adeline Corridor Plan alone.

The City Council found that the consumption of natural gas within city buildings accounted for 27 percent of Berkeley’s greenhouse gas emissions in 2016, which is equivalent to the consumption of 20 million gallons of gasoline a year.

The ordinance allocates $273,341 per year for a two-year staff position that will have the responsibility for implementing the ban, among other duties.

Economics

Berkeley claims that all-electric heating technologies are cost-competitive with their natural gas counterparts. Nationally, that is not the case. While electric heaters are generally less expensive than natural gas furnaces, natural gas is a less expensive fuel than electricity. In California, residential natural gas prices have increased 16 percent over the last five years while falling 8 percent on average nationwide. At the same time, California’s electricity prices have increased 20 percent due partly to increased reliance on solar and wind power and they are expected to increase more as the state reduces its consumption of fossil fuels.

Likewise, water heating is generally much cheaper with natural gas than with electricity, although gas units face initially higher costs.

Natural Gas Use in the U.S.

Natural gas is a versatile fuel used in all sectors of the economy. It is efficient because it does not suffer the line loss affiliated with electricity, acting more like a distributed energy source than electricity. With the use of hydraulic fracturing and horizontal drilling, natural gas prices have plummeted and its consumption has grown. It now supplies 31 percent of our energy and 35 percent of our electricity, surpassing coal as the leader in supplying the generation sector in 2016.

The U.S. power sector set a record for natural gas consumption in July, according to the Energy Information Administration (EIA). The electricity industry consumed 44.5 billion cubic feet of gas on July 19, slightly more than the old daily record of 43.1 billion cubic feet set on July 16, 2018. Higher electricity demand for air conditioning during a heat wave from July 15 through July 22 drove the increased power generation, especially from natural gas-fired generators. Low natural gas prices encouraged companies to buy and dispatch more gas-fired power at the expense of other forms of generation. Natural gas prices throughout June and July averaged $2.31 per thousand cubic feet at the Henry Hub, which is the U.S. benchmark. They dipped even lower at some delivery points in the Midwest and Northeast.

Electric companies are projected to increase their natural gas use by 5 percent this year compared with 2018, according to EIA’s Short-Term Energy Outlook.

Conclusion

Despite residential homeowners liking the use of natural gas in their homes for heating, cooking, and hot water, the Berkeley City Council has banned its use in new residential buildings, mandating electricity instead. Its approach is for fossil-free buildings as the city expects its electricity to be 100 percent renewable by 2035. While natural gas prices have declined nationwide, they have increased in California due to the state’s climate regulations, making electric heating competitive with natural gas heating, according to the Berkeley City Council. Gourmet cooks will find their lives changed under Berkeley’s new all-electric mandate. We await their verdict.

The post Berkeley Bans Natural Gas in New Residential Buildings appeared first on IER.

Lead Volume vs. Lead Quality By RuthBurrReedy

Posted by RuthBurrReedy

Ruth Burr Reedy is an SEO and online marketing consultant and speaker and the Vice President of Strategy at UpBuild, a technical marketing agency specializing in SEO, web analytics, and conversion rate optimization. This is the first post in a recurring monthly series and we're excited! 


When you’re onboarding a new SEO client who works with a lead generation model, what do you do?

Among the many discovery questions you ask as you try to better understand your client’s business, you probably ask them, “What makes a lead a good lead?” That is, what are the qualities that make a potential customer more likely to convert to sale?

A business that’s given some thought to their ideal customer might send over some audience personas; they might talk about their target audience in more general terms. A product or service offering might be a better fit for companies of a certain size or budget, or be at a price point that requires someone at a senior level (such as a Director, VP, or C-level employee) to sign off, and your client will likely pass that information on to you if they know it. However, it’s not uncommon for these sorts of onboarding conversations to end with the client assuring you: “Just get us the leads. We’ll make the sales.”

Since SEO agencies often don’t have access to our clients’ CRM systems, we’re often using conversion to lead as a core KPI when measuring the success of our campaigns. We know enough to know that it’s not enough to drive traffic to a site; that traffic has to convert to become valuable. Armed with our clients’ assurances that what they really need is more leads, we dive into understanding the types of problems that our client’s product is designed to solve, the types of people who might have those problems, and the types of resources they might search for as they tend to solve those problems. Pretty soon, we’ve fixed the technical problems on our client’s site, helped them create and promote robust resources around their customers’ problems, and are watching the traffic and conversions pour in. Feels pretty good, right?

Unfortunately, this is often the point in a B2B engagement where the wheels start to come off the bus. Looking at the client’s analytics, everything seems great — traffic is up, conversions are also up, the site is rocking and rolling. Talk to the client, though, and you’ll often find that they’re not happy.

“Leads are up, but sales aren’t,” they might say, or “yes, we’re getting more leads, but they’re the wrong leads.” You might even hear that the sales team hates getting leads from SEO, because they don’t convert to sale, or if they do, only for small-dollar deals.

What happened?

At this point, nobody could blame you for becoming frustrated with your client. After all, they specifically said that all they cared about was getting more leads — so why aren’t they happy? Especially when you’re making the phone ring off the hook?

A key to client retention at this stage is to understand things from your client’s perspective — and particularly, from their sales team’s perspective. The important thing to remember is that when your client told you they wanted to focus on lead volume, they weren’t lying to you; it’s just that their needs have changed since having that conversation.

Chances are, your new B2B client didn’t seek out your services because everything was going great for them. When a lead gen company seeks out a new marketing partner, it’s typically because they don’t have enough leads in their pipeline. “Hungry for leads” isn’t a situation any sales team wants to be in: every minute they spend sitting around, waiting for leads to come in is a minute they’re not spending meeting their sales and revenue targets. It’s really stressful, and could even mean their jobs are at stake. So, when they brought you on, is it any wonder their first order of business was “just get us the leads?” Any lead is better than no lead at all.

Now, however, you’ve got a nice little flywheel running, bringing new leads to the sales team’s inbox all the livelong day, and the team has a whole new problem: talking to leads that they perceive as a waste of their time. 

A different kind of lead

Lead-gen SEO is often a top-of-funnel play. Up to the point when the client brought you on, the leads coming in were likely mostly from branded and direct traffic — they’re people who already know something about the business, and are closer to being ready to buy. They’re already toward the middle of the sales funnel before they even talk to a salesperson.

SEO, especially for a business with any kind of established brand, is often about driving awareness and discovery. The people who already know about the business know how to get in touch when they’re ready to buy; SEO is designed to get the business in front of people who may not already know that this solution to their problems exists, and hopefully sell it to them.

A fledgling SEO campaign should generate more leads, but it also often means a lower percentage of good leads. It’s common to see conversion rates, both from session to lead and from lead to sale, go down during awareness-building marketing. The bet you’re making here is that you’re driving enough qualified traffic that even as conversion rates go down, your total number of conversions (again, both to lead and to sale) is still going up, as is your total revenue.

So, now you’ve brought in the lead volume that was your initial mandate, but the leads are at a different point in their customer journey, and some of them may not be in a position to buy at all. This can lead to the perception that the sales team is wasting all of their time talking to people who will never buy. Since it takes longer to close a sale than it does to disqualify a lead, the increase in less-qualified leads will become apparent long before a corresponding uptick in sales — and since these leads are earlier in their customer journey, they may take longer to convert to sale than the sales team is used to.

At this stage, you might ask for reports from the client’s CRM, or direct access, so you can better understand what their sales team is seeing. To complicate matters further, though, attribution in most CRMs is kind of terrible. It’s often very rigid; the CRM’s definitions of channels may not match those of Google Analytics, leading to discrepancies in channel numbers; it may not have been set up correctly in the first place; it’s opaque, often relying on “secret sauce” to attribute sales per channel; and it still tends to encourage salespeople to focus on the first or last touch. So, if SEO is driving a lot of traffic that later converts to lead as Direct, the client may not even be aware that SEO is driving those leads.

None of this matters, of course, if the client fires you before you have a chance to show the revenue that SEO is really driving. You need to show that you can drive lead quality from the get-go, so that by the time the client realizes that lead volume alone isn’t what they want, you’re prepared to have that conversation.

Resist the temptation to qualify at the keyword level

When a client is first distressed about lead quality, It’s tempting to do a second round of keyword research and targeting to try to dial in their ideal decision-maker; in fact, they may specifically ask you to do so. Unfortunately, there’s not a great way to do that at the query level. Sure, enterprise-level leads might be searching “enterprise blue widget software,” but it’s difficult to target that term without also targeting “blue widget software,” and there’s no guarantee that your target customers are going to add the “enterprise” qualifier. Instead, use your ideal users’ behaviors on the site to determine which topics, messages, and calls to action resonate with them best — then update site content to better appeal to that target user

Change the onboarding conversation

We’ve already talked about asking clients, “what makes a lead a good lead?” I would argue, though, that a better question is “how do you qualify leads?” 

Sit down with as many members of the sales team as you can (since you’re doing this at the beginning of the engagement — before you’re crushing it driving leads, they should have a bit more time to talk to you) and ask how they decide which leads to focus on. If you can, ask to listen in on a sales call or watch over their shoulder as they go through their new leads. 

At first, they may talk about how lead qualification depends on a complicated combination of factors. Often, though, the sales team is really making decisions about who’s worth their time based on just one or two factors (usually budget or title, although it might also be something like company size). Try to nail them down on their most important one.

Implement a lead scoring model

There are a bunch of different ways to do this in Google Analytics or Google Tag Manager (Alex from UpBuild has a writeup of our method, here). Essentially, when a prospect submits a lead conversion form, you’ll want to:

  • Look for the value of your “most important” lead qualification factor in the form,
  • And then fire an Event “scoring” the conversion in Google Analytics as e.g. Hot, Warm, or Cold.

This might look like detecting the value put into an “Annual Revenue” field or drop-down and assigning a score accordingly; or using RegEx to detect when the “Title” field contains Director, Vice President, or CMO and scoring higher. I like to use the same Event Category for all conversions from the same form, so they can all roll up into one Goal in Google Analytics, then using the Action or Label field to track the scoring data. For example, I might have an Event Category of “Lead Form Submit” for all lead form submission Events, then break out the Actions into “Hot Lead — $5000+,” “Warm Lead — $1000–$5000,” etc.

Note: Don’t use this methodology to pass individual lead information back into Google Analytics. Even something like Job Title could be construed as Personally Identifiable Information, a big no-no where Google Analytics is concerned. We’re not trying to track individual leads’ behaviors, here; we’re trying to group conversions into ranges.

How to use scored leads

Drive the conversation around sales lifecycle. The bigger the company and the higher the budget, the more time and touches it will take before they’re ready to even talk to you. This means that with a new campaign, you’ll typically see Cold leads coming in first, then Hot and Warm trickling in overtime. Capturing this data allows you to set an agreed-upon time in the future when you and the client can discuss whether this is working, instead of cutting off campaigns/strategies before they have a chance to perform (it will also allow you to correctly set Campaign time-out in GA to reflect the full customer journey).

Allocate spend. How do your sales team’s favorite leads tend to get to the site? Does a well-timed PPC or display ad after their initial visit drive them back to make a purchase? Understanding the channels your best leads use to find and return to the site will help your client spend smarter.

Create better-targeted content. Many businesses with successful blogs will have a post or two that drives a great deal of traffic, but almost no qualified leads. Understanding where your traffic goals don’t align with your conversion goals will keep you from wasting time creating content that ranks, but won’t make money.

Build better links. The best links don’t just drive “link equity,” whatever that even means anymore — they drive referral traffic. What kinds of websites drive lots of high-scoring leads, and where else can you get those high-quality referrals?

Optimize for on-page conversion. How do your best-scoring leads use the site? Where are the points in the customer journey where they drop off, and how can you best remove friction and add nurturing? Looking at how your Cold leads use the site will also be valuable — where are the points on-site where you can give them information to let them know they’re not a fit before they convert?

The earlier in the engagement you start collecting this information, the better equipped you’ll be to have the conversation about lead quality when it rears its ugly head.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Friday, August 23, 2019

Fossil Fuels: Still Winning

After decades of climate-related debate and policy, it’s still a fossil-fuel world. And judging from domestic and global politics, this will continue.

A base year to judge energy trends is 1988, the year climate scientist James Hansen sounded the alarm about the enhanced greenhouse effect, the result of manmade emissions of carbon dioxide and other warming gases.

In 1988 the global market share of carbon-based energies was 88 percent; three decades hence it is little diminished at 85 percent. Total usage of natural gas, coal, and oil in this period increased by two-thirds, with CO2 emissions rising 61 percent.

Regarding market share, natural gas has been the global gainer, capturing the electricity and industrial market away from oil. Versus oil at 34 percent and gas at 23 percent, coal has been steady at 28 percent of the world’s primary energy.

Electricity from wind, solar, and other non-hydro renewables increased to 3.5 percent from a fraction of one percent, while nuclear fell one-fourth to 4 percent. With hydropower growing globally, the market share of non-fossil-fuels reached 15 percent (from 11 percent in 1988).

Coal remains a workhorse globally in power generation. China, for example, which used coal for 59 percent of its energy usage last year, plans to rely upon this indigenous source for at least the next several decades to power its growing economy. In Asia overall, 1,200 coal plants are in construction or planned. Just last year, King Coal increased its global output 4 percent. U.S. coal exports, primarily to Asia, up 90 percent last year versus 2016, is a big part of the global coal boom.

Oil and gas have a robust future thanks to a U.S.-led extraction boom and increasing exports, as well as favorable policies from the Trump administration to drill, pipe, refine, and consume. A public policy of energy exceptionalism, combined with human ingenuity to find so-called depletable resources faster than they can be consumed (resourceship), is a powerful combination.

Horizontal drilling and fractionation have reinvigorated both primary energies. Whereas 15 years ago four-of-five wells in the U.S. were vertically drilled, today 19-of-20 are horizontal. In 2018, America recorded a “unique double first” of being the world leader in oil and in natural gas output. And much more is to come from a still-young industry.

Domestic consumption is meeting record supply, negating the notion of Peak Oil Demand. An estimated 43 million Americans traveled in cars, trucks, planes, trains, and boats to open this year’s driving season—virtually all petroleum-powered.

On the natural gas side, record production, consumption, and exports last year will be exceeded this year, according to the Energy Information Administration.

Climate politics is following, not leading, this energy reality. The international quest to reduce CO2 emissions remains stalled. Global emissions rose 2 percent last year, reflecting an overall increase in energy demand of nearly 3 percent. Three-fourths of this increase was met by fossil fuels.

Seen another way, not one of the top ten emitters of greenhouse gases is on track to meet its climate goals under the Paris climate accord. The U.S., which intends to withdraw from the agreement, is rated “critically insufficient” by the Climate Action Tracker. Even the EU, which aims to be carbon-neutral by 2050 (distant targets are a sign of compliance issues), is off to a bad start. Greenpeace recently called the EU’s climate plan “a collection of buzzwords.”

Why are fossil fuels so dominant despite political headwinds? The economic answer is that consumers prefer the most affordable, reliable, and convenient energies. The technical answer is that mineral energies are dense and have built-in storage, while wind and solar are dilute and intermittent. This is why renewables, which had a 100-percent market share for almost all of human history (primitive biomass, falling water, etc.), were displaced to enable the Industrial Revolution.

It’s still a fossil fuel world. And if subsidy fatigue sets in for wind power, solar power, ethanol, and electric vehicles, and legacy nuclear plants continue to retire, carbon-based energies could reach and even exceed 90 percent of global primary energy consumption in future decades.

The post Fossil Fuels: Still Winning appeared first on IER.

Goodbye, Generic SEO Audit – Say Hello to Customization & Prioritization - Whiteboard Friday

Tuesday, August 20, 2019

Fresh Features & Functionalities: A Six Month Look back at What’s New in Moz Pro

Native Americans Embrace the Sun

A New Way to Honor the Old Ways

Pine Ridge, South Dakota (August 15, 2019) – Red Cloud Renewable (RCR) and Solar Energy International (SEI) are pleased to announce selection of the first class of Native Americans to receive a full scholarship as part of the professional level Tribal Train the Trainer (T4) Program for Solar Certification.

Seven Native Americans, four men and three women from four tribes have accepted positions in the program each receiving a scholarship that covers their classes, travel, lodging, food, workbooks and testing costs. The cohort will learn through intensive classroom theory, and hands-on applications, the four primary SEI foundational classes in solar photovoltaics:

  1. PV 101 Solar Electric Design and Installation
  2. PV201L: Solar Electric Lab Week for Grid-Direct
  3. PV203: PV Fundamentals-Battery Based
  4. PV301L: Solar Training – Solar Electric Lab Week-Battery-Based

Students selected to participate are Marie Kills Warrior (Oglala), Lance Daniels (Muscogee Creek), Leo Bear (Shoshoni-Bannock), Leo Campbell (Rosebud), Cassandra Valandra (Rosebud), Gloria Red Cloud (Oglala) and Henry Red Cloud (Oglala).

Upon satisfactory completion of these for classes, trainees will test for their national NABCEP certification, an industry indicator of solar skills, knowledge and job readiness. The new Trainers will then co-teach with mentors, to a new cohort of Native Americans a PV 101 class at the end of October.

Trainees will learn about solar components, power flows, energy storage, applications in various settings, as well as installing and commissioning with a range of modules and inverters. Ongoing mentorship, resume and skill development, and culturally adapted support will also provide a strong foundation, with the goal to serve many more Trainees in the coming years.

Our shared vision is to provide energy access and independence for all Native Americans, with skilled and certified tribal members who are wanting to start solar businesses, and complete Native community projects. T4 Trainee Leo Campbell from the Rosebud Tribe sums it up “This is about building the core of solar awareness and leadership that will create a new generation of energy efficient native communities.”

The first two classes will take place at the Red Cloud Renewable Energy Center at Pine Ridge, SD, with the final two classes being conducted at SEI’s lab facility in Paonia, CO.  These professional level courses will be taught by Henry Red Cloud, Carol Weis, and Brad Burkhartzmeyer, with help from other SEI trainers.  The non-profit In Our Hands is providing funding for a follow-up spring workshop “How to Start a Solar Business,” which will be taught by one of SEI’s Founders, Johnny Weiss at no cost to the new Trainers and other RCR students.

Red Cloud Renewable is a 501 (c)(3) federally approved non-profit organization headquartered on the Pine Ridge Reservation in South Dakota.  Led by Lakota renewable energy leader, Henry Red Cloud, it has provided renewable energy training for thousands of tribal members for more than 50 tribes.

SEI has been the premiere solar training organization in the U.S. since 1991 with more than 70,000 students taking their courses.  More than 25% of all North American Board of Certified Energy Practitioners (NABCEP) have received their training through SEI. They have been working on solar projects with tribes for more than 15 years.

###

This program has been made possible through the generous support of All Points North Foundation, The Turner Foundation, In Our Hands, and many individual contributors to Red Cloud Renewable.

For more information about this program and to learn about upcoming opportunities, please contact: Richard Fox via email – richard@redcloudrenewable.org, or phone 970-391-0148. www.redcloudrenewable.org

 

 

Contact: Deirdre Morrison

Solar Energy International

Phone: (970) 527-5046 x 106

Email: deirdre@solarenergy.org

The post Native Americans Embrace the Sun appeared first on Solar Training - Solar Installer Training - Solar PV Installation Training - Solar Energy Courses - Renewable Energy Education - NABCEP - Solar Energy International (SEI).

Wind Power Is Collapsing In Germany

New onshore wind energy is in a steep decline in Germany. The expansion of onshore wind power in the first half of this year is at its lowest level since the introduction of Germany’s Renewable Energy Act in 2000. There were only 231 megawatts of new onshore wind capacity installed in Germany during the first half of 2019—an 82 percent decline compared to the first half of 2018. Wind Europe, the region’s wind energy trade association, is expecting Germany to install between 1 gigawatt and 2 gigawatts of wind energy in 2019, which is significantly below the 4.3 gigawatts per year installed on average over the past 5 years.

It is also well below what Germany needs to meet its target of 65 percent renewable electricity by 2030 as part of its energy transition, Energiewende. Germany gets 40 percent of its electricity from renewables today. The European Union is counting on Germany to attain its target so that it can meet its own 32 percent renewable energy target. Furthermore, offshore wind cannot fill the gap, as Germany is scheduled to build less than 15 percent of what is required—about 730 megawatts per year of offshore wind through to 2030.

The Issues

Local residents protested in many state parliaments that wind farms needed to be a minimum distance from residential developments once people began to see the scale of industrial wind development. As a result, the licensing authorities have been acting much more cautiously. Permitting has gotten much slower, more complex and under-staffed in terms of processing wind applications. Only 400 megawatts of new wind farm permits were awarded in the first quarter of 2019, well below historic levels. The German permitting process used to take 10 months and now takes over two years for a new project to proceed through the civil service infrastructure. About 11,000 megawatts of wind energy capacity are currently in the permit backlog.

According to Wind Europe, Germany’s public authorities are not applying deadlines and many wind farm projects are getting stuck in legal disputes, with at least 750 megawatts of onshore wind projects currently in legal proceedings. Over 70 percent of the legal objections are based on species conservation, especially the threat to endangered bird species and bats, a growing concern around the world. Besides species protection, 17 percent of legal cases deal with noise protection. Another 6 percent of lawsuits deal with monument protection.

The German government also forced the wind power industry to cut costs, introducing a market-based tendering model. The economic and legal risks have made investors leery about investing in wind projects without the massive government-mandated subsidies that the wind industry enjoyed.

Since the federal government also removed some privileges they had provided for community wind farms, there are no longer enough participants for the public auctions. Of over 1,350 megawatts of wind power projects tendered this year, only 746 megawatts were slated for a project. At least three German onshore wind auctions were significantly undersubscribed. Also, in technology-neutral or combined auctions, onshore wind is continually outperformed. Recently, the country’s solar industry was awarded 210-megawatts of a 200-megawatt auction which was heavily oversubscribed with 719.5 megawatts of solar projects bidding for contracts.

In order to meet its 2030 target of 65 percent renewables, Germany needs to build 5 gigawatts of wind power each year. In the first quarter of 2019, no turbine orders were recorded. To fix this, proponents indicate that Germany needs to look at alternative site locations such as industrial sites similar to what the Dutch are doing or motorway sites similar to what France and Belgium are doing.

In 2021, thousands of wind turbines are expected to come to the end of their 20-year subsidy period as defined by Germany’s Renewable Energy Act. The wind industry is afraid that more wind turbines will be demolished than new ones will be built. Alternatives would be to repower these early wind farms or replace their turbines with modern ones that would double the capacity using one third fewer turbines.

Military concerns and FM radio beacons are also contributing to the lack of wind turbine orders. About 4,790 megawatts of wind power are blocked due to these concerns, of which 2,370 megawatts are blocked due to distance issues. Wind power projects are required to keep a distance of 10 to 15 kilometers away from stations that are used for navigation in aviation.

Conclusion

The Germans are finding out that there are many issues with wind farms that include noise pollution, wildlife and monument protection, military and radio conflicts, and NIMBY—not in my back yard—issues. These are on top of the economic fact that their current system is built upon government subsidies and mandates about which the public seems to be growing weary. German residential consumers pay about three times what U.S. residential consumers do for electricity.

These issues are also concerns in the United States, but the governments here have not changed the economic model and wind farms are still being built due to state mandates, state and federal subsidies, and wholesale pricing practices. If these were to go away or be less lucrative for wind power, wind auctions would be undersubscribed in the United States as Germany is seeing this year.

The post Wind Power Is Collapsing In Germany appeared first on IER.

Friday, August 16, 2019

Greenland’s Ice Sheet and Climate Change Policy, Part 1 of 2

As I have been detailing for years here on the pages of IER, the actual peer-reviewed research on climate change doesn’t come anywhere close to supporting the apocalyptic proclamations so often coming out of the corporate media and activist camps. Because I’m an economist, I tend to focus on the economics literature, but the pattern also holds for the physical sciences.

I’ll illustrate with the case of the Greenland ice sheet (GIS), which—we are told—will inevitably melt if market forces are left unchecked, leading to utter ruin for our descendants. The collapse of the GIS is one of a handful of frequently cited “tipping points” that ostensibly shows just how irresponsible it would be to delay aggressive government intervention to severely restrict greenhouse gas emissions.

Specifically, the reaction to a new paper by (Nobel laureate) William Nordhaus studying carbon tax policy and the GIS shows how unmoored many in the scientific community are from their own research. In this first post on the topic (Part 1 of 2), I’ll summarize the paper and its critics, showing that their dire warnings of disaster don’t hold up. Then I’ll explain that even the “moderate” Nordhaus is himself far too eager to reserve an important role for government intervention. Then in my next post (Part 2), using Nordhaus’ own framework, I’ll show that a policy of laissez-faire leaves humanity in an excellent position to deal with temperature change and Greenland’s ice sheet.

Nordhaus’ New Paper

Nordhaus’ article, “Economics of the disintegration of the Greenland ice sheet,” was published in June in the prestigious journal PNAS, which stands for Proceedings of the National Academy of Sciences. (The PNAS article is available here, though as of this writing it has an important line missing from Figure 5. An earlier, longer version of Nordhaus’ article is available here, where the diagrams are correct. When I quote or give page citations in this post, I will be referring to the shorter PNAS version of the article, unless otherwise stated.)

In the article, Nordhaus integrates the current scientific understanding of the disintegration of the GIS into his industry-standard model of climate change policy (called DICE). He argues that this type of extension is important, because one of the major criticisms of the standard economic models of climate change (including his earlier versions of DICE) is that they do not include “tipping points,” such that their recommendation of a relatively modest initial carbon tax, ramping up gently over time, would lull the public into complacency.

The whole episode is quite ironic. In this new paper, Nordhaus for his part concludes that explicitly including the dynamics of the melting/rebuilding of the Greenland ice sheet into his DICE model does not significantly affect his estimates of the “social cost of carbon.” Furthermore, if governments enact the “optimal carbon tax” trajectory that his model has been recommending, then the melting of Greenland’s ice sheet will be reversed well before it approaches a tipping point.

However, Nordhaus does warn against complacency, because (in his words) “a baseline or no-policy path will lead to the gradual melting of the GIS over the coming millennium.” What’s worse, even if humans neglect action in the near term but then try geo-engineering to fix the situation down the road, it won’t work, because (Nordhaus claims) “the rate of rebuilding is so slow that the damage cannot be undone within the time perspective of climate policy and human settlements.” (As we will see, this is very misleading if it is construed as a strike against geo-engineering.)

Melting With Outrage

In reaction to Nordhaus’ article, some in the climate community were aghast, thinking that it demonstrated just how useless economics was for something as critical as greenhouse gas regulation. The reader must keep in mind that these are the same type of folks who hate the very idea of balancing costs and benefits for an “optimal” amount of (what they would call) carbon pollution, and certainly would not buy the economist argument that climate damages to future generations should be “discounted” by any percentage at all. So when William Nordhaus comes along and says that even if we explicitly include a collapsing ice sheet in the analysis, then that barely moves the dial on his recommended carbon tax, you can understand why tempers would reach unprecedented levels.

The reason I say this whole episode is ironic is that my take is the exact opposite of the interventionist critics. Their hysterical attempts to point out the alleged absurdity of Nordhaus’ findings actually show why the alarmists are standing on thin ice. (Pun definitely intended.) Moreover, although I applaud the technical detail of Nordhaus’ efforts, I think his “test” of geo-engineering is rather obtuse. As I’ll show in Part 2 (my next post), I can come up with a much more relevant scenario sketching a free-market approach that is entirely consistent with the scientific literature, and yet (in the long run) involves less melting of the GIS than occurs under Nordhaus’ “optimal” trajectory.

Nordhaus’ Main Findings

After including the temperature/melting dynamics of the Greenland ice sheet into his DICE model, and showing how his (simplified) approach approximates the scientific literature’s handling of the topic, Nordhaus concludes that “strong climate policy in the optimal run can stop the GIS decline well short of complete disintegration or critical tipping points.”

Crucially, Nordhaus finds that his original DICE model was giving good recommendations, in the sense that its estimated “social cost of carbon” figure doesn’t increase much, even when explicitly including the dynamics of the melting Greenland ice sheet:

This study demonstrates that, under a very wide range of assumptions, the risk of GIS disintegration—although a major change in the earth system—would make a small further contribution to the overall SCC [social cost of carbon—RPM] or to the overall cost of climate change. The increment to the SCC is near zero at moderate discount rates and as high as 5% of the total SCC at very low discount rates and high melt rates. At the discount rate used by the US government, the addition of GIS damages to the SCC is essentially zero. The intuition behind this result is that the timescale of damages associated with GIS melting is much longer than those associated with non-GIS damages. For example, impacts on agriculture are determined largely by contemporaneous climate changes, while GIS melting is determined by climate only with a long lag.

As mentioned above, Nordhaus also includes what he calls a geo-engineering “experiment” where he assumes humanity does not enact a carbon tax, but down the road implements emergency measures to quickly reduce the temperature and arrest the GIS melting. (To repeat, I will cover Nordhaus’ geo-engineering discussion in my next post, Part 2.)

The Critics Attack Nordhaus

Some of the academic community were appalled at Nordhaus’ article, thinking it demonstrated what was so wrongheaded about the entire “economics of climate change” approach. Here is one specific complaint:

Greenland Tweet

Wow, the critic seems to have a point! In his model, Nordhaus assumes that the damage from a melting ice sheet comes from rising sea levels. And, as the critic points out, Nordhaus’ calibration assumes that at a 7-meter sea lever rise (SLR), global GDP would only be reduced by 7 percent from what it otherwise would have been. Is that really plausible, since many coastal cities would be underwater?

Yet hold on a second. If you look carefully at the sea level diagram (you can click through the twitter links to get a large view), you’ll see that a sea level rise of 7 meters doesn’t occur for at least another 500 or so years.

Is it really so obvious that in the year 2500, humans would be devastated by certain major cities today being underwater? Over the course of centuries, wouldn’t humans move out of the way? There are already plans for developing floating artificial cities (called “seasteading”), and even with our current technology, portions of the Netherlands can survive being almost 7 meters below sea level right now.

Try this: The fictional character Jean-Luc Picard, captain of the Enterprise in the rebooted Star Trek series who was played by Patrick Stewart, was born in the year 2305. So there could have been an episode of the show where Starfleet command contacts Picard and warns that south London is threatened by sea level rise. Would that have been very compelling for the viewers of the show? Or would they have thought, “Humans didn’t see this coming for centuries ahead of time, and do something about it—if only to move inland?!”

Conclusion

I have written this relatively long post in order to demonstrate that those skeptical of government intervention do not need to rely on skepticism of the UN-endorsed physical science. As William Nordhaus’ new paper shows, the standard estimates of the “social cost of carbon” do not rise significantly, even when explicitly modeling the Greenland ice sheet. The extreme activists trying to throw out traditional cost-benefit analysis are shown, once again, to be bluffing.

In my next post, I’ll show moreover that Nordhaus himself is underestimating the potency of a private-sector solution. Even stipulating Nordhaus’ basic framework and assumptions about melting dynamics, I’ll sketch a scenario in which there is no government intervention and yet humanity easily avoids any direct problems from the melting Greenland ice sheet.

The post Greenland’s Ice Sheet and Climate Change Policy, Part 1 of 2 appeared first on IER.

5 Common Objections to SEO (& How to Respond) - Whiteboard Friday

Wednesday, August 14, 2019

Ohio’s New Energy Bill

#31: Professor David Dismukes on his recent research regarding PURPA

Professor David Dismukes joins the show to discuss his recent findings highlighting the need for reforming outdated provisions of the Public Utility Regulatory Policies Act (PURPA) passed in 1978.

Full text of Dismukes’ paper(The Urgency of PURPA Reform to Assure Ratepayer Protection).

Summary of the key findings by IER’s Kenny Stein.

Learn more about the Center for Energy Studies at Lousiana State University.

 

The post #31: Professor David Dismukes on his recent research regarding PURPA appeared first on IER.

How to Get Started Building Links for SEO

Posted by KameronJenkins

Search for information about SEO, and you’ll quickly discover three big themes: content, user experience, and links. If you’re just getting started with SEO, that last theme will likely seem a lot more confusing and challenging than the others. That’s because, while content and user experience are under the realm of our control, links aren’t… at least not completely.

Think of this post as a quick-and-dirty version of The Beginner’s Guide to SEO’s chapter on link building. We definitely recommend you read through that as well, but if you’re short on time, this condensed version gives you a quick overview of the basics as well as actionable tips that can help you get started.

Let’s get to it!

What does “building links” mean?

Link building is a term used in SEO to describe the process of increasing the quantity of good links from other websites to your own.

Why are links so important? They’re one of the main (although not the only!) criteria Google uses to determine the quality and trustworthiness of a page. You want links from reputable, relevant websites to bolster your own site’s authority in search engines.

For more information on different types of links, check out Cyrus Shepard’s post All Links are Not Created Equal: 20 New Graphics on Google's Valuation of Links.

“Building links” is common SEO vernacular, but it deserves unpacking or else you may get the wrong idea about this practice. Google wants people to link to pages out of their own volition, because they value the content on that page. Google does not want people to link to pages because they were paid or incentivized to do so, or create links to their websites themselves — those types of links should use the “nofollow” attribute. You can read more about what Google thinks about links in their webmaster guidelines.

The main thing to remember is that links to your pages are an important part of SEO, but Google doesn’t want you paying or self-creating them, so the practice of “building links” is really more a process of “earning links” — let’s dive in.

How do I build links?

If Google doesn’t want you creating links yourself or paying for them, how do you go about getting them? There are a lot of different methods, but we’ll explore some of the basics.

Link gap analysis

One popular method for getting started with link building is to look at the links your competitors have but you don’t. This is often referred to as a competitor backlink analysis or a link gap analysis. You can perform one of these using Moz Link Explorer’s Link Intersect tool.

Link Intersect gives you a glimpse into your competitor’s link strategy. My pal Miriam and I wrote a guide that explains how to use Link Explorer and what to do with the links you find. It’s specifically geared toward local businesses, but it’s helpful for anyone just getting started with link building.

Email outreach

A skill you’ll definitely need for link building is email outreach. Remember, links to your site should be created by others, so to get them to link to your content, you need to tell them about it! Cold outreach is always going to be hit-or-miss, but here are a few things that can help:

  • Make a genuine connection: People are much more inclined to help you out if they know you. Consider connecting with them on social media and building a relationship before you ask them for a link.
  • Offer something of value: Don’t just ask someone to link to you — tell them how they’ll benefit! Example: offering a guest post to a content-desperate publisher.
  • Be someone people would want to link to: Before you ask anyone to link to your content, ask yourself questions like, “Would I find this valuable enough to link to?” and “Is this the type of content this person likes to link to?”

There are tons more articles on the Moz Blog you can check out if you’re looking to learn more about making your email outreach effective:

Contribute your expertise using services like HARO

When you’re just getting started, services like Help a Reporter Out (HARO) are great. When you sign up as a source, you’ll start getting requests from journalists who need quotes for their articles. Not all requests will be relevant to you, but be on the lookout for those that are. If the journalist likes your pitch, they may feature your quote in their article with a link back to your website.

Where do I go from here?

I hope this was a helpful crash-course into the world of link building! If you want to keep learning, we recommend checking out this free video course from HubSpot Academy that walks you through finding the right SEO strategy, including how to use Moz Link Explorer for link building.

Watch the video

Remember, link building certainly isn’t easy, but it is worth it!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Monday, August 12, 2019

Tax Credits Are Expiring for Tesla and GM

The Plug-In Electric Drive Vehicle Credit (EV tax credit)—established by the Energy Improvement and Extension Act of 2008, and further amended by the American Recovery and Reinvestment Act of 2009—provides a tax rebate of up to $7,500 to U.S. purchasers of qualified plug-in electric vehicles. The tax credit is worth its full value until a manufacture sells more than 200,000 vehicles. After this threshold is met, a phase-out period begins. Starting the second quarter following the quarter in which the 200,000th vehicle is sold, the credit halves to $3,750, and after two quarters at that rate it halves again to $1,875, where it remains for another two quarters before going away entirely.

At present, the only two EV manufacturers to hit this sales threshold are Tesla and General Motors. Tesla began its phase-out period January 1st of this year after hitting the threshold in the 3rd quarter of 2018. On June 30th, the tax credit for Tesla buyers halved again to $1,875, and it will go away entirely in January of 2020. GM is not far behind; its phase-out started on April 1st when the credit for GM buyers was halved. It will halve again to $1,875 at the beginning of October and expire on March 31st of next year.

The expiration of their tax credits will require companies whose sales have been augmented by the electric vehicle tax credit to make their EVs more cost-competitive with internal combustion engine vehicles. The beginnings of this can be seen, as Tesla cut $1,000 off the price of its Model 3 and simplified its offerings by removing the base model option and changing the price scales for the Models X and S after its most recent tax credit reduction. Electric vehicles are growing in popularity, but it’s time to see if they can survive on the market unsubsidized.

Predictably, some automakers are loath to see the credits phase out. Tesla and GM, along with Nissan (whose sales are projected to reach the threshold by 2021) have been engaged in lobbying efforts to extend the credit. They’ve partnered with charging station manufacturers and advocacy groups in the “EV Drive Coalition,” the goal of which is legislation to reform and extend the EV tax credit, although even they acknowledge that the credit must eventually sunset.

In April, Senator Debbie Stabenow (D-MI) introduced S.1094, the “Driving America Forward Act,” to the Senate Finance Committee. The bill, co-sponsored by seven other Senators including two Republicans, Susan Collins (R-ME) and Lamar Alexander (R-TN), would add an additional 400,000 units to each company’s cap, with the value of the additional credits being lowered to $7,000. This increase in the cap would only serve to further entrench the distortion to the market created by the individual manufacturer based phase-out.

Since the EV tax credit began in 2008, the vehicles produced by all major electric vehicle companies have all been eligible for the subsidy. Because the most popular producers will have to learn to live without the tax credit, the EV market will likely start to change. Tesla and GM will need to start innovating more aggressively, cutting prices to make up for tax credit their customers lose, and finding other ways to be profitable that they did not have to concern themselves with while the subsidy was artificially lowering the real price that consumers paid for their vehicles after receiving the tax credit.

The automakers whose tax credit eligibility is phasing out complain that it’s unfair that they, the early adopters who have successfully sold EVs, will no longer have their vehicles eligible for the credit, while other companies’ vehicles will still be able to receive it.

This disparity arises from the 2009 amendment of the law, which established the per-company cap and phase out rather than allowing for one overall number available on a first come-first served basis, regardless of manufacturer. Although this initial structure creates a serious market distortion, plans to extend the credit would not remedy either the overall distortion caused by the credit, or the particular distortion caused when some companies still receive the subsidy as others see their credits phased out after reaching the cap.

Although all government subsidies distort markets, the effect would be far less negative were the program based on a total pool of money or total number of cars irrespective of manufacturer (the first law in 2008 established a total cap of 250,000) after which point the law would sunset. This would have given companies an incentive to compete against competitors to receive the subsidies. As things are, though, the 2009 iteration of the law was structured in such a way as to lead to this eventual interim period where some companies have phased out while others have not yet done so.

As is so often the case with government programs, the reality is a far cry from the ideal hypotheticals drawn up by economists and policy wonks. Suboptimal structures make their way into legislation all the time, and in this case, the structure of the law leaves plenty to be desired.

Now that the distortion has occurred, though, the best path forward is to allow companies to use up their allotments and then enter the real, unsubsidized market. After all, each company gets the same total number of allotments, so although the benefit is going to the latecomers now, while Tesla and GM are phasing out, their arguments of unfairness unravel when its acknowledged that the newcomers will hit the threshold at some point as well. They too will be capped at 200,000.

But, new and less effective manufacturers may continue to crop up, benefiting from the tax credit to their customers, while the companies with more established production methods will no longer benefit from the credit. This could distort the EV market in such a way as to undermine its long-term growth by subsidizing the less productive companies.

It may also cause companies to splinter in otherwise economically impractical ways to gain new tax credit eligibility for their customers. The only phase-out of the law is per manufacturer, so it could conceivably continue in a pattern of propping up inefficient companies, and causing uneconomical allocations for a long time. Overall, the EV tax credit is an example of convoluted policy getting in the way of both the free market, and of its own stated goal.

The post Tax Credits Are Expiring for Tesla and GM appeared first on IER.

"Study Finds:" How Data-Driven Content Marketing Builds Links and Earns Press Mentions

Friday, August 9, 2019

James Hansen on Climate Policy: The Latest

We have at most ten years—not ten years to decide upon action, but ten years to alter fundamentally the trajectory of global greenhouse emissions.

-James Hansen, July 2006

Saving Earth is a century-time-scale problem. There will be significant overshoot of global temperature as well as overshoot of atmospheric greenhouse gas amounts. We are already into overshoot territory, but not very far as yet. This is no time to give up.

-James Hansen, June 2019

James Hansen sparked the climate alarm in 1988 and remains a leading policy activist today. But as the above two quotations attest, he shoots from the hip, exaggerating both the problem and the time frame to address it.

Hansen is quite unlike the hyper-arrogant John Holdren on one side and vitriolic Joe Romm on the other. He is a real scientist, although far too confident about his high-sensitivity estimate of the enhanced greenhouse effect on global climate.

But Hansen speaks truth to power on the futility of renewables as a substitute for mineral energies. He has even wondered if the environmental movement is net CO2 positive, their anti-nuclear activism more than offsetting their renewables push.

Hansen sees climate politics as “alligator shoe” lobbyists bribing the system. Cap-and-trade schemes are “a hidden regressive tax, benefiting the select few.”

The Kyoto Protocol was “doomed from the start.” The Paris climate accord is “a fraud really, a fake.” Hansen recently opined on both:

The ‘cap’ approach of the Kyoto and Paris agreements is doomed to failure. We cannot successfully beg each of 200 nations to reduce their emissions.

How about the so-called Green New Deal? Hansen in a debate rejected it as “nonsense.”

His straight talk has disrupted the Left environmentalists’ build-it-and-they-will-come (really, force-it-and-they-will-cope) narrative. “I am unhappy to publicize Hansen’s bleeding-edge climate policy analysis,” Joe Romm once complained, which “is mostly providing aid-and-comfort to the deniers and delayers.”

The Latest

Hansen’s “Saving Earth” (June 27) updates his thoughts about the problem and the solution. He remains a full-scale climate alarmist, seeing only red and not green from increasing atmospheric concentration of carbon dioxide and other GHGs. (His earlier views were more nuanced.)

To deep ecologists like Hansen, which includes the large majority of natural scientists in and outside of climatology, the natural environment is optimal and fragile. The human influence cannot be good. Government must solve the problem at any scale. But they are surprised again and again at the resiliency of man and ecosystems as time marches on.

Adaptation, in fact, is the wealth-is-health, free-market alternative to government-forced CO2 mitigation. A century of progress, enabled significantly by fossil fuels, is responsible for reducing climate-related deaths 95 percent.

Here is Hansen’s latest, mixing alarmist certainty with the reality of a losing war against pro-consumer, pro-taxpayer carbon-based energy.

  1. Alarmism

On the other hand, delayed [climate] response [to human forcing] also allows the possibility of [policy] actions to avert a globally catastrophic outcome. By ‘globally catastrophic outcome’ I refer to the threat that the planet could become ungovernable over the next several decades, if we do not fundamentally alter our energy systems.

  1. Mitigation Fail

The really bad news is that the annual growth of greenhouse gas climate forcing is not declining, it is accelerating! Accelerating growth is mainly from CO2, but methane (CH4) also contributes…. The real world is rapidly diverging from the RCP2.6 scenario … that keeps global warming at approximately 1.5°C.

  1. Geoengineering Fail

Let us look at the cost of CO2 extraction [from the atmosphere] …. In rounder numbers the annual cost of extracting CO2 is now about 2-4 trillion dollars, and rising. That is the annual cost. So we won’t do the extraction.

  1. Political Fail

A temperature ‘target’ approach is ineffectual. It has practically no impact on global emissions. A target approach is also used for emissions. Yes, a nation should track its emissions accurately, but targets cannot substitute for policy. Global emissions accelerated after the 1997 Kyoto Protocol.

Political leaders are perpetrating a hoax. Faced with realization that we could hand young people a climate system running out of their control, political leaders took the easy way out. With the Paris Agreement in 2015 they changed the target for maximum global warming from 2°C to 1.5°C.

The public … voted in Barack (‘Planet in Peril’) Obama and Albert (‘Earth in the Balance’) Gore. The accomplishments by those Administrations in addressing climate change, to use a favorite phrase of my mother, ‘did not amount to a hill of beans’.

Emission targets will never overrule the desire of nations to raise their standards of living.

Conclusion

James Hansen is wed to catastrophic warming, wholly rejecting a benign, if not beneficial, lukewarming. This said, he is on point about the futility of wind power and solar power making a dent in CO2 emissions. And to his credit, he sees almost all of climate politics as grotesque.

Naively, Hansen sees the climate solution as “honest pricing of fossil fuels” (via a carbon tax) and “government support of breakthrough technologies, including clean energy research, development, demonstration and deployment programs.”

Assume perfect knowledge about the problem. Assume perfect government in the solution. Assume an unimaginable energy fix. Assume that self-sacrificial democracies will go carbon negative as energy reality marches on.

James Hansen is a serious scientist. But is he open to lower climate sensitivity estimates? Can he recognize the risks of climate policy to adaptation? He has 30 years invested in his climate views, but the impossible climate math cannot be held at bay much longer.

The post James Hansen on Climate Policy: The Latest appeared first on IER.

Supercharge Your Link Building Outreach! 5 Tips for Success - Whiteboard Friday

Wednesday, August 7, 2019

New Paper: The Urgency of PURPA Reform to Assure Ratepayer Protection

A new paper released by Professor David Dismukes at the Louisiana State University Center for Energy Studies highlights the need for reforming outdated provisions of the Public Utility Regulatory Policies Act (PURPA) passed in 1978. The paper demonstrates how PURPA ultimately forces ratepayers to pay inflated prices for new renewable generation regardless of whether the electricity is needed and examines state level efforts to protect ratepayers from these negative impacts.

The paper focuses on the provisions of PURPA intended to promote independent energy generation. At the time of passage, utilities were all vertically integrated; all generation, transmission and sales were handled by the same monopoly in a given area. PURPA sought to inject competition into this system. It did this by requiring utilities to buy any electricity produced by a “qualifying facility.” A QF is a smaller, independent generation facility. These purchases are made at an administratively determined price called “avoided cost.” This is meant to approximate the cost for a utility to produce that unit of electricity.

Whatever the merits of that competition effort in the 70’s and 80’s, the paper shows that these provisions are hopelessly outdated now. Numerous states have varying levels of competition in the electricity market. FERC and the regional interconnection grids also promote and regulate interstate competitive markets pursuant to federal law. While some states remain largely monopoly systems, in general there are outlets for competition to varying degrees nationwide.

PURPA, though, continues to require utilities to purchase electricity from any QF, regardless of whether that electricity is even needed. And since electricity demand has been essentially flat for more than a decade, most new PURPA development is in fact not needed. On top of that mandate, the system for setting “avoided cost” is a hopeless mess, mainly because it is administratively set by regulators, not market based. The paper notes a few examples of a commonly seen situation where the “avoided cost” rate is far more than the actual cost of the generation. While this is a great deal for small wind and solar developers because utilities are required to buy their product at artificially high prices, it leaves ratepayers holding the bag paying excessive rates for unneeded electricity generation. This extra PURPA generation additionally wrecks the economics of existing generation, with those costs also passed on the ratepayers.

Among the main conclusions of the paper are:

  • An estimated $108 billion in PURPA renewables payments have been charged to ratepayers in the last 10 years.
  • While it is impossible to estimate exactly how much of that new generation was unnecessary, given that electricity demand in the US has not been growing over that decade, it is likely that a large proportion of that was not needed.
  • Various efforts at state-level reforms in states such as Montana, North Carolina, and Idaho have run into regulatory and legal hurdles largely due to the antiquated structure of the federal PURPA laws and regulations.

The paper further calls for specific reforms to PURPA to address some of the law’s worst failures:

  1. Eliminating the requirement for utilities to purchase unneeded electricity
  2. Eliminating requirements for long-term contracts unless there is a demonstrated need for such a contract.
  3. Eliminating loopholes like the “one-mile” rule that allows large generation developments through careful spacing to masquerade as a series of small PURPA qualifying facilities.

What this paper ultimately highlights is that, although “PURPA reform” sounds like an arcane fight between utilities and independent generators, ultimately the ratepayers pay all the costs of the rent-seeking from this broken regulatory system. Inflated “avoided cost” payments, which have nothing to do with the actual cost of electricity generation, are locked into long term contracts that leave ratepayers paying far more than the wind or solar generation actually costs. Furthermore, these over-market payments are for new electricity supplies that are not needed. The only way to halt this fleecing of electricity consumers is through urgently needed reforms to the underlying law. The full paper is avaliable here.

The post New Paper: The Urgency of PURPA Reform to Assure Ratepayer Protection appeared first on IER.

Physicist in New York Times Admits Climate Change Might Be “Mere Annoyance”

For years here at IER I have been warning Americans that the “consensus science” in the peer-reviewed economics literature doesn’t at all support aggressive political action in the name of fighting climate change. For example, back in 2014 I used the latest climate report from the UN in order to show that the UN’s (then) preferred climate target of 2° Celsius didn’t make sense. Similarly, last fall when William Nordhaus won the Nobel Prize for his pioneering work on climate economics, I mentioned the awkward fact that his life’s work quite clearly repudiated the UN report that came out the same weekend as Nordhaus’ award.

Now because I’m an economist, I understandably focus on the economics literature. But anybody with eyes can see that the claims of immediate climate crisis have been oversold in the natural sciences, as well.

This was obvious in a recent article in the New York Times, written by PhD physicist Sabine Hossenfelder. The title of the piece is, “Is Climate Change Inconvenient or Existential? Only Supercomputers Can Do the Math.” Already this is surprising: could it really be true that climate change is merely inconvenient?! I mean, the BBC is telling us that we don’t have 12 years to act—as Alexandria Ocasio-Cortez (falsely) claimed the UN reports tell us—but in fact we only have 18 months to save the planet. So did the NYT editors screw up with that title?

Nope, the NYT headline is fine. Here is a longer excerpt from the piece by Hossenfelder:

But we don’t know how to solve [the equations modeling the climate system]. The many factors that affect the climate interact with one another and give rise to interconnected feedback cycles. The mathematics is so complex, the only way scientists know to handle it is by feeding the problem into computers, which then approximately solve the equations.

The Intergovernmental Panel on Climate Change based its latest full report, in 2014, on predictions from about two dozen such computer models….While similar in methodology, the models arrive at somewhat different long-term predictions. They all agree that Earth will continue to warm, but disagree on how much and how quickly.

In this situation, the best we can do is improve computer models to obtain more accurate, approximate solutions. It is knowledge we urgently need: As Earth continues to warm, we face a future of drought, rising seas and extreme weather events. But for all we currently know, this situation could be anywhere between a mere annoyance and an existential threat. [Bold added.]

The above excerpt—and remember, this is coming from a physicist whose article was vetted by the NYT—is absolutely shocking, in the context of U.S. political debates over climate change. If President Trump or Senator Inhofe had made equivalent remarks, the internet Thought Police would quickly pounce on such “unscientific” denialism.

And yet, the so-called “lukewarmer” position is actually the most defensible, in terms of mainstream climate science. Do not fall for rhetorical tricks when alarmists claim that “we’ve known for over a century that carbon dioxide-induced climate change is real,” because the basic chemistry and physics is not in dispute. As scientists like Judith Curry, and my Cato co-authors Pat Michaels and Chip Knappenberger have been telling the public, the immediate effect of carbon dioxide will not lead to disaster. The predictions of catastrophe are derived from controversial “feedback effects” that arise in certain models, but not in others. The various models are all consistent with the laws of physics, but—as the NYT piece explains—our computers currently have to cut corners, because the climate system is so complex.

Conclusion

I’m not a (physical) scientist, and I don’t even play one on TV. But as a professional economist, I can quite confidently report that the economic models of climate change do not come in the same ZIP code as supporting the aggressive policies touted by recent UN reports. And, as a recent NYT article from a PhD physicist admits, even the physical scientists aren’t agreed that disaster looms. Indeed, climate change might very well just be a mere inconvenience. This type of revelation doesn’t by itself tell us the proper government policy response, but it does tell Americans which vocal activists they should immediately tune out, because they’ve been caught bluffing.

The post Physicist in New York Times Admits Climate Change Might Be “Mere Annoyance” appeared first on IER.