Saturday, November 30, 2019

Solar Sunday – get $100 OFF Solar Training!

Illuminate the upcoming new year with reduced solar training tuition from SEI! Between December 1-2, get $100 off SEI’s online PVOL101 or FVOL101 January 13 – February 23 sessions. A great gift for yourself or another this holiday season, these classes cover the fundamentals of solar energy by presenting a solid understanding of the various components, system architectures, and applications for PV systems. No prerequisites or previous solar/electrical experience required. This deal expires Monday, December 2nd at midnight. Enroll now, classes fill up quickly!

Use code SOLARSUNDAY to recieve $100 OFF!

The post Solar Sunday – get $100 OFF Solar Training! appeared first on Solar Training - Solar Installer Training - Solar PV Installation Training - Solar Energy Courses - Renewable Energy Education - NABCEP - Solar Energy International (SEI).

Friday, November 29, 2019

All About Fraggles (Fragment + Handle) - Whiteboard Friday

Posted by Suzzicks

What are "fraggles" in SEO and how do they relate to mobile-first indexing, entities, the Knowledge Graph, and your day-to-day work? In this glimpse into her 2019 MozCon talk, Cindy Krum explains everything you need to understand about fraggles in this edition of Whiteboard Friday.

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Hi, Moz fans. My name is Cindy Krum, and I'm the CEO of MobileMoxie, based in Denver, Colorado. We do mobile SEO and ASO consulting. I'm here in Seattle, speaking at MozCon, but also recording this Whiteboard Friday for you today, and we are talking about fraggles.

So fraggles are obviously a name that I'm borrowing from Jim Henson, who created "Fraggle Rock." But it's a combination of words. It's a combination of fragment and handle. I talk about fraggles as a new way or a new element or thing that Google is indexing.

Fraggles and mobile-first indexing

Let's start with the idea of mobile-first indexing, because you have to kind of understand that before you can go on to understand fraggles. So I believe mobile-first indexing is about a little bit more than what Google says. Google says that mobile-first indexing was just a change of the crawler.

They had a desktop crawler that was primarily crawling and indexing, and now they have a mobile crawler that's doing the heavy lifting for crawling and indexing. While I think that's true, I think there's more going on behind the scenes that they're not talking about, and we've seen a lot of evidence of this. So what I believe is that mobile-first indexing was also about indexing, hence the name.

Knowledge Graph and entities

So I think that Google has reorganized their index around entities or around specifically entities in the Knowledge Graph. So this is kind of my rough diagram of a very simplified Knowledge Graph. But Knowledge Graph is all about person, place, thing, or idea.

Nouns are entities. Knowledge Graph has nodes for all of the major person, place, thing, or idea entities out there. But it also indexes or it also organizes the relationships of this idea to this idea or this thing to this thing. What's useful for that to Google is that these things, these concepts, these relationships stay true in all languages, and that's how entities work, because entities happen before keywords.

This can be a hard concept for SEOs to wrap their brain around because we're so used to dealing with keywords. But if you think about an entity as something that's described by a keyword and can be language agnostic, that's how Google thinks about entities, because entities in the Knowledge Graph are not written up per se or their the unique identifier isn't a word, it's a number and numbers are language agnostic.

But if we think about an entity like mother, mother is a concept that exists in all languages, but we have different words to describe it. But regardless of what language you're speaking, mother is related to father, is related to daughter, is related to grandfather, all in the same ways, even if we're speaking different languages. So if Google can use what they call the "topic layer"and entities as a way to filter in information and understand the world, then they can do it in languages where they're strong and say, "We know that this is true absolutely 100% all of the time."

Then they can apply that understanding to languages that they have a harder time indexing or understanding, they're just not as strong or the algorithm isn't built to understand things like complexities of language, like German where they make really long words or other languages where they have lots of short words to mean different things or to modify different words.

Languages all work differently. But if they can use their translation API and their natural language APIs to build out the Knowledge Graph in places where they're strong, then they can use it with machine learning to also build it and do a better job of answering questions in places or languages where they're weak. So when you understand that, then it's easy to think about mobile-first indexing as a massive Knowledge Graph build-out.

We've seen this happening statistically. There are more Knowledge Graph results and more other things that seem to be related to Knowledge Graph results, like people also ask, people also search for, related searches. Those are all describing different elements or different nodes on the Knowledge Graph. So when you see those things in the search, I want you to think, hey, this is the Knowledge Graph showing me how this topic is related to other topics.

So when Google launched mobile-first indexing, I think this is the reason it took two and a half years is because they were reindexing the entire web and organizing it around the Knowledge Graph. If you think back to the AMA that John Mueller did right about the time that Knowledge Graph was launching, he answered a lot of questions that were about JavaScript and href lang.

When you put this in that context, it makes more sense. He wants the entity understanding, or he knows that the entity understanding is really important, so the href lang is also really important. So that's enough of that. Now let's talk about fraggles.

Fraggles = fragment + handle

So fraggles, as I said, are a fragment plus a handle. It's important to know that fraggles — let me go over here —fraggles and fragments, there are lots of things out there that have fragments. So you can think of native apps, databases, websites, podcasts, and videos. Those can all be fragmented.

Even though they don't have a URL, they might be useful content, because Google says its goal is to organize the world's information, not to organize the world's websites. I think that, historically, Google has kind of been locked into this crawling and indexing of websites and that that's bothered it, that it wants to be able to show other stuff, but it couldn't do that because they all needed URLs.

But with fragments, potentially they don't have to have a URL. So keep these things in mind — apps, databases and stuff like that — and then look at this. 


So this is a traditional page. If you think about a page, Google has kind of been forced, historically by their infrastructure, to surface pages and to rank pages. But pages sometimes struggle to rank if they have too many topics on them.

So for instance, what I've shown you here is a page about vegetables. This page may be the best page about vegetables, and it may have the best information about lettuce, celery, and radishes. But because it's got those topics and maybe more topics on it, they all kind of dilute each other, and this great page may struggle to rank because it's not focused on the one topic, on one thing at a time.

Google wants to rank the best things. But historically they've kind of pushed us to put the best things on one page at a time and to break them out. So what that's created is this "content is king, I need more content, build more pages" mentality in SEO. The problem is everyone can be building more and more pages for every keyword that they want to rank for or every keyword group that they want to rank for, but only one is going to rank number one.

Google still has to crawl all of those pages that it told us to build, and that creates this character over here, I think, Marjory the Trash Heap, which if you remember the Fraggles, Marjory the Trash Heap was the all-knowing oracle. But when we're all creating kind of low- to mid-quality content just to have a separate page for every topic, then that makes Google's life harder, and that of course makes our life harder.

So why are we doing all of this work? The answer is because Google can only index pages, and if the page is too long or too many topics, Google gets confused. So we've been enabling Google to do this. But let's pretend, go with me on this, because this is a theory, I can't prove it. But if Google didn't have to index a full page or wasn't locked into that and could just index a piece of a page, then that makes it easier for Google to understand the relationships of different topics to one page, but also to organize the bits of the page to different pieces of the Knowledge Graph.

So this page about vegetables could be indexed and organized under the vegetable node of the Knowledge Graph. But that doesn't mean that the lettuce part of the page couldn't be indexed separately under the lettuce portion of the Knowledge Graph and so on, celery to celery and radish to radish. Now I know this is novel, and it's hard to think about if you've been doing SEO for a long time.

But let's think about why Google would want to do this. Google has been moving towards all of these new kinds of search experiences where we have voice search, we have the Google Home Hub kind of situation with a screen, or we have mobile searches. If you think about what Google has been doing, we've seen the increase in people also ask, and we've seen the increase in featured snippets.

They've actually been kind of, sort of making fragments for a long time or indexing fragments and showing them in featured snippets. The difference between that and fraggles is that when you click through on a fraggle, when it ranks in a search result, Google scrolls to that portion of the page automatically. That's the handle portion.

So handles you may have heard of before. They're kind of old-school web building. We call them bookmarks, anchor links, anchor jump links, stuff like that. It's when it automatically scrolls to the right portion of the page. But what we've seen with fraggles is Google is lifting bits of text, and when you click on it, they're scrolling directly to that piece of text on a page.

So we see this already happening in some results. What's interesting is Google is overlaying the link. You don't have to program the jump link in there. Google actually finds it and puts it there for you. So Google is already doing this, especially with AMP featured snippets. If you have a AMP featured snippet, so a featured snippet that's lifted from an AMP page, when you click through, Google is actually scrolling and highlighting the featured snippet so that you could read it in context on the page.

But it's also happening in other kind of more nuanced situations, especially with forums and conversations where they can pick a best answer. The difference between a fraggle and something like a jump link is that Google is overlaying the scrolling portion. The difference between a fraggle and a site link is site links link to other pages, and fraggles, they're linking to multiple pieces of the same long page.

So we want to avoid continuing to build up low-quality or mid-quality pages that might go to Marjory the Trash Heap. We want to start thinking in terms of can Google find and identify the right portion of the page about a specific topic, and are these topics related enough that they'll be understood when indexing them towards the Knowledge Graph.

Knowledge Graph build-out into different areas

So I personally think that we're seeing the build-out of the Knowledge Graph in a lot of different things. I think featured snippets are kind of facts or ideas that are looking for a home or validation in the Knowledge Graph. People also ask seem to be the related nodes. People also search for, same thing. Related searches, same thing. Featured snippets, oh, they're on there twice, two featured snippets. Found on the web, which is another way where Google is putting expanders by topic and then giving you a carousel of featured snippets to click through on.



 So we're seeing all of those things, and some SEOs are getting kind of upset that Google is lifting so much content and putting it in the search results and that you're not getting the click. We know that 61% of mobile searches don't get a click anymore, and it's because people are finding the information that they want directly in a SERP.

That's tough for SEOs, but great for Google because it means Google is providing exactly what the user wants. So they're probably going to continue to do this. I think that SEOs are going to change their minds and they're going to want to be in those windowed content, in the lifted content, because when Google starts doing this kind of thing for the native apps, databases, and other content, websites, podcasts, stuff like that, then those are new competitors that you didn't have to deal with when it was only websites ranking, but those are going to be more engaging kinds of content that Google will be showing or lifting and showing in a SERP even if they don't have to have URLs, because Google can just window them and show them.

So you'd rather be lifted than not shown at all. So that's it for me and featured snippets. I'd love to answer your questions in the comments, and thanks very much. I hope you like the theory about fraggles.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Wednesday, November 27, 2019

Oil Rig Count Down; Oil Production Up

According to the Energy Information Administration (EIA), U.S. oil production for the first nine months of 2019 averaged over 12 million barrels per day. That level of production is 1.4 million barrels per day higher than the same period last year, despite the oil rig count declining steadily in 2019. EIA sees increasing oil production as the trend for the near future. In EIA’s Short Term Energy Outlook, oil production is expected to end the 2019 year 1.3 million barrels per day higher than last year, and it is expected to be another 1 million barrels a day higher next year ending the 2020 year with oil production at 13.29 million barrels per day.

Despite the increase in oil production, the oil rig count declined over a record 11 months as independent exploration and production companies cut spending on new drilling, focusing on earnings growth instead of increased output. U.S. financial services firm Cowen & Co indicated that projections from the exploration and production companies it tracks point to a 5 percent decline in capital expenditures for drilling and completions in 2019 versus 2018. The exploration and production companies Cowen tracks reported plans to spend about $80.5 billion in 2019 versus $84.6 billion in 2018. But that decline, both in magnitude and direction, differs by major oil and gas producers versus independent producers. According to Cowen, independent producers expect to spend about 11 percent less in 2019, while major oil companies plan to spend about 16 percent more.

Source: The Wall Street Journal
Source: The Wall Street Journal 

Shale Producers

Many independent oil shale companies are expected to lower investment spending that would likely slow growth in future oil production. The companies hope that slower growth may result in higher commodity prices. These small and midsize shale producers are emphasizing profitability, making it more challenging for the companies to drill new wells as rapidly as they did previously. A pullback by these oil producers would likely cause U.S. oil production growth that slowed a bit this year to slow further in 2020.

The slower rate of new wells means companies will need to obtain greater production out of each well to sustain current production levels. Some data indicates that the production gains from technological advances are leveling off. Gains in oil production from U.S. onshore drilling rigs are declining. In December, drilling rigs helped extract 25 percent more oil than they had a year ago. In August, they were producing about 14 percent more than last year. Further, oil production in the first 90 days of an average shale well—the time when it is most productive—declined by 10 percent in the first half of the year compared to the 2018 average.

Data show that across North Dakota’s Bakken Shale region, well productivity has not improved since late 2017. In other mature shale regions, such as the Eagle Ford in South Texas, many operators have experienced productivity per horizontal foot decline as they have supersized their wells—drilling bigger and often more expensive wells to recover a similar amount of oil.

However, as the exploration and production data above noted, major oil companies such as Exxon Mobil and Chevron that are increasing their shale footprint, particularly in the Permian Basin of Texas and New Mexico, are investing heavily in factory-style shale production. It is unclear how much the increased investment and production from the major producers would outweigh the slower growth in production from the independent producers.

Conclusion

Oil drilling rig counts are down, but U.S. oil production is still increasing, just at a slightly slower rate. According to EIA, the trend in oil production growth is expected to continue with oil production next year reaching 13.29 million barrels per day—1 million barrels a day above what is expected for this year’s oil production. Many independents are investing less, focusing on profitability, but major oil companies are expected to continue to invest and increase production.

The post Oil Rig Count Down; Oil Production Up appeared first on IER.

Tuesday, November 26, 2019

The Practical Guide to Finding Anyone's Email Address

Posted by David_Farkas

In link building, few things are more frustrating than finding the perfect link opportunity but being completely unable to find a contact email address.

It’s probably happened to you — if you’re trying to build links or do any sort of outreach, it almost always entails sending out a fairly significant amount of emails. There are plenty of good articles out there about building relationships within the context of link building, but it’s hard to build relationships when you can’t even find a contact email address.

So, for today, I want to focus on how you can become better at finding those important email addresses.

Link builders spend a lot of time just trying to find contact info, and it’s often a frustrating process, just because sussing out email addresses can indeed be quite difficult. The site you’re targeting might not even have a contact page in the first place. Or, if the site does have a contact page, it might only display a generic email address. And, sometimes, the site may list too many email addresses. There are eight different people with similar-sounding job titles — should you reach out to the PR person, the marketing director, or the webmaster? It’s not clear.

Whatever the case may be, finding the right email address is absolutely imperative to any successful outreach campaign. In our industry, the numbers around outreach and replies aren’t great. Frankly, it’s shocking to hear the industry standard — only 8.5% of outreach emails receive a response.

I can’t help but wonder how many mistakes are made along the way to such a low response rate.

While there are certainly instances where there is simply no clear and obvious contact method, that should be the exception — not the rule! An experienced link builder understands that finding relevant contact information is essential to their success.

That’s why I’ve put together a quick list of tips and tools that will help you to find the email addresses and contact information you need when you’re building links.

And, if you follow my advice, here is a glimpse of the results you could expect:

Screenshot of high open and reply rates on an email

We don’t track clicks, in case you were wondering ;)

ALWAYS start by looking around!

First, let’s start with my golden rule: Before you fire up any tool, you should always manually look for the correct contact email yourself.

Based on my experience, tools and automation are a last resort. If you rely solely upon tools and automated solutions, you’ll end up with many more misfired emails than if you were to go the manual route. There’s a simple reason for this: the email address listed on your target website may, surprisingly, belong to the right person you should contact!

Now, if you are using a tool, they may generate dozens of email addresses, and you’ll never end up actually emailing the correct individual. Another reason I advocate manually looking for emails is because many email finding tools are limited and can only find email addresses that are associated with a domain name. So, if there is a webmaster that happens to have a @gmail.com email address, the email finding tool will not find it.

It’s also important to only reach out to people you strongly believe will have an interest in your email in order to stay GDPR compliant.

So, always start your manual search by looking around the site. Usually, there will be a link to the contact page in the header, footer, or sidebar. If there’s not a page explicitly named “contact,” or if the contact page only has generic email addresses, that’s when I would recommend jumping to an “About Us” page, should there be one. 

You always want to find a personal email, not a generic one or a contact form. Outreach is more effective when you can address a specific individual, not whoever who is checking info@domain.com that day.

If you encounter too many emails and aren’t sure who the best person to contact is, I suggest sending an email to your best hunch that goes something like this:

And who knows, you may even get a reply like this:

Screenshot of a reply telling you to contact someone else

If you weren’t able to locate an email address at this point, I’d move on to the next section.

Ask search engines for help

Perhaps the contact page you were looking for was well-hidden; maybe they don’t want to be contacted that much or they're in desperate need of a new UX person.

You can turn to search engines for help.

My go-to search engine lately is Startpage. Dubbed as the world's most private search engine, they display Google SERPs in a way that doesn’t make you feel like you just stepped into Times Square. They also have a cool option to browse the search results anonymously with "Anonymous View."

For our purposes, I would use the site: search operator just like this:

If there is in fact a contact page or email somewhere on their website that you were not able to find, any competent search engine will find it for you. If the above site query doesn't return any results, then I’d start expanding my search to other corners of the web.

Use the search bar and type:

If you’re looking for the email of a specific person, type their name before or after the quotation marks.

With this query you can find non-domain email addresses:

If that person’s email address is publicly available somewhere, you will likely be able to find it within the search results.

Email-finding tools

There are many, many excellent email finding tools to choose from. The first one I want to talk about is Hunter.

Hunter has a Chrome extension that’s really easy to use. After you’ve downloaded the extension, there’s not much more that needs to be done.

Go to the site which you are thinking about sending an email to, click on the extension in the top right corner of your screen, and Hunter, well, hunts.

It returns every email address it can find associated with that domain. And also allows you to filter the results based on categories.

Did I say “email address?” I meant to say email address, name, job title, etc. Essentially, it’s a one-click fix to get everything you need to send outreach.

Because I use Hunter regularly (and for good reason, as you can see), it’s the one I’m most familiar with. You can also use Hunter’s online app to look up emails in bulk.

The major downside of working in bulk is coming up with an effective formula to sift through all the emails. Hunter may generate dozens of emails for one site, leaving you to essentially guess which email address is best for outreach. And if you’re relying on guess-work, chances are pretty high you’re leaving perfectly good prospects on the table.

There are several other email finding tools to pick from and I would be remiss to not mention them. Here are 5 alternative email-finding tools:

Even though I personally try not to be too dependent on tools, the fact of the matter is that they provide the easiest, most convenient route in many cases.

The guessing game

I know there's no word in the digital marketing world that produces more shudders than “guessing.” However, there are times when guessing is easier.

Let’s be real: there aren’t too many different ways that companies both large and small format their email addresses. It’s usually going to be something like:

If you’ve ever worked for a living, you know most of the variations. But, in case you need some help, there’s a tool for that.

Now, I’m not suggesting that you just pick any one of these random addresses, send your email, cross your fingers, and hope for the best. Far from it. There are actually tools that you can use that will indicate when you’ve selected the right one.

Sales Navigator is such a tool. Sales Navigator is a Gmail extension that is easy to use. Simply enter the name of the person you’re looking for, and it will return all of the possible standard variations that they may use for their email address. Then, you can actually test the address from your Gmail account. When you type in the address into the proper line, a sidebar will appear on your screen. If there no is no information in that sidebar, you have the wrong address. If, however, you get a return that looks like this:

Congratulations! You’ve found the right email address.

Obviously, this method only works if you know the name of the person you want to email, but just don’t have their email address. Still, in those scenarios, Sales Navigator works like a charm.

Trust, but verify

There’s nothing more annoying than when you think you’ve finally struck gold, but the gold turned out to be pyrite. Getting an email that bounces back because it wasn’t the correct address is frustrating. And even worse, if it happens too often, your email can end up on email blacklists and destroy your email deliverability.

There are ways to verify, however. At my company, we use Neverbounce. It’s effective and incredibly easy to use. With Neverbounce, you can enter in either individual email addresses or bulk lists, and voila!

It will let you know if that email address is currently Valid, Invalid, or Unknown. It’s that easy. Here are some other email verifiers:

Subscribe to their newsletter

Here’s one final out-of-the-box approach. This approach works more often with sites where one person clearly does most, if not all, of the work. A site where someone’s name is the domain name, for example.

If you come across a site like davidfarkas.com and you see a newsletter that can be subscribed to, hit that subscribe button. Once that’s done, you can simply reply to one iteration of the newsletter.

This method has an added benefit. An effective way of building links is building relationships, just like I said in the opening. When you can demonstrate that you're already subscribing to a webmaster’s newsletter, you'll be currying favor with that webmaster.

Conclusion

When you send a link building outreach email, you want to make sure it’s going to a real person and, even more importantly, ending up in the right hands. Sending an email to an incorrect contact periodically may seem like a negligible waste of time, but when you send emails at the volume a link builder should, the waste adds up very quickly. In fact, enough waste can kill everything else that you’re trying to accomplish.

It’s well worth your time to make sure you’re getting it right by putting in the effort to finding the right email address. Be a picky link builder. Don’t just choose the first email that comes your way and never rely solely on tools. If you email the wrong person, it will look to them like that you didn’t care enough to spend time on their site, and in return, they will ignore you and your pitch.

With the tips outlined above, you'll avoid these issues and be on your way to more successful outreach.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

California’s Aggressive Renewable Mandates Are Not Having the Desired Affect

California has some of the most aggressive mandates for renewable energy production in the country. The state has a 100 percent “clean” energy mandate by 2045, with 60 percent of the state’s electricity mandated to come from renewable energy by 2030. All new houses built in the Golden State must have solar panels on the roof, and several cities (e.g., Berkeley) have banned the use of natural gas in new residential construction. Increasing amounts of wind and solar on the California grid have caused reliability problems, with millions of people forced to endure days without power to alleviate wildfire risk believed to be caused by wind damage to electrical wires and insufficient back-up power when wind and solar are not producing.

Because of potential power shortages, California utility regulators have ordered electricity providers in the state to secure an extra 3.3 gigawatts of low-emission generation resources by 2023, with half of those resources in place by 2021. No new fossil-fueled resources are allowed to count toward the procurement obligations at sites without previous electricity generation facilities. Along with insufficient resources, the regulators cited uncertainty created through historically high reliance on imports, an increased amount of wind and solar power which are not often available during peak demand, and the retirement of natural gas plants as factors for the decision. According to the regulators, the actions are needed to support grid reliability “while keeping the electricity sector on a path to meeting the state’s clean energy goals.”

The graph below shows that California’s peak electricity demand occurs after solar power diminishes as well as its heavy reliance on natural gas generation and imported electricity generated from fossil fuels (natural gas and coal) and nuclear power plants in Arizona and Nevada. The graph depicts the generation profile in California on October 3, 2019. The orange area is solar generation, the light blue barely visible area is wind generation, the deeper blue area is hydropower, the pinkish area is natural gas, and the brown area is imported electricity.

Source: Center of the American Experiment
Source: Center of the American Experiment

Because the increasing wind and solar generation in California are causing reliability issues, grid operators issue contracts that pay natural gas plants to prevent them from closing, attempting to ensure that there is enough reliable electricity when renewables are not producing energy. Paying for the wind, solar, natural gas and imported power is expensive, resulting in California electricity prices, on average, being almost 60 percent higher than the national average in 2018. And they are continuing to increase. In 2018, they averaged 16.7 cents per kilowatt hour and through August of 2019, they averaged 16.93 cents per kilowatt hour—about 2 percent higher than the same time period in 2018.

The power shortage will probably get worse in California as the Diablo Canyon nuclear facilities are phased out—a non-carbon emitting power source—in the coming years, and more renewables are being mandated on its grid. Diablo Canyon currently meets the needs of almost 3 million California residents. Pacific Gas and Electric (PG&E), the owner of those nuclear units, is cited as the cause of recent wildfires and it has had to temporarily order power outages to avoid further fines for wildfire liability that have resulted in its need to file for bankruptcy. California’s Supreme Court ruled that utilities must cover the costs associated with electricity-caused wildfires regardless of whether they were accidental or the result of negligence. As a result, PG&E has resolved to preventatively cut its services when hot and dry Santa Ana winds could bring on brushfires.

The power outages have resulted in extreme hardships to many. “Perishable foods have been lost, costing individual households and restaurants tens of thousands of dollars. Hundreds of schools have been closed with little or no warning. The estimated 30,000 Californians who are sick or infirm and do not have access to gasoline-powered backup generators were left without functioning medical equipment.” Further, the rush to buy generators powered by gasoline or diesel has resulted in more carbon dioxide emissions in total.

With the push for renewable energy and its associated costs, PG&E and other power providers in California lack the resources to build extensive networks that could pinpoint fire-prone areas or insulate transmission lines. Even so, most wildfires are not caused by faulty electrical equipment but natural factors and human error. California’s utilities are required by law to provide services to housing in high-risk areas, which happen to be overburdened with trees and brush that contribute to the danger.

Because of the state’s acute housing shortage, more and more residences are built inside danger zones. Further, the responsibility for forestry management is covered by many interests–federal, state, local, tribal, and private—that have encountered the state’s onerous regulatory environment. A 2018 study found that “overcrowded forests” have contributed to the destruction of vegetation, causing forest floors to contain combustible fuels. The report blames current forest management practices for imposing “limitations on timber harvesting” and “environmental permitting requirements” that “constrain the amount of trees and other growth removed from the forest.”

Conclusion

California is home to massive, rolling power outages that are not the result of resource deficiency but rather the result of California’s policies and mismanagement of its forests. Other states should take notice of what is transpiring in California and ensure that they do not follow suit.

The post California’s Aggressive Renewable Mandates Are Not Having the Desired Affect appeared first on IER.

#36: Bill desRosiers of Cabot Oil & Gas on developing the Marcellus Shale

Bill desRosiers, the external affairs manager of Cabot Oil & Gas, joined the show to discuss how technological innovation has unleashed American energy and the impacts developing the Marcellus Shale has had on the surrounding areas.

Links:
Shale Gas News radio show.
Learn more about Cabot Oil & Gas Corporation.
Learn more about Cabot’s community outreach initiatives. 

The post #36: Bill desRosiers of Cabot Oil & Gas on developing the Marcellus Shale appeared first on IER.

Solar Sunday is coming…

Get ready to illuminate the upcoming new year with reduced solar training tuition from SEI! Between Dec 1-2, get $100 off SEI’s online PVOL101 or FVOL101 sessions. Learn the fundamentals of solar energy and gain a solid understanding of the various components, system architectures, and applications for PV systems. These classes don’t require any prerequisites or previous solar/electrical experience, so sign up before your spot is taken – classes fill up fast! This deal expires Monday, December 2nd at midnight.

Use code SOLARSUNDAY between December 1-2 to recieve $100 OFF!

The post Solar Sunday is coming… appeared first on Solar Training - Solar Installer Training - Solar PV Installation Training - Solar Energy Courses - Renewable Energy Education - NABCEP - Solar Energy International (SEI).

Monday, November 25, 2019

Trump Withdraws from Paris Accord; China Continues to Increase Its Emissions

On November 4, President Trump sent a letter to the United Nations beginning the formal process to withdraw the United States from the Paris Climate Agreement. The process, which takes a year to complete, will end on November 4, 2020—a day after the 2020 presidential election. According to President Trump, the Paris Climate Agreement is a bad deal for America because it allows China to continue increasing its greenhouse gas emissions, while the United States would have to cut its emissions, thereby hurting the bottom lines of U.S. businesses that need affordable and reliable energy to prosper.

China knows how important energy is to its economy, accounting for almost 25 percent of the world’s energy consumption. It is the world’s largest consumer of coal and one of the world’s largest consumers of oil and natural gas. Fossil fuels accounted for 85 percent of the world’s energy supply last year and China is dependent on those fuels for that same percentage. China is also the world’s largest importer of coal, oil, and natural gas since it does not have the resources to produce all that it needs. As a result, China has an ever-extending global reach that has procuring energy supplies at its core as these fuels will remain indispensable for decades. It is active in oil- and gas-rich Africa, the Middle East, Canada, and South America.

Oil

Since 2015, China’s oil production declined by 12 percent, while its oil demand has been increasing by more than 5 percent per year for over the last decade, reaching 13.5 million barrels per day in 2018. In 2018, it was responsible for over half of the demand growth in oil. China relies on oil imports for around 75 percent of its total oil consumption. The country also buys crude when prices are low to stockpile its inventories, which cover about 80 days of imports.

Natural Gas

China’s natural gas production has been increasing by about 8 percent per year over the last decade, but its demand has been increasing by over 13 percent per year over that time period. Thus, its growth rate for imports has been huge—over 30 percent per year. Turkmenistan supplied 70 percent of China’s natural gas pipeline imports in 2018, but that will change as the Power of Siberia project with Russia will provide natural gas by pipeline beginning this December 1. China’s LNG imports increased by almost 40 percent in 2018. It is expected that China will surpass Japan before 2022 to become the largest LNG buyer in the world.

 

Source: Forbes
Source: Forbes 

Coal

China is the world’s largest consumer of coal, consuming more coal than the rest of the world combined. It consumed over five times as much coal as the United States in 2018. It is also the world’s largest producer of coal, producing almost half of the world’s output in 2018. China is the world’s largest coal importer, followed by India, importing 17 percent of the world’s coal imports.

China has restarted construction on more than 50 gigawatts of suspended coal-fired power plants—the equivalent capacity of the entire coal fleet of Africa and the Middle East. A recent study by Global Energy Monitor, Greenpeace, and the Sierra Club finds that China could add 290 gigawatts in new coal-fired plants, which would exceed the 261 gigawatts of the entire US coal-power fleet in 2018. That increased capacity would bring China’s total coal capacity to 1300 gigawatts, which is the cap for 2030 proposed by the China Electricity Council. Global climate goals, however, will not be met without a full halt in new coal plants and the retirement of existing coal plants in China, as the graph below shows.

Source: Engineering & Technology
Source: Engineering & Technology

 

The Paris Accord Commitments

In 2015, then-President Obama committed the United States to reduce its greenhouse gas emissions from 2005 levels by 26 to 28 percent by 2025. China, however, did not agree to begin reducing its greenhouse gas emissions until after 2030, while it continues to grow its economy. It pledged its share of non-fossil fuels in 2030 would be 20 percent, reducing fossil fuel usage by 5 percentage points by reducing carbon intensity.

Since 2005, the United States reduced its carbon dioxide emissions, the largest component of greenhouse gas emissions, by 12 percent, making it one of the few countries to make a sizable percent reduction. China, on the other hand, has increased its carbon dioxide emissions by 55 percent over that time frame. In 2018, China accounted for 28 percent of the world’s carbon dioxide emissions, while the United States accounted for 18 percent.

Conclusion

President Trump and China both realize that energy is needed for economic growth and that fossil fuels provide the most affordable, available, and reliable energy. As such, an energy-rich country makes a strong country. The United States is the world’s largest oil and natural gas producer and has abundant fossil fuel resources to keep its economy growing. China, however, despite having large resources, must also import these fuels and has its tentacles in many areas of the world to accomplish its mission of having readily available coal, oil, and natural gas. As President Trump is keenly aware, if the United States remains in the Paris Accord, it would have to institute dire policies to achieve the Obama targets, while China can continue to increase emissions of greenhouse gases and will likely do so until at least 2030.

The post Trump Withdraws from Paris Accord; China Continues to Increase Its Emissions appeared first on IER.

App Store SEO: How to Diagnose a Drop in Traffic & Win It Back

Posted by Joel.Mesherghi

For some organizations, mobile apps can be an important means to capturing new leads and customers, so it can be alarming when you notice your app visits are declining.

However, while there is content on how to optimize your app, otherwise known as ASO (App Store Optimization), there is little information out there on the steps required to diagnose a drop in app visits.

Although there are overlaps with traditional search, there are unique factors that play a role in app store visibility.

The aim of this blog is to give you a solid foundation when trying to investigate a drop in app store visits and then we’ll go through some quick fire opportunities to win that traffic back.

We’ll go through the process of investigating why your app traffic declined, including:

  1. Identifying potential external factors
  2. Identifying the type of keywords that dropped in visits
  3. Analyzing app user engagement metrics

And we’ll go through some ways to help you win traffic back including:

  1. Spying on your competitors
  2. Optimizing your store listing
  3. Investing in localisation

Investigating why your app traffic declined

Step 1. Identify potential external factors

Some industries/businesses will have certain periods of the year where traffic may drop due to external factors, such as seasonality.

Before you begin investigating a traffic drop further:

  • Talk to your point of contact and ask whether seasonality impacts their business, or whether there are general industry trends at play. For example, aggregator sites like SkyScanner may see a drop in app visits after the busy period at the start of the year.
  • Identify whether app installs actually dropped. If they didn’t, then you probably don’t need to worry about a drop in traffic too much and it could be Google’s and Apple’s algorithms better aligning the intent of search terms.

Step 2. Identify the type of keywords that dropped in visits

Like traditional search, identifying the type of keywords (branded and non-branded), as well as the individual keywords that saw the biggest drop in app store visits, will provide much needed context and help shape the direction of your investigation. For instance:

If branded terms saw the biggest drop-off in visits this could suggest:

  1. There has been a decrease in the amount of advertising spend that builds brand/product awareness
  2. Competitors are bidding on your branded terms
  3. The app name/brand has changed and hasn’t been able to mop up all previous branded traffic

If non-branded terms saw the biggest drop off in visits this could suggest:

  1. You’ve made recent optimisation changes that have had a negative impact
  2. User engagement signals, such as app crashes, or app reviews have changed for the worse
  3. Your competition have better optimised their app and/or provide a better user experience (particularly relevant if an app receives a majority of its traffic from a small set of keywords)
  4. Your app has been hit by an algorithm update

If both branded and non-branded terms saw the biggest drop off in visits this could suggest:

  1. You’ve violated Google’s policies on promoting your app.
  2. There are external factors at play

To get data for your Android app

To get data for your Android app, sign into your Google Play Console account.

Google Play Console provides a wealth of data on the performance of your android app, with particularly useful insights on user engagement metrics that influence app store ranking (more on these later).

However, keyword specific data will be limited. Google Play Console will show you the individual keywords that delivered the most downloads for your app, but the majority of keyword visits will likely be unclassified: mid to long-tail keywords that generate downloads, but don’t generate enough downloads to appear as isolated keywords. These keywords will be classified as “other”.

Your chart might look like the below. Repeat the same process for branded terms.

Above: Graph of a client’s non-branded Google Play Store app visits. The number of visits are factual, but the keywords driving visits have been changed to keep anonymity.

To get data for your IOS app

To get data on the performance of your IOS app, Apple have App Store Connect. Like Google Play Console, you’ll be able to get your hands on user engagement metrics that can influence the ranking of your app.

However, keyword data is even scarcer than Google Play Console. You’ll only be able to see the total number of impressions your app’s icon has received on the App Store. If you’ve seen a drop in visits for both your Android and IOS app, then you could use Google Play Console data as a proxy for keyword performance.

If you use an app rank tracking tool, such as TheTool, you can somewhat plug gaps in knowledge for the keywords that are potentially driving visits to your app.

Step 3. Analyze app user engagement metrics

User engagement metrics that underpin a good user experience have a strong influence on how your app ranks and both Apple and Google are open about this.

Google states that user engagement metrics like app crashes, ANR rates (application not responding) and poor reviews can limit exposure opportunities on Google Play.

While Apple isn't quite as forthcoming as Google when it comes to providing information on engagement metrics, they do state that app ratings and reviews can influence app store visibility.

Ultimately, Apple wants to ensure IOS apps provide a good user experience, so it’s likely they use a range of additional user engagement metrics to rank an app in the App Store.

As part of your investigation, you should look into how the below user engagement metrics may have changed around the time period you saw a drop in visits to your app.

  • App rating
  • Number of ratings (newer/fresh ratings will be weighted more for Google)
  • Number of downloads
  • Installs vs uninstalls
  • App crashes and application not responding

You’ll be able to get data for the above metrics in Google Play Console and App Store Connect, or you may have access to this data internally.

Even if your analysis doesn’t reveal insights, metrics like app rating influences conversion and where your app ranks in the app pack SERP feature, so it’s well worth investing time in developing a strategy to improve these metrics.

One simple tactic could be to ensure you respond to negative reviews and reviews with questions. In fact, users increase their rating by +0.7 stars on average after receiving a reply.

Apple offers a few tips on asking for ratings and reviews for IOS app.

Help win your app traffic back

Step 1. Spy on your competitors

Find out who’s ranking

When trying to identify opportunities to improve app store visibility, I always like to compare the top 5 ranking competitor apps for some priority non-branded keywords.

All you need to do is search for these keywords in Google Play and the App Store and grab the publicly available ranking factors from each app listing. You should have something like the below.

Brand

Title

Title Character length

Rating

Number of reviews

Number of installs

Description character length

COMPETITOR 1

[Competitor title]

50

4.8

2,848

50,000+

3,953

COMPETITOR 2

[Competitor title]

28

4.0

3,080

500,000+

2,441

COMPETITOR 3

[Competitor title]

16

4.0

2566

100,000+

2,059

YOUR BRAND

​[Your brands title]

37

4.3

2,367

100,000+

3,951

COMPETITOR 4

[Competitor title]

7

4.1

1,140

100,000+

1,142

COMPETITOR 5

[Competitor title]

24

4.5

567

50,000+

2,647

     Above: anonymized table of a client's Google Play competitors

From this, you may get some indications as to why an app ranks above you. For instance, we see “Competitor 1” not only has the best app rating, but has the longest title and description. Perhaps they better optimized their title and description?

We can also see that competitors that rank above us generally have a larger number of total reviews and installs, which aligns with both Google’s and Apple’s statements about the importance of user engagement metrics.

With the above comparison information, you can dig a little deeper, which leads us on nicely to the next section.

Optimize your app text fields

Keywords you add to text fields can have a significant impact on app store discoverability.

As part of your analysis, you should look into how your keyword optimization differs from competitors and identify any opportunities.

For Google Play, adding keywords to the below text fields can influence rankings:

  • Keywords in the app title (50 characters)
  • Keywords in the app description (4,000 characters)
  • Keywords in short description (80 characters)
  • Keywords in URL
  • Keywords in your app name

When it comes to the App Store, adding keywords to the below text fields can influence rankings:

  • Keywords in the app title (30 characters)
  • Using the 100 character keywords field (a dedicated 100-character field to place keywords you want to rank for)
  • Keywords in your app name

To better understand how your optimisation tactics hold up, I recommended comparing your app text fields to competitors.

For example, if I want to know the frequency of mentioned keywords in their app descriptions on Google Play (keywords in the description field are a ranking factor) than I’d create a table like the one below.

Keyword

COMPETITOR 1

COMPETITOR 2

COMPETITOR 3

YOUR BRAND

COMPETITOR 4

COMPETITOR 5

job

32

9

5

40

3

2

job search

12

4

10

9

10

8

employment

2

0

0

5

0

3

job tracking

2

0

0

4

0

0

employment app

7

2

0

4

2

1

employment search

4

1

1

5

0

0

job tracker

3

0

0

1

0

0

recruiter

2

0

0

1

0

0

     Above: anonymized table of a client's Google Play competitors

From the above table, I can see that the number 1 ranking competitor (competitor 1) has more mentions of “job search” and “employment app” than I do.

Whilst there are many factors that decide the position at which an app ranks, I could deduce that I need to increase the frequency of said keywords in my Google Play app description to help improve ranking.

Be careful though: writing unnatural, keyword stuffed descriptions and titles will likely have an adverse effect.

Remember, as well as being optimized for machines, text fields like your app title and description are meant to be a compelling “advertisement” of your app for users..

I’d repeat this process for other text fields to uncover other keyword insights.

Step 2. Optimize your store listing

Your store listing in the home of your app on Google Play. It’s where users can learn about your app, read reviews and more. And surprisingly, not all apps take full advantage of developing an immersive store listing experience.

Whilst Google doesn't seem to directly state that fully utilizing the majority of store listing features directly impacts your apps discoverability, it’s fair to speculate that there may be some ranking consideration behind this.

At the very least, investing in your store listing could improve conversion and you can even run A/B tests to measure the impact of your changes.

You can improve the overall user experience and content found in the store listing by adding video trailers of your app, quality creative assets, your apps icon (you’ll want to make your icon stand out amongst a sea of other app icons) and more.

You can read Google’s best practice guide on creating a compelling Google Play store listing to learn more.

Step 3. Invest in localization

The saying goes “think global, act local” and this is certainly true of apps.

Previous studies have revealed that 72.4% of global consumers preferred to use their native language when shopping online and that 56.2% of consumers said that the ability to obtain information in their own language is more important than price.

It makes logical sense. The better you can personalize your product for your audience, the better your results will be, so go the extra mile and localize your Google Play and App Store listings.

Google has a handy checklist for localization on Google Play and Apple has a comprehensive resource on internationalizing your app on the App Store.

Wrap up

A drop in visits of any kind causes alarm and panic. Hopefully this blog gives you a good starting point if you ever need to investigate why an apps traffic has dropped as well as providing some quick fire opportunities to win it back.

If you’re interested in further reading on ASO, I recommend reading App Radar’s and TheTool’s guides to ASO, as well as app search discoverability tips from Google and Apple themselves.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Friday, November 22, 2019

Thanksgiving for Freedom and Energy

“More often than not, we tend to overlook our truly spectacular rise from grinding poverty to previously unimaginable abundance. And so, during this Thanksgiving holiday, let us give thanks for accountable government, market economy, and scientific progress that make a king out of each of us.”

– Marian Tupy, “Some Perspective on What We Have to Be Thankful For,” Los Angeles Times, November 26, 2014.

The celebration of a good harvest by 53 Pilgrims and 90 Indians at Plymouth Plantation in 1621 is a story for the ages. Three days of feast was a respite from fear and poverty. Only about half of the original Pilgrims had survived since landing. Their air and water were pristine, the food organic and free-range, the energy renewable. But abject poverty and death were the norms.

Today, Thanksgiving celebrates the compound benefits of progress. But it should also be a warning against the threat of, Venezuela-style, almost unimaginable retrogression.

Capitalist institutions, prominently including incentives from private property, which the early settlers stumbled upon, would institutionalize good harvests for millions of Americans. But another factor literally fueled progress: dense mineral energies to better warm, cook, and light and to run the machines of the Industrial Revolution.

Primitive to Modern Energy

In the 17th century, inanimate energy came from burning plants and woody matter. Water wheels added an energy element to primitive biomass, while human labor and animals did the rest, not unlike the century to come except for the usage of whale oil for lighting.

The richest and most stately man in the world had about the same technology as those at Plymouth for the first Thanksgiving. Louis XIV, the King of France, had more energy at his disposal than any other person in the world. The Palace of Versailles had 2,000 windows for solar, 1,250 chimneys for biomass, and hundreds of horses for transportation. There was not energy storage or portability; food sent from the kitchens to the King’s dining room arrived cold because of the travel time.

This would change in the 19th century. Coal, joined by petroleum and then natural gas, the fossil fuels, did the work previously done by slaves and animals—or work that could not be done at all.

Illuminating oil was a quantum leap. “Kerosene has, in one sense, increased the length of life among the agricultural population,” an observer stated at the time. “Those who, on account of the dearness or inefficiency of whale oil, were accustomed to go to bed soon after sunset and spend almost half their time in sleep, now occupy a portion of the night in reading and other amusements; and this is more particularly true of the winter seasons.”

Erich Zimmermann at mid-20th century filled in the energy history:

When James Watt patented his steam engine in 1776 . . . it marked the beginning of a long series of inventions, including the steam turbine, the gasoline explosion engine, the Diesel engine, the gas combustion turbine, the different jets … the water turbine, and the host of other inventions which have made electricity one of the most widely used forms of energy.

“These inventions so raised the productivity of man,” the resource economist added, “that he, at last, found the leisure and surplus which made possible the systematic pursuit of scientific research.”

Amory Lovins added twenty-five years later: “As medical science, by deferring death, has allowed many more people to live on the earth, so the energy of fossil fuels, by deferring physical scarcity, has kept those people alive.”

Neo-Malthusian John Holdren dare not dispute the utility of plentiful, reliable energy. “Affordable energy in ample quantities is the lifeblood of the industrial societies and a prerequisite for the economic development of the others,” stated the man who would be Obama’s energy and science advisor for eight years.

The work of fossil fuels and electricity can never end. “I am ashamed at the number of things around my house and shops that are done by … human beings,” Thomas Edison stated a century ago. “Hereafter a motor must do all the chores.” Prometheus Unbound still has many chores to do, freeing time for other pursuits.

Conclusion

Most recently, Chelsea Follett has documented how our celebration has become ever more inexpensive. “Thanksgiving dinner is now the most affordable that it has been in more than a decade and is 26 percent lower than in 1986, when the AFBF began its annual survey on the matter,” she wrote.

She concludes:

It seems that, on average, each child born today eventually grows up to make resources less scarce by contributing to innovation and the global economy. And as resources become more abundant, they come down in price, allowing each of us to spend less time working to afford goods like Thanksgiving dinner. As economist Mark Perry put it, thanks to the plummeting time cost, “The average worker would earn enough money before their lunch break on just one day to be able to afford the cost of a traditional Thanksgiving meal.” Now that’s something to be thankful for.

Amen.

The post Thanksgiving for Freedom and Energy appeared first on IER.

Intersolar North America 2020 is coming – get FREE access to the Expo Hall before Dec. 31, 2019

Intersolar North America is the premier solar event that connects innovators and decision makers in the solar + energy storage industry. With a dynamic exhibition floor and robust conference program, Intersolar North America provides business-to-business professionals a platform to advance business, expand education, and drive networking. Happening February 4-6 in San Diego, California, immerse yourself in three days of face-to-face business and networking and cutting-edge technical training and workshops. With access to industry-leading exhibitors and innovators, you’ll leave with the knowledge you need to accelerate your business. Experience a vibrant exhibition floor, in-depth technical training workshops, and lively conference sessions to keep you up to speed on the latest trends and tech.

Get your FREE Expo Hall Pass

For a limited time, Solar Energy International (SEI) is offering FREE access to the Intersolar North America 2020 expo hall. Register now and save – this special one-time offer is only available until December 31, 2019 (after that, the price is $130+).

The post Intersolar North America 2020 is coming – get FREE access to the Expo Hall before Dec. 31, 2019 appeared first on Solar Training - Solar Installer Training - Solar PV Installation Training - Solar Energy Courses - Renewable Energy Education - NABCEP - Solar Energy International (SEI).

SEI participates in first three-day Solar Industry Readiness Training pilot to Oregon Tradeswomen’s Environmental Workers Training program

From Oct 21-23rd, Oregon Solar Energy Industries Association (OSEIA), Solar Energy International (SEI), Energy Trust of Oregon and Remote Energy presented the first three-day Solar Industry Readiness Training pilot to Oregon Tradeswomen’s Environmental Workers Training program students. Clackamas Community College Renewable Energy Program donated their new outdoor lab training facilities so students could experience classroom and hands-on training. Sunbridge Solar volunteered to host students at an installation site, where the install team walked students through what their work days may look like, showed their tool kits and van setup.

The pilot curriculum included the basics of grid-tied PV systems, racking and safety fundamentals, module and MLPE installation, and more. The group spent their mornings in the classroom and afternoons in the lab, with the goal of preparing them for a career in the solar industry. Eight students, many with little to no construction or electrical experience, attended the class. Carol Weis, co-founder of Remote Energy and an instructor at SEI, believes that it is important to have all women’s classes as it introduces a ‘whole new dynamic and energy’ that is often lacking in a co-ed training context. Sarah Wilder, an instructor from SEI with over 10 years of solar installation experience, mentioned that there are very few women in the Solar Industry, particularly in solar installation, yet it is an excellent and rewarding career path. Carol and Sarah, both passionate and knowledgable instructors, are empowering other women to enter into the solar industry through this program.

The trainee’s reactions to the new course? One student said that “this has been the best training in this whole program because we’re actually hands on – actually doing it. Sitting in a classroom is not for me, I’d rather be in it and know exactly how to do it – the day to day of the job.” This is exactly what the pilot program is meant to accomplish. The training concluded with sharing personal solar stories, discussing Oregon’s solar and job market, and resources for students to connect with Oregon’s solar industry. A big thanks to our sponsors the Spirit Mountain Community Fund and the Energy Trust of Oregon, and to all who contributed. Stay tuned as we are looking forward to brainstorming how to incorporate more trainings like this in Oregon and throughout the world.

Learn more about how to launch your career in the solar industry with SEI’s PV101 course either in-person or online. Check out our full training schedule: https://www.solarenergy.org/training-schedule/

The post SEI participates in first three-day Solar Industry Readiness Training pilot to Oregon Tradeswomen’s Environmental Workers Training program appeared first on Solar Training - Solar Installer Training - Solar PV Installation Training - Solar Energy Courses - Renewable Energy Education - NABCEP - Solar Energy International (SEI).

Better Content Through NLP (Natural Language Processing) - Whiteboard Friday

Posted by RuthBurrReedy

Gone are the days of optimizing content solely for search engines. For modern SEO, your content needs to please both robots and humans. But how do you know that what you're writing can check the boxes for both man and machine?

In today's Whiteboard Friday, Ruth Burr Reedy focuses on part of her recent MozCon 2019 talk and teaches us all about how Google uses NLP (natural language processing) to truly understand content, plus how you can harness that knowledge to better optimize what you write for people and bots alike.

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Howdy, Moz fans. I'm Ruth Burr Reedy, and I am the Vice President of Strategy at UpBuild, a boutique technical marketing agency specializing in technical SEO and advanced web analytics. I recently spoke at MozCon on a basic framework for SEO and approaching changes to our industry that thinks about SEO in the light of we are humans who are marketing to humans, but we are using a machine as the intermediary.

Those videos will be available online at some point. [Editor's note: that point is now!] But today I wanted to talk about one point from my talk that I found really interesting and that has kind of changed the way that I approach content creation, and that is the idea that writing content that is easier for Google, a robot, to understand can actually make you a better writer and help you write better content for humans. It is a win-win. 

The relationships between entities, words, and how people search

To understand how Google is currently approaching parsing content and understanding what content is about, Google is spending a lot of time and a lot of energy and a lot of money on things like neural matching and natural language processing, which seek to understand basically when people talk, what are they talking about?

This goes along with the evolution of search to be more conversational. But there are a lot of times when someone is searching, but they don't totally know what they want, and Google still wants them to get what they want because that's how Google makes money. They are spending a lot of time trying to understand the relationships between entities and between words and how people use words to search.

The example that Danny Sullivan gave online, that I think is a really great example, is if someone is experiencing the soap opera effect on their TV. If you've ever seen a soap opera, you've noticed that they look kind of weird. Someone might be experiencing that, and not knowing what that's called they can't Google soap opera effect because they don't know about it.

They might search something like, "Why does my TV look funny?" Neural matching helps Google understand that when somebody is searching "Why does my TV look funny?" one possible answer might be the soap opera effect. So they can serve up that result, and people are happy. 

Understanding salience

As we're thinking about natural language processing, a core component of natural language processing is understanding salience.

Salience, content, and entities

Salience is a one-word way to sum up to what extent is this piece of content about this specific entity? At this point Google is really good at extracting entities from a piece of content. Entities are basically nouns, people, places, things, proper nouns, regular nouns.

Entities are things, people, etc., numbers, things like that. Google is really good at taking those out and saying, "Okay, here are all of the entities that are contained within this piece of content." Salience attempts to understand how they're related to each other, because what Google is really trying to understand when they're crawling a page is: What is this page about, and is this a good example of a page about this topic?

Salience really goes into the second piece. To what extent is any given entity be the topic of a piece of content? It's often amazing the degree to which a piece of content that a person has created is not actually about anything. I think we've all experienced that.

You're searching and you come to a page and you're like, "This was too vague. This was too broad. This said that it was about one thing, but it was actually about something else. I didn't find what I needed. This wasn't good information for me." As marketers, we're often on the other side of that, trying to get our clients to say what their product actually does on their website or say, "I know you think that you created a guide to Instagram for the holidays. But you actually wrote one paragraph about the holidays and then seven paragraphs about your new Instagram tool. This is not actually a blog post about Instagram for the holidays. It's a piece of content about your tool." These are the kinds of battles that we fight as marketers. 

Natural Language Processing (NLP) APIs

Fortunately, there are now a number of different APIs that you can use to understand natural language processing: 

Is it as sophisticated as what they're using on their own stuff? Probably not. But you can test it out. Put in a piece of content and see (a) what entities Google is able to extract from it, and (b) how salient Google feels each of these entities is to the piece of content as a whole. Again, to what degree is this piece of content about this thing?

So this natural language processing API, which you can try for free and it's actually not that expensive for an API if you want to build a tool with it, will assign each entity that it can extract a salient score between 0 and 1, saying, "Okay, how sure are we that this piece of content is about this thing versus just containing it?"

So the higher or the closer you get to 1, the more confident the tool is that this piece of content is about this thing. 0.9 would be really, really good. 0.01 means it's there, but they're not sure how well it's related. 

A delicious example of how salience and entities work


The example I have here, and this is not taken from a real piece of content — these numbers are made up, it's just an example — is if you had a chocolate chip cookie recipe, you would want chocolate cookies or chocolate chip cookies recipe, chocolate chip cookies, something like that to be the number one entity, the most salient entity, and you would want it to have a pretty high salient score.

You would want the tool to feel pretty confident, yes, this piece of content is about this topic. But what you can also see is the other entities it's extracting and to what degree they are also salient to the topic. So you can see things like if you have a chocolate chip cookie recipe, you would expect to see things like cookie, butter, sugar, 350, which is the temperature you heat your oven, all of the different things that come together to make a chocolate chip cookie recipe.

But I think that it's really, really important for us as SEOs to understand that salience is the future of related keywords. We're beyond the time when to optimize for chocolate chip cookie recipe, we would also be looking for things like chocolate recipe, chocolate chips, chocolate cookie recipe, things like that. Stems, variants, TF-IDF, these are all older methodologies for understanding what a piece of content is about.

Instead what we need to understand is what are the entities that Google, using its vast body of knowledge, using things like Freebase, using large portions of the internet, where is Google seeing these entities co-occur at such a rate that they feel reasonably confident that a piece of content on one entity in order to be salient to that entity would include these other entities?

Using an expert is the best way to create content that's salient to a topic

So chocolate chip cookie recipe, we're now also making sure we're adding things like butter, flour, sugar. This is actually really easy to do if you actually have a chocolate chip cookie recipe to put up there. This is I think what we're going to start seeing as a content trend in SEO is that the best way to create content that is salient to a topic is to have an actual expert in that topic create that content.

Somebody with deep knowledge of a topic is naturally going to include co-occurring terms, because they know how to create something that's about what it's supposed to be about. I think what we're going to start seeing is that people are going to have to start paying more for content marketing, frankly. Unfortunately, a lot of companies seem to think that content marketing is and should be cheap.

Content marketers, I feel you on that. It sucks, and it's no longer the case. We need to start investing in content and investing in experts to create that content so that they can create that deep, rich, salient content that everybody really needs. 

How can you use this API to improve your own SEO? 

One of the things that I like to do with this kind of information is look at — and this is something that I've done for years, just not in this context — but a prime optimization target in general is pages that rank for a topic, but they rank on page 2.

What this often means is that Google understands that that keyword is a topic of the page, but it doesn't necessarily understand that it is a good piece of content on that topic, that the page is actually solely about that content, that it's a good resource. In other words, the signal is there, but it's weak.

What you can do is take content that ranks but not well, run it through this natural language API or another natural language processing tool, and look at how the entities are extracted and how Google is determining that they're related to each other. Sometimes it might be that you need to do some disambiguation. So in this example, you'll notice that while chocolate cookies is called a work of art, and I agree, cookie here is actually called other.

This is because cookie means more than one thing. There's cookies, the baked good, but then there's also cookies, the packet of data. Both of those are legitimate uses of the word "cookie." Words have multiple meanings. If you notice that Google, that this natural language processing API is having trouble correctly classifying your entities, that's a good time to go in and do some disambiguation.

Make sure that the terms surrounding that term are clearly saying, "No, I mean the baked good, not the software piece of data." That's a really great way to kind of bump up your salience. Look at whether or not you have a strong salient score for your primary entity. You'd be amazed at how many pieces of content you can plug into this tool and the top, most salient entity is still only like a 0.01, a 0.14.

A lot of times the API is like "I think this is what it's about," but it's not sure. This is a great time to go in and bump up that content, make it more robust, and look at ways that you can make those entities easier to both extract and to relate to each other. This brings me to my second point, which is my new favorite thing in the world.

Writing for humans and writing for machines, you can now do both at the same time. You no longer have to, and you really haven't had to do this in a long time, but the idea that you might keyword stuff or otherwise create content for Google that your users might not see or care about is way, way, way over.

Now you can create content for Google that also is better for users, because the tenets of machine readability and human readability are moving closer and closer together. 

Tips for writing for human and machine readability:

Reduce semantic distances!

What I've done here is I did some research not on natural language processing, but on writing for human readability, that is advice from writers, from writing experts on how to write better, clearer, easier to read, easier to understand content.Then I pulled out the pieces of advice that also work as pieces of advice for writing for natural language processing. So natural language processing, again, is the process by which Google or really anything that might be processing language tries to understand how entities are related to each other within a given body of content.

Short, simple sentences

Short, simple sentences. Write simply. Don't use a lot of flowery language. Short sentences and try to keep it to one idea per sentence. 

One idea per sentence

If you're running on, if you've got a lot of different clauses, if you're using a lot of pronouns and it's becoming confusing what you're talking about, that's not great for readers.

It also makes it harder for machines to parse your content. 

Connect questions to answers

Then closely connecting questions to answers. So don't say, "What is the best temperature to bake cookies? Well, let me tell you a story about my grandmother and my childhood," and 500 words later here's the answer. Connect questions to answers. 

What all three of those readability tips have in common is they boil down to reducing the semantic distance between entities.

If you want natural language processing to understand that two entities in your content are closely related, move them closer together in the sentence. Move the words closer together. Reduce the clutter, reduce the fluff, reduce the number of semantic hops that a robot might have to take between one entity and another to understand the relationship, and you've now created content that is more readable because it's shorter and easier to skim, but also easier for a robot to parse and understand.

Be specific first, then explain nuance

Going back to the example of "What is the best temperature to bake chocolate chip cookies at?" Now the real answer to what is the best temperature to bake chocolate cookies is it depends. Hello. Hi, I'm an SEO, and I just answered a question with it depends. It does depend.

That is true, and that is real, but it is not a good answer. It is also not the kind of thing that a robot could extract and reproduce in, for example, voice search or a featured snippet. If somebody says, "Okay, Google, what is a good temperature to bake cookies at?" and Google says, "It depends," that helps nobody even though it's true. So in order to write for both machine and human readability, be specific first and then you can explain nuance.

Then you can go into the details. So a better, just as correct answer to "What is the temperature to bake chocolate chip cookies?" is the best temperature to bake chocolate chip cookies is usually between 325 and 425 degrees, depending on your altitude and how crisp you like your cookie. That is just as true as it depends and, in fact, means the same thing as it depends, but it's a lot more specific.

It's a lot more precise. It uses real numbers. It provides a real answer. I've shortened the distance between the question and the answer. I didn't say it depends first. I said it depends at the end. That's the kind of thing that you can do to improve readability and understanding for both humans and machines.

Get to the point (don't bury the lede)

Get to the point. Don't bury the lead. All of you journalists who try to become content marketers, and then everybody in content marketing said, "Oh, you need to wait till the end to get to your point or they won't read the whole thing,"and you were like, "Don't bury the lead," you are correct. For those of you who aren't familiar with journalism speak, not burying the lead basically means get to the point upfront, at the top.

Include all the information that somebody would really need to get from that piece of content. If they don't read anything else, they read that one paragraph and they've gotten the gist. Then people who want to go deep can go deep. That's how people actually like to consume content, and surprisingly it doesn't mean they won't read the content. It just means they don't have to read it if they don't have time, if they need a quick answer.

The same is true with machines. Get to the point upfront. Make it clear right away what the primary entity, the primary topic, the primary focus of your content is and then get into the details. You'll have a much better structured piece of content that's easier to parse on all sides. 

Avoid jargon and "marketing speak"

Avoid jargon. Avoid marketing speak. Not only is it terrible and very hard to understand. You see this a lot. I'm going back again to the example of getting your clients to say what their products do. You work with a lot of B2B companies, you will you will often run into this. Yes, but what does it do? It provides solutions to streamline the workflow and blah, blah. Okay, what does it do? This is the kind of thing that can be really, really hard for companies to get out of their own heads about, but it's so important for users, for machines.

Avoid jargon. Avoid marketing speak. Not to get too tautological, but the more esoteric a word is, the less commonly it's used. That's actually what esoteric means. What that means is the less commonly a word is used, the less likely it is that Google is going to understand its semantic relationships to other entities.

Keep it simple. Be specific. Say what you mean. Wipe out all of the jargon. By wiping out jargon and kind of marketing speak and kind of the fluff that can happen in your content, you're also, once again, reducing the semantic distances between entities, making them easier to parse. 

Organize your information to match the user journey

Organize it and map it out to the user journey. Think about the information somebody might need and the order in which they might need it. 

Break out subtopics with headings

Then break it out with subheadings. This is like very, very basic writing advice, and yet you all aren't doing it. So if you're not going to do it for your users, do it for machines. 

Format lists with bullets or numbers

You can also really impact skimmability for users by breaking out lists with bullets or numbers.

The great thing about that is that breaking out a list with bullets or numbers also makes information easier for a robot to parse and extract. If a lot of these tips seem like they're the same tips that you would use to get featured snippets, they are, because featured snippets are actually a pretty good indicator that you're creating content that a robot can find, parse, understand, and extract, and that's what you want.

So if you're targeting featured snippets, you're probably already doing a lot of these things, good job. 

Grammar and spelling count!

The last thing, which I shouldn't have to say, but I'm going to say is that grammar and spelling and punctuation and things like that absolutely do count. They count to users. They don't count to all users, but they count to users. They also count to search engines.

Things like grammar, spelling, and punctuation are very, very easy signals for a machine to find and parse. Google has been specific in things, like the "Quality Rater Guidelines,"that a well-written, well-structured, well-spelled, grammatically correct document, that these are signs of authoritativeness. I'm not saying that having a greatly spelled document is going to mean that you immediately rocket to the top of the results.

I am saying that if you're not on that stuff, it's probably going to hurt you. So take the time to make sure everything is nice and tidy. You can use vernacular English. You don't have to be perfect "AP Style Guide" all the time. But make sure that you are formatting things properly from a grammatical standpoint as well as a technical standpoint. What I love about all of this, this is just good writing.

This is good writing. It's easy to understand. It's easy to parse. It's still so hard, especially in the marketing world, to get out of that world of jargon, to get to the point, to stop writing 2,000 words because we think we need 2,000 words, to really think about are we creating content that's about what we think it's about.

Use these tools to understand how readable, parsable, and understandable your content is

So my hope for the SEO world and for you is that you can use these tools not just to think about how to dial in the perfect keyword density or whatever to get an almost perfect score on the salience in the natural language processing API. What I'm hoping is that you will use these tools to help yourself understand how readable, how parsable, and how understandable your content is, how much your content is about what you say it's about and what you think it's about so you can create better stuff for users.

It makes the internet a better place, and it will probably make you some money as well. So these are my thoughts. I'd love to hear in the comments if you're using the natural language processing API now, if you've built a tool with it, if you want to build a tool with it, what do you think about this, how do you use this, how has it gone. Tell me all about it. Holla atcha girl.

Have a great Friday.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!