Thursday, August 31, 2017

What's Your AMP Traffic Really Doing? Set Up Reporting in 10 Minutes

Posted by Jeremy_Gottlieb

The other day, my colleague Tom Capper wrote a post about getting more traffic when you can’t rank any higher. I was really pleased that he wrote it, because it tackles a challenge I think about all the time. As SEOs, our hands are tied: we’re often not able to make product-level decisions that could create new markets, and we’re not Google’s algorithms — we can’t force a particular page to rank higher. What’s an SEO to do?

What if we shifted focus from transactional queries (for e-commerce, B2C, or B2B sites) and focused on the informational type of queries that are one, two, three, and possibly four or more interactions away from actually yielding a conversion? These types of queries are often quite conversational (i.e. "what are the best bodyweight workouts?") and very well could lead to conversions down the road if you’re try to sell something (like fitness-related products or supplements).

If we shift our focus to queries like the question I just posed, could we potentially enter more niches for search and open up more traffic? I’d hypothesize yes — and for some, driving this additional traffic is all one needs; whatever happens with that traffic is irrelevant. Personally, I’d rather drive qualified, relevant traffic to a client and then figure out how we can monetize that traffic down the road.

To accomplish this, over the past year I’ve been thinking a lot about Accelerated Mobile Pages (AMP).


What are Accelerated Mobile Pages?

According to Google,

"The AMP Project is an open-source initiative aiming to make the web better for all. The project enables the creation of websites and ads that are consistently fast, beautiful, and high-performing across devices and distribution platforms."

What this really means is that Google wants to make the web faster, and probably doesn’t trust the majority of sites to adequately speed up their pages or do so on a reasonable timeframe. Thus, AMP were created to allow for pages to load extremely fast (by cutting out the fat from your original source code) and provide an awesome user experience. Users can follow some basic instructions, use WordPress or other plugins, and in practically no time have mobile variants of their web content that loads super fast.

Why use AMP?

While AMP is not yet (or possibly ever going to be) a ranking factor, the fact that it loads fast certainly helps in the eyes of almighty Google and can contribute to higher rankings and clicks.

Let’s take a look at the query "Raekwon McMillan," the Miami Dolphins second-round pick in the 2017 NFL Draft out of Ohio State University:

Screenshot of mobile SERP for query "Raekwon McMillan"

Notice how of these cards on mobile, two contain a little lightning bolt and the word "AMP?" The prevalence of AMP results in the SERPs is becoming more and more common. It’s reasonable to think that while the majority of people who use Google are not currently familiar with AMP, over time and through experience, they will realize that AMP pages with that little icon load much faster than regular web pages and will gravitate towards AMP pages through a type of subconscious Pavlovian training.

Should I use AMP?

There are rarely any absolutes in this world, and this is no exception. Only you will know, based upon your particular needs at this time. AMP is typically used by news publishers like the New York Times, Washington Post, Fox News, and many others, but it’s important to note that it's not limited to this type of entity. While there is an AMP news carousel that frequently appears on mobile and is almost exclusively the domain of large publishing sites, AMP results are increasingly appearing in the regular results, like with the Raekwon McMillan example.

I'm a fan of leveraging blog content on AMP to generate as many eyeballs as possible on our pages, but I'm still a bit leery about putting product pages on AMP (though this is now possible). My end goal is to drive traffic and brand familiarity through the blog content and then ultimately drive more sales as people are either retargeted to via paid or come back from other sources, direct, organic or otherwise to actually complete the purchase. If your blog has strong, authoritative content, deploying AMP could potentially be a great way to generate more visibility and clicks for your site.

I must point out, however, that AMP doesn’t come without potential drawbacks. There are strict guidelines around what you can and can’t do with it, such as not having email popups, possible reduction in ad revenue, analytics complications, and requiring maintenance of a new set of pages. If you do decide that the potential gain in organic traffic is worth the tradeoffs, we can get into how to best measure the success of AMP for your site.


Now you have AMP traffic — so what?

If your goal is to drive more organic traffic, you need to be prepared for the questions that will come if that traffic does not yield revenue in Google Analytics. First, we need to keep in mind that GA's default attribution is via last direct click, but the model can be altered to report different numbers. This means that if you have a visitor who searches something organically, enters via the blog, and doesn't purchase anything, yet 3 days later comes back via direct and purchases a product, the default conversion reporting in GA would assign no credit to the organic visit, giving all of the conversion credit to the direct visit.

But this is misleading. Would that conversion have happened if not for the first visit from organic search? Probably not.

By going into the Conversions section of GA and clicking on Attribution > Model Comparison Tool, you’ll be able to see a side-by-side comparison of different conversion models, such as:

  • First touch (all credit goes to first point-of-entry to site)
  • Last touch (all credit goes to the point-of-entry of session where conversion took place)
  • Position-based (credit is primarily shared between the first and last points-of-entry, with less credit being shared amongst the intermediary steps)

There are also a few others, but I find them to be less interesting. For more information, read here. You can also click on Multi-Channel Funnels > Assisted Conversions to see the number of conversions by channel which were used along the way to a conversion, but was not the channel of conversion.

AMP tracking complications

Somewhat surprisingly, tracking from AMP is not as easy or as logical as one might expect. To begin with, AMP uses a separate Analytics snippet than your standard GA tracking code, so if you already have GA installed on your site and you decide to roll out AMP, you will need to set up the specific AMP analytics. (For more information on AMP analytics, please read Accelerated Mobile Pages Via Google Tag Manager and Adding Analytics to Your AMP Pages).

In a nutshell, the client ID (which tracks a specific user’s engagement with a site over time in GA) is not shared by default between AMP analytics and the regular tracking code, though there are some hack-y ways to get around this (WARNING: this gets very technically in-depth). I think there are two very important questions when it comes to AMP measurement:

  1. How much revenue are these pages responsible for?
  2. How much engagement are we driving from AMP pages?

In the Google Analytics AMP analytics property, it's simple to see how many sessions there are and what the bounce and exit rates are. From my own experience, bounce and exit rates are usually pretty high (depending on UX), but the number of sessions increases overall. So, if we’re driving more and more users, how can we track and improve engagement beyond the standard bounce and exit rates? Where do we look?

How to measure real value from AMP in Google Analytics

Acquisition > Referrals

I propose looking into our standard GA property and navigating to our referring sources within Acquisition, where we’ll select the AMP source, highlighted below.

Once we click there, we’ll see the full referring URLs, the number of sessions each URL drove to the non-AMP version of the site, the number of transactions associated with each URL, the amount of revenue associated per URL, and more.

Important note here: These sessions are not the total number of sessions on each AMP page; rather, these are the number of sessions that originated on an AMP URL and were referred to the non-AMP property.

Why is this particular report interesting?

  1. It allows us to see which specific AMP URLs are referring the most traffic to the non-AMP version of the site
  2. It allows us to see how many transactions and how much revenue comes from a session initiated by a specific AMP URL
    1. From here, we can analyze why certain pages refer more traffic or end up with more conversions, then apply any findings to other AMP URLs

Why is this particular report incomplete?

  • It only shows us conversions and revenue that happened during one session (last-touch attribution)
    • It is very likely that most of your blog traffic will be higher-funnel and informational, not transactional, so conversions are more likely to happen at later touch points than the first one

Conversions > Multi-Channel Funnels > Assisted Conversions

If we really want to have the best understanding of how much revenue and conversions happen from visits to AMP URLs, we need to analyze the assisted conversions report. While you can certainly find value from analyzing the model comparison tool (also found within the conversions tab of GA), if we want to answer the question, "How many conversions and how much revenue are we driving from AMP URLs?", it’s best answered in the Assisted Conversions section.

One of the first things that we’ll need to do is create a custom channel grouping within the Assisted Conversions section of Conversions.

In here, we need to:

  1. Click "Channel Groupings," select "Create a custom channel grouping"
  2. Name the channel "AMP"
  3. Set a rule as a source containing your other AMP property (type in “amp” into the form and it will begin to auto-populate; just select the one you need)
  4. Click "Save"

Why is this particular report interesting?

  1. We’re able to see how many assisted as well as last click/direct conversions there were by channel
  2. We’re able to change the look-back window on a conversion to anywhere from 1–90 days to see how it affects the sales cycle

Why is this particular report incomplete?

  • We’re unable to see which particular pages are most responsible for driving traffic, revenue, and conversions

Conclusion

As both of these reports are incomplete on their own, I recommend any digital marketer who is measuring the effect of AMP URLs to use the two reports in conjunction for their own reporting. Doing so will provide the value of:

  1. Informing us which AMP URLs refer the most traffic to our non-AMP pages, providing us a jumping-off point for analysis of what type of content and CTAs are most effective for moving visitors from AMP deeper into the site
  2. Informing us how many conversions happen with different attribution models

It’s possible that a quick glance at your reports will show very low conversion numbers, especially when compared with other channels. That does not necessarily mean AMP should be abandoned; rather, those pages should receive further investment and optimization to drive deeper engagement in the same session and retargeting for future engagement. Google actually does allow you to set up your AMP pages to retarget with Google products so users can see products related to the content they visited.

You can also add in email capture forms to your AMP URLs to re-engage with people at a later time, which is useful because AMP does not currently allow for interstitials or popups to capture a user’s information.

What do you do next with the information collected?

  1. Identify why certain pages refer more traffic than others to non-AMP URLs. Is there a common factor amongst pages that refer more traffic and others that don’t?
  2. Identify why certain pages are responsible for more revenue than other pages. Do all of your AMP pages contain buttons or designated CTAs?
  3. Can you possibly capture more emails? What would need to be done?

Ultimately, this reporting is just the first step in benchmarking your data. From here you can pull insights, make recommendations, and monitor how your KPIs progress. Many people have been concerned or confused as to whether AMP is valuable or the right thing for them. It may or may not be, but if you’re not measuring it effectively, there’s no way to really know. There's a strong likelihood that AMP will only increase in prominence over the coming months, so if you’re not sure how to attribute that traffic and revenue, perhaps this can help get you set up for continued success.

Did I miss anything? How do you measure the success (or failure) of your AMP URLs? Did I miss any KPIs that could be potentially more useful for your organization? Please let me know in the comments below.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

SEI Family Mourns the Tragic Death of Instructor Mike Sullivan

We are deeply saddened and stunned by the unexpected death of SEI Instructor Mike Sullivan. Since 2009, Mike taught over 40 classes and over 1000 students through both online and in-person trainings. Mike died on Monday in a climbing accident on Steeple Peak area of the Wind River Mountain Range in Wyoming.

Mike was a living legend at SEI because of how fully he lived his 54 years of life. When he wasn’t teaching solar classes, he travelled around the world working on solar projects for people who did not have electricity. Or he was climbing and adventuring in some of the most remote and beautiful places on Earth and then sharing his amazing photos with us on social media.

When he introduced himself in classes, he said he was from “Tacoma, the truck not Washington.” We were excited when he opened a PO Box and got a storage unit in Paonia, SEI’s headquarters, because it meant we were his homebase. He was an engineer by training, and even worked for a few years behind the desk before he found solar and his life’s passion. Meticulous and singular in his focus, it didn’t surprise us when we recently learned that he had read the entire dictionary by the time he was 5 years old.

There are so many stories flowing in from instructors and other solar friends. From stripping down to his underwear to cross a river in Bolivia while carrying PV equipment, from strapping a hot water heater to his back and skiing up a steep, steep mountain to deliver it to an off-grid cabin of a friend, to awakening each morning at 4am before teaching a class so that he could find a tree to climb or a hike. His motto was work 24/7—meaning 24 hours a week for 7 months a year.

“We met on a rainy, rest day in a hostel in El Chalten, Argentina in the heart of Patagonia in 2014. I had never met someone who so seamlessly wove their passions and their profession into such an adventurous and colorful lifestyle. (He was teaching SEI online courses in between climbing.) It quickly became evident that he was more than a climbing mentor; he was the impetus I needed to pursue my passion for renewable energy systems as a career. Mike redefined possibility. He invited me to assist with a solar project in Ladakh, India last summer, where I witnessed how solar technology can be used as an incredible tool to positively impact a community and its enormous potential to further influence the course of the world.

Mike’s gentle ways and vibrant enthusiasm was infectious and his spirit full of compassion. 

I always looked forward to our next adventure as he gravitated to the most beautiful places above and around. He’ll continue to inspire with his sparky twinkle of irradiance and that big alleycat grin.”

Toby Swimmer, OnSite Energy in Bozeman, Montana

“Mike spent a lot of time working with technicians in Haiti in 2011-2012.  Each time I go to Haiti and see individual technicians they go through and diligently ask about the different people we know in common… How is Christopher? How is Brad? How is Jeff?… but when they get to Mike, each person gets this big goofy smile and asks “How is Mike? You know- he is craazzy!!” and laughs…

Recently, I asked why everyone thinks Mike was so crazy… and everyone had some story about turning their back to do something… and then when going back to find Mike… hunting around, calling his name… and then someone finding him up in the tallest part of a tree. And of course, there is no purpose of being in the tree… the tree is not a fruiting tree, etc…and this part is explained in detail as they are trying to make sense as to why he was up there. Mike was just hanging out there. Making people laugh without knowing the language. And there is always a lot of laughter in recounting these stories. 

~Carol Weis, SEI Instructor and Co-Founder of Remote Energy

Mike, we miss you. Thank you for always reminding us to go live your adventure, live your passions. We will send out more details as the memorial is finalized.

 

The post SEI Family Mourns the Tragic Death of Instructor Mike Sullivan appeared first on Solar Training - Solar Installer Training - Solar PV Installation Training - Solar Energy Courses - Renewable Energy Education - NABCEP - Solar Energy International (SEI).

Wednesday, August 30, 2017

Building a Community of Advocates Through Smart Content

Posted by Michelle_LeBlanc

From gentle criticism to full-on trolls, every brand social media page or community sometimes faces pushback. Maybe you’ve seen it happen. Perhaps you’ve even laughed along as a corporation makes a condescending misstep or a local business publishes a glaring typo. It’s the type of thing that keeps social media and community managers up at night. Will I be by my phone to respond if someone needs customer service help? Will I know what to write if our brand comes under fire? Do we have a plan for dealing with this?

Advocates are a brand’s best friend

In my years of experience developing communities and creating social media content, I’ve certainly been there. I won’t try to sell you a magic elixir that makes that anxiety go away, but I've witnessed a phenomenon that can take the pressure off. Before you can even begin to frame a response as the brand, someone comes out of the woodwork and does it for you. Defending, opening up a conversation, or perhaps deflecting with humor, these individuals bring an authenticity to the response that no brand could hope to capture. They are true advocates, and they are perhaps the most valuable assets a company could have.

But how do you get them?

Having strong brand advocates can help insulate your brand from crisis, lead to referring links and positive media coverage, AND help you create sustainable, authentic content for your brand. In this blog post, I’ll explore a few case studies and strategies for developing these advocates, building user-generated content programs around them, and turning negative community perceptions into open dialogue.

Case study 1: Employee advocates can counter negative perceptions

To start, let’s talk about negative community perceptions. Almost every company deals with this to one degree or another.

In the trucking industry, companies deal with negative perceptions not just of their individual company, but also of the industry as a whole. You may not be aware of this, but our country needs approximately 3.5 million truck drivers to continue shipping daily supplies like food, medicine, deals from Amazon, and everything else you’ve come to expect in your local stores and on your doorstep. The industry regularly struggles to find enough drivers. Older drivers are retiring from the field, while younger individuals may be put off by a job that requires weeks away from home. Drivers that are committed to the industry may change jobs frequently, chasing the next hiring bonus or better pay rate.

How does a company counter these industry-wide challenges and also stand out as an employer from every other firm in the field?

Using video content, Facebook groups, and podcasts to create employee advocates

For one such company, we looked to current employees to become brand advocates in marketing materials and on social media. The HR and internal communications team had identified areas of potential for recruitment — e.g. separating military, women — and we worked with them to identify individuals that represented these niche characteristics, as well as the values that the company wanted to align themselves with: safety, long-term tenure with the company, affinity for the profession, etc. We then looked for opportunities to tell these individuals' stories in a way that was authentic, reflected current organic social media trends, and provided opportunities for dialogue.

In one instance, we developed a GoPro-shot, vlog-style video program around two female drivers that featured real-life stories and advice from the road. By working behind the scenes with these drivers, we were able to coach them into being role models for our brand advocate program, modeling company values in media/PR coverage and at live company events.

One driver participated in an industry-media live video chat where she took questions from the audience, and later she participated in a Facebook Q&A on behalf of the brand as well. It was our most well-attended and most engaged Q&A to date. Other existing and potential drivers saw these individuals becoming the heroes of the brand’s stories and, feeling welcomed to the dialogue by one of their own, became more engaged with other marketing activities as a result. These activities included:

  • A monthly call-in/podcast show where drivers could ask questions directly of senior management. We found that once a driver had participated in this forum, they were much more likely to stay with the company — with a 90% retention rate!
  • A private Facebook group where very vocal and very socially active employees could have a direct line to the company’s driver advocate to express opinions and ask questions. In addition to giving these individuals a dedicated space to communicate, this often helped us identify trends and issues before they became larger problems.
  • A contest to nominate military veterans within the company to become a brand spokesperson in charge of driving a military-themed honorary truck. By allowing anyone to submit a nomination for a driver, this contest helped us discover and engage members of the audience that were perhaps less likely to put themselves forward out of modesty or lack of esteem for their own accomplishments. We also grew our email list, gained valuable insights about the individuals involved, and were able to better communicate with more of this “lurker” group.

By combining these social media activities with traditional PR pitching around the same themes, we continued to grow brand awareness as a whole and build an array of positive links back to the company.

When it comes to brand advocates, sometimes existing employees simply need to be invited in and engaged in a way that appeals to their own intrinsic motivations — perhaps a sense of belonging or achievement. For many employee-based audiences, social media engagement with company news or industry trends is already happening and simply needs to be harnessed and directed by the brand for better effect.

But what about when it comes to individuals that have no financial motivation to promote a brand? At the other end of the brand advocate spectrum from employees are those who affiliate themselves with a cause. They may donate money or volunteer for a specific organization, but when it comes down to it, they don’t have inherent loyalty to one group and can easily go from engaged to enraged.

Case study 2: UGC can turn volunteers into advocates

One nonprofit client that we have the privilege of working with dealt with this issue on a regular basis. Beyond misunderstandings about their funding sources or operations, they occasionally faced backlash about their core mission on social media. After all, for any nonprofit or cause out there, it's easy to point to two or ten others that may be seen as "more worthy," depending on your views. In addition, the nature of their cause tended to attract a lot of attention in the holiday giving period, with times of low engagement through the rest of the year.

Crowdsourcing user-generated content for better engagement

To counter this and better engage the audience year-round, we again looked for opportunities to put individual faces and stories at the forefront of marketing materials.

In this case, we began crowdsourcing user-generated content through monthly contesting programs during the organization's "off" months. Photos submitted during the contests could be used as individual posts on social media or remixed across videos, blog posts, or as a starting point for further conversation and promotion development with the individuals. As Facebook was the primary promotion point for these contests, they attracted those who were already highly engaged with the organization and its page. During the initial two-month program, the Facebook page gained 16,660 new fans with no associated paid promotion, accounting for 55% of total page Likes in the first half of 2016.

Perhaps even more importantly, the organization was able to save on internal labor in responding to complaints or negative commentary on posts as even more individuals began adding their own positive comments. The organization’s community manager was able to institute a policy of waiting to respond after any negative post, allowing the brand advocates time to chime in with a more authentic, volunteer-driven voice.

By inviting their most passionate supporters more deeply into the fold and giving them the space and trust to communicate, the organization may have lost some measure of control over the details of the message, but they gained support and understanding on a deeper level. These individuals not only influenced others within the social media pages of the organization, but also frequently shared content and tagged friends, acting as influencers and bringing others into the fold.

How you can make it work for your audience

As you can see, regardless of industry, building a brand advocate program often starts with identifying your most passionate supporters and finding a way to appeal to their existing habits, interests, and motivations — then building content programs that put those goals at the forefront. Marketing campaigns featuring paid influencers can be fun and can certainly achieve rapid awareness and reach, but they will never be able to counter the lasting value of an authentic advocate, particularly when it comes to countering criticism or improving the perceived status of your brand or industry.

To get started, you can follow a few quick tips:

  • Understand your existing community.
    • Take a long look at your active social audience and try to understand who those people are: Employees? Customers?
    • Ask yourself what motivates them to participate in dialogue and how can you provide more of that.
  • Work behind the scenes.
    • Send private messages and emails, or pick up the phone and speak with a few audience members.
    • Getting a few one-on-one insights can be incredibly helpful in content planning and inspiring your strategy.
    • By reaching out individually, you really make people feel special. That’s a great step towards earning their advocacy.
  • Think: Where else can I use this?
    • Your advocates and their contributions are valuable. Make sure you take advantage of that value!
    • Reuse content in multiple formats or invite them to participate in new ways.
    • Someone who provides a testimonial might be able to act as a source for your PR team, as well.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

The Wrong Way to Save Nuclear Power

Earlier this month, Jeremy Carl and David Fedor of Stanford University’s Hoover Institution, released a book showcasing the dire state of America’s nuclear energy industry. Keeping the Lights on at America’s Nuclear Power Plants highlights the problems facing the beleaguered power source and offers a range of proposals to save America’s nuclear reactors. And while some of their proposals would make meaningful headway toward transforming nuclear power into a viable power source, others would merely make the nuclear energy industry dependent on government largesse and raise costs on consumers in the process.

As I discussed in my previous article, the authors support reforming the federal government’s expensive licensing restrictions which make it harder for newer and cheaper reactors to reach the market. In particular, they call for ending the Nuclear Regulatory Commission’s requirement that nuclear developers complete a decade-long application before any approvals are made. In its place, they support shifting the NRC’s licensing process towards a “test-then-license” system in which the commission would grant companies faster step-by-step approval as they wade through the process.

Streamlining the NRC’s process would undoubtedly make it easier for nuclear developers to bring their reactors online while lowering costs for consumers. Unfortunately, Carl and Fedor’s other recommendations appear to be geared less towards delivering cheaper energy to consumers and more towards erecting artificial protections for the nuclear industry. In their section on policy and regulatory options, the authors encourage state government agencies to use their monopoly utility regimes to force residents to use nuclear power:

“State regulatory commissions could choose to encourage nuclear power generation by developing various mechanisms to direct more ratepayer money towards it. In most regulated states with monopoly utilities, such bodies already have broad discretion to do so,” the authors said.

In addition to regulatory preferences, Carl and Fedor also call on the federal government to explicitly subsidize nuclear power plants. Specifically, they suggest the federal government establish public-private “partnerships” with nuclear companies and use taxpayer dollars to underwrite long-term contracts with utilities.

Experience shows taxpayer subsidies don’t spur development of new nuclear plants. Beginning with the Price-Anderson Nuclear Industries Indemnity Act of 1957, supporting nuclear power became a priority for government planners. The act mandates every nuclear power plant to purchase $325 million in commercial liability insurance as well as contribute to an insurance pool to cover serious accidents and damage. If the costs of a nuclear accident ever exceed the value of these insurance funds, Price-Anderson obligates taxpayers to pay the remaining costs of cleanup. This artificially reduces the costs nuclear reactors owners must pay to insure their facilities.

Since then, federal support for nuclear has only increased. The 2005 Energy Policy Act established tax credits to subsidize nuclear power plants, providing these companies $18 for every megawatt-hour of energy they produce for the first eight years of operation. Then in 2008, the federal government began offering generous loan guarantees for nuclear developers to build new plants. These loan guarantees can cover 80 percent of a project’s costs and up to $18.5 billion in loans.

Yet, despite these enormous regulatory advantages and taxpayer-funded subsidies, nuclear energy still struggles to survive. Since 2012, nuclear plant owners have closed or announced the closure of 14 reactors over 11 plant sites. South Carolina Electric & Gas and the state-owned power company Santee Cooper recently ceased construction on a $9 billion project to build two new reactors in Fairfield County.

In a press release announcing the closures, Santee Cooper explained why nuclear reactors close. When the company filed its initial application to begin construction in 2008, natural gas prices were three times higher and didn’t pose a viable threat to nuclear. Those days are over. Innovations in hydraulic fracturing technologies have unlocked millions of cubic feet of previously unrecoverable natural gas reserves and increased production by 50 percent. As a result, the cost of generating electricity from natural gas has fallen to $2.34 per kilowatt, far below the cost of nuclear power.

As Santee Cooper’s closure demonstrates, expanding corporate welfare won’t save the nuclear industry from inexpensive natural gas and certainly won’t lower costs for consumers. Instead, policymakers should follow recommendations laid out in the Department of Energy’s recently released review of America’s power grid. The report proposes government agencies streamline the licensing and permitting process in order to accelerate the development of lower cost nuclear power plants.

Nuclear can indeed thrive, but advocates should focus on making nuclear energy more competitive by unwinding burdensome regulations, rather than forcing taxpayers to subsidize high cost nuclear reactors drowning in government mandates.

The post The Wrong Way to Save Nuclear Power appeared first on IER.

Tuesday, August 29, 2017

Going Beyond Google: Are Search Engines Ready for JavaScript Crawling & Indexation?

Posted by goralewicz

I recently published the results of my JavaScript SEO experiment where I checked which JavaScript frameworks are properly crawled and indexed by Google. The results were shocking; it turns out Google has a number of problems when crawling and indexing JavaScript-rich websites.

Google managed to index only a few out of multiple JavaScript frameworks tested. And as I proved, indexing content doesn’t always mean crawling JavaScript-generated links.

This got me thinking. If Google is having problems with JavaScript crawling and indexation, how are Google’s smaller competitors dealing with this problem? Is JavaScript going to lead you to full de-indexation in most search engines?

If you decide to deploy a client-rendered website (meaning a browser or Googlebot needs to process the JavaScript before seeing the HTML), you're not only risking problems with your Google rankings — you may completely kill your chances at ranking in all the other search engines out there.

Google + JavaScript SEO experiment

To see how search engines other than Google deal with JavaScript crawling and indexing, we used our experiment website, http:/jsseo.expert, to check how Googlebot crawls and indexes JavaScript (and JavaScript frameworks’) generated content.

The experiment was quite simple: http://jsseo.expert has subpages with content parsed by different JavaScript frameworks. If you disable JavaScript, the content isn’t visible — i.e. if you go to http://jsseo.expert/angular2/, all the content within the red box is generated by Angular 2. If the content isn’t indexed in Yahoo, for example, we know that Yahoo’s indexer didn’t process the JavaScript.

Here are the results:

As you can see, Google and Ask are the only search engines to properly index JavaScript-generated content. Bing, Yahoo, AOL, DuckDuckGo, and Yandex are completely JavaScript-blind and won’t see your content if it isn’t HTML.

The next step: Can other search engines index JavaScript?

Most SEOs only cover JavaScript crawling and indexing issues when talking about Google. As you can see, the problem is much more complex. When you launch a client-rendered JavaScript-rich website (JavaScript is processed by the browser/crawler to “build” HTML), you can be 100% sure that it’s only going to be indexed and ranked in Google and Ask. Unfortunately, Google and Ask cover only ~64% of the whole search engine market, according to statista.com.

This means that your new, shiny, JavaScript-rich website can cost you ~36% of your website’s visibility on all search engines.

Let’s start with Yahoo, Bing, and AOL, which are responsible for 35% of search queries in the US.

Yahoo, Bing, and AOL

Even though Yahoo and AOL were here long before Google, they’ve obviously fallen behind its powerful algorithm and don’t invest in crawling and indexing as much as Google. One reason is likely the relatively high cost of crawling and indexing the web compared to the popularity of the website.

Google can freely invest millions of dollars in growing their computing power without worrying as much about return on investment, whereas Bing, AOL, and Ask only have a small percentage of the search market.

However, Microsoft-owned Bing isn't out of the running. Their growth has been quite aggressive over last 8 years:

Unfortunately, we can’t say the same about one of the market pioneers: AOL. Do you remember the days before Google? This video will surely bring back some memories from a simpler time.

If you want to learn more about search engine history, I highly recommend watching Marcus Tandler’s spectacular TEDx talk.

Ask.com

What about Ask.com? How is it possible that Ask, with less than 1% of the market, can invest in crawling and indexing JavaScript? It makes me question if the Ask network is powered by Google’s algorithm and crawlers. It's even more interesting looking at Ask’s aversion towards Google. There were already some speculations about Ask’s relationship with Google after Google Penguin in 2012, but we can now confirm that Ask’s crawling is using Google’s technology.

DuckDuckGo and Yandex

Both DuckDuckGo and Yandex had no problem indexing all the URLs within http://jsseo.expert, but unfortunately, the only content that was indexed properly was the 100% HTML page (http://jsseo.expert/html/).

Baidu

Despite my best efforts, I didn’t manage to index http://jsseo.expert in Baidu.com. It turns out you need a mainland China phone number to do that. I don’t have any previous experience with Baidu, so any and all help with indexing our experimental website would be appreciated. As soon as I succeed, I will update this article with Baidu.com results.

Going beyond the search engines

What if you don’t really care about search engines other than Google? Even if your target market is heavily dominated by Google, JavaScript crawling and indexing is still in an early stage, as my JavaScript SEO experiment documented.

Additionally, even if crawled and indexed properly, there is proof that JavaScript reliance can affect your rankings. Will Critchlow saw a significant traffic improvement after shifting from JavaScript-driven pages to non-JavaScript reliant.

Is there a JavaScript SEO silver bullet?

There is no search engine that can understand and process JavaScript at the level our modern browsers can. Even so, JavaScript isn’t inherently bad for SEO. JavaScript is awesome, but just like SEO, it requires experience and close attention to best practices.

If you want to enjoy all the perks of JavaScript without worrying about problems like Hulu.com’s JavaScript SEO issues, look into isomorphic JavaScript. It allows you to enjoy dynamic and beautiful websites without worrying about SEO.

If you've already developed a client-rendered website and can’t go back to the drawing board, you can always use pre-rendering services or enable server-side rendering. They often aren’t ideal solutions, but can definitely help you solve the JavaScript crawling and indexing problem until you come up with a better solution.

Regardless of the search engine, yet again we come back to testing and experimenting as a core component of technical SEO.

The future of JavaScript SEO

I highly recommend you follow along with how http://jsseo.expert/ is indexed in Google and other search engines. Even if some of the other search engines are a little behind Google, they'll need to improve how they deal with JavaScript-rich websites to meet the exponentially growing demand for what JavaScript frameworks offer, both to developers and end users.

For now, stick to HTML & CSS on your front-end. :)


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Father/Son Duo Launch Solar Careers with SEI!

Sometimes solar is a family affair. This is true for Rob and Jake Dushek, a father/son team who came to Solar Energy International (SEI) for solar training earlier this month. Rob and Jake are already business partners who see their future in the solar industry, but for different reasons. Rob, Jake’s father, is looking to expand the contracting and energy consulting business he’s been growing for the last fifteen years. He sees the growth of the business as a way empower the young people like his son Jake. Saying of his reason to pursue solar training at SEI and become a certified solar professional,  “it is a financial opportunity to help my small business grow which will in turn lead young professionals to paths and opportunities for business ownership.”

Jake is excited to make the shift to solar because he is technically inclined and shares his father’s passion for renewable energy and optimism about the industry’s potential for growth. “I understand in order to continue life as we know it, we will need to incorporate renewable energy and make it a priority. I want to become certified in renewable energy because it seems like a very lucrative field to get into at the time and I also want to save the environment.” When asked about coming to SEI with his father, Jake admitted it’s a “rare opportunity to opportunity get to work with a parent” especially one who shares a such a similar passion. “It’s really nice to work together, we already have a great relationship and we’re good at troubleshooting and collaborating on solutions.”

The two are continuing their hands-on solar training with SEI’s online training courses. Proving once again that they make a great team, they are each enrolled in complementary tracks of SEI’s Solar Professionals Certificate Program.  Jake is working towards completing his Solar Business and Technical Sales Certificate to focus on the customer-facing aspects of the business while Rob works on the Residential and Commercial Photovoltaic Systems Certificate to grow on his experience and passion for technical solutions. Together, they’ll have a strong technical background in solar to launch the next phase of their business.

 

The post Father/Son Duo Launch Solar Careers with SEI! appeared first on Solar Training - Solar Installer Training - Solar PV Installation Training - Solar Energy Courses - Renewable Energy Education - NABCEP - Solar Energy International (SEI).

Monday, August 28, 2017

Relive MozCon with the 2017 Video Bundle

Posted by Danielle_Launders

MozCon may be over, but we just can’t get enough of it — and that's why our team has worked hard to bring the magic back to you with our MozCon 2017 Video Bundle. You'll have 26 sessions at your fingertips to watch over and over again — that’s over 14 hours of future-focused sessions aiming to level up your SEO and online marketing skills. Get ahead of Google and its biggest changes to organic search with Dr. Pete Meyers, prepare for the future of mobile-first indexing with Cindy Krum, and increase leads through strategic data-driven design with Oli Gardner.

Ready to dive into all of the excitement? Feel free to jump ahead:

Buy the MozCon 2017 Video Bundle

For our friends that attended MozCon 2017, check your inbox: You should find an email from us that will navigate you to your videos. The same perk applies for next year — your ticket to MozCon 2018 includes the full video bundle. We do have a limited number of super early bird tickets (our best deal!) still available.

This year's MozCon was truly special. We are honored to host some of the brightest minds in the industry and the passion and insights they bring to the stage. We know you'll enjoy all the new tactics and innovative topics just as much as we did.

But don’t just take our word for it...

Here’s a recap of one attendee's experience:

“Attending MozCon is like a master's course in digital marketing. With so many knowledgeable speakers sharing their insights, their methods, and their tools all in the hopes of making me a better digital marketer, it seems like a waste not to take advantage of it.”
– Sean D. Francis, Director of SEO at Blue Magnet Interactive

The video bundle

You’ll have access to 26 full video presentations from MozCon.

For $299, the MozCon 2017 video bundle gives you instant access to:

  • 26 videos (that’s over 14 hours of content)
  • Stream or download the videos to your computer, tablet, or phone. The videos are iOS, Windows, and Android-compatible
  • Downloadable slide decks for presentations

Buy the MozCon 2017 Video Bundle

Want a free preview?

If you haven’t been to a MozCon before, you might be a little confused by all of the buzz and excitement. To convince you that we're seriously excited, we're sharing one of our highly-rated sessions with you for free! Check out "How to Get Big Links" with Lisa Myers in the full session straight from MozCon 2017. Lisa shares how her and her team were able to earn links and coverage from big sites such as New York Times, the Wall Street Journal, and BBC.

I want to thank the team behind the videos and for all the hours of editing, designing, coding, processing, and more. We love being able to share this knowledge and couldn’t do it without the crew's efforts. And to the community, we wish you happy learning and hope to see you at MozCon 2018 in July!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Friday, August 25, 2017

How to Determine if a Page is "Low Quality" in Google's Eyes - Whiteboard Friday

Posted by randfish

What are the factors Google considers when weighing whether a page is high or low quality, and how can you identify those pages yourself? There's a laundry list of things to examine to determine which pages make the grade and which don't, from searcher behavior to page load times to spelling mistakes. Rand covers it all in this episode of Whiteboard Friday.

How to identify low quality pages

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we're going to chat about how to figure out if Google thinks a page on a website is potentially low quality and if that could lead us to some optimization options.

So as we've talked about previously here on Whiteboard Friday, and I'm sure many of you have been following along with experiments that Britney Muller from Moz has been conducting about removing low-quality pages, you saw Roy Hinkis from SimilarWeb talk about how they had removed low-quality pages from their site and seen an increase in rankings on a bunch of stuff. So many people have been trying this tactic. The challenge is figuring out which pages are actually low quality. What does that constitute?

What constitutes "quality" for Google?

So Google has some ideas about what's high quality versus low quality, and a few of those are pretty obvious and we're familiar with, and some of them may be more intriguing. So...
  • Google wants unique content.
  • They want to make sure that the value to searchers from that content is actually unique, not that it's just different words and phrases on the page, but the value provided is actually different. You can check out the Whiteboard Friday on unique value if you have more questions on that.
  • They like to see lots of external sources linking editorially to a page. That tells them that the page is probably high quality because it's reference-worthy.
  • They also like to see high-quality pages, not just sources, domains but high-quality pages linking to this. That can be internal and external links. So it tends to be the case that if your high-quality pages on your website link to another page on your site, Google often interprets that that way.
  • The page successfully answers the searcher's query.

This is an intriguing one. So if someone performs a search, let's say here I type in a search on Google for "pressure washing." I'll just write "pressure wash." This page comes up. Someone clicks on that page, and they stay here and maybe they do go back to Google, but then they perform a completely different search, or they go to a different task, they visit a different website, they go back to their email, whatever it is. That tells Google, great, this page solved the query.

If instead someone searches for this and they go, they perform the search, they click on a link, and they get a low-quality mumbo-jumbo page and they click back and they choose a different result instead, that tells Google that page did not successfully answer that searcher's query. If this happens a lot, Google calls this activity pogo-sticking, where you visit this one, it didn't answer your query, so you go visit another one that does. It's very likely that this result will be moved down and be perceived as low quality in Google.

  • The page has got to load fast on any connection.
  • They want to see high-quality accessibility with intuitive user experience and design on any device, so mobile, desktop, tablet, laptop.
  • They want to see actually grammatically correct and well-spelled content. I know this may come as a surprise, but we've actually done some tests and seen that by having poor spelling or bad grammar, we can get featured snippets removed from Google. So you can have a featured snippet, it's doing great in the SERPs, you change something in there, you mess it up, and Google says, "Wait, no, that no longer qualifies. You are no longer a high-quality answer." So that tells us that they are analyzing pages for that type of information.
  • Non-text content needs to have text alternatives. This is why Google encourages use of the alt attribute. This is why on videos they like transcripts. Here on Whiteboard Friday, as I'm speaking, there's a transcript down below this video that you can read and get all the content without having to listen to me if you don't want to or if you don't have the ability to for whatever technical or accessibility, handicapped reasons.
  • They also like to see content that is well-organized and easy to consume and understand. They interpret that through a bunch of different things, but some of their machine learning systems can certainly pick that up.
  • Then they like to see content that points to additional sources for more information or for follow-up on tasks or to cite sources. So links externally from a page will do that.

This is not an exhaustive list. But these are some of the things that can tell Google high quality versus low quality and start to get them filtering things.

How can SEOs & marketers filter pages on sites to ID high vs. low quality?

As a marketer, as an SEO, there's a process that we can use. We don't have access to every single one of these components that Google can measure, but we can look at some things that will help us determine this is high quality, this is low quality, maybe I should try deleting or removing this from my site or recreating it if it is low quality.

In general, I'm going to urge you NOT to use things like:

A. Time on site, raw time on site

B. Raw bounce rate

C. Organic visits

D. Assisted conversions

Why not? Because by themselves, all of these can be misleading signals.

So a long time on your website could be because someone's very engaged with your content. It could also be because someone is immensely frustrated and they cannot find what they need. So they're going to return to the search result and click something else that quickly answers their query in an accessible fashion. Maybe you have lots of pop-ups and they have to click close on them and it's hard to find the x-button and they have to scroll down far in your content. So they're very unhappy with your result.

Bounce rate works similarly. A high bounce rate could be a fine thing if you're answering a very simple query or if the next step is to go somewhere else or if there is no next step. If I'm just trying to get, "Hey, I need some pressure washing tips for this kind of treated wood, and I need to know whether I'll remove the treatment if I pressure wash the wood at this level of pressure," and it turns out no, I'm good. Great. Thank you. I'm all done. I don't need to visit your website anymore. My bounce rate was very, very high. Maybe you have a bounce rate in the 80s or 90s percent, but you've answered the searcher's query. You've done what Google wants. So bounce rate by itself, bad metric.

Same with organic visits. You could have a page that is relatively low quality that receives a good amount of organic traffic for one reason or another, and that could be because it's still ranking for something or because it ranks for a bunch of long tail stuff, but it is disappointing searchers. This one is a little bit better in the longer term. If you look at this over the course of weeks or months as opposed to just days, you can generally get a better sense, but still, by itself, I don't love it.

Assisted conversions is a great example. This page might not convert anyone. It may be an opportunity to drop cookies. It might be an opportunity to remarket or retarget to someone or get them to sign up for an email list, but it may not convert directly into whatever goal conversions you've got. That doesn't mean it's low-quality content.

THESE can be a good start:

So what I'm going to urge you to do is think of these as a combination of metrics. Any time you're analyzing for low versus high quality, have a combination of metrics approach that you're applying.

1. That could be a combination of engagement metrics. I'm going to look at...

  • Total visits
  • External and internal
  • I'm going to look at the pages per visit after landing. So if someone gets to the page and then they browse through other pages on the site, that is a good sign. If they browse through very few, not as good a sign, but not to be taken by itself. It needs to be combined with things like time on site and bounce rate and total visits and external visits.

2. You can combine some offsite metrics. So things like...

  • External links
  • Number of linking root domains
  • PA and your social shares like Facebook, Twitter, LinkedIn share counts, those can also be applicable here. If you see something that's getting social shares, well, maybe it doesn't match up with searchers' needs, but it could still be high-quality content.

3. Search engine metrics. You can look at...

  • Indexation by typing a URL directly into the search bar or the browser bar and seeing whether the page is indexed.
  • You can also look at things that rank for their own title.
  • You can look in Google Search Console and see click-through rates.
  • You can look at unique versus duplicate content. So if I type in a URL here and I see multiple pages come back from my site, or if I type in the title of a page that I've created and I see multiple URLs come back from my own website, I know that there's some uniqueness problems there.

4. You are almost definitely going to want to do an actual hand review of a handful of pages.

  • Pages from subsections or subfolders or subdomains, if you have them, and say, "Oh, hang on. Does this actually help searchers? Is this content current and up to date? Is it meeting our organization's standards?"

Make 3 buckets:

Using these combinations of metrics, you can build some buckets. You can do this in a pretty easy way by exporting all your URLs. You could use something like Screaming Frog or Moz's crawler or DeepCrawl, and you can export all your pages into a spreadsheet with metrics like these, and then you can start to sort and filter. You can create some sort of algorithm, some combination of the metrics that you determine is pretty good at ID'ing things, and you double-check that with your hand review. I'm going to urge you to put them into three kinds of buckets.

I. High importance. So high importance, high-quality content, you're going to keep that stuff.

II. Needs work. second is actually stuff that needs work but is still good enough to stay in the search engines. It's not awful. It's not harming your brand, and it's certainly not what search engines would call low quality and be penalizing you for. It's just not living up to your expectations or your hopes. That means you can republish it or work on it and improve it.

III. Low quality. It really doesn't meet the standards that you've got here, but don't just delete them outright. Do some testing. Take a sample set of the worst junk that you put in the low bucket, remove it from your site, make sure you keep a copy, and see if by removing a few hundred or a few thousand of those pages, you see an increase in crawl budget and indexation and rankings and search traffic. If so, you can start to be more or less judicious and more liberal with what you're cutting out of that low-quality bucket and a lot of times see some great results from Google.

All right, everyone. Hope you've enjoyed this edition of Whiteboard Friday, and we'll see you again next week. Take care.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Al Gore’s Energy Problems

Climate alarmism was launched almost 30 years ago when the featured scientist before Al Gore’s Senate Committee on Energy and Natural Resources testified that he was “99 percent certain” human activity was behind that year’s unusually hot summer.

For maximum effect, that hearing was scheduled during a Washington, D.C. heat wave. Rumors flew that the hearing room’s windows had been left open so that visible beads of perspiration would accompany the very words of NASA scientist James Hansen.

That was June 1988. The theatrics continue with this summer’s release of Gore’s “An Inconvenient Sequel: Truth to Power,” the follow-up to “An Inconvenient Truth” (2006).

Despite a plea to climate activists to come en masse, the documentary has all but bombed. The general public, much less moviegoers, are just not buying Gore’s latest serial exaggerations.

It’s little wonder why the Left and Right ask whether Pastor Gore is more of a hindrance than a help to the cause. “While Gore’s heart is in the right place, his hyperbole can hurt him,” one reviewer politely stated. “Al Gore is the best friend climate skeptics ever had,” noted Steve Hayward. “Here’s to hoping Gore makes many more sequels to An Inconvenient Truth. With enemies like him, who needs enemies?”

Palatial Energy

Just in time for his movie, the story broke that Gore’s 10,070-square-foot Nashville residence consumed twenty times more energy than the average U.S. home. His swimming pool alone accounted for six times more.

“With an average consumption of 22.9 kWh per square foot over the past year, Gore’s home classifies as an ‘energy hog’ under standards developed by Energy Vanguard—a company specializing in energy efficiency methods,” one writer noted.

Indeed. While Gore’s mansion is about four times larger than the average American house of 2,700-square feet, it uses as much as 34 times more energy (for example, in September of last year).

Hypocrisy and irony turn into mystery with another fact: the Gore mansion is certificated energy efficient, and it is partly powered by renewable energy. Appliance retrofits, an array of solar panels, and a geothermal system (all at an estimated cost of a quarter-million dollars) were installed in 2007 when Gore’s energy bill became a national issue. The U.S. Green Building Council, in fact, gave the property a Gold LEED certification after the quarter-million-dollar renovation.

Offset “Monkey Business”

In damage control, Gore’s spokesperson Betsy McManus stated this month that her boss “leads a carbon-neutral life by purchasing green energy, reducing carbon impacts, and offsetting any emissions that can’t be avoided.” But carbon neutral is not the same as carbon free—and in this case, it’s quite the opposite.

McManus refused to provide data on Gore’s alleged offsets, much less the reasons that Gore’s (Gold LEED) electricity usage is off the charts. (Gore’s other residences in San Francisco and Carthage, Tennessee are at issue here too.) But even assuming substantive purchases, Gore is supporting a fossil-fueled present and future according to Gore’s go-to climate scientist, James Hansen.

“A successful new policy cannot include any offsets,” Hansen stated in his global warming manifesto, Storms of My Grandchildren (p. 206):

The public must be firm and unwavering in demanding “no offsets,” because this sort of monkey business is exactly the type of thing that politicians love and will try to keep. Offsets are like the indulgences that were sold by the church in the Middle Ages. People of means loved indulgences, because they could practice any hanky-panky or worse, then simply purchase an indulgence to avoid punishment for their sins.

Bishops loved them too, because they brought in lots of moola. Anybody who argues for offsets today is either a sinner who wants to pretend he or she has done adequate penance or a bishop collecting moola.

As government mitigation policy, the Gore approach should be rejected. Hansen continues (ibid.):

A successful new policy cannot include any offsets. We specified the carbon limit based on the geophysics. The physics does not compromise—it is what it is. And planting additional trees cannot be factored into the fossil fuel limitations. The plan for getting back to 350 ppm assumes major reforestation, but that is in addition to the fossil fuel limit, not instead of. Forest preservation and reforestation should be handled separately from fossil fuels in a sound approach to solve the climate problem.

Climate stabilization requires no less than “a global phaseout of fossil fuel carbon dioxide emissions,” Hansen insists (p. 205). Yet the majority of energy molecules used at Gore’s Belle Meade residence are fossil-fuel generated, as much as the former Vice President would like to claim carbon neutrality.[1]

Gore Misspeaks

Al Gore will not dare debate climate change issues—the very ones he cares about the most. Joseph Bast at the Heartland Institute tried a decade ago with a national advertising campaign—to no avail. Alex Epstein last year offered $100,000 for Gore to publicly debate—the very amount that Gore charges for his speaking engagements.

While Gore dare not put his own knowledge and convictions to the test, sometimes things can go awry. When a reporter brought up a mainstream climate scientist’s caution about Gore’s (exaggerated) sea-level rise claim in An Inconvenient Sequel, Gore snapped.

As recounted by reporter Ross Clark:

As soon as I mention Professor Wdowinski’s name, he counters: “Never heard of him — is he a denier?” Then, as I continue to make the point, he starts to answer before directing it at me: “Are you a denier?” When I say I am sure that climate change is a problem, but how big a one I don’t know, he jumps in: “You are a denier.”

Professor Shimon Wdowinski, associate professor of marine geology and geophysics at the Florida International University, specializes in the study of flooding in Miami. He is, states Clark, “exactly the sort of expert, one might think, with whom Gore or his team of researchers might have been in touch before making a documentary film involving the issue of flooding in Miami.”

Politics First

One can go on and on about the tensions and contradictions of Albert Arnold Gore Jr., including when the presidential candidate conveniently forgot his end-of-the-world rhetoric in the heat of political battle.

“I think we need to bring gasoline prices down,” Gore intoned in the summer of 2000. “I have made it clear in this campaign that I am not calling for any tax increase on gasoline, on oil, on natural gas, or anything else.”

A climate skeptic or “denier,” and the current President of the United States, could not have said it better.

Conclusion

A quarter century ago, in Earth in the Balance, Al Gore offered a stern diagnosis and gloomy prognosis of the natural state of things. “I believe that our civilization is, in effect, addicted to the consumption of the earth itself,” he complained. The ensuing environmental crisis, he added, was a very difficult “war with ourselves” (pp. 220, 275).

Al Gore is at war with himself. Little wonder that his hypocritical, hyperbolic message goes backwards with his every push.

It is all political theater, as Jerry Taylor posited in “Global Warming: The Anatomy of a Debate.” And in this show, actor Al is “the gift that keeps on giving.”


[1] According to the US Energy Information Administration, Tennessee gets about 40 percent of its electricity from coal-fired generation, with natural gas providing 14 percent. Renewables provide about 15 percent, and non-hydro generation less than two percent. Nuclear provides the balance (about one-third).

The post Al Gore’s Energy Problems appeared first on IER.

SEI’s Solar Ready Colorado Initiative Wins IREC 3iAward!

Solar Energy International’s (SEI) Solar Ready Colorado Initiative received the 2017 IREC State/Local Government Achievement of the Year. Solar Ready Colorado is a statewide training and career outreach program led by SEI and industry partners and funded through the Colorado Department of Labor and Employment’s WORK Act Grant. The program offers SEI’s technical training program to Coloradoans who can fill the solar industry’s predicted skilled workforce shortage and helps to ensure the state is prepared for the changing landscape of energy production and distribution. The awards, given out by the Interstate Renewable Energy Council (IREC), honored the 2017 3iAward recipients which celebrate the nation’s best innovation, ingenuity and inspiration in renewable energy and energy efficiency. The winners are based on a prestigious annual national search.

The Solar Ready Colorado program was honored for its innovative approaches to outreach especially the Solar Ready Colorado Career Expo and Envision360 Virtual Reality project. The Solar Ready Colorado Career Expo was co-sponsored by project partners Colorado Solar Energy Industries Association (COSEIA) and GRID Alternatives Colorado at COSEIA’s annual Solar Power Colorado conference. More than 150 job seekers attended the expo and engaged with Colorado solar companies about the jobs available in this growing industry. The partners will once again put on the Career Expo event at the newly expanded Solar Power Mountain West conference from March 12-14, 2018. coseia.org/conference/

“This event provided a forum where some of Colorado’s top solar companies could meet individually with job seekers was a unique way to showcase the opportunities in this fast growing part of our economy, which already employs more than 6,000 Coloradans,’’ said Chris Turek, Marketing Director of SEI ”we look forward to an even bigger career expo next year.’’

The event also marked the launch of the SEI Envision360 App and SEI Google Cardboard Project. Envision360 is a solar recruitment program technology tool that gives students a virtual experience of what it’s like to work in the solar industry, using Google Cardboard Viewers and virtual reality (VR) technologies. Solar companies across Colorado are featured in the app through interviews of key staff about their experience in solar and through 360o experiences of their warehouses, offices, and job sites. The app and viewers are being utilized by Coloradoans new to the solar industry as well as in schools to educate students about career opportunities in solar.

The post SEI’s Solar Ready Colorado Initiative Wins IREC 3iAward! appeared first on Solar Training - Solar Installer Training - Solar PV Installation Training - Solar Energy Courses - Renewable Energy Education - NABCEP - Solar Energy International (SEI).

Thursday, August 24, 2017

The Voice Playbook – Building a Marketing Plan for the Next Era in Computing

Posted by SimonPenson

Preface

This post serves a dual purpose: it's a practical guide to the realities of preparing for voice right now, but equally it's a rallying call to ensure our industry has a full understanding of just how big, disruptive, and transformational it will be — and that, as a result, we need to stand ready.

My view is that voice is not just an add-on, but an entirely new way of interacting with the machines that add value to our lives. It is the next big era of computing.

Brands and agencies alike need to be at the forefront of that revolution. For my part, that begins with investing in the creation of a voice team.

Let me explain just how we plan to do that, and why it’s being actioned earlier than many will think necessary….

Jump to a section:

Why is voice so important?
When is it coming in a big way?
Who are the big players?
Where do voice assistants get their data from?
How do I shape my strategy and tactics to get involved?
What skill sets do I need in a "voice team?"

Introduction

"The times, they are a-changing."
– Bob Dylan

Back in 1964, that revered folk-and-blues singer could never have imagined just what that would mean in the 21st century.

As we head into 2018, we're nearing a voice interface-inspired inflection point the likes of which we haven't seen before. And if the world’s most respected futurist is to be believed, it’s only just beginning.

Talk to Ray Kurzweil, Google’s Chief Engineer and the man Bill Gates says is the "best person to predict the future," and he’ll tell you that we are entering a period of huge technological change.

For those working across search and many other areas of digital marketing, change is not uncommon. Seismic events, such as the initial roll out of Panda and Penguin, reminded those inside it just how painful it is to be unprepared for the future.

At best, it tips everything upside down. At worst, it kills those agencies or businesses stuck behind the curve.

It’s for exactly this reason that I felt compelled to write a post all about why I'm building a voice team at Zazzle Media, the agency I founded here in the UK, as stats from BrightEdge reveal that 62% of marketers still have no plans whatsoever to prepare for the coming age of voice.

I’m also here to argue that while the growth traditional search agencies saw through the early 2000s is over, similar levels of expansion are up for grabs again for those able to seamlessly integrate voice strategies into an offering focused on the client or customer.

Winter is coming!

Based on our current understanding of technological progress, it's easy to rest on our laurels. Voice interface adoption is still in its very early stages. Moore’s Law draws a (relatively) linear line through technological advancement, giving us time to take our positions — but that era is now behind us.

According to Kurzweil’s thesis on the growth of technology (the Law of Accelerating Returns),

"we won’t experience 100 years of progress in the 21st century – it will be more like 20,000 years."

Put another way, he explains that technology does not progress in a linear way. Instead, it progresses exponentially.

"30 steps linearly get you to 30. One, two, three, four, step 30 you're at 30. With exponential growth, it's one, two, four, eight. Step 30, you're at a billion," he explained in a recent Financial Times interview.

In other words, we're going to see new tech landing and gaining traction faster than we ever realized it possible, as this chart proves:

Above, Kurzweil illustrates how we’ll be able to produce computational power as powerful as a human brain by 2023. By 2037 we’ll be able to do it for less than a one-cent cost. Just 15 years later computers will be more powerful than the entire human race as a whole. Powerful stuff — and proof of the need for action as voice and the wider AI paradigm takes hold.

Voice

So, what does that mean right now? While many believe voice is still a long ways off, one point of view says it's already here — and those fast enough to grab the opportunity will grow exponentially with it. Indeed, Google itself says more than 20% of all searches are already voice-led, and will reach 50% by 2020.

Let’s first deal with understanding the processes required before then moving onto the expertise to make it happen.

What do we need to know?

We’ll start with some assumptions. If you are reading this post, you already have a good understanding of the basics of voice technology. Competitors are joining the race every day, but right now the key players are:

  • Microsoft Cortana – Available on Windows, iOS, and Android.
  • Amazon Alexa Voice-activated assistant that lives on Amazon audio gear (Echo, Echo Dot, Tap) and Fire TV.
  • Google Assistant – Google’s voice assistant powers Google Home as well as sitting across its mobile and voice search capabilities.
  • Apple Siri – Native voice assistant for all Apple products.

And (major assistants) coming soon:

All of these exist to allow consumers the ability to retrieve information without having to touch a screen or type anything.

That has major ramifications for those who rely on traditional typed search and a plethora of other arenas, such as the fast-growing Internet of Things (IoT).

In short, voice allows us to access everything from our personal diaries and shopping lists to answers to our latest questions and even to switch our lights off.

Why now?

Apart from the tidal wave of tech now supporting voice, there is another key reason for investing in voice now — and it's all to do with the pace at which voice is actually improving.

In a recent Internet usage study by KPCB, Andrew NG, chief scientist at Chinese search engine Baidu, was asked what it was going to take to push voice out of the shadows and into its place as the primary interface for computing.

His point was that at present, voice is "only 90% accurate" and therefore the results are sometimes a little disappointing. This slows uptake.

But he sees that changing soon, explaining that "As speech recognition accuracy goes from, say, 95% to 99%, all of us in the room will go from barely using it today to using it all the time. Most people underestimate the difference between 95% and 99% accuracy — 99% is a game changer... “

When will that happen? In the chart below we see Google’s view on this question, predicting we will be there in 2018!

Is this the end for search?

It is also important to point out that voice is an additional interface and will not replace any of those that have gone before it. We only need to look back at history to see how print, radio, and TV continue to play a part in our lives alongside the latest information interfaces.

Moz founder Rand Fishkin made this point in a recent WBF, explaining that while voice search volumes may well overtake typed terms, the demand for traditional SERP results and typed results will continue to grow also, simply because of the growing use of search.

The key will be creating a channel strategy as well as a method for researching both voice and typed opportunity as part of your overall process.

What’s different?

The key difference when considering voice opportunity is to think about the conversational nature that the interface allows. For years we've been used to having to type more succinctly in order to get answers quickly, but voice does away with that requirement.

Instead, we are presented with an opportunity to ask, find, and discover the things we want and need using natural language.

This means that we will naturally lengthen the phrases we use to find the stuff we want — and early studies support this assumption.

In a study by Microsoft and covered by the brilliant Purna Virji in this Moz post from last year, we can see a clear distinction between typed and voice search phrase length, even at this early stage of conversational search. Expect this to grow as we get used to interacting with voice.

The evidence suggests that will happen fast too. Google’s own data shows us that 55% of teens and 40% of adults use voice search daily. Below is what they use it for:

While it is easy to believe that voice only extends to search, it's important to remember that the opportunity is actually much wider. Below we can see results from a major 2016 Internet usage study into how voice is being used:

Clearly, the lion's share is related to search and information retrieval, with more than 50% of actions relating to finding something local to go/see/do (usually on mobile) or using voice as an interface to search.

But an area sure to grow is the leisure/entertainment sector. More on that later.

The key question remains: How exactly do you tap into this growing demand? How do you become the choice answer above all those you compete with?

With such a vast array of devices, the answer is a multi-faceted one.

Where is the data coming from?

To answer the questions above, we must first understand where the information is being accessed from and the answer, predictably, is not a simple one. Understanding it, however, is critical if you are to build a world-class voice marketing strategy.

To make life a little easier, I’ve created an at-a-glance cheat sheet to guide you through the process. You can download it by clicking on the banner below.

In it, you'll find an easy-to-follow table explaining where each of the major voice assistants (Siri, Cortana, Google Assistant, and Alexa) retrieve their data from so you can devise a plan to cover them all.

The key take away from that research? Interestingly, Bing has every opportunity to steal a big chunk of market share from Google and, at least at present, is the key search engine to optimize for if voice "visibility" is the objective.

Bing is more important now.

Of all the Big Four in voice, three (Cortana, Siri, and Alexa) default to Bing search for general information retrieval. Given that Facebook (also a former Bing search partner) is also joining the fray, Google could soon find itself in a place it's not entirely used to being: alone.

Now, the search giant usually finds a way to pull back market share, but for now a marketers’ focus should be on Microsoft’s search engine and Google as a secondary player.

Irrespective of which engine you prioritize there are two key areas to focus on: featured snippets and local listings.

Featured snippets

The search world has been awash with posts and talks on this area of optimization over recent months as Google continues to push ahead with the roll out of the feature-rich SERP real estate.

For those that don’t know what a "snippet" is, there’s an example below, shown for a search for "how do I get to sleep":

Not only is this incredibly valuable traditional search real estate (as I’ve discussed in an earlier blog post), but it's a huge asset in the fight for voice visibility.

Initial research by experts such as Dr. Pete Myers tells us, clearly, that Google assistant is pulling its answers from snippet content for anything with any level of complexity.

Simple answers — such as those for searches about sports results, the weather, and so forth — are answered directly. But for those that require expertise it defaults to site content, explaining where that information came from.

At present, it's unclear how Google plans to help us understand and attribute these kinds of visits. But according to insider Gary Illyes, it is imminent within Search Console.

Measurement will clearly be an important step in selling any voice strategy proposal upwards and to provide individual site or brand evidence that the medium is growing and deserving of investment.

User intent and purchase

Such data will also help us understand how voice alters such things as the traditional conversion funnel and the propensity to purchase.

We know how important content is in the traditional user journey, but how will it differ in the voice world? There's sure to be a rewrite of many rules we've come to know well from the "typed Internet."

Applying some level of logic to the challenge, it's clear that there's a greater degree of value in searches showing some level of immediacy, i.e. people searching through home assistants or mobiles for the location of something or time and/or date of the same thing.

Whereas with typed search we see greater value in simple phrases that we call "head terms," the world is much more complex in voice. Below we see a breakdown of words that will trigger searches in voice:

To better understand this, let’s examine a potential search "conversation."

If we take a product search example for, let’s say, buying a new lawn mower, the conversation could go a little like this:

[me] What’s the best rotary lawn mower for under £500?
[voice assistant] According to Lawn Mower Hut there are six choices [reads out choices]

Initially, voice will struggle to understand how to move to the next logical question, such as:

[voice assistant] Would you like a rotary or cylinder lawn mower?

Or, better still…

[voice assistant] Is your lawn perfectly flat?
[me] No.
[voice assistant] OK, may I suggest a rotary mower? If so then you have two choices, the McCulloch M46-125WR or the BMC Lawn Racer.

In this scenario, our voice assistant has connected the dots and asks the next relevant question to help narrow the search in a natural way.

Natural language processing

To do this, however, requires a step up in computer processing, a challenge being worked on as we speak in a bid to provide the next level of voice search.

To solve the challenge requires the use of so-called Deep Neural Networks (DNNs), interconnected layers of processing units designed to mimic the neural networks in the brain.

DNNs can work across everything from speech, images, sequences of words, and even location before then classifying them into categories.

It relies on the input of truckloads of data so it can learn how best to bucket those things. That data pile will grow exponentially as the adoption of voice accelerates.

What that will mean is that voice assistants can converse with us in the same way as a clued-up shop assistant, further negating the need for in-store visits in the future and a much more streamlined research process.

In this world, we start to paint a very different view of the "keywords" we should be targeting, with deeper and more exacting phrases winning the battle for eyeballs.

As a result, the long tail’s rise in prominence continues at pace, and data-driven content strategies really do move to the center of the marketing plan as the reward for creating really specific content increases.

We also see a greater emphasis placed on keywords that may not be on top of the priority list currently. If we continue to work through our examples, we can start to paint a picture of how this plays out…

In our lawnmower purchase example, we're at a stage where two options have been presented to us (the McCulloch and the BMC Racer). In a voice 1.0 scenario, where we have yet to see DNNs develop enough to know the next relevant question and answer, we might ask:

[me] Which has the best reviews?

And the answer may be tied to a 3rd party review conclusion, such as…

[voice assistant] According to Trustpilot, the McCulloch has a 4.5-star rating versus a 3.5-star rating for the BMC lawn mower.

Suddenly, 3rd party reviews become more valuable than ever as a conversion optimization opportunity, or a strategy that includes creating content to own the SERP for a keyword phrase that includes "review" or "top rated."

And where would we naturally go from here? The options are either directly to conversion, via some kind of value-led search (think "cheapest McCulloch M46-125W"), or to a location-based one ("nearest shop with a McCulloch M46-125WR") to allow me to give it a "test drive."

Keyword prioritization

This single journey gives us some insight into how the interface could shape our thinking on keyword prioritization and content creation.

Pieces that help a user either make a decision or perform an action around the following trigger words and phrases will attract greater interest and traffic from voice. Examples could include:

  • buy
  • get
  • find
  • top rated
  • closest
  • nearest
  • cheapest
  • best deal

Many are not dissimilar to typed search, but clearly intent priorities change. The aforementioned Microsoft study also looked at how this may work, suggesting the following order of question types and their association with purchase/action:

Local opportunity

This also pushes the requirement for serious location-based marketing investment much higher up the pecking order.

We can clearly see how important such searches become from a "propensity to buy/take action" perspective.

It pays to invest more in ensuring the basics are covered, for which the Moz Local Search Ranking Factors study can be a huge help, but also in putting some weight behind efforts across Bing Places. If you are not yet set up fully over there, this simple guide can help.

Local doesn’t start and end with set up, of course. To maximize visibility there must be an ongoing local marketing plan that covers not just the technical elements of search but also wider marketing actions that will be picked up by voice assistants.

We already know, for instance, that engagement factors are playing a larger part of the algorithmic mix for local, but our understanding of what that really means may be limited.

Engagement is not just a social metric but a real world one. Google, for instance, knows not just what you search for but where you go (via location tracking and beacon data), what you watch (via YouTube), the things you are interested in, and where you go (via things such as Flight search and Map data). We need to leverage each of these data points to maximize effect.

As a good example of this in action, we mentioned review importance earlier. Here it plays a significant part of the local plan. A proactive review acquisition strategy is really important, so look to build this into your everyday activity by proactively incentivizing visitors to leave them. This involves actively monitoring on all the key review sites, not just your favorite!

Use your email strategy to drive this behavior as well by ensuring that newsletters and offer emails support the overall local plan.

And a local social strategy is also important. Get to know your best customers and most local visitors and turn them into evangelists.

Doing it is easier than you might think; you can use Twitter mention monitoring not only to search for key terms, but also mentions within specific latitude/longitude settings or radius.

Advanced search also allows you to discover tweets by location or mentioning location. This can be helpful as research to discover the local questions being asked.

The awesome team at Zapier covered this topic in lots of detail recently, so for those who want to action this particular point I highly recommend reading this post.

Let’s go deeper

There is new thinking needed if the opportunity is to be maximized. To understand this, we need to go back to our user journey thought process.

For starters, there's the Yelp/Alexa integration. While the initial reaction may be simply to optimize listings for the site, the point is actually a wider one.

Knowing that many of the key vertical search engines (think Skyscanner [travel], Yelp [local], etc.) will spend big to ensure they have the lion’s share of voice market, it will pay to spend time improving your content on these sites.

Which is most important will be entirely dependent upon what niche you are working in. Many will only offer limited opportunity for optimization, but being there and spending time ensuring your profile is 110% will be key. It may even pay to take sponsored opportunities within them for the added visibility it may give you in the future.

There's also the really interesting intellectual challenge of attempting to map out as many potential user journeys as possible to and from your business.

Let's take our lawnmower analogy again, but this time from the perspective of a retailer situated within 20 miles of the searcher. In this scenario, we need to think about how we might be able to get front and center before anyone else if we stock the McCulloch model they are looking for.

If we take it as a given that we’ve covered the essentials, then we need to think more laterally.

It's natural to not only look for a local outlet that stocks the right model, but when it may be open. We might also ask more specific questions like whether they have parking, or even if they are busy at specific times or offer appointments.

The latter would be a logical step, especially for businesses that work in this way; think dentists, doctors, beauty salons, and even trades. The opportunity to book a plumber at a specific time via voice would be a game changer for those set up to offer it.

Know your locality

As a local business, it is also imperative that you know the surrounding areas well and to be able to prove you’ve thought about it. This includes looking at how people talk about key landmarks from a voice perspective.

We often use slang or shortened versions of landmark naming conventions, for instance. In a natural, conversational setting, you may find that you miss out if you don’t use those idiosyncrasies within the content you produce and feature on your site or within your app.

Fun and entertainment

Then, of course, comes the "fun." Think of it as the games section of the App Store — it makes little logical sense, but in it lies a whole industry of epic proportions.

Voice will give birth to the next era in entertainment. While some of you may be thinking about how to profit from such an active audience, the majority of brands would be smart to see it as an engagement and brand awareness world.

Game makers will clamber to create hit mind games and quizzes, but those that play around the edges may well be the monarchs of this opportunity. Think about how voice could change the dynamic for educators, play the part of unbiased referees in games, or teach birdsong and the birds to which they relate. The opportunity is endless — and it will claim 25% of the overall pie, according to current usage research.

The monetization methods are yet to be uncovered, but the advertising opportunity is significant, as well as how clever technology like Blockchain may enable frictionless payments and more.

User journey mapping

So how do you tie all of this together into a seamless plan, given the complexity and number of touch points available? The answer starts and ends with user journey mapping.

This is something I find myself doing more and more now as part of the wider marketing challenge. Fragmented audiences and a plethora of devices and technology mean it's more difficult than ever to build an integrated strategy. Taking a user-centric approach is the only way to make sense of the chaos.

Voice is no different, and the key differentiator here is the fact that in this new world a journey is actually a conversation (or a series of them).

Conversation journey mapping

While the tech may not yet be there to support conversations in voice, given the point at the beginning of this piece around the law of Accelerating Returns, it's clear that it's coming — and faster than we realize.

In some respects, the timing of that advancement is irrelevant, however, as the process of working through a series of conversations that a potential client or customer may have around your product or service is invaluable as research for your plan.

To go back to our lawnmower example, a conversation mapping exercise may look a little like this:

[me] What’s the best lawnmower for under £500?
[voice assistant] How large is your lawn?
[me] It’s not very big. I don’t need a ride-on.
[voice assistant] OK so would you prefer a cylinder or rotary version?
[me] I don’t know. How do I choose?
[voice assistant] If you want stripes and your lawn is very flat, a cylinder gives a better finish. If not, a rotary is better.
[me] OK, definitely a rotary then!
[voice assistant] Good choice. In that case, your best options are either the McCulloch M46-125WR or the BMC Lawn Racer.
[me] Which is best?
[voice assistant] According to Trustpilot, the McCulloch has 4.5 stars from 36 reviews versus 3.5 stars for the BMC. The McCulloch is also cheaper. Do you want me to find the best deal or somewhere nearby that stocks it?
[me] I’d like to see it before buying if possible.
[voice assistant] OK, ABC Lawn Products is 12 miles away and has an appointment at 11am. Do you want to book it?
[me] Perfect.

Where are the content or optimization opportunities?

Look carefully above and you’ll see that there are huge swathes of the conversation that lend themselves to opportunity, either through content creation or some other kind of optimization.

To spell that out, here's a possible list:

  • Guide – Best lawnmower for £500
  • Guide – Rotary versus cylinder lawnmowers
  • Review strategy – Create a plan to collect more reviews
  • Optimization – Evergreen guide optimization strategy to enhance featured snippet opportunities
  • Local search – Optimize business listing to include reviews, opening times, and more
  • Appointments – Open up an online appointment system and optimize for voice

In developing such a roadmap, it's also important to consider the context within which the conversation is happening.

Few of us will ever feel entirely comfortable using voice in a crowded, public setting, for instance. We’re not going to try using voice on a bus, train, or at a festival anytime soon.

Instead, voice interfaces will be used in private, most likely in places such as homes and cars and places where it's useful to be able to do multiple things at once.

Setting the scene in this way will help as you define your conversation possibilities and the optimization opportunities from it.

What people do we need to create all this?

The one missing piece of the jigsaw as we prepare for the shift to voice? People.

All of the above require a great deal of work to perfect and implement, and while the dust still needs to clear on the specifics of voice marketing, there are certain skill sets that will need to pull together to deliver a cohesive strategy.

For the majority, this will simply mean creating project groups from existing team members. But for those with the biggest opportunities (think recipe sites, large vertical search plays, and so on), it may be that a standalone team is necessary.

Here’s my take on what that team will require:

  • Developer – with specific skill in creating Google Home Actions, Alexa Skills, and so on.
  • Researcher – to work with customer groups to understand how voice is being used and capture further opportunities for development.
  • SEO – to help prioritize content creation and how it's structured and optimized.
  • Writer – to build out the long-tail content and guides necessary.
  • Voice UX expert – A specialist in running conversation mapping sessions and turning them into brilliant user journeys for the different content and platforms your brand utilizes.

Conclusion

If you’ve read to this point, you at least have an active interest in this fast-moving area of tech. We know from the minds of the most informed experts that voice is developing quickly and that it clearly offers significant benefits to its users.

When those two key things combine, alongside a lowering cost to the technology needed to access it, it creates a tipping point that only ends one way: in the birth of a new era for computing.

Such a thing has massive connotations for both digital and wider marketing, and it will pay to have first-mover advantage.

That means educating upwards and beginning the conversation around how voice interfaces may change your own industry in the future. Once you have that running, who knows where it might lead you?

For some, it changes little, for others everything, and the good news for search marketers is that there are a lot of existing tactics and skill sets that will have an even bigger part to play.

Existing skills

  • The ability to claim featured snippets and answer boxes becomes even more rewarding as they trigger millions of voice searches.
  • Keyword research has a wider role in forming strategies to reach into voice and outside traditional search, as marketers become more interested in the natural language their audiences are using.
  • Local SEO wins become wider than simply appearing in a search engine.
  • Micro-moments become more numerous and even more specific than ever before. Research to uncover these becomes even more pivotal.

New opportunities to consider

  • Increases in content consumption through further integration in daily life — so think about what other kinds of content you can deliver to capture them.
  • Think Internet of Things integration and how your brand may be able to provide content for those devices or to help people use connected home.
  • Look at what Skills/Actions you can create to play in the "leisure and entertainment" sector of voice. This may be as much about an engagement/awareness play than pure conversion or sales, but it’s going to be a huge market. Think quick games, amazing facts, jokes, and more…
  • Conversation journey mapping is a powerful new skill to be learned and implemented to tie all content together.

Here’s to the next 50 years of voice interface progress!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!