California Governor Gavin Newsom signed an executive order to end the sale of new gasoline and diesel-powered passenger cars in the state by 2035. He is doing this because he says he believes climate change is the cause of wild fires in the state and he believes that banning new sales of the internal combustion engine vehicles will somehow fix what his mismanagement of the state’s forests has caused. He did not address the fact that carbon dioxide emissions result from the manufacture of all vehicles, including electric vehicles, and that most electricity in this country (62 percent) is produced from fossil fuels, largely coal and natural gas. He also neglects the fact that most Americans (98 percent) prefer internal combustion engine vehicles, where they can choose the price, size, safety features, range, storage capacity, refueling options, and other attributes that they need for their life style. Only 1.6 percent of the nation currently own an electric vehicle and many of those vehicle owners also own a gasoline or diesel vehicle as well. The electric car for many of its owners is a statement of ostensible concern for the environment.
According to Newsom, California residents would still be able to own gasoline or diesel vehicles and sell them on the used market after 2035. But, the governor’s executive order raises questions about the logistics and equity of the transition from internal combustion engines to electric vehicles. Car-dependent California tried to make this transition before with little luck. In 1990, a mandate required 2 percent of new cars sold in California by 1998 to be zero-emission vehicles. At the time, few models complied and they were small and had limited ranges, resulting in the mandate being unsuccessful.
Because of the higher cost of electric vehicles and a lack of a vibrant second-hand market for them, Newsom is putting Californians into a financial dilemma of not being able to afford a vehicle, which is needed to commute to jobs, take children to after-school activities, shop for groceries and other essentials, and shuttle the old, disabled, and sick to doctors’ offices and hospitals. For those who require a sturdy and capable vehicle for their jobs—construction workers, tradesmen and women and the like, additional burdens will occur.
What Authority Does Newsom Have for the Ban?
Established under the Clean Air Act of 1970, California has special authority to regulate tailpipe emissions owing to their unique air quality problems and geography. The Trump administration has revoked this authority given the much cleaner cars and fuels of today. A lawsuit is currently before a federal appeals court in Washington, D.C. against the revocation. Governor Newsom’s authority to enforce the new mandate would come from that same law.
President Trump revoked California’s waiver and instituted a single federal emissions standard, which would lower auto prices for consumers, create jobs and safer vehicles, while still lowering emissions.
Newsom’s executive order also states that, “where feasible,” medium- and heavy-duty vehicles such as trucks and construction equipment should be zero-emission by 2045. The order also called for agencies to craft “an integrated, statewide rail and transit network,” despite Newsom lowering the scope of the state’s high speed rail project a year or so ago. The estimated cost of the initial high speed rail line has ballooned to $80.3 billion. Newsom also outlined plans to support more bicycle and pedestrian infrastructure, especially in low-income and disadvantaged communities.
Governor Newsom directed agencies to develop a zero-emission vehicle market development strategy by the end of January and update it every three years. He also asked them to accelerate existing efforts on establishing charge ports.
If California wins the court battle, which may end up in the Supreme Court, the California Air Resources Board will be charged with developing the specific regulations needed to implement the state mandate for passenger cars and trucks. According to Air Resources Board Chair Mary Nichols, California wants to phase out hybrid vehicles over the next 15 years and have Californians purchase only fully electric cars.
Industry Reaction
The Alliance for Automotive Innovation, a trade group that represents major U.S. and overseas-based auto manufacturers, indicated that mandates are not the best way to implement change. Successful markets need widespread stakeholder engagement, among governments, auto makers, utilities, infrastructure providers and others. While California has the best sales data in the nation for electric vehicles (6.2 percent) due mainly to state financial incentives and subsidies, California still needs increased infrastructure, more incentives, fleet requirements, building codes, and much more to accomplish Newsom’s directive.
The California New Car Dealers Association expressed concerns over the need to greatly expand public charging infrastructure and to drive down costs of zero-emission vehicles so they are not available only to the wealthy as they are today. Also, enacting a major piece of policy through an executive order, without legislative approval, deprives Californians a direct voice on the issue.
California business groups, from the California Chamber of Commerce to the California Manufacturers & Technology Association, criticized the executive order as unrealistic. According to the California Business Roundtable, the radical step to ban internal combustion engines makes no sense and is a rushed decision, with no guarantee of affordability for many who live in an already-expensive state.
System-Wide Changes Are Needed
A shift from gasoline and diesel to electric vehicles will require a new charging infrastructure. The state’s electric grid, which experienced two days of rolling black-outs recently due to a heat wave will need to be significantly upgraded to handle the new demand. The state’s largest utilities have also shut off large sections of their grids to stop their power lines from sparking wildfires on windy days.
The state’s electric grid has proven unreliable because the state adopted a law in 2017 to reduce its greenhouse gas emissions by at least 40 percent by 2030 and 80 percent by 2050 from 1990 levels. Further, California’s renewable portfolio standard mandates that 60 percent of its electricity must come from renewable energy by 2030. Renewables account for about a third of the state’s electricity and much of it is intermittent solar and wind power, which stress the system more as their percentage of total generation increases.
Conclusion
Governor Newsom is taking choice away from California residents, banning the sale of internal combustion vehicles by 2035. As industry executives have indicated, mandates are not the way to implement such a change because its massiveness requires a public-private partnership to establish the infrastructure and to make electric vehicles affordable to the general public. Further, it will require massive and expensive changes to the grid, which is already heavily stressed due to increasing intermittent sourcing of electricity.
Newsom’s timing is awkward given that the state has barely opened up from the lockdown caused by the coronavirus pandemic and California residents are already hurting from the lack of jobs and normalcy, and saddled with a grid that cannot handle existing demands for electricity. Yet, the Governor wants to put more restrictions on them while increasing demands for electricity.
One of the most difficult decisions to make in any field is to consciously choose to miss a deadline. Over the last several months, a team of some of the brightest engineers, data scientists, project managers, editors, and marketers have worked towards a release date of the new Page Authority (PA) on September 30, 2020. The new model is exceptional in nearly every way to the current PA, but our last quality control measure revealed an anomaly that we could not ignore.
As a result, we’ve made the tough decision to delay the launch of Page Authority 2.0. So, let me take a moment to retrace our steps as to how we got here, where that leaves us, and how we intend to proceed.
Seeing an old problem with fresh eyes
Historically, Moz has used the same method over and over again to build a Page Authority model (as well as Domain Authority). This model's advantage was its simplicity, but it left much to be desired.
Previous Page Authority models trained against SERPs, trying to predict whether one URL would rank over another, based on a set of link metrics calculated from the Link Explorer backlink index. A key issue with this type of model was that it couldn’t meaningfully address the maximum strength of a particular set of link metrics.
For example, imagine the most powerful URLs on the Internet in terms of links: the homepages of Google, Youtube, Facebook, or the share URLs of followed social network buttons. There are no SERPs that pit these URLs against one another. Instead, these extremely powerful URLs often rank #1 followed by pages with dramatically lower metrics. Imagine if Michael Jordan, Kobe Bryant, and Lebron James each scrimaged one-on-one against high school players. Each would win every time. But we would have great difficulty extrapolating from those results whether Michael Jordan, Kobe Bryant, or Lebron James would win in one-on-one contests against each other.
When tasked with revisiting Domain Authority, we ultimately chose a model with which we had a great deal of experience: the original SERPs training method (although with a number of tweaks). With Page Authority, we decided to go with a different training method altogether by predicting which page would have more total organic traffic. This model presented several promising qualities like being able to compare URLs that don’t occur on the same SERP, but also presented other difficulties, like a page having high link equity but simply being in an infrequently-searched topic area. We addressed many of these concerns, such as enhancing the training set, to account for competitiveness using a non-link metric.
Measuring the quality of the new Page Authority
The results were — and are — very promising.
First, the new model obviously predicted the likelihood that one page would have more valuable organic traffic than another. This was expected, because the new model was directed at this particular goal, while the current Page Authority merely attempted to predict whether one page would rank over another.
Second, we found that the new model predicted whether one page would rank over another better than the previous Page Authority. This was especially pleasing, as it laid to rest many of our concerns that the new model would underperform on old quality controls due to the new training model.
How much better is the new model at predicting SERPs than the current PA? At every interval — all the way down to position 4 vs 5 — the new model tied or out-performs the current model. It never lost.
Everything was looking great. We then started analyzing outliers. I like to call this the “does anything look stupid?” test. Machine learning makes mistakes, just as humans can, but humans tend to make mistakes in a very particular manner. When a human makes a mistake, we often understand exactly why the mistake was made. This isn’t the case for ML, especially Neural Nets; we pulled URLs with high Page Authorities under the new model that happened to have zero organic traffic, and included them in the training set to learn for those errors. We quickly saw bizarre 90+ PAs drop down to much more reasonable 60s and 70s… another win.
We were down to one last test.
The problem with branded search
Some of the most popular keywords on the web are navigational. People search Google for Facebook, Youtube, and even Google itself. These keywords are searched an astronomical number of times relative to other keywords. Subsequently, a handful of highly powerful brands can have an enormous impact on a model that looks at total search volume as part of its core training target.
The last test involves comparing the current Page Authority to the new Page Authority, in order to determine if there are any bizarre outliers (where PA shifted dramatically and without obvious reason). First, let’s look at a simple comparison of the LOG of Linking Root Domains compared to the Page Authority.
Not too shabby. We see a generally positive correlation between Linking Root Domains and Page Authority. But can you spot the oddities? Go ahead and take a minute…
There are two anomalies that stand out in this chart:
There is a curious gap separating the main distribution of URLs and the outliers above and below.
The largest variance for a single score is at PA 99. There are an awful lot of PA 99s with a wide range of Linking Root Domains.
Here is a visualization that will help draw out these anomalies:
The gray spaces between the green and red represent this odd gap between the bulk of the distribution and the outliers. The outliers (in red) tend to clump together, especially above the main distribution. And, of course, we can see the poor distribution at the top of PA 99s.
Bear in mind that these issues are not sufficient to make the new Page Authority model less accurate than the current model. However, upon further examination, we found that the errors the model did produce were significant enough that they could adversely influence the decisions of our customers. It’s better to have a model that is off by a little everywhere (because the adjustments SEOs make are not incredibly fine-tuned) than it is to have a model that is right mostly everywhere but bizarrely wrong in a limited number of cases.
Luckily, we’re fairly confident as to what the problem is. It seems that homepage PAs are disproportionately inflated, and that the likely culprit is the training set. We can’t be certain this is the cause until we complete retraining, but it is a strong lead.
The good news and the bad news
We are in good shape insofar as we have multiple candidate models that outperform the existing Page Authority. We’re at the point of bug squashing, not model building. However, we are not going to ship a new score until we are confident that it will steer our customers in the right direction. We are highly conscientious of the decisions our customers make based on our metrics, not just whether the metrics meet some statistical criteria.
Given all of this, we have decided to delay the launch of Page Authority 2.0. This will give us the necessary time to address these primary concerns and produce a stellar metric. Frustrating? Yes, but also necessary.
As always, we thank you for your patience, and we look forward to producing the best Page Authority metric we have ever released.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
“Biden will also transform the energy sources that power the transportation sector, making it easier for mobility to be powered by electricity and clean fuels, including commuter trains, school and transit buses, ferries, and passenger vehicles.”
Many U.S. cities and states have experimented with electric (battery-powered) buses, including Los Angeles and San Francisco, California; Albuquerque, New Mexico; Columbus, Ohio; Virginia; the District of Columbia; and Chicago, Illinois. The cities all found them not ready to replace gasoline, diesel, and natural gas-powered buses because the electric buses take too long to charge and did not live up to their mileage specifications, particularly in cold or hot weather and in hilly terrain or even because of how the drivers braked. San Francisco worried that the buses might not hold up with a full passenger load, particularly on its hilly terrain. Albuquerque, New Mexico cancelled its contract over safety concerns with the vehicles’ batteries and chargers.
Most recently, the Massachusetts Bay Transportation Authority (MBTA) tested battery buses and found them not ready for prime time. The MBTA purchased five battery-power, 60-foot buses in 2019 and ran them over the past year. The vehicle manufacturer promised the buses would run 100 to 120 miles on a single charge, but the actual mileage ranged from 60 to 110 miles, with the lesser amounts coming on colder weather days. According to the Chief Engineer, “They don’t have enough battery power to deliver a full day’s service.” The buses would run out of power in the afternoon, and then it would take eight hours to recharge the batteries.
Further, the MBTA worried that the performance could actually be worse than the testing indicated because the past winter (2019) was so mild. The mileage dropped to 60 miles when the temperature was 20 degrees, but the mileage could drop even more with colder temperatures.
Despite lawmakers and transportation advocates pressing to convert to all-electric buses as quickly as possible, the MBTA found that the technology was not ready for a large procurement.
The vast majority (99 percent) of the world’s electric buses (425,000) are in China, where a national mandate promotes electric vehicles. China also nationally subsidizes its manufacturing of electric buses, and is shipping electric buses in mass to other countries, including the United States.
As mentioned above, some U.S. cities have bought a few electric buses and run limited pilots to test the concept in their areas. Despite California cities finding problems, the state has mandated that by 2029 all buses purchased by its mass transit agencies be zero-emission so that the state will have a total zero-emission fleet by 2040. There are just 650 electric buses in the United States currently, with over 200 of them in California.
Other Issues
Buying an electric bus is just the start because an entire electric bus system is needed. Charging stations are expensive—about $50,000 for a standard depot-based one. Longer bus routes would also need on-route charging stations, which could cost two or three times that amount not including construction costs or the cost of land. Charging infrastructure will cost a major city millions of dollars. An electric bus in the United States today costs around $750,000 compared to a diesel bus that sells for $550,000.
In most urban centers, bus depots are tightly packed to accommodate parking and fueling. The limited space would especially be a problem when transitioning between diesel and electric buses because two sets of fueling infrastructure would be required. Since charging buses can take 8 hours, more charging stations and space would be required than when compared to relatively brief fueling times for existing natural gas or diesel buses.
Companies must also get electricity to their charging stations, which involves grid upgrades, possible rewiring of systems and building new substations, and, determining cost-effective rates with utility companies. This is especially expensive in urban areas, given already-congested underground utility systems. One estimate is that it would take 150 megawatt-hours of electricity to keep a 300-bus depot charged throughout the day. A typical American household, by comparison, consumes 7 percent of that amount annually. Another comparison is that a single charge for a fleet of 100 battery electric buses—roughly one-tenth of MBTA’s current fleet size—would take 60 to 80 percent the amount of energy that AT&T Stadium does on a Dallas Cowboys game day.
Some transit agencies have run into high demand pricing from their local electric utility. In King County, Washington, for example, the county’s electric buses had a higher per-mile fuel costs than its diesel fleet due, in part, to high electricity demand charges. The Denver area’s transit agency worked out an agreement with Xcel Energy after similar problems with demand charges on one of its routes. Electricity is metered on volume as well as peak demand times, which can make the pricing of charging an electric bus more complicated than refueling a diesel one.
Conclusion
If the United States transitions to an all-electric fleet of buses, the country is likely to become reliant on China where the majority of manufacturing is taking place. The United States has transitioned from being dependent on foreign oil in the Middle East to achieving energy independence under President Donald Trump, but if Joseph Biden is elected, the United States will end up relying on China for its electric buses and the batteries that are needed to run them. Electric buses have performed much worse than advertised, leading many localities to begin questioning the switch.
“No place like home for the holidays.” This will be the refrain for the majority of your customers as we reach 2020’s peak shopping season. I can’t think of another year in which it’s been more important for local businesses to plan and implement a seasonal marketing strategy extra early, to connect up with customers who will be traveling less and seeking ways to celebrate at home.
Recently, it’s become trendy in multiple countries to try to capture the old Danish spirit of hygge, which the OED defines as: A quality of coziness and comfortable conviviality that engenders a feeling of contentment or well-being.
While this sometimes-elusive state of being isn’t something you can buy direct from a store, and while some shoppers are still unfamiliar with hygge by name, many will be trying to create it at home this year. Denmark buys more candles than any other nation, and across Scandinavia, fondness for flowers, warming foods, cozy drinks, and time with loved ones characterizes the work of weaving a gentle web of happiness into even the darkest of winters.
Whatever your business can offer to support local shoppers’ aspirations for a safe, comfortable, happy holiday season at home is commendable at the end of a very challenging 2020. I hope these eight local search marketing tips will help you make good connections that serve your customers — and your business — well into the new year.
1) Survey customers now and provide what they want
Reasonably-priced survey software is worth every penny in 2020. For as little as $20/month, your local business can understand exactly how much your customers’ needs have changed this past year by surveying:
Which products locals are having trouble locating
Which products/services they most want for the holidays
Which method of shopping/delivery would be most convenient for them
Which hours of operation would be most helpful
Which safety measures are must-haves for them to transact with a business
Which payment methods are current top choices
Doubtless, you can think of many questions like these to help you glean the most possible insight into local needs. Poll your customer email/text database and keep your surveys on the short side to avoid abandonment.
Don’t have the necessary tools to poll people at-the-ready? Check out Zapier’s roundup of the 10 Best Online Survey Apps in 2020 and craft a concise survey geared to deliver insights into customers’ wishes.
2) Put your company’s whole heart into affinity
If I could gift every local business owner with a mantra to carry them through not just the 2020 holiday shopping season, but into 2021, it would be this:
It’s not enough to have customers discover my brand — I need them to like my brand.
Chances are, you can call to mind some brands of which you’re highly aware but would never shop with because they don’t meet your personal or business standards in some way. You’ve discovered these brands, but you don’t like them. In 2020, you may even have silently or overtly boycotted them.
On the opposite side of this scenario are the local brands you love. I can wax poetic about my local independent grocery store, stocking its shelves with sustainable products from local farmers, flying its Black Lives Matter and LGBTQ+ flags with pride from its storefront, and treating every customer like a cherished neighbor.
For many years, our SEO industry has put great effort into and emphasis on the discovery phase of the consumer journey, but my little country-town grocer has gone leaps and bounds beyond this by demonstrating affinity with the things my household cares about. The owners can consider us lifetime loyal customers for the ways they are going above-and-beyond in terms of empathy, diversity, and care for our community.
I vigorously encourage your business to put customer-brand affinity at the heart of its holiday strategy. Brainstorm how you can make meaningful changes that declare your company’s commitment to being part of the work of positive social change.
3) Be as accessible and communicative as possible
Once you’ve accomplished the above two goals, open the lines of communication about what your brand offers and the people-friendly aspects of how you operate across as many of the following as possible:
In my 17 years as a local SEO, I can confidently say that local business listings have never been a greater potential asset than they will be this holiday season. Google My Business listings, in particular, are an interface that can answer almost any customer who-what-where-when-why — if your business is managing these properly, whether manually or via software like Moz Local.
Anywhere a customer might be looking for what you offer, be there with accurate and abundant information about identity, location, hours of operation, policies, culture, and offerings. From setting special hours for each of your locations, to embracing Google Posts to microblog holiday content, to ensuring your website and social profiles are publicizing your USP, make your biggest communications effort ever this year.
With the pandemic necessitating social distancing, make the Internet your workhorse for connecting up with and provisioning your community as much as you can.
4) Embrace local e-commerce and product listings
Digital Commerce 360 has done a good job charting the 30%+ rise in online sales in the first half or 2020, largely resulting from the pandemic. The same publication summarizes the collective 19% leap in traffic to North America’s largest retailers. At the local business level, implementing even basic e-commerce function in advance of the holiday season could make a major difference, if you can find the most-desired methods of delivery. These could include:
Buy-online, pick up in-store (BOPIS)
Buy-online, pick up curbside
Buy online for postal delivery
Buy online for direct home delivery by in-house or third-party drivers
Put your products everywhere you can. Don’t forget that this past April, Google surprised everybody by offering free product listings, and that they also recently acquired the Pointy device, which lets you transform scanned barcodes into online inventory pages.
Additionally, in mid-September, Google took their next big product-related step by adding a “nearby” filter to Google Shopping, taking us closer and closer to the search engine becoming a source for real-time local inventory, as I’ve been predicting here in my column for several years.
Implement the public safety protocols that review research from GatherUp shows consumers are demanding, get your inventory onto the web, identify the most convenient ways to get purchases from your storefront into the customer’s hands, and your efforts could pave the way for increased Q4 profits.
5) Reinvent window shopping with QR codes
“How can I do what I want to do?” asked Jennifer Bolin, owner of Clover Toys in Seattle.
What she wanted to do was use her storefront window to sell merchandise to patrons who were no longer able to walk into her store. When a staff member mentioned that you could use a QR code generator like this one to load inventory onto pedestrians’ cell phones, she decided to give it a try.
Just a generation or two ago, many Americans cherished the tradition of going to town or heading downtown to enjoy the lavish holiday window displays crafted by local retailers. The mercantile goal of this form of entertainment was to entice passersby indoors for a shopping spree. It’s time to bring this back in 2020, with the twist of labeling products with QR codes and pairing them with desirable methods of delivery, whether through a drive-up window, curbside, or delivery.
“We’ve even gotten late night sales,” Bolin told me when I spoke with her after my colleague Rob Ousbey pointed out this charming and smart independent retail shop to me.
If your business locations are in good areas for foot traffic, think of how a 24/7 asset like an actionable, goodie-packed window display could boost your sales.
6) Tie in with DIY, and consider kits
With so many customers housebound, anything your business can do to support activities and deliver supplies for domestic merrymaking is worth considering. Can your business tie in with decorating, baking, cooking, crafting, handmade gift-giving, home entertainment, or related themes? If so, create video tutorials, blog posts, GMB posts, social media tips, or other content to engage a local audience.
One complaint I am encountering frequently is that shoppers are feeling tired trying to piecemeal together components from the internet for something they want to make or do. Unsurprisingly, many people are longing for the days when they could leisurely browse local businesses in-person, taking inspiration from their hands-on interaction with merchandise. I think kits could offer a stopgap solution in some cases. If relevant to your business, consider bundling items that could provide everything a household needs to:
Prepare a special holiday meal
Bake treats
Outfit a yard for winter play
Trim a tree or decorate a home
Build a fire
Create a night of fun for children of various age groups
Dress appropriately for warmth and safety, based on region
Create a handmade gift, craft, or garment
Winter prep a home or vehicle
Create a complete home spa/health/beauty experience
Plant a spring garden
Kits could be a welcome all-in-one resource for many shoppers. Determine whether your brand has the components to offer one.
7) Manage reviews meticulously
Free, near-real-time quality control data from your holiday efforts can most easily be found in your review profiles. Use software like Moz Local to keep a running tally of your incoming new reviews, or assign a staff member at each location of your business to monitor your local business profiles daily for any complaints or questions.
If you can quickly solve problems people cite in their reviews, your chances are good of retaining the customer and demonstrating responsiveness to all your profiles’ visitors. You may even find that reviews turn up additional, unmet local needs your formal survey missed. Acting quickly to fulfill these requests could win you additional business in Q4 and beyond.
8) Highly publicize one extra reason to shop local this year
“72% of respondents...are likely or very likely to continue to shop at independent stores, either locally or online, above larger retailers such as Amazon.” — Bazaarvoice
I highly recommend reading the entire survey of 12,000 global respondents by Bazaarvoice, quantifying how substantially shopping behaviors have changed in 2020. It’s very good news for local business owners that so many customers want to keep transacting with nearby independents, but the Amazon dilemma remains.
Above, we discussed the fatigue that can result from trying to cobble together a bunch of different resources to check everything off a shopping list. This can drive people to online “everything stores”, in the same way that department stores, supermarkets, and malls have historically drawn in shoppers with the promise of convenience.
A question every local brand should do their best to ask and answer in the runup to the holidays is: What’s to prevent my community from simply taking their whole holiday shopping list to Amazon, or Walmart, or Target this year?
Whatever your business can offer to support local shoppers’ aspirations for a safe, comfortable, happy holiday season at home is commendable at the end of a very challenging 2020. I hope these eight local search marketing tips will help you make good connections that serve your customers — and your business — well into the new year.
My completely personal answer to this question is that I want my town’s local business district, with its local flavor and diversity of shops, to still be there after a vaccine is hopefully developed for COVID-19. But that’s just me. Inspiring your customers’ allegiance to keeping your business going might be best supported by publicizing some of the following:
The economic, societal, and mental health benefits proven to stem from the presence of small, local businesses in a community.
Your philanthropic tie-ins, such as generating a percentage of sales to worthy local causes — there are so many ways to contribute this year.
The historic role your business has played in making your community a good place to live, particularly if your brand is an older, well-established one. I hear nostalgia is a strong influencer in 2020, and old images of your community and company through the years could be engaging content.
Any recent improvements you’ve made to ensure fast home delivery, whether by postal mail or via local drivers who can get gifts right to people’s doors.
Uplifting content that simply makes the day a bit brighter for a shopper. We’re all looking for a little extra support these days to keep our spirits bright.
Be intentional about maximizing local publicity of your “extra reason” to shop with you. Your local newspaper is doubtless running a stream of commentary about the economic picture in your city, and if your special efforts are newsworthy, a few mentions could do you a lot of good.
Don’t underestimate just how reliant people have become on the recommendations of friends, family, and online platforms for sourcing even the basics of life these days. In my own circle, everyone is now regularly telling everyone else where to find items from hand sanitizer to decent potatoes. Networking will be happening around gifts, too, so anything you get noticed for could support extensive word-of-mouth information sharing.
I want to close by thanking you for being in or marketing businesses that will help us all celebrate the many upcoming holidays in our own ways. Your efforts are appreciated, and I’m wishing you a peaceful, profitable, and hyggelig finish to 2020.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
In February 2009, Ontario, Canada passed its Green Energy Act. The act entailed: increased integration of wind and solar energy into Ontario’s electricity grid, shutting down coal plants and creating 50,000 green jobs in the first three years; allowing First Nations communities to manage their own electricity supply and distribution (the ‘decolonization’ of energy), empowering Canada’s indigenous communities; and reducing costs for poorer citizens through clean and sustainable energy provided by renewable energy. That part of the act received an endorsement from Ontario’s Low Income Energy Network – a group that campaigns for universal access to affordable energy.
On January 1, 2019, Ontario repealed the act—one month before its 10th anniversary. The 50,000 guaranteed jobs never materialized. The “decolonization” of energy did not work out. one-third of indigenous Ontarians now live in energy poverty—their electricity bills more than doubled during the life of the act, making their electricity costs among the highest in North America. Its promises turned out to be false, and the actual results made people’s lives worse.
What Went Wrong?
Ontario’s contracts with renewable suppliers guarantee electricity suppliers that they “will be paid for each kilowatt hour of electricity generated from the renewable energy project,” regardless of whether the electricity is consumed. While that does not make sense, the new contracts were an improvement over earlier contracts that guaranteed payments close to 100 percent of the supplier’s capacity, rather than electricity generated. So if a producer supplied only 33 percent of its capacity in a given year, it would still be paid as if it had produced almost 100 percent. About 97 percent of the applicants to the program were forms of intermittent energy—wind or solar energy—that operate only when the wind is blowing or the sun is shining, requiring other sources for back-up. They are inherently intermittent by design and nature, and therefore their capacity factors are always well below their maximum capacity over a year. Wind and solar electricity providers cannot replace consistently reliable power plants like natural gas, coal or nuclear—they can only supplement the grid.
The Council for Clean and Reliable Energy found that “in 2015, Ontario’s wind farms operated at less than one-third capacity more than half (58 percent) of the time.” Regardless, Ontarians paid multiple contracts as if wind farms had operated at full capacity all year. Ontario’s contracts also guaranteed exorbitant prices for renewable energy—often at up to 40 times the cost of conventional power for 20 years of operation. By 2015, Ontario’s auditor general concluded that citizens had paid $37 billion above the market rate for energy. Since these plants will continue operating, they will pay another $133 billion from 2015 to 2032 on top of market valuations. (One steelmaker has taken the Ontarian government to court for these exorbitant energy costs.)
Despite the act being repealed, Ontarians continue to pay exorbitant rates. In April, 2020, the market value for all wind-generated electricity in Ontario was $4.3 million, but Ontario paid $184.5 million in wind contracts—almost43 times the value of electricity delivered.
Between 2011 and 2015, electricity demand in Ontario declined, and it has continued to decline. Yet, Ontarians were forced to pay higher prices for new electricity capacity that they did not need since their consumption was going down. They had sufficient electricity capacity—just not the right kind.
Biden’s Promises
If elected, U.S. Democratic Party presidential nominee Joe Biden promises to forge a carbon-free power sector by 2035. That means shutting down perfectly good coal and natural gas plants and replacing them with wind and solar power. He claims this means he will create new jobs in the “clean-energy” arena, much like the promises in Ontario. Of course, that also means that coal and natural gas jobs will be killed, replacing them with jobs that pay only half of their previous salary, according to the Bureau of Labor Statistics.
It also means that U.S. ratepayers will pay higher prices for their electricity because perfectly good coal and natural gas capacity will be replaced by wind and solar farms. When Germany went on its path of pursuing renewable energy for its electric generation, residential rates skyrocketed. Germans pay 3 times more for their residential electricity, on average, than U.S. residents.
It also means that taxpayers will be subsidizing intermittent renewable technologies in order to make them economic. Or, Biden might go down the path of half our states that instituted Renewable Portfolio Standards, only to find they resulted in much higher electricity prices. One of those states, California, has been rewarded with rolling black-outs as high temperatures resulted in a shortage of electrical generation when solar panels powered down as sunset was approaching and families arrived home from work and turned on their air conditioning and other appliances.
Conclusion
Americans need to make sure that they do not fall into the same trap as Ontarians, allowing changes to the electrical grid that will only result in higher energy prices, premature capacity retirements, and a loss of jobs to be replaced by lower paying ones. Americans have abundant energy at very affordable prices. All that could be lost with policies such as those Joe Biden is cheerleading.
Content, content, and more content! That’s what SEO is all about nowadays, right? Compared to when I started working in SEO (2014), today, content is consistently one of the most popular topics covered at digital marketing conferences, there are way more tools that focus on content analysis and optimization, and overall it seems to dominate most of SEO news.
Don’t believe me? Here’s a nice Google Trends graph that may change your mind:
Google Trends screenshot for “content marketing” as a topic, set for worldwide interest.
But why is it that content is now dominating the SEO scene? How vital is content for your SEO strategy, actually? And most importantly: how can you be content with your site’s content? Puns aside, this post aims to help you figure out potential causes of your underperforming content and how to improve it.
Why content is key in SEO in 2020
Content is one of the most important factors in SEO. Just by paying close attention to what Google has been communicating to webmasters in the last few years, it’s clear that they’ve put a strong emphasis on “content” as a decisive ranking factor.
For instance, let’s have a look at this post, from August 2019, which talks about Google’s regular updates and what webmasters should focus on:
“Focus on content: pages that drop after a core update don’t have anything wrong to fix. We suggest focusing on ensuring you’re offering the best content you can. That’s what our algorithms seek to reward.”
The article goes on, listing a series of questions that may help webmasters when self-assessing their own content (I strongly recommend reading the entire post).
That said, content alone cannot and should not be enough for a website to rank well, but it is a pretty great starting point!
Underperforming content: theory first
What is underperforming content?
When I say “underperforming content”, I’m referring to content, either on transactional/commercial pages or editorial ones, that does not perform up to its potential. This could be content that either used to attract a good level of organic traffic and now doesn’t, or content that never did generate any organic traffic despite the efforts you might have put in.
There could be many reasons why your content is not doing well, but the brutal truth is often simple: in most cases, your content is simply not good enough and does not deserve to rank in the top organic positions.
Having said that, here are the most common reasons why your content may be underperforming: they are in no particular order and I will highlight the most important, in my opinion.
Your content does not match the user intent
Based on my experience, this is a very important thing that even experienced marketers still get wrong. It may be the case that your content is good and relevant to your users, but does not match the intent that Google is showcasing in the SERP for the keywords of focus.
As SEOs, our aim should be to match user intent, which means we first need to understand the what and the who before defining the how. Whose intent we are targeting and what is represented in the SERP will define the strategy we use to get there.
Example: webmasters who hope to rank for a “navigational or informational” keyword with a transactional, page or vice versa.
Your content isn’t in the ideal format Google is prioritizing
Google may be favoring a certain type of format which your content doesn’t conform to, hence it isn’t receiving the expected visibility.
Example: you hope to rank with a text-heavy blog post for a “how to” keyword where Google is prioritizing video content.
Your content is way too “thin” compared to what is ranking
It doesn’t necessarily have to be a matter of content length (there is no proven content length formula out there, trust me) but more relevance and comprehensiveness. It may be the case that your content is simply not as compelling as other sites out there, hence Google prioritizing those over you.
Example: you hope to rank for heavily competitive informational keywords with a 200-words blog post.
Your content isn’t as up-to-date
If your content is very topical, and such a topic heavily depends on information which may change with time, then Google will reward sites that put effort into keeping the content fresh and up-to-date. Apart from search engines themselves, users really care about fresh content — no one wants to read an “SEO guide to improve underperforming content” that was created in 2015!
Example: certain subjects/verticals tend to be more prone to this issue, but generally anything related to regulations/laws/guidelines which tend to change often.
Your content is heavily seasonal or tied to a past event/experience
Self-explanatory: if your content is about something that occurred in the past, generally the interest for that particular subject will gradually decrease over time. There are exceptions, of course (god save the 90s and my fav Netflix show “The Last Dance”), but you get the gist.
Example: topics such as dated events or experiences (Olympics 2016, past editions of Black Friday, and so on) or newsworthy content (2016 US election, Kanye running for president — no wait that is still happening...).
Your tech directives have changed the page’s indexation status
If something happens to your page that makes it fall out of Google’s index. The most common issues could be: unexpected no-index tag, canonical tag, incorrect hreflang tags, page status changes, page removed with Google Search Console’s remove tool, and so on.
Example: after some SEO recommendations, your devs mistakenly put a no-index tag on your page without you realizing.
Your page is victim of duplication or cannibalization
If you happen to cover the same or similar keyword topic with multiple pages, this may trigger duplication and/or cannibalization, which ultimately will result in a loss of organic visibility.
Example: you launch a new service page alongside your current offerings, but the on-page focus (metadata, content, linking structure) isn’t different or unique enough and it ends up cannibalizing your existing visibility.
Your page has been subject to JavaScript changes that make the content hard to index for Google
Let’s not go into a JavaScript (JS) rabbit hole and keep it simple: if some JS stuff is happening on your page and it’s dynamically changing some on-page SEO elements, this may impact how Google indexes your content.
Example: fictitious case where your site goes through a redesign, heavy JS is now happening on your browser and changing a key part of your content that now Google cannot render easily — that is a problem!
Your page has lost visibility following drastic SERP changes
The SERP has changed extensively in the last few years, which means many more new features that are now present weren’t there before. This may cause disruption to previous rankings (hence to your previous CTR), or make your pages fall out of Google’s precious page one.
Also, don’t forget to consider that the competition might have gotten stronger with time, so that could be another reason why you lose significant visibility.
Example: some verticals have been impacted more than others (jobs, flights, and hotels, for instance) where Google’s own snippets and tools are now getting the top of the SERP. If you are as obsessed with SERP chances, and in particular PAA, as I am and want more details, have a read here.
Your content doesn’t have any backlinks
Without going into too much detail on this point — it could be a separate blog post — for very competitive commercial terms, not having any/too few backlinks (and what backlinks represent for your site in Google’s eyes) can hold you back, even if your page content is compelling on its own. This is particularly true for new websites operating in a competitive environment.
Example: for a challenging vertical like fashion, for instance, it is extremely difficult to rank for key head terms without a good amount of quality (and naturally gained) backlinks to support your transactional pages.
How to find the issues affecting your content
We’ve covered the why above, let’s now address the how: how to determine what issue affects your page/content. This part is especially dedicated to a not-too savvy SEO audience (skip this part and go straight to next if you are after the how-to recommendations).
I’ll go through a list of checks that can help you detect the issues listed above.
Technical checks
Google Search Console
Use the URL inspection tool to analyze the status of the page: it can help you answer questions such as:
Has my page been crawled? Are we even allowing Google to crawl the page?
Has my page been indexed? Are we even allowing Google to index the page?
By assessing the Coverage feature, Google will share information about the crawlability and indexability of the page.
Pay particular attention to the Indexing section, where they mention user-declared canonical vs google-selected canonical. If the two differ, it’s definitely worth investigating the reason, as this means Google isn’t respecting the canonical directives placed on the page — check official resources to learn more about this.
Chrome extensions
I love Chrome extensions and I objectively have way too many on my browser…
Some Chrome extensions can give you lots of info on the indexability status of the page with a simple click, checking things like canonical tags and meta robots tags.
I’ll keep it simple: JavaScript is key in today’s environment as it adds interactivity to a page. By doing so, it may alter some key HTML elements that are very important for SEO. You can easily check how a page would look without JS by using this convenient tool by Onley: WWJD.
Realistically speaking, you need only one of the following tools in order to check whether JavaScript might be a problem for your on-page SEO:
All the above tools are very useful for any type of troubleshooting as they are showcasing the rendered-DOM resources in real-time (different from what the “view-source” of a page looks like).
Once you’ve run the test, click to see the rendered HTML and try and do the following checks:
Is the core part of my content visible?
Quick way to do so: find a sentence in your content, use the search function or click CTRL + F with that sentence to see if it’s present in the rendered version of the page.
Are internal links visible to Google?
Quick way to do so: find an internal link on the page, use the search function or click CTRL + F with that sentence to see if it’s present in the rendered version of the page.
Can Google access other key elements of the page?
Check for things such as headers (example below with a Brainlabs article), products, pagination, reviews, comments, etc.
Intent and SERP analysis
By analyzing the SERP for key terms of focus, you’ll be able to identify a series of questions that relate to your content in relation to intent, competition, and relevance. All major SEO tools nowadays provide you with tons of great information about what the SERP looks like for whatever keyword you’re analyzing.
For the sake of our example, let’s use Ahrefs and the sample keyword below is “evergreen content”:
Based on this example, these are a few things I can notice:
This keyword triggers a lot of interesting SERP features (Featured Snippet, Top Stories, People also ask)
The top organic spots are owned by very established and authoritative sources (Ahrefs blog, Hubspot, Wordstream etc), which makes this keyword quite difficult to compete for
Here are quick suggestions on what types of checks I recommend:
Understand and classify the keyword of analysis, based on the type of results Google is showing in the SERP: any ads showing, or organic snippets? Are the competing pages mainly transactional or informational?
Check the quality of the sites that are ranking in page one: indicative metrics that can help you gather insights on the quality of each domain (DA/DR) are helpful, the number of keywords those pages are visible for, the estimated traffic per page, and so on.
Do a quick crawl of these pages to bulk check the comprehensiveness of their content and metadata, or manually check some if you prefer that way.
By doing most of these checks, you’ll be able to see if your content is underperforming for any of the reasons previously mentioned:
Content not compelling enough compared to what is ranking on page one
Content in the wrong format compared to what Google is prioritizing
Use compelling SEO tools to understand the following:
whether, for tracked keywords of interest, two or more ranking URLs have been flip-flopping. That is a clear sign that search engines are confused and cannot “easily decide” on what URL to rank for a certain keyword.
whether, for tracked keywords of interest, two or more ranking URLs are appearing at the same time (not necessarily on page one of the SERP). That is a clear signal of duplication/cannibalization.
check your SEO visibility by landing page: if different URLs that rank for very similar keyword permutations, chances are there is a risk there.
last but not least: do a simple site search for keywords of interest in order to get an initial idea of how many pages (that cover a certain topic) have been indexed by Google. This is an insightful preliminary exercise and also useful to validate your worries.
How to fix underperforming content
We’ve covered the most common cases of underperforming content and how to detect such issues — now let’s talk about ways to fix them.
Below is a list of suggested actions to take when improving your underperforming content, with some very valuable links to other resources (mostly from Moz or Google) that can help you expand on individual concepts.
Make sure your page can be crawled and indexed “properly”
Ensure that your page does not fall under any path of blocked resources in Robots.txt
Ensure your page is not provided with a no-index meta robots tag or a canonical tag pointing elsewhere (a self-referencing canonical tag is something you may want to consider but not compulsory at all).
Check whether other pages have a canonical tag pointing to your URL of focus. Irrelevant or poorly-done canonical tags tend to get ignored by Google — you can check if that is the case in the URL Inspection tool.
Ensure your site (not just your page) is free from any non-SEO friendly JavaScript that can alter key on-page elements (such as headers, body content, internal links, etc.).
Ensure your page is linked internally on the site and present in your XML sitemap.
Put simply, you should always research what the SERP looks like for the topic of interest: by analyzing the SERP and all its features (organic and non), you can get a much better understanding of what search engines are looking for in order to match intent.
By auditing the SERP, you should be able to answer the following questions:
What type of content is Google favoring here: transactional, navigational, informational?
How competitive are the keywords of focus and how authoritative are those competitors ranking highly for them?
What content format is Google showcasing in the SERP?
How comprehensive should my content be to get a chance to rank in page one?
What keywords are used in the competitor’s metadata?
What organic features should I consider addressing with my content (things like featured snippets, people also ask, top images, etc.)?
Hopefully all the questions above will also give you a realistic view of your chances of ranking on Google’s first page. Don’t be afraid to switch your focus to PPC for some very competitive keywords where your real possibility of organic rankings are slim.
Map your pages against the right keywords
This is a necessary step to make sure you have a clear understanding of not only what keywords you want to rank for, but also what keywords you are eligible to rank for.
Don’t overdo it and be realistic about your ranking possibilities: mapping your page against several keywords variations, all of which show very different SERPs and intents, is not realistic.
My suggestion is to pick two or three primary keyword variations and focus on getting your content as relevant as possible to those terms.
Write great metadata
Title tags are still an incredibly important on-page ranking factor, so dedicate the right time when writing unique and keyword-rich titles.
Meta descriptions are not a ranking factor anymore, but they still play a part in enticing the user to click on a search result. So from a CTR perspective, they still matter.
SEO keyword research is the obvious choice to write compelling metadata, but don’t forget about PPC ad copies — check what PPC ad copies work best for your site and take learnings from them.
Don’t change metadata too often, though: make sure you do your homework and give enough time to properly test new metadata, once implemented.
Make the right content amends
Based on the intent audit and keyword mapping insights, you’re now ready to work on your actual page content.
By now, you’ve done your homework, so you just need to focus on writing great content for the user (and not for Google).
Readability is a very important part of a page. Tricks that I’ve learned from colleagues over the years are the following:
Read the content out loud and try to objectively assess how interesting it is for your target audience.
Make sure to use enough spacing between lines and paragraphs. People’s attention span these days is very short, and chances are people will skim through your content rather than dedicating 100% of their attention to it (I’m sure some of YOU readers are doing it right now!).
Make sure your tone of voice and language match your target audience (if you can write things in plain English vs. highly technical jargon, do so and don’t over-complicate your life).
Make sure you’ve thought about all internal linking possibilities across the site. Not only for the same type of page (transactional page to transactional page, for instance) but also across different types (transactional page to video/blog post, if that helps people make a decision, for example).
Optional step: once everything is ready, request indexing of your page in Google Search Console with the URL inspection tool.
Final thoughts
Underperforming content is a very common issue and should not take you by surprise, especially considering that content is considered among (if not the) most important ranking factors in 2020. With the right tools and process in place, solving this issue is something everyone can learn: SEO is not black magic, the answer tends to be logical.
First, understand the cause(s) for your underperforming content. Once you’re certain you’re compliant with Google’s technical guidelines, move on to determining what intent you’re trying to satisfy. Your research on intent should be comprehensive: this is what’s going to decide what changes you’ll need to make to your content. At that point, you’ll be ready to make the necessary SEO and content changes to best match your findings.
I hope this article is useful! Feel free to chat about any questions you may have in the comments or via Twitter or LinkedIn.
To help us serve you better, please consider taking the 2020 Moz Blog Reader Survey, which asks about who you are, what challenges you face, and what you'd like to see more of on the Moz Blog.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
On May 21, the Ohio Power Siting Board recommended conditional approval of the $126 million Icebreaker six-turbine wind project proposed in Lake Erie, but with over 34 conditions that were believed to doom the project. The major restriction was that the wind farm cannot operate at night between March 1 and November 1, which was required to limit risk to birds and bats. The requirement made the wind farm financially infeasible, particularly because the best wind resources generally occur at night. The ruling was appealed, and facing political pressure from dozens of northeastern Ohio elected officials, the board reversed course and eliminated the condition. The developer, however, must still produce a radar monitoring program for birds and bats, and an avian and bat impact mitigation plan before beginning construction, which is not required for land-based wind projects.
If built, the 20.7-megawatt Icebreaker Wind farm would be the nation’s first freshwater offshore wind farm, opening up the Great Lakes for industrial wind development. Under the plan, Cleveland Public Power (CPP) would purchase two-thirds of Icebreaker Wind’s output. CPP is owned by the city of Cleveland, and already has rates that average 13 percent higher than the local private utility. As of last fall, the developer was looking for large energy users to purchase the remaining output.
CPP ratepayers and their representatives need to consider the cost of offshore wind power, which is one of the most expensive technologies being considered in the United States, with only one small wind farm actually operating off the coast of Block Island, Rhode Island. According to the Energy Information Administration, the levelized cost of offshore wind is over three times higher than the levelized cost of natural gas combined cycle or onshore wind technologies. In the case of Block Island, the wind farm was cost competitive with the inefficient diesel generators the island had previously used to generate electricity, getting the diesel from floating tankers ferried across 18 miles of water.
In 2016, the U.S. Department of Energy awarded $40 million to the Lake Erie Energy Development Corporation to build a six-turbine pilot wind farm in Lake Erie by the end of 2018. The funds were to be delivered in three $13.3 million grants, provided the company continued to meet engineering, permitting and construction goals set by the Department of Energy. The company previously received three DOE grants totaling $10.7 million. The $40 million award would make the federal share over $50 million.
Here again is another use of taxpayer funds that the Obama/Biden administration initiated for a controversial renewable energy project, which supports the notion that the government should not pick winners and losers in energy markets. Candidate Biden has more of these types of projects planned in his environment and “clean energy” plans.
New Yorkers Need to Be Aware
New York State is considering allowing massive industrial wind turbines to be installed within a few miles of the shorelines of Lake Erie and Lake Ontario. Two New York State documents released in June discuss industrial wind turbines in Lake Erie and Lake Ontario, which will be needed to meet Governor Cuomo’s goal of 70 percent renewable energy by 2030. However, there are several technical issues pertinent to installing wind turbines in the Great Lakes presented in the papers.
Offshore industrial wind turbines will need to be massive in order to be cost competitive because they are extremely expensive to install. Both Lake Erie and Lake Ontario are less than 60 miles wide making the proximity of the wind turbines to the shoreline necessarily closer than ocean-based turbines. Lakes Erie and Ontario are the smallest and already the most stressed of the five Great Lakes due to industrial runoff and other uses along their shores. They are also heavily used for recreation of all types.
Because there are limitations in the size of commercial ships which can safely navigate the locks and waterways in and leading to the Great Lakes, only turbines less than 4 megawatts could be transported and installed, unless “development of a new or adapted fleet of construction vessels” is achieved. A limit of four megawatts in turbine size may make development in the Great Lakes economically unfeasible, as larger turbines are typically more efficient than their smaller versions.
Another issue is the problem of ice in the Great Lakes. While floating foundations are being developed for use with turbines in the oceans, freshwater ice presents a problem due to lateral forces imparted by ice and freezing of the substructure.
According to the White Paper, these projects would interconnect in the region of the state with the greatest proportion of renewable energy development relative to native load. According to New York’s grid operator, new renewable energy will displace older renewable energy projects upstate unless transmission upgrades allow the power to be transported downstate. Upstate New York already gets 88 percent of their electricity from zero emission sources including nuclear and hydropower, meaning that the new capacity needs to reach New York City and Long Island where 70 percent of their electricity is generated from fossil fuels.
Other issues of concern include recreational boating, fishing, tourism, commercial shipping, and wildlife-especially bird and bat seasonal migration, which is one of the issues the Ohio Power Siting Board had with their original restriction on night operation.
Conclusion
The Ohio Power Siting Board caved under political pressure on its proposed restriction regarding night operation of the Lake Erie wind project during migratory season, which will result in ratepayers incurring very high electricity rates if the Lake Erie Wind Farm becomes operational. Clearly, other cost effective options are available to the city. New Yorkers also need to evaluate carefully construction of wind farms in Lake Erie and Lake Ontario to ensure that the stunning panoramic views, ecology, and economies of the Great Lakes are not at risk.
And Americans need to assess Joe Biden’s energy policies to realize that there will be more taxpayer funds that will be spent on expensive renewable energy projects as he implements his climate and “clean energy” plans, because these go way beyond the Obama/Biden renewable projects of a decade ago.
Machine learning — a branch of artificial intelligence that studies the automatic improvement of computer algorithms — might seem far outside the scope of your SEO work. MozCon speaker (and all-around SEO genius) Britney Muller is here with a special edition of Whiteboard Friday to tell you why that's not true, and to go through a few steps to get you started.
To see more on machine learning from Britney and our other MozCon 2020 speakers, check out this year's video bundle.
Click on the whiteboard image above to open a high resolution version in a new tab!
Video Transcription
Hey, Moz fans. Welcome to this special edition of Whiteboard Friday. Today we are taking a sneak peek at what I spoke about at MozCon 2020, where I made machine learning accessible to SEOs everywhere.
This is so, so exciting because it is readily at your fingertips today, and I'm going to show you exactly how to get started.
So to kick things off, I learned about this weird concept called brood parasites this summer, and it's fascinating. It's basically where one animal tricks another animal of the same species to raise its young.
It's fascinating, and the more I learned about it, the more I realized: oh my gosh, I'm sort of like a brood parasite when it comes to programming and machine learning! I latch on and find these great models that do all the work — all of the raising — and I put in my data and my ideas, and it does things for me.
So we are going to use this concept to our advantage. In fact, I have been able to teach my dad most of these models that, again, are readily available to you today within a tool called Colab. Let me just walk you through what that looks like.
Models to get you started
So to get started, if you want to start warming up right now, just start practicing clicking "Shift" and then click "Enter".
Just start practicing that right now. It's half the battle. You're about to be firing up some really cool models.
All right. What are some examples of that? What does that look like? So some of the models you can play with today are things like DeOldify, which is where you repair and colorize old photos. It's really, really fun.
Another one is a text generator. I created one with GTP-2 — super silly, it's this excuse generator. You can manipulate it and make it do different things for you.
There's also a really, really great forecasting model, where you basically put in a chunk of time series data and it predicts what the future might have in store. It's really, really powerful and fun.
You can summarize text, which is really valuable. Think about meta descriptions, all that good stuff.
You can also automate keyword research grouping, which I'll show you here in a second.
You can do really powerful internal link analysis, set up a notebook for that.
Perhaps one of the most powerful things is you can extract entities and categories as Google perceives them. It's one of my favorite APIs. It's through Google's NLP API. I pull it into a notebook, and you basically put the URLs you want to extract this information from and you can compare how your URL compares to competitors.
It's really, really valuable, fun stuff. So most importantly, you cannot break any of this. Do not be intimidated by any of the code whatsoever. Lots of seasoned developers don't know what's happening in some of those code blocks. It's okay.
Using Colab
We get to play in this environment. It's hosted in Google Drive, and so there's no fear of this breaking anything on your computer or with your data or anything. So just get ready to dive in with me. Please, it's going to be so much fun. Okay, so like I said, this is through a free tool called Colab. So you know how Google basically took Excel and made Google Sheets?
They did the same thing with what's known as Jupyter Notebooks. So these were locally on computers. It's one of the most popular notebook environments. But it requires some setup, and it can be somewhat clunky. It gets confused with different versions and yada, yada. Google put that into the cloud and is now calling it Colab. It's unbelievably powerful.
So, again, it's free. It's available to you right now if you want to open it up in a new tab. There is zero setup. Google also gives you access to free GPU and TPU computing, which is great. It has a 12-hour runtime.
Some cons is that you can hit limits. So I hit the limits, and now I'm paying $9.99 a month for the Pro version and I've had no problems.
Again, I'm not affiliated with this whatsoever. I'm just super passionate about it, and the fact that they offer you a free version is so exciting. I've already seen a lot of people get started in this. It's also something to note that it's probably not as secure or robust as Google's Enterprise solution. So if you're doing this for a large company or you're getting really serious about this, you should probably check out some other options. But if you're just kind of dabbling and want to explore and have fun, let's keep this party going.
Using pandas
All right. So again, this is basically a cloud hosted notebook environment. So one thing that I want to really focus on here, because I think it's the most valuable for SEOs, is this library known as "pandas".
Pandas is a data frame library, where you basically run one — or two — lines of code. You can choose your file from your local computer, so I usually just upload CSVs. This silly example is one that I really did run with Google Search Console data.
So you run this in a notebook. Again, I'm sharing this entire notebook with you today. So if you just go to it and you do this, it brings you through the cells. It's not as intimidating as it looks. So if you just click into that first cell, even if it's just that text cell, "Shift + Enter", it will bring you through the notebook.
So once you get past and once you fire up this chunk of code right here, upload your CSV. Then once you upload it, you are going to name your data frame.
So these are the only two cells you need to really change or do anything with if you want. Well, you need to.
So we are uploading your file, and then we are grabbing that file name. In this case, mine was just "gsc-example.csv". Again, once you upload it, you will see the name in that output here. So you just put that within this code block, run this, and then you can do some really easy lines of code to check to make sure that your data is in there.
So one of the first ones that most people do is "df". This is your data frame that you named with your file right here. So you just do "df.head()". This shows you the first five rows of your data frame. You can also do "df.tail()", and it shows you the last five rows of your data frame.
You can even put in a number in here to modify how many rows you want to explore. So maybe you do "df.head(30)", and then you see the first 30 rows. It's that easy just to get it in there and to see it. Now comes the really fun stuff, and this is just tip of the iceberg.
So you can run this really, really cool code cell here to create a filterable table. What's powerful about this, especially with your Google Search Console data, is you can easily extract and explore keywords that have high click-through rate and a low ranking in search. It's one of my favorite ways to explore keyword opportunities for clients, and it couldn't be easier.
So check that out. This is kind of the money part right here.
If you're doing keyword research, which can take a lot, right, you're trying to bucket keywords, you're trying to organize topics and all that good stuff, you can instantly create a new column with pandas with branded keyword terms.
So just to walk you through this, we're going "df["Branded"]". This is the name of the new column we're going to create. We have this query string "contains," and this is just regex, ("moz|rand|ose"). So any keywords that contain one of those words gets in the "Branded" column a "True".
So now that makes filtering and exploring that so much faster. You can even do this in ways where you can create an entirely different data frame table. So sometimes if you have lots and lots of data, you can use the other cell in that example. All of these examples will be in the notebook.
You can use that and export your keywords into buckets like that, and there's no stall time. Things don't freeze up like Excel. You can account for misspellings and all sorts of good stuff so, so easily with regular expressions. So super, super cool.
Conclusion
Again, this is just tip of the iceberg, my friends. I am most excited to sort of plant this seed within all of you so that you guys can come back and teach me what you've been able to accomplish. I think we have so much more to explore in this space. It is going to be so much fun. If you get a kick out of this and you want to continue exploring different models, different programs within Colab, I highly suggest you download the Colab Chrome extension.
It just makes opening up the notebook so much easier. You can save a copy to your drive and play with it all you want. It's so much fun. I hope this kind of sparked some inspiration in some of you, and I am so excited to hear what all of you think and create. I really appreciate you watching.
So thank you so much. I will see you all next time. Bye.
You'll uncover even more SEO goodness in the MozCon 2020 video bundle. At this year's special low price of $129, this is invaluable content you can access again and again throughout the year to inspire and ignite your SEO strategy:
21 full-length videos from some of the brightest minds in digital marketing
Instant downloads and streaming to your computer, tablet, or mobile device
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!