Google destroys AI generated content rankings

Welcome to The Cusp: cutting-edge AI news (and its implications) explained in simple English.

In this week's issue:

  • The results are in: Google officially destroys AI generated content. But by how much—and is AI still profitable to use?
  • New AI developments set to decimate the interior design industry in just a few years.
  • Are you in 3D? Companies like Quixel are in for a major shock thanks to a new model.

Let's dive in.

1. Google Penalizes AI Generated Content By At Least 20%, Probably More

Large language models like GPT-3 made headlines last year for their ability to generate eerily human-like text.

Affiliate marketers and content writers (myself included) were quick to try and game the system.

But Google fought back.

Today, Neil Patel, a popular internet marketing blogger, announced the first real test of Google's new stance on AI generated writing. And the results are definitive: generating content with AI destroys your search traffic.

Neil took 50 websites, and separated them into two groups. One group was AI-generated only—for the other, he had humans write, edit, or perform a mix of both approaches.

Then he waited for Google to index them. The results? Purely AI generated content was given a 20% penalty right off the bat – and that's just the average. In some cases, the penalty was as high as 60%.

Many of you will be thinking: 20%... that's it? Why is this a big deal?

Because there are massive, asymmetrical gains the higher up you are on Google search. Traffic increases exponentially the closer you are to the top.

If you're in first place, you get double what you'd get in second. If you're in second, you get almost six times more clicks than if you're in eighth.

So getting dinged just 20% for using AI is almost always enough to shift you down at least a couple of places— which means you're losing 75% or more of your traffic.

There's some silver lining, though. Content that was generated by AI, but modified by humans, only dropped an average of 6%.

Since AI writing routinely provides speedups of 4x-5x, this is probably still a very worthwhile investment—especially for low-competition keywords.

How can we take advantage of this?

Run a website and want to rank better?

  • Don't try to automatically generate content—that's a race to the bottom.
  • Instead, augment your current, capable writers with AI like GPT-3 or Jurassic-1.
  • Make sure they edit the result for clarity, style, and tone of voice.

Dollar for dollar, you'll increase output by ~400% with a loss in rank potential of just 6%. If you're targeting longer-tail keywords without much competition, that's more than enough to offset the decrease plus give you a big leg up in terms of volume.

2. 3D Interior Designers Will Soon Be Obsolete, Thanks to AI

NeRF (neural radiance fields) just got a big upgrade.

0:00
/

The above clip was built from just a handful of interior photographs. NeRF seamlessly wove those photos together, and—with the help of style transfer—lets you instantly change the colors, mood, and furniture.

The library in question (NeRF Studio) is still in development, but it's already clear that it has the potential to disrupt the $4.1 billion interior design software industry.

And it's not just because NeRF is free: it's also orders of magnitude faster than traditional methods of design.

In the past, you had to take dozens of photos of a space, create a detailed floor plan, and then pay someone hundreds (or thousands) of dollars for a mockup.

More recently, solutions like Matterport have gained popularity; they use 3D scanning to quickly create a model of a space. But Matterport quality is notoriously crappy, and the imaging process is slow and non-intuitive.

NeRF, on the other hand, takes just a few minutes.

It's not long before everybody in the 3D interior design industry makes the switch—and when they do, the market is in for a massive upset.

Instant render services

In addition to 3D interior design, instant render services are another AI-driven approach that's quickly gaining popularity.

Interior AI is a free design mockup app that takes images of your room and then uses Stable Diffusion to superimpose new imagery, complete with furniture and other objects, to show you how your space could look.

The above example, while not incredible, already serves a market niche. And given the rate of growth of image generation AI—I mean, we went from blurry, 16x16 cows to completely realistic human faces in just 6 years—similar services will soon outperform human designers on every level.

How can we take advantage of this?

In design? There's a ton of quick money on the table.

  • Learn how NeRF works, and begin implementing it into your workflow
  • In addition to the improved output, create information products (course, ebook, etc) teaching other designers to do the same

Alternatively: in architecture or real estate? You're about to get a lot more efficient.

  • Use NeRF to quickly create 3D models of properties
  • Automate the ideation process with instant render AI
  • Generate models of what a space could look like, then send to a designer for touchups

You don't have to be on the receiving end of major disruptions to your industry. Plan ahead, make some changes to your workflow, and profit.

3. AI Texture Generation Will Soon Give Companies Like Quixel A Run For Their Money

Speaking of 3D: AI texture generation has made massive leaps and bounds in the past few years. And it's about to give companies like Quixel, who create and sell 3D textures, a run for their money.

For those not in the know, Quixel creates 3D textures that are used in video games, movies, and other CGI.

But these textures take a ton of time and money to create—photographers have to travel to remote locations to zoom in on real-world objects and take pictures of them at different angles to create a comprehensive texture library.

Now? AI can do all of that work in a fraction of the time, and for a fraction of the cost.

0:00
/

Runway.ml recently released a cutting-edge texture generation feature to helps designers, artists, and 3D professionals make textures in a tenth of the time. And while it's not quite on par with Quixel's patented Megascan library yet, it's getting close—and fast.

Plus (and this is very important) it can generate an infinite variety of textures that don't exist in the real world.

What does this mean?

The 3D mapping and modelling industry is worth approximately $13B, according to Mordor Intelligence.

GitHub - ashawkey/stable-dreamfusion: A pytorch implementation of text-to-3D dreamfusion, powered by stable diffusion.
A pytorch implementation of text-to-3D dreamfusion, powered by stable diffusion. - GitHub - ashawkey/stable-dreamfusion: A pytorch implementation of text-to-3D dreamfusion, powered by stable diffus...

But in conjunction with Stable Diffusion 3D models, texture generators will almost certainly unemploy vast swathes of this industry in the very near future. An entire industry is poised to disappear virtually overnight.

How can we take advantage of this?

Luckily, if you're a forward-thinker, there's plenty of time to prepare and make a splash.

  • First, learn how to use Stable Diffusion to generate your own textures (or use a pre-made solution like Runway.ml)
  • Create a pipeline that generates & upscales high quality 3D textures
  • Before they go bust, sell as many as you can on marketplaces like CGTrader. When sales start slowing down (and they will), pitch the pipeline itself for extra runway
  • Create information products that teach CGI professionals how to begin working with AI

While it's always sad to see a creative industry get automated, there's no denying the economic potential. And what's more: AI 3D asset generation will have tremendous (positive) impacts on media, storytelling, and our culture. At least we'll have something to watch when we're unemployed laying on the couch.


That's a wrap!

Enjoyed this? Consider sharing with someone you know. And if you're reading this because someone you know sent you this, get the next newsletter by signing up here.

See you next week.

– Nick