What other news is worth mentioning?
- First of all, an incredible stat, ChatGPT might have reached 100 million users by January! For a bit of context, one of the latest successful consumer app, TikTok made it in nine months. So you can grasp the massive scale of ChatGPT adoption.
- In addition, ChatGPT finally (and officially) announced the paid version at $20/mo (ChatGPT Plus). This is an interesting price point, as it shows that OpenAI wants to keep the tool, yes for B2B, but also enable it to become, potentially a premium consumer tool. Indeed, the pricing is not that far from a Netflix's subscription plan! Will it pull it off?
- Another key point about ChatGPT's premium is that right now this Plus version is priced at $20/mo, but we might assume that OpenAI might be releasing a more powerful premium version with a higher pricing point, to tackle B2B. That segment, if priced well can become an incredible cash cow for OpenAI!
- This week OpenAI also released an AI Detection tool. And I've seen a lot of people commenting how the game was over for AI content creation. That doesn't make sense to me, as AI content generation is a mouse and cat game. Of course, if OpenAI's ChatGPT is the only AI content generation tool out there, no doubt OpenAI has advantage if catching AI generated content. But otherwise, if the AI generated content can come also from other language models this will become a real cat and mouse game. In addition to that, even if ChatGPT is the only content generation tool out there, smart AI developers can still build various AI engines on top of those to make the content generated by ChatGPT indistinguishable from that of humans! Indeed, I played with AI detection in late December, and we also launched a tool here, which was slightly updated in early January. Yet, again, what matters when it comes to AI detection is the classification model, and that isn't something static, it needs to be continuously updated, as large language models get better, and as other developers build content engines on top of those large language models. So for those who believe to the results of AI detection tool religiously, you might be up for a great disappointment! Unless you'll build a company investing millions a month (as large language models become more and more complex) in AI detection technology, this will always be a cat and mouse game!
- In the meantime, Microsoft seems to be moving fast in integrating OpenAI's technology into Microsoft's products.
- This of course has awakened the sleepy AI giant: Google! Indeed, it seems that Sergey Brin, co-founder of Google was reviewing the code for LaMDA, the company's large language model (GPT-3's competitor), which might be the underlying model for Google's ChatGPT-like tool!
- Indeed, Google has a huge amount of pressure as its revenue slew down substantially in the last quarter of 2022, and the only segment that made it strong was Google Cloud (which though runs at negative margins as Google is trying to win cloud deals).
- In fact, as I explained, in AI business models, AI Supercomputers (part of the Cloud Infrastructure at Microsoft and Google) have become a key component to the AI race!
So, if you are Google, you want to make sure to quickly fill the market gap between ChatGPT (which over time might turn into a Google's killer) and get back on track to the AI race!
As this will help, not only, to keep Google's dominant position, but also to strengthen Google's Cloud segment, which in the future, might be the most important segment for the company and the infrastructure the will power up the AI Industrial Revolution!
As I explained in yesterday's newsletter, today, ChatGPT is trapped into a web app, which doesn't access the web (for now) and it can't be hooked to your device (for now).
And yet, once it does, with prompt engineering and in-context learning it might be able to unleash a set of custom experiences that we've never seen before.
That might unleash what I like to call real-world generative experiences.
Create your
podcast in
minutes
It is Free