If people would always demand answers for those questions, we wouldn’t have speculative bubbles. For now, everybody seems to still believe the “it’s the worst it’ll ever be right now” and the “just more scaling bro” answers.
Trying to sell consumers on “scaling solves everything” is going to be a hard sell.
If we look at general purpose computation, which had decades of actual scaling-solves-everything growth, you had two influences that made the message resonate with customers:
Clear existing applications where more power made the experience straightforward better. Your spreadsheet took an hour to recalculate at 8MHz and 20 minutes at 25MHz. A lot of the “bigger model” stuff is plateauing with marginal or spotty gains. If I feed another 5 Internets of data to ChatGPT, will that summarized email be that much better?
New applications that could be demoed on specialised low capacity hardware and scaled down to consumers as more power became available. Think of early CGI on hardware costing tens of millions, and now you can run Blender on a $149 laptop. Since most commercial AI plays are hosted services, there’s not much opportunity to tease that way anymore.
If people would always demand answers for those questions, we wouldn’t have speculative bubbles. For now, everybody seems to still believe the “it’s the worst it’ll ever be right now” and the “just more scaling bro” answers.
Trying to sell consumers on “scaling solves everything” is going to be a hard sell.
If we look at general purpose computation, which had decades of actual scaling-solves-everything growth, you had two influences that made the message resonate with customers:
Clear existing applications where more power made the experience straightforward better. Your spreadsheet took an hour to recalculate at 8MHz and 20 minutes at 25MHz. A lot of the “bigger model” stuff is plateauing with marginal or spotty gains. If I feed another 5 Internets of data to ChatGPT, will that summarized email be that much better?
New applications that could be demoed on specialised low capacity hardware and scaled down to consumers as more power became available. Think of early CGI on hardware costing tens of millions, and now you can run Blender on a $149 laptop. Since most commercial AI plays are hosted services, there’s not much opportunity to tease that way anymore.