One of the ideas circulating around the AI Downers on social media is that AI is nothing more than a repackaged search engine (and according to some, will make us all much dumber). I started thinking about this after browsing some of the announcements after Apple’s thing where they announced a bunch of ChatGPT integrations into their products:
In Apple's lone example, there was a "help" intent, with the input saying to "help me plan a five-course meal" given certain ingredient limitations. That sort of ultra-specific input is something you can't do with a traditional search engine.
In some respects is not all that dissimilar to:
“While it’s still in its early phase, artificial intelligence will one day accomplish things that humans could have never even dreamed of doing,” said Parker, who, by all accounts, has never stretched himself to do something he found difficult; has never created anything truly original; and, deep down, has absolutely zero understanding of what makes things good, enjoyable, or rewarding
But! But I think the actual utility here comes down to what’s common knowledge and what’s not - i.e. what normal people have a frame of reference for already.
For a normal person, the mental gymnastics required to plan a five-course meal with specific ingredients using plain-old search are very different than those required to figure out the arcane rules of probate court. Here’s what I mean - pretty much everyone has an existing mental frame of reference for:
What food is, i.e. things that are edible;
What foods go with other foods, at least in general (such as sliced carrots don’t go well into breakfast cereal, but sliced bananas can); and
The idea that a meal can have different courses, such as
Appetizer
Salad
Main
Dessert
What’s more, pretty much everyone is able to google something like “ideas for five-course meal with salmon” and get results that they can quickly scan and incorporate into their existing mental model for what food is, and what meals should be like. Thanks to years of SEO optimizing, that stuff, replete with stories of grandma’s famous five-course salmon meals, already exists on Al Gore’s internet! So it’s not that you can’t plan a five-course meal with specific ingredients using search, it’s that it takes 2 or 3 extra mental steps to do so.
I contrasted this with the “arcane rules of probate court” above - this is where I think the difference between AI and search can be used to the legal help seeker’s advantage. While everyone has a mental frame of reference for food, pretty much nobody has the mental frame of reference for:
Testamentary procedures;
The rule against perpetuities;
The difference between:
Tenants in common
Joint tenants
And so forth - it’s been a long time since I took Wills & Trusts in law school.
My point is this:
The difference between googling “ideas for 5-course meals with salmon” and asking ChatGPT “Give me some ideas for 5-course meals with salmon” isn’t all that big.
The difference between a non-lawyer googling “probate” and asking ChatGPT “help me understand probate procedure when my father died with no will and I have 3 siblings” is potentially huge in terms of helpfulness:
vs:
And yes, there are concerns about accuracy and other stuff, but I don’t buy the argument that AI is “just search” for things where people don’t have the existing mental framework for understanding it. This is part of why I think AI can be a big help in accessing legal help, be that from a lawyer or from simply getting good results from ChatGPT.