Everything net
Everything relevant to the thought... GitHub copilot is promising, but it's not human-level yet. It might be possible for Self-driving to use LLMs as foundations to build models that can predict the social behavior of humans on the street. ...
201 characters.
... More precisely, LLMs model the concepts in natural languages using the language (albeit in a different syntax). Obviously, LLMs don't need to learn the concepts from scratch, they already have encoded words. ...
... To perform cognitive tasks, LLMs need to learn the specific relationships between specific concepts, and those relationships can be connections between a group of words, e.g., "swan", "black", "is", here "swan" and "black" are two concepts while "is" is the relationship between them. ...
... Thus to truly encode a specific relationship, LLMs need to have connections embedded with syntax. e.g. it needs to encode the relationship between "black," and "swan" as "swan is black". i.e. A language phrase encodes the relationship. ...
... Thus one might be able to say that LLMs model the world in language. It might be a totally different gramma from natural language, but a syntax nonetheless and it's quite possible that this syntax is inspired by the syntax in natural language. ...
1529 characters.
... This might also be achievable through a collaboration between LLMs and humans. Then LLMs can generate new theorems on their own. ...
128 characters.
... There are no new capabilities that can provide new viable products than what GPT 3 already does (mainly CoPilot). That being said, the chat bot format did provide a great way for a wider community to explore potential use cases with LLM hence directions for future development (#GPT4?). ...
449 characters.
... The hope was that as the LLMs become better (which they seem to be consistently doing), it will have a positive effect on robotics. ...
279 characters.