Everything net
Everything relevant to the thought... How to cultivate Asian heritage languages in the classroom for children from Asian immigrant families? For teachers who don't understand Asian heritage languages, how to encourage children's multilingual development in the classroom? ...
233 characters.
... pace seems to be slowing 1/ https://t.co/1a4025hPpu Couple this with expectations data: expected inflation up a lot for next year but not so much over next 5 years, suggesting consumers expect shock to be temporary 2/ https://t.co/JGj8kRXLhI https://t.co/PvRlPZ0vTp So far, then, no signs of a 70s-type ...
521 characters.
... One of the worst aspects of current politics is the bipartisan consensus that America kind of sucks. On the left you have the relentless negativity of woke politics. On the right you have the relentless negativity of Trumpism. ...
452 characters.
... We find that all three theories specifically relate conscious function to some aspect of domain-general intelligence in humans. ...
... With this insight, we turn to the field of Artificial Intelligence (AI) and find that, while still far from demonstrating general intelligence, many state-of-the-art deep learning methods have begun to incorporate key aspects of each of the three functional theories. ...
1330 characters.
... I'm reading the numbers right, around 800K bitcoins mined in 2021; at $50K each, that's around 0.2% of US GDP 5/ By contrast, residential investment peaked at almost 7% of GDP and fell by more than 4% 6/ https://t.co/PDSNM4BV7l And there surely isn't enough leveraged buying of crypto to create 2008-type ...
1419 characters.
... Many aspects of government are coasting along, overseen by agencies created between 1932 and 1972 but whose enabling legislation is increasingly out of date. ...
920 characters.
... I can also believe that money is more long-run neutral in high inflation environments. ...
2224 characters.
... For example, she has shown that speakers of languages that use different words for different types of spatial relationships (e.g. "left" versus "right") are better at remembering the location of objects than speakers of languages that do not make this distinction. ...
264 characters.
... The brain uses a mental language to represent and organize complex ideas and concepts. This mental language is thought to be distinct from natural languages like English or Spanish, and it is believed to be the medium through which we think and process information. ...
... According to the LOTH, the structure and content of this mental language are shaped by the structure and content of the natural languages that we learn, but it is not identical to any one natural language. ...
... Instead, it is thought to be a universal language that is used by all humans to represent and process complex ideas. Link: https://plato.stanford.edu/entries/language-thought/ ...
651 characters.
... Suggested in [Being You](https://www.anilseth.com/being-you/), the perception process is a top-down "controlled hallucination." The brain predicts conceived "things" it will see and then "the sensory input" caused by them. ...
... Then the brain verifies or corrects the conceived things with the sensory input it actually receives. ...
324 characters.
... Human perception is a controlled/controlling hallucination process in which the brain 1\. constantly generates hypotheses of the world around and 2\. makes predictions about incoming sensory signals based on these hypotheses 3\. correct hypotheses according to differences between observations \(actual ...
342 characters.
... More precisely, LLMs model the concepts in natural languages using the language (albeit in a different syntax). Obviously, LLMs don't need to learn the concepts from scratch, they already have encoded words. ...
... More importantly, it doesn't need to learn a representation of the kinds of relationships between concepts, those are also encoded in words in the language as well, such as, "is", "belong to," "cause", etc. Here comes the more speculative part. ...
... To perform cognitive tasks, LLMs need to learn the specific relationships between specific concepts, and those relationships can be connections between a group of words, e.g., "swan", "black", "is", here "swan" and "black" are two concepts while "is" is the relationship between them. ...
... However, such groups of words can have multiple interpretations. Thus to truly encode a specific relationship, LLMs need to have connections embedded with syntax. e.g. it needs to encode the relationship between "black," and "swan" as "swan is black". i.e. ...
... A language phrase encodes the relationship. Thus one might be able to say that LLMs model the world in language. It might be a totally different gramma from natural language, but a syntax nonetheless and it's quite possible that this syntax is inspired by the syntax in natural language. ...
... The ability to model concepts using words, phrases and even sentences combined with syntax is critical. [It might be the reason we humans reached our level of intelligence](https://www.themind.net/hypotheses/8yof9E9YTYu4vHQI4qgBcw). ...
1529 characters.
... That is, assign language tokens to objects. not just for communication, e.g., when young children name their dolls. Or when someone comes up with a new concept, they would eager to find a linguistic name for it, even before the need to communicate it. ...
251 characters.