Everything net
Everything relevant to the thought... How to cultivate Asian heritage languages in the classroom for children from Asian immigrant families? For teachers who don't understand Asian heritage languages, how to encourage children's multilingual development in the classroom? ...
233 characters.
... Thread… Here’s my basic understanding of the model: the economy has some industries that are capital intensive and others that are not. When the central bank makes interest rates artificially low, it makes capital investment cheap and skews the economy toward capital intensive sectors. ...
... The period through the end of 2007 arguably fits the austrian model. There was arguably over investment in residential home construction. In 2006 and 2007 the home building industry was contracting while other industries were still growing. But in mid 2008, the situation changed. ...
... I can’t figure out how to explain this period with an Austrian model. I don’t see why anyone would consider this kind of mass unemployment necessary or how it set us up for stronger growth later. ...
... But that does not make sense to me. The central question of macroeconomics is explaining why economies sometimes have periods of elevated unemployment, where not just one industry shrinks but almost all of them do at once. ...
2948 characters.
... This makes no sense to me. To start with the most obvious point: I think everyone would agree that a central bank can induce a recession (a decline of real output) if it tries hard enough. ...
... Even if that spending is restored later, that's several years when you don't have scientists and engineers working on important problems. Presumably that translates to lost inventions. ...
2224 characters.
... I expect to do a bit more of this in 2022. Our subscription revenues aren't yet close replacing my Ars salary. My wife earns enough as a physician that I can take a big pay cut without a big financial hardship for our family. I'm very lucky. ...
911 characters.
... With this insight, we turn to the field of Artificial Intelligence (AI) and find that, while still far from demonstrating general intelligence, many state-of-the-art deep learning methods have begun to incorporate key aspects of each of the three functional theories. ...
... Given this apparent trend, we use the motivating example of mental time travel in humans to propose ways in which insights from each of the three theories may be combined into a unified model. ...
1330 characters.
... Where causality comes in is when we venture to model the 1/2 https://t.co/AmgYUOenjq ...
301 characters.
... If I'm reading the numbers right, around 800K bitcoins mined in 2021; at $50K each, that's around 0.2% of US GDP 5/ By contrast, residential investment peaked at almost 7% of GDP and fell by more than 4% 6/ https://t.co/PDSNM4BV7l And there surely isn't enough leveraged buying of crypto to create 2008 ...
1419 characters.
... More precisely, LLMs model the concepts in natural languages using the language (albeit in a different syntax). Obviously, LLMs don't need to learn the concepts from scratch, they already have encoded words. ...
... More importantly, it doesn't need to learn a representation of the kinds of relationships between concepts, those are also encoded in words in the language as well, such as, "is", "belong to," "cause", etc. Here comes the more speculative part. ...
... To perform cognitive tasks, LLMs need to learn the specific relationships between specific concepts, and those relationships can be connections between a group of words, e.g., "swan", "black", "is", here "swan" and "black" are two concepts while "is" is the relationship between them. ...
... Thus to truly encode a specific relationship, LLMs need to have connections embedded with syntax. e.g. it needs to encode the relationship between "black," and "swan" as "swan is black". i.e. A language phrase encodes the relationship. ...
... Thus one might be able to say that LLMs model the world in language. It might be a totally different gramma from natural language, but a syntax nonetheless and it's quite possible that this syntax is inspired by the syntax in natural language. ...
... The ability to model concepts using words, phrases and even sentences combined with syntax is critical. [It might be the reason we humans reached our level of intelligence](https://www.themind.net/hypotheses/8yof9E9YTYu4vHQI4qgBcw). ...
1529 characters.
... Columns learn from prediction errors. They can predict raw sensory inputs; they can also predict signals by other columns produced from sensory inputs. Thus, learning can happen when there are raw sensory input prediction errors as well as when there are other column signal prediction errors. ...
... Learning in columns can easily be hierarchical - naturally, models (or knowledge) learned from them are hierarchical. That being said, there is no reason to believe that the hierarchy is a neat pyramid with clear-cut division between layers. ...
... Any column can learn from any other columns as long as their signals are useful. It's just that learning, and thus models, can happen orders away from raw input signals. ...
946 characters.
... In some sense, language and its syntax provides the programming language for brains and reduced the need for specialized neural circuitries. ...
365 characters.
... The brain uses a mental language to represent and organize complex ideas and concepts. This mental language is thought to be distinct from natural languages like English or Spanish, and it is believed to be the medium through which we think and process information. ...
... According to the LOTH, the structure and content of this mental language are shaped by the structure and content of the natural languages that we learn, but it is not identical to any one natural language. ...
... Instead, it is thought to be a universal language that is used by all humans to represent and process complex ideas. Link: https://plato.stanford.edu/entries/language-thought/ ...
651 characters.
... [Data source](https://app.dealroom.co/transactions.rounds/f/growth\_stages/not\_mature/rounds/not\_GRANT\_SPAC%20PRIVATE%20PLACEMENT/tags/not\_outside%20tech/technologies/anyof\_artificial%20intelligence\_deep%20learning\_machine%20learning?showScale=absolute&showStats=YEAR&statsType=rounds) ...
292 characters.
... This might also be achievable through a collaboration between LLMs and humans. Then LLMs can generate new theorems on their own. ...
128 characters.
... Boroditsky has conducted a number of studies that have shown how the language we speak can influence our perception of time, space, and other aspects of our environment. For example, speakers of languages that use different words for different types of snow (e.g. ...
... "wet snow" versus "dry snow") are better at discriminating between different types of snow than speakers of languages that do not make this distinction. ...
416 characters.
... Our brains can a) store models of the world in the sentences like linguistic memory. E.g. "Deers come to this spot when there is a drought." and b) Construct new knowledge/predictions by constructing new sentences following syntax rules E.g. ...
... We don't have dedicated circuitries for each model expressed in linguistic memory, we just need the basic circuitries for language processing. Note that this hypothesis is different from linguistic determinism. ...
844 characters.