Everything net
Everything relevant to the thought... In popular media, there is often a connection drawn between the advent of awareness in artificial agents and those same agents simultaneously achieving human or superhuman level intelligence. ...
... We find that all three theories specifically relate conscious function to some aspect of domain-general intelligence in humans. ...
... With this insight, we turn to the field of Artificial Intelligence (AI) and find that, while still far from demonstrating general intelligence, many state-of-the-art deep learning methods have begun to incorporate key aspects of each of the three functional theories. ...
... Given this apparent trend, we use the motivating example of mental time travel in humans to propose ways in which insights from each of the three theories may be combined into a unified model. ...
1330 characters.
... What's funny is that there are a bunch of countries that experienced similarly large housing booms in the 2000s, never had a comparable crash, and are now at even higher levels. https://t.co/EbuqEoSkxP https://t.co/KnigE1W2SJ To be fair, some countries, including Spain and Ireland, did have US-style ...
573 characters.
... I've been noting that we're currently seeing a surge in real house prices up to 2000s-bubble levels 1/ https://t.co/ukUAXznGpk But the 2000s bubble was geographically very uneven: prices surged in cities with strict zoning, but not in places where developers were free to sprawl => elastic housing ...
1012 characters.
... Meaning that an economy has a long run potential level of output that it will return to regardless of what policy the central bank pursues. This makes no sense to me. ...
2224 characters.
... In the Austrian theory, it’s important that central banks don’t interrupt this process by pushing interest rates back down to unnaturally low levels, because that interferes with this necessary re-allocation process. Ok so let’s think about 2006 to 2009. ...
2948 characters.
... Thanks to the intrinsic set of fixed relationships between them, this formalization provides a helpful structure to our inquiry. It enables the learning cycle of observations -> hypotheses -> predictions -> correct with observations. ...
422 characters.
... It's hard to imagine higher-level cognitive faculty without some form of hierarchical information processing. ATB proposed that such a hierarchy corresponds to the hierarchy of objects in the real world. This might be a bit too speculative. Columns learn from prediction errors. ...
... Thus, learning can happen when there are raw sensory input prediction errors as well as when there are other column signal prediction errors. Learning in columns can easily be hierarchical - naturally, models (or knowledge) learned from them are hierarchical. ...
... Any column can learn from any other columns as long as their signals are useful. It's just that learning, and thus models, can happen orders away from raw input signals. ...
946 characters.
... The brain learns by constantly making predictions and making corrections (in its wiring) to approach close to the actual results. ...
129 characters.
... Human brains have no biological apparatus for logic thinking (or, more broadly speaking, system 2 thinking). This is in contrast with the fact that human language faculties are more likely supported by prewired circuitries. ...
223 characters.
... GitHub copilot is promising, but it's not human-level yet. It might be possible for Self-driving to use LLMs as foundations to build models that can predict the social behavior of humans on the street. ...
201 characters.
... This is how humans reached this level of intelligence. In some sense, language and its syntax provides the programming language for brains and reduced the need for specialized neural circuitries. ...
... To reach the same level of intelligence without a vastly larger number of neural circuitries in artificial neuron networks, they need to be able to do the same thing. . ...
365 characters.
... High-level concepts and relationships between them exist linguistically in our brains, and cognitive functions based on these concepts and relationships are also encoded in sentences-like linguistic memories. Our brains can a) store models of the world in the sentences like linguistic memory. ...
... High-level human cognitive functions are the enterprise of our braining employing these two faculties. We don't have dedicated circuitries for each model expressed in linguistic memory, we just need the basic circuitries for language processing. ...
844 characters.