Everything net
The whole neocortex, regardless of its specific ...
Everything relevant to the thought
8 items and more
Loading more relevant content by others
Loading...

... In popular media, there is often a connection drawn between the advent of awareness in artificial agents and those same agents simultaneously achieving human or superhuman level intelligence. ...

... In this work, we explore the validity and potential application of this seemingly intuitive link between consciousness and intelligence. ...

... We do so by examining the cognitive abilities associated with three contemporary theories of conscious function: Global Workspace Theory (GWT), Information Generation Theory (IGT), and Attention Schema Theory (AST). ...

... We find that all three theories specifically relate conscious function to some aspect of domain-general intelligence in humans. ...

... With this insight, we turn to the field of Artificial Intelligence (AI) and find that, while still far from demonstrating general intelligence, many state-of-the-art deep learning methods have begun to incorporate key aspects of each of the three functional theories. ...

... Given this apparent trend, we use the motivating example of mental time travel in humans to propose ways in which insights from each of the three theories may be combined into a unified model. ...

... We believe that doing so can enable the development of artificial agents which are not only more generally intelligent but are also consistent with multiple current theories of conscious function. ...

1330 characters.

Loading items related to this reference source
Loading...

... Collective Intelligence ...

23 characters.

Loading items related to this reference source
Loading...

... [https://www.inc.com/bill-murphy-jr/how-emotionally-intelligent-people-use-send-a-bible-rule-to-become-remarkably-more-memorable.html](https://www.inc.com/bill-murphy-jr/how-emotionally-intelligent-people-use-send-a-bible-rule-to-become-remarkably-more-memorable.html) ...

470 characters.

emotional intelligence
Loading items related to this note
Loading...

... You don't need to go to CPT to argue that "deep down" there is no such thing as causality. Newtonian mechanics is also governed by algebraic equalities, so it is symmetric, telling us that f=ma is as valid as a=f/m. ...

... Where causality comes in is when we venture to model the 1/2 https://t.co/AmgYUOenjq ...

301 characters.

Loading items related to this reference source
Loading...

... Thread… Here’s my basic understanding of the model: the economy has some industries that are capital intensive and others that are not. When the central bank makes interest rates artificially low, it makes capital investment cheap and skews the economy toward capital intensive sectors. ...

... As a result, inflation starts to inch up, and central banks are forced to raise interest rates to cool things off. So far this is an entirely conventional account of how business cycles work. But now things get weird. ...

... The period through the end of 2007 arguably fits the austrian model. There was arguably over investment in residential home construction. In 2006 and 2007 the home building industry was contracting while other industries were still growing. But in mid 2008, the situation changed. ...

... I can’t figure out how to explain this period with an Austrian model. I don’t see why anyone would consider this kind of mass unemployment necessary or how it set us up for stronger growth later. ...

2948 characters.

Loading items related to this reference source
Loading...

... Yes, I know the HODLers see it as a buying opportunity, and they could be right — not doing price predictions, just trying to think this through 1/ First: crypto faithful comparing this to "crypto winter" of 2017-18, which was comparable in percentage terms. ...

... bitcoins mined in 2021; at $50K each, that's around 0.2% of US GDP 5/ By contrast, residential investment peaked at almost 7% of GDP and fell by more than 4% 6/ https://t.co/PDSNM4BV7l And there surely isn't enough leveraged buying of crypto to create 2008-type financial stability risks, although things ...

1419 characters.

Loading items related to this reference source
Loading...

... So if Newt Gingrich forced the BLS to lower the CPI as a backdoor way of cutting Social Security payments, did he force the BEA to do the same thing so it wouldn't look suspicious? ...

1019 characters.

Loading items related to this reference source
Loading...

... pivot, which I was able to incorporate 1/ https://t.co/LwvwHJ6lgv I say sort-of pivot because while the Fed has stopped saying "transitory", it still seems to believe that much of the inflation we're currently experiencing will fade away without the need to impose a lot of economic pain 2/ One odd thing ...

1009 characters.

Loading items related to this reference source
Loading...
2 items
Thanks to the intrinsic set of fixed relationships between them, this formalization provides a helpful structure to our inquiry. It enables the learning cycle of observations -> hypotheses -> predictions -> correct with observations. This can be argued for [philosophically](https://www.themind.net/hypotheses/W2wRBi5mSeGueEYevUjMzw) and [neuroscientifically](https://www.themind.net/hypotheses/M4p8C9lOTRu8ipf5zGtEJA).
76.3%
Loading items related to this hypothesis
Loading...
**Observation:** An observed empirical fact, indisputable for reasonable people. **Hypothesis:** A conjecture that is generalized from observations (induction), or deducted from other hypotheses (deduction). Collectively, hypotheses constitute our model (or theory) of the world. A hypothesis, often together with other hypotheses, can produce predictions. We use the broadest sense of this word, not limited to untested theories of how something works. It encompasses assumption, theory, putative knowledge, and the narrow sense of hypothesis. **Prediction:** A yet-to-be-made observation or (observational categorical per Quine). It's usually based on one or more hypotheses. The relationships between these categories of thoughts are fixed. In another sentence, there is an algebra in these elements of thoughts. This theory (or "hypothesis" as defined in itself) has [a philosophical basis](https://www.themind.net/hypotheses/M1qolEkbTje29ze62yEfQg): our knowledge of the world consists solely of prediction models. and [a neuroscience one](https://www.themind.net/hypotheses/M4p8C9lOTRu8ipf5zGtEJA) (and [independently](https://www.themind.net/hypotheses/n3Tx6wlrSWOjsXHSYggrFQ).) Out brain's intelligence is solely neurons trained to that predicts inputs working collectively.
71.9%
Loading items related to this hypothesis
Loading...
8 items and more
Loading more relevant content from same author
Loading...

... Columns learn from prediction errors. They can predict raw sensory inputs; they can also predict signals by other columns produced from sensory inputs. Thus, learning can happen when there are raw sensory input prediction errors as well as when there are other column signal prediction errors. ...

... Learning in columns can easily be hierarchical - naturally, models (or knowledge) learned from them are hierarchical. That being said, there is no reason to believe that the hierarchy is a neat pyramid with clear-cut division between layers. ...

... It's just that learning, and thus models, can happen orders away from raw input signals. ...

946 characters.

p/brain
Loading items related to this hypothesis
Loading...

... Models can be seen as the math representation of hypotheses ...

59 characters.

Loading items related to this observation
Loading...

... More precisely, LLMs model the concepts in natural languages using the language (albeit in a different syntax). Obviously, LLMs don't need to learn the concepts from scratch, they already have encoded words. ...

... Thus one might be able to say that LLMs model the world in language. It might be a totally different gramma from natural language, but a syntax nonetheless and it's quite possible that this syntax is inspired by the syntax in natural language. ...

... The ability to model concepts using words, phrases and even sentences combined with syntax is critical. [It might be the reason we humans reached our level of intelligence](https://www.themind.net/hypotheses/8yof9E9YTYu4vHQI4qgBcw). ...

1529 characters.

Loading items related to this hypothesis
Loading...

... High-level concepts and relationships between them exist linguistically in our brains, and cognitive functions based on these concepts and relationships are also encoded in sentences-like linguistic memories. Our brains can a) store models of the world in the sentences like linguistic memory. ...

... "Deers come to this spot when there is a drought." and b) Construct new knowledge/predictions by constructing new sentences following syntax rules E.g. "there is a drought now, if we go to this spot we might find deers." ...

... High-level human cognitive functions are the enterprise of our braining employing these two faculties. We don't have dedicated circuitries for each model expressed in linguistic memory, we just need the basic circuitries for language processing. ...

844 characters.

p/brain
Loading items related to this hypothesis
Loading...
No Content
Loading items related to this observation
Loading...

... This is how humans reached this level of intelligence. In some sense, language and its syntax provides the programming language for brains and reduced the need for specialized neural circuitries. ...

... To reach the same level of intelligence without a vastly larger number of neural circuitries in artificial neuron networks, they need to be able to do the same thing. . ...

365 characters.

p/investing AI
Loading items related to this hypothesis
Loading...

... Because this self-supervised learning process mimics the brain's learning mechanism: make predictions and learn from prediction errors. ...

135 characters.

Loading items related to this hypothesis
Loading...

... Collective digitalized mind, i.e. digitalized and connected thoughts/ideas from multiple users by itself, may not automatically achieve collective intelligence (the intelligent capabilities of solving problems collectively). ...

379 characters.

p/Collective Intelligence p/The Mind Net
Loading items related to this hypothesis
Loading...
**Observation:** An observed empirical fact, indisputable for reasonable people. **Hypothesis:** A conjecture that is generalized from observations (induction), or deducted from other hypotheses (deduction). Collectively, hypotheses constitute our model (or theory) of the world. A hypothesis, often together with other hypotheses, can produce predictions. We use the broadest sense of this word, not limited to untested theories of how something works. It encompasses assumption, theory, putative knowledge, and the narrow sense of hypothesis. **Prediction:** A yet-to-be-made observation or (observational categorical per Quine). It's usually based on one or more hypotheses. The relationships between these categories of thoughts are fixed. In another sentence, there is an algebra in these elements of thoughts. This theory (or "hypothesis" as defined in itself) has [a philosophical basis](https://www.themind.net/hypotheses/M1qolEkbTje29ze62yEfQg): our knowledge of the world consists solely of prediction models. and [a neuroscience one](https://www.themind.net/hypotheses/M4p8C9lOTRu8ipf5zGtEJA) (and [independently](https://www.themind.net/hypotheses/n3Tx6wlrSWOjsXHSYggrFQ).) Out brain's intelligence is solely neurons trained to that predicts inputs working collectively.
71.9%
Loading items related to this hypothesis
Loading...
)
Terms of Service Privacy Policy About
Loading...
Loading...
Loading...