Everything net
Everything relevant to the thought... In popular media, there is often a connection drawn between the advent of awareness in artificial agents and those same agents simultaneously achieving human or superhuman level intelligence. ...
... In this work, we explore the validity and potential application of this seemingly intuitive link between consciousness and intelligence. ...
... We find that all three theories specifically relate conscious function to some aspect of domain-general intelligence in humans. ...
... With this insight, we turn to the field of Artificial Intelligence (AI) and find that, while still far from demonstrating general intelligence, many state-of-the-art deep learning methods have begun to incorporate key aspects of each of the three functional theories. ...
... Given this apparent trend, we use the motivating example of mental time travel in humans to propose ways in which insights from each of the three theories may be combined into a unified model. ...
... We believe that doing so can enable the development of artificial agents which are not only more generally intelligent but are also consistent with multiple current theories of conscious function. ...
1330 characters.
... How to cultivate Asian heritage languages in the classroom for children from Asian immigrant families? For teachers who don't understand Asian heritage languages, how to encourage children's multilingual development in the classroom? ...
233 characters.
... [https://www.inc.com/bill-murphy-jr/how-emotionally-intelligent-people-use-send-a-bible-rule-to-become-remarkably-more-memorable.html](https://www.inc.com/bill-murphy-jr/how-emotionally-intelligent-people-use-send-a-bible-rule-to-become-remarkably-more-memorable.html) ...
470 characters.
... When the central bank makes interest rates artificially low, it makes capital investment cheap and skews the economy toward capital intensive sectors. But that cheap credit hasn’t actually created any real resources, so you end up with increased spending on both capital and consumer goods. ...
... There was arguably over investment in residential home construction. In 2006 and 2007 the home building industry was contracting while other industries were still growing. But in mid 2008, the situation changed. ...
2948 characters.
... If I'm reading the numbers right, around 800K bitcoins mined in 2021; at $50K each, that's around 0.2% of US GDP 5/ By contrast, residential investment peaked at almost 7% of GDP and fell by more than 4% 6/ https://t.co/PDSNM4BV7l And there surely isn't enough leveraged buying of crypto to create 2008 ...
1419 characters.
... Human brains have no biological apparatus for logic thinking (or, more broadly speaking, system 2 thinking). This is in contrast with the fact that human language faculties are more likely supported by prewired circuitries. ...
223 characters.
... We can understand it through biology, evolutionary theories, social science. Almost every aspects of human behavior can be explained with the internal logic discovered by these scientific studies. ...
... It's only when we truly understand the origin of these behaviors, then we can possibly make some choices in deciding the meaning of being human. ...
342 characters.
... Boroditsky has conducted a number of studies that have shown how the language we speak can influence our perception of time, space, and other aspects of our environment. For example, speakers of languages that use different words for different types of snow (e.g. ...
... "wet snow" versus "dry snow") are better at discriminating between different types of snow than speakers of languages that do not make this distinction. ...
416 characters.
... More precisely, LLMs model the concepts in natural languages using the language (albeit in a different syntax). Obviously, LLMs don't need to learn the concepts from scratch, they already have encoded words. ...
... More importantly, it doesn't need to learn a representation of the kinds of relationships between concepts, those are also encoded in words in the language as well, such as, "is", "belong to," "cause", etc. Here comes the more speculative part. ...
... Thus to truly encode a specific relationship, LLMs need to have connections embedded with syntax. e.g. it needs to encode the relationship between "black," and "swan" as "swan is black". i.e. A language phrase encodes the relationship. ...
... Thus one might be able to say that LLMs model the world in language. It might be a totally different gramma from natural language, but a syntax nonetheless and it's quite possible that this syntax is inspired by the syntax in natural language. ...
... The ability to model concepts using words, phrases and even sentences combined with syntax is critical. [It might be the reason we humans reached our level of intelligence](https://www.themind.net/hypotheses/8yof9E9YTYu4vHQI4qgBcw). ...
1529 characters.
... More specifically, a network of ideas, thoughts, propositions, observations, things that live in our minds. Imagine we can connect things in our minds with things in other people's minds. ...
... As explained by the book "Where Good Ideas Come From: The Natural History of Innovation", being able to make connections between ideas in different minds has been critical for human innovations. ...
794 characters.
... The brain predicts conceived "things" it will see and then "the sensory input" caused by them. Then the brain verifies or corrects the conceived things with the sensory input it actually receives. ...
324 characters.
... For example, she has shown that speakers of languages that use different words for different types of spatial relationships (e.g. "left" versus "right") are better at remembering the location of objects than speakers of languages that do not make this distinction. ...
264 characters.
... That is, assign language tokens to objects. not just for communication, e.g., when young children name their dolls. Or when someone comes up with a new concept, they would eager to find a linguistic name for it, even before the need to communicate it. ...
251 characters.