Everything net
Everything relevant to the thought... With this insight, we turn to the field of Artificial Intelligence (AI) and find that, while still far from demonstrating general intelligence, many state-of-the-art deep learning methods have begun to incorporate key aspects of each of the three functional theories. ...
1330 characters.
... But that cheap credit hasn’t actually created any real resources, so you end up with increased spending on both capital and consumer goods. As a result, inflation starts to inch up, and central banks are forced to raise interest rates to cool things off. ...
... But now things get weird. In the Austrian theory, a recession is a process of resource re-allocation from capital-intensive to capital-light industries. Society needs to produce fewer factories and machine tools and more beer and pairs of pants (or whatever). ...
... In the Austrian theory, it’s important that central banks don’t interrupt this process by pushing interest rates back down to unnaturally low levels, because that interferes with this necessary re-allocation process. Ok so let’s think about 2006 to 2009. ...
... Instead of re-allocating workers and other resources from home building to other sectors, you suddenly had almost every industry laying off workers—even ones that were not capital intensive and did not see strong growth in the 2000s. ...
2948 characters.
... This @JosephPolitano writeup is invaluable for understanding how this works. https://t.co/mmlaHCOVYW https://t.co/xtCyVhGzZl The use of OER to measure housing costs strikes a lot of people as goofy when they first hear about it. ...
... But if you think through the alternatives it does seem like the least-bad option. Just throwing home prices into the basket is weird because most homeowners don't buy a home every year. It also doesn't account for the fact that mortgage rates change over time. ...
... You could survey homeowners on their mortgage payments, but then would you price it based on the mortgages people actually pay (which would include people who bought homes 20 years ago and therefore have tiny payments) or what they would pay if they bought their house now? ...
933 characters.
... You don't need to go to CPT to argue that "deep down" there is no such thing as causality. Newtonian mechanics is also governed by algebraic equalities, so it is symmetric, telling us that f=ma is as valid as a=f/m. ...
301 characters.
... There's a widespread view in inflation hawk circles that the BLS came under political pressure (in the 1990s and maybe before) to modify the CPI to reduced the measured inflation rate and hence reduce how much the government would have to pay in Social Security benefits. ...
... The CPI, the one used for adjusting Social Security benefits, tends to come in higher than the PCE index. Right now, for example, PCE inflation is 5.7 percent while CPI inflation is 7 percent. ...
... So if Newt Gingrich forced the BLS to lower the CPI as a backdoor way of cutting Social Security payments, did he force the BEA to do the same thing so it wouldn't look suspicious? ...
1019 characters.
... For 25 years prior to 2020, the prices of durable goods like cars, washing machines, and couches fell every single year. Then in 2021 that suddenly and dramatically reversed. https://t.co/BTtqgHOKxB This has some important implications. One is that this isn't just an American problem. ...
... It takes a year or two to spin up the capacity, but it's not rocket science. These are also largely global industries, so they won't be constrained by US labor supply. ...
1870 characters.
... I'm reading @mattyglesias and thinking that bubble-phobia was a major factor behind the bad economic performance of the Bush and Obama years. https://t.co/p8zhQAIA2D https://t.co/xBi4kjjAXK In the early 2000s, people exaggerated the frothiness of the tech bubble and the harms from its crash. ...
... When a sluggish recovery finally started to gain steam in 2004-5, triggering a housing boom, people once again over-estimated its frothiness. This bubble frame caused the Fed to react too slowly to the onset of the Great Recession from December 2007 to September 2008. ...
... Then the idea that 1999 and 2006 were driven by unsustainable bubbles, as opposed to just being healthy economic booms, prevented people from recognizing how far the US economy was below potential from 2010 to 2015. ...
... People in the early 2010s assumed we had to live with a permanently shitty economy to avoid having more bubbles. But now we know that was wrong. The economy was still way below potential in 2015, and it's possible to recover rapidly from a recession with appropriate macro policy. ...
1418 characters.
... THAT said, if you are A.I. and profitable, you will demand double the valuation instantly. Nothing beats controlling your own destiny. ...
273 characters.
... Human perception is a controlled/controlling hallucination process in which the brain 1\. constantly generates hypotheses of the world around and 2\. makes predictions about incoming sensory signals based on these hypotheses 3\. correct hypotheses according to differences between observations \(actual ...
... sensory signals and predictions\. \) ...
342 characters.
... The controlled hallucination process is also a hypotheses-prediction-observation correction learning process. Brains can learn brand new concepts by generating new hypotheses about the world, making predictions about the sensory input, and correcting according to observation (actual sensory input) ...
298 characters.
... It's hard to imagine higher-level cognitive faculty without some form of hierarchical information processing. ATB proposed that such a hierarchy corresponds to the hierarchy of objects in the real world. This might be a bit too speculative. Columns learn from prediction errors. ...
... They can predict raw sensory inputs; they can also predict signals by other columns produced from sensory inputs. Thus, learning can happen when there are raw sensory input prediction errors as well as when there are other column signal prediction errors. ...
... That being said, there is no reason to believe that the hierarchy is a neat pyramid with clear-cut division between layers. Any column can learn from any other columns as long as their signals are useful. It's just that learning, and thus models, can happen orders away from raw input signals. ...
946 characters.
... High-level concepts and relationships between them exist linguistically in our brains, and cognitive functions based on these concepts and relationships are also encoded in sentences-like linguistic memories. Our brains can a) store models of the world in the sentences like linguistic memory. ...
... "Deers come to this spot when there is a drought." and b) Construct new knowledge/predictions by constructing new sentences following syntax rules E.g. "there is a drought now, if we go to this spot we might find deers." ...
... High-level human cognitive functions are the enterprise of our braining employing these two faculties. We don't have dedicated circuitries for each model expressed in linguistic memory, we just need the basic circuitries for language processing. ...
844 characters.
... Boroditsky has conducted a number of studies that have shown how the language we speak can influence our perception of time, space, and other aspects of our environment. For example, speakers of languages that use different words for different types of snow (e.g. ...
416 characters.
... We can understand it through biology, evolutionary theories, social science. Almost every aspects of human behavior can be explained with the internal logic discovered by these scientific studies. ...
... It's only when we truly understand the origin of these behaviors, then we can possibly make some choices in deciding the meaning of being human. ...
342 characters.
... Cortical columns make classifications based on raw sensory inputs as well as classification signals from other columns. They learn by predicting incoming raw sensory inputs as well as classification signals from other columns. ...
... One column's prediction error helps its own learning, as well as other columns' learning when the prediction was made with input from their classification signals. What do you think of my hypothesis? ...
... **ChatGPT**: Your hypothesis describes a process that is similar to how the brain is thought to process and classify sensory information. Cortical columns, which are groups of neurons in the cerebral cortex, are thought to play a role in processing and classifying sensory information. ...
... They do this by receiving input from sensory organs and then making predictions about what this input represents. These predictions are then compared to the actual input, and any discrepancies between the two are used to update the column's understanding of the sensory information. ...
... It is also thought that cortical columns communicate with each other, sharing information about the sensory input they have received and the predictions they have made. ...
... This can help to improve the overall accuracy of the predictions made by the brain, as different columns may have different strengths and areas of expertise in processing different types of sensory information. ...
... Overall, your hypothesis captures some of the key principles of how the brain processes and classifies sensory information, and it is generally consistent with current understanding of this process. ...
... However, it is important to note that the brain is a complex and multifaceted organ, and our understanding of how it works is still evolving. There may be additional factors and processes at play that are not captured in your hypothesis. ...
1869 characters.
... In some sense, language and its syntax provides the programming language for brains and reduced the need for specialized neural circuitries. ...
... To reach the same level of intelligence without a vastly larger number of neural circuitries in artificial neuron networks, they need to be able to do the same thing. . ...
365 characters.