View Full Version : The Political Economy of Chip Density
jake williams
16th September 2009, 22:57
A comrade of mine is an electrical engineering prof. He's doing a class right now on microchip manufacture.
He has this theory that a lot of the economic problems right now are due to, basically and as far as I understood what he was saying, the declining rate of increase in chip density. Basically, we're reaching a point where we can't increase computer capability at the rate we're used to, and this cuts into profits, and it contributes (at least contributes) to an economic crisis.
Thoughts? I thought it was at least a really interesting argument.
Paul Cockshott
16th September 2009, 23:08
The decline in the rate of increase in density has been long predicted. Moore reputedly was giving lectures 20 years ago saying that he would give Moore's law only another 10 years. Then 10 years ago he said he would give it another 10 years.
Jokes aside, we are going to hit a limit set by the thinness of the insulation at some point, though that may be some distance off.
The effect on the rate of profit is not obvious. A slackening of computer progress would tend to slow down moral depreciation of capital stock, and by non obvious means this would reduce the equilibrium rate of profit. See the paper From Adam Smith to the Equilibium Rate of Profit I have just posted on the Reality website ( I am not allowed to post links so do a google search on "Reality Political Economy Pages")
thethinkingchimp
16th September 2009, 23:20
Computerworld - Researchers at IBM are experimenting with a combination of DNA molecules and nanotechnology to create tiny circuits that could form the basis of smaller, more powerful and energy-efficient computer chips that also are easier and cheaper to manufacture.
IBM said last week that it's looking to use the DNA molecules as scaffolding so carbon nanotubes can assemble themselves into precise patterns.
The ability for the DNA structures to self-assemble would lead to greater precision in the design and manufacture of chips, said Greg Wallraff, an IBM Research scientist working on the project. He noted that implementation is still years away.
Dan Olds, an analyst at Gabriel Consulting Group Inc., said that "harnessing biological processes and building blocks" could significantly cut chip-building costs.
I would be estatic if this actually gets implemented. You never know though, a few years ago microsoft rushed to get a patent on using human skin as a viable conducter. In the future,if this process is found effective and used, computer chips may be able to get faster each year, for a few years longer. There is more about this topic to be found in a simple google search.
RedAnarchist
16th September 2009, 23:20
( I am not allowed to post links so do a google search on "Reality Political Economy Pages")
Offtopic, but in regards to links, you'll be able to do so after you reach 25 posts. This is simply an anti-spam feature of the forum.
JJM 777
17th September 2009, 11:40
He has this theory (...) we're reaching a point where we can't increase computer capability at the rate we're used to, and this cuts into profits, and it contributes (at least contributes) to an economic crisis.
This theory makes sense from a Capitalist point of view, but no sense at all from a Socialist point of view.
You don't need to be a rocket scientist to notice that computers are getting a bit more effective all the time, without any giant leaps. And just coincidentally, also Microsoft operating systems are getting a bit heavier all the time, requiring more memory and CPU speed from the computer.
I have worked in the computer administration of a big company, and 5 years ago we assessed that 400 MHz CPU and 128 MB memory are the minimum limits for computers where we can run Windows XP in the company. Any computer hardware that didn't meet these minimum demands, was thrown away 5 years ago, because it was useless for the company.
In course of time, as the Windows updates and service packs have made the XP operating system heavier, we had to update these minimum requirements, because computer users were complaining that the old computer hardware cannot run Windows XP comfortably any more.
- We changed the minimum requirement to 600 MHz CPU and 256 MB memory. Throw away some old computers and buy new ones.
- Later we again had to change the minimum requirement to 800 MHz CPU and 394 MB memory. Again throw away some old computers and buy new ones.
- And again later we had to change the minimum requirement to 1 GHz CPU and 512 MB memory. Again throw away some old computers and buy new ones.
So the computer technology is getting faster, and people buy new computers as their old computers are "not fast enough" any more. (They were fast enough before, but thanks to the new heavy updates and service packs of Windows XP, they are not fast enough any more.)
From a Capitalist viewpoint this makes perfect sense: people buy something new all the time, money rotates in the economy. So people have jobs and everyone is merry at the stock market.
From a Socialist or ecological viewpoint the "short product lifecycle" theory, a key element of modern Capitalism, makes absolutely no sense at all. People work to get money to buy products which they already have at home, but the "short product lifecycle" effect forces people to throw away what they have, so that they must again buy the same thing what they already have.
From a Socialist viewpoint, such a rat race does not mean "economical affluence". Quite the contrary, it means that people work like mad, without really improving their standard of living at all. But from a Capitalist viewpoint it means great economical affluence -- to the people who are in a position to fleece a share of all money transactions on the market, as something is produced or sold in the society.
jake williams
17th September 2009, 16:19
From a Capitalist viewpoint this makes perfect sense: people buy something new all the time, money rotates in the economy. So people have jobs and everyone is merry at the stock market.
From a Socialist or ecological viewpoint the "short product lifecycle" theory, a key element of modern Capitalism, makes absolutely no sense at all. People work to get money to buy products which they already have at home, but the "short product lifecycle" effect forces people to throw away what they have, so that they must again buy the same thing what they already have.
He explicitly said that he thought the rate of growth in computer technology was manipulated to create that sort of planned obsolesence. He wasn't agreeing with it, he was arguing that that's what was going on.
I also didn't really think that computer technology growth could have that sort of a rate-of-profit effect at this stage - but given it's his professional field I think it's worth considering.
Psy
20th September 2009, 18:39
It is more then bloated software (though it plays a part), look at arcade boards and video game consoles where you have diminishing returns as rather then hardware trailing the demands of developers like in the past we now have hardware ahead of the demands of developers.
MarxSchmarx
23rd September 2009, 08:06
I always thought the solution wasn't faster chips but more chips (multicore) and that it is only a matter of time before software catches up. So I guess I disagree with the premise that improved computational efficiency has started to grind to a halt.
Anywho:
He has this theory that a lot of the economic problems right now are due to, basically and as far as I understood what he was saying, the declining rate of increase in chip density. Basically, we're reaching a point where we can't increase computer capability at the rate we're used to, and this cuts into profits, and it contributes (at least contributes) to an economic crisis.
Thoughts?It is a ridiculous, myopic argument so absurd only an academic will make it. Maybe, MAYBE he could have claimed that if the soviets had our computing capability they might have gotten a little further along in the cold war.
But as regards the current academic crisis, hasn't this guy heard anything of the housing bubble in America? Unsustainable consumer, government, and corporate debt? Not to mention growing inequality, imperialist globalization, mideast oil, Hell-ooo?? It's all I hear about in the last year. Under what rock has this person been living??
Typical ivory-tower, caught up in thinking your work is so damned important, you can choose to know so little about so much. Oh, and if you donj't have to offer any evidence except for trends in your own field, all the better to abet your argument.
Psy
27th September 2009, 01:39
I always thought the solution wasn't faster chips but more chips (multicore) and that it is only a matter of time before software catches up. So I guess I disagree with the premise that improved computational efficiency has started to grind to a halt.
Muticores are harder to program and require more complex code that takes more cycles to run, thus multicores only delay the problem of stagnate clock speed
revolt4thewin
29th September 2009, 02:10
The transistor layers can be staked two to three layers max but would require more exotic cooling while optical interconnects will improve IPC.
MarxSchmarx
1st October 2009, 06:02
Muticores are harder to program and require more complex code
Right, but the point is is that this is just a software issue, it is not like it is insurmountably complex and if you can scale infinitely by expanding the # of chips why the sky's the limit (at least in theory, excluding issues like connectors).
Psy
1st October 2009, 14:32
Right, but the point is is that this is just a software issue, it is not like it is insurmountably complex and if you can scale infinitely by expanding the # of chips why the sky's the limit (at least in theory, excluding issues like connectors).
The problem is threads waiting for other threads, for example even with a math co-processor the CPU has to idle if it needs the answer to the equation it gave to the math co-processor before running the next line of code as it will be using that answer for the next operation.
jake williams
1st October 2009, 20:14
Right, but the point is is that this is just a software issue, it is not like it is insurmountably complex and if you can scale infinitely by expanding the # of chips why the sky's the limit (at least in theory, excluding issues like connectors).
The other point he made was that with parallel processing etc. the increase is more arithmetic, as opposed to geometric, and moreover your cooling has to increase in proportion to the increase in processing power, as opposed to not having to increase very much, which vastly increases size. His point wasn't that there's no more progress to be made - but that at least for the forseeable furture we won't see the kind of RATE of progress we're used to (and he argues, economically dependant on).
MarxSchmarx
2nd October 2009, 08:19
The problem is threads waiting for other threads, for example even with a math co-processor the CPU has to idle if it needs the answer to the equation it gave to the math co-processor before running the next line of code as it will be using that answer for the next operation.
You are correct. That's an inherent problem in any serial code. Given that most serial codes have been optimized, I think the problem lies in running multiple serial codes. For example, a system of 100 evolving differential equations can usually be treated as 100 separate 1 equation serial codes with sufficient communications between nodes. The latter is a hardware problem that multicore chips can still make progress on.
The other point he made was that with parallel processing etc. the increase is more arithmetic, as opposed to geometric, and moreover your cooling has to increase in proportion to the increase in processing power, as opposed to not having to increase very much, which vastly increases size. His point wasn't that there's no more progress to be made - but that at least for the forseeable furture we won't see the kind of RATE of progress we're used to (and he argues, economically dependant on).
Fair enough. But I am still somewhat skeptical that progress won't be geometric. Probably in the consumer pc market this is true, but for high end stuff e.g., going from 16 core to 1000 core by 2011, is pretty routine and there seems to be little theoretical/a priori engineering constraints on pulling of ever increasing and efficient multicore processors.
Psy
2nd October 2009, 14:33
You are correct. That's an inherent problem in any serial code. Given that most serial codes have been optimized, I think the problem lies in running multiple serial codes. For example, a system of 100 evolving differential equations can usually be treated as 100 separate 1 equation serial codes with sufficient communications between nodes. The latter is a hardware problem that multicore chips can still make progress on.
Right but if you look at gaming that has been the largest push for more performance they mostly pushed large sub-routines onto their own processors the largest being the graphic and sound processors (that at one time used to be handled by the CPU) yet the PS3 has shown that it is difficult to divide up loads beyond that effectively thus some PS3 games leaves many processors totally idle as it is too much work to break up the work load any more for game programmers.
MarxSchmarx
5th October 2009, 05:33
Right but if you look at gaming that has been the largest push for more performance they mostly pushed large sub-routines onto their own processors the largest being the graphic and sound processors (that at one time used to be handled by the CPU) yet the PS3 has shown that it is difficult to divide up loads beyond that effectively thus some PS3 games leaves many processors totally idle as it is too much work to break up the work load any more for game programmers.
I can't speak for sound but I know for graphics the processors are finding they can change efficiency for slightly less accuracy, thus allowing approximations to spread out over multiple cores to create the illusion, if you will, of what is trying to be depicted well enoguh that the human eye can't detect it.
But on some level, even with games, it is also an open question of whether people's expectations can outpace the state of the art divisions. This is esp. true in 1st person shooter games where teh marginal benefit of a handful of milliseconds v. a handful of nanoseconds is negligable from the standpoint of 99% of users.
Powered by vBulletin® Version 4.2.5 Copyright © 2020 vBulletin Solutions Inc. All rights reserved.