100+ GHz computer chips.
http://www.theregister.co.uk/2010/02/05/ibm_graphene_transistor/
Neat.
Now let's think up ways to saturate the additional capacity!
I'm most excited about the prospect of realtime raytracing. I've been programming with graphics shaders again recently, and although the available power there is phenomenal, there are too many limitations built into the hardware pipelines - they are streamlined like a racecar - you can't take them off-road smoothly.
Presumably it would allow for faster networking, although I'm a little stumped at what use that would be -- there isn't an equivalent increase in storage capacity that I know of. Though perhaps you could use that to your advantage by generating so much spurious traffic that your ISP can't possibly keep records of your activity - P2P live untraceable streaming?
Storage did a similar jump a while back, this levels that ratio a bit.
The limitation on internet speeds is last mile stuff, the backbone is a long way away from being strained. Getting the speed up is about building infrastructure.
I just want to rub my balls on this. I hope that costs less than actually buying one.
I for one am excited about speed increases on computers. It literally takes weeks to run maximum parsimony analysis on sequences with most computers, which leads to using distance-related phenetic approaches like maximum likelyhood and Bayesian instead because they are faster. The more computing power out there, the sooner systematics can be rid of those error inducing algorithms.
I should explain that these won't necessarily make anything faster, I'm not sure about what Kai mentions, but for general day to day computing The only things a normal person might do that would be made faster by one of these is video encoding and gaming (the stuff will be very rapidly adapted to graphics cards no doubt).
Bottlenecks on the RAM, Hard drive, and network still remain. Though Parkinson's law has never been wrong yet, so expect to need to buy one of these in order to run Windows 9.
Oh, and I will be laughing my ass off if IBM keeps up its current fab technology deals and AMD/ATI gets this but intel doesn't. :lulz:
I'm tired of Intel, and I hope this sinks them.
I hope they'll bundle these systems with racetrack memory, which IBM is also developing. Perhaps that could solve parts of the bottleneck.
Quote from: Requia ☣ on February 06, 2010, 07:01:49 PM
I should explain that these won't necessarily make anything faster, I'm not sure about what Kai mentions, but for general day to day computing The only things a normal person might do that would be made faster by one of these is video encoding and gaming (the stuff will be very rapidly adapted to graphics cards no doubt).
Bottlenecks on the RAM, Hard drive, and network still remain. Though Parkinson's law has never been wrong yet, so expect to need to buy one of these in order to run Windows 9.
Awww. :sad:
Quote from: Requia ☣ on February 06, 2010, 07:01:49 PM
I should explain that these won't necessarily make anything faster, I'm not sure about what Kai mentions, but for general day to day computing The only things a normal person might do that would be made faster by one of these is video encoding and gaming (the stuff will be very rapidly adapted to graphics cards no doubt).
Bottlenecks on the RAM, Hard drive, and network still remain. Though Parkinson's law has never been wrong yet, so expect to need to buy one of these in order to run Windows 9.
From what Kai said I assume that it is an algorithm, in which case an increase in CPU speed would vastly decrease the time it takes for the algorithm to be completed.
If you can keep all the data in registers or cache, then yeah.. but the question is how much data are you comparing between different taxa? E.g. here (http://www.mun.ca/biology/scarr/2900_Parsimony_Analysis.htm) it looks (from my uneducated quick scan) like it's comparing dna sequences.. in which case you're talking quite a bit of data -- wiki puts the human genome at around 750mb uncompressed. CPU cache sizes are (I think - I don't keep up) still in the single figures of mb, so I'd guess you'd have to be focussed upon a pretty small chunk of the genome for this to give a significant boost, assuming nothing else changes.
Given my current laptop has a 1.66 GHz computer chip, one of these would be nice. Shit, I can't even play games that came out a year before this model.
Quote from: FP on February 06, 2010, 11:31:59 PM
If you can keep all the data in registers or cache, then yeah.. but the question is how much data are you comparing between different taxa? E.g. here (http://www.mun.ca/biology/scarr/2900_Parsimony_Analysis.htm) it looks (from my uneducated quick scan) like it's comparing dna sequences.. in which case you're talking quite a bit of data -- wiki puts the human genome at around 750mb uncompressed. CPU cache sizes are (I think - I don't keep up) still in the single figures of mb, so I'd guess you'd have to be focussed upon a pretty small chunk of the genome for this to give a significant boost, assuming nothing else changes.
I think I read that the new 100Ghz chips also have gargantuan caches, to the order of 1gb.
Which is stupidly large, if this is true.
Unlikely, I can definitely see card slot processors with huge L3 caches making a comeback, but that much memory would take up a lot of physical space with the large process being used at the moment.
As you say. Any proc that fast is pointless without a gigantic cache to match.
Another article made the 1gb cache claim:
http://www.theinquirer.net/inquirer/news/1046135/new-chinese-chips-pose-threat-intel
QuoteUsing 128-bit technology, the architecture, codenamed 'Long March', could deliver performance of up to 100GHz in as little as 18 months, say researchers. Built on a new 90 nm (nanometer) process, the Long March chips feature 1Gb level 2 cache and prototypes have been demonstrated running at 5GHz with no special cooling.
Quote from: Cain on February 06, 2010, 11:47:16 PM
Given my current laptop has a 1.66 GHz computer chip, one of these would be nice. Shit, I can't even play games that came out a year before this model.
Hmm, my (very rough) estimates put this on the same order of power as a top end nVidia card.. you really could do laptop gaming with it. Not needing to spend 300 dollars on a extra part is actually pretty useful.
Of course, the gaming companies will probably start making games that require even more hardware as soon as these hit the shelves.
RE: Sig,
A) That's the inquirer
B) Pentium 4, what?
C) Oh, its from 2003.
D) Scam. To start with you'd never be allowed to sell a Pentium 4 pin compatible processor in the US. Intel keeps a tight reign on that bit of IP ever since it finally decoupled itself from AMD.
Quote from: Requia ☣ on February 07, 2010, 12:17:22 AM
Quote from: Cain on February 06, 2010, 11:47:16 PM
Given my current laptop has a 1.66 GHz computer chip, one of these would be nice. Shit, I can't even play games that came out a year before this model.
Hmm, my (very rough) estimates put this on the same order of power as a top end nVidia card.. you really could do laptop gaming with it. Not needing to spend 300 dollars on a extra part is actually pretty useful.
Maybe. I would think it was OK, given how it handles my current games, but a lot of stuff I look at suggests 2.2 GHz as a minimum, up to three in some cases. And obviously, I don't want to spend money just to have to send something back or wait until I upgrade.
I meant that the IBM chip could game.
graphics power normally has little to do with processors because well...
high end intel i7 ~60 GFlops
50$ low end ATI card ~90 Gflops.
A good processor will give a system that little extra something, but its the graphics power that really does it, and its hard to cram graphics power into a low power poorly cooled device like a laptop.
Quote from: Requia ☣ on February 07, 2010, 12:23:25 AM
I meant that the IBM chip could game.
graphics power normally has little to do with processors because well...
high end intel i7 ~60 GFlops
50$ low end ATI card ~90 Gflops.
A good processor will give a system that little extra something, but its the graphics power that really does it, and its hard to cram graphics power into a low power poorly cooled device like a laptop.
Actually the new Nvidia ION processor can play most games on medium. Hell, it can even play Crysis on low.
So I would suggest getting that if you want awesome graphics in a small laptop.
ION isn't a cpu.
Quote from: FP on February 06, 2010, 11:31:59 PM
If you can keep all the data in registers or cache, then yeah.. but the question is how much data are you comparing between different taxa? E.g. here (http://www.mun.ca/biology/scarr/2900_Parsimony_Analysis.htm) it looks (from my uneducated quick scan) like it's comparing dna sequences.. in which case you're talking quite a bit of data -- wiki puts the human genome at around 750mb uncompressed. CPU cache sizes are (I think - I don't keep up) still in the single figures of mb, so I'd guess you'd have to be focussed upon a pretty small chunk of the genome for this to give a significant boost, assuming nothing else changes.
Say you are doing a parsimony analysis for a 500 bp (base pair) long sequence of COI mtDNA (cytochrome oxidase I mitochondrial DNA) for 40 taxa, in which you have to compare the character state (C,T,G or A)of every position (the positions are considered homologues) determine maximum congruence (how well the characters fit to each other) while computing consistency (how well the characters fit to the topology) to infer a cladogram (dicotomously branching phylogenetic tree)...yeah, it takes time.
Quote from: Requia ☣ on February 07, 2010, 12:23:25 AM
its hard to cram graphics power into a low power poorly cooled device like a laptop.
that is what I was responding to. I probably should have indicated that lol...
Quote from: Kai on February 07, 2010, 02:46:49 AM
Quote from: FP on February 06, 2010, 11:31:59 PM
If you can keep all the data in registers or cache, then yeah.. but the question is how much data are you comparing between different taxa? E.g. here (http://www.mun.ca/biology/scarr/2900_Parsimony_Analysis.htm) it looks (from my uneducated quick scan) like it's comparing dna sequences.. in which case you're talking quite a bit of data -- wiki puts the human genome at around 750mb uncompressed. CPU cache sizes are (I think - I don't keep up) still in the single figures of mb, so I'd guess you'd have to be focussed upon a pretty small chunk of the genome for this to give a significant boost, assuming nothing else changes.
Say you are doing a parsimony analysis for a 500 bp (base pair) long sequence of COI mtDNA (cytochrome oxidase I mitochondrial DNA) for 40 taxa, in which you have to compare the character state (C,T,G or A)of every position (the positions are considered homologues) determine maximum congruence (how well the characters fit to each other) while computing consistency (how well the characters fit to the topology) to infer a cladogram (dicotomously branching phylogenetic tree)...yeah, it takes time.
That sounds like these chips would help a lot, since the data set isn't that huge.
Quote from: Kai on February 07, 2010, 02:46:49 AM
Quote from: FP on February 06, 2010, 11:31:59 PM
If you can keep all the data in registers or cache, then yeah.. but the question is how much data are you comparing between different taxa? E.g. here (http://www.mun.ca/biology/scarr/2900_Parsimony_Analysis.htm) it looks (from my uneducated quick scan) like it's comparing dna sequences.. in which case you're talking quite a bit of data -- wiki puts the human genome at around 750mb uncompressed. CPU cache sizes are (I think - I don't keep up) still in the single figures of mb, so I'd guess you'd have to be focussed upon a pretty small chunk of the genome for this to give a significant boost, assuming nothing else changes.
Say you are doing a parsimony analysis for a 500 bp (base pair) long sequence of COI mtDNA (cytochrome oxidase I mitochondrial DNA) for 40 taxa, in which you have to compare the character state (C,T,G or A)of every position (the positions are considered homologues) determine maximum congruence (how well the characters fit to each other) while computing consistency (how well the characters fit to the topology) to infer a cladogram (dicotomously branching phylogenetic tree)...yeah, it takes time.
fuck, I'd love to take a class or two on the hardcore computational algorithms that must be involved with this.
is it based on http://en.wikipedia.org/wiki/Longest_common_substring_problem ?
Quote from: Horrendous Foreign Love Stoat on February 07, 2010, 07:20:53 AMYeah. It would be nice, but as soon as we've bought the 100+ghz chip, Left 4 Dead 5, and Halo 6 and Civilisation 7 will require the 200+ghz chip ...
That'll probably be around the time game content development is outsourced to 'sweatshops' to deal with the need for a team of thousands just to keep up with market expectations of progress.
I imagine that a super-fast processor could be leveraged with clever coding to significantly speed up something wherein the biggest bottleneck is the storage speed. That said, increases in processor speed (and decreases in storage cost, be it cache, ram, disk space, etc) have historically always eventually become excuses for lazier programming, negating their actual upsides. That said, 100ghz is enough of a jump that it may take a few years before everyone starts using bogo-sort and similar brain damaged algorithms and the speeds of typical applications fatten down to average.
Quote from: Enki v. 2.0 on February 07, 2010, 04:27:24 PMI imagine that a super-fast processor could be leveraged with clever coding to significantly speed up something wherein the biggest bottleneck is the storage speed.
Good point, I suppose the idea of speed/memory tradeoffs [see: rainbow tables] work the other way around as well.
Quote from: Requia ☣ on February 07, 2010, 12:17:22 AM
Quote from: Cain on February 06, 2010, 11:47:16 PM
Given my current laptop has a 1.66 GHz computer chip, one of these would be nice. Shit, I can't even play games that came out a year before this model.
Hmm, my (very rough) estimates put this on the same order of power as a top end nVidia card.. you really could do laptop gaming with it. Not needing to spend 300 dollars on a extra part is actually pretty useful.
Of course, the gaming companies will probably start making games that require even more hardware as soon as these hit the shelves.
RE: Sig,
A) That's the inquirer
B) Pentium 4, what?
C) Oh, its from 2003.
D) Scam. To start with you'd never be allowed to sell a Pentium 4 pin compatible processor in the US. Intel keeps a tight reign on that bit of IP ever since it finally decoupled itself from AMD.
Ack, rebuttal fail.
I'm just going to stop making claims about CS/CE for a while, and merely phrase my thoughts as musings in the form of questions.
Boring. Let me know when they can graft this chip to my brain along with a terabyte of storage and one of those things that let me stream infomercials 24/7 into my prefrontal cortex.
...I hate to ask, really, but why infomercials?
Quote from: Annabel the Destroyer on February 08, 2010, 05:32:29 AM
Boring. Let me know when they can graft this chip to my brain along with a terabyte of storage and one of those things that let me stream infomercials 24/7 into my prefrontal cortex.
Just hook an AM microwave transmitter to the audio output of a television receiver box and aim it at your head. Cheaper.
Quote from: Annabel the Destroyer on February 08, 2010, 05:32:29 AM
Boring. Let me know when they can graft this chip to my brain along with a terabyte of storage and one of those things that let me stream infomercials 24/7 into my prefrontal cortex.
Aren't we to solve the world food problem first, or something?
Or is that just the restriction for being allowed to travel the stars.
Quote from: Horrendous Foreign Love Stoat on February 10, 2010, 09:14:34 AM
On a serious note, how come we just cannot clone people without brains as meat, in sinister vat farms, and well, harvest the shit out of em for organs, long pork and face bacon and whatnot, and use the liquidized slurry remainders to feed up the next batch?
Apparently, vat-grown meat tastes awful because it doesn't get any exercise.
Quote from: Enki v. 2.0 on February 10, 2010, 11:40:30 AM
Quote from: Horrendous Foreign Love Stoat on February 10, 2010, 09:14:34 AM
On a serious note, how come we just cannot clone people without brains as meat, in sinister vat farms, and well, harvest the shit out of em for organs, long pork and face bacon and whatnot, and use the liquidized slurry remainders to feed up the next batch?
Apparently, vat-grown meat tastes awful because it doesn't get any exercise.
Hmm, maybe this is also going to be a problem with the "eat the rich" program.
Quote from: Enki v. 2.0 on February 10, 2010, 11:40:30 AM
Quote from: Horrendous Foreign Love Stoat on February 10, 2010, 09:14:34 AM
On a serious note, how come we just cannot clone people without brains as meat, in sinister vat farms, and well, harvest the shit out of em for organs, long pork and face bacon and whatnot, and use the liquidized slurry remainders to feed up the next batch?
Apparently, vat-grown meat tastes awful because it doesn't get any exercise.
Then use electroconvulsive therapy on it too.