Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Topics - P3nT4gR4m

Pages: [1] 2 3 4 ... 16
News is a proper lulzfest right now (so much so I'm actually watching it)

In a nutshell UK retailers have imported the great american tradition of slashing prices for a day then inviting everyone to come along to the stores and beat the shit out each other.

So that's how it was advertised and it's scotland so, in the interests of keeping alive our culture of drunken violence, my fellow countrymen took the retailers up on their generous offer of somewhere to go and kick fuck out each other.

So a couple of stores got a bit trashed in the process and there was plenty carnage. As advertised. Now the retailers are all upset cos... they got what they asked for?


So I'm sitting with P3nTw1F3 watching Interstellar at the local cinema last night and it's 15 mins from the end and the movie has been pretty fucking sweet so far and then suddenly it freezes and we're left staring at a frozen image of Anne Hathaway while the staff try in vain to get mission control down south to reboot the projector. Turns out the IP tards don't trust local staff to run their projection systems, just in case they nip back after hours and shoot a telesync for piratebay so the upshot is I'm going to have to nip along to piratebay tonight and download a telesync just so I can find out how the fucking movie ends  :argh!:

In the meantime. Feel free to regail me with outlandish tales of what happens in the final 15 mins.

Techmology and Scientism / Rosetta has landed!
« on: November 12, 2014, 03:05:49 pm »

Bring and Brag / Human
« on: October 03, 2014, 11:27:25 pm »
I stand here disconnected
and the visions fall like rain
there's a quiet desperation
flicker knives on feet of shame

You twist my head rejected
whisper solitude insane
I join the revolution
and I ride the crazy train

And the devils bring tomorrow
as the gods steal yesterday
This is not the world we dreamed of
It's the reason humans pray

Your touch ignites my terror
self loathing rears it's head
Fragile silken barbs of will
twist and tear the words you said

See the fool who stands before you
spend your pity on his lies
Drifting consequence of knowledge
as we spit our last goodbyes

And the devils give us feelings
as the gods build fires of hate
We are slaves to our emotions
this is every human's fate

Techmology and Scientism / Promising development in solar
« on: October 02, 2014, 08:57:39 pm »
two-dimensional metallic dielectric photonic crystal

Here was me thinking materials science was all about the nanotubes and the graphene but fuck me there's so much more to it than that!  :eek:

Think for Yourself, Schmuck! / Reconventional Wisdom
« on: June 29, 2014, 12:55:57 pm »
What if some of the things that have always been wrong in the past aren't wrong any more?

Conversely, what if some of the things that we believe to be good ideas have become bad ones in current times?

What if, maybe, some things that even failed spectacularly one or more times only failed because of when they happened, the historical environment they happened in?

What if there's a whole bunch of stuff that's always been absolutely true and generally accepted to be true for so long that nobody thinks about it any more and merely dismisses the notion out of hand?

What if mixed in with flying cats and being able to walk through walls, there's some things that actually would be good ideas now?

Maybe it's a political system or an economics system. Maybe it's technological, maybe it's sociological. Logistics...

There's a fuckload of bad ideas out there to choose from. Question is - are they still bad ideas?

Techmology and Scientism / New theory of quantums
« on: June 25, 2014, 10:40:58 am »
Not sure about source website - walking the fine line between science and woo but if the information is accurate, maybe there's an alternative way of quantums that makes a lot more sense to me than copenhagen?

Techmology and Scientism / Welcome to the Machine
« on: June 17, 2014, 08:40:52 pm »
Storage - 160Pb. Latency - 250ns  :eek:

And the OS is open source. Hands up who wants one?

All Our Patent Are Belong To You

20th century industry - Competition was the order of the day. Hardly surprising given we're a warlike species. The keywords were Hostile Takeover, Price Wars, smashing the competition

21st century industry - Is all about collaboration. Keywords are Open Source, Developers, Ecosystem, Partners.

Quantum mindfuck alert!  :eek:

Quote could solve certain problems that could not be solved by any non-quantum computer, even if the entire mass and energy of the universe was at its disposal and molded into the best possible computer.

This is not a proper computer but, in much the same way a GPU isn't a proper computer, it's mindfuckingly fast at a specific narrow subset of computation

First the caveat (the text in white letters on the graph). D-Wave has not built a general-purpose quantum computer. Think of it as an application-specific processor, tuned to perform one task — solving discrete optimization problems. This happens to map to many real world applications, from finance to molecular modeling to machine learning, but it is not going to change our current personal computing tasks. In the near term, assume it will apply to scientific supercomputing tasks and commercial optimization tasks where a heuristic may suffice today, and perhaps it will be lurking in the shadows of an Internet giant’s data center improving image recognition and other forms of near-AI magic. In most cases, the quantum computer would be an accelerating coprocessor to a classical compute cluster.


The team found that in U6, the Prp24 protein and RNA—like two partners holding hands—are intimately linked together in a type of molecular symbiosis. The structure yields clues about the relationship and the relative ages of RNA and proteins, once thought to be much wider apart on an evolutionary time scale.

"What's so cool is the degree of co-evolution of RNA and protein," Brow says. "It's obvious RNA and protein had to be pretty close friends already to evolve like this."

So I'd never thought much about the period in history when a puddle of amino acids turned into DNA. This article mentions that RNA evolved first and I've always had this vague gap in my head where - amino acids and then something something something and then BINGO- Nanofactories!

The thing with gaps in my head is, quite often, I don't notice them. The other thing is, once I become aware of them, some administrative function demands they are filled in with as much reliable information as is available. So I come to the good people of PD, cos I know that often, in matters such as these, it's often faster than Google.

If there an equivalent to this somewhere with chemicals on the left and Cells on the right?

This is not an argument. I'm not stating a position. This is a quarter baked thought that's begun materialising in my head and I'm writing it down to explore the development of a seed of an idea that I'm not too clear on yet, in the hope that highly intelligent random internet people might provide some input.

The reason I'm stating this, is because (knowing me) I'll forget to e-prime or whatever the fuck and say things like "It's like this" or "that's the reason", as my train of thought goes barrelling down the track and I want to get it straight from the git-go that I'm not assuming any of this bullshit, I'm merely examining it as an alternative to current models.

I will not defend it if challenged, because it's random crap that my brain is coming up with. I'm not even sure what the hell it is yet but it's piqued my interest so here goes...

I'm a software engineer. I'm a hacker. I work with software. My job is to communicate massively complex sets of instructions and conditional logic to machines which carry out these instructions on datasets and then communicate the results back to me or to the end user.

Something that I've always been vaguely aware of but never given much thought to, is what it must be like for the people (a significant percentage?) who don't actually understand exactly what software is, how it behaves, and what it represents as a manifest phenomena at the granular levels which all software engineers encounter and interact with it.

So in general software geek parlance these individuals are commonly termed "users" with a small -u- (we're an elitist bunch us code nerds) and these are the people who don't think of what they're doing as software, rather as the abstractions the designers expose them to. They're using a spreadsheet. They're sending an email. They're playing Angry Birds.

Then there's people who understand a fair bit more about the mechanics of the stuff we're dealing with but their knowledge rarely extends very far below the abstract interfaces of the users. They might kinda grok what a config file does or be able to apply some basic conditional logic into their XL formulae. We call them "Superusers", by way of differentiating.

Superusers get a capital letter at the start of their name and some of them, the chosen few, will be elevated to the exalted status of "Admin". Unlike the user and the Superuser, the admin is someone who we can leave in a room full of sharp objects and reasonably expect nothing too bad to happen as a result.

The admin may be someone who has absolute command of the system he administrates and yet, at the same time, from the engineers point of view, he's just another user. He may have a perfect understanding of the operations of the system, at the interface level but he's still dealing with a construct, an emergent property, an abstract representation of the software itself.

So what is this software? What is it made of? How does it work? In a sense I could describe it as a language, of syntax and semantics which we use to describe existent systems, existent materially or in abstract, in a way which effectively mimics or parallels the operations of the systems they describe.

That old cliché about language being "alive", referring allegorically to poetry and prose? No, I mean these "languages" are alive in the "Frankenstein" sense. The words and the sentences, speak themselves and they grow and sprout and branch in direct correlation to any existent system they represent and, as a result, fluency in these languages grants the hacker something that's akin to a new sense, a new way of interpreting external information - the sense of code.

A coder develops an intrinsic sense of code which, once developed, isn't something that you just turn off when you step away from the computer, any more than you'd turn your sense of sound off when you stopped listening to the radio. Everything a coder observes, using the traditional, biological senses, can also be observed and filtered via the sense of code.

If all existent phenomena can be considered as a system, then all existent phenomena can be expressed, examined and extrapolated in code. If this is the case, then it follows that we can examine the efficiency, the accuracy and the performance of these systems, simply by analysing the resulting code descriptions.

Software has advanced rapidly, over the last couple of decades, the systems that software represents, refined and shaped and moulded, by way of the software itself, which was quickly able to figure out increasingly efficient ways of carrying out the end goals which any given system was required to achieve.

However, not all systems have been described in code. Many human systems, especially in the interpersonal and up to the societal sphere of interaction still remain clunky and old and inefficient when observed with a sense of code. What a sense of code is, in the context of this framework is a method of describing, objectively, the system being observed. This objectivity is the kicker. By the same token that it could be argued that objectivity is one of the fundamental strengths of scientific investigation, so too does code benefit from having objectivity built in.

As I said earlier, code can be used to describe all systems, biological, neurological, sociological, psychological and, by declaring these systems in code and examining their operations through the lense of a well developed sense of code, maybe these systems can be analysed objectively, in a way that allows us to develop and evolve them through the iterative process of refinement which is a coder's stock in trade.

What I'm getting at here is not necessarily simulation, rather it's simply a description of a system of operations, described in a language that mirrors the operations of the system analogously. I find it difficult, at this stage, to further elaborate and communicate my idea in terms which a user, someone with little or no sense of code, would understand. How do you describe the colour blue to a blind person is the sound of rain to someone who's deaf? Perhaps there's a coder reading this now, who gets the drift of where I'm going and can think of a way to explain this? Maybe, like me, you've never given much thought to this "sense of code" idea but, now that you have it seems to hold some merit?

Techmology and Scientism / David Eagleman - What the actual fuck?
« on: May 31, 2014, 05:43:42 am »
This guy is just a crackpot, talking gobshite, right :eek:


The ENIAC computer was switched on in 1946 and thus began the information revolution. It had a CPU clock speek of 100 kHz (100,000 machine cycles per second) and a proto-ALU that could perform a (then) staggering 357 multiplications per second or 35 division operations with an operational uptime of around 50%.

The original form factor, weighed in at 30 tons, and stood an impressive 8 feet tall, 100 feet long and 3 feet deep. It set the US Army back an estimated six million bucks in today's money.


Ten years later IBM released the 305 RAMAC introducing the worlds first commercially available Hard Disk Storage System. It was considerably smaller than the ENIAC, 30 foot by 50 foot, the hardrive taking up a further 16 square foot for a capacity of around 5 Megs.

For all it's diminutive stature, it could out think the Eniac by a factor of about 10 and, more importantly, it could commit it's instructions and the results of it's calculations, to memory. It retailed for $100,000 or leased for $3500 a month. All in 1000 units shipped before it became obsolete in 1962


Ten years later HP entered the computer market with it's "go-anywhere, do-anything" 2116a. This tiny device could fit comfortably in a small office. Integrated chips had by now replaced the vacuum valves of previous generations and the latest and latest and greatest machines could now store tens of thousands of instructions and values in true 16 bit registers. Churning out calculations to in the order of tens of thousands per second.

The 2116a was one of these machines and you could have one delivered to your place of business for between $25,000 and $50,000, depending on options and upgrades.


Three years later the P3nT4gR4m MkI Biocomputational Device was released. None sold so far.


Seven years later Apple created the Apple I and sold it, in kit form, to home users. This was around the birth of personal computing but it's a bit unfair to compare the specs of what was essentially a whole new sub-revolution so lets stay with the business sphere for a little longer and give this fledgeling platform a little time to mature, shall we?

The pinnacle of business in '76 was the Cray I. It was a big bugger at 58 cubic feet and weighing in around 2.5 tons but by fuck it was fast. 166 million floating-point operations per second fast. One of these bad boys would set you back a cool $10 million but at least you had the option of more reasonably priced, albeit significantly less powerful alternatives


Ten years later IBM Released their first Laptop. At $2000 a pop, it came with a quarter of a megabyte of RAM and an 80c88 processor clocked at 4.7 MHz and portable at a smidegon under 6 kilos on the scales but I'm comparing apples and oranges again what else was happening?

Well, since 76 we were fast approaching the end of Mainframes, with minis and superminis Like the IBM 6150 arriving on the scene. Supporting multiple "dumb terminal" users, all vying for compute cycles at the timeslice trough. 16 Megabytes of RAM and a 300 Meg Hard Drive, these shrunk the average mainframe room into a form factor resembling a modern desktop PC on steroids. It performed a couple of hundred thousand floating point operations per second.


Ten years later Intel released their 6th generation Pentium. Remember the first picture I posted, the old B&W one with the room full of wires and flashy lights? Well the picture above is the equivalent of several million of those. Five and a half million transistors in all, squeezed down to a little square slab of silicon about the size of your fingernail. Instructions per second? 500-odd million.


Ten years later Intel Core 2 Extreme - 49,161 Million instructions carried out, to the letter, every second.

In other news, computation has spawned a new platform - the Smartphone which, contrary to the name, has very little to do with telephony and is in fact an ultra-portable mobile computing platform.



Pages: [1] 2 3 4 ... 16