ACCELER8OR

Oct 23 2011

Graphene: And Here Comes “Film” Computing

""){ ?> By Valkyrie Ice


Share

You might recall I’ve talked about graphene repeatedly, and yes, this is yet another piece about it. In one H+ magazine article I discussed some of the remarkable properties of graphene. In this one I touted a few developments that highlighted exactly how fast graphene developments were being researched. Just recently, I discussed IBM making a working graphene broadband frequency mixer and how it proved that graphene was well on its way to practical use in the electronics industry.

If you recall, I also spoke about the major remaining hurdle to graphene processors being how it reacted to being layered on a chip, and clinging to the “hills and valleys” of the silicon, making it cease to be a superconductor. I also discussed a couple of possible work arounds, one of which was layering graphene on a boron nitrate surface. Well, here we are just a few months later, and what do I read but this?

That’s right. Layer graphene like an oreo filling between two layers of boron nitride effectively eliminated the “hills and valleys” effect, creating an “isolation” zone in which graphene is effectively uninfluenced by its environment. This means that the last hurdles to using graphene in electronics are falling fast, and that electronics using nothing but carbon are theoretically possible.

Yes, you heard that right. I already discussed the methods for creating graphene “nanoribbon” circuits, squared “U” transistors, controlled graphene “magnets”, quantum well “Qdot” displays, and other electronics devices. Now imagine putting all of these advances into one single device.

Because you see, I already did, several years ago, during my first articles on VR. I’ve been watching graphene development intensely because I have long known that sooner or later they would solve the issues and that “film” computers could be a possible result.

What is a “film” computer? It should be pretty obvious given the article referenced above. Imagine a “sheet” of BN/G/BN, in which the graphene layer has been patterned into a variety of electronic devices, from batteries to processors to memory, etc. For the sake of simplicity, lets imagine that layer has the rough computing power of a single iPhone G4. Now, add a hundred more layers, maybe even a few thousand more, until that “sheet” is as thick as a piece of paper. If you are truly imagining that, the amount of computing power I’m discussing here is a little scary. Then let’s add a few more layers, configured to be cameras, display pixels, and solar panels.

What does that give us? How about a iPad like device in which the entire surface acts like a display and camera, enabling touch screen like interaction over the entire surface. Imagine it weighing so little that the wind could blow it around, yet being able to offer all the power of a corporate data center’s worth of computers. Imagine it never needing to be plugged in because it’s powered by ambient light and can store several months worth of energy from a few hours exposure.

Sound radical? An iPad on steroids? Let’s take that sheet of “paper” and wrap it around our eyes. Hell make it a few thousand more layers thick and you still will barely approach enough weight to be noticeable. Toss in a few hundred lidars and you’ve got a basic VR rig that overlays your vision and turns your entire world into an interactive computer interface. Layer it between cloth and your entire wardrobe could have more computing power than the fastest super computer of today. Cover your wall in it and you could play your favorite FPS life sized.

That’s what I mean by “film” computing. Being able to layer processing power over or in nearly anything. Silicon cannot do this, because it is rigid and inflexible, but graphene’s material and electronics properties lead to entirely new possibilities in both electronics and product design. Once we learn how to “print” graphene circuits, we could be applying them to nearly everything, for almost any purpose.

And this news is just one more step towards that future. Piece by piece, the puzzle is coming together.

Share
  • By Ice Ice Babay, October 24, 2011 @ 11:17 am

    I would gouge out my own eyes before I would allow them to be contaminated and ruined by some technological garbage which is also likely to cause massive inflammation/immune response and thus probably cancer as well. Man is not a machine. The cyborg is the ‘Nouveau Nosferatu’, it is undead. So destroy all cyborgs on sight.

    “The target of the Butlerian Jihad was a machine-attitude as much as the machines,” Leto said. “Humans had set those machines to usurp our sense of beauty, our necessary selfdom out of which we make living judgments. Naturally, the machines were destroyed.”[

  • By Valkyrie Ice, October 24, 2011 @ 6:42 pm

    Better grab a spoon and start gouging then.

  • By elmo, October 24, 2011 @ 7:36 pm

    Nope. Sorry. Your fantasies have nor more ability to be realized in reality than the fantasies of E.E. Doc Smith.

    and if they could, you would be opposed and stopped by a number of factors economic and political.

    your sick fascist delusions are nothing but DnD and comic books.
    your “singularity” aka “Nerd Rapture” is exactly as irrational and mythical as the Christian Rapture.

  • By elmo, October 25, 2011 @ 9:13 am

    with the hundreds of billions ‘o $ invested in silicon fabs, it will be a minimum of 20 years before any potential graphene chips are mass manufactured _at all_ ….

  • By Joe, November 2, 2011 @ 2:22 pm

    So, Ice Ice, are you’re killing people with cochlear implants now? What implant makes someone worth murdering? Elmo, market forces will determine if it’s worth retooling for any particular tech. Some of the stupidest assertions have been made by those simply trying to be sensible. We shall see…

Other Links to this Post