ACCELER8OR

Oct 23 2011

Graphene: And Here Comes “Film” Computing

Share

You might recall I’ve talked about graphene repeatedly, and yes, this is yet another piece about it. In one H+ magazine article I discussed some of the remarkable properties of graphene. In this one I touted a few developments that highlighted exactly how fast graphene developments were being researched. Just recently, I discussed IBM making a working graphene broadband frequency mixer and how it proved that graphene was well on its way to practical use in the electronics industry.

If you recall, I also spoke about the major remaining hurdle to graphene processors being how it reacted to being layered on a chip, and clinging to the “hills and valleys” of the silicon, making it cease to be a superconductor. I also discussed a couple of possible work arounds, one of which was layering graphene on a boron nitrate surface. Well, here we are just a few months later, and what do I read but this?

That’s right. Layer graphene like an oreo filling between two layers of boron nitride effectively eliminated the “hills and valleys” effect, creating an “isolation” zone in which graphene is effectively uninfluenced by its environment. This means that the last hurdles to using graphene in electronics are falling fast, and that electronics using nothing but carbon are theoretically possible.

Yes, you heard that right. I already discussed the methods for creating graphene “nanoribbon” circuits, squared “U” transistors, controlled graphene “magnets”, quantum well “Qdot” displays, and other electronics devices. Now imagine putting all of these advances into one single device.

Because you see, I already did, several years ago, during my first articles on VR. I’ve been watching graphene development intensely because I have long known that sooner or later they would solve the issues and that “film” computers could be a possible result.

What is a “film” computer? It should be pretty obvious given the article referenced above. Imagine a “sheet” of BN/G/BN, in which the graphene layer has been patterned into a variety of electronic devices, from batteries to processors to memory, etc. For the sake of simplicity, lets imagine that layer has the rough computing power of a single iPhone G4. Now, add a hundred more layers, maybe even a few thousand more, until that “sheet” is as thick as a piece of paper. If you are truly imagining that, the amount of computing power I’m discussing here is a little scary. Then let’s add a few more layers, configured to be cameras, display pixels, and solar panels.

What does that give us? How about a iPad like device in which the entire surface acts like a display and camera, enabling touch screen like interaction over the entire surface. Imagine it weighing so little that the wind could blow it around, yet being able to offer all the power of a corporate data center’s worth of computers. Imagine it never needing to be plugged in because it’s powered by ambient light and can store several months worth of energy from a few hours exposure.

Sound radical? An iPad on steroids? Let’s take that sheet of “paper” and wrap it around our eyes. Hell make it a few thousand more layers thick and you still will barely approach enough weight to be noticeable. Toss in a few hundred lidars and you’ve got a basic VR rig that overlays your vision and turns your entire world into an interactive computer interface. Layer it between cloth and your entire wardrobe could have more computing power than the fastest super computer of today. Cover your wall in it and you could play your favorite FPS life sized.

That’s what I mean by “film” computing. Being able to layer processing power over or in nearly anything. Silicon cannot do this, because it is rigid and inflexible, but graphene’s material and electronics properties lead to entirely new possibilities in both electronics and product design. Once we learn how to “print” graphene circuits, we could be applying them to nearly everything, for almost any purpose.

And this news is just one more step towards that future. Piece by piece, the puzzle is coming together.

Share