Sunday, June 27, 2010

Future Tech: Quantum Redux

One of my favorite Physics topics was always "Particle in a Box." Mostly I like the name. It's just fun to say. This weekend I started explaining the concept to somebody before I got distracted on a tangent, but it got me thinking. Classical physics is nice, neat, and well buttoned up. Yet we are now entering uncharted territory in the Subatomic Realm. I have literally be in the thick of this acceleration, or Inflection point in the words of Raymond Kurzweil, who wrote an amazing book about the acceleration of technology development, The Singularity Is Near: When Humans Transcend Biology. He points out the incredible, and increasing rate or technological change dictates that it will soon be the dominating force of evolution in our lives.


10 years ago we sequenced the human genome. The biology front is certainly exciting. By some accounts, Craig Venter and his team have created a fully synthetic bacterium. Companies like Affymetrix, Illumina, and others have created rapid sequencing platforms we can use for countless applications. I used Affymetrix Arrays working with Justin Borevitz when he was at the Salk Institute to investigate Alternative Splicing. The company where I work, Cyntellect, has developed a line of Cellular Analysis Instruments which can be used for Stem Cell, and Cancer Applications which allows you to manipulate, analyze, and quantify cells used for therapeutic purposes, drug discovery, or drug production.


On the Neuroscience front, IBM already built Deep Blue, the first machine that could beat a grandmaster at chess. Now they are working on a machine, "Watson" to win at Jeopardy by fundamentally understanding natural language! We already have Autonomous cars that can handle urban and outdoor environments courtesy of Darpa, who also funded a project I worked on to build Soccer playing Segway Robots at the Neuroscience Institute a few years ago. I also happen to believe that in a short time a combination of Memristors, literally silicon neurons, and Quantum Computers will allow us to create systems which can process both logical, and analog problems with incredible capacity. 


Harnessing true AI, and cellular machinery, certainly is dependent on nano tech. The Space Elevator is now literally possible due to the strength of carbon nano tubes, which also promise to make incredibly efficient, and compact transistors. 3D holographic chips promise to revolutionize storage space abilities from todays amazing multi level DVD and Blue Ray Discs. MEMS machines are now being built using semiconductor manufacturing techniques. At Luxtera we worked on optical structures built directly into silicon, and built 40G optical cables incorporating holographic optimal inputs, and used a Mach Zender Interferometer to modulate the light. MEMS Accelerometers, Gyroscopes, and Oscillators, and other devices are incredibly cheap and have no moving parts, so they use very little power and rarely break. How do you think those Wii controllers last so long and are so cheap? How do you think a $100 8 Mega Pixel camera now has motion compensation built in?


So on to the Quantum world, and Modern Physics.


When I was a kid, somewhere around 10-12 my family rented an RV and we drove to Canada and Back. During this trip I got my hands on a first edition copy of The Illustrated Brief History of Time by Stephen Hawking. This book had a lot of content on matters of cosmology, the big bang and other classical phsyics questions. It also covered relativity, which was the first great non intuitive scientific theory, as well as the underpinnings of Quantum Theory. I believe this book triggered me to become deeply interested in fundamental physics, and later the whole evolution of Quantum Mechanics. From then on I was often found reading interesting, and cutting edge physics books. I never felt drawn into wanting to research, but felt that it was a fascinating subject. Later when I was 19, I took Quantum Physics with Vivek Sharma at UCSD. This guy was SO passionate about the subject, he had a good sense of humor, and was a good teacher. It was his lecture on "Particle In a Box" that I always remember. When I read this New Scientist article today about Quantum Machinery I figured it was time to write an article about Quantum Theory.


Last week I wrote the first part of an article covering the art of temperature sensing, specifically using an infrared temperature sensor, which works off the principle of black boday radiation. A Black Body, is a idealized notion in physics which absorbs all incident radiation completely. This hypothetical notion, in the early 1900s, gave rise to the "ultraviolet catastrophe" which was basically if the Raleigh-Jeans law held true, and object should emit INFINITE energy in the ultraviolet spectrum. Einstein noted that this issue would be resolved by using the notion of QUANTIZING light, which was earlier given by Max Plank for unrelated reasons. This notion of Quantization of light, and really all physical properties was the birth of Quantum Mechanics, and Quantum Theory.  A weird world where things don't make sense. A real twilight zone under what seemed to be a universe of sense.


Ahhh now particle in a box. This notion is one of the few which can be stated analytically, so it's one of the first ones you learn. It is also the theory which explains why electrons can only circle the nucleus at certain energy levels (resulting in the emission of discrete QUANTA of energy, in the form of light), which underpins our understanding of chemical bonding behavior. My favorite side effect which I understand via the less idealized notion of a potential well, is, that due to the particle-wave duality, particles can probabilistically tunnel through barriers due to their wave behavior. This "Quantum Tunneling" property is what allows the plug on every device you plug in to work despite a layer of oxidization caused by the atmosphere.


This subject is throughly argued, and hotly debated. It's still a great area of mystery. I could write more for hours, but I figure that's a pretty good introduction. Some other interesting articles....


Quantum Cryptography
http://en.wikipedia.org/wiki/Quantum_cryptography


Wormholes 
http://en.wikipedia.org/wiki/Wormhole

Quantum Teleportation
http://www.research.ibm.com/quantuminfo/teleportation/

Large Hadron Collider
http://lhc.web.cern.ch/lhc/

Laser Interferometer Gravitational Wave Observatory
http://www.ligo.caltech.edu/

Monday, June 14, 2010

Infared -vs- Thermal Junction Temperature Sensing Part 1.

Recently I came upon a situation where I needed to take very detailed temperature measurements across a large number of locations to validate and environmental control system. Alas this was a subject I know very little about. In my scenario I was attempting to validate the spread across numerous points using a series of thermocouples to evaluate 2 optical temperature sensors in an environmental control system. So I had to do some research.

In my previous experience there had been 2 scenarios in which you measured temperature. For embedded devices a digital temperature sensor that came calibrated (like a DS1820 from Maxim/Dallas Semi) was usually easy to use, and could be easily bit banged with any old micro controller, meaning code the bus in C as opposed to needing hardware support like I2C/TWI etc. In IC design situations Bandgap Voltage/Voltage Proportional To Absolute Temperature (VPAT). In my experience they were usually used to give "perfect" reference currents for delicate analog circuits like the Optical chips Luxtera Makes. Of course that's until process varaiation, but that's outside the scope here.

So I will start with the fun stuff. Infrared Optical Temperature sensing is basically a practical application of the Black Body Radiation theory that was a first big step from classical physics to the quantum era. Nice! In a Blackbody, all light is absorbed, and re-emitted as a thermal spectrum. To me this is basically turning order into entropy. We know that the spectrum of a flame changes depending on its temperature. In fact the spectrum, and spectral peak of emitted light correlate well with the temperature of an object.

You can use the Stephen Boltzman Equation, add in an emissivity factor, and get pretty good measurements.
Emissivity is basically a measure of how ideal a material is for this type of measurement.

Typical Emissivity Values

Nickel
0.05
Aluminum Paint
0.45
Asbestos: Fabric
0.78
Plastic: acrylic, clear
0.94
Plastic: Black
0.95


As you can see there is quite a range. Nickel is terrible. Graphite is almost ideal. In our case clear and black plastic also happen to be pretty good. There are other factors (good application note) like sample to spot ratio ("dot" from beam small compared to sample size), and getting the beam close to the sample to reduce stray radiation. With a good material the results are very accurate assuming theres no dirty air, smoke, stray light, and some other pretty obvious things. The part we're using claims 0.5C accuracy which is identical to the DS1820, and the results are instant (or basically the speed of light and a look up table), while the 1820 has thermal mass which must equalize with it's surroundings.

In part 2 I will examine Thermocouple based temperature sensing in comparison.