Wednesday, November 24, 2010

Future Tech II: Supersolids, Meta Materials, Varying Fine Structure Constant, and Maxwell's Demon!

Interesting things are afoot these days. It's very exciting. Medicine, technology, software & networking, neuroscience, genetics, and many fields are blowing up. Right now however we are experiencing some massive discoveries in the areas of fundamental physics and our understanding of materials. About 6 weeks ago I sent my friend, Aaron Smith of the university of Arizona this blog post on Maxwell's Demon. I had not heard of this "thought experiment" but it was pretty cool.

Basically in the blog post they reference an article in Nature - Physics about some theoretical physicists who have come up with a more concrete example of an age old problem known as "Maxwell's Demon." Basically it's a thought experiment that seeks to throw a wrench in the second law of thermodynamics. Aaron recommends an article from Charles Bennett in Sciam, “Demons, Engines, and the Second Law,” and I can find another interesting pdf from Bennett.  Kind of like a math proof, this article reminds me that much of science is figuring out why something doesn't make sense.

Many people have poked holes in the theory/postulation, and Sziliard, a Hungarian physicist, basically showed mathematically that INFORMATION is ENERGY! That's what I think is super cool. The Japanese scientists basically created an experiment that demonstrates this in the real world. So not like knowing the content of a book, but to some degree ENERGY (or the power required to operate storage) could be converted in to energy. Unfortunately, it would take more information than can fill just about every computer in the world to power a spaceship, but nano-machines would require a reasonable amount and might someday be powered by “information-heat engines.” The experiment appears to have roughly converted with 28% efficiency according to the economist.  (This doesn’t take into account the measuring device and feedback control, which when accounted for pushes the efficiency very far into the negative.  It is still a proof of principle because those energy costs can theoretically be made arbitrarily small.)

This reminds me of how a black hole sucking everything in would violate conservation of energy  (Huh?  Don’t think that is right.  Black holes gain energy when they suck in energ and they lose energy when they radiate.) Hawking Radiation is an example of how particles and anti particles materialize on opposite sides of the event horizon saving energy as it imparts momentum to one of the pair. Or something like that.

Couple more interesting things I read lately.

Not sure about a quantum supersolid, but it sounds sick. We know about superconductors, and other crazy states of matter:  quark gluon plasma, superfluids, and recently demonstrated trapped antihydrogen atoms at the LHC, are all things that probably don't happen in the real world outside of exotic places like the event horizon of a black hole, crazy collapsing stars and nebulae, or early in the life of the universe. Who KNOWS what's coming soon on so many fronts. Gonna be exciting times soon.

Alpha a.k.a Fine Structure constant, could vary with time or space. Could be a major discovery and would require a theory beyond the standard model of particle physics and cosmology, which would point a finger directly at string theory.

I'm sure you heard buzz about metamaterials, and some recent stuff about a suit that could seemingly warp space by changing the speed of light. I'm sure we have some interesting things to learn from proving why this eventually will not be made, or if it can be, even cooler.

Dirty Fuel, Dirty Ships, Dirty Air

Recently have been reading about how dirty our system of seaborne transportation is. There's a lot of numbers out there, but many are confusing. From an article from the Guardian which I consider reputable:

- 1.12bn tonnes of CO₂, or nearly 4.5% of all global emissions of the main greenhouse gas!!!

Dude. We know CO2 is bad. What's bad is that there are some VERY easy things we can do to cut these numbers, as well as pollution. From the New York Times, shipping company Maersk cut their fuel usage by 30% (and most CO2 emissions come from fuel usage) JUST BY CUTTING the speed of their ships in half! This article also observes that we could cut Fuel/CO2 emissions by American cars by 20% going 10mph slower from 55 to 65. I say do it, and stop building roads. It's definitely an indirect way to encourage more efficient public transportation. Imagine if we could confine heavy transport to certain times of day in big cities easy traffic, and big truck's fuel usage.

The International Maritime Organization, is a UN body responsible for creating rules and regulations for international waters including policies for pollution reduction.They have passed rules which will require the sulphur content of fuels used by ships in certain areas to be reduced. Alas this doesn't apply in other areas. I'd love to see cleaner fuel everywhere, as well as a reduction in speed by cargo/tanker vessels. There's a lot we can do here without a ton of technology! Technology is cool too. A company has devised a massive sail for cargo ships, and all ships can use solar power for electric motors with zero emissions!

Sunday, October 3, 2010

The Holographic Principle and Information

So I read a couple interesting developments in the end all physics category recently. One big one was the demonstration of something like Hawking Radiation. This radiation was predicted by Stephen Hawking (hence the name) based on quantum particles materializing on each side of an event horizon of a black hole. A virtual event horizon was created using some interesting physical properties in a contrived way. The reason this is such a big deal is that the notion of a black hole "destroying" information (i.e. swallowing EVERYTHING at the event horizon) violates basic accepted principles of conservation. This good article from Scientific American covers the subject in greater depth.

I emailed my friend, Dr Aaron Smith, a physicist at the University of Arizona a post I wrote about the state of technology, and innovation as I see it a few months ago. Ironically my note got lost and he found it a bit later, and was telling me about the holographic principle, which I think is best demonstrated at the boundary of a black hole, and even the boundary of the universe. I asked him to write on the subject for me, and his expose, is below.
Holography and information are two key notions that are beginning to invade the world of physics.  Beware, mind-melting may occur if you read further!

In short, the holographic principle in physics says that a given theory in D-dimensions is equivalent to a different theory that describes the D-1 dimensional boundary of that D-dimensional space.  So, the physical principles that describe our universe must be equivalent to a different set of physical principles that describe the behavior of the boundary of our universe.  Confused yet?  A convenient way to think about this is that everything in the universe can be represented as information, or encoded somehow, and everything on the boundary of the universe is just another representation of this same information, encoded differently.  A radical interpretation is emerging:  the universe may be made up entirely of information...
What realm of physics did this bizarre realization emerge from?  Black hole thermodynamics.  But before diving in, we need a quick review of entropy.

Entropy is a macroscopic property of a system originating from the microscopic details.  Given a collection of particles with macroscopic properties(temperature, pressure, volume) there is a specific number of ways of reconfiguring the particles while still maintaining the same macro-properties.  For example, take a gas in a tank at high temperature: if one particle moves slightly faster and another moves slightly slower than the micro-configuration is different but the macro-properties are unchanged.  

Entropy is simply the number of ways that a system can be reconfigured and still have the same macro-properties (actually, entropy = logarithm of the number of ways, but that is irrelevant for now).  

Entropy is intimately related to information: high entropy means it takes more information to specify the micro-details of a system.  Entropy and information may in fact be one and the same.

It turns out that the entropy of an isolated system can never decrease, which basically means that information can never be erased.  If you disagree because you erase information on your computer every day, then consider the fact that there is far more information in your computer than the bits you know of; there are all the details about every electron flowing throughout, the tiny amounts of radiation released, and so on.  So when you erase something on your computer, the information isn't truly gone, it has just been re-encoded in the microscopic properties of particles and fields in and around your computer.  But don't worry about your security, extracting that re-encoded information is next-to-impossible.  

And now, onto black holes.  A black hole has mass, charge, and angular momentum(they can spin) and has only one unique configuration once you specify those three parameters--at least that's what general relativity tells us(the "no hair theorem").  This suggests that a black hole has very low entropy since very little information is needed to fully describe it.  This leads to a disturbing paradox.  When a system with high entropy falls into a black hole, the entropy of that system must disappear and the information describing it would thus be erased.  This cannot be, as it violates sacred laws of physics.  The resolution is that the entropy of a black hole is actually quite large, ask Stephen Hawking if you don't believe me, and so it must have microscopic parts that can be reconfigured in many different ways and leave the bulk properties unchanged.  General relativity is thus missing some very important details.

This is all pretty abstract but what does it mean?  One more detour, first.  

The boundary of a black hole, the event horizon, is a region in which space-time is so warped that the interior and exterior are not causally connected; anything inside can never affect anything outside.  Hmmm, weird.  When a system falls into a black hole, an outside observer never actually sees it get passed the boundary!  This is because of that warping of space-time.  An outside observer sees the infalling system get closer to the surface, eventually becoming a distortion of the surface.  The infalling system goes right through the boundary, no problem, and sees itself in the interior but can no longer see anything outside.  This means that the information describing the infalling system is now contained within the black hole and is also written onto the boundary somehow.  

Here is the crux, folks: a description of the boundary of a black hole is informationally equivalent to a description of the interior.  At last, holography!

This holographic understanding of black holes has led to many new ideas in physics and has the potential to revolutionize the discipline.  Lets explore some more of the realm of holography.  

A direct consequence of black hole entropy is that a given region of space-time has a upper bound on its total entropy.  In other words, you can't cram an infinite amount of information in a finite region of space.  This 'information capacity' is highly suggestive.  Space-time itself must be indivisible on some length scale; the universe must be truly digital!  Current estimates suggest that a spherical region of space, one meter in radius, can hold a maximum of about 10^70 bits of information(1 with 70 zeros after it)!  But remember, the universe is holographic, so it is really the area of the region of space we must consider, not the volume; two volumes of space will have a different information capacity if their boundaries have a different area.  In fact,the information capacity of a volume of space is simply proportional to the area.

Another consequence.  Modern particle physics(AKA quantum field theory) must be extremely redundant.  The standard model of particle physics says there are many different particles each with a large number of internal degrees of freedom.  A simple calculation shows that the amount of information that can be encoded in a region of space, using fundamental particles such as electrons and quarks, is far greater than what holographic reasoning suggests.  Therefore, particle physics is not a fundamental theory, but an effective theory in which the true information content of the universe is encoded redundantly.  What is the fundamental theory then?  The best known candidate is string theory, in which all things in existence can be broken down into very tiny little vibrating strings; an electron is fundamentally a vibrating string and a quark is too, just vibrating differently.

Gravity, dark energy, dark matter, cosmic inflation, the big bang, the nature of time, and many other ideas in physics have seen the holographic principle lend an interesting perspective.  It is all very cutting edge stuff and a lot of speculation is going on right now.

So what was the definition of the holographic principle?  Well, it is too poorly understood to give a rigorous definition at this point in time.  But, as holographic ideas help us understand to better understand physics, we in turn better understand the mysterious holographic principle.  The race is on to figure it out, and will likely happen by the current generation of physicists.  Keep your eyes and ears open folks!

For more, read the Scientific American article written by highly regarded physicist Jacob Bekenstein.  He is the one who originally proposed in the 70s that black holes have large entropy, proportional to the area of the event horizon.

The Wikipedia page also has some good insight and citations for further reading.

In the end that's a pretty cool reductionist viewpoint as I see it. The universe is nothing but our observed reality, of information. Light, and fundamental particles (gravity loops, quarks, strings whatever) are all quantized, because they are just information as we see them. Of course there's lots of details, but after a century of basic physics becoming increasingly convoluted, we are ready for some understanding and simplification. Let's see what happens.

Wednesday, September 8, 2010

"Carpe Diem"

How does one, exactly, seize the day? I think it's a pretty complicated balance. In triathlon you balance consistency with a focused plan that gradually prepares you for race day incrementally, focusing on ALL the aspects that congeal into a passionate performance with perfect execution. I learned a lot about life as I slowly accepted you couldn't just train hard all the time without rest, recovery, and good nutrition.

So the reality is that you can't just go hard all the time just about anywhere. I like to say that even a Ferrari needs an oil change! We all know the law of diminishing returns in Economics terms. Increase any one factor of production alone, like labor, and your rate of return diminishes. It's all about balance. If you build a faster car, without better breaks, better handling, better cooling, and all the other things that matter, you will end up with a car that slams into the wall.

I believe it's very American to think "bigger, better, faster, more!" but not exclusively. As an Engineer its very easy to get sucked into doing too much. What a lot of people don't tell you, is that when you don't focus on balance, and recovery, YOUR utility decreases, just like they postulate in economics. I had to learn this the hard way getting burned out at a few jobs in my time. One thing I was always good at was going hard. Of course that's only half the equation. You can only be on "output" for so long before you need some "input" time. What I learned gradually as I grew, was learning how to be ready to POUNCE!

In one of my favorite books of fiction as a child, The Celestine Prophecy, some guru comes around and tells the protagonist that you should never feel bad, for being lucky. For taking an opportunity that presents itself. "There are no coincidences" he says, and I think it's a good way to approach some parts of your life. The problem is, if you are running around talking on your cell phone, overloaded with work and duties, and never stop to smell the roses, you will just WALK RIGHT BY some of the most amazing things in the world. A very interesting social experiment by a famous musician, Joshua Bell, is a great example of this. The reality is it takes a lot of work to keep your life simple, and open enough to truly embrace the world as a source of inspiration, while devoting yourself fully to your passion, but that is the key.

I read this excellent article last week on keeping your life simple to allow you to release your passion. I like some of the points they the author highlights about leaving free time in your schedule, or "underscheduling" as he calls it. I also deeply believe that much of this time needs to be "off the grid." When I'm off the grid I am unfindable, often to people's frustration. That doesn't mean I'm not with some people sometimes, but I'm not letting the buzz of the world interfere with my experience. Not only must you be rested, and have free time, but just as important is the "go hard" part of getting your work done. How can you focus on the beautiful world around you if you are always worried about things you didn't finish?

 Alas, once again, it comes down to balance. I know that the tightrope between work hard, and play hard, will forever be a challenge to walk.

Friday, August 27, 2010

The Tao of Triathlon

I recently read a few articles by Shane Eversfield I found through an email from USA Triathlon. I often write about about the physical, spiritual, and mental aspects of endurance training. I asked him for a guest post, as I feel he has a lot of insight to offer.

Movement patterns profoundly affect the brain's function. Case in point: I practice T'ai Chi daily. Using a book briefly in the beginning to "estimate the basics", I have continued to practice and refine the movements on my own for over 30 years now. Primary guidance comes from a diligent quest for perfect balance and orientation - a deep challenge as I move very slowly through the form, often with my eyes closed. After 8 years of self-guidance, I read a book about Taoism, an ancient Chinese way of life inextricably linked to T'ai Chi. The yin-yang symbol? Taoist. It expresses the cosmic dance of polar opposites - essentially, the animation of our universe. Not really a religion, or a philosophy, "Tao" translates as "the Way". Engaging an inquisitive "beginner's mind", the Taoist disciple embarks on a lifelong quest to investigate functional principles of our universe and to diligently train their application. Taoism is a way of perceiving, responding to and moving through the world around us. So is Triathlon.
As I read this book on Taoism, I realized that I was... well, Taoist. This didn't happen from reading ancient texts or from living in a remote Chinese village with Taoist sages. (Heck, I was a young hippy-artist living in the northeast US.) My Tao transformation occurred through the movements of T'ai Chi. Taoism is now intrinsic to the way I think, perceive and respond - to the way I live. So is Triathlon.
As triathletes, we're on a path to enjoy and master three basic activities from childhood. Each of these involves a repetitive movement pattern, coordinating opposite arm and leg movements through pelvic core stabilization. (Yes, even cycling.) Equally important, each of these childhood activities requires a unique and complex orientation with gravity. This is profound, given that up to ninety percent of your neurological energy is invested in balance - orienting your body to gravity. (Contemplate balance and orientation deeply while you train.) Like T'ai Chi, each of these basic childhood activities affects the way we perceive and interact with ourselves and the world. Put ‘em together, and you've got a powerful kinetic trinity. Tao of Triathlon.
Just like juggling, triathlon is a feat of timing, dexterity and balance, dynamically orchestrating three elements. Training effectively towards ambitious performance goals requires vigilance and honesty in the ongoing assessment of one's strengths and weaknesses. It demands a continuous response that is equally evidence-based science and creative intuition. Humility, self-honesty, curiosity and knowledge are essential.
Human nature provides us with nesting instincts; we gravitate towards our strengths, stay within the comfort zone, and avoid the dark forests of uncertainty. Well, there's no "nesting" in multisport. We're all familiar with that humbling "day-of-reckoning" feeling on race morning, as we toe the line with pale, tender feet. I wonder, is that what makes us so friendly and cooperative in the transition area before the big showdown?
Tao says embrace vulnerability and imbalance as opportunities for improvement well ahead of race day. Triathletes who are weak cyclists often elect to participate in group rides with experienced road cyclists. Criticism, embarrassment and humility be damned, the rewards of experience gained outweigh the rookie's discomfort. Drop the fear; embrace uncertainty as the ultimate opportunity.
In the real world, versatility ultimately triumphs specialization: Change is inevitable.
Beyond the relentless quest for swimming, biking and running mastery, experienced triathletes know there is a fourth element in triathlon: the art of transition. More than a quick gear and clothing change; it's an instant transition from sleek efficient swimmer, to strong efficient cyclist, to swift efficient runner. In under a minute, it's possible to transform from one movement pattern, from one orientation with gravity, from one integration with equipment to another one entirely.
Athletic excellence in a single sport trains mastery of a single identity. The swift transitions of multisport challenge the athlete to fully engage, and then completely detach from each identity. Ego is the collection of identities one assumes in the roles of every day life. A well-balanced individual chooses his/her identities functionally - as tools in a constructive, brilliant life. Dis-functionality is a strong attachment to a specific identity, an unwillingness to let go of one role when it no longer serves in the moment.
A ludicrous example of such an attachment: Tommy Triathlete rolls into T2, fastest bike split of the day, and transitions to run. However, Tommy just can't let go of his prowess as a cyclist and insists on wearing bike shoes and carrying his bike for the entire run. Even with the fastest bike split, that finish line is a long way off lugging a bike. Multisport transition develops a functional relationship with ego through the capacity and the will to engage and detach.
Function and brilliance - Tao of Triathlon: Swim, bike, run. Balance, orient, transition.
This article originally appeared in Hammer Nutrition Endurance News, Issue #69. Copyright 2010 Shane Eversfield

Check out Shane's book, Zendurance. Shane also writes regularly for Total Immersion, a popular system for teaching swimming.

Sunday, August 1, 2010

Infared -vs- Thermal Junction Temperature Sensing Part 2

Thermal Junction/Thermocouple Sensing

                Typical semiconductor based temperature sensing is based of current flowing through some sort of thermal junction. This junction is ideally two dissimilar metals, where the voltage will be proportional to the temperature of the environment, after taking into account time lag of heat absorption.

Voltage/Temperature Relationship

                Typically there is a nonlinear relationship, and the main limitation is accuracy. “System errors of less than one degree Celisius can be difficult to achieve.” The relationship between the temperature difference and output voltage of a thermocouple is derived from a comples summation of coeffeicients based on metal type, and results in a typically non-linear relationship.

K Type Thermocouples

Type K Thermocouples are the most common general purpose TCs used. They are made of a chromel-alumel junction with a sensitivity of approx. 41uV/ C. According to the Omega NIST reference, K type thermocouples have a maximum error of 2.2C with 0.5C being more typical.

Cold Junction Compensation (CJC)

Usually, to calibrate a thermocouple, a method similar to sound noise cancellation is used. An independent junction is maintained at a fixed temperature. More commonly a thermistor, or diode (like a PN where the current varied minutely with temperature) is used. Frequently as well, temperature sensors and a look up table can be used to extract the CJC temperature indirectly.
Some examples, and a thorough treatment of circuits for this purpose are included in the Maxim Application Note cited below “Implementing cold-Junction Compensation in Thermocouple Applications.”


                Thermocouples have a very non-linear relationship to temperature. they are versatile, and usable for measurements which will frequently cover a large range. They are more commonly used in industrial applications where such temperature variation is more common.
                 For more accurate measurements, it is common to use a resistance thermometer, which are commonly referred to as RTDs, and made of platinum. For applications under 600 degrees C they are slowly eroding the use of thermocouples due to dramatically improved accuracy and repeatability.

Sunday, June 27, 2010

Future Tech: Quantum Redux

One of my favorite Physics topics was always "Particle in a Box." Mostly I like the name. It's just fun to say. This weekend I started explaining the concept to somebody before I got distracted on a tangent, but it got me thinking. Classical physics is nice, neat, and well buttoned up. Yet we are now entering uncharted territory in the Subatomic Realm. I have literally be in the thick of this acceleration, or Inflection point in the words of Raymond Kurzweil, who wrote an amazing book about the acceleration of technology development, The Singularity Is Near: When Humans Transcend Biology. He points out the incredible, and increasing rate or technological change dictates that it will soon be the dominating force of evolution in our lives.

10 years ago we sequenced the human genome. The biology front is certainly exciting. By some accounts, Craig Venter and his team have created a fully synthetic bacterium. Companies like Affymetrix, Illumina, and others have created rapid sequencing platforms we can use for countless applications. I used Affymetrix Arrays working with Justin Borevitz when he was at the Salk Institute to investigate Alternative Splicing. The company where I work, Cyntellect, has developed a line of Cellular Analysis Instruments which can be used for Stem Cell, and Cancer Applications which allows you to manipulate, analyze, and quantify cells used for therapeutic purposes, drug discovery, or drug production.

On the Neuroscience front, IBM already built Deep Blue, the first machine that could beat a grandmaster at chess. Now they are working on a machine, "Watson" to win at Jeopardy by fundamentally understanding natural language! We already have Autonomous cars that can handle urban and outdoor environments courtesy of Darpa, who also funded a project I worked on to build Soccer playing Segway Robots at the Neuroscience Institute a few years ago. I also happen to believe that in a short time a combination of Memristors, literally silicon neurons, and Quantum Computers will allow us to create systems which can process both logical, and analog problems with incredible capacity. 

Harnessing true AI, and cellular machinery, certainly is dependent on nano tech. The Space Elevator is now literally possible due to the strength of carbon nano tubes, which also promise to make incredibly efficient, and compact transistors. 3D holographic chips promise to revolutionize storage space abilities from todays amazing multi level DVD and Blue Ray Discs. MEMS machines are now being built using semiconductor manufacturing techniques. At Luxtera we worked on optical structures built directly into silicon, and built 40G optical cables incorporating holographic optimal inputs, and used a Mach Zender Interferometer to modulate the light. MEMS Accelerometers, Gyroscopes, and Oscillators, and other devices are incredibly cheap and have no moving parts, so they use very little power and rarely break. How do you think those Wii controllers last so long and are so cheap? How do you think a $100 8 Mega Pixel camera now has motion compensation built in?

So on to the Quantum world, and Modern Physics.

When I was a kid, somewhere around 10-12 my family rented an RV and we drove to Canada and Back. During this trip I got my hands on a first edition copy of The Illustrated Brief History of Time by Stephen Hawking. This book had a lot of content on matters of cosmology, the big bang and other classical phsyics questions. It also covered relativity, which was the first great non intuitive scientific theory, as well as the underpinnings of Quantum Theory. I believe this book triggered me to become deeply interested in fundamental physics, and later the whole evolution of Quantum Mechanics. From then on I was often found reading interesting, and cutting edge physics books. I never felt drawn into wanting to research, but felt that it was a fascinating subject. Later when I was 19, I took Quantum Physics with Vivek Sharma at UCSD. This guy was SO passionate about the subject, he had a good sense of humor, and was a good teacher. It was his lecture on "Particle In a Box" that I always remember. When I read this New Scientist article today about Quantum Machinery I figured it was time to write an article about Quantum Theory.

Last week I wrote the first part of an article covering the art of temperature sensing, specifically using an infrared temperature sensor, which works off the principle of black boday radiation. A Black Body, is a idealized notion in physics which absorbs all incident radiation completely. This hypothetical notion, in the early 1900s, gave rise to the "ultraviolet catastrophe" which was basically if the Raleigh-Jeans law held true, and object should emit INFINITE energy in the ultraviolet spectrum. Einstein noted that this issue would be resolved by using the notion of QUANTIZING light, which was earlier given by Max Plank for unrelated reasons. This notion of Quantization of light, and really all physical properties was the birth of Quantum Mechanics, and Quantum Theory.  A weird world where things don't make sense. A real twilight zone under what seemed to be a universe of sense.

Ahhh now particle in a box. This notion is one of the few which can be stated analytically, so it's one of the first ones you learn. It is also the theory which explains why electrons can only circle the nucleus at certain energy levels (resulting in the emission of discrete QUANTA of energy, in the form of light), which underpins our understanding of chemical bonding behavior. My favorite side effect which I understand via the less idealized notion of a potential well, is, that due to the particle-wave duality, particles can probabilistically tunnel through barriers due to their wave behavior. This "Quantum Tunneling" property is what allows the plug on every device you plug in to work despite a layer of oxidization caused by the atmosphere.

This subject is throughly argued, and hotly debated. It's still a great area of mystery. I could write more for hours, but I figure that's a pretty good introduction. Some other interesting articles....

Quantum Cryptography


Quantum Teleportation

Large Hadron Collider

Laser Interferometer Gravitational Wave Observatory

Monday, June 14, 2010

Infared -vs- Thermal Junction Temperature Sensing Part 1.

Recently I came upon a situation where I needed to take very detailed temperature measurements across a large number of locations to validate and environmental control system. Alas this was a subject I know very little about. In my scenario I was attempting to validate the spread across numerous points using a series of thermocouples to evaluate 2 optical temperature sensors in an environmental control system. So I had to do some research.

In my previous experience there had been 2 scenarios in which you measured temperature. For embedded devices a digital temperature sensor that came calibrated (like a DS1820 from Maxim/Dallas Semi) was usually easy to use, and could be easily bit banged with any old micro controller, meaning code the bus in C as opposed to needing hardware support like I2C/TWI etc. In IC design situations Bandgap Voltage/Voltage Proportional To Absolute Temperature (VPAT). In my experience they were usually used to give "perfect" reference currents for delicate analog circuits like the Optical chips Luxtera Makes. Of course that's until process varaiation, but that's outside the scope here.

So I will start with the fun stuff. Infrared Optical Temperature sensing is basically a practical application of the Black Body Radiation theory that was a first big step from classical physics to the quantum era. Nice! In a Blackbody, all light is absorbed, and re-emitted as a thermal spectrum. To me this is basically turning order into entropy. We know that the spectrum of a flame changes depending on its temperature. In fact the spectrum, and spectral peak of emitted light correlate well with the temperature of an object.

You can use the Stephen Boltzman Equation, add in an emissivity factor, and get pretty good measurements.
Emissivity is basically a measure of how ideal a material is for this type of measurement.

Typical Emissivity Values

Aluminum Paint
Asbestos: Fabric
Plastic: acrylic, clear
Plastic: Black

As you can see there is quite a range. Nickel is terrible. Graphite is almost ideal. In our case clear and black plastic also happen to be pretty good. There are other factors (good application note) like sample to spot ratio ("dot" from beam small compared to sample size), and getting the beam close to the sample to reduce stray radiation. With a good material the results are very accurate assuming theres no dirty air, smoke, stray light, and some other pretty obvious things. The part we're using claims 0.5C accuracy which is identical to the DS1820, and the results are instant (or basically the speed of light and a look up table), while the 1820 has thermal mass which must equalize with it's surroundings.

In part 2 I will examine Thermocouple based temperature sensing in comparison.

Thursday, May 20, 2010

Denialism and FUD!

FUD. Fear Uncertainty, and Doubt. A term used in tech circles. Intel, for example, was purported to have used FUD tactics to intimidate competitors, and the market. They would announce a product aggressively and encourage people to WAIT instead of buying competitors products. Then mysteriously these big products would get delayed or cancelled. Who knows. Products get delayed. Also, SCO -vs- IBM exemplified another form, where SCO made a lot of claims with NO concrete details. Mathematically this equates to throwing out unproven corner cases as arguments, before they are substantiated.

I notice this behavior pattern in people a lot. They are angry, usually for no reason to do with you, and attack something lame. It's usually a good sign that you shouldn't take it personally. A few times my dad has yelled at me for swimming too much. What the hell? Oh,'re mad about something else. I look for this in myself. When I use a lame excuse to get emotional, I know I should pull back.

Global Warming. Fact Versus Fiction. THERE IS A TON ON BOTH SIDES. Clearly there are a TON of lame excuses to counter global warming details. BUT we know a lot of truths. CO2 DOES increase temperatures. Temps are going up, but this has happened before. The Urban Heat Island Effect skews these readings. And alas, correlation does NOT equal causality. I want to take care of the environment, but I don't think SCARING people into doing it is the right way.

I read article about denialism in New Scientist. They talk about how people get stuck denying things that are obvious. Including Global Warming, though again I stipulate, if you are saying "temps are going up" I agree 100%. If you say "Humans are causing temps to go up" then there is a lot of room to debate.
Whatever they are denying, denial movements have much in common with one another, not least the use of similar tactics (see "How to be a denialist"). All set themselves up as courageous underdogs fighting a corrupt elite engaged in a conspiracy to suppress the truth or foist a malicious lie on ordinary people. This conspiracy is usually claimed to be promoting a sinister agenda: the nanny state, takeover of the world economy, government power over individuals, financial gain, atheism.
I dig that quote. I always laugh when people justify conspiracy theories. They (New Scientist) also comment on the limitation of some types of arguments:
Similarly, global warming, evolution and the link between tobacco and cancer must be taken on trust, usually on the word of scientists, doctors and other technical experts who many non-scientists see as arrogant and alien.
So a call to action. For years people didn't believe cigarettes were bad because of lame excuses. We KNOW texting and using cell phones is dangerous when driving, yet I see people still doing it all the time. For years we had people who could only call Bush "stupid," like we now have people who just call Obama a "Socialist" instead of arguing policy. There's a lot of this kind of thinking out there, and many people buy in to it, ESPECIALLY in politics. In Buddhist parlance this results from Dependant thinking as opposed to Independent thinking. Let's spend our time contemplating the REAL world, not the world people want us to see.

I should confess this post was inspired by my friend Greta in Brazil, and reading this Essay by Ralph Waldo Emerson, on Self-Reliance.

Woah. Check this out. This guy is an awesome skeptic that Seems to embody my thoughts on this subject well:

He's got cool stuff about intelligent design, and other pseudoscience subjects.

Saturday, May 8, 2010

Roundup Resistant Weeds ATTACK!

I was reading coverage in the NY times about how Roundup resistant weeds are proliferating. This is similar in cause and effect to overuse of antibiotics accelerating development of resistant bacteria. I believe this is once again an example of the shortsightedness on the part of the farming industry, but logical in a competitive market.

Once again I have to go back to natural capitalism for some startling statistics, and information about the state of our farming industry. Our farming methodologies are inefficient with regards to energy consumption:

American farms have doubled their direct and indirect energy efficiency since 1978. They use more efficiently manufactured fertilizer, diesel engines, bigger and multifunction farm machinery, better drying and irrigation processes and controls, and herbicides instead of plowing to control weeds. 

And also even more worrisome is that we are destroying the genetic diversity of our crop species:

Clear-cutting at the microscopic level of DNA may be creating the gravest problem of all. The world's farming rests on an extraordinarily narrow genetic base. Of the 200,000 species of wild plants, notes biogeographer Jared Diamond, "only a few thousands are eaten by humans, and just a few hundred of those have been more or less domesticated." Three-quarters of the world's food comes from only seven crop species, wheat, rice, corn, potatoes, barley, cassava (manioc), and sorghum.

Not only are we reducing genetic diversity, but also our "single crop mindset" is creating fertile breeding grounds for prime predators, and a disaster waiting to happen:

The single-crop mentality both ignores nature's tendency to foster diversity and worsens the ancient battle against pests. Monocultures are rare in nature, in part because they create paradises for plant diseases and insects, as science writer Janine Benyus puts it, they are like equipping a burglar with the keys to every house in the neighborhood; they're an all-you-can-eat restaurant for pests. Disease already damages or destroys 13 percent of the world's crops, insects 15 percent, and weeds 12 percent; in all, two-fifths of the world's harvest is lost in the fields, and after some more spoils, nearly half never reaches a human mouth.

Hopefully we can be ahead of the curve on this one, before disaster strikes
Another example. Recent New Scientist coverage of a Nature article about new pests evolving to attack Genetically modified crops:
The rise of mirids has driven Chinese farmers back to pesticides - they are currently using about two-thirds as much as they did before Bt cotton was introduced. As mirids develop resistance to the pesticides, Wu expects that farmers will soon spray as much as they ever did.

Tuesday, May 4, 2010

First offshore wind farm

I already read an article dissing the fact that the first wind farm was built in bureaucracy central (east coast), versus the gulf by Texas (no rules basically). Still this announcement of a big off shore wind farm being built is pretty exciting.

As usual there were a bunch of complaints (this time Indians, not environmentalists), but luckily we were able to navigate the rough waters and get it done. I don't want to assign blame, or credit politically. I'm just happy it happened.

Monday, April 12, 2010

Simple Sustainability

Wow. I just came across this site linking from an article I read on Slashdot. Sort of a place to post modern ideas on resource and energy efficiency. Just a couple of articles on were awesome enough that I wanted to share the site immediately.

A pretty awesome eco-integrated concept building, I would love to see actually get built. Could we get some financing for projects that will reduce our energy footprint, and pay themselves back?

And this one about chemical sniffing phones is an awesome idea. What if everyone was just assigned random chemicals to have their phone sense? I think we could justify getting sensors for industrial waste, and other problem pollutants as well.

This story about paint on connected solar cells is pretty interesting too. They comment that it's not yet commercially proven, but a pretty awesome concept.

All posted today ?!?