27 November, 2009

1905: Annus Mirabilus - Photo electric effect

1905 was a great year for physics, in this year a 24 year old patent examiner in Bern Switzerland published 4 fundamental papers in physics in 4 disparate areas of the field. The topics included special relativity, the relationship between energy and matter, brownian motion and the subject of this post, the photo electric effect.

The photo electric effect paper by Einstein was probably the most practical paper next to the brownian motion paper in that it provided an answer to a long standing problem in electromagnetic theory at the time that had stood as an embarrassment to particle physics. This embarrasment was a legacy of the work of James Clerk Maxwell and his fundamental equations of electromagnetism, by using a continuous wave analog to describe the energy of propagating fields Maxwell was able to do the astonishing, he explained the riddle that was the relationship between electricity and magnetism in clear mathematical terms and he was able to show how light must be itself an electromagnetic wave by showing that all such waves are limited by the speed of light "c" or roughly 186,000 miles per second.


 The use of continuous waves to describe particles however led to serious difficulties when attempting to calculate the energy radiating from theoretical systems known as 'black bodies". Black body radiation could easily be approximated by taking an ingot of steel with a hole bored inside, according to Maxwell's electromagnetic theory this hole should be generating an infinite set of frequencies of light, but if there are infinite frequencies being generated then there must be infinite energy and experimentation clearly showed that such bodies had limited fixed energy release patterns so something was wrong.

Near the end of the 19th century a brilliant physicist Max Planck theorized the possibility that the infinite energy black body problem could be solved IF the energy of light eminating from the material was discrete in some way or "quantized". However he was unable to form a mechanism to describe the smooth transition from a continuous field and quanta of light or photons as they came to be called. At least this was the case until Einstein's arrival, Einstein took the job at Bern to work on his ideas in physics (but also because he had trouble finding a teaching position!)...his main aim was to answer the question he claimed plagued him from the time he was 17 and asked what it would be like to ride a beam of light, what would he see? He answered the question also in 1905 in his special relativity paper which I'll talk about in another post. Along the way he apparently amused himself by solving a couple of other huge problems in physics at the time, one of which was the black body problem. To be clear Einstein wasn't directly trying to solve the black body problem, he was trying to explain why it was that a metal surface shined with light would have ejected electrons or an induced current. Somehow the energy of the light was being absorbed by the metal material, some bouncing out as electrons and others forming a current flow. Einstein's solution involved using Planck's idea of quanta and tying it to the constant that Planck discovered (symbolically represented as h in physics) that would govern energy release in particle form devised the famous equation E = hv which in semiconductor physics circles is more important than F= ma or E = mc^2 , the reason is that this equation enabled theoretical results using a slight modification to Maxwell's wave formulation to match experimental results in the black body problem.

If Energy could only be released in packets of "hv" in size the infinite energy problem would go away, this would be so if the material of the black body and all material that radiate is restricted to "hv" units of radiative absorption. This victory on the part of Einstein could be said to be his most fruitful in a practical sense as it spawned more real technology than any of his other work, including Special and General Relativity. A minor caveat though, is that the discovery and explanation for quantization that emerged from Einstein's paper went on to describe a related by different phenomena the photo voltaic effect...the difference being subtle...the photo voltaic effect results as the liberated electrons in the material are free to flow, the bonding energy for ejecting the electrons of the photo electric effect are different from the conduction flow energy required to induce the photo voltaic effect. The latter phenomena is what led to later inventions that will be discussed below.

So surfaces emit energy in "hv" units of energy, so what?

The significance of Einstein's equation toward the electron ejection and current flow problems and the black body problem enabled a great embarrassment of EM theory to be over come but beside that what has come from the realization? A first practical utilization of the awareness that light could induce current flow came when Shockley and Bardeen invented the transistor in 1947, the transistor was realized by mating two different materials called semiconductors at a junction, one material was an electron donor , with free electrons ready to give ...the other material was an electron receiver, with free "holes" (or open valence shells) for accepting electrons...when mated in the double junction fashion used by Shockley and Bardeen it was hoped that a "bias" or current applied between two junctions could modulate a much larger current...effectively amplifying the smaller current.

It worked, This success ushered in the age of semiconductor electronics that accelerated in the 50's with the transistor radio and other devices and took off in the 60's and 70's. However a side effect of mating electron rich and electron poor junctions was what happened as electrons jumped the gap of what is called the "depletion region" between the two materials, in accordance with Einstein's relation for quantized photons on surfaces when current flowed through the junctions photons (in the infrared range at the time) would be produced. This single junction effect had actually been discovered decades before but was not exploited until...


Technically a single junction device is a diode, a device known to electronics and electrical power generation for over 130 years (before semiconductors) that only allows current flow in one direction...but it's optical properties were unexplored for decades.

 The fact that diodes were creating light (infrared) when run in specific operating regimes was not seen as valuable. Thus the reverse of the photo-voltaic effect hibernated, flowing current through materials in just the right manner could liberate photons of a specific frequency 'v'...in transistor design this is actually a bad thing as it is energy that leaves the circuit and does not aid in the amplification of the bias signal which is the desired result...a way to use this reverse photo electric effect was realized shortly after Einstein's paper was written in 1907, the light produced was infra red...it took almost 60 years to produce visible light LED's and another 15 years to reduce costs for them to be included into practical devices. If you've seen an 8 segment display from old calculators you were looking at a bunch of early LED's. Today, LED lighting is every where and has diversified from the early red LED's to colors across the visible spectrum including combined options to create white light. Many cities have begun replacing their old bulb based street lights by packages of highly efficient and color pure LED lights, these packages are significantly more power efficient and thus will save cities millions in energy costs , they are also environmentally friendly as energy not used means less carbon emissions.

The LED based street lights are also noticeably more color pure and brighter from much further distances potentially allowing for a reduction in accidents. As a replacement for incandescent and flourescent bulbs LED lighting promises incredibly energy efficient and color pure light options without the environmental potential hazards associated with older technologies. In the most recent iterations of the technology rather than use semiconductors to produce the light, researchers are using bio molecules that change shape and release visible light photons, these new LED's called O (organic) LED's for their use of these biomolecules are poised to revolutize display technologies from hand held phones to large screen TV's.

OLED's can be even more efficient than regular LED's can be embedded in flexible or transparent membranes or surfaces and can produce color ranges not possible with CRT (cathode ray tube) or plasma or LCD based technologies.

Image Sensors

So a direct result of Einstein's explanation of the photo electric effect in the reverse case is clear and ever present but what of the original mode of light driving current in a metal? This aspect of the paper was explored in the mid 60's by engineers in Bell Labs who were on a quest to find a display technology for what was believed to be the pending big business of video phones. The investigation of a grid of photo diodes used to capture light in wells and induce individual currents which can then be used to infer the luminous intensity of the impinged light, sufficiently large arrays of these diodes could then be used as a sensor array for optical uses and the CCD or charge coupled device was born. The technology changed hands and advanced as the array densities improved and the methods for reading and processing the signals generated in the thousands and then millions of wells were refined, soon the CCD was being used in high end video cameras to capture light using a trichromatic process and then synthesize a full color image. It wasn't until the mid 90's that the technology really took off in production as the unique nature of the circuitry for CCD's that made them expensive became less of an issue. CCD's were used by Kodak to replace chemical film and the digital camera revolution was born, Japanese companies picked up on the power of the sensor technology and devoted large amounts of R&D to create more advanced sensors , Nikon, Canon , Sony and others joined the fray and led to the growth of the digital photography market that has occurred in the last15 years. A parallel development for the last 10 years has been the ability to retool existing MOS (metal oxide silicon) based production plants to produce sensors , these C(complementary) MOS sensors have recently matched and exceeded the former advantages of CCD's for most applications but at the much reduced production cost allowing such sensors to be placed in cell phones, web cams and home surveillance cameras in amazing numbers.

Solar Cells

An obvious use for exploiting the photo voltaic effect came up as semiconductor materials continued to fall in price and in conjunction with their use as receptor wells for creating image sensors and that is to collect photons from the sun and use them to generate electricity. The requirements for solar cells to function rely less on the type of energy input (you ideally want to take in all photon energy) so materials that allow the surface to conduct current at as many frequencies as possible will induce the largest current flow.

The original solar cells were very inefficient at liberating this current flow but were good enough to enable the creation of solar powered calculators by the late 70's, the models today can easily perform vast computation using the residual light of incandescent bulbs indoors. However, until recently they remained sensitive only to specific wavelengths of incoming light making them inefficient for power generation devices for large objects...recent advances in nano-materials though have opened the possibility for gathering across the optical spectrum to create solar cells that can convert nearly 50% of the incoming photon energy into conduction electrons. This technology promises to revolutionize how we generate power world wide in the next decade.

As a recognition of his explanation of light quanta and the work that fell out of it (I covered only the practical devices the ideas (quantization) has even wider ranging influence on quantum mechanics itself but I won't go into that here) Einstein won the Nobel prize in Physics (the only one he ever won amazingly) in 1923.

Quite a trip isn't it? In just over 100 years, Einstein's diversion has flowered into a multi-billion dollar industry for creating light and for capturing it, think about that the next time you are stopped at a traffic light or are taking a snapshot with your camera phone. Take a moment to say thanks to Einstein.











22 November, 2009

Ever lives... the continuum

I recently answered the question of my life purpose in an FB comment response, excerpted in its entirety below.

The "purpose" of life is what you make it of course, a relative thing, I don't use "purpose" in some grand metaphysical sense disconnected from people that is often solely the stance of those with a religious bent...no, I feel personally fulfilled that I am the guide of my life, what ever I chose to do it is me that is the conductor of the great symphony that attends the short time I have on this rock. (longer if we succeed in eradicating the disease of aging in the next 20 years as we are on schedule to do!)

I am contended to know that our consciousness is a blip on the continuum that through our work over centuries passed on from one to the next we are able to extend temporally, opening a wider stage over which to do our life dance. Ultimately, we are in the infinite expanse of existence an infinitesimal blip of many such blips that came infinitely before and will come infinitely after, I can't think of anything more beautiful than that continuum. If I could call anything God it would be the continuum of spaces and times created by the march of energy about nothing (Since energy itself is all a function of time after all, it can be created and destroyed so long as the dance continues forever!) As an undergraduate engineering student I imagined the universe in this way, a transitory blip of "up" over the down "soon" to come, and now with the discovery of dark energy and it's amazing alignment with the quantum scale zero point energy, it seems my ideas were aligned with a deep truth of reality...that all of the energy that makes up our universe comes from the nothing of the void and that we ourselves owe existence to a dance of the void over a longer universal time scale...which itself is still a blip on the continuum. The beauty of that symmetry at all scales is what has me convinced of its truth, still full of mystery but without hocus pocus, still full of wonder but without paradox. Complex but astoundingly simple.

To me, that is fucking awesome!



21 November, 2009

Email will never go obsolete.

As definitive as a post title as one can get, what follows is an analysis of why I am convinced that email , the electronic message delivery system that has been in use on the internet for over 30 years is here to stay...even in the face of more real time oriented communication technologies.

How it works:

In order to fully understand why email is here to stay we need to take a look at the history and the internals of the communication medium. The purpose of email as stated in the RFC (request for comment) document that specifies the technology and the protocols. RFC 821 written in 1982 states:

The objective of Simple Mail Transfer Protocol (SMTP) is to transfer
mail reliably and efficiently.

SMTP is independent of the particular transmission subsystem and
requires only a reliable ordered data stream channel.

Several things are important about this statement, first note "simple" is in the name, when designed it was more important that messages were delivered (at some time) than that they arrived on a specific schedule. In many ways this makes it an electronic analog of real mail systems which more or less don't guarantee arrival time (unless you pay for that of course). The next and possibly most important aspect of this description that gives email the widespread use it enjoys today is "independent of the particular transmission subsystem" this means that different networks can mediate email. In the early 80's there were many transmission subsystems in use, ethernet was not the only connection method to network computers and a communication system that spanned the systems of the time was required, email had to work across all those systems in a agnostic manner. The solution was realized by using the tying the communication protocol to the transfer protocols that sat on top of the various types of networks in use, TCP (transfer control protocol)/IP (internet protocol) and its suite of ports or "sockets". SMTP was given the port 23 for exclusive use into machines for the purpose of mediate mail messages. The dual service of routing received messages to users was called POP (post office protocol) and was originally separate from SMTP, today most mail servers host both the message transfer function SMTP and the local mail delivery function POP. The important point of these protocols is that they are low level protocols, meaning they are part of the actual communication protocols of the internet itself as such any computer on the internet can enable applications to perform the respective functions by opening those ports and connecting to other computers. Now, having an SMTP server on a network allows that network to receive and forward messages but only by being aware of other SMTP servers on other networks, this forms a hiearchy of servers that create the global email system. Networks opt into this global system by registering a particular SMTP server on an internal network with an outside facing DNS (Domain Name Server) computer. DNS is the global address book of the entire internet, it is a hierarchical set of yellow pages for all the computers on the web that allows any data to move from a source computer in one network to a destination computer in another network. The data could be an smtp message, it could be video or audio, it could be video or audio embedded in an smtp message, DNS doesn't care...it's only purpose is to route the messages from source to destination. The DNS system is managed by ICANN (Internet Company for Assigned Names and Numbers) which leases out domain names to governments, educational institutions , corporations and individuals, because DNS is a public global service it is extremely robust to outages, data is routed around nodes that fail to respond to a forward request..without DNS computers would otherwise have to know the specific IP address of the destination computer to send data to it, by offloading that computation to a resident set of DNS servers that keep a list or index of names to numbers (IP's) this very simple but often repeated function is pushed to the network in a sense. It is this great power that the email system piggy backs on and that ensures that email as global reach.

When you open a new yahoo email account, you don't have to worry about being able to send mail to your colleagues in your company mail system (so long as that system is connected to the wider web via DNS and smtp) , you don't have to worry about sending or receiving mail from your family on their various cell phone or broadband server provider email addresses. The existing network of public DNS servers and their associated SMTP routing hosts ensure that messages arrive at the destination reliably.

How it has grown:

The SMTP based email system has changed since it's description in 1982, back then SMTP only allowed simple text messages without any graphics or html (the language that web pages are written in that browsers an interpret) SMTP predates the www (world wide web) by 9 years so it was designed for much simpler messaging requirements. Today , smtp messages can contain video and audio and html and can contain foreign characters outside of the characters used in the roman set known in the pc world as ASCII text. The addition of MIME types to describe these more complex data types and send them as messages without reducing the robust nature of the service is a testament to how reliable it really is as a message delivery platform.

The new guard:

The emergence of the real time web has provided new options for message delivery. IM a messaging system for real time communication that is nearly 20 years old has now moved from stand alone installed clients running on computers to web based clients that can be accessed using a browser on a computer, laptop or smart phone. IM enables real time communication but because it is real time it is subject to the hiccups that email was designed to route around since delivery time was never meant to be guaranteed with SMTP. So IM servers a perpetually different niche for communication purposes than email. Another new technology is the micro-blogging methods used by companies like Twitter, the short form 140 character tweets enable bits of data to go out to whoever wishes to subscribe to a senders message stream by "following" them. This enables message delivery without requiring formal registration to an SMTP server (only sign up to the twitter service) but allows participants to gather large collections of followers to which they can send /receive messages en mass in a way that is anonymous (you don't necessarily know who your message is going to, with email you explicitly must state your list of recipients)

What is different:

The main difference between the email system and these other forms of messaging stands out quite clearly when one tries to send an IM to some one who isn't on the same IM network, or tries to send a short form message to some one who isn't on Twitter. These later forms of messaging are locked in the silos of the companies that provide the services, there is no equivalent of a global DNS network that ensures routing of messages across IM or "short form" networks. A simulation of this functionality as been arrived at by converting the messages into an open communication form called XML that allows different networks to capture and convert messages between networks but this doesn't guarantee the richness of the respective systems is conveyed in the message stream (for example connecting two users on IM from MSN messenger to AOL Instant Messenger via XML may allow the message and formatting to transit but may fail on transfer of a file through that message exchange) unlike SMTP where the protocol of communication is agreed upon by all makers of email servers , IM servers enable additional functionality that may not cross over to other IM networks. This prevents IM from attaining the level of reliability that email enjoys. Also, IM doesn't enable a method to route messages meant for other IM services or networks through an intermediate system or systems as email does this significantly reduces the scope and robustness of IM services and lock in the service usefulness to a particular IM implementation.

Additionally, IM was not made to compete with email on this front but to compliment it as stated earlier, for delivery of messages that are not time critical the more robust email system will always have a niche. The newest messaging idea is the short form messaging of twitter, however similar to IM it also is a specific protocol implementation that from implementation to implementation have different functions, currently it doesn't even have that problem as twitter is the only major provider of a network for short form messaging, again addressing a communication use case that is distinct from email.

Email satisfies a niche of communication that doesn't require delivery time but does require delivery. It is integrated into the backbone of the internet itself and predates the www, this integration gives it great robustness to failures in networks. The ability to route messages to other email servers through the main servers gives it a supreme advantage over IM and short form messaging by giving it a global scope. The additions to the service to enable delivery of rich messages containing html and audio or video media allow it in fact to serve as a more static version of the more real time messaging systems of IM and short form messaging making it the only option for such functions when it is infeasible or impossible to use the other protocols to perform the functions. As the internet advances the need for a queued non real time messaging system like email will not go away, in fact it will become more useful as a system for storing notifications and events that must be received at some time and that can be used as a global back up to similar functionality provided by independent IM and short form messaging networks.


http://www.faqs.org/rfcs/rfc821.html (The first RFC for SMTP the heart of email)

http://www.faqs.org/rfcs/rfc1939.html (pop3 the latest post office protocol for retrieving local mail from a mail server)

http://www.lemis.com/email/email-rfc.html (a list of email related RFC's)

01 November, 2009

Taking global warming heat to use it..

A recent article in the Times online describes the idea put forward by by Art Rosenfeld, a member of the California Energy Commission that simply painting roof and road way surfaces could reduce carbon emissions this got me to thinking of another possibility that I've entertained in the past. Instead of reflecting the light into space we could try to harness it.

Existing research on building more efficient solar cells is moving forward all over the world and if we can create significantly efficient panel designs we could succeed in solving the heat absorption problem and acquire energy (to avoid needing to burn fossil fuels) all in one solution. It is true that trapping the energy for use could definitely kill two birds with one stone but it would be orders of magnitude more expensive than just painting or coloring surfaces. The advantage would be the power generation of the panels which depending on efficiency could pay themselves off in a short time. Let us say solar cells are used to absorb the radiative energy and turn it into electricity. A first issue that must be addressed is that traditional solar cells are designed for optimal photo conducting of visible frequencies of light, much of the most damaging radiation from the sun however is in the infra red regime (A and B) ...so the formula for any created solar panels would have to be custom designed to absorb those frequencies or there will be no benefit on reducing their effects. Recent developments in nanotube fabrication can help here, by creating a complex surface of nanotubes of various lengths corresponding to the wavelengths of sun light that we want to absorb and thus allowing all the frequencies to be absorbed and converted to usable electricity.

However, even after capture the real problem is what Rachel mentioned, storage. It is something that I and many other engineers have tried to devote some mind time to and have some ideas on the drawing board but it is a very hard problem, what is needed is a very efficient way to store electricity. Large capacitors work but require massive surface area, magnets work but create powerful magnetic fields, chemical batteries , fuel cells are all horribly inefficient more research on trying to finally understand and create room temperature super conductors would solve this problem over night allowing near limitless storage of current in such materials for on demand utilization.

Other novel solutions can be employed where by we use our rapidly advancing knowledge of biotechnology to solve the problem. Recent attempts to try and break down oil using bacteria has found success, another team was able to create bacteria that does the opposite process ..create oil as a waste product(see link below), it may be possible to create a species of bacteria or fungus that converts sunlight to alcohol. So it would require mating the genes for photosynthesis with the sugar processing genes of existing fungus (yeast) allowing the yeast to convert light into alcohol. This sounds like science fiction but there are many plant / animal hybrid organisms created for scientific purposes (tracers and such or plants with animal genes) this would be a practical way of creating clean fuel (alcohol) without requiring sugar as a food source (diverting from tightened supplies of plants that produce it) and it would avoid the carbon tax of oil. It would not address the light reflection problem but could be developed in conjunction with the mentioned technologies to bridge the gap and reduce carbon emmissions significantly from current levels while extracting energy and fuel in the process. At the accelerating rate at which the Earth is warming we have short precious time to develop efficient solutions, let us hope they can be achieved before it is too late.


http://www.sciencedaily.com/releases/2005/05/050517063708.htm (oil eating bacteria)

http://www.greenoptimistic.com/2008/06/19/nanotube-solar-cells-improve-efficiency-10-times/ (carbon nanotubes for more efficient solar cells)

(oil excreting bacteria)