Monthly Archives: May 2016

Is Your Solar Panel Installed the Right Way?

Although few people would have noticed, the costs of solar photovoltaic cells have been dropping over the years. As the technology took off, costs plummeted in the first 12 years. However, between 2005 and 2009, global market demand surged, making it difficult for supply to keep up. As manufacturing picked up post 2009, solar PV cell prices have continued their downward trend steadily. Now, it makes sense for companies to switch to PV cells purely based on economics.

As solar grows to become a more attractive option, we see a clear preference in its adoption over adding new wind capacity. Navigant Research has predicted in their recent report that declining prices will result in the global solar PV market exceed $134 billion by 2020 – a phenomenal increase of 50% from this year. That means a solar capacity addition of nearly 435 Gigawatts.

However, getting the maximum benefit from solar PV cells requires mounting them the right way. As the sun traverses the sky in the daytime, the PV cells must either follow the trajectory of the sun or be mounted in the most optimum way for them to catch most of the sunlight. Automatically turning the PV cells to face the sun requires elaborate sensing and expensive movement mechanisms. Therefore, most people prefer fixed installations that are simple to put up and maintain.

Another thing to consider is the latitude tilt of the location where you intend to install the solar cells. If your location is below the 25 degrees latitude, tilt the solar panel towards the sun the same amount as the latitude number. At 25 degrees latitude, your panel must tilt by 25 degrees. Above 25 degrees, you will need to add five degrees for each additional five degrees of latitude up to 40 degrees. At and beyond 40 degrees latitude, add 20 degrees of tilt to the latitude number. The above is the general thumb rule people follow for solar PV panel installation. Consequently, most installations have the solar panels facing south to catch the maximum amount of sunlight.

Researchers at the Pecan Street Research Institute have discovered ways to additionally fine-tune the positioning and tilt of the solar panels to extract somewhat more power. During their research on impact of residential solar power on the power grid, they discovered that if the solar panels faced west rather than the customary south, they could generate about 2% more power.

So long, homeowners, utilities and architects believed that in the northern hemisphere, solar panels directed south would receive the maximum exposure from the sun. However, when studying home installations in Austin, Texas, Pecan Street researchers found that this was not true. In fact, they noticed south-facing panels generating less energy. They found west-facing panels generated more power in the afternoon, when the energy demand peaked.

As energy demand peaks, a typical home in Austin using solar panels reduces its reliance on the power grid by as much as 54%. However, for homes with west-facing panels, this number shot up to 65% – a significant power saving. Therefore, by merely shifting the angle, you may be able to achieve significant gain in solar power generation.

Conducting Elastic Fibers for Artificial Muscles and Electronic Devices

A study at the Dallas based University of Texas shows how scientists have wrapped electrically conducting carbon nanotube sheets around a rubber core to create super elastic fibers. In addition, these fibers conduct electricity and have some special electronic properties as well.

The elasticity of the fibers is phenomenal. They can be more than 14 times longer than their original lengths by stretching and the process is reversible. The fibers regain their initial lengths once the stretching force goes away. While the fibers can extend to 14 times their length, the electrical conductivity steps up by 200 times.

Scientists have conceived of several uses for these fibers. Their high elasticity and conductivity when they stretch could make them ideal for use as minute interconnects in electronic circuits. This could help in upgrading diverse applications like flexible charging cables for mobile devices, robots with longer ranges and trouble-free pacemaker leads.

These fibers are different from conventional fibers in a major aspect. The conductivity of the ordinary fibers falls when they stretch, because of the reduced area of cross section offered to the path of electricity. On the other hand, conductivity of the new fibers increases when stretched.

Dr. Ray Baughman, director of the Nano Tech Institute at Dallas, has authored a paper on the subject. He explains the enhanced elasticity is owing to the buckled structure that develops while the nanotube sheets wrap around the rubber core. Buckles form along the length and circumference of the fibers.

According to Dr. Zunfeng Liu, a research associate in the institute, the two dimensional buckling maintains the alignment of the rubber core and the nanotube. This prevents the resistance of the fibers from rising while they stretch.

Liu reveals that until now, no material has been able to function over so large a range of strain. This feature makes possible the creation of artificial muscles with rotational properties. Dr. Haines, a research associate of the university and a coauthor of the paper said that this aspect of the artificial muscles could be useful for rotating mirrors in optical circuits.

Scientific workers have found several other uses for the nanotube sheaths or elastomers. A particularly valuable device formed from the new material is a fiber capacitor. In this device, a thin coating of rubber covers a core of nanotube fibers. Over this, there is another sheath of nanotube fibers. The inner and outer nanotube layers form the two electrodes, while the rubber core serves as the dielectric.

Nan Jiang is a member of the scientific team working on the project. He demonstrated in the laboratory of the Nanotech Institute the manner of construction of these conductive elastomers in varying sizes, ranging from 150 microns to considerably larger sizes. The width of the rubber core determines the size.

The easy availability and low cost of the rubber cores make it easy for commercialization of the technology. Baughman’s team at the institute has developed a process that aims to convert the carbon nanotubes into large sheets. This could facilitate the fabrication of diverse applications with the elastic conducting fibers.

What is Hyperscale Cooling?

We are more familiar with heat generation in electronic gadgets and methods followed for its removal. Heat sinks are commonly recognized for their function – removing unwanted heat by conduction and convection. Most engineers know how to keep the temperature of the components on their printed circuit boards such as ICs below some maximum allowable value. They are also aware of the heat within the overall enclosure, which may be for a standard rack of boards, a power supply, or a DVR.

Engineers follow several techniques to remove heat from ICs, boards, or enclosures. These often involve the use of one or more heat sinks, heat pipes, heat spreaders, cold plates, and fans. Methods that are more sophisticated use active cooling approaches that include air conditioning or cooling with liquid flowing through embedded pipes. All these are good techniques for handling heat generated with a few kilowatts of power.

However, things change when megawatts of power is involved. Consider for instance, a hyperscale data center that offers a massively scalable compute architecture. Usually made up of small, individual servers called nodes, the hyperscale data center provides computing power, storage, and networking, with the nodes clustered together and managed to form a single entity. Inexpensive, off-the-shelf servers form the nodes. As demand increases, more nodes are attached. Although no formal standard is available as to the minimum power dissipation that can be considered hyperscale, it is safe to admit it is in the range starting at hundreds of kilowatts to megawatts.

According to thermodynamics, energy cannot be created or destroyed. Therefore, the heat removed from the object to be cooled must be delivered to another location that is willing to be heated up. Therefore, when cooling a hyperscale data center, the problem lies in dumping this enormous quantity of heat at a location that can accept it.

BSRIA, an organization involved in testing, instrumentation, research, and consultancy in the construction and building services, has recently conducted a market study. They offer a valuable insight into the cooling options and trends available to hyperscale data centers.

According to the BSRIA report summary, they have represented the feasibility and popularity of techniques versus data-center temperatures in a four-quadrant graph. Among the options shown are reducing dissipation by using modular DC power supplies and variable-speed drives, cooling techniques by using adiabatic evaporation to liquid cooling and allowing a rise in the temperature at the server-inlet. The graph includes growth potential against the investment level necessary for each approach – most popular is the adiabatic/evaporative cooling.

The adiabatic/evaporative process of cooling uses a natural phenomenon to regulate temperature. The cooler uses a large fan that draws in warm air through pads moistened with water, which evaporates. Huge quantities of heat are removed when water evaporates, chilling the air, which is then pushed out to the room. Temperature control is a simple matter of adjusting the airflow of the cooler.

For data centers and other facilities, the adiabatic/evaporative process has saved the industry millions of liters of water. Older cooling towers would pollute the water they used. Adiabatic cooling units also save greater than 40% in electricity consumption.

Raspberry Pi Zero for a Real-Time Sensor Dashboard

Using the Raspberry Pi or RBPi, the single board computer (SBC), and a few applications from Google, you can have a functional dashboard showing real-time parameters from sensors. Google offers its App Engine in the form of a Platform as a Service or PaaS. The advantage is you can deploy and run your own applications using the Google infrastructure without bothering about exclusive ways of setting up hardware, servers, or Operating Systems.

Google also offers the free and powerful Google Charts that you can use as simple charting tools for plotting the data from the sensors into line charts. An HTML5 templates generator such as the Initializr is also useful for generating templates for the dashboard. Initalizr has several useful frontend resources such as Bootstrap and jQuery.

RBPi Zero is the perfect hardware platform to use for this project. This SBC is a full-fledged computer, but smaller than a credit card. It features a single-core CPU running at 1 GHz and 512 MB RAM. Along with a 40-pin GPIO header, the RBPi Zero has USB and a mini HDMI port.

When you connect a few sensors to the GPIO pins, the RBPi Zero sends their data over to the Google App Engine. On the dashboard, you can see the values and the charts updating in real-time as new data arrives from the sensors. Github carries the instructions for building and deploying the project for the RBPi Zero app and the App Engine dashboard.

For this project, Java is the programming language, as both the RBPi Zero and the Google App Engine support it – both use the Pi4J library. However, those who prefer Python can easily change the code, as both RBPi and the Google App Engine support Python as well. As the latest version of Raspbian, the Operating System of the RBPi comes pre-installed with Oracle Java 8, it is easy to deploy and run an executable JAR on the RBPi Zero.

The JAR acts as the go-between with the sensors and the Google App Engine – it reads inputs from the sensors and passes them on to the Google App Engine. You can use the Apache Maven to compile and build the code on the RBPi Zero. Of course, you may also build the code on your laptop or desktop and copy the resulting JAR over to the RBPi Zero.

You can use Cloud Endpoint on the Google App Engine side. This is a powerful service for creating a backend API by using annotations. This includes the client libraries for web and mobiles. It generates a Java based Android client for use with the RBPi Zero application. Google Qauth 2.0 authenticates the API for installed applications.

The RBPi Zero based hardware provides readings from three sensors – voltage generated by a solar cell, temperature from an analog temperature sensor, and illuminance or LUX from a photocell. A 10-bit Analog to Digital converter with SPI interface is necessary to covert the analog signal to a digital format suitable for the RBPi Zero. All the sensors work with a supply of 3.3V, and the RBPi Zero is capable of sourcing this.

Play Chess with the Raspberry Pi

You could be an ardent chess player searching for a worthy opponent. A human opponent may not always be conveniently present, but a computerized player can be relied upon to be available at any time of your choosing. With the Single Board Computer, the Raspberry Pi, or RBPi, you can now play a complicated game of chess, provided you are willing to build a chessboard first.

You will need an Arduino to control the chessboard and an RBPi to run the actual chess engine Stockfish, along with Chessboard, which is the chess rules library. The entire arrangement is completely automated – plug in the different parts, press the green button and you start playing. If there is no automated arm, you must move the pieces manually and the computer signals its move by flashing LEDs. You get 21 levels of play along with the ability to set the personality of the computer – coward or aggressive.

Apart from the personality setting and the 21 levels, Stockfish allows several features. Choose to play with black or with white pieces, and play against the computer or another human. Along with providing hints if stuck, Stockfish recognizes and makes special moves such as Castling, En Passent, and Pawn Promotion. It validates all moves against all rules of chess, signaling errors and allowing re-moves. The chess engine has a maximum rating of GM and an ELO level of 2900.

Although you can use the RBPi alone to control the board and play the chess engine, using an Arduino relieves the RBPi of many functions, speeding up the chess engine running on it. Since the Arduino does not use an Operating System, it is not possible to run Stockfish on it. Although there are chess programs to run on the Arduino, none is as strong as the Stockfish. Moreover, if you are using a computerized arm, the Arduino can take care of operating the motors. The combination of RBPi and Arduino for the chessboard works efficiently.

You can make the board out of wood or plastic according to the materials readily available. A chessboard has 64 squares with alternate black and white colors. To sense the pieces, you need reed switches under each square. These will be wired in the form of a matrix with eight rows and eight columns, with a single reed-switch straddling each junction. By numbering the rows as 1-8 and the columns as A-H, a command E2E4 tells the computer to move the piece from the E2 square to the E4 square.

To let the computer signal its move with LEDs, you will need a second matrix similar to that of the reed switches. Only this time, instead of reed switches, you must place LEDs at the junction points. Using sockets for both the reed switches and the LEDs is advisable as it becomes easy for maintenance. Unlike reed switches, LEDs are polarized, and need to be properly oriented to function. Placing them in sockets helps to re-orient them if they are inserted the wrong way. The Arduino controls both the matrices with data from the RBPi.

Emulating Brain Functions with a Memristor Computer

Chip designs at the atomic level may require emulating the functioning of the human brain, while upholding Moore’s Law. In fact, this might be the only way forward. All forward-looking semiconductor design organizations such as Intel, HP, Samsung, and others know this and this has sparked an exponential interest in the study of memristors.

Among all known electronic components, memristors alone are capable of emulating the brain. It is common knowledge now that a human brain performs far better than the fastest supercomputer, while consuming only about 20 watts of power. Supercomputers cannot emulate, rather only simulate the brain, but they consume thousands of watts and are expensive to the tune of millions of dollars.

At Santa Fe, N.M., Knowm Inc. has expanded its portfolio by adding three different types of memristors. They offer a new high-density die-only option, along with all the raw data manufacturers will need to perform their characterization. Although another organization HP/Hynix is also trying to build commercial memristors, Knowm has beaten them to the market by diversifying its offering of memristors. Knowm is now offering three models with slow, medium, or fast hysteresis, based on the material they are made of – tungsten, tin, or chromium.

Regardless of the basic metal-ion used for its manufacture, all memristors work in the same way. The device consists of two electrodes, with a layer of metal located close to one of them. As a voltage is applied across the electrodes, metal ions move through the device towards the electrode with the lower potential. The device has a layer of amorphous chalcogenide material, which is also its active layer. Metal ions moving through this active layer form a conductive pathway between the electrodes. As the pathways spin through the active material layer, the device resistance drops. When the direction of the applied potential is reversed, the conductive channels dissolve and the device resistance increases.

This characteristic makes the memristor a bipolar device. The memristor also emulates the way neurons in the brain learn. Neurons send pulses through their synapses to strengthen them, equivalent to lowering the resistance, or they do not send pulses, thus causing the synapses to atrophy, equivalent to increasing their resistance.

IBM, together with DARPA or the Defense Advance Research Agency, is developing programs to simulate the process with digital computers. Their software will span the spectrum of accuracy in modeling the way synapses work. However, memristors are the only true emulators of the brain and its functions, so far. Some researchers are going to the extent of erecting scaffoldings for connecting memristor-based emulators to large-scale models. This is creating the need for components such as those Knowm is now offering.

Knowm’s offer is a treasure trove for researchers, being raw data from over 1000 experiments. It helps tremendously, as there is actually no well-defined specification to characterize the memristors properly. That allows researchers to generate their own characterization data, based on the properties of their choice from among the slow, medium, or fast hysteresis types. Knowm offers 16 memristors packaged in a ceramic single dual-in-line package. Their die only option holds 180 memristors.