Stories tagged IBM

Feb
21
2010

Solar cells made from common materials

Solar cells for everyone
Solar cells for everyoneCourtesy Dominic

Solar cells produce less than 1/1000 of the Earth's electricity. This is mainly because they are expensive and are made from rare, hard to obtain materials.
An IBM research team, managed by David Mitzi, is working on photovoltaic cells that are made from common materials.

The new solar cells are also cheaper to manufacture, using a “printing” technique that uses a hydrazine solution containing copper and tin with nanoparticles of zinc dispersed within it. The solution is then spin-coated and heat treated in the presence of selenium or sulfur vapor. PhysOrg

9.6% Efficiency

This new material, called kesterite, was 6.8% efficient in 2009. IBM increased the efficiency to 9.8% and is planning to increase the efficiency above 11 per cent, which is equal to or better than the traditional solar cells.

Abstract of published paper: High-Efficiency Solar Cell with Earth-Abundant Liquid-Processed Absorber Advanced Materials

Feb
06
2010

Graphene
GrapheneCourtesy Carbophiliac

Graphene is great

Graphene is a single atom thick layer of carbon atoms in a honeycomb like arrangement (read more about graphene here in ScienceBuzz.org)

Graphene transistors are the fastest

Transistors are like valves that can turn the flow of electricity off and on. Computers can use transistors and logic circuits to solve all kinds of problems. These problems can be solved faster if the transistors can turn on and off faster. Transistors made out of graphene now can switch on and off 100 billion times per second (100 GigaHertz). State-of-the-art silicon transistors of the same gate length have a switching frequency of about 40 GigaHertz.

IBM develops next-generation transistors

IBM just announced their breakthrough in the magazine Science.

Uniform and high-quality graphene wafers were synthesized by thermal decomposition of a silicon carbide (SiC) substrate. The graphene transistor itself utilized a metal top-gate architecture and a novel gate insulator stack involving a polymer and a high dielectric constant oxide. The gate length was modest, 240 nanometers, leaving plenty of space for further optimization of its performance by scaling down the gate length. ScienceDaily

Nov
23
2009

Reverse engineering the brain
Reverse engineering the brainCourtesy Thomas Schultz

Engineering computers that can think

Even simple brains, like those in a mouse, are amazing. A brain the size of a thimble that requires almost no energy, can navigate through mazes, survive in severe weather, or escape from a cat. Will we ever create a computer capable of such adaptable and creative "thinking"? One approach is to reverse engineer the brain of a mouse, rat, or cat.

Computer simulation achieves cat brain complexity

Dharmendra S. Modha is a team leader at IBM who is attempting to understand and build such a brain as cheaply as possible. Their latest achievement is a brain simulation with 1 billion spiking neurons and 10 trillion individual learning synapses.

Synapses are the key

Synapses are junctions between neurons and a key to how a brain learns. The strength of the chemical reactions within the synapses changes as the animal interacts with the environment These synaptic junctions are thought to encode our individual experience.

The problem with today's computers

Regular computer architecture has a separation between computation and memory.

“Surely there must be a less primitive way of making big changes in the store than by pushing vast numbers of words back and forth through the von Neumann bottleneck. Not only is this tube a literal bottleneck for the data traffic of a problem, but, more importantly, it is an intellectual bottleneck that has kept us tied to word-at-a-time thinking instead of encouraging us to think in terms of the larger conceptual units of the task at hand. Thus programming is basically planning and detailing the enormous traffic of words through the von Neumann bottleneck, and much of that traffic concerns not significant data itself, but where to find it.”

DARPA's SyNAPSE program

The goal of a DARPA program known as SyNAPSE (Systems of Neuromorphic Adaptive Plastic Scalable Electronics) is to create new electronics hardware and architecture that can understand, adapt and respond to a a changing environment.

What is cognitive computing?

Cognitive computing is the quest to engineer mind-like intelligent machines by reverse-engineering the computational function of the brain.

There is no definition or specification of the human mind. But, we understand it as a collection of processes of sensation, perception, action, cognition, emotion, and interaction. Yet, the mind seems to integrate sight, hearing, touch, taste, and smell effortlessly into a coherent whole, and to act in a context-dependent way in a changing, uncertain environment. The mind effortless creates categories of time, space, and object, and interrelationships between these.

Learn more about cognitive computing

Fortran punch card: I remember punching out code on hundreds of these cards.
Fortran punch card: I remember punching out code on hundreds of these cards.Courtesy Arnold Reinhold
Oct 15, 1956, John W. Backus published a manual explaining a new way to program computers.

“John Backus and his Fortran project members almost single-handedly invented the ideas of both programming languages and (optimizing) compilers as we know them today." Wired

Instead of compiling complex machine code which tooks weeks, Fortran code could be written in hours and was much easier.
I was even able to learn Fortran back in the late 60's. It even satisfied my foreign language requirement!

Sep
06
2009

.

Single molecule, one million times smaller than a grain of sand, pictured for first time

Read more about this at MailOnline or read the IBM Zurich press release.

We demonstrate imaging of molecules with unprecedented atomic resolution by probing the short-range chemical forces with use of noncontact atomic force microscopy. The key step is functionalizing the microscope’s tip apex with suitable, atomically well-defined terminations, such as CO molecules. Science Magazine

Growing computers with DNA

Scientists from California Institute of Technology and IBM have for the first time coaxed components made from DNA to self organize in a way that could serve as a template upon which additional components like wires and switches could attach.

This technique, which "grows" nano circuits rather than "tooling" them, could result in smaller circuits and save millions of dollars.

Learn more at SiliconValley.com:
IBM scientists take big step toward DNA microchips

Apr
11
2008

Hard drive cost 3,000x cheaper

Fortran punch card
Fortran punch cardCourtesy Arnold Reinhold
I bought a hard drive yesterday that can store a million MB of data for $202. That is about 20 cents per megabyte. My first hard drive purchase cost over $600 for just one megabyte.

First computer "bugs"

The first computer I got to play with used relays. I programmed it by moving wires creating a circuit called a "flip flop" that could play tic-tac-toe. The relays used electromagnets to open and close electrical contacts and if a bug got in between the contacts the program failed to work and had to be "debugged".

Paper punch programming

When I switched majors in college from engineering into education I needed to take a foreign language. Luckily I was allowed to use my class in Fortran (a computer language) to qualify. In the Fortran coarse we stored instruction data on punch cards. The holes in the cards allowed electrical contact between appropriate circuits within a huge mainframe computer.

Magnetic tape data storage

Before I had enough money to buy that first hard drive, I used magnetic tape. Audio pulses on a regular audio cassette would be converted to connections being made within an integrated circuit comprised of millions of transistor switching circuits.

IBM calls new data storage, "racetrack"

Soon personal memory devices will hold thousands of movies, run for weeks on one battery, and will last for decades.

IBM just announced another breakthrough in data storage that could lead to electronic devices capable of storing far more data in the same amount of space than is possible today, with lightning-fast boot times, far lower cost and unprecedented stability and durability.

To learn more, click on the video below.

Source: IBM Press release

IBM computer, Deep Blue defeats chess master Garry Kasparov on May 11, 1997. Read Wired interview with coder here.

Aug
13
2006

Molecular memory: photo from Wikipedia commons
Molecular memory: photo from Wikipedia commons

IBM memory the size of one molecule

Electronic memory circuits "remember" by switching between two distinct conductive states, (on or off). These conductive states need to be stable and allow non-destructive sensing of their bit state. In the August 4 issue of SMALL, IBM researchers Heike Riel and Emanuel Lörtscher reported on such a single-molecule switch and memory element.
With dimensions of a single molecule on the order of one nanometer (one millionth of a millimeter), molecular electronics redefines the ultimate limit of miniaturization far beyond that of today's silicon-based technology(100 X smaller).

How small can we go?

Whenever memory technology appears to approach physical limitations, a new paradigm allows ever smaller memories. The evolution of memory started with electromechanical relays. Then came vacuum tubes, transistors, ferrite cores, integrated circuits, magnetic tapes and discs, optical discs, and holographic discs. Now we have memory the size of a molecule. Will atom sized memory be next?

Aug
11
2006

IBM PC: source;  Wikipedia Commons
IBM PC: source; Wikipedia Commons

The first IBM PC was $1265

I remember watching with envy as an evolving crop of computers were offered for sale. In 1965, using a kit, I wired together some flip-flop relays into a computing machine that could hold its own playing tic-tac-toe. The Osborn I was really cool because it was "portable" (23.5lbs) but $1795 was out of my reach.

The history of computing hardware starting in the 1960s begins with the development of the integrated circuit (IC), which was later used for the first computer kits and home computers in the 1970s with the MITS Altair, Apple II and Commodore PET and eventually for personal and business computers such as the IBM PC and Apple Macintosh in the 1980s. Wikipedia

The IBM PC was released on August 12, 1981

IBM, with its reputation for business machines, using a computer chip from Intel and an operating system from Microsoft, changed the world of computing. Small businesses could now purchase mainframe computing power for a little over $1000.

Read more at Low End Mac and Wikipedia
Here is PC World exclusive review of the very first IBM PC.