Monday, April 21, 2008

Making the World A Billion Times Better - an article at Washington Post

By Ray KurzweilSunday, April 13, 2008; Page B04

M IT was so advanced in 1965 (the year I entered as a freshman) that it actually had a computer. Housed in its own building, it cost $11 million (in today's dollars) and was shared by all students and faculty. Four decades later, the computer in your cellphone is a million times smaller, a million times less expensive and a thousand times more powerful. That's a billion-fold increase in the amount of computation you can buy per dollar.
Yet as powerful as information technology is today, we will make another billion-fold increase in capability (for the same cost) over the next 25 years. That's because information technology builds on itself -- we are continually using the latest tools to create the next so they grow in capability at an exponential rate. This doesn't just mean snazzier cellphones. It means that change will rock every aspect of our world. The exponential growth in computing speed will unlock a solution to global warming, unmask the secret to longer life and solve myriad other worldly conundrums. [read entire article]

Wednesday, April 16, 2008

A QUOTE: Napoleon

«Sachez écouter, et soyez sûr que le silence produit souvent le même effet que la science.»

 

 

Tuesday, April 15, 2008

The Future Is Now

By Joel Achenbach
Sunday, April 13, 2008; B01

http://www.washingtonpost.com/wp-dyn/content/article/2008/04/11/AR2008041103328.html

The most important things happening in the world today won't make tomorrow's front page. They won't get mentioned by presidential candidates or Chris Matthews or Bill O'Reilly or any of the other folks yammering and snorting on cable television.

They'll be happening in laboratories -- out of sight, inscrutable and unhyped until the very moment when they change life as we know it.

Science and technology form a two-headed, unstoppable change agent. Problem is, most of us are mystified and intimidated by such things as biotechnology, or nanotechnology, or the various other -ologies that seem to be threatening to merge into a single unspeakable and incomprehensible thing called biotechnonanogenomicology. We vaguely understand that this stuff is changing our lives, but we feel as though it's all out of our control. We're just hanging on tight, like Kirk and Spock when the Enterprise starts vibrating at Warp 8.

What's unnerving is the velocity at which the future sometimes arrives. Consider the Internet. This powerful but highly disruptive technology crept out of the lab (a Pentagon think tank, actually) and all but devoured modern civilization -- with almost no advance warning. The first use of the word "internet" to refer to a computer network seems to have appeared in this newspaper on Sept. 26, 1988, in the Financial section, on page F30 -- about as deep into the paper as you can go without hitting the bedrock of the classified ads.

The entire reference: "SMS Data Products Group Inc. in McLean won a $1,005,048 contract from the Air Force to supply a defense data network internet protocol router." Perhaps the unmellifluous compound noun "data network internet protocol router" is one reason more of us didn't pay attention. A couple of months later, "Internet" -- still lacking the "the" before its name -- finally elbowed its way to the front page when a virus shut down thousands of computers. The story referred to "a research network called Internet," which "links as many as 50,000 computers, allowing users to send a variety of information to each other." The scientists knew that computer networks could be powerful. But how many knew that this Internet thing would change the way we communicate, publish, sell, shop, conduct research, find old friends, do homework, plan trips and on and on?

Joe Lykken, a theoretical physicist at the Fermilab research center in Illinois, tells a story about something that happened in 1990. A Fermilab visitor, an English fellow by the name of Tim Berners-Lee, had a new trick he wanted to demonstrate to the physicists. He typed some code into a little blank box on the computer screen. Up popped a page of data.

Lykken's reaction: Eh.

He could already see someone else's data on a computer. He could have the colleague e-mail it to him and open it as a document. Why view it on a separate page on some computer network?

But of course, this unimpressive piece of software was the precursor to what is known today as the World Wide Web. "We had no idea that we were seeing not only a revolution, but a trillion-dollar idea," Lykken says.

Now let us pause to reflect upon the fact that Joe Lykken is a very smart guy -- you don't get to be a theoretical physicist unless you have the kind of brain that can practically bend silverware at a distance -- and even he, with that giant cerebral cortex and the billions of neurons flashing and winking, saw the proto-Web and harrumphed. It's not just us mortals, even scientists don't always grasp the significance of innovations. Tomorrow's revolutionary technology may be in plain sight, but everyone's eyes, clouded by conventional thinking, just can't detect it. "Even smart people are really pretty incapable of envisioning a situation that's substantially different from what they're in," says Christine Peterson, vice president of Foresight Nanotech Institute in Menlo Park, Calif.

So where does that leave the rest of us?

In technological Palookaville.

Science is becoming ever more specialized; technology is increasingly a series of black boxes, impenetrable to but a few. Americans' poor science literacy means that science and technology exist in a walled garden, a geek ghetto. We are a technocracy in which most of us don't really understand what's happening around us. We stagger through a world of technological and medical miracles. We're zombified by progress.

Peterson has one recommendation: Read science fiction, especially "hard science fiction" that sticks rigorously to the scientifically possible. "If you look out into the long-term future and what you see looks like science fiction, it might be wrong," she says. "But if it doesn't look like science fiction, it's definitely wrong."

That's exciting -- and a little scary. We want the blessings of science (say, cheaper energy sources) but not the terrors (monsters spawned by atomic radiation that destroy entire cities with their fiery breath).

Eric Horvitz, one of the sharpest minds at Microsoft, spends a lot of time thinking about the Next Big Thing. Among his other duties, he's president of the Association for the Advancement of Artificial Intelligence. He thinks that, sometime in the decades ahead, artificial systems will be modeled on living things. In the Horvitz view, life is marked by robustness, flexibility, adaptability. That's where computers need to go. Life, he says, shows scientists "what we can do as engineers -- better, potentially."

Our ability to monkey around with life itself is a reminder that ethics, religion and old-fashioned common sense will be needed in abundance in decades to come (see the essay on page B1 by Ronald M. Green). How smart and flexible and rambunctious do we want our computers to be? Let's not mess around with that Matrix business.

Every forward-thinking person almost ritually brings up the mortality issue. What'll happen to society if one day people can stop the aging process? Or if only rich people can stop getting old?

It's interesting that politicians rarely address such matters. The future in general is something of a suspect topic . . . a little goofy. Right now we're all focused on the next primary, the summer conventions, the Olympics and their political implications, the fall election. The political cycle enforces an emphasis on the immediate rather than the important.

And in fact, any prediction of what the world will be like more than, say, a year from now is a matter of hubris. The professional visionaries don't even talk about predictions or forecasts but prefer the word "scenarios." When Sen. John McCain, for example, declares that radical Islam is the transcendent challenge of the 21st century, he's being sincere, but he's also being a bit of a soothsayer. Environmental problems and resource scarcity could easily be the dominant global dilemma. Or a virus with which we've yet to make our acquaintance. Or some other "wild card."

Says Lykken, "Our ability to predict is incredibly poor. What we all thought when I was a kid was that by now we'd all be flying around in anti-gravity cars on Mars."

Futurists didn't completely miss on space travel -- it's just that the things flying around Mars are robotic and take neat pictures and sometimes land and sniff the soil.

Some predictions are bang-on, such as sci-fi writer Arthur C. Clarke's declaration in 1945 that there would someday be communications satellites orbiting the Earth. But Clarke's satellites had to be occupied by repairmen who would maintain the huge computers required for space communications. Even in the late 1960s, when Clarke collaborated with Stanley Kubrick on the screenplay to "2001: A Space Odyssey," he assumed that computers would, over time, get bigger. "The HAL 9000 computer fills half the spaceship," Lykken notes.

Says science-fiction writer Ben Bova, "We have built into us an idea that tomorrow is going to be pretty much like today, which is very wrong."

The future is often viewed as an endless resource of innovation that will make problems go away -- even though, if the past is any judge, innovations create their own set of new problems. Climate change is at least in part a consequence of the invention of the steam engine in the early 1700s and all the industrial advances that followed.

Look again at the Internet. It's a fantastic tool, but it also threatens to disperse information we'd rather keep under wraps, such as our personal medical data, or even the instructions for making a fission bomb.

We need to keep our eyes open. The future is going to be here sooner than we think. It'll surprise us. We'll try to figure out why we missed so many clues. And we'll go back and search the archives, and see that thing we should have noticed on page F30.

achenbachj@washpost.com

Joel Achenbach is a reporter on the national staff of The Washington Post.

Thursday, April 10, 2008

Buracos negros da Internet são mapeados por Hubble virtual

Inovação Tecnologica - 10/04/2008
Assunto: Acesso à Internet

A experiência é frustrante: você tenta visitar um site e ele não responde, apesar de sucessivas tentativas. Pode ser que o servidor esteja fora do ar ou passando por alguma manutenção. Mas o motivo pode ser bem mais misterioso.

"Há uma suposição de que, se você tem uma conexão funcionando com a Internet, então você tem acesso a toda a Internet. Nós descobrimos que não é bem assim," explica Ethan Katz-Bassett, um dos idealizadores do Hubble virtual.

Buracos negros na Internet

Cientistas da Universidade de Washington, Estados Unidos, descobriram que existem verdadeiros buracos negros na Internet, interrupções nas comunicações que impedem que usuários de determinada região geográfica acessem servidores localizados em outra região, mesmo que os servidores estejam funcionando corretamente.

As informações coletadas até agora mostram que mais de 7 por cento de todos os servidores da Internet caíram em "buracos negros" virtuais ao menos uma vez durante um período de três semanas.

Hubble virtual

Para localizar esses buracos negros, os cientistas criaram um sistema batizado de Hubble, em homenagem ao telescópio espacial que procura por corpos celestes no espaço profundo. Ao invés de vasculhar as galáxias, o Hubble virtual mapeia os mistérios dos roteadores e cabos de fibras-ópticas que formam a rede mundial de computadores.

A observação da rede mundial é feita por meio de 100 computadores da rede de pesquisas PlanetLab, distribuídos por cerca de 40 países. O sistema de monitoramento envia "sondas virtuais", pequenos programas de teste que checam se determinado computador está respondendo. Esses programas atualmente atingem 90% de toda a Internet.

Acessibilidade parcial

Como se originam de diversos computadores do PlanetLab, o Hubble virtual consegue detectar computadores que estão acessíveis de um ponto mas não de outro, uma situação conhecida como acessibilidade parcial. Pequenas falhas de comunicação são ignoradas - para ser registrado, um problema deve ser constatado em duas tentativas consecutivas de 15 minutos cada uma.

O mapa do céu da Internet é atualizado a cada 15 minutos, cobrindo todo o planeta. São mostrados todos o locais que estão passando por problemas, e que possuem computadores que não estão acessíveis de outros pontos da rede mesmo estando operando corretamente.

Ferramenta proativa

Cada bandeira no mapa corresponde a centenas e até milhares de computadores. Além do mapa, é apresentada uma listagem com os endereços IP dos servidores, contendo a indicação do país em que se encontram.

O principal objetivo do mapa produzido pelo Hubble virtual é fornecer uma nova ferramenta para os administradores de rede, que poderão ser alertados de forma proativa de problemas em sua área ou em seus roteadores.

Wednesday, April 9, 2008

Ciência e Inovação para diplomatas

O ministro da C&T, Sergio Rezende, proferiu nesta segunda-feira palestra sobre Ciência, Tecnologia e Inovação para alunos do curso de aperfeiçoamento de diplomatas, no auditório do Instituto Rio Branco, em Brasília

A explanação incluiu um breve histórico sobre a trajetória nacional no setor de C&T, desde a década de 90, época marcada por crises e queda no número de bolsas oferecidas pelo CNPq/MCT até os dias atuais. De acordo com Rezende, na virada do século, o país já havia formado 50 mil pesquisadores com doutorado.

Entre os anos de 1999 e 2002, o sistema de C&T passou por uma fase de transição. Neste período, o CNPq criou outras modalidades de financiamento e foram criados os Fundos Setoriais visando garantir a ampliação e a estabilidade do encaminhamento de recursos para a área.

Segundo dados fornecidos pelo ministro, nos quatro anos seguintes, o setor se desenvolveu devido à ampliação dos recursos federais. O investimento em C&T promoveu o crescimento da comunidade científica em pelo menos 20% em comparação à década de 70. Em 2006, R$ 1,1 bilhão foi investido no segmento científico-tecnológico e o CNPq ofereceu 65 mil bolsas de estudo. No mesmo ano, foram formados 9,6 mil doutores no Brasil.

A publicação de artigos científicos nacionais em revistas internacionais indexadas também teve crescimento de 8,2%, valor correspondente a quatro vezes a média mundial (1,9%).

Hoje, o Brasil tem mais de 80 mil pesquisadores com doutorado. Na América Latina, é o país com maior e mais qualificada comunidade de Ciência e Tecnologia. O ministro destacou ainda os setores econômicos melhor sucedidos sob o aspecto do desenvolvimento tecnológico no país: petróleo, agronegócio e aeronáutico.
(Monalisa Silva, da Assessoria de Comunicação do MCT)

Tuesday, April 8, 2008

Breakthrough In Biofuel Production Process

ScienceDaily (2008-04-08) -- Researchers have made a breakthrough in the development of "green gasoline," a liquid identical to standard gasoline yet created from sustainable biomass sources like switchgrass and poplar trees. ... > read full article

The Sunday Times - superfast internet

From The Sunday Times

April 6, 2008

Coming soon: superfast internet

Jonathan Leake, Science Editor

THE internet could soon be made obsolete. The scientists who pioneered it have now built a lightning-fast replacement capable of downloading entire feature films within seconds.

At speeds about 10,000 times faster than a typical broadband connection, “the grid” will be able to send the entire Rolling Stones back catalogue from Britain to Japan in less than two seconds.

The latest spin-off from Cern, the particle physics centre that created the web, the grid could also provide the kind of power needed to transmit holographic images; allow instant online gaming with hundreds of thousands of players; and offer high-definition video telephony for the price of a local call.

David Britton, professor of physics at Glasgow University and a leading figure in the grid project, believes grid technologies could “revolutionise” society. “With this kind of computing power, future generations will have the ability to collaborate and communicate in ways older people like me cannot even imagine,” he said.

The power of the grid will become apparent this summer after what scientists at Cern have termed their “red button” day - the switching-on of the Large Hadron Collider (LHC), the new particle accelerator built to probe the origin of the universe. The grid will be activated at the same time to capture the data it generates.

Cern, based near Geneva, started the grid computing project seven years ago when researchers realised the LHC would generate annual data equivalent to 56m CDs - enough to make a stack 40 miles high.

This meant that scientists at Cern - where Sir Tim Berners-Lee invented the web in 1989 - would no longer be able to use his creation for fear of causing a global collapse.

This is because the internet has evolved by linking together a hotchpotch of cables and routing equipment, much of which was originally designed for telephone calls and therefore lacks the capacity for high-speed data transmission.

By contrast, the grid has been built with dedicated fibre optic cables and modern routing centres, meaning there are no outdated components to slow the deluge of data. The 55,000 servers already installed are expected to rise to 200,000 within the next two years.

Professor Tony Doyle, technical director of the grid project, said: “We need so much processing power, there would even be an issue about getting enough electricity to run the computers if they were all at Cern. The only answer was a new network powerful enough to send the data instantly to research centres in other countries.”

That network, in effect a parallel internet, is now built, using fibre optic cables that run from Cern to 11 centres in the United States, Canada, the Far East, Europe and around the world.

One terminates at the Rutherford Appleton laboratory at Harwell in Oxfordshire.

From each centre, further connections radiate out to a host of other research institutions using existing high-speed academic networks.

It means Britain alone has 8,000 servers on the grid system – so that any student or academic will theoretically be able to hook up to the grid rather than the internet from this autumn.

Ian Bird, project leader for Cern’s high-speed computing project, said grid technology could make the internet so fast that people would stop using desktop computers to store information and entrust it all to the internet.

“It will lead to what’s known as cloud computing, where people keep all their information online and access it from anywhere,” he said.

Computers on the grid can also transmit data at lightning speed. This will allow researchers facing heavy processing tasks to call on the assistance of thousands of other computers around the world. The aim is to eliminate the dreaded “frozen screen” experienced by internet users who ask their machine to handle too much information.

The real goal of the grid is, however, to work with the LHC in tracking down nature’s most elusive particle, the Higgs boson. Predicted in theory but never yet found, the Higgs is supposed to be what gives matter mass.

The LHC has been designed to hunt out this particle - but even at optimum performance it will generate only a few thousand of the particles a year. Analysing the mountain of data will be such a large task that it will keep even the grid’s huge capacity busy for years to come.

Although the grid itself is unlikely to be directly available to domestic internet users, many telecoms providers and businesses are already introducing its pioneering technologies. One of the most potent is so-called dynamic switching, which creates a dedicated channel for internet users trying to download large volumes of data such as films. In theory this would give a standard desktop computer the ability to download a movie in five seconds rather than the current three hours or so.

Additionally, the grid is being made available to dozens of other academic researchers including astronomers and molecular biologists.

It has already been used to help design new drugs against malaria, the mosquito-borne disease that kills 1m people worldwide each year. Researchers used the grid to analyse 140m compounds - a task that would have taken a standard internet-linked PC 420 years.

“Projects like the grid will bring huge changes in business and society as well as science,” Doyle said.

“Holographic video conferencing is not that far away. Online gaming could evolve to include many thousands of people, and social networking could become the main way we communicate.

“The history of the internet shows you cannot predict its real impacts but we know they will be huge.”

Computer Memory In Artificial Atoms: Carbon Nantubes Can Rev Up Speed, Accuracy Of Data Storage

ScienceDaily (2008-04-08) -- Nano-physicists have made a discovery that could change the way data is stored on computers. In the future it will be possible to store data much faster, and with more accuracy. A computer has two equally important elements: computing power and memory. Traditionally, scientists have developed these two elements in parallel. Now computer scientists have made a step towards a new means of data-storage, in which electricity and magnetism are combined in a new transistor concept. ... > read full article