GIGAOM: the world’s largest cloud

Lessons learned from the world’s largest cloud
GIGAOM, Jul 1, 2014
https://www.youtube.com/watch?v=I9R4P0TLViA
Urs Hölzle — SVP, Technical Infrastructure and Google Fellow, Google

09:50 internally, a lot of our things are running close to the bare metal
10:00 virtualization does have some overhead
14:30 the cloud needs to be much easier to use, much more scalable, much more cost-effective, much more elastic

A Virtual Outbreak Offers Hints Of Ebola’s Future

A Virtual Outbreak Offers Hints Of Ebola’s Future
August 14, 2014
http://www.npr.org/blogs/health/2014/08/14/340346575/a-virtual-outbreak-offers-hints-of-ebolas-future

“I’ve spent a lot of time doing computer models of disease transmission, but rarely does it involve something in Africa. Africa is often overlooked,” says Bryan Lewis, a computational epidemiologist at Virginia Tech.

“Some of those factors are the ones that are hard to measure,” he says. “You’ve got to choose how much of this complexity you care to explicitly represent.”

“At the moment, these models — at least for Sierra Leone and Liberia — we aren’t putting in any mitigating factors.

Given that all this modeling is as much an art as a science, different groups working on the problem have been comparing notes. They’ve also been fielding calls from government officials and policymakers.

Martin Meltzer, who heads up the unit at the Centers for Disease Control and Prevention that’s been creating computer models of the outbreak, says that people always ask him the same two questions: “How many people are going to die, and when is this going to end?

Computer sharing loses momentum

Computer sharing loses momentum
Competition and education needed to keep people engaged.
04 February 2014
http://www.nature.com/news/computer-sharing-loses-momentum-1.14666

… But enthusiasm is waning. The 47 projects hosted on BOINC, the most popular software system for @home efforts, have 245,000 active users among their 2.7 million registrants, down from a peak of about 350,000 active users in 2008

BOINC (Berkeley Open Infrastructure for Network Computing) and a computer scientist at the University of California, Berkeley, has several explanations for the slip. He says media coverage has declined now that volunteer computing is more than 15 years old. A shift to mobile-computing devices has probably also hurt — BOINC can run on an Android phone while charging, but uses too much battery power when unplugged.

The academy estimates that US$20 million has been saved since it launched CAS@home in September 2010, by using donated computing power rather than buying it from a company such as Amazon.

funding bodies might at some point enforce the use of volunteer computing whenever possible, rather than allowing grant money to be used for supercomputer time or cloud-based services.

For volunteer computing to be used in a bigger way, participation rates need to keep up. Perhaps the most obvious motivator — money — is deemed a bad idea. “Small amounts of money are too trivial, and may be almost insulting,” says Grey. “It goes against the idea of volunteering.” Only one BOINC project — IBM’s World Community Grid, an umbrella initiative that oversees a batch of biomedical projects aimed at goals such as drug discovery — has partnered with a scheme that allows volunteers to earn virtual cash (which can be exchanged for real money) for their time.
This had a measurable but small overall impact, says Anderson, earning the grid as many as 15,000 new volunteers, bringing the total so far up to almost 650,000.

IBM’s Neurosynaptic Chip Mimics Human Brain

IBM’s Neurosynaptic Chip Mimics Human Brain
William Murray, Electrical Engineer
9/18/2013
http://www.eetimes.com/author.asp?section_id=36&doc_id=1319523

IBM is releasing to early adopters a neurosynaptic computation chip that mimics the neurons and synapses of the brain. This chip is based on a new neuron model developed by IBM researchers.

Using artificial neural networks is not new — what is new is the innovative neuron model approach developed by the IBM team, and putting everything in a high-density, low-power ASIC. Each of the artificial neurons requires about 1,200 ASIC gates coupled with a synapse implemented using a cross-bar-type RAM memory. Many of these artificial neurons and synapses can fit in a single device where they perform extreme parallel processing.

Up until now, most computer chips have employed a von Neumann-type architecture with an ALU and RAM. These devices execute instructions in series, with multiple CPUs and/or ALU pipelining used to improve performance for a given precision. By comparison, IBM’s device mimics the neurons in the human brain, using an “integrate up the inputs” function and then firing an output pulse onto the output network. With the addition of feedback, and other constants and variables, various sophisticated transfer functions can be realized.

————————————————-

original source:
Neurosynaptic chips
Sep. 19, 2013
https://www.research.ibm.com/cognitive-computing/neurosynaptic-chips.shtml