mykeystrokes.com

"Do or Do not. There is no try."

“A Whale Of A Failure”: The Romney Campaign Sure Had Some Bad Smartphone Apps

The Romney campaign was so very ambitious with its smart phone apps, and so very bad at them. The latest and most epic failure was the so-called Orca app the campaign had built to count people who had voted, which crashed repeatedly throughout Election Day and even when it was working, didn’t really do its job. Politico and Ace of Spades both have in-depth reports on how poorly the thing performed, but regardless of the technical failings, the baffling thing is that they didn’t test it. Politico’s Maggie Haberman and Alexander Burns reported: “Among other issues, the system was never beta-tested or checked for functionality without going live before Election Day, two sources said. It went live that morning but was never checked for bugs or efficiencies internally.” With a record of electronic gaffes as bad as the Romney campaign’s that just seems insane.

The campaign really should have known better than to not test. It had already made at least two disastrous attempts at making a killer app that would get all the smart-phone types chattering. First, there was the campaign’s official app, which became a literal gaffe machine when it prominently misspelled America on its welcome screen, promising users “a better Amercia.” Then, of course, there was the vice-presidential choice app, which promised users they would “be the first to find out” when Romney finally tapped his running mate. The app got scooped by about seven hours. Romney can, of course, do whatever he wants now that he’s not campaigning. But we’d humbly suggest he choose a field other than app development.

 

By: Adam Martin, Daily Intel, November 9, 2012

November 11, 2012 Posted by | Election 2012 | , , , , , , | Leave a comment

“Improving The Quality Of Life”: It’s Time To Get Serious About Science

Some policymakers, including certain senators and members of Congress, cannot resist ridiculing any research project with an unusual title. Their press releases are perhaps already waiting in the drawer, with blanks for the name of the latest scientist being attacked. The hottest topics for ridicule involve sex, exotic animals and bugs.

The champion of mocking science was the late William Proxmire, whose Golden Fleece Awards enlivened dull Senate floor proceedings from 1975 until 1988. His monthly awards became a staple of news coverage. He generated good laughs back home by talking about a “wacko” in a lab coat experimenting with something seemingly stupid. Proxmire did not invent the mad-scientist stereotype, but he did much to popularize it.

The United States may now risk falling behind in scientific discoveries as other countries increase their science funding. We need to get serious about science. In fact, maybe it’s time for researchers to fight back, to return a comeback for every punch line.

Toward that end, we are announcing this week the winners of the first Golden Goose Awards, which recognize the often-surprising benefits of science to society. Charles H. Townes, for example, is hailed as a primary architect of laser technology. Early in his career, though, he was reportedly warned not to waste resources on an obscure technique for amplifying radiation waves into an intense, continuous stream. In 1964, he shared the Nobel Prize in Physics with Nikolay Basov and Alexander Prokhorov.

Similarly, research on jellyfish nervous systems by Osamu Shimomura, Martin Chalfie and Roger Y. Tsien unexpectedly led to advances in cancer diagnosis and treatment, increased understanding of brain diseases such as Alzheimer’s, and improved detection of poisons in drinking water. In 2008, the trio received the Nobel Prize in Chemistry for this initially silly-seeming research. Four other Golden Goose Award winners — the late Jon Weber as well as Eugene White, Rodney White and Della Roy — developed special ceramics based on coral’s microstructure that is now used in bone grafts and prosthetic eyes.

Across society, we don’t have to look far for examples of basic research that paid off. Larry Page and Sergey Brin, then a National Science Foundation fellow, did not intend to invent the Google search engine. Originally, they were intrigued by a mathematical challenge, so they developed an algorithm to rank Web pages. Today, Google is one of the world’s most highly valued brands, employing more than 30,000 people.

It is human nature to chuckle at a study titled “Acoustic Trauma in the Guinea Pig,” yet this research led to a treatment for hearing loss in infants. Similar examples abound. Transformative technologies such as the Internet, fiber optics, the Global Positioning System, magnetic resonance imaging (MRI), computer touch-screens and lithium-ion batteries were all products of federally funded research.

Yes, “the sex life of the screwworm” sounds funny. But a $250,000 study of this pest, which is lethal to livestock, has, over time, saved the U.S. cattle industry more than $20 billion. Remember: The United States itself is the product of serendipity: Columbus’s voyage was government-funded. Remember, too, that basic science, the seed corn of innovation, is primarily supported by the federal government — not industry, which is typically more interested in applied research and development.

While some policymakers continue to mock these kinds of efforts, researchers have remained focused on improving our quality of life. Scientific know-how, the engine of American prosperity, is especially critical amid intense budgetary pressures. Federal investments in R&D have fueled half of the nation’s economic growth since World War II. This is why a bipartisan team of U.S. lawmakers joined a coalition of science, business and education leaders to launch the Golden Goose Awards.

Federal support for basic science is at risk: We are already investing a smaller share of our economy in science as compared with seven other countries, including Japan, Taiwan and South Korea. Since 1999, the United States has increased R&D funding, as a percentage of the economy, by 10 percent. Over the same period, the share of R&D in the economies of Finland, Germany and Israel have grown about twice as fast. In Taiwan, it has grown five times as fast; in South Korea, six times as fast; in China; 10 times. In the United States, meanwhile, additional budget cuts have been proposed to R&D spending for non-defense areas. If budget-control negotiations fail, drastic across-the-board cuts will take effect in January that could decimate entire scientific fields.

Columbus thought he knew where he was going, but he didn’t know what he had found until many years later. He was searching for the Orient, but he discovered something even better: the New World.

Let’s honor our modern-day explorers. We need more of them. They deserve the last laugh.

 

By: Jim Cooper and Alan I. Leshner, The Washington Post, September 9, 2012

September 10, 2012 Posted by | Science | , , , , , , , , | Leave a comment

What Steve Jobs’s Legacy Says About Innovation

In the wake of Apple Computer  cofounder Steve Jobs’s death, it’s become almost a truism that he provided  consumers what they needed before they even knew they needed it.

I think it’s true not only in the  case of the revolutionary products that Jobs marshaled into existence, but of  many, many consumer goods that seemed exotic or pointless at first, and then  became ubiquitous.

It’s the nature of innovation, the  “novus.” The New Thing.

There’s an important moral dimension  to it, too, I think—this idea  of “needing” consumer goods. Pro-innovation  people—the vast majority of  us—love new things. We love things that make our  lives simpler,  easier, more enriching, or just more fun.

Take the vacuum cleaner.

I remember well a lefty history  professor in college, lecturing in a  disdainful deterministic tone about the  vacuum cleaner. Did it make  housewives’ lives easier—or did it impel them to  remove household dust  that had previously been a nonissue?

On the one hand, Christine Rosen’s 2006  essay in The New Atlantis,  “Are We Worthy Our Kitchens?”, was a  definitive takedown of such  thinking. There have been real gains in human  welfare due to  industrial-era electronic technology:

Despite its humble  status … the electric washing  machine represents one of the more dramatic  triumphs of technological  ingenuity over physical labor. Before its invention  in the twentieth  century, women spent a full day or more every week performing  the  backbreaking task of laundering clothes. Hauling water (and the fuel to   heat it), scrubbing, rinsing, wringing—one nineteenth-century American  woman  called laundry “the Herculean task which women all dread.” No one  who had the  choice would relinquish her washing machine and do laundry  the old-fashioned  way.

Then again, even with all of our  fancy time-saving gadgets, has family/domestic really improved? She continues:

Judging  by how Americans  spend their money—on shelter magazines and kitchen  gadgets and home  furnishings—domesticity appears in robust health.  Judging by the way Americans  actually live, however, domesticity is in  precipitous decline. Families sit  together for meals much less often  than they once did, and many homes exist in  a state of near-chaos as  working parents try to balance child-rearing, chores,  long commutes,  and work responsibilities. As Cheryl Mendelson, author of a  recent book  on housekeeping, observes, “Comfort and engagement at home have   diminished to the point that even simple cleanliness and decent  meals—let alone  any deeper satisfactions—are no longer taken for  granted in many middle-class  homes.” Better domestic technologies have  surely not produced a new age of  domestic bliss.

True, no?

And who can deny the moral, or at  least McLuhan-esque, dimension of “gadget love”?

There’s no simple answer to these  questions—and I ponder them anew  every time I interact with an Apple product.  (Like right now, as I  type.)

I’m far from a Mac nerd, but I am,  in my own way, a heavy user. My  iPod battery has been broken for months, and I  haven’t gotten around to  replacing it. Lately, the idea of driving without  ready access to my  entire music library—something that would have been  unthinkable for  most of my lifetime—is a continual annoyance.

And when I first bought that iPod, I  found myself mired  in a sort of technological obsessive-compulsive disorder:

With  1,000-plus CDs that I’d ideally like to  upload—because you can’t let all those  free gigabytes starve, not with  so many of the world’s poor children starving  for gigabytes—the process  of ripping, in short order, became an object of dread  and crippling  self-doubt. Unripped CDs now taunt me in their unripped-ness. I  can  almost hear them, in their half-broken jewel cases and water-stained   leaflets, in their state of 20th-century plastic inertness, laugh at   me.

I’ve also found  the aesthetic, near-cultic magnetism of Apple products a little creepy, too:

When  I read stories about iPod users rhapsodizing about  how their iPods are profound  reflections of their personalities; how  their iPod shuffle mechanism has the  seemingly mystical ability to  randomly spit out the right song for the right  moment; how life  screeches to a halt when their iPod suffers a technical glitch  [um, yes — S.G.]—when I read these stories I think of Mr. McLuhan’s  chapter on “gadget lovers.”

Riffing  on the Greek myth of Narcissus, Mr. McLuhan wrote that  technology gadgets were  like narcotic extensions of the self; we  worship them as idols and thus become  a self-enclosed system.

Sound  familiar?

“Servomechanism”  was the term of art that Mr. McLuhan employed: a device that controls something  from a distance.

He  said of gadget love: “We must, to use them at all, serve these  objects, these  extensions of ourselves, as gods or minor religions. An  Indian is the  servomechanism of his canoe, as the cowboy of his horse  or the executive of his  clock.”

When  you think of mere gadgets in such terms, it’s no wonder there’s  been such an  outpouring of grief over the loss of Steve Jobs.

But  who among us is willing to pull  a modern-day Thoreau and wall ourselves off from innovation?

It’s  part of the human condition, I suppose.

 

By: Scott Galupo, U. S. News and World Report, October 6, 2011

October 7, 2011 Posted by | Capitalism, Corporations, Economy | , , , , | Leave a comment