If it's technology, I want to know about it.
Category Archives: Computers
January 27, 2012Posted by on
The LSE study selected two industries, aerospace and smartphone services, and examined the impact of cloud computing on these industries across the UK, USA, Germany and Italy between the years 2010 and 2014. The LSE study was underwritten by Microsoft.
Investing in cloud computing is contributing to growth and job creation in both the fast-growing, high-tech smartphone services industry as well as the longstanding and slow-growth aerospace sector, the study claims. In addition, cloud is directly creating employment through the construction, staffing and supply of data centers, which will host the cloud. Using cloud computing enables businesses of all sizes to be more productive by freeing managerial staff and skilled employees to concentrate on more profitable areas of work.
There will be a new range of employment opportunities opening up as a result of the shift to cloud as well. As the study points out, “as firms shift from proprietary application servers towards virtualization and cloud computing, related skills will be in demand among employers. New direct hires and upskilling for public cloud enablement result in higher-than-average salaries.”
Of the countries analyzed in the study, the US is leading the way in terms of cloud job creation. US cloud-related jobs in the smartphone sector are set to grow to 54,500 in 2014. This is compared to a projected 4,040 equivalent jobs in the UK. The authors of the study say that this can be attributed, in part, to lower electricity costs and less restrictive labor regulation compared to Europe.
Small to medium-size businesses will benefit as well. In the smartphone sector alone, “cloud computing will form the basis for a rapid expansion and high-start-up rate among SMEs 2010-2014 in all four markets in services,” the study says.
The study also shows that there is in fact little risk of unemployment from investing in the cloud, as companies are more likely to move and re-train current staff. This would be alongside the hiring of new staff, likely to be in a higher salary bracket, who have the necessary skills for using virtual data-handling systems.
But researchers found that the level of impact the cloud has on a business or department’s growth and productivity depends on a number of factors, primarily the type of sector in which the business is involved and the regulatory environment in which it operates.
Unsurprisingly, the cloud has a much greater effect on the web-centred smartphone services industry than traditional high tech manufacturing, with expansion and a high-start-up rate among small and medium size businesses in 2010-2014 forecast. For example, in the UK from 2010 through 2014, the rate of growth in cloud-related jobs in the smartphone services sector is set to be 349%, compared to 52% growth in aerospace. German, Italian and US equivalent growth rates will be 280% vs 33%, 268% vs 36% and 168% vs 57% respectively.
The study’s authors, Jonathan Liebenau, Patrik Karrberg, Alexander Grous and Daniel Castro, also talk about the direct and indirect employment and business opportunities that will stem from cloud, which may not be apparent at first. “Our analysis shows jobs shifting from distributed data processing facilities to consolidated data centers, resulting in a drop in data processing jobs overall as efficiency gains occur especially through public cloud services,” they write. “We see a reduction in IT administrators within large firms in smartphone businesses (and most likely in many other similar sectors) compared to their level of employment otherwise expected by taking into account overall IT spending.”
They add that direct and indirect employment gains will be seen in the construction of new data centers needed to accommodate the public cloud businesses, and an “unanticipated effect is in job creation of site maintenance, janitorial staff and security guards in newly built data centers. Overall, more than 30% of short-term new employment in cloud services originates from the construction of data centers and outfitting them accounts for around another third.” Almost 25% of new jobs accrue from direct employment in public cloud services firms, they add.
Then there’s the “cloud dividend” that enterprises will see as the cloud infrastructure develops. These gains will be “in the form of…continue reading at source.
January 19, 2012Posted by on
January 19, 2012Posted by on
Cloud computing is one of the hottest buzzwords in technology. It appears 48 million times on the Internet. But amidst all the chatter, there is one question about cloud computing that has never been answered: Who said it first?
Some accounts trace the birth of the term to 2006, when large companies such as Google and Amazon began using “cloud computing” to describe the new paradigm in which people are increasingly accessing software, computer power, and files over the Web instead of on their desktops.
But Technology Review tracked the coinage of the term back a decade earlier, to late 1996, and to an office park outside Houston. At the time, Netscape’s Web browser was the technology to be excited about, the Yankees were playing Atlanta in the World Series, and the Taliban was celebrating the sacking of Kabul. Inside the offices of Compaq Computer, a small group of technology executives was plotting the future of the Internet business and calling it “cloud computing.”
Their vision was detailed and prescient. Not only would all business software move to the Web, but what they termed “cloud computing-enabled applications” like consumer file storage would become common. For two men in the room, a Compaq marketing executive named George Favaloro and a young technologist named Sean O’Sullivan, cloud computing would have dramatically different outcomes. For Compaq, it was the start of a $2-billion-a-year business selling servers to Internet providers. For O’Sullivan’s startup venture, it was a step toward disenchantment and insolvency.
Cloud computing still doesn’t appear in the Oxford English Dictionary. But its use is spreading rapidlybecause it captures a historic shift in the IT industry as more computer memory, processing power, and apps are hosted in remote data centers, or the “cloud.” With billions of dollars of IT spending in play, the term itself has become a disputed prize. In 2008, Dell drew outrage from programmers after attempting to win a trademark on “cloud computing.” Other technology vendors, such as IBM and Oracle, have been accused of “cloud washing,” or misusing the phrase to describe older product lines.Like “Web 2.0,” cloud computing has become a ubiquitous piece of jargon that many tech executives find annoying, but also hard to avoid. “I hated it, but I finally gave in,” says Carl Bass, president and CEO of Autodesk, whose company unveiled a cloud-computing marketing campaign in September. “I didn’t think the term helped explain anything to people who didn’t already know what it is.”
The U.S. government has also had trouble with the term. After the country’s former IT czar, Vivek Kundra, pushed agencies to move to cheaper cloud services, procurement officials faced the question of what, exactly, counted as cloud computing. The government asked the National Institutes of Standards and Technology to come up with a definition. Its final draft, released this month, begins by cautioning that “cloud computing can and does mean different things to different people.”
“The cloud is a metaphor for the Internet. It’s a rebranding of the Internet,” says Reuven Cohen, cofounder of Cloud Camp, a course for programmers. “That is why there is a raging debate. By virtue of being a metaphor, it’s open to different interpretations.” And, he adds, “it’s worth money.”
Part of the debate is who should get credit for inventing the idea. The notion of network-based computing dates to the 1960s, but many believe the first use of “cloud computing” in its modern context occurred on August 9, 2006, when then Google CEO Eric Schmidt introduced the term to an industry conference. “What’s interesting [now] is that there is an emergent new model,” Schmidt said, “I don’t think people have really understood how big this opportunity really is. It starts with the premise that the data services and architecture should be on servers. We call it cloud computing—they should be in a “cloud” somewhere.”
The term began to see wider use the following year, after companies including Amazon, Microsoft, and IBM started to tout cloud-computing efforts as well. That was also when it first appeared in newspaper articles, such as a New York Times report from November 15, 2007, that carried the headline “I.B.M. to Push ‘Cloud Computing,’ Using Data From Afar.” It described vague plans for “Internet-based supercomputing.”
Sam Johnston, director of cloud and IT services at Equinix, says cloud computing took hold among techies because it described something important. “We now had a common handle for a number of trends that we had been observing, such as the consumerization and commoditization of IT,” he wrote in an e-mail.
Johnston says it’s never been clear who coined the term. As an editor of the Wikipedia entry for cloud computing, Johnston keeps a close eye on any attempts at misappropriation. He was first to raise alarms about Dell’s trademark application and this summer he removed a citation from Wikipedia saying a professor at Emory had coined the phrase in the late 1990s. There have been “many attempts to coopt the term, as well as various claims of invention,” says Johnston.
That may explain why cloud watchers have generally disregarded or never learned of one unusually early usage—a May 1997 trademark application for “cloud computing” from a now-defunct company called NetCentric. The trademark application was for “educational services” such as “classes and seminars” and was never approved. But the use of the phrase was not coincidental. When Technology Review tracked down NetCentric’s founder, O’Sullivan, he agreed to help dig up paper copies of 15-year-old business plans from NetCentric and Compaq. The documents, written in late 1996, not only extensively use the phrase “cloud computing,” but also describe in accurate terms many of the ideas sweeping the Internet today…continue to source.
October 19, 2011Posted by on
By LEE FERRAN
Oct. 18, 2011
A new computer virus using “nearly identical” parts of the cyber superweapon Stuxnet has been detected on computer systems in Europe and is believed to be a precursor to a new Stuxnet-like attack, a major U.S.-based cyber security company said today.
Stuxnet was a highly sophisticated computer worm that was discovered last year and was thought to have successfully targeted and disrupted systems at a nuclear enrichment plant in Iran. At the time, U.S. officials said the worm’s unprecedented complexity and potential ability to physically sabotage industrial control systems — which run everything from water plants to the power grid in the U.S. and in many countries around the world — marked a new era in cyber warfare.
Though no group claimed responsibility for the Stuxnet worm, several cyber security experts have said it is likely a nation-state created it and that the U.S. and Israel were on a short list of possible culprits.
Whoever it was, the same group may be at it again, researchers said, as the authors of the new virus apparently had access to original Stuxnet code that was never made public.
The new threat, discovered by a Europe-based research lab and dubbed “Duqu”, is not designed to physically affect industrial systems like Stuxnet was, but apparently is only used to gather information on potential targets that could be helpful in a future cyber attack, cyber security giant Symantec said in a report today.
“Duqu shares a great deal of code with Stuxnet; however, the payload is completely different,” Symantec said in a blog post.
Duqu is designed to record key strokes and gather other system information at companies in the industrial control system field and then send that information back to whomever planted the bug, Symantec said.
If successful, the information gleaned from those companies through Duqu could be used in a future attack on any industrial control system in the world where the companies’ products are used — from a power plant in Europe to an oil rig in the Gulf of Mexico.
“Right now it’s in the reconnaissance stage, you could say,” Symantec Senior Director for Security Technology and Response, Gerry Egan, told ABC News. “[But] there’s a clear indication an attack is being planned.”
Duqu is also not designed to spread on its own…continue reading.
October 8, 2011Posted by on
A computer virus has infected the cockpits of America’s Predator and Reaper drones, logging pilots’ every keystroke as they remotely fly missions over Afghanistan and other warzones.
The virus, first detected nearly two weeks ago by the military’s Host-Based Security System, has not prevented pilots at Creech Air Force Base in Nevada from flying their missions overseas. Nor have there been any confirmed incidents of classified information being lost or sent to an outside source. But the virus has resisted multiple efforts to remove it from Creech’s computers, network security specialists say. And the infection underscores the ongoing security risks in what has become the U.S. military’s most important weapons system.
“We keep wiping it off, and it keeps coming back,” says a source familiar with the network infection, one of three that told Danger Room about the virus. “We think it’s benign. But we just don’t know.”
Military network security specialists aren’t sure whether the virus and its so-called “keylogger” payload were introduced intentionally or by accident; it may be a common piece of malware that just happened to make its way into these sensitive networks. The specialists don’t know exactly how far the virus has spread. But they’re sure that the infection has hit both classified and unclassified machines at Creech. That raises the possibility, at least, that secret data may have been captured by the keylogger, and then transmitted over the public internet to someone outside the military chain of command.
Drones have become America’s tool of choice in both its conventional and shadow wars, allowing U.S. forces to attack targets and spy on its foes without risking American lives. Since President Obama assumed office, a fleet of approximately 30 CIA-directed drones have hit targets in Pakistan more than 230 times; all told, these drones have killed more than 2,000 suspected militants and civilians, according to the Washington Post. More than 150 additional Predator and Reaper drones, under U.S. Air Force control, watch over the fighting in Afghanistan and Iraq. American military drones struck 92 times in Libya between mid-April and late August. And late last month, an American drone killed top terrorist Anwar al-Awlaki — part of an escalating unmanned air assault in the Horn of Africa and southern Arabian peninsula.
But despite their widespread use, the drone systems are known to have security flaws. Many Reapers and Predators don’t encrypt the video they transmit to American troops on the ground. In the summer of 2009, U.S. forces discovered “days and days and hours and hours” of the drone footage on the laptops of Iraqi insurgents. A $26 piece of software allowed the militants to capture the video.
The lion’s share of U.S. drone missions are flown by Air Force pilots stationed at Creech, a tiny outpost in the barren Nevada desert, 20 miles north of a state prison and adjacent to a one-story casino. In a nondescript building, down a largely unmarked hallway, is a series of rooms, each with a rack of servers and a “ground control station,” or GCS. There, a drone pilot and a sensor operator sit in their flight suits in front of a series of screens. In the pilot’s hand is the joystick, guiding the drone as it soars above Afghanistan, Iraq, or some other battlefield.
Some of the GCSs are classified secret, and used for conventional warzone surveillance duty. The GCSs handling more exotic operations are top secret. None of the remote cockpits are supposed to be connected to the public internet. Which means they are supposed to be largely immune to viruses and other network security threats.
But time and time again, the so-called “air gaps” between classified and public networks have been bridged, largely through the use of discs and removable drives. In late 2008, for example, the drives helped introduce the agent.btz worm to hundreds of thousands of Defense Department computers. The Pentagon is still disinfecting machines, three years later.
Use of the drives is now severely restricted throughout the military. But the base at Creech was one of the exceptions, until the virus hit. Predator and Reaper crews use removable hard drives to load map updates and transport mission videos from one computer to another. The virus is believed to have spread through these removable drives. Drone units at other Air Force bases worldwide have now been ordered to stop their use.
In the meantime, technicians at Creech are trying to get the virus off the GCS machines. It has not been easy. At first, they followed removal instructions posted on the website of the Kaspersky security firm. “But the virus kept coming back,” a source familiar with the infection says. Eventually, the technicians had to use a software tool called BCWipe to completely erase the GCS’ internal hard drives. “That meant rebuilding them from scratch” — a time-consuming effort...continue reading at source.
June 26, 2011Posted by on
A Stretchy Sensing Tool for Surgery
Monday, March 7, 2011
By Katherine Bourzac
A new surgical tool covered in stretchable sensors could reduce the time required to map electrical problems in the heart from over an hour to just a few minutes. The tool could be one of the first commercial applications for an innovative method for making dense arrays of stretchable, biocompatible electronics using high-performance materials including silicon. The tool, which senses temperature and electrical activity, could also lead to better monitoring during other types of surgery, potentially reducing the rate of complications.
Putting such devices on a stretchy surface is not possible using conventional electronics manufacturing. The stretchable silicon electronics used were developed by John Rogers, professor of materials science and engineering at the University of Illinois at Urbana-Champagne and a cofounder of MC10, a startup that is commercializing the technology. Researchers at MC10 are leading the development of the catheters and are also developing the electronics for other applications.
The surgical tool has performed well in animal tests designed to mimic a disorder called atrial fibrillation. This results from electrical problems in the heart tissue around the pulmonary vein, which carries blood back to the heart from the lungs. The condition, in which the upper chambers of the heart quiver instead of beating, is seen in over 2 million Americans, and in 15 percent of all people who have strokes. Atrial fibrillation is difficult to control with drugs, and the drugs that are used, including blood thinners, can have serious side effects. But the problem can be corrected with surgery. First, surgeons map the source of the electrical problem with a probe, and then they knock out the electrical trouble spots by heating and damaging those tissues.
The new multifunctional surgical tools could help speed this surgery, lowering the risk that something will go wrong.
Mapping electrical activity in heart tissue is conventionally done using a tool called a balloon catheter—a soft, inflatable probe fitted with one or two electrodes. The catheter is moved back and forth over the damaged tissue, taking thousands of electrical readings one at a time, and these become the basis for a map of electrical activity. But the process is time-consuming—in the case of some fibrillations it takes over an hour.
June 1, 2011Posted by on
Found this infographic while Stumbling.