John Romant's Technology Blog

If it's technology, I want to know about it.

Category Archives: Entertainment Technologies

Cloud Computing: Who said the phrase first?

Cloud computing is one of the hottest buzzwords in technology. It appears 48 million times on the Internet. But amidst all the chatter, there is one question about cloud computing that has never been answered: Who said it first?

Some accounts trace the birth of the term to 2006, when large companies such as Google and Amazon began using “cloud computing” to describe the new paradigm in which people are increasingly accessing software, computer power, and files over the Web instead of on their desktops.

But Technology Review tracked the coinage of the term back a decade earlier, to late 1996, and to an office park outside Houston. At the time, Netscape’s Web browser was the technology to be excited about, the Yankees were playing Atlanta in the World Series, and the Taliban was celebrating the sacking of Kabul. Inside the offices of Compaq Computer, a small group of technology executives was plotting the future of the Internet business and calling it “cloud computing.”

Their vision was detailed and prescient. Not only would all business software move to the Web, but what they termed “cloud computing-enabled applications” like consumer file storage would become common. For two men in the room, a Compaq marketing executive named George Favaloro and a young technologist named Sean O’Sullivan, cloud computing would have dramatically different outcomes. For Compaq, it was the start of a $2-billion-a-year business selling servers to Internet providers. For O’Sullivan’s startup venture, it was a step toward disenchantment and insolvency.

Cloud computing still doesn’t appear in the Oxford English Dictionary. But its use is spreading rapidlybecause it captures a historic shift in the IT industry as more computer memory, processing power, and apps are hosted in remote data centers, or the “cloud.” With billions of dollars of IT spending in play, the term itself has become a disputed prize. In 2008, Dell drew outrage from programmers after attempting to win a trademark on “cloud computing.” Other technology vendors, such as IBM and Oracle, have been accused of “cloud washing,” or misusing the phrase to describe older product lines.Like “Web 2.0,” cloud computing has become a ubiquitous piece of jargon that many tech executives find annoying, but also hard to avoid. “I hated it, but I finally gave in,” says Carl Bass, president and CEO of Autodesk, whose company unveiled a cloud-computing marketing campaign in September. “I didn’t think the term helped explain anything to people who didn’t already know what it is.”

The U.S. government has also had trouble with the term. After the country’s former IT czar, Vivek Kundra, pushed agencies to move to cheaper cloud services, procurement officials faced the question of what, exactly, counted as cloud computing. The government asked the National Institutes of Standards and Technology to come up with a definition. Its final draft, released this month, begins by cautioning that “cloud computing can and does mean different things to different people.”

“The cloud is a metaphor for the Internet. It’s a rebranding of the Internet,” says Reuven Cohen, cofounder of Cloud Camp, a course for programmers. “That is why there is a raging debate. By virtue of being a metaphor, it’s open to different interpretations.” And, he adds, “it’s worth money.”

Part of the debate is who should get credit for inventing the idea. The notion of network-based computing dates to the 1960s, but many believe the first use of “cloud computing” in its modern context occurred on August 9, 2006, when then Google CEO Eric Schmidt introduced the term to an industry conference. “What’s interesting [now] is that there is an emergent new model,” Schmidt said, “I don’t think people have really understood how big this opportunity really is. It starts with the premise that the data services and architecture should be on servers. We call it cloud computing—they should be in a “cloud” somewhere.”

The term began to see wider use the following year, after companies including Amazon, Microsoft, and IBM started to tout cloud-computing efforts as well. That was also when it first appeared in newspaper articles, such as a New York Times report from November 15, 2007, that carried the headline “I.B.M. to Push ‘Cloud Computing,’ Using Data From Afar.” It described vague plans for “Internet-based supercomputing.”

Sam Johnston, director of cloud and IT services at Equinix, says cloud computing took hold among techies because it described something important. “We now had a common handle for a number of trends that we had been observing, such as the consumerization and commoditization of IT,” he wrote in an e-mail.

Johnston says it’s never been clear who coined the term. As an editor of the Wikipedia entry for cloud computing, Johnston keeps a close eye on any attempts at misappropriation. He was first to raise alarms about Dell’s trademark application and this summer he removed a citation from Wikipedia saying a professor at Emory had coined the phrase in the late 1990s. There have been “many attempts to coopt the term, as well as various claims of invention,” says Johnston.

That may explain why cloud watchers have generally disregarded or never learned of one unusually early usage—a May 1997 trademark application for “cloud computing” from a now-defunct company called NetCentric. The trademark application was for “educational services” such as “classes and seminars” and was never approved. But the use of the phrase was not coincidental. When Technology Review tracked down NetCentric’s founder, O’Sullivan, he agreed to help dig up paper copies of 15-year-old business plans from NetCentric and Compaq. The documents, written in late 1996, not only extensively use the phrase “cloud computing,” but also describe in accurate terms many of the ideas sweeping the Internet today…continue to source.

 

Sony’s new 3mm lenticular sheet allows 3D viewing without glasses.

Sony Corp will release a sheet that enables a notebook PC to display 3D images that can be viewed with the naked eye without using special glasses in Europe.

The newly-developed sheet that realizes 3D images viewable with the naked eye

The company exhibited it at IFA 2011, the largest consumer electronics trade show in Europe, which runs from Sept 2, 2011, in Berlin, Germany. The sheet was developed for the “Vaio VPCSE1Z9E (S series),” a notebook PC that is equipped with a 15.5-inch color LCD panel and will be launched in Europe in October 2011. And it will be commercialized at the same time as the release of the notebook PC.

The sheet attached to a notebook PC. The face recognition function is seen at the upper left corner of the screen.

The size of the sheet is almost the same as that of a 15.5-inch LCD panel, and its thickness is about 3mm. It is attached to the front side of the notebook PC’s LCD panel. The naked-eye 3D display is realized based on the lenticular method, which creates parallax by arraying lenses that are thin and long and have a semicircular cross section.

The sheet will come with a dedicated application software that uses the Web camera of the notebook PC to determine the position of the user’s face and adjusts 3D images so that the user can see optimal 3D images at the position.

A human face can be detected when it is located at a distance of 30cm to 1m from the display and at an angle of 60 to 120° to the display horizontally. And, in consideration of the height of the face, the application optimizes 3D images. The sheet is priced at 129 euro (approx US$183)…visit source article.

AMD Releases Quad Buffer SDK for AMD HD3D technology to Accelerate the Development of Stereo 3D.

August 17, 2011 —

SUNNYVALE, CA — (Marketwire) — 08/18/11 — AMD (NYSE: AMD) today announced the availability of the AMD Quad Buffer SDK for AMD HD3D technology, delivering a vital tool to developers engaged in building immersive stereo 3D capabilities into upcoming game titles. Concurrently, new passive and active monitors from Acer, LG, Samsung, and Viewsonic have further expanded ecosystem support for AMD HD3D technology. End-users with systems including any of the following: the AMD A-Series APUs, AMD Radeon™ HD 5000 or HD 6000 HD3D-capable graphics products now have even more choice thanks to the Open Stereo 3D initiative in building their stereo 3D gaming or Blu-ray 3D playback system.

“AMD HD3D technology has reached critical mass, with more games, more movies, and supporting hardware and software from many of the industry’s leading vendors,” stated Matt Skynner, corporate vice president and general manager, AMD Graphics Division. “The addition of the Quad Buffer SDK can help our many developer partners make stereo 3D a standard part of future game titles.”

AMD Quad Buffer SDK

A big part of enabling stereo 3D support is the ability of AMD graphics hardware to drive four frame buffers simultaneously. AMD Quad Buffer SDK, available on AMD Developer Central, is designed to enable game and application developers to accelerate development time of stereo 3D within their titles. The SDK provides clear guidelines on how to implement stereo 3D to help ensure that it can be enjoyed across the expanding ecosystem of monitors and stereo 3D glasses supporting AMD HD3D technology. Additionally, the quad buffer can be used to add native support for stereo 3D in video games and supports DirectX® 9, 10 and 11.

Monitors & 3D Glasses
Computer monitors supporting AMD HD3D technology are now shipping from several major vendors, including Acer, LG, Samsung, and Viewsonic. The approach to stereo 3D varies from monitor to monitor, but they all have in common the ability to enable an incredibly immersive stereo 3D experience.  continue reading.

Sony Unveils HD Recording Digital Binoculars With 2D & 3D Capture

By

New Sony HD-recording Digital Binoculars models, DEV-3 and DEV-5, have been announced. The new models, the “World’s First Digital Binoculars With HD Video Recording, Zoom, Autofocus and SteadyShot Image Stabilization”, allow users to capture “can’t miss” moments in 1080 AVCHD 2.0 video, 3D, 7.1 MP images, and full stereo sound. According to Sony Electronics:

“Now consumers can watch birds, wildlife, sports action and more in steady, sharply-focused close-up views, while capturing their subjects in crisp Full HD. These new models add entirely new levels of flexibility and convenience to viewing, recording and enjoying your favorite images and scenes.”

The binoculars feature an ergonomic grip, a “stealh” design, a rechargeable battery pack good for about three hours of 2D recording, and a GPS receiver which allows for automatic geo-tagging of pictures (DEV-5 model only). Both models electronically autofocus at any magnification (in 2D) and the DEV-3 and DEV-5 have 10x and 20x optical zoom respectively. The new binoculars will be available for purchase for $1400 and $2000 this coming November.

What do you think of Sony’s HD Binoculars? Pretty, neat, right? Maybe we will see some higher-quality fan footage from sporting events this Christmas…visit original post.

THE LION KING 3D Conversion Images Show Off Depth of Field.

Original article by Bill Graham    Posted:August 9th, 2011 at 6:38 pm

the-lion-king-3d-scar-conversion-image-slice

Disney’s The Lion King will release into theaters this year in a new 3D format for the very first time on September 16th. The film was a childhood favorite of mine, and every time I hear “The Circle of Life,” I get goosebumps. Needless to say, I look forward to viewing the film on the big screen, something I may have done when I was little but can’t recall. However, I do wonder how a film from the ’90s will hold up, animation wise, and how a 3D conversion of it will fare on the big screen.

Today, Disney sent over some images showing just what the conversion process entails, including adding notes of depth and then using filters to key in on what will be in the foreground, background, and everywhere in between. The process is a lot more difficult than this, but it gives us a great idea of what the process entails on a basic, easy to understand level. Hit the jump to view those images, including a description of what we are looking at, a discussion with the stereographer Robert Neuman about the procedure itself, and my impressions of the scenes they showed before Cars 2.

First, let’s get to the good stuff. Disney sent over two scenes that show off the process. The first is of Pride Rock and the second is of Scar. The basic process is taking a finished image, adding a layer and marking depth details, and then using a layering system to key on those depth markings in the image to tell the computer where to place the image. Here are those images [click to enlarge]:

lion-king-3d-image

lion-king-3d-image-3

lion-king-3d-image-4

lion-king-3d-image-5

lion-king-3d-image-1

lion-king-3d-image-2

Here are the captions Disney sent over as well, explaining the images in more detail:

1. The original film image.

2. The 3D Depth Map created by Robert Neuman, the 3D Stereographer on the film. Positive numbers refer to the amount of pixels the image will come out of the screen and negative numbers refer to the amount of pixels the image will go deeper into the screen, creating the 3D depth.

3. Grey Scale – The final image in the computer representation of depth. Darker images will be furthest away, and lighter images will be closer to the viewer.”

….click here to continue reading original article.

Astronomy’s 3D Revolution

I ran across this article at technologyreview.com

Simple 3D tools could bring astronomy alive for scientists and the public alike. But the techniques are woefully underused, argue two astronomers.

When it comes to scientific visualisations, biochemists are the undisputed champions. These guys embraced 3D techniques to represent complex molecules at the dawn of the computer age. That’s made a huge difference to the way researchers understand and appreciate each other’s work. In fact, it’s fair to say that biochemistry would not be a poorer science without efficient 3D visualisation tools.
Now, Frederic Vogt and Alexander Wagner at the Australian National University argue that astronomy could benefit in a similar way from simple 3D tools.
“Stereo pairs are not merely an ostentatious way to present data, but an enhancement in the communication of scientifific results in publications because they provide the reader with a realistic view of multi-dimensional data, be it of observational or theoretical nature,” they say….continue reading.

Technicolor Acquires LaserPacific and Cinedigm Key Assets

I got my start in the Entertainment Industry at Laser Pacific, during the Leon Silverman era.  Under Leon, Laser Pacific developed Emmy award winning technologies which spurred innovation throughout the entire industry and beyond.   Hopefully Technicolor can rekindle the spirit of innovation that Leon Silverman developed at Laser Pacific.

See the article below:

Technicolor has been doing some shopping to dramatically boost its digital cinema business. The Paris-based company announced July 27 an agreement to acquire Cinedigm Digital Cinema Corp.’s physical and electronic theatrical distribution assets, and on the same day announced the acquisition of LaserPacific’s postproduction assets.

The Cinedigm deal will grow Technicolor’s satellite presence by 40 percent, expanding it to over 1,100 locations in North America. Distribution assets, replication equipment, and at least 300 satellite roof rights are also included, among other things. Additionally, Technicolor will license some of Cinedigm’s key software and become its preferred partner for related post-production services…continue to original post.

Apple doesn’t fall far from the tree. The Death of Final Cut Pro.

UPDATE 9/8/2011 Apple back selling the old version of Final Cut Pro…thanks!

After spending the day brainstorming with various high level technologist’s in the entertainment industry,  I have concluded that Apple has decided to offline the development and support of its professional editing application Final Cut ProCheck out this article for more information about Apple users complaining about Final Cut Pro X being lean. 

Forget the Decade of Apple customer loyalty to Final Cut Pro.  Apple has gone the way of the North Koreans on this one.   Final Cut Pro was  like another failed desert war, “long-term and no hope for victory”. 

All the hopes and dreams of the creative community that were riding on the shoulders of apple as a platform are now being sold down the river for bottom line profit margins.  When you tell the consumer it’s too expensive to develop and support a professional Editing Suite you are telling them you are giving up on a huge community.  Creativity will be stifled until editors complete the  migration to Avid and Adobe Premier, because that’s all we seem to have left at this time.

Could there be a deal coming between Avid and Steve Jobs?  Only time will tell.  We’re watching and speculating to the best of our knowledge.  Technologist at the highest levels of knowledge in the Entertainment Industry, are in agreement that this is happening with Apple.  Let’s see what unfolds.

It looks like Bill Gates might have had it right.  It’s always been PC and will remain  so for the foreseeable future.    Who’s going to be the next to use a form of Free Unix as an operating system and make you pay for it.

As our Chief Scientist Robert Haussmann at nanotechniq says “Go Jobsy, you are a marketing Genius”.

More information is coming as we analyze the data….

%d bloggers like this: