While valued at $10 billion, Twitter must prove — financially and culturally — that it has the right tech stuff to match that figure on its balance sheet. It has annual revenue of about $1 billion.Computers will someday soon automatically provide short video digests of a day in your life, your family vacation or an eight-hour police patrol, say computer scientists at The University of Texas at Austin.The researchers are working to develop tools to help make sense of the vast quantities of video that are going to be produced by wearable camera technology such as Google Glass and Looxcie.
"The amount of what we call 'egocentric' video, which is video that is shot from the perspective of a person who is moving around, is about to explode," said Kristen Grauman, associate professor of computer science in the College of Natural Sciences. "We're going to need better methods for summarizing and sifting through this data."Grauman and her colleagues developed a superior technique that uses machine learning to automatically analyze recorded videos and assemble a better short "story" of the footage than what is available from existing methods.
Better video summarization should prove important in helping military commanders managing data coming in from soldiers' cameras, investigators trying to sift through cellphone video data in the wake of disasters like the Boston Marathon bombing, and senior citizens using video summaries of their days to compensate for memory loss, said Grauman."There's research showing that if people suffering from memory loss wear a camera that takes a snapshot once a minute, and then they review those images at the end of the day, it can help their recall," said Grauman.