Join Transform 2021 this July 12-16. Register for the AI event of the year.


In advance of its Intel Developer Forum in San Francisco, Intel researchers are talking today about the toughest computing problems they face. Among them is the task of connecting the physical and digital worlds in a way that is believable to users and then getting a computer to anticipate what users want.

Key to this is getting computers to act more like humans, with a keen sense of “situational awareness,” or what is going on around someone at a given time, where they are, what they want to do, what they’re feeling and what they ought to know.

Accordingly, Intel Research is pouring a lot of money and effort into technologies that can make it much easier to connect the physical world to the digital world, said Andrew Chien, director of Intel Research. To really make it effortless for users, computer researchers have to make breakthroughs in a variety of fields: visual computing (look out, Nvidia), sensors that can translate physical data into digital data that computers can process, and mobile technologies that deliver computing power and connectivity in a power-efficient way.

Jim Held, an Intel fellow, said that Intel is closely watching the development of virtual worlds such as Second Life, as the likely path for computing interfaces in the future. He noted there are more than 2,000 virtual worlds in development and that the number of users already in virtual worlds has surpassed 300 million. He notes how Second Life is a compute-intensive application, using 70 percent of a processor’s available computing power and 35 percent to 70 percent of a computer’s graphics capability. Part of the challenge is the difficulty in crunching all of the data associated with user-generated content.

Over time, he believes that such worlds will combine digital 3-D animations with real-world data, so that 3-D data can be overlaid on top of a real video image to help users identify locations or play new kinds of “augmented reality” games.

The sensing technologies under research have a broad range. At the microscopic level, Intel is looking at how it can track stem cell growth on a nanometer scale. Such tracking technologies could track individual cells and their rates of reproduction and movement.

“On the basis of how they behave, you can classify them on how rapidly they’re reproducing,” Chien said.

Intel is also doing research into skin cancer. It is using sensors to capture images of skin lesions to determine whether those lesions change over time and whether they are cancerous or not. The DermFind research is being done by May Chen and other researchers in Intel’s Pittsburgh research office. They can automatically process the data on the lesions, archive it, and allow doctors to look up the changes in the lesions.

On a higher level, computers need to be much better at recognizing objects in daily life, such as recognizing everyday objects, navigating a path through the physical world, and recognizing faces and the emotional expressions on those faces. In geek terms, real-time video detection requires four teraflops of computing power at 10 kilowatts of power. That’s not possible with today’s computers.

“Collecting data from sensors and getting to high-level results such as a ‘call for help’ are really difficult,” Chien said.

Some of these sensor projects are amusing. Intel has one project where it determines which person is watching TV at any given time based on how they use the remote control. The remote has accelerometers that detect the patterns of usage for the remote (which way it’s pointing, for instance). And it can distinguish who is using a remote based on the differences in the way family members use the remote.

“That gives us info on who is using what in a given living room without a lot of heavy-duty instrumentation,” Chien said.

Mary Smiley, director of the Intel Emerging Technologies Lab, showed off a “proactive wellness” application where a mobile phone with the appropriate sensors could show the heart rate, posture (sitting), and other data associated with what a person was doing at any given time and how far they were toward reaching the day’s energy expenditure and calorie intake goals. When the researcher, Intel’s Junaith Shahabdeen, started running, the data presented on the mobile phone showed his heart rate increasing and the message, “Calm down, please.” Shahabdeen said he has been using the device for weeks during daytime hours and has learned that his desk job keeps him too sedentary.

It’s time, she said, for humans to stop adapting to computers. “They should adapt to us,” she said.

VentureBeat

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact. Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:
  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more
Become a member