The World’s First ‘Hands On Search’

Introducing the world’s first hand on search, allowing visually impaired children in Tokyo to search with their hands. The vending machine-like pod recognises your verbal search, then recreates with a 3D printer. For the first time, visually impaired people will be able to touch what they can’t see, anything from giraffes to skyscrapers.


Could future devices read images from our brains?

As an expert on cutting-edge digital displays, Mary Lou Jepsen studies how to show our most creative ideas on screens. And as a brain surgery patient herself, she is driven to know more about the neural activity that underlies invention, creativity, thought. She meshes these two passions in a rather mind-blowing talk on two cutting-edge brain studies that might point to a new frontier in understanding how (and what) we think.


Enabling speed reading

The reading game is about to change forever. With Spritz, which is coming to the Samsung Galaxy S5 and Samsung Gear 2 watch, words appear one at a time in rapid succession. This allows you to read at speeds of between 250 and 1,000 words per minute. The typical college-level reader reads at a pace of between 200 and 400 a minute.

Other apps have offered up similar types of rapid serial visual presentation to enhance reading speed and convenience on mobile devices in the past. What Spritz does differently is manipulate the format of the words to more appropriately line them up with the eye’s natural motion of reading. The “Optimal Recognition Point” (ORP) is slightly left of the center of each word, and is the precise point at which our brain deciphers each jumble of letters. The unique aspect of Spritz is that it identifies the ORP of each word, makes that letter red and presents all of the ORPs at the same space on the screen. In this way, our eyes don’t move at all as we see the words, and we can therefore process information instantaneously rather than spend time decoding each word.

This is 250 words per minute. Harry Potter and the Philosopher's stone is 76,944 words long. At this rate you could read HP1 in just over 5 hours.




350 words per minute doesn't seem that much faster. 3 hours and 40 minutes to finish Potter.

Science quote from their web page:
When reading, only around 20% of your time is spent processing content. The remaining 80% is spent physically moving your eyes from word to word and scanning for the next [Optimal Recognition Point].




Now it's getting harder to follow. Probably takes time to get used to, but still, I can't imagine being able to concentrate on this for too long. If you could keep up with this for two and a half hours, you could read Harry Potter 1 from cover to cover.




Boston-based Spritz, which says its been in "Stealth Mode" for nearly three years, is working on licensing its technology to software developers, ebook makers and even wearables.

Here's a little bit more about how it works: In every word you read, there is an "Optimal Recognition Point” or ORP. This is also called a "fixation point." The "fixation point" in every word is generally immediately to the left of the middle of a word, explains Kevin Larson, of Microsoft's Advanced Reading Technologies team. As you read, your eyes hop from fixation point to fixation point, often skipping significantly shorter words.

"After your eyes find the ORP, your brain starts to process the meaning of the word that you’re viewing," Spritz explains on its website. Spritz indicates the ORP by making it red, and positions each word so that the ORP is at the same point, so your eyes don't have to move. That's what makes it different from RSVP speed reading, which just shows you words in rapid succession with no regard to the ORP. Here's a graphic that shows how Spritz keeps your eyes still while reading:



via HuffingtonPost

A New Car UI

How touch screen controls in cars should work

The problem: Several automotive companies have begun replacing traditional controls in their cars with touch screens. Unfortunately, their eagerness to set new trends in hardware, is not matched by their ambition to create innovative software experiences for these new input mechanisms. Instead of embracing new constraints and opportunities, they merely replicate old button layouts and shapes on these new, flat, glowing surfaces. So even controls for air condition and infotainment - which are commonly used while driving - now lack any tactile feedback and require the driver's dexterity and attention when operated. Considering that distracted driving is the number one cause for car accidents, this is not a step in the right direction. The solution: A new mode that can be invoked at any time: It clears the entire screen of those tiny, intangible control elements and makes way for big, forgiving gestures that can be performed anywhere. In place of the lost tactile feedback, the interface leverages the driver's muscle memory to ensure their ability to control crucial features without taking their eyes off the road. via Matthaeus Krenn

Feel textures on a screen

Fujitsu have been demonstrating a prototype tablet that features haptic technology which gives the user the ability to feel the texture of the on screen image.

The Fujitsu uses ultrasonic inducers on the screen to vibrate it at different frequencies, creating a cushion of high-pressure above the screen that can be varied based on your fingertip's position on an X-Y axis. Match that with an onscreen image and you have something pretty magical - different 'surfaces' in images feel different to the touch.




"This technology enables tactile sensations — either smooth or rough, which had until now been difficult to achieve — right on the touch-screen display," Fujitsu said in a statement. "Users can enjoy realistic tactile sensations as they are applied to images of objects displayed on the screen."



The technology currently works only with a single point of contact, too - the whole screen reacts to that point of contact, and feedback can't be more accurately localised - but it's very early days and will undoubtedly evolve. Down the line, a version that could replicate a two-thumb control pad on screen would transform mobile gaming.

Fujitsu's track record with bringing its technology experiments to market, either in its own products on licensed to other vendors, is excellent - about 90% see manufacture. It hopes this one will hit retail in 2015.

High speed robot hand

This short video is of a series of demonstrations of a highly capable robotic hand executing tasks that humans could never match in terms of pace. It dribbles so fast all you see is a blur. Rice isn’t too small, and watching the hand catch a phone is not just wild, it’s downright creepy. The lack of overcompensation in its responses to complex, three-dimensional motion data is off – we could never replicate it as lowly humans.

Exoskeletal robot suit to make astronauts super-strong



X1 was initially designed as a human assist device to allow persons with paraplegia to walk again. Strategically designed motors allow for high torque applications such as stair climbing, while multiple points of adjustment allow for a wide range of users. We are now exploring space applications for exoskeletons, such as amplifying astronaut strength, or even as exercise devices for long duration missions.

Read more over on TechLand