The dominance of touchscreen user interfaces will decrease over the next five years as more sensors are introduced to mainstream products and entirely new product form-factors emerge, necessitating new user interfaces such as voice, gesture, eye-tracking and neural, according to a new report from ABI Research.

The report examines popular user interface (UI) methods as well as the natural sensory technologies moving from research labs into future commercial products.

“Touch got mobile device usability to where it is today, but touch will become one of many interfaces for future devices as well as for new and future markets,” Jeff Orr, senior practice director at ABI, said in a statement. “The really exciting opportunity arrives when multiple user interfaces are blended together for entirely new experiences.”

Among 11 unique features from wireless connectivity to embedded sensors, ABI Research found that hand and facial gesture recognition will experience the greatest growth in future smartphone and tablet shipments, with a compound annual growth rate of 30% and 43% respectively from 2014 to 2019.

The impact of UI innovation in mobile devices will be felt across a range of applications, the report says, and as mobile applications integrate more technology, the UI must be kept simple enough to be intuitive.

“Packing a mobile device with sensors goes little beyond being a novelty,” Orr said. “Complexity contradicts good UI design and a critical mass of engaging mobile applications are required for mainstream adoption.”

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access