Advertisement
 

iPhone Siri, Is the 1987 Knowledge Navigtor Finally Here?

User Rating: / 0
PoorBest 
Monday, 26 September 2011

 iPhone Siri, Is the 1987 Knowledge Navigtor Finally Here?

http://www.evdoinfo.com/images/ios5.png

Back in 1987 Apple created a 'proof in concept" on how people will interact with their devices in the future.  That really never happened, BUT the next iPhone (iPhone5) may actually be moving us in that direction.  There have been reports that over the last year that Apple partnered with Nuance (speech company) and they also purchased Suri (another speech company) and there have been leaks that this powerful speech and speech reconition feature finally will have enough power with the rumored dual-core A5 processor that is rumored to be in the iPhone 5.

Here is what is rumored the iPhone will be able to do:

How it works (according to 9to5Mac):

To activate, the user holds down the home button for a couple of seconds (loads much quicker than Voice Control because of the A5 chip/RAM) and then the microphone interface “slides up” from the bottom in a clever animation. The speech interface doesn’t cover your entire view, just about the bottom fourth of the display – like the multitasking/app-switcher function. The feature even works from the lock screen.

Since there are so few direct commands with Voice Control, Apple slides the command options by on a waveform. Assistant is packed with seemingly endless possibilities, so Apple instead has a small “info” button which one can click to view some of the most commonly spoken commands. This command view not only shows command types, but actually provides some sample phrases; Apple obviously wants their implementation to be as intuitive as possible. In the middle of the Assistant interface – next to the small command samples button – is a silver microphone icon with an orbiting purple flare. The flare notes that your iPhone is ready to receive commands.

Assistant taps into many aspects of the iPhone, according to people familiar with the feature and SDK findings. For example, one can say make appointment with Mark Gurman for 7:30 PM and Assistant will create the appointment in the user’s calendar. On noting events, Assistant also allows users to set reminders for the iOS 5 Reminders application. For example, a user could say “remind me to buy milk when I arrive at the market.” Another example would be integration with the iOS Maps application. A user could ask: “how do I get to Staples Center?” and Assistant will use the user’s current location via GPS and provide directions.

Another interesting Assistant feature is the ability to create and send an SMS or iMessage with just your voice. For example, you can say “send a text to Mark saying I’ll be running late to lunch!” – and it will send. This is a super compelling feature for people who cannot physically or safely take the time to type out a text message. Users can also choose to have Assistant read back unsent text messages to ensure the system interpreted the speech correctly. If the text is written correctly, the user simply says “yes;” if not the user says “no” and Assistant will ask the user to speak again. Apple is also working to allow users to ask for a specific song to be played. Voice Control only allows albums, artists, and playlists to be chosen with your voice.

One of the key elements of Assistant is the conversation view. The system will actually speak back and forth with the user to gain the most information in order to provide the best results. The user essentially can hold a conversation with their iPhone like it is another human being. For example, if a user is making a meeting with me, they will say “setup meeting with Mark” and the first “bubble” of the conversation thread will say that. After that, the system will speak back: “which e-mail address should Mark be notified at, work or personal?” This question will both be spoken out loud by the iPhone Assistant and shown as a new “bubble” in the conversation thread. The user will then respond with the email address they want to notify me at, and the appointment will be made. The iPhone will even show a quick glance at a calendar view to confirm the appointment. If the Assistant was sending an SMS, as another example, a mini SMS view would appear so the user has a quick glance at the SMS thread.

Assistant is literally like a personal assistant, but in your phone. The speech interpretation is so accurate that users do not even have to speak very clearly or in a slow and robotic tone, according to a source familiar with the software. Users can simply talk how they would usually talk to another person, and the iPhone with Assistant will do its best to interpret the speech and provide accurate results.

Does this sound like something you heard from a long time ago?

Apple created a proof of concept video in 1987 named the "Knowlege Navigator" (see youtube video).  While iOS5 isn't exactly this, it is sounding like it is going down this 2 way interaction wiht your device.

Related Links:




Last Updated ( Wednesday, 05 October 2011 )
 
< Prev   Next >

EVDO News, Tips, Products, Reviews, Verizon and Sprint Experts.
Welcome to the #1 source for EVDO Information. Search our EVDO forums, read our EVDO Blogs, check EVDO coverage and when ready, buy your EVDO products from us, your EVDO Experts. Call us @ 1-866-3GSTORE.
 
The image “http://www.evdoinfo.com/images/stories/evdo_easy_button.jpg” cannot be displayed, because it contains errors.
 

CrawlTrack: free crawlers and spiders tracking script for webmaster- SEO script -script gratuit de dÔøΩtection des robots pour webmaster