Friday, 26 June 2009
Well, I said I didn't have the cash to try out any of the voice guided turn by turn software for the iphone 3gs just yet. However then I saw the British Isles version of the new Navigon app for £37.99. The app can be used in portrait or landscape mode, and without an active data connection, which is pretty handy as you don't incur any more sneaky charges once you've bought the thing. Navigon say they will be updating the app with more features in the future, and there is an American version on the way.
A session of googling the likely forums revealed this app has a pedestrian mode (as well as car, bicycle, motorbike, lorry - but no public transit yet). The price goes up on the 30th of June to sixty quid (all bar a penny) so I bought it on a whim, because I don't own any gps software or hardware already. And guess what ? The app is ninety nine percent (in my experience) accessible with VoiceOver.
If I were to go through every nuance of the app I'd be here all night, and dear reader, you would die of boredom. The highlights include being able to set your home address and having the app take you home with a handy Take Me Home button right on the main menu. The ability to look up contacts from your address book and navigate to them by car, bike, lorry (heh !) or pedestrian modes. Let's be honest - the Shank's Pony mode is the one we're interested in as visually impaired people afterall. And being able to browse for Points of Interest such as atms and restaurants nearby.
The only unlabeled buttons I've found so far are in the search for point of interest section. However I expect I haven't explored all the in and outs quite yet so don't quote me on that being the absolutely only unlabeled buttons. There are three direct access buttons at the bottom of the search for p.o.i. screen which VO reports as "button", which one can customise in the options to search for your choice of p.o.i., such as atm, train station, shopping, whatever, for quick access. It becomes obvious that they are for this purpose when you fiddle with the options on this screen because you can set them up yourself. You'd just have to remember what you set them at.
As I expected, VO will not read street names when you pass your finger over the map. You can show a map rather than have to put in a route if you want to see what is round you, but VO isn't going to tell you any of the visual info sadly. This is going to be down to the map rather than VO - basically the map in this case is just a big picture afterall I guess. And we all know how screen readers treat pictures.
But putting in routes via an address is ok once you get the hang of it - it requires a town or city first, then a road, then the number. Below the text field for inputting the info are choices it thinks you might want, for example if you put an N in the field when putting a city or town in, a selectable list of towns and cities beginning with N will appear below the text field. There's a Next button down on the bottom right of the keyboard to advance through this process, or picking one of the list choices advances you automatically.
Picking contacts to navigate to is a matter of choosing the contacts button from the main menu, and picking one out of your address book.
Once the app accepts the address to navigate to there will be a More button at the bottom of the screen where you can pick what the app calls a Route Profile. A screen asking for a Speed Profile will appear - this is where you choose pedestrian, car, whatever (you have to do this to get the right mode, so to speak, if you're walking it). There's also a save as a favourite button for the address you've put in - a sort of customisable point of interest if you like. Once you've picked your route profile (you will only have to do it once unless you want to change it in the future, the choice sticks if it's not changed), go back to the previous screen and press Start Navigation (top right of screen). Then get going !
The polite female voice gives you accurate directions - in my test from my house to my fiance's anyway. She piped up to tell me to turn right and left as I was on the corner of roads (very useful), told me how many meters to walk and then turn, what roads I was turning onto, when I was on the road featuring my destination, and when I was approaching my destination, and when I'd gotten there. "You have reached your destination" rang out within about ten feet of my fiance's garden gate I'd say.
And when I had reached home again on the return, she informed me pretty much within two feet of the turn to my gate, as my fiance's guide dog indicated the turn.
Unlike Trekker you can't browse the route beforehand (so far), and it won't tell you the roads coming up on the left or right unless they are on your route (so far). However in the map screen (Show Map from the main menu), the name of the road is displayed at the bottom of the screen - and this is read by VO. It's alot bigger an area to tap in too, than the current location button in Google Maps. So if you were to make a turn and wonder what road you were on, tapping there would cause VO to tell you.
Whether it's for you or not will depend on your own individual perceptions, but what I can say is that the app is accessible in all the right areas. There are no nasty blank areas that are vital to putting in routes or making choices. Once you figure your way around the app, I reckon it's a winner for the money personally when faced with other accessible choices.
Your mileage, as ever, may vary - if you'll pardon the pun !
Thursday, 25 June 2009
Whilst trip trapping through the app store yesterday, I happened upon a free trial of an OCR program for the iphone 3gs called OcrNow ! Lite. There are others, but this one had a free version which gives you ten goes to see if it works for you. "I'll have some of that", I thought.
OCR, for those who may not know, stands for "optical character recognition", or reading words, to put it simply. Us visually impaired types might have more experience with this than most because it's software which allows us to read printed letters and the like which we maybe wouldn't normally be able to read. You pop your letter on a scanner, scan it, and the software takes a picture of the letter and converts the words in the image into plain text, which you can then read with a screen reader, edit, or whatever else you like - put straight into the trash bin (as is the case for me with junk mail) or run around the place screaming in horror (as is the case for me with household bills).
The iphone 3gs has an improved camera over the previous models (although the app does work on the previous iphones too) and finding this app gave me a thought about what use I could put it to, being a nearly blind bint and all.
Basically, the software takes a picture of your real world item containing the text you want to convert, uploads it to an online server, and then emails it back to you as an attached file in whatever format you choose - options being, pdf, plain text or rich text. You can also have the image itself included if you want. There are similar things floating about for Symbian phones, so I gather, though I've not played with them. Obviously to use this you need an active data connection.
The ocrNow ! app is very accessible via VoiceOver on the iphone 3gs. Hooray !
On opening the app, flicking from the top gets you the app title, an "add" button (this is where you go when you want to get started), and then a set of instructions which tell you what happens and how to do it.
At the bottom we have an "ocrnow" button (the "go" button for when you're taken your picture, basically), a delete button, and an options button.
The options button takes you to a screen with a "using ocr now tab", a "connection details" button, and a "jobs" button.
The using ocr now section gives you some tips about how to get the best results out of the program, more on that later. The connection settings department is where you will find the nominated email address for converted text to be sent, and the ocr server address (I advise not to fiddle with this bit, lol). The jobs button shows you a list of your jobs, how they were interpreted - for example how many characters and words the program recognised, and how many words of those were in the dictionary, plus any "suspicious characters", lol, the program detected. This doesn't mean how many dodgy reprobate types the program might have scanned your surroundings for (though wouldn't that be a triumph of an app ? heh heh) but how many characters the program couldn't quite suss. More about this section in a bit.
Now, here's where I have to point something out. When I started to fiddle with this app, I didn't have the idea of actually reviewing such things. It only occurred to me that my messing about with such things could possibly benefit others when I'd fiddled with a couple of programs and found my way around them already. So when you start this app for the very first time, you may well get a setup screen that tells you what to do and asks you for an email address to send the output to. I can't remember - it was yesterday when I first met the app and I have serious memory problems much of the time. I will try to remember to document the next program I "review" right from the off though.
So, to the meat of the matter.
On the first screen of the app, you have your add button as I said. Press this to get started scanning your text. The camera opens and tells you "viewfinder, image". Here is where the tips the program gives you in the using ocr now section really help. It recommends keeping the item flat, for example on a desk or on a wall. It recommends good light (well, we'll have to chance it on that one I guess). It recommends holding the phone about 13 inches away from the paper if you're doing an a4 letter, for a3 about 20 inches. Try to keep the border of the picture in line with the border of the paper, it says. Well, those of us who can see abit might have better success with this one - but it is worth remembering something that caught me out a few times - the camera on the iphone is on very top right corner of the rear of the phone if you hold it facing you with the home button at the bottom. I guess most of us would imagine it was in the middle of the phone, but it isn't. This makes lining stuff up abit less instinctive but there you go.
Once you think you've lined up your paper, gently double tap the screen somewhere in the middle. This focuses the camera on the item below it. You'll hear a sound when it focuses. I say do it gently because whatever movement you put on the phone will likely mess with the position of it and thus the focus. Double tap the take picture button to take a picture (or use the split tap method if you prefer - I reckon this is better for taking pictures, as it moves the phone less). It's probably worth messing about with the camera if you're so inclined beforehand to find your way around it.
When you've done this, you will get a preview of the image and a "retake" button at the bottom of the screen, and a "use" button. Double tap the use button to use the image (or retake if you've loused it up).
Then you're taken back to the previous screen, which will be showing a preview of your image, and giving you the magic "ocrNow!" button at the bottom left of the screen. Activate it !
You're now taken to a screen which allows you to set the options for your output. From top to bottom, you got the subject text field where you will likely want to put the subject of your email.
Then flicking past that gets you a choice of pdf, text or rtf buttons - then you're told this is the output format you're choosing. Ass backwards I know but you can't have it all. Your previous choice of format is remembered and indicates as selected.
Flicking by gets you a yes button and a no button, then a "return image" description - again backwards. Check yes if you want to have the image emailed to you to.
Then you have your ocr now button, which is where the magic happens (or a cancel button next to it if you have given up all hope by this point).
Activating the ocrnow button sends the image to the server. The screen changes to a progress bar but doesn't say anything when it's done so, give it a wee while. If you've got push notification for your email (assuming you're sending it to yourself and want to get the result on your iphone) you'll know when it's done as an email will come in. If not, about thirty seconds should do it. There's a button at the bottom of the screen which changes from "cancel" to "continue" when it's finished, so if you're light of touch this is a good way to find out when the magic's happened. Double tapping continue takes you back to the screen with the add button for further excitement (or boredom, depending on your point of view.)
Fire up your email on the iphone if you want instant gratification and you're not near your pooter, and look for an attachment right at the bottom of the blurb that the company who make the app write you when they send you your output. Flicking through this take a while. Double tap the attached file to get your outputted text, where you can read it at your leisure.
I have to admit I totally missed the attachment hiding at the end of the blurb to begin with and even emailed the company asking if the text could be sent inline with the email as opposed to attached, as I couldn't read attachments using my iphone (though in the back of my mind I recalled that you could - I just couldn't find them). What a silly cow I am ! They were very kind and said they would definitely consider this option in future version of the app, by the way. They must have thought I was a right eejit though.
If you don't want to wait for email, in the jobs section of the ocr now app you get a list of the jobs you've done (in the program, not in real life, heh) and options to delete them or download them. Choosing download gets you an option to view the output - which will then go to a screen which will read it all to you, but is not interactive so it will read it in one big lump (keep up at the back there, lol) or to email the output, and you can choose where to email it this time as opposed to having a default address set up.
So - does it work and how well ?
Well, yes, it does work. I snapped a letter I had been sent on a4 paper, and after a few tries of lining it up I managed to get an output that sent the entire letter back to me in a format I could actually read, with very few mistakes or missed characters. Anyone familiar with ocr will know that sometimes you do get a few letters wrong here and there, but you know what the word is supposed to be by context and common sense. I tried a book cover - pretty successful but not quite as easy to line up as the letter. I got real words out of it though, and they even made sense ! And a miniscule chocolate wrapper than even sighted folks might struggle to read I'd wager. I got enough to know that it was 54 percent cocoa solids and made by Kraft Foods. I have no idea where I had lined it up in the viewfinder though.
I suspect this concept might be more successful for those with some remaining vision than those without. I have some left so I was able to line the letter up ok with a good light and contrasting background, but not the smaller things I tried. All of this will depend on one's level of sight and perseverance. I imagine some enterprising and inventive types could make a frame for the iphone and the items to be scanned which makes the process of lining up and focusing a snap, but that would rather destroy the on the go nature of the app. Still, if you don't have a solution already, that would be an idea worth considering I reckon - because compared to scanning software and a camera for your computer, or a flatbed scanner and software, or a standalone solution (that's my credit card I hear screaming in the background at the mention of such things !), the app is dirt cheap.
The beauty of it is that there is a trial version which will do you ten goes, and the program is fully accessible with VoiceOver on the iphone, so you can try it yourself and see if it works for you. To buy the full version is £7.99.
I'm going to give it a go tonight at the pub and see how it copes with menus and the like.
If you've got some residual vision and don't want to spring for a mobile magnifier, or don't want to carry one around in addition to your iphone, ocrNow ! may well be what you are looking for ... pun intended.
Wednesday, 24 June 2009
As the turn by turn navigation apps start to roll in now that Apple has opened the doors to development of such things, voice guided navigation is now available for the iphone. AT&T have got one going for a monthly subscription in the US, so will O2 follow suit in the UK ? And even more importantly, will such things work with the iphone 3gs accessibility feature, VoiceOver ?
Often it's the maps that make such apps for mobile devices cost and arm and a leg, but I can't help thinking that those of us who are visually impaired and who need an accessible solution for navigation tend to get even more shafted by and large.
A prime example being Wayfinder Access.
In order to use Wayfinder Access, one has to have Nuance Talks or MobileSpeak on your mobile. Talks is £150 for an IMEI based license, or £280 for a sim based license. MobileSpeak was £149 from Enableeyes last time I looked. Wayfinder Access itself is anything from £150 upwards depending on what maps you want. You need a data connection to use it. And of course you'll need your mobile phone to install all this on.
Wayfinder Navigator (which isn't accessible, so this is the version the "normal" people get to use) however, available on a 12 month basis (or a 36 month basis) is £49.99 for 12 months. You need a data connection for this too.
Having messed about with both of these, I wonder what Wayfinder Access provides, with regards to actual information to guide you, that Navigator doesn't. In my experience, nothing. It just works with a third party speech solution. So why is Wayfinder Access three times the price ? Oh, sorry I forgot - because it's a limited market (the stock answer). Perhaps if it wasn't so bloody expensive more visually impaired people would buy it.
Trekker Breeze from HumanWare is £485. Brilliant though it is - an all in one device about the size of a tv remote - a friend of mine who has one says the gps signal often drops out on buses. Dearie me. For that money I'd want it to be bulletproof. And don't even go there with the original Trekker, a bastardized Dell pda cum braille / tactile add on with extra bluetooth receiver and speaker. For the £1300 my fiance paid for it three years ago, it is fugly, and the development and support of it now very likely orphaned by HumanWare. And you look like a tit with all that gear hanging off you. Don't mention the 100 quid they would like to charge you for a new speaker - merely a rechargeable clip on effort with a run of the mill 3.5 jack to connect it to the pda headphone port - should yours crap out at the first sign of ambient moisture either.
Loadstone is a wonderful concept, free, and accessible (and coded by two blind blokes) - however as the "map" data is actually provided by users previously sharing route and area data with the program (PointShare), it's use is limited to those of us who live in an area already explored by other Loadstone users. Or those of us who are adventurous, or who have access to sighted guides to help us go over our routes prior to needing to traverse them alone. I wonder if an iphone version will surface ? I hope so. The iphone platform could offer so much more data provided by users for such an app, as there are so many of the sleek little Apple beasts out there.
For our sighted friends, we have an assortment of wonderful gps systems with voice guided navigation. Without voice guided navigation in car sat nav would be pretty unwise afterall, so voice guides are neccessary (and these days, usually standard). A quick look in Dixons reveals gps systems from 50 quid upwards, maps pre loaded. And you get the physical device included and ready to go !
Stick one in your silent but deadly hybrid car, listen to the dulcet tones of Celia (as my dad calls the mournful lady imprisoned in his TomTom), pootle blithely along and confuse a blind bod trying to cross the road today, why don't you. lol.
I understand that just because most of my friends are visually impaired, it doesn't mean the rest of the world is too. I know I am in a minority. I know assistive devices are a limited market - but what sticks in my craw is that the tech for the things we need is already out there and being used in devices for the fully sighted every day. Yet the same thing is in an accessible form, it costs a shed load more. And often the very people who could use this assistive tech are the people who just don't have the money for this sort of thing.
I have been delighted to find that the majority of third party apps I've tried on the iphone 3gs have been at least partly accessible with VoiceOver. Some of them are completely accessible, and most of them entirely usable even if they don't speak everything on the screen. This is great because - ah, heaven ! - the visually impaired amongst us aren't charged any more than our sighted counterparts when we buy them. Perhaps, thanks to Apple's taking the accessibility bull by the horns with the iphone 3gs, the days of blind bods getting held over the proverbial barrel that is access to mobile phone software and given a good rogering are over. I really really hope so.
But I can't help fearing that when it comes to an accessible voice guided turn by turn gps solution, we may find ourselves yet again bent over and looking at the rough ground of harsh reality whilst a company like Wayfinder Systems takes advantage of us. Here's hoping I'm wrong. Currently I don't have the money to give the current iphone turn by turn offerings that feature voice guidance a try on the accessibility front. If anyone else does, please let me know !
Monday, 22 June 2009
Ok, it seems I am yet again taking back wot I said before - Fring (www.fring.com) does accept voice calls on the iphone 3gs, with VoiceOver active. The Fring app itself seems mainly accessible, except it does not speak the status of your buddies, so you have to guess if they're on line or not. I gather the online status is shown by a green icon as opposed to a grey one for offline. The accessibility is good because Fring has a Skype add-on, and you can make calls to Skypers straight from the Fring app without having to have your Skype buddy call you first (as documented below in my murmurings about Skype).
At the bottom of the Fring screen you get a buddy list pane icon, a history pane button, the dialler pane button, the goto pane button, and a more pane button.
Fring starts up with the pane you had in the session before - so on first use it will be the buddy pane (as I recall).
The more pane is where you actually add buddies, and add ons for Fring such as Skype, to enable you to make Skype calls - result -, or if you like, to add AIM or ICQ plugins, and a few others. If you've not set Fring up before this is where you'll want to go first, to add plugins for the services you need. Once you add in, say, the Skype plugin and set it up with your username and password for Skype, Fring automatically imports your Skype contacts. Ditto with any other plugins you add in.
The buddy pane (first on the left at the bottom) is where you find your buddies, and activating a buddy by clicking their name reveals (from top left travelling right) a back button, their name and user id, then two buttons. The first on the left is the "chat" button (VO just says "button"). The second on the right (both about in the dead middle of the screen) is the "call" button (again VO just says "button"). Sometimes I have found that you have to flick back and forth between these to get VO to say "button" for some reason, sometimes it merely clicks.
Activating the call button - well, calls the contact - if the contact is on a service that support voice calls through Fring. Skype does. I haven't tried the others because I only have Skype contacts.
Flicking around the screen that appears when in a call gets you the contact name, call in progress, the time of the call in minutes and seconds, and the final button at the end of the right flick is the end call button, which sometimes reveals itself in VO as "end call" button, or sometimes just "button". It's directly above the tactile home screen button anyhow.
Activating the chat button chats the contact - no surprises there. You get a back button to return to the previous pane, the contact's name and the date and time of the chat, and then the messages in the chat (if there's any, if it's a new chat you get "empty list".) Flick along to the right to get a text field (approx half way down the screen), and after double clicking to activate, type your chat message on the keyboard which pops up (bottom half of screen, where the keyboard usually is on the 3gs).
There's a clear text button to the right of the text field (if you mess up typing your message or decide not to send it afterall I assume), and then the send button. Guess what that one does ? Heh.
You'll get a tone on the receipt of any chat messages from your contact, but the text they send is not read out automatically. You have to either flick left of the text field to read the latest chat message, or head for the top third to top half of the screen to find it by finger, where it will be displayed ina timeline of the chat along with the previous messages in conversation form.
The history pane shows your history of chats and calls. It reads ok.
The dialler pane in Fring isn't accessible at all, so if you were hoping to dial cell contacts from it you're out of luck. This is what the dialler pane is for apparently.
The goto pane reveals your current chats or calls, if you want to switch between I imagine.
That's about it I think.
Sometimes, as in a few iphone 3gs apps I've tried, VO will read content on Fring panes that you might have used previously. From what I can gather, it acts as if it's reading a pane which is underneath the pane you're using as opposed to the one you're actually using. Odd, but I suspect there are a few bugs that Apple need to iron out in the next os update. Coming out of the current app and then back in seems to sort it out.
I tell porky pies - I got ti wrong yesterday. Skype for iphone is semi accessible with VoiceOver and will accept and make calls.
The only parts which aren't accessible are the contact list and the chats pane, which is a bummer because if you can't access your contacts you can't make calls. However this can be gotten around by having that contact make a call to you first, then they appear in your history list, which is accessible. You can then call or chat them from there, but you can't read their chat replies. Boo hoo.
So - there's hope for it if you can get your contacts to make contact with you first. This is what works for me, anyhow.
Sunday, 21 June 2009
I got my shiney new iphone 3gs yesterday. After some messing about with o2, being on hold for hours, issues with my credit card (that were not of my making fortunately), I finally managed to order a 16gb black version. It turned up yesterday morning at quarter to nine in the morning. I imagine the dogs' few barks at the courier will have pissed my neighbors off no end, but there you go.
Playing with it, I have a few observations about the in and outs of VoiceOver on the 3GS, which I will rabbit on about - starting with the issues I have with it first.
The rotor control does not work exactly as described by Apple, well, for me at least. The way I read the instructions seems to imply that one puts two fingers on the screen and rotates them together as if turning a dial to switch between the various options that the rotor controls.
I couldn't get this to work no matter what - but what does work for me is to put one finger on the screen and keep it touching the screen, and then put another finger next to it and rotate that finger, keeping the first finger still on the screen. You will then hear a twisting dial sound, and the characters, words or whatever options (which change depending on what app you're in) will be read out. Just stop turning when you get to the one you want.
As for selecting items, I favour the split tap method - passing one finger over the screen, touching it lightly, and when the item you want to use or select is read out, I then tap anywhere on the screen with another finger to open or activate it. It is important to keep the first navigation finger, as I call it, touching the screen and on the item you want whilst tapping with the other, or activation finger, as I have christened it. If you move your selection finger before tapping with the activation finger even slightly you may find the item you wanted isn't selected anymore, as items can be very close together.
There is a bug in the Notes program that intermittently does not activate the on-screen keyboard when editing the text, so the audio prompt will say "Double tap to edit", and you will double tap, then the keyboard will not appear and you'll be left wondering what is going on. Exiting the Notes app and then reentering it seems to sort it out ninety nine percent of the time.
I had the original iPhone 2G when it came out, but had to stop using it as my remaining vision become too poor to use it (I went onto a Nokia E90 with Mobile Squeak, what a nightmare). Before I gave it up I imagine people got some very odd and garbled text messages from me however. But I have retained a vague idea of the layout of the iPhone keyboard, apps, and so on, which is helping me now. Having a vague idea where things are helps I think.
My bf, who whilst being lovely has the patience of a gnat (and no sight at all), finds the iPhone likeable but frustrating as he finds it hard to visualise the layout of things. Whatever I do to try and explain it isn't helping. It seems hard for him to not accidentally touch the screen and put his navigation out, if that makes sense, which means he often activates items he didn't mean to and sometimes moves the insertion point of text without meaning to. And he has bigger fingers than my little fairy-like fingers. Heh heh.
This is why I'm using the navigation finger plus activation finger method - it works better for me as the selection finger is always safely on the screen. Perhaps, being a pianist is also helping here - I'm used to the left and right hands doing different things in tandem.
Inputting text is easier in landscape mode I find, and here I can rest the phone in my curled fingers as if I were holding a game controller. I use the left hand thumb to navigate and select the left keyboard keys, and the right thumb for activation - and the right thumb to navigate and select the keys on the right side of the keyboard, with the left thumb to activate. It takes abit of getting used to but I am fairly flying along now.
I have yet to figure out how to copy and paste (although I have done it accidentally, lol) - VoiceOver changes all the standard gestures, and how to do this whilst using VO doesn't seem to be documented yet.
Disappointingly, Maps doesn't read the names of roads as you pass your finger over the screen. I thought it wouldn't, but I had hoped. But it will read your current locale, and directions can be gotten in a list which makes them easy to flick through. I have yet to go outside and see how the program actually works on the street, and how much voice feedback one will get.
Everything else built in I've tried so far works without a hitch.
Some of the third party apps I had on my old iPhone don't work with VO on the 3GS - Ocarina for one. I imagine this is because the iPhone doesn't do multitasking and VO uses the audio driver, meaning it isn't available for such things as Ocarina, Zephyr, and Stylophone. Skype is accessible, and one can receive chats and reply to them, but not initiate them. Fring and Skype seem not to be happy about voip calls - possibly because of the audio driver issue. So far they've all crashed on me when attempting to receive a call.
Twitterrific works great - it all seems to be accessible, though I have only just downloaded it so I am still finding my way around it.
As often is the case with Apple, the VO documentation for the iPhone is abit sparse and not totally complete right now - better than for the original VoiceOver on Tiger though. That really was flying blind back then. I'm sure as time goes on, more and more people using the 3GS with VO will appear and we will share tips and tricks. Meantime I'm playing and learning, and waiting for a review to cover the bits I am having trouble with from Lioncourt.com - I gather Josh is going to write one soon.
I'm very pleased with the iPhone 3GS so far. It is very responsive and sleek. And the voice - a version of OSX's Vicky I imagine - is quite passable, rather than the robotesque Nokia voices I am used to, though not quite up to Alex on Leopard. It is very encouraging to see a mobile phone that is accessible out of the box - more importantly, one that doesn't look like a brick or is designed for a special need that is accessible out of the box. This is a mainstream phone, accessible to us blind buggers, without having to pay out a hundred and fifty notes on separate software on top of the price of the phone and contract. Perhaps other phone manufacturers will follow suit, though I am not going to hold my breath on it.
All in all, the iPhone 3GS is - in my humble opinion - well worth the money and the eighteen month contract I have just signed up for. It has surpassed my expectations already. Result.