Showing posts with label iphone. Show all posts
Showing posts with label iphone. Show all posts

Monday, December 07, 2009

Google Goggles: Why Didn’t I Think of That?

So, Google Goggles is now in beta in Google Labs and available on Android phones.
Humorous name aside, the product looks to be a huge leap forward in the field of visual search — by which I mean, you point a camera at something and Google figures out what it is.
Here’s a little video explanation.



As the Tech Crunch article mentions, it’s somewhat similar to ShopSavvy. I’ve used the ShopSavvy demo video in my last couple of presentations, replacing another video I had been using of an iPhone app called Bionic Eye. That made me think of an earlier post of mine where I said:
This is a nice little app for what it does, but imagine what it’s going to evolve into: a portable heads-up display for everything. Yes, right now it lists restaurants, subway stations (in certain cities), and wifi hotspots, but it’s not that hard to extrapolate a few years into the future where this app – or something like it – connects you to all the available information about whatever you’re looking at.

It doesn’t really matter whether it’s on an iPhone-type device, or whether it’s mounted on your eyeglasses, it’s going to be with you effectively 24/7/365 (only “effectively” because you can still choose to turn it off), have 99% uptime, and is going to get better every hour of every day as more information is added to it. Practically every urban location will be geotagged and infotagged (think Google Street View on steroids), extending further and further beyond urban areas with each passing year. In fact, I imagine the app will evolve into a two-way app, with users adding to the database as they go about their daily routines, constantly adding more locations and more data to the database.

Perhaps a few more years down the road artificial intelligence object-recognition software will be embedded, maybe even with some simple sensors to analyze the material it’s looking at, so that the app will be able to peer into just about any object and return information about it’s chemical composition, various useful facts about it, and ways the object can be used.
Huh. Maybe I shouldn't have changed my major.

Seriously, though, the truth is ending up stranger than fiction . . .

Sunday, September 27, 2009

We Have the Technology

When I was growing up I liked watching the Six Million Dollar Man on television. Looking back, it was a pretty hokey show, but I really liked it at the time. In the opening for the show, there’s a line that says, “We have the technology.” I thought of that – for pretty obvious reasons if you’re familiar with the show - when viewing the video for the Bionic Eye iPhone application.



This is a nice little app for what it does, but imagine what it’s going to evolve into: a portable heads-up display for everything. Yes, right now it lists restaurants, subway stations (in certain cities), and wifi hotspots, but it’s not that hard to extrapolate a few years into the future where this app – or something like it – connects you to all the available information about whatever you’re looking at.

It doesn’t really matter whether it’s on an iPhone-type device, or whether it’s mounted on your eyeglasses, it’s going to be with you effectively 24/7/365 (only “effectively” because you can still choose to turn it off), have 99% uptime, and is going to get better every hour of every day as more information is added to it. Practically every urban location will be geotagged and infotagged (think Google Street View on steroids), extending further and further beyond urban areas with each passing year. In fact, I imagine the app will evolve into a two-way app, with users adding to the database as they go about their daily routines, constantly adding more locations and more data to the database.

Perhaps a few more years down the road artificial intelligence object-recognition software will be embedded, maybe even with some simple sensors to analyze the material it’s looking at, so that the app will be able to peer into just about any object and return information about it’s chemical composition, various useful facts about it, and ways the object can be used.

I know that scenario is frightening to a lot of folks, and certainly there are going to be more and more privacy/ethical issues we are going to have to figure out as a society. But, for the moment, let’s focus on the incredibly positive side of this – what kind of learning apps can be built on this platform? What will we be able to do as teachers and students that we can barely even conceive of today, but will be commonplace in the very near future? What happens when the sum total of the world’s knowledge – updated in real time - is available in a portable heads-up display?

Just imagine the possibilities. How many years is it going to be before we see something of this sophistication? I don’t know. My guess is more than three and less than thirty. So you’ve got to ask the question, does your school/district want to be ahead of the curve in figuring out best practices, or behind it?