This app tells you what it is looking at and vibrates based on how confident it is. When using this app you will wave the phone around you until you feel it vibrating more and more which means you are getting closer to an object it understands.
The convolutional neural network inside the app can understand 1000 classes based on ImageNet classes. This app has been tested on the Nexus 4 and Nexus 5 phones. Please report issues so I can make a note of them here.
I'm releasing this tool for the blind free because I hope this app will help people live a better life.
Visit my homepage here: http://josephpcohen.com/
It's a good idea, but it's currently very inaccurate.
Literally tells you what it thinks its looking at. Amazing software, expect to be bought out by somebody
Really cool of you, I hope you get your kickstarter!
. This is the coolest app ever and I hope things we look at are helping it somehow
Ui is ugly but app is interesting
Voice doesn't always work even on a positive identification but it's accuracy is amazing
Works 50% of the time but is really fun and I've had a lot of laughs.
Nexus 4 Android Marshmallow 6.0.1
It's not working
It's really a hard job. You deserve 5
I hope this will help the blind.
Impressive app, keep it up.
Horrible and Inaccurate
Nice initiation and Loved it
Apparently I had a jellyfish, a blue whale and a rhinoceros beetle hiding in my room! Who knew?!!!
Camera is upsidedown on Nexus 5x. Dev needs to detect camera orientation and compensate, or switch to Camera2 api when available.
Tested it for my daughter who is blind. It was describing objects incorrectly. My hardwood floor was a pomeranian dog. Toothpaste was bandaid. Door handle was a blower, lamp was a vase, Didn't know bed, door..etc. needs major improvement. Once that is done I will reinstall app. But did get toilet, shower curtain and clock
Galaxy S6. Blindtool works for a few seconds, then loses connectivity with the camera. After which it no longer reconnects. Worse, all camera apps no longer work until the phone is rebooted.
It needs some work, but fantastic work!
It seems that the app cant understand most of the thing.. u need to work harder on the app i see it might have a great future to use by google glass or any future devices .. I also think it will be good idea if you use the remote control waves to tell how close you are from the objects
Excellent start! Thank you also for making this available for free for the blind. I hope that the range of objects that can be reliably identified in daily living situations will broaden in due course: use the Teradeep neural network? It would be nice to have an option to turn vibration off, because it does not add anything to the already confidence-thresholded speech feedback.
However my jack russel was a rott weiler my mum was a cradle, wearing a cowboy hat, my hand was a wooden spoon and my cat was skiperg (wrong spelling) on last word. I love the idea and hope it improves greatly as would be very helpful
A load of rubish no one dicription was right. We have no tripod or grand piano and our hifi looks nothing like a microwave. Don't waste time downloading it!
It is very surprising how accurate this app is. Im not sure about the technicalities but there is some very impressive technology here. Still needs work obviously but I bet companies interested in advanced AI will probably buy this
Application needs improvement recoganizing vehicles and doesnt recognize objects in dorrs it wood be nice if the application wood tell a blind person when it is safe to cross streets.
WOW.....if you have trouble to understad what you see...this is your app. Whitout other words!Perfect for all people of the world!
Really Nice work. But i think instead of giving very general classes, you better should specify classes specially on daily house holds things.
This app is really impressive, and it works surprisingly well. I'd be very interested in learning more about the algorithms behind it.
Does not work at all in samsung s4. Could not recognize a coffe cup.
Nice app with mxnet deep learning. It is not based on google's tensorflow but the great DMLC's MXnet. MXnet also has nice api that everyone can create her/his own deep learning app, just google search MXnet.
Excellent start! Thank you also for making this available for free for the blind. I hope the range of objects that can be reliably identified will broaden in due course, but I have informed the users of my own app (The vOICe for Android sensory substitution for the blind app) about the availability of BlindTool. The earlier crashing upon device rotation is now fixed. Keep up the good work!
I love the concept, and I'm excited to see where you take this project!
Cool concept, I wonder which one of the networks are you using, the one that comes with TENSORFLOW?
This app is incredibly accurate. I can't believe any old phone can analyze pictures like that. I'd say 1/3 realistically identifiable things are correct and that's amazing. Stability issues are prevalent but that's minor compared to it's overall functionality. I think it could benefit some basic options like turning on/off the speaking too. Edit: phone is Oneplus Two on cm12.1
Still needs some work but a great idea. Looking forward to future updates. Also it's not speaking, HTC m8
A fun app to experiment with. As others, looking forward to where this type of technology takes us in the future.
Nice idea i must say!! But unfortunately it thought my headphones were a dumbbell and my wallet was a punching bag and my hand was a matchstick...
I wasn't able to name one thing in a more complex room. It does speak. Very interesting. Htc m8