Describes photos using audio for Blind and Visually-Impaired Users
|Demo Video||View on GitHub|
aiEyes is an open source app that helps the Blind and Visually-Impaired Users to see the world with the help of artificial intelligence. Developed with Ionic framework, Azure Computer Vision API and Google Translate API it is able to describe pictures to the user.
Inspired by @adrielcafe BeMyEyesXamarinApp.
|Ionic framework||Ionic is the beautiful, free and open source mobile SDK for developing native and progressive web apps with ease.|
|Azure Computer Vision API||Extract rich information from images to categorize and process visual data – and machine-assisted moderation of images to help curate your services.|
|Google Translate API||A free and unlimited API for Google Translate :dollar::no_entry_sign:|
|Server google-translate-api||A simple server using a free and unlimited API for Google Translate.|
|Ionic Native Camera||This plugin defines a global
|Ionic Native File and File Transfer||This plugin allows you to upload and download files.|
|Ionic Native Text To Speech||Text to Speech plugin.|
|Ionic Native Vibration||Vibrates the device.|
|Ionic Native Screen Orientation||Cordova plugin to set/lock the screen orientation in a common way.|
|Ionic Native Globalization||This plugin obtains information and performs operations specific to the user’s locale, language, and timezone.|
|Ionic Native Storage||Native storage of variables in Android and iOS.|
|NGX-Translate||NGX-Translate is an internationalization library for Angular 2+. It lets you define translations for your content in different languages and switch between them easily.|