Skip to content

triceam/Watson-Translator

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

#Watson Translator The Watson Translator app leverages IBM Watson language translation and text to speech services with the open source Tesseract OCR engine to create a sample interface that allows you to extract text from an image, and then translate that text to another language, and even generate spoken audio in another language from that translation. The app has also been configured to use Bluemix Mobile Client Access for operational analyticsat runtime.

Screenshot from app

Note: This should not be confused with any of IBM Research's OCR (Optical Character Recognition) or Natural-Scene OCR work. The OCR in this example is based on the open source Tesseract project, which works best with dark text on a light background.


Bluemix Services Used

  1. Mobile Client Access - Capture analytics and logs from mobile apps running on devices
  2. Watson Language Translation - Language translation service
  3. Watson Text To Speech - Audible language synthesis from text

Setting Up The Bluemix Backend

  1. Create a Bluemix Account

    Sign up for Bluemix, or sign in using an existing account.

  2. From the Bluemix Dashboard, click on the "Create App" link, then select the "Mobile" template and walk through the process of creating your app infrastructure. Remember the app namel you are going to need it later.

  3. From the app dashboard, add the Watson Language Translation and [Watson Speech To Text][stt] services.

    Screenshot from app

  4. Download and install the Cloud-foundry CLI tool. This will be used to deploy your Node.js back end.

  5. Clone the app to your local environment from your terminal using the following command

git clone https://github.com/triceam/Watson-Translator.git
  1. Back in the command line termianl, cd into this newly created directory, then go into the /server directory.

  2. Connect to Bluemix in the command line tool and follow the prompts to log in.

$ cf api https://api.ng.bluemix.net
$ cf login
  1. Push it to Bluemix. This will automatically deploy the back end app and start the Node.js instance. Replace "AppName" with the name of your app on Bluemix.
$ cf push AppName
  1. Voila! You now have your very own API instance up and running on Bluemix. Next we need to configure the Mobile Client application. You can test your Node.js app deployment at https://yourapp.mybluemix.net (use your actual app route).

Setting Up The Mobile App

The native iOS application requires Xcode 6.4 running on a Mac to compile and deploy on either the iOS Simulator or on a development device. Xcode 6.4 is required to target Apple WatchOS 1.0.

  1. If you do not already have it, download and install CocoaPods.

  2. In a terminal window, cd into the /client directory (from your local project directory).

  3. Run the pod install command to setup the Xcode project and download all dependencies.

$ pod install
  1. This will create an Xcode Workspace file. Open the Bluemix-MicrosoftBand.xcworkspace file in Xcode.

  2. Configure Facebook as the identity provider. This configuration requires settings both on the Mobile Client Access service and in the native code project (inside the "client" directory for this project). In the local project, be sure to follow the instructions for both sections: [configuring the Facebook SDK][facebook_sdk_config] and [Configuring Faebook Authentication][facebook_auth]. Be sure to make the appropriate changes inside of Info.plist.

  3. Open the "AppDelegate.m" file. Update the connection to Bluemix on line 24 to include your app's route and GUID.

[imfClient initializeWithBackendRoute:@"bluemix app route"
                           backendGUID:@"bluemix app guid"];

You can access the route and app GUID under "Mobile Options" on the app dashboard.

Contacts App on Apple Watch

  1. Open the file "AudioManager.m" and update the urlString instance to use your application route on Bluemix.
NSString *urlString = [NSString stringWithFormat:@"https://appname.mybluemix.net/synthesize?text=%@&download=1&voice=%@&accept=audio/flac", phrase, voice ];
  1. Open the file "TranslationManager.m" and update the request url on line 24 to use your application route on Bluemix.
IMFResourceRequest * imfRequest = [IMFResourceRequest requestWithPath:@"https://appname.mybluemix.net/translate" method:@"GET" parameters:params];
  1. Now you are all set! Launch the app on a device that has been paired with a Microsoft Band. Tap on the "Start Heart Rate Sensor" button to start capturing data. After the heartate capture is complete, data will be saved locally and replicated up to Cloudant. You will be able to see your data in the web interface once that data has been replicated successfully to Cloudant.

Dependencies & 3rd Party Libraries

The following 3rd party libraries were used in the creation of this demo:

  1. GPUImage - GPU accelerated image processing
  2. OrigamiEngine - Audio playback for the FLAC file format
  3. Tesseract OCR - Optical Character Recognition

Troubleshooting

To troubleshoot your the server side of your Bluemix app the main useful source of information is the logs. To see them, run:

$ cf logs <application-name> --recent

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages