#Watson Translator The Watson Translator app leverages IBM Watson language translation and text to speech services with the open source Tesseract OCR engine to create a sample interface that allows you to extract text from an image, and then translate that text to another language, and even generate spoken audio in another language from that translation. The app has also been configured to use Bluemix Mobile Client Access for operational analyticsat runtime.
Note: This should not be confused with any of IBM Research's OCR (Optical Character Recognition) or Natural-Scene OCR work. The OCR in this example is based on the open source Tesseract project, which works best with dark text on a light background.
- Mobile Client Access - Capture analytics and logs from mobile apps running on devices
- Watson Language Translation - Language translation service
- Watson Text To Speech - Audible language synthesis from text
-
Create a Bluemix Account
Sign up for Bluemix, or sign in using an existing account.
-
From the Bluemix Dashboard, click on the "Create App" link, then select the "Mobile" template and walk through the process of creating your app infrastructure. Remember the app namel you are going to need it later.
-
From the app dashboard, add the Watson Language Translation and [Watson Speech To Text][stt] services.
-
Download and install the Cloud-foundry CLI tool. This will be used to deploy your Node.js back end.
-
Clone the app to your local environment from your terminal using the following command
git clone https://github.com/triceam/Watson-Translator.git
-
Back in the command line termianl, cd into this newly created directory, then go into the /server directory.
-
Connect to Bluemix in the command line tool and follow the prompts to log in.
$ cf api https://api.ng.bluemix.net
$ cf login
- Push it to Bluemix. This will automatically deploy the back end app and start the Node.js instance. Replace "AppName" with the name of your app on Bluemix.
$ cf push AppName
- Voila! You now have your very own API instance up and running on Bluemix. Next we need to configure the Mobile Client application. You can test your Node.js app deployment at https://yourapp.mybluemix.net (use your actual app route).
The native iOS application requires Xcode 6.4 running on a Mac to compile and deploy on either the iOS Simulator or on a development device. Xcode 6.4 is required to target Apple WatchOS 1.0.
-
If you do not already have it, download and install CocoaPods.
-
In a terminal window, cd into the /client directory (from your local project directory).
-
Run the pod install command to setup the Xcode project and download all dependencies.
$ pod install
-
This will create an Xcode Workspace file. Open the Bluemix-MicrosoftBand.xcworkspace file in Xcode.
-
Configure Facebook as the identity provider. This configuration requires settings both on the Mobile Client Access service and in the native code project (inside the "client" directory for this project). In the local project, be sure to follow the instructions for both sections: [configuring the Facebook SDK][facebook_sdk_config] and [Configuring Faebook Authentication][facebook_auth]. Be sure to make the appropriate changes inside of Info.plist.
-
Open the "AppDelegate.m" file. Update the connection to Bluemix on line 24 to include your app's route and GUID.
[imfClient initializeWithBackendRoute:@"bluemix app route"
backendGUID:@"bluemix app guid"];
You can access the route and app GUID under "Mobile Options" on the app dashboard.
- Open the file "AudioManager.m" and update the urlString instance to use your application route on Bluemix.
NSString *urlString = [NSString stringWithFormat:@"https://appname.mybluemix.net/synthesize?text=%@&download=1&voice=%@&accept=audio/flac", phrase, voice ];
- Open the file "TranslationManager.m" and update the request url on line 24 to use your application route on Bluemix.
IMFResourceRequest * imfRequest = [IMFResourceRequest requestWithPath:@"https://appname.mybluemix.net/translate" method:@"GET" parameters:params];
- Now you are all set! Launch the app on a device that has been paired with a Microsoft Band. Tap on the "Start Heart Rate Sensor" button to start capturing data. After the heartate capture is complete, data will be saved locally and replicated up to Cloudant. You will be able to see your data in the web interface once that data has been replicated successfully to Cloudant.
The following 3rd party libraries were used in the creation of this demo:
- GPUImage - GPU accelerated image processing
- OrigamiEngine - Audio playback for the FLAC file format
- Tesseract OCR - Optical Character Recognition
To troubleshoot your the server side of your Bluemix app the main useful source of information is the logs. To see them, run:
$ cf logs <application-name> --recent