Coder Social home page Coder Social logo

dialogflow-android-client's Introduction

DEPRECATED Android SDK for api.ai

Deprecated
This Dialogflow client library and Dialogflow API V1 have been deprecated and will be shut down on October 23th, 2019. Please migrate to Dialogflow API V2 and the v2 client library

Release status Maven Central

The API.AI Android SDK makes it easy to integrate speech recognition with API.AI natural language processing API on Android devices. API.AI allows using voice commands and integration with dialog scenarios defined for a particular agent in API.AI.

Two permissions are required to use the API.AI Android SDK:

  • android.permission.INTERNET for internet access
  • android.permission.RECORD_AUDIO for microphone access

Add this dependencies to your project to use SDK

compile 'ai.api:sdk:2.0.7@aar'
// api.ai SDK dependencies
compile 'com.android.support:appcompat-v7:23.2.1'

Currently, speech recognition is performed using Google's Android SDK, either on the client device or in the cloud. Recognized text is passed to the API.AI through HTTP requests. Also you can try Speaktoit recognition engine (Use AIConfiguration.RecognitionEngine.Speaktoit).

Authentication is accomplished through setting the client access token when initializing an AIConfiguration object. The client access token specifies which agent will be used for natural language processing.

Note: The API.AI Android SDK only makes query requests, and cannot be used to manage entities and intents. Instead, use the API.AI user interface or REST API to create, retrieve, update, and delete entities and intents.

The API.AI Android SDK comes with a simple sample that illustrates how voice commands can be integrated with API.AI. Use the following steps to run the sample code:

  1. Have an API.AI agent created that has entities and intents. See the API.AI documentation on how to do this.
  2. Open Android Studio.
  3. Import the api-ai-android-master directory.
  4. Open the SDK Manager and be sure that you have installed Android Build Tools 19.1.
  5. In the Project browser, open apiAISampleApp/src/main/java/ai.api.sample/Config.
  6. Towards the top of the file, you will see a declaration of a static final string called ACCESS_TOKEN. Set its value to be the client access token of your agent.
  7. Attach an Android device, or have the emulator set up with an emulated device.
  8. From the Run menu, choose Debug (or click the Debug symbol). Choose your device.
  9. You should see an app running with three buttons: Listen, StopListen, and Cancel.
  10. Click Listen and say a phrase that will be understood by your agent. Wait a few seconds. The Java will appear that is returned by the API.AI service.

This section describes what you need to do to get started with your own app that uses the API.AI Android SDK. The first part provides an overview of how to use the SDK, and the second part is a tutorial with detailed step-by-step instructions for creating your own app.

If you are an experienced developer you might use brief integration instruction.

Overview

To implement speech recognition and natural language processing features in your app, you must first add the API.AI SDK library to your project. There are two ways to accomplish this. The first way is recommended:

  1. Add a dependency to your build.gradle file. Add the following line to your build.gradle file. (In the sample app, the apiAISampleApp/build.gradle is an example of how to do this.)

    compile 'ai.api:sdk:2.0.7@aar'
    
  2. (Not recommended) Download the library source code from github, and attach it to your project.

Now you can use API.AI service features in your app using either integrated speech recognition or using your own speech recognition.

Using integrated speech recognition

Once you've added the SDK library, follow these steps:

  1. Add two permissions into the AndroidManifest:

    • android.permission.INTERNET
    • android.permission.RECORD_AUDIO
  2. Create a class that implements the AIListener interface. This class will process responses from API.AI. (AIRequest, AIResponse are not a part of "ai.api:sdk:2.0.7@aar", they are a part of "ai.api:libai:1.6.12", if you haven't added it until now add


    compile 'ai.api:libai:1.6.12'

to yor app level gradle file.

)

```java
public interface AIListener {
    void onResult(AIResponse result); // here process response
    void onError(AIError error); // here process error
    void onAudioLevel(float level); // callback for sound level visualization
    void onListeningStarted(); // indicate start listening here
    void onListeningCanceled(); // indicate stop listening here
    void onListeningFinished(); // indicate stop listening here
}
```
  1. Create an instance of AIConfiguration, specifying the access token, locale, and recognition engine.

    final AIConfiguration config = new AIConfiguration("CLIENT_ACCESS_TOKEN",
                AIConfiguration.SupportedLanguages.English,
                AIConfiguration.RecognitionEngine.System);
  2. Use the AIConfiguration object to get a reference to the AIService, which will make the query requests.

    AIService aiService = AIService.getService(context, config);
  3. Set the AIListener instance for the AIService instance.

    aiService.setListener(yourAiListenerInstance);
  4. Launch listening from the microphone via the startListening method. The SDK will start listening for the microphone input of the mobile device.

    aiService.startListening();
  5. To stop listening and start the request to the API.AI service using the current recognition results, call the stopListening method of the AIService class.

    aiService.stopListening();
  6. To cancel the listening process without sending a request to the API.AI service, call the cancel method of the AIService class.

    aiService.cancel();
  7. If there are no errors, you can get the result using the AIResponse.getResult method. From there, you can obtain the action and parameters.

    public void onResult(final AIResponse response) {
        Log.i(TAG, "Action: " + result.getAction());
        // process response object
    }

Using your own speech recognition

This section assumes that you have performed your own speech recognition and that you have text that you want to process as natural language. Once you've added the SDK library, follow these steps:

  1. Add this permission into the AndroidManifest:

    • android.permission.INTERNET
  2. Create an instance of AIConfiguration, specifying the access token, locale, and recognition engine. You can specify any recognition engine, since that value will not be used.

  3. Create an AIDataService instance using the configuration object.

  4. Create the empty AIRequest instance. Set the request text using the method setQuery.

  5. Send the request to the API.AI service using the method aiDataService.request(aiRequest).

  6. Process the response.

The following example code sends a query with the text "Hello". First, it initialize aiDataService and aiRequest instances

final AIConfiguration config = new AIConfiguration(ACCESS_TOKEN, 
    AIConfiguration.SupportedLanguages.English, 
    AIConfiguration.RecognitionEngine.System);

final AIDataService aiDataService = new AIDataService(config);

final AIRequest aiRequest = new AIRequest();
aiRequest.setQuery("Hello");

Then it calls the aiDataService.request method. Please note, that you must call aiDataService.request method from background thread, using AsyncTask class, for example.

new AsyncTask<AIRequest, Void, AIResponse>() {
    @Override
    protected AIResponse doInBackground(AIRequest... requests) {
        final AIRequest request = requests[0];
        try {
            final AIResponse response = aiDataService.request(aiRequest);
            return response;
        } catch (AIServiceException e) {
        }
        return null;
    }
    @Override
    protected void onPostExecute(AIResponse aiResponse) {
        if (aiResponse != null) {
            // process aiResponse here
        }
    }
}.execute(aiRequest);

Getting results

After implementing AIListener interface, you can get the response from api.ai inside your listener like this:

public void onResult(final AIResponse response) {
   // Use the response object to get all the results
}

Here is how to get different part of the result object:

  • Get the status

    final Status status = response.getStatus();
    Log.i(TAG, "Status code: " + status.getCode());
    Log.i(TAG, "Status type: " + status.getErrorType());
  • Get resolved query

    final Result result = response.getResult();
    Log.i(TAG, "Resolved query: " + result.getResolvedQuery());
  • Get action

    final Result result = response.getResult();
    Log.i(TAG, "Action: " + result.getAction());
  • Get speech

    final Result result = response.getResult();
    final String speech = result.getFulfillment().getSpeech();
    Log.i(TAG, "Speech: " + speech);
  • Get metadata

    final Result result = response.getResult();
    final Metadata metadata = result.getMetadata();
    if (metadata != null) {
      Log.i(TAG, "Intent id: " + metadata.getIntentId());
      Log.i(TAG, "Intent name: " + metadata.getIntentName());
    }
  • Get parameters

    final Result result = response.getResult();
    final HashMap<String, JsonElement> params = result.getParameters();
    if (params != null && !params.isEmpty()) {
      Log.i(TAG, "Parameters: ");
      for (final Map.Entry<String, JsonElement> entry : params.entrySet()) {
          Log.i(TAG, String.format("%s: %s", entry.getKey(), entry.getValue().toString()));
      }
    }

This section contains a detailed tutorial about creating new app and connect it to API.AI.

Create a new app

Follow these steps to set up your environment and create new android app with API.AI integration:

  1. Create an API.AI agent with entities and intents, or use one that you've already created. See the API.AI documentation for instructions on how to do this.
  2. Open Android Studio. (Download it if you don't have it.)
  3. From the start screen (or File menu) , choose New Project....
    New Project
  4. In the New Project dialog, fill Application name and Company Domain, then click Next.
    New project dialog
  5. Choose minimum SDK for project, minimum supported by API.AI SDK is 9 Gingerbread. Click Next.
    Min SDK
  6. Select Blank Activity and click Next.
  7. Enter the main activity name and click Finish.

Integrate with the SDK

Next you will integrate with the SDK to be able to make calls. Follow these steps:

  1. Open AndroidManifest.xml under app/src/main.

  2. Just above the <application> tag, add these line in order to give the app permission to access the internet and the microphone:

    <uses-permission android:name="android.permission.INTERNET"/>
    <uses-permission android:name="android.permission.RECORD_AUDIO" />
  3. Save AndroidManifest.xml.

  4. Next, you need to add a new dependency for the AI.API library. Right click on your module name (it should be app) in the Project Navigator and select Open Module Settings. Click on the Dependencies tab. Click on the + sign on the bottom left side and select Library dependency.
    Add dependency

  5. In the opened dialog search ai.api, choose ai.api:sdk:2.0.5 item and append @aar to the end of library name (see image) then click OK.
    Add dependency

    • Also you need to add dependencies of the SDK library : com.android.support:appcompat-v7, com.google.code.gson:gson, commons-io:commons-io. Add them in the same way.
  6. Open MainActivity.java under app/src/main/java/com.example.yourAppName.app, or whatever your package name is.

  7. Expand the import section and add the following lines to import the necessary API.AI classes:

    import ai.api.AIListener;
    import ai.api.android.AIConfiguration;
    import ai.api.android.AIService;
    import ai.api.model.AIError;
    import ai.api.model.AIResponse;
    import ai.api.model.Result;
    import com.google.gson.JsonElement;
    import java.util.Map;

Create the user interface

  1. Open activity_main.xml under app/src/main/res/layout. This will open the layout in the designer.
    activity_main.xml in Designer

  2. Select and delete the "Hello World" TextView.

  3. Drag a Button (under Widgets) to the top of the screen. Change the id property to "listenButton" and the text property to "Listen".
    Listen button

  4. Drag a Plain TextView (under Widgets) under the button. Expand it so that it covers the rest of the bottom of the screen. Change the id property to "resultTextView" and the text property to an empty string.
    Result TextView

  5. Now return to the MainActivity.java file. Add three import statements to access our widgets:

    import android.view.View;
    import android.widget.Button;
    import android.widget.TextView;
  6. Create two private members in MainActivity for the widgets:

    private Button listenButton;
    private TextView resultTextView;
  7. At the end of the OnCreate method, add these lines to initialize the widgets:

    listenButton = (Button) findViewById(R.id.listenButton);
    resultTextView = (TextView) findViewById(R.id.resultTextView);

Create the AI Service and Listener

  1. Use the MainActivity as the class that will be called when events occur by having it implement the AIListener class. Replace the class declaration with this:

    public class MainActivity extends ActionBarActivity implements AIListener {
    
  2. In the MainActivity class, create a private member for the AIService class named aiService.

    private AIService aiService;
  3. In the OnCreate method, add the following line to set up the configuration to use system speech recognition. Replace CLIENT_ACCESS_TOKEN with your client access token.

     final AIConfiguration config = new AIConfiguration("CLIENT_ACCESS_TOKEN",
            AIConfiguration.SupportedLanguages.English,
            AIConfiguration.RecognitionEngine.System);

    Api keys

  4. Below this line, initialize the AI service and add this instance as the listener to handle events.

    aiService = AIService.getService(this, config);
    aiService.setListener(this);
  5. Add method to start listening on the button click:

    public void listenButtonOnClick(final View view) {
        aiService.startListening();
    }
  6. Return to activity_main.xml and click on the Listen button. In the properties pane, set the onClick property to listenButtonOnClick.

  7. Add the following method to show the results when the listening is complete:

    public void onResult(final AIResponse response) {
        Result result = response.getResult();
    
        // Get parameters
        String parameterString = "";
        if (result.getParameters() != null && !result.getParameters().isEmpty()) {
            for (final Map.Entry<String, JsonElement> entry : result.getParameters().entrySet()) {
                parameterString += "(" + entry.getKey() + ", " + entry.getValue() + ") ";
            }
        }
    
        // Show results in TextView.
        resultTextView.setText("Query:" + result.getResolvedQuery() +
            "\nAction: " + result.getAction() +
            "\nParameters: " + parameterString);
    }
  8. Add the following method to handle errors:

    @Override
    public void onError(final AIError error) {
        resultTextView.setText(error.toString());
    }
  9. Add the following empty methods to implement the AIListener interface:

    @Override
    public void onListeningStarted() {}
    
    @Override
    public void onListeningCanceled() {}
    
    @Override
    public void onListeningFinished() {}
    
    @Override
    public void onAudioLevel(final float level) {}

Run the App

  1. Attach an Android device to your computer or have a virtual device ready.
  2. Make sure that your module is selected in the dropdown, and then click the Debug button.
    Debug button
  3. The app should now be running on your device or virtual device. Click the Listen button and then speak a phrase that will work with your intent. Wait a few seconds. The result should appear in the result TextView.
    Result

Feature examples

User specified contexts

To specify additional contexts in the query you can use RequestExtras object.

First create list of contexts you need:

List<AIContext> contexts = new ArrayList<>();
contexts.add(new AIContext("firstContext"));
contexts.add(new AIContext("secondContext"));

Then create RequestExtras instance and use it for request

RequestExtras requestExtras = new RequestExtras(contexts, null);
aiService.startListening(requestExtras);

User specified entities

To specify user entities in the query you can use RequestExtras object.

First create list of entities you need:

final Entity myDwarfs = new Entity("dwarfs");
myDwarfs.addEntry(new EntityEntry("Ori", new String[] {"Ori", "Nori"}));
myDwarfs.addEntry(new EntityEntry("Bifur", new String[] {"Bofur","Bombur"}));
final List<Entity> entities = Collections.singletonList(myDwarfs);

Then create RequestExtras instance and use it for request

RequestExtras requestExtras = new RequestExtras(null, entities);
aiService.startListening(requestExtras);

Also you can upload user entities with separate method

aiService.uploadUserEntities(entities);

Bluetooth support

Do these steps to make SDK work with Bluetooth devices:

  1. Create implementation of the BluetoothController near your Application class

    private class BluetoothControllerImpl extends BluetoothController {
    
        public BluetoothControllerImpl(Context context) {
            super(context);
        }
    
        @Override
        public void onHeadsetDisconnected() {
            Log.d(TAG, "Bluetooth headset disconnected");
        }
    
        @Override
        public void onHeadsetConnected() {
            Log.d(TAG, "Bluetooth headset connected");
    
            if (isInForeground() && !bluetoothController.isOnHeadsetSco()) {
                bluetoothController.start();
            }
        }
    
        @Override
        public void onScoAudioDisconnected() {
            Log.d(TAG, "Bluetooth sco audio finished");
            bluetoothController.stop();
    
            if (isInForeground()) {
                bluetoothController.start();
            }
        }
    
        @Override
        public void onScoAudioConnected() {
            Log.d(TAG, "Bluetooth sco audio started");
        }
    
    }
  2. Add to your Application class integer field to count Activities and BluetoothController class implementation for Bluetooth management

    private int activitiesCount;
    private BluetoothControllerImpl bluetoothController;
  3. Add helper methods to your Application class

    protected void onActivityResume() {
        if (activitiesCount++ == 0) { // on become foreground
            bluetoothController.start();
        }
    }
    
    protected void onActivityPaused() {
        if (--activitiesCount == 0) { // on become background
            bluetoothController.stop();
        }
    }
    
    private boolean isInForeground() {
        return activitiesCount > 0;
    }
  4. You need to call this methods from onPause and onResume of every Activity, it can be solved with base class for all your activities

    public class BaseActivity extends ActionBarActivity {
    
        private AIApplication app;
    
        private static final long PAUSE_CALLBACK_DELAY = 500;
    
        private final Handler handler = new Handler();
        private Runnable pauseCallback = new Runnable() {
            @Override
            public void run() {
                app.onActivityPaused();
            }
        };
    
        @Override
        protected void onCreate(Bundle savedInstanceState) {
            super.onCreate(savedInstanceState);
            app = (AIApplication) getApplication();
        }
    
        @Override
        protected void onResume() {
            super.onResume();
            app.onActivityResume();
        }
    
        @Override
        protected void onPause() {
            super.onPause();
            handler.postDelayed(pauseCallback, PAUSE_CALLBACK_DELAY);
        }
    }

A complete example can be found in the Sample Application.

  • If you get an error when trying to install app that says "INSTALL_FAILED_OLDER_SDK", then check you have Android SDK 19 and build tools 19.1 installed.

How to make contributions?

Please read and follow the steps in the CONTRIBUTING.md.

License

See LICENSE.

Terms

Your use of this sample is subject to, and by using or downloading the sample files you agree to comply with, the Google APIs Terms of Service.

This is not an official Google product.

dialogflow-android-client's People

Contributors

a23sokolov avatar anuragsidana avatar artemgoncharuk avatar enlighter avatar istima avatar pgruenbaum avatar tahnik avatar thegeekybaniya avatar xvir avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dialogflow-android-client's Issues

Dependencies out of Date

compile 'com.android.support:appcompat-v7:23.2.1'
compile 'com.google.code.gson:gson:2.3.1'
compile 'commons-io:commons-io:2.4'

These are all out of date. Here are the most recent ones.
compile 'com.android.support:appcompat-v7:25.3.0'
compile 'com.google.code.gson:gson:2.8.0'
compile 'commons-io:commons-io:2.5'

App crashes on startup with NoClassDefFound error

Hi everyone!
I'm currently trying to implement AI.API by following the provided tutorial. When I try to get the app run on the emulator, it crashes upon startup with the following error.
screen shot 2017-06-22 at 3 44 11 pm

I noticed the error has to do with something on line 35 of my code, but I'm simply following the tutorial and not sure what is the issue (token was erased from the code below for confidential purposes).
screen shot 2017-06-22 at 3 46 34 pm

I also tried to use the provided sample app but the same error occurred.
I would be really appreciated if someone could provide me with some insights!
Thank you so much in advance!

ai.api.AIServiceException: Can't connect to the api.ai service.

Hi, i face a issue:
when i use some speech recognition api like bing speech 、nuance or ibm, it can work fine.
but when i use the google cloud speech api, the github demo url is https://github.com/GoogleCloudPlatform/android-docs-samples/tree/master/speech/Speech.
the google speech api use grpc, when i use this, happen the issue like title.
the log like below:

i.api.AIServiceException: Can't connect to the api.ai service. System.err: at ai.api.AIDataService.doTextRequest(AIDataService.java:389) System.err: at ai.api.AIDataService.request(AIDataService.java:147) System.err: at ai.api.AIDataService.request(AIDataService.java:117) System.err: at com.amyrobotics.amya_one.main.HardWakeMutualVoiceService$6.doInBackground(HardWakeMutualVoiceService.java:511) System.err: at com.amyrobotics.amya_one.main.HardWakeMutualVoiceService$6.doInBackground(HardWakeMutualVoiceService.java:506) System.err: at android.os.AsyncTask$2.call(AsyncTask.java:292) System.err: at java.util.concurrent.FutureTask.run(FutureTask.java:237) System.err: at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:231) System.err: at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1112) System.err: at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:587) System.err: at java.lang.Thread.run(Thread.java:818) System.err: Caused by: javax.net.ssl.SSLHandshakeException: Handshake failed System.err: at com.android.org.conscrypt.OpenSSLSocketImpl.startHandshake(OpenSSLSocketImpl.java:390) System.err: at com.android.okhttp.Connection.upgradeToTls(Connection.java:201) System.err: at com.android.okhttp.Connection.connect(Connection.java:155) System.err: at com.android.okhttp.internal.http.HttpEngine.connect(HttpEngine.java:276) System.err: at com.android.okhttp.internal.http.HttpEngine.sendRequest(HttpEngine.java:211) System.err: at com.android.okhttp.internal.http.HttpURLConnectionImpl.execute(HttpURLConnectionImpl.java:382) System.err: at com.android.okhttp.internal.http.HttpURLConnectionImpl.connect(HttpURLConnectionImpl.java:106) System.err: at com.android.okhttp.internal.http.DelegatingHttpsURLConnection.connect(DelegatingHttpsURLConnection.java:89) System.err: at com.android.okhttp.internal.http.HttpsURLConnectionImpl.connect(HttpsURLConnectionImpl.java:25) System.err: at ai.api.AIDataService.doTextRequest(AIDataService.java:368) System.err: ... 10 more System.err: Caused by: javax.net.ssl.SSLProtocolException: SSL handshake aborted: ssl=0xaf458600: Failure in SSL library, usually a protocol error System.err: error:14077410:SSL routines:SSL23_GET_SERVER_HELLO:sslv3 alert handshake failure (external/openssl/ssl/s23_clnt.c:770 0xac139001:0x00000000) System.err: at com.android.org.conscrypt.NativeCrypto.SSL_do_handshake(Native Method) System.err: at com.android.org.conscrypt.OpenSSLSocketImpl.startHandshake(OpenSSLSocketImpl.java:318) System.err: ... 19 more I/System.out: error messageis Can't connect to the api.ai service.

where the error occured is in the program below:

final AIRequest aiRequest = new AIRequest(text); new AsyncTask<AIRequest, Void, AIResponse>() { @Override protected AIResponse doInBackground(AIRequest... requests) {

i suspect the error is about the port "443", used in google cloud speech for "setAccessToken".

but i change the port of google speech to 442, it did't work, and google speech error.
so i don't kown how to solve this, please help;
i don't kown ho

Class loading warnings due to Log4J and JMX

I'm seeing these warnings on application startup

Rejecting re-init on previously-failed class java.lang.Class<org.apache.logging.log4j.core.lookup.JmxRuntimeInputArgumentsLookup>: java.lang.NoClassDefFoundError: Failed resolution of: Ljava/lang/management/ManagementFactory;

Likely due to the Java VM Android is running does not have those classes. Anyway we can exclude them fro being loaded or referenced? App still runs fine though.

RecognitionEngine Speaktoit is not working

Application is working fine when RecognitionEngine is set to system (AIConfiguration.RecognitionEngine.System).
I changed RecognitionEngine "System " to "Speaktoit"(AIConfiguration.RecognitionEngine.Speaktoit) Application is not working.
Following Exception is occurring.

07-06 18:01:39.339 21358-22502/ai.api.sample W/System.err: java.io.IOException: Stream closed
07-06 18:01:39.339 21358-22502/ai.api.sample W/System.err: at java.io.BufferedInputStream.getInIfOpen(BufferedInputStream.java:151)
07-06 18:01:39.339 21358-22502/ai.api.sample W/System.err: at java.io.BufferedInputStream.read1(BufferedInputStream.java:273)
07-06 18:01:39.339 21358-22502/ai.api.sample W/System.err: at java.io.BufferedInputStream.read(BufferedInputStream.java:334)
07-06 18:01:39.339 21358-22502/ai.api.sample W/System.err: at sun.nio.cs.StreamDecoder.readBytes(StreamDecoder.java:287)
07-06 18:01:39.339 21358-22502/ai.api.sample W/System.err: at sun.nio.cs.StreamDecoder.implRead(StreamDecoder.java:350)
07-06 18:01:39.340 21358-22502/ai.api.sample W/System.err: at sun.nio.cs.StreamDecoder.read(StreamDecoder.java:179)
07-06 18:01:39.340 21358-22502/ai.api.sample W/System.err: at java.io.InputStreamReader.read(InputStreamReader.java:184)
07-06 18:01:39.340 21358-22502/ai.api.sample W/System.err: at java.io.Reader.read(Reader.java:140)
07-06 18:01:39.340 21358-22502/ai.api.sample W/System.err: at ai.api.util.IOUtils.copy(IOUtils.java:140)
07-06 18:01:39.340 21358-22502/ai.api.sample W/System.err: at ai.api.util.IOUtils.copy(IOUtils.java:131)
07-06 18:01:39.340 21358-22502/ai.api.sample W/System.err: at ai.api.util.IOUtils.readAll(IOUtils.java:126)
07-06 18:01:39.340 21358-22502/ai.api.sample W/System.err: at ai.api.util.IOUtils.readAll(IOUtils.java:102)
07-06 18:01:39.340 21358-22502/ai.api.sample W/System.err: at ai.api.util.IOUtils.readAll(IOUtils.java:114)
07-06 18:01:39.340 21358-22502/ai.api.sample W/System.err: at ai.api.http.HttpClient.getErrorString(HttpClient.java:158)
07-06 18:01:39.340 21358-22502/ai.api.sample W/System.err: at ai.api.AIDataService.doSoundRequest(AIDataService.java:799)
07-06 18:01:39.340 21358-22502/ai.api.sample W/System.err: at ai.api.AIDataService.doSoundRequest(AIDataService.java:743)
07-06 18:01:39.340 21358-22502/ai.api.sample W/System.err: at ai.api.AIDataService.voiceRequest(AIDataService.java:288)
07-06 18:01:39.340 21358-22502/ai.api.sample W/System.err: at ai.api.AIDataService.voiceRequest(AIDataService.java:253)
07-06 18:01:39.340 21358-22502/ai.api.sample W/System.err: at ai.api.services.SpeaktoitRecognitionServiceImpl$RecognizeTask.doInBackground(SpeaktoitRecognitionServiceImpl.java:380)
07-06 18:01:39.340 21358-22502/ai.api.sample W/System.err: at ai.api.services.SpeaktoitRecognitionServiceImpl$RecognizeTask.doInBackground(SpeaktoitRecognitionServiceImpl.java:362)
07-06 18:01:39.340 21358-22502/ai.api.sample W/System.err: at android.os.AsyncTask$2.call(AsyncTask.java:305)
07-06 18:01:39.341 21358-22502/ai.api.sample W/System.err: at java.util.concurrent.FutureTask.run(FutureTask.java:237)
07-06 18:01:39.341 21358-22502/ai.api.sample W/System.err: at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:243)
07-06 18:01:39.341 21358-22502/ai.api.sample W/System.err: at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1133)
07-06 18:01:39.341 21358-22502/ai.api.sample W/System.err: at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:607)
07-06 18:01:39.341 21358-22502/ai.api.sample W/System.err: at java.lang.Thread.run(Thread.java:761)
07-06 18:01:39.346 21358-22502/ai.api.sample I/System.out: 2017-07-06 18:01:39,341 ERROR An exception occurred processing Appender Console java.lang.NullPointerException: Attempt to get length of null array
07-06 18:01:39.346 21358-22502/ai.api.sample I/System.out: at org.apache.logging.log4j.util.ReflectionUtil.getCurrentStackTrace(ReflectionUtil.java:274)
07-06 18:01:39.346 21358-22502/ai.api.sample I/System.out: at org.apache.logging.log4j.core.impl.ThrowableProxy.(ThrowableProxy.java:116)
07-06 18:01:39.346 21358-22502/ai.api.sample I/System.out: at org.apache.logging.log4j.core.impl.Log4jLogEvent.getThrownProxy(Log4jLogEvent.java:323)
07-06 18:01:39.346 21358-22502/ai.api.sample I/System.out: at org.apache.logging.log4j.core.pattern.ExtendedThrowablePatternConverter.format(ExtendedThrowablePatternConverter.java:64)
07-06 18:01:39.346 21358-22502/ai.api.sample I/System.out: at org.apache.logging.log4j.core.pattern.PatternFormatter.format(PatternFormatter.java:36)
07-06 18:01:39.346 21358-22502/ai.api.sample I/System.out: at org.apache.logging.log4j.core.layout.PatternLayout.toSerializable(PatternLayout.java:196)
07-06 18:01:39.346 21358-22502/ai.api.sample I/System.out: at org.apache.logging.log4j.core.layout.PatternLayout.toSerializable(PatternLayout.java:55)
07-06 18:01:39.346 21358-22502/ai.api.sample I/System.out: at org.apache.logging.log4j.core.layout.AbstractStringLayout.toByteArray(AbstractStringLayout.java:71)
07-06 18:01:39.346 21358-22502/ai.api.sample I/System.out: at org.apache.logging.log4j.core.appender.AbstractOutputStreamAppender.append(AbstractOutputStreamAppender.java:108)
07-06 18:01:39.346 21358-22502/ai.api.sample I/System.out: at org.apache.logging.log4j.core.config.AppenderControl.callAppender(AppenderControl.java:99)
07-06 18:01:39.346 21358-22502/ai.api.sample I/System.out: at org.apache.logging.log4j.core.config.LoggerConfig.callAppenders(LoggerConfig.java:430)
07-06 18:01:39.346 21358-22502/ai.api.sample I/System.out: at org.apache.logging.log4j.core.config.LoggerConfig.log(LoggerConfig.java:409)
07-06 18:01:39.347 21358-22502/ai.api.sample I/System.out: at org.apache.logging.log4j.core.config.LoggerConfig.log(LoggerConfig.java:367)
07-06 18:01:39.347 21358-22502/ai.api.sample I/System.out: at org.apache.logging.log4j.core.Logger.logMessage(Logger.java:112)
07-06 18:01:39.347 21358-22502/ai.api.sample I/System.out: at org.apache.logging.log4j.spi.AbstractLogger.logMessage(AbstractLogger.java:727)
07-06 18:01:39.347 21358-22502/ai.api.sample I/System.out: at org.apache.logging.log4j.spi.AbstractLogger.logIfEnabled(AbstractLogger.java:716)
07-06 18:01:39.347 21358-22502/ai.api.sample I/System.out: at org.apache.logging.log4j.spi.AbstractLogger.error(AbstractLogger.java:354)
07-06 18:01:39.347 21358-22502/ai.api.sample I/System.out: at ai.api.AIDataService.doSoundRequest(AIDataService.java:813)
07-06 18:01:39.347 21358-22502/ai.api.sample I/System.out: at ai.api.AIDataService.doSoundRequest(AIDataService.java:743)
07-06 18:01:39.347 21358-22502/ai.api.sample I/System.out: at ai.api.AIDataService.voiceRequest(AIDataService.java:288)
07-06 18:01:39.347 21358-22502/ai.api.sample I/System.out: at ai.api.AIDataService.voiceRequest(AIDataService.java:253)
07-06 18:01:39.347 21358-22502/ai.api.sample I/System.out: at ai.api.services.SpeaktoitRecognitionServiceImpl$RecognizeTask.doInBackground(SpeaktoitRecognitionServiceImpl.java:380)
07-06 18:01:39.347 21358-22502/ai.api.sample I/System.out: at ai.api.services.SpeaktoitRecognitionServiceImpl$RecognizeTask.doInBackground(SpeaktoitRecognitionServiceImpl.java:362)
07-06 18:01:39.347 21358-22502/ai.api.sample I/System.out: at android.os.AsyncTask$2.call(AsyncTask.java:305)
07-06 18:01:39.347 21358-22502/ai.api.sample I/System.out: at java.util.concurrent.FutureTask.run(FutureTask.java:237)
07-06 18:01:39.347 21358-22502/ai.api.sample I/System.out: at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:243)
07-06 18:01:39.347 21358-22502/ai.api.sample I/System.out: at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1133)
07-06 18:01:39.347 21358-22502/ai.api.sample I/System.out: at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:607)
07-06 18:01:39.347 21358-22502/ai.api.sample I/System.out: at java.lang.Thread.run(Thread.java:761)
07-06 18:01:39.349 21358-22502/ai.api.sample E/TEST: doInBackground e*****ai.api.AIServiceException: Can't make request to the API.AI service. Please, check connection settings and API.AI keys.
07-06 18:01:39.391 21358-21358/ai.api.sample D/ai.api.sample.AIButtonSampleActivity: onCancelled
07-06 18:01:39.391 21358-21358/ai.api.sample D/ai.api.sample.AIButtonSampleActivity: onError*

Kindly help me to solve this issue.

Trim empty parameters

Parameters in response json could look like:

"parameters": {
    "date": "2015-03-19",
    "date-time": "",
    "time": "",
    "text": "feed it",
    "priority": "",
    "remind": "remind"
    },

So, it is hard to check existent parameters with

result.getParameters().containsKey("key")

Parameters with empty values should be trimmed.

Annotation processor dependency error

I'm running 'com.android.tools.build:gradle:2.4.0-alpha7' on AS 2.4 Preview 7 and getting the following error:

Error:Execution failed for task ':zapImoveisApp:javaPreCompileZapDebug'.

Annotation processors must be explicitly declared now. The following dependencies on the compile classpath are found to contain annotation processor. Please add them to the annotationProcessor configuration.
- log4j-core-2.2.jar
Alternatively, set android.defaultConfig.javaCompileOptions.annotationProcessorOptions.includeCompileClasspath = true to continue with previous behavior. Note that this option is deprecated and will be removed in the future.
See https://developer.android.com/r/tools/annotation-processor-error-message.html for more details.

As you can see on the provided url documentation, it's possible to disable the verification for this error using the following gradle config, but must be used as a temporary solution, because it may be removed in the future.

javaCompileOptions { annotationProcessorOptions { includeCompileClasspath false } }

So we need a alternative solution that fits this new annotation processor dependency configurations.

Insufficient Permissions

With the standard two INTERNET and RECORD_AUDIO permissions, it throws an Insufficient Permissions error. You must add the ACCESS_NETWORK_STATE permission for it to work.

I am setting the session id in AIRequest object but its giving some other session id in response

      `AIRequest request = new AIRequest();

       request.setQuery(message);

       request.setSessionId(MyData.getInstance(MyApplication.getInstance()).getGatewayId() + "_" +
               MyData.getInstance(MyApplication.getInstance()).getUsername());

       return aiService.textRequest(request);`

and when i get AIResponse object it gives me some different session id.
How can i set my custom session id for a particular user.

Settings Activity not opening

Actual Behaviour
Nothing happens when we click on the settings option in options menu in AIServiceSampleActivity and AIButtonSampleActivity
Expected Behaviour
Settings activity should be opened when we click on settings item in options menu
Would like to resolve this issue !!

1 second delay seem to be too low

This commit 334e165 introduced:

Made google recognizer cancellation after 1 sec from last partial results for google app starts with v5.9.26.

Which is really low amount of time if you want to input whole sentence. You need to speak really fast.
I think it should be at least configurable by SDK user.

Google speech recognition language not setup correctly for English

The supported languages are defined in

ailib/src/main/java/ai/api/AIConfiguration.java

     /**
         * Currently supported languages
         */
        public enum SupportedLanguages {
            English("en"),

    ...
     public static SupportedLanguages fromLanguageTag(final String languageTag) {
                switch (languageTag) {
                   case "en":
                        return English;

The language string does not correctly setup the google recognizer, taking into account country accent i.e. en-US vs en-GB

The android specification for EXTRA_LANGUAGE
http://developer.android.com/reference/android/speech/RecognizerIntent.html#EXTRA_LANGUAGE
shows that it should be Optional IETF language tag (as defined by BCP 47), for example "en-US"

I suggest adding additional enums for atleast "en-US" and "en-GB"

I've tested replacing "en" with "en_GB" in the following code
i.e. final String language = "en_GB";

ailib/src/main/java/ai/api/services/GoogleRecognitionServiceImpl.java

    private Intent createRecognitionIntent() {
            final Intent sttIntent = new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH);
            sttIntent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_MODEL,
                    RecognizerIntent.LANGUAGE_MODEL_FREE_FORM);

            final String language = config.getLanguage().replace('-', '_');

            sttIntent.putExtra(RecognizerIntent.EXTRA_LANGUAGE, language);
            sttIntent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_PREFERENCE, language);

and I get much better speech recognition for my uk accent once this has been implemented.

Only English speech recognition

I tried to follow the presented tutorial, so I'm currently having a very simple AI-application. Now I changed the supported language to German, but still the app only recognizes english words. This is my configuration:

final AIConfiguration config = new AIConfiguration("---", AIConfiguration.SupportedLanguages.German, AIConfiguration.RecognitionEngine.System);

My dependencies:

compile 'com.android.support:appcompat-v7:25.0.1'
testCompile 'junit:junit:4.12'
compile 'ai.api:sdk:2.0.0@aar'
compile 'ai.api:libai:1.2.1'
compile 'com.android.support:appcompat-v7:25.0.1'
compile 'com.google.code.gson:gson:2.8.0'
compile 'commons-io:commons-io:20030203.000550'

hello

支持的最低版本是多少(zhi chi de zui di ban ben shi duo shao)

Extra field in Request Json

I need to send some additional data using Webhook (e.g. current logged in username). Is there any possibility that I could add some extra field in request json?

Request Json:
{
"extra_field": "abc"
"lang": "en",
"status": {
"errorType": "success",
"code": 200
},
"timestamp": "2017-02-09T16:06:01.908Z",
"sessionId": "1486656220806",
"result": {
"parameters": {
"city": "Rome",
"name": "Ana"
},
"contexts": [],
"resolvedQuery": "my name is Ana and I live in Rome",
"source": "agent",
"score": 1.0,
"speech": "",
"fulfillment": {
"messages": [
{
"speech": "Hi Ana! Nice to meet you!",
"type": 0
}
],
"speech": "Hi Ana! Nice to meet you!"
},
"actionIncomplete": false,
"action": "greetings",
"metadata": {
"intentId": "9f41ef7c-82fa-42a7-9a30-49a93e2c14d0",
"webhookForSlotFillingUsed": "false",
"intentName": "greetings",
"webhookUsed": "true"
}
},
"id": "ab30d214-f4bb-4cdd-ae36-31caac7a6693",
"originalRequest": {
"source": "google",
"data": {
"inputs": [
{
"raw_inputs": [
{
"query": "my name is Ana and I live in Rome",
"input_type": 2
}
],
"intent": "assistant.intent.action.TEXT",
"arguments": [
{
"text_value": "my name is Ana and I live in Rome",
"raw_text": "my name is Ana and I live in Rome",
"name": "text"
}
]
}
],
"user": {
"user_id": "PuQndWs1OMjUYwVJMYqwJv0/KT8satJHAUQGiGPDQ7A="
},
"conversation": {
"conversation_id": "1486656220806",
"type": 2,
"conversation_token": "[]"
}
}
}
}

Hard to detect final speech recognition result when using GoogleRecognitionServiceImpl

This is an enhancement proposal, I hope I'm not missing anything.

When you use the GoogleRecognitionServiceImpl as the AIService, there's no easy way of notifying the user of their final speech output when its submitted to API.AI.

You can use the PartialResultsListener with GoogleRecognitionServiceImpl, but you get something like 2 callbacks to PartialResultsListener.onPartialResults(List<String>) after receiving the AIListener.onListeningFinished() callback.

Here's the logcat output from my phone to illustrate the issue, refer to the last 3-4 lines:
api_ai_partial_result_listener_output_logcat

The really hacky, nasty and potentially fragile way to detect the final input is to just wait for 2 callbacks after AIListener.onListeningFinished().

I'd like to modify theGoogleRecognitionServiceImpl.InternalRecognitionListener.onResults(Bundle) handling by adding something like a QuerySubmitListener (prevents breaking the PartialResultsListener contract). The QuerySubmitListener would notify developers of the query or queries and the confidences (right about here https://github.com/api-ai/apiai-android-client/blob/master/ailib/src/main/java/ai/api/services/GoogleRecognitionServiceImpl.java#L392).

Happy to create a PR if this sounds OK

Sample App cannot run

Hello. I'm having trouble with the Android Studio documentation.

**Now return to the MainActivity.java file. Add three import statements to access our widgets:

import android.view.View;
import android.widget.Button;
import android.widget.TextView;

Create two private members in MainActivity for the widgets:

private Button listenButton;
private TextView resultTextView;
At the end of the OnCreate method, add these lines to initialize the widgets:

listenButton = (Button) findViewById(R.id.listenButton);
resultTextView = (TextView) findViewById(R.id.resultTextView);**


I'm sorry, but can you give the example of where these go? My app closes immediately.

Error: java.lang.NoClassDefFoundError: org.apache.commons.io.Charsets

I did add commons-io:commons-io:20030203.000550 dependency and still getting these errors...
Heres the error log....

java.lang.RuntimeException: An error occured while executing doInBackground()
at android.os.AsyncTask$3.done(AsyncTask.java:300)
at java.util.concurrent.FutureTask.finishCompletion(FutureTask.java:355)
at java.util.concurrent.FutureTask.setException(FutureTask.java:222)
at java.util.concurrent.FutureTask.run(FutureTask.java:242)
at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:231)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1112)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:587)
at java.lang.Thread.run(Thread.java:841)
Caused by: java.lang.NoClassDefFoundError: org.apache.commons.io.Charsets
at ai.api.AIDataService.doTextRequest(AIDataService.java:342)
at ai.api.AIDataService.request(AIDataService.java:125)
at ai.api.services.GoogleRecognitionServiceImpl$1.doInBackground(GoogleRecognitionServiceImpl.java:147)
at ai.api.services.GoogleRecognitionServiceImpl$1.doInBackground(GoogleRecognitionServiceImpl.java:139)
at android.os.AsyncTask$2.call(AsyncTask.java:288)
at java.util.concurrent.FutureTask.run(FutureTask.java:237)

Getting Following exception: Caused by: java.lang.AbstractMethodError: abstract method "java.util.TimeZone ai.api.AIServiceContext.getTimeZone()"

03-01 12:28:58.999 6355-9154/? E/AndroidRuntime: FATAL EXCEPTION: AsyncTask #2
Process: com.hexa.helloaction, PID: 6355
java.lang.RuntimeException: An error occurred while executing doInBackground()
at android.os.AsyncTask$3.done(AsyncTask.java:309)
at java.util.concurrent.FutureTask.finishCompletion(FutureTask.java:354)
at java.util.concurrent.FutureTask.setException(FutureTask.java:223)
at java.util.concurrent.FutureTask.run(FutureTask.java:242)
at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:234)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1113)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:588)
at java.lang.Thread.run(Thread.java:818)
Caused by: java.lang.AbstractMethodError: abstract method "java.util.TimeZone ai.api.AIServiceContext.getTimeZone()"
at ai.api.AIDataService.getTimeZone(AIDataService.java:962)
at ai.api.AIDataService.request(AIDataService.java:172)
at ai.api.AIDataService.request(AIDataService.java:148)
at ai.api.services.GoogleRecognitionServiceImpl$2.doInBackground(GoogleRecognitionServiceImpl.java:166)
at ai.api.services.GoogleRecognitionServiceImpl$2.doInBackground(GoogleRecognitionServiceImpl.java:158)
at android.os.AsyncTask$2.call(AsyncTask.java:295)
at java.util.concurrent.FutureTask.run(FutureTask.java:237)
at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:234) 
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1113) 
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:588) 
at java.lang.Thread.run(Thread.java:818) 

Bluetooth

Recognition doesn't work with Bluetooth headset

How to create new an agent?

I want to create new an agent from client. What should I do? Can you write example for me? Thank you very much.

Caused by: java.lang.NumberFormatException: For input string: "simple_response"

When i Send a message to API.AI server and client blink to error below :

java.lang.RuntimeException: An error occurred while executing doInBackground()
at android.os.AsyncTask$3.done(AsyncTask.java:325)
at java.util.concurrent.FutureTask.finishCompletion(FutureTask.java:354)
at java.util.concurrent.FutureTask.setException(FutureTask.java:223)
at java.util.concurrent.FutureTask.run(FutureTask.java:242)
at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:243)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1133)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:607)
at java.lang.Thread.run(Thread.java:761)
Caused by: java.lang.NumberFormatException: For input string: "simple_response"
at java.lang.Integer.parseInt(Integer.java:521)
at java.lang.Integer.parseInt(Integer.java:556)
at com.google.gson.JsonPrimitive.getAsInt(JsonPrimitive.java:260)
at ai.api.GsonFactory$ResponseItemAdapter.deserialize(GsonFactory.java:78)
at ai.api.GsonFactory$ResponseItemAdapter.deserialize(GsonFactory.java:71)
at com.google.gson.internal.bind.TreeTypeAdapter.read(TreeTypeAdapter.java:69)
at com.google.gson.internal.bind.TypeAdapterRuntimeTypeWrapper.read(TypeAdapterRuntimeTypeWrapper.java:41)
at com.google.gson.internal.bind.CollectionTypeAdapterFactory$Adapter.read(CollectionTypeAdapterFactory.java:82)
at com.google.gson.internal.bind.CollectionTypeAdapterFactory$Adapter.read(CollectionTypeAdapterFactory.java:61)
at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$1.read(ReflectiveTypeAdapterFactory.java:129)
at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$Adapter.read(ReflectiveTypeAdapterFactory.java:220)
at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$1.read(ReflectiveTypeAdapterFactory.java:129)
at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$Adapter.read(ReflectiveTypeAdapterFactory.java:220)
at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$1.read(ReflectiveTypeAdapterFactory.java:129)
at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$Adapter.read(ReflectiveTypeAdapterFactory.java:220)
at com.google.gson.Gson.fromJson(Gson.java:887)
at com.google.gson.Gson.fromJson(Gson.java:852)
at com.google.gson.Gson.fromJson(Gson.java:801)
at com.google.gson.Gson.fromJson(Gson.java:773)
at ai.api.AIDataService.request(AIDataService.java:193)
at ai.api.AIDataService.request(AIDataService.java:148)
at ai.api.AIDataService.request(AIDataService.java:124)
at ninhv.vl.vlchat.views.chat.ChatActivity$1.doInBackground(ChatActivity.java:90)
at ninhv.vl.vlchat.views.chat.ChatActivity$1.doInBackground(ChatActivity.java:85)
at android.os.AsyncTask$2.call(AsyncTask.java:305)
at java.util.concurrent.FutureTask.run(FutureTask.java:237)
at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:243) 
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1133) 
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:607) 
at java.lang.Thread.run(Thread.java:761) 

And here is code in doInBackground method
@OverRide
protected AIResponse doInBackground(AIRequest... requests) {
AIRequest request = requests[0];
try {
Log.e("REQUEST",request.toString());
return aiDataService.request(request);
} catch (AIServiceException e) {
}
return null;
}

App Crashes after listening command ( FATAL EXCEPTION: AsyncTask )

After implementing the Android SDK ,

The app crashes after it listens to my request.

Here is the monitor log, please help.
FATAL EXCEPTION: AsyncTask #3 Process: com.aiapp.user.homeai, PID: 15096 java.lang.RuntimeException: An error occurred while executing doInBackground() at android.os.AsyncTask$3.done(AsyncTask.java:309) at java.util.concurrent.FutureTask.finishCompletion(FutureTask.java:354) at java.util.concurrent.FutureTask.setException(FutureTask.java:223) at java.util.concurrent.FutureTask.run(FutureTask.java:242) at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:234) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1113) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:588) at java.lang.Thread.run(Thread.java:818) Caused by: java.lang.NumberFormatException: Invalid int: "simple_response" at java.lang.Integer.invalidInt(Integer.java:138) at java.lang.Integer.parse(Integer.java:410) at java.lang.Integer.parseInt(Integer.java:367) at java.lang.Integer.parseInt(Integer.java:334) at com.google.gson.JsonPrimitive.getAsInt(JsonPrimitive.java:260) at ai.api.GsonFactory$ResponseItemAdapter.deserialize(GsonFactory.java:78) at ai.api.GsonFactory$ResponseItemAdapter.deserialize(GsonFactory.java:71) at com.google.gson.internal.bind.TreeTypeAdapter.read(TreeTypeAdapter.java:69) at com.google.gson.internal.bind.TypeAdapterRuntimeTypeWrapper.read(TypeAdapterRuntimeTypeWrapper.java:41) at com.google.gson.internal.bind.CollectionTypeAdapterFactory$Adapter.read(CollectionTypeAdapterFactory.java:82) at com.google.gson.internal.bind.CollectionTypeAdapterFactory$Adapter.read(CollectionTypeAdapterFactory.java:61) at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$1.read(ReflectiveTypeAdapterFactory.java:129) at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$Adapter.read(ReflectiveTypeAdapterFactory.java:220) at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$1.read(ReflectiveTypeAdapterFactory.java:129) at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$Adapter.read(ReflectiveTypeAdapterFactory.java:220) at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$1.read(ReflectiveTypeAdapterFactory.java:129) at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$Adapter.read(ReflectiveTypeAdapterFactory.java:220) at com.google.gson.Gson.fromJson(Gson.java:887) at com.google.gson.Gson.fromJson(Gson.java:852) at com.google.gson.Gson.fromJson(Gson.java:801) at com.google.gson.Gson.fromJson(Gson.java:773) at ai.api.AIDataService.request(AIDataService.java:193) at ai.api.AIDataService.request(AIDataService.java:148) at ai.api.services.GoogleRecognitionServiceImpl$2.doInBackground(GoogleRecognitionServiceImpl.java:166) at ai.api.services.GoogleRecognitionServiceImpl$2.doInBackground(GoogleRecognitionServiceImpl.java:158) at android.os.AsyncTask$2.call(AsyncTask.java:295) at java.util.concurrent.FutureTask.run(FutureTask.java:237) at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:234)  at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1113)  at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:588)  at java.lang.Thread.run(Thread.java:818)

Crash doing voice recognition

11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: FATAL EXCEPTION: AsyncTask #3
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: Process: com.getscarlett.android, PID: 8094
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: java.lang.RuntimeException: An error occured while executing doInBackground()
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at android.os.AsyncTask$3.done(AsyncTask.java:304)
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at java.util.concurrent.FutureTask.finishCompletion(FutureTask.java:355)
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at java.util.concurrent.FutureTask.setException(FutureTask.java:222)
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at java.util.concurrent.FutureTask.run(FutureTask.java:242)
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:231)
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1112)
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:587)
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at java.lang.Thread.run(Thread.java:818)
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: Caused by: java.lang.NullPointerException: lock == null
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at java.io.Reader.(Reader.java:64)
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at java.io.InputStreamReader.(InputStreamReader.java:120)
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at org.apache.commons.io.IOUtils.copy(IOUtils.java:1906)
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at org.apache.commons.io.IOUtils.toString(IOUtils.java:778)
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at ai.api.AIDataService.doTextRequest(AIDataService.java:319)
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at ai.api.AIDataService.doTextRequest(AIDataService.java:282)
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at ai.api.AIDataService.request(AIDataService.java:107)
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at ai.api.services.GoogleRecognitionServiceImpl$1.doInBackground(GoogleRecognitionServiceImpl.java:146)
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at ai.api.services.GoogleRecognitionServiceImpl$1.doInBackground(GoogleRecognitionServiceImpl.java:138)
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at android.os.AsyncTask$2.call(AsyncTask.java:292)
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at java.util.concurrent.FutureTask.run(FutureTask.java:237)
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:231) 
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1112) 
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:587) 
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at java.lang.Thread.run(Thread.java:818) 

Language default when not in "SupportedLanguages"

Hi,

I'm using the Android SDK, building a simple test app to know if I'm going to use your services or another :-)

When I try this:
final AIConfiguration.SupportedLanguages lang = AIConfiguration.SupportedLanguages.fromLanguageTag("nl");

The "lang" is always English. How come it doesn't support Dutch when your doc says it does (https://docs.api.ai/docs/languages)? Did I miss something?

Thank you !

Application crashing after receiving speech input

I followed the instruction and tried to run it in my Nexus-6P. After receiving the speech input, the application crashes without showing any error msg. Please help.

here is my MainActivity code.

package poc.apiai.com.firstaiapiapp;

import android.support.v7.app.AppCompatActivity;
import android.os.Bundle;

import ai.api.AIConfiguration;
import ai.api.AIListener;
import ai.api.AIService;
import ai.api.model.AIError;
import ai.api.model.AIResponse;
import ai.api.model.Result;
import com.google.gson.JsonElement;
import java.util.Map;

import android.view.View;
import android.widget.Button;
import android.widget.TextView;

public class MainActivity extends AppCompatActivity implements AIListener {

private Button listenButton;
private TextView resultTextView;
private AIService aiService;

@Override
protected void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);
    setContentView(R.layout.activity_main);

    listenButton = (Button) findViewById(R.id.listenButton);
    resultTextView = (TextView) findViewById(R.id.resultTextView);

    final AIConfiguration config = new AIConfiguration("abc",
            AIConfiguration.SupportedLanguages.English,
            AIConfiguration.RecognitionEngine.System);

    aiService = AIService.getService(this, config);
    aiService.setListener(this);
}

public void listenButtonOnClick(final View view) {
    aiService.startListening();
}

@Override
public void onResult(AIResponse response) {
    Result result = response.getResult();

    // Get parameters
    String parameterString = "";
    if (result.getParameters() != null && !result.getParameters().isEmpty()) {
        for (final Map.Entry<String, JsonElement> entry : result.getParameters().entrySet()) {
            parameterString += "(" + entry.getKey() + ", " + entry.getValue() + ") ";
        }
    }

    // Show results in TextView.
    resultTextView.setText("Query:" + result.getResolvedQuery() +
            "\nAction: " + result.getAction() +
            "\nParameters: " + parameterString);
}

@Override
public void onError(AIError error) {
    resultTextView.setText(error.toString());
}

@Override
public void onAudioLevel(float level) {

}

@Override
public void onListeningStarted() {

}

@Override
public void onListeningCanceled() {

}

@Override
public void onListeningFinished() {

}

}

Looking fwd to your reply.

Thanks

How to get all the text response?

In Android SDK I'm getting response like this-

final Result result = response.getResult();
final String speech = result.getFulfillment().getSpeech();
Log.i(TAG, "Speech: " + speech);

This will give only one response. How can I get all the text responses?
Please help me out?

ProGuard

Lack of ProGuard guidance in readme.

NetworkOnMainThreadException

Hey!

When trying to do what the tutorial says in the own application, I get a NetworkOnMainThreadException. How can I fix it? Already tried working with a Thread but that didn't work.

Get only the exact response

Hello!

I may have overseen this but how can I just get things like "weather.search" or the content of "speech"?

Many thanks

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.