在线媒体播放

摘自http://arm9.org.ru/archiver/?tid-554.html

 

Develop an IMS SIP client on Android

Develop an IMS SIP client on Android. Part 1
This is a series of 3 post samples how to develop for Android using ImsInnovation.com as the IMS SIP provider. The samples has been developed using Eclipse on Mac and Windows. The other platform that should work equally well but without testing on it is Linux.
Preparation
First of all to develop on Android one needs the SDK.
Android Download instructions.
Android Installation Instructions.
If you already have an Eclipse installation such as for example the SDS (Service Developer Studio)
Or else one can be fetched here:
SDS Download site.
or just eclipse
[url=http://www.eclipse.org/downloads/]http://www.eclipse.org/downloads/[/url]
Start the eclipse and install the ADT (Android Development Tools)
The smoothest way is to add it as a remote site for software updates in Eclipse
Point the IDE to search for the latest ADT under:
[url=https://dl-ssl.google.com/android/eclipse/]https://dl-ssl.google.com/android/eclipse/[/url]
The other thing that is required is to sign up for a SIP IMS account.
If you do not have this already then it can be done here:
IMS User provisioning page.
Follow the link to registration.
Sample1
The first sample will demonstrate how to build a simple chat client for SIP instant messages over IMS. Except of showing the communication framework provided by Ericsson the sample will also demonstrate power management and waking up the screen and releasing screen locks in Android.
At the same time for people not accustomed with Android development it will also show the event driven style of programming for true multithreaded devices.
The first step is to download the sample bundle. Unpack it and import it into Eclipse. (File/Import [existing project]).
Download the sample zip.
Under the lib directory the file containing the ImsInnovation API should be located there after unpacking the zip file.
There is also a bin directory with the .apk file that can be used to install the sample without compilation.
If you just want to try out the sample and you have followed the installation instructions from the Android SDK you should now have the adb command in your path. Start the emulator from the SDK or plug in a real phone with the USB cable. (Windows users will have to install a drive that is provided as part of the SDK). Now to control that everything is correct an adb command can be used to verify that adb can talk to the emulator or the phone.
$> adb devices
List of devices attached
HT845GZ53016 device
Now by typing “adb install imsLabsSample.apk” the application will be uploaded to the device.
The primary network device is the Wireless LAN driver for the real phone and the secondary the APN with the 3G mobile access. Now a good test is to open the built in Web browser and make sure that it has Internet connection and fetching pages in a correct way. If that is the case then there is a fairly good chance that the sample will work too. Now tap the arrow to expand the file view in the Android device and locate the IMS Android Sample with the nice Ericsson icon.
Since we are developers we want to know what is happening. One good command to see what the device is doing is to type “adb logcat”. It should attach to the device and start typing a lot of information. If this command was ran prior to starting the sample the SIP registration to the IMS system should be logged on the screen. The sequence is terminated by a 200 OK message on the register at the same time that the sample GUI will display a message {IMS Connected}.
In case of error there will be no message that it failed to connect, the way it informs the user is when an IM is to be sent then an error message that there is no message service to utilize.
Now the binary is provisioned with a demo user so there might be other running and getting clashes. So the best thing to do is to compile your own version with a unique user ID registered to the imsinnovation.com domain. But to just get a quick feedback and a feeling what the code does .
So a simple test is to send an instant message to yourself. Just type some message and press the send button. The SIP message will be sent to an Ericsson basement in Sweden and hopefully come back. Now having two clients is much more fun. But for that we need first to recompile. Since if both uses the same user login ID the testing will not be so much fun. (But it should work too due to SIP forking. Both the terminal that sent the message and the other terminal that just registered will receive the message.)
Now lets look at the code and how to compile the example.
A graphical user interface element in Android is usually extending the android.app.Activity
To also include the IMS SIP communication framework we have developed an extended version of the Activity class. Instead of extending the Activity class you should extend the
com.ericsson.labs.android.ImsActivity class.
For more info look at the javadoc.
So in the sample here is how the chat service is defined:
public class ImsChat extends ImsActivity {

while the ImsActivity is defined as:
public class ImsActivity extends Activity {

Now one interesting event in the Android application life-cycle is the onCreate() function call.
It is important to understand the life cycle:
[url=http://developer.android.com/reference/android/app/Activity.html]http://developer.android.com/reference/android/app/Activity.html[/url]
In order for the IMS SIP connection to be established in the right way the ImsActivity onCreate should be overridden.
@Override
        public void onCreate(Bundle savedInstanceState) {
                //Call super to initialize the network and IMS connection
                super.onCreate(savedInstanceState);
                setContentView(R.layout.main);

                //Hook up the send button generated by main.xml
                sendButton = (Button) findViewById(R.id.SendButton);
                sendButton.setOnClickListener(sendListener);

                //Hook up the two EditText fields generated by main.xml
                textMessage = (EditText) findViewById(R.id.EditMessage);
                textToURI = (EditText) findViewById(R.id.EditTextToURI);
                textToURI.setText("sip:android@imsinnovation.com");
        }

First there is a call to the base class ImsActivity, super.onCreate().
Then the R.layout is pointing out the “main” as the graphical layout to show.
It is located in the sample in the res/layout directory.
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
    android:orientation="vertical"
    android:layout_width="fill_parent"
    android:layout_height="fill_parent"
    >
    <LinearLayout android:layout_height="wrap_content" android:id="@+id/LinearFlatLayout" android:layout_gravity="center_horizontal" android:layout_width="fill_parent">
        <TextView android:layout_height="wrap_content" android:text="@string/send_to" android:layout_width="wrap_content"/>
        <EditText android:layout_height="wrap_content" android:id="@+id/EditTextToURI" android:layout_width="fill_parent"/>
    </LinearLayout>
    <LinearLayout android:layout_height="wrap_content" android:id="@+id/LinearFlatLayout" android:layout_gravity="center_horizontal" android:layout_width="fill_parent">
        <TextView android:layout_height="wrap_content" android:layout_width="wrap_content" android:text="@string/message"/>
        <EditText android:layout_height="wrap_content" android:layout_width="fill_parent" android:id="@+id/EditMessage"/>
    </LinearLayout>
    <Button android:text="@string/send" android:layout_height="wrap_content" android:layout_width="wrap_content" android:layout_gravity="right" android:id="@+id/SendButton"/>
</LinearLayout>

The GUI is defined in XML and all labels are stored in separate Xml file for easy internationalization. The rest of the code in the onCreate() is to tie the graphical components with Java objects like :
sendButton = (Button) findViewById(R.id.SendButton);

A button needs a listener while the EditText widget is pre set with the text string of our own SIP identity. Of course the XML presentation of the GUI is not as nice as a graphical one. So the preferred way of developing the GUI is in Eclipse ADT layout mode.
Picture 1

Now the next event that the software has to handle is the onLogin() event callback defined by the ImsActivity class. Here the login credentials are provided and the unique user id and password provisioned. In a full fledged application this values would be queried by a separate login Activity.
config.put("ImsInnovation-RegisterURI", "sip:imsinnovation.com");
config.put("ImsInnovation-ProxyURI", "sip:193.180.168.44:35060");
config.put("ImsInnovation-PubUI", "sip:youruser@imsinnovation.com");
config.put("ImsInnovation-PriUI", "youruser");
config.put("ImsInnovation-Password", "yourpass");
config.put("ImsInnovation-Realm", "imsinnovation.com");

super.onLogin(config, "3gpp-application.com.imsinnovation.sample", false);

The config object is passed in to the function by the ImsActivity and when exited from this function it will be used to create an ImsInnovation connection. The execution continues with the call to super() on the ImsActivity. The first parameter is the login properties config, the second is a IARI identifier, that could be used to deploy endpoint services on the server side but also to separate multiple applications running on the same client. The third parameter to super is if the GUI thread should block while setting up the communication with the IMS core. If true the GUI will be drawn when the 200 OK on register has arrived. If false the client will be able to start typing and the GUI will be drawn as quickly as possible. But if the user would want to send a IM before everything is initialized we would have to put a guard against that.
Next step in the initialization is to wait for the IMS framework to get connected.
The onConnected() function is called with a reference to the ImsInnovation object.
At this point the innovation object should be fully initialized. There is actually no need to keep a reference to it since in a event that it would change the ovveriding ImsActivity class would get the corresponding callback function invoked like the disConnected function that is implemented in this sample. The main parts of the code in the onConnected does the following. A ServiceListener is implemented and a callback to PageMessageReceived is the most important part. This notifies the client that someone has sent an IM. When a page message is received we call a helper function alertOnMessage(), we will come back to what is does later. For now lets assume that it is delivering the IM to the end user in a nice way. The next thing we do onConnected() is to register the created ServiceListener with the innovation object. After that the MessageService reference is retrieved from the service listener. This is done so that when the user presses the send GUI button we can then create a new IM and send it.
The sending is a reaction to the registered button listener we did in onCreate(). The listener callback onClick() is called. First of all we check if the MessageService is not null. If it is then it means that we have initiated the framework in non blocking mode and the end user is trying to send an IM before the core is initialized. In the other case everything is as it should be and we create and send the IM to the part specified by the TO EditText.
messageService.sendQuickMessage(textToURI.getText().toString(),
        textMessage.getText().toString().getBytes("UTF-8"), "text/plain");

This method takes a byte array and we specify that the content is text plain. It might look complicated but the reason for this is to enable sending other content as a quick message. A picture or a audio clip could be other things that can be transfered in a quick message. In general it is not recommended to transfer large content. We will show in the Android samples part 3 how a file transfer should be set up. This should be comparable to a SMS or an MMS where a couple of kilobytes should be no problem to transfer.
The last interesting piece of code is the alertOnMessage() function.
For just displaying a quick dialog the android.widget.Toast class is used.
There is a special caveat since all UI interactions needs to be performed by the same thread. In this case we got the callback from the SIP stack getting an incoming IM. That means that we got it from the socket thread and not the UI thread. It is not always easy to know but with some reasoning or after the hard trial and error getting an exception it can be figured out.
        final Activity ctx = ImsChat.this;
        ctx.runOnUiThread(new Runnable() {
                public void run() {
                        Toast.makeText(ctx, message, Toast.LENGTH_LONG).show();
                }
        });

The rest of the alert on message function shows how other useful Android API's can be used. Displaying the IM when the user is looking at the screen works only for demo's. In real life the phone is in the pocket or on your desk. The KeyguardManager is a system service that enables a representation of the service that locks the screen so that it is not touched by mistake while in you pocket or purse. We call :
km.inKeyguardRestrictedInputMode()

The answer yields if the screen lock is on or not. This gives us a hint if the user is interacting with the phone or not. In the case the screen is locked we do a number of things to attract the users attention. For that reason we are going to need two more basic Android services. The VibratorManager to alert the end user by vibrating the phone and the power manager to wake up the phone and turn on the back lights on the screen.
The vibrator starts buzzing in a predefined pattern :
Vibrator vibrator = (Vibrator) getSystemService(Context.VIBRATOR_SERVICE);
                                long[] pattern = { 500, 200 };
                                vibrator.vibrate(pattern, 0);

With the Power and Keyguard we do retrieve a lock object for the two services that will allow to wake the power and release the key lock temporarily.
wl.acquire();
kl.disableKeyguard();
try {
        Thread.sleep(3500); //Alert for 3,5 sec
} catch (InterruptedException e) {}

As the code comment says the end user is alerted for 3,5 seconds. If by then it has not started to interact the vibrator stops, the screen lock is resumed and the power goes back to the state of resting it was in when the IM was received. It is all done by this code:
wl.release();
kl.reenableKeyguard();
vibrator.cancel();

That concludes the code part of this sample and shows how a few lines of code can make a complex chat client with the help of the ImsInnovation.jar and the Android SDK.
There are two more files in the sample that we did not mention. One is the nice Ericsson icon representing the application. It is stored under the res/drawable directory as icon.png
The last file is the one that is the metadata about the application and specifies how everything fits together. That is the AndroidManifest.xml
ADT help you with generating the majority of the file.
The application is defined by this xml tag :
<application android:icon="@drawable/icon" android:label="@string/app_name">

Where also the connection to the icon is made.
Another important piece is the activity declaration under the application tag.
<activity android:name=".ImsChat" android:label="@string/app_name">
        <intent-filter>
                <action android:name="android.intent.action.MAIN" />
                <category android:name="android.intent.category.LAUNCHER" />
        </intent-filter>
</activity>

This gives the OS a hint when the application is launched the ImsChat class contains the main activity to launch.
The last important piece are the required permissions for this application to execute.
This particular application requires 6 permissions. The first 3 are connected to the IMS SIP framework and the ImsInnovation.jar. Without these it would not operate in a correct way.
<uses-permission android:name="android.permission.INTERNET"/>
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE"/>
<uses-permission android:name="android.permission.WAKE_LOCK"/>

The SIP stack uses IP connectivity, it also uses WIFI and prevents it from going to sleep if WIFI is connected. While the other 3 permissions are connected to alerting the end user on incoming messages.
<uses-permission android:name="android.permission.DEVICE_POWER"/>
<uses-permission android:name="android.permission.VIBRATE"/>
<uses-permission android:name="android.permission.DISABLE_KEYGUARD"/>

Now having the code covered the next step is to build and install it. Eclipse normally builds everything automatically. There is a strange class called com.ericsson.labs.sample.androidchat.R
It is automatically generated by the ADT plugin when the project is executed.
The best way is to right click on the project and choose the “Run As” -> Android Application
Picture 2

Now if the real phone is connected with USB the application will be executed on the phone. If it is not connected a new instance of the emulator will be started by eclipse ADT and the application installed there. There is also a way to configure ADT to ask every time where a Android Application should be executed. In that way multiple emulators might be stared at the same time and also a real phone connected with the USB cable. One note is that in the same menu there is an Android Tools option. If the Export Unsigned Application Package is chosen be aware that the .apk needs to be signed. This is the way to deliver real applications for production distribution. It needs to be signed with a real Certificate and then it can be uploaded to Android market. The other way and how the .apk sample file is built is using debug mode signing by the ADT plugin. The developer phones will allow this kind of APK to be installed. The commercial phones like the T-Mobile G1 phone will not allow installing this kind of applications. The owner of such phone must go to the phone configuration GUI and enable experimental application support.
“Settings->Applications->Unknown sources” needs to be checked/enabled
Now when everything is crystal clear I bet the next sample we will be able to spice it up a bit.
There are some things that could be done to make this sample even more exciting and it can be explored further and perhaps donated back to the labs portal. Here is a list of things that can be extended:
•        login page
•        history window for the latest messages
•        tabs for communication with multiple people
•        integration the the presence buddy list
•        voice signal for the incoming message
All of the above should be quite trivial extensions but for simplicity the sample should only demonstrate the least possible set for implementing a chat IM service.
Feedback:
We welcome any feedback good or bad on the posted samples.
Another interesting feedback from the community is what kind of device and network operator was used during the tests. The basic testing was done on the emulator and a G1 developer phone using WLAN and 3G in the AT&T network.
Develop an IMS SIP client on Android. Part 2
This is a series of 3 post samples how to develop for Android using ImsInnovation.com as the IMS SIP provider. This sample demonstrates the presence and group list management part of the API. As the backend in the imsinnovation.com domain an Ericsson of the shelf product PGM is acting as the presence and XDM server.
Preparation
A lot has happened since posting the part 1!
Google has released the Android 1.5 (Cupcake) with a lot of nice new features. G2 HTC Magic has been released. But to a large extent it is the same phone but without a physical keyboard. This sample was written for Android 1.0 but also for covering the changes and giving links to how to upgrade might be useful. There should be no problem compiling the sample for 1.0 if you remove the 1.5 specific things from the AndroidManifest. The bundled binary version is compiled for the 1.5 release.
This means that you have to download a new SDK and a new version of the Eclipse plugin. Eclipse should detect and download the latest version of the ADT plugin if you followed the instructions in Sample 1. Make sure to change the location of the ADT to point to the Android 1.5 SDK library.
When running on a G1 developer device there is some official information from HTC on how to upgrade:
[url=http://www.htc.com/www/support/android/adp.html]http://www.htc.com/www/support/android/adp.html[/url]
When running on a T-Mobile commercial phone the operator should push out 1.5 cupcake with OTA. (Over The Air Provisioning) But for all the developers that are like me, inpatient there is a link on how to upgrade yourself. I do not know how supported this is but I did it and it worked fine. [url=http://forum.xda-developers.com/showthread.php?t=511011]http://forum.xda-developers.com/showthread.php?t=511011[/url]
Both ways have been successfully tested at least on the US phones.
With Cupcake there are also some changes in how the emulator behaves. There are 3 profiles that the emulator can operate. Android 1.1, Android 1.5 and 1.5 with Google extra API's
In our case we need the last one since that contains the extra Map libraries.
For every emulator instance the developer needs to create an AVD (Android Virtual Device)
android create avd -n maps -c 50M -t 3
This command creates a emulator template called “maps” with an emulated SDCARD of 50MBytes.
To start the emulator call:
emulator @maps
Other useful commands are:
android list avds
and
android delete avd -n maps
There is also a helper GUI for doing this in Eclipse by pushing the phone button.

Sample2
Now that the emulator is setup lets look at how the Presence application is implemented.
Lets start with the main GUI page to get a better understanding of the use case.

The Send To field specifies a SIP URI of a person to subscribe for presence state modifications. The Mood field will contain your own presence mood state. The corresponding Subscribe and Publish buttons will actually trigger the events to happen. The map button is used to invoke a Google Activity to display a map across the entire screen.
Here is how the XML for this GUI looks like:
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
    android:orientation ="vertical"
    android:layout_width ="fill_parent"
    android:layout_height ="fill_parent"
    >
        <LinearLayout android:layout_height="wrap_content" android:id="@+id/LinearFlatLayout" android:layout_gravity="center_horizontal" android:layout_width="fill_parent">
                <TextView android:layout_height="wrap_content" android:text="@string/send_to" android:layout_width="wrap_content"/>
                <EditText android:layout_height="wrap_content" android:id="@+id/EditTextToURI" android:layout_width="fill_parent"/>
        </LinearLayout>
        <LinearLayout android:layout_height="wrap_content" android:id="@+id/LinearFlatLayout" android:layout_gravity="center_horizontal" android:layout_width="fill_parent">
                <TextView android:layout_height="wrap_content" android:layout_width="wrap_content" android:text="@string/mood"/>
                <EditText android:layout_height="wrap_content" android:layout_width="fill_parent" android:id="@+id/MoodMessage"/>
        </LinearLayout>
        <LinearLayout android:layout_height="wrap_content" android:id="@+id/LinearFlatLayout" android:layout_width="fill_parent"         android:layout_gravity="center_horizontal" android:gravity="right">
                <Button android:layout_height="wrap_content" android:layout_width="wrap_content" android:layout_gravity="right" android:id="@+id/MapButton" android:text="@string/map"/>
                <Button android:layout_height="wrap_content" android:layout_width="wrap_content" android:layout_gravity="right" android:id="@+id/PublishButton" android:text="@string/publish"/>
                <Button android:layout_height="wrap_content" android:layout_width="wrap_content" android:layout_gravity="right" android:id="@+id/SubscribeButton" android:text="@string/subscribe"/>
        </LinearLayout>
</LinearLayout>
No strange new things here, pretty similar to Sample 1. The novelty comes within the maps Activity that needs to be separate and an independent Activity. The reason for that is that it is not only showing the map but also presenting overlays and keeping a map cache. So for that purpose it does its own memory and threading and can not be a simple View component but an Activity on its own.
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
    android:orientation="vertical"
    android:layout_width="fill_parent"
    android:layout_height="fill_parent"
    >
        <com.google.android.maps.MapView
                android:id="@+id/MapView"
                android:apiKey="0tr_XJhNmMlXLfgxAyl61t3RhI4QMkmlVZ2xcbA"
                android:layout_width="fill_parent"
                android:layout_height="fill_parent"
                android:clickable="true" />
</LinearLayout>
The most important thing to note is the apiKey tag. It needs to be updated with one generated for your local machine that is going to compile the sample. If not modified then the map will turn out black. Here are the instructions to follow on how to generate an API key.
Here is the description:
[url=http://code.google.com/android/add-ons/google-apis/mapkey.html]http://code.google.com/android/add-ons/google-apis/mapkey.html[/url]
and the actual page where to generate you own key:
[url=http://code.google.com/android/add-ons/google-apis/maps-api-signup.html]http://code.google.com/android/add-ons/google-apis/maps-api-signup.html[/url]
In the preferences of Eclipse check the menu Android -> Build for the filed debug.keystore. On OSX the keystore is located in the home directory in .android directory. This is the seed for your key generation.
Paste the generated key into map.xml described above.
Now that the XML layout is covered lets look at how that hooks into the Java code.
public class ImsPresence extends ImsActivity {
        // Handle to the presence service
        PresenceService presenceService;
        // Object representing the logged in user
        Presentity self;

        // Button to directly display the map view
        Button mapButton;
        // Button to publish presence mood
        Button publishButton;
        // Button to subscribe for presence state notifications
        Button subscribeButton;
        // Text widget to set the text to send in the IM
        EditText moodMessage;
        // SIP URI to send the IM to
        EditText textToURI;

        //Buddy list name used for storing contacts in XDM
        final static String BUDDY_LIST = "myBuddies";

        @Override
        public void onCreate(Bundle savedInstanceState) {
                // Call super to initialize the network and IMS connection
                super.onCreate(savedInstanceState);
                setContentView(R.layout.main);

                // Hook up the three buttons generated by main.xml
                // Map button to display the map Activity
                mapButton = (Button) findViewById(R.id.MapButton);
                mapButton.setOnClickListener(sendListener);
                //Publish button to send mood and location
                publishButton = (Button) findViewById(R.id.PublishButton);
                publishButton.setOnClickListener(sendListener);
                //Subscribe for notifications of changed presence state
                subscribeButton = (Button) findViewById(R.id.SubscribeButton);
                subscribeButton.setOnClickListener(sendListener);

                // Hook up the two EditText fields generated by main.xml
                moodMessage = (EditText) findViewById(R.id.MoodMessage);
                // The SIP URI to get notifications from.
                textToURI = (EditText) findViewById(R.id.EditTextToURI);
                textToURI.setText("sip:android@imsinnovation.com");
        }

Some of the graphical components that the applications will interact with are fused together in the onCreate method. There are also 3 presence related member fields. The PresenceService is the main entry point to the imsinnovation framework. The Presentity is a reference for setting the logged in user's presence state. The Buddy list constant string is used to name an XCAP list that we will use to store contacts. There is always a default list but the end user can create new ones, for example, one list for buddies, one list for co-workers and one list for family.
The onLogin() method looks identical in sample 1 and sets the credentials.
Most of the initialization of the presence framework happens in onConnected() method. Also, it is here the ServiceListener is implemented and set.
presenceService.getWatcherList().setAllowAll(true);
is called to allow anyone to subscribe for the logged in users presence. In normal operation a person would prefer to know who is getting the presence information. In that case it is possible to allow per user and get notifications when new users are interested in your presence changes. This time we have short circuit the entire functionality for the sake of simplicity.
private  OnClickListener sendListener = new OnClickListener() {
        public void onClick(View v) {
                if (v.getId() == R.id.MapButton) { //Bring up the map
                        Intent i = new Intent(ImsPresence.this, PresenceMapActivity.class);
                        startActivity(i);
                } else if (presenceService == null) { // Not connected yet!
                        String o = "The IMS framework is not initialized. - No Message Service";
                        Toast.makeText(ImsPresence.this, o, Toast.LENGTH_LONG).show();
                } else if (v.getId() == R.id.PublishButton) {
                        publish();
                } else if (v.getId() == R.id.SubscribeButton) {
                        try {
                                subscribeForPresence();
                        } catch (Exception e) {
                                Toast.makeText(ImsPresence.this,
                                        "Failed to Subscribe for presence.",
                                        Toast.LENGTH_LONG).show();
                                Log.e("Sample", "", e);
                        }
                }
        }
};
This is the callback handler for pushing various buttons. When the Maps key is pushed a new Intent is created. The way android navigates between screens is by sending Intents. Firstly, you send an Intent specifying what content you are sending and then the android platform will choose the correct Activity to start (or it can present a list if multiple matches). Secondly, you can statically define a Activity class that should be invoked.
The other two buttons are for publishing your own presence including your GPS coordinates. The second button lets you specify a SIP URI to subscribe for presence changes. The easiest way of testing the application is to first publish your own presence, then subscribe for your own presence state. The value submitted in the mood field should come back as the result of a successful subscription.
At the same time if GPS position was retrieved the map activity should be brought up on screen.
Now lets look at the publish code:
private void publish() {
        //Store the mood in the pres doc note field
        self.setNote(moodMessage.getText().toString());
        //Set willingness to communicate
        self.setOverridingWillingness(Presentity.WILLINGNESS_POSITIVE);

        //Get the current location
        LocationManager lm = (LocationManager) getSystemService(LOCATION_SERVICE);
        String best = lm.getBestProvider(new Criteria(), true);
        Location l = lm.getLastKnownLocation(best);
        //Format location to be in epsg format
        String phoneLocation = ImsUtil.getFormatedLocation(l,true);
        GeographicalLocation gl = new GeographicalLocation();
        gl.setPoint("Me", "epsg:4326", phoneLocation);
        //Add to presence document object
        self.setGeographicalLocation(gl);
        try {
                self.publish(); //Sends the publish message
        } catch (ServiceClosedException e) {
                Toast.makeText(ImsPresence.this, "Presence service closed.",
                        Toast.LENGTH_LONG).show();
        }
}

What it does is to collect the mood string from the EditText box. Then gets a reference to the LocationManager singleton in the Android platform. One note is that it tries to get the best possible location which will result in a GPS fix attempt, if there is no recent position fix cached in the system. This is fairly battery intensive so one should ask the question if it is good enough with cell triangulation instead?
Then a conversion of the location format is done to be compatible with our Java ME tutorial. The last step is to publish the information to the presence server.
Now looking on the subscribe side of the sample:
private void subscribeForPresence() throws IllegalArgumentException,
                        ServiceClosedException, IllegalListStateException {
        //Recreates the list to be sure of clean slate
        presenceService.removePeerPresentityList(BUDDY_LIST);
        final PeerPresentityList buddies = presenceService.createPeerPresentityList(BUDDY_LIST);

        //Set listener for notification callback's
        buddies.setPresenceListener(new PresenceListener() {
                public void presentityInvalid(String presentity) {
                        printWithToast(ImsPresence.this, "Invalid presence SIP URI: " + presentity);
                }

                public void presentityUpdated(PeerPresentity peer,
                                boolean isActive) {
                        printWithToast(ImsPresence.this, "Update from : "
                                + peer.getPeerUri() + " note : " + peer.getNote());
                        GeographicalLocation gl = peer.getGeographicalLocation();
                        //If presence document contains geo-location show the map
                        if (gl != null && gl.getCoordinate() != null) {
                                Intent i = new Intent(ImsPresence.this, PresenceMapActivity.class);
                                i.putExtra(PresenceMapActivity.EXTRA_URI, peer.getPeerUri());
                                i.putExtra(PresenceMapActivity.EXTRA_NOTE, peer.getNote());
                                i.putExtra(PresenceMapActivity.EXTRA_LOCATION, gl.getCoordinate());
                                startActivity(i);
                        }
                }

        });
        //Need to implement this listener since buddies can only be added
        //to a presece list when it is in stable state.
        buddies.setStateListener(new AsynchronousListener() {
                boolean added = false;
                public void stateChanged(Asynchronous source, int newState) {
                        if (!added && newState == Asynchronous.STABLE) {
                                try {
                                        //Add the SIP URI from the GUI text field
                                        buddies.addPresentity(textToURI.getText().toString());
                                        added = true;

                                } catch (Exception e) {
                                        e.printStackTrace();
                                        printWithToast(ImsPresence.this, "Added pesentity failed.");
                                }
                        }
                }

        });
        //Subscribes for changes of any buddy in the list
        //NOTE since the list is re-created every time there will be max 1
        //entry in the list. But there is nothing preventing of having
        //multiple buddies in a list. More for simplicity and robustness

        buddies.subscribe();
}
First of all the code removes any previous list just to make things easier to play around with. Then the list is re-created, so there is only one subscription at a time in this sample.
Next step is to set a callback listener that will be triggered when a SIP Notify message containing the presence document is received. When there is location information in the document the position is added to the map Intent as extra information. This is in order for the map to draw it as an overly on top of the Google map view. The last part is adding the buddy to the contact XCAP list. There is a need to see that the list is in stable state (synchronized with the server side) before an entry can be added.
The last piece of code is from onCreate function call on the Map Activity.
String uri = getIntent().getStringExtra(EXTRA_URI);
String note = getIntent().getStringExtra(EXTRA_NOTE);
String loc = getIntent().getStringExtra(EXTRA_LOCATION);

//Overlay on the map to center around the current phone position
final  MyLocationOverlay overlay = new MyLocationOverlay(this, map);
//overlay.enableCompass(); //Does not work due to bug in 1.5 sensors
overlay.enableMyLocation();
overlay.runOnFirstFix(new Runnable() {
        public  void run() {  
                mc.setZoom(11); //Good level to start with
                //Center the map around current location
                mc.animateTo(overlay.getMyLocation());
        }
});

map.getOverlays().add(overlay); //Add the current blue dot

if ( uri != null && loc != null ) {
        //Reformat the location to Google maps format
        Location l = ImsUtil.parseFormatedLocation(loc);
        Double geoLat = l.getLatitude()*1E6;
        Double geoLng = l.getLongitude()*1E6;
        final GeoPoint gp = new GeoPoint(geoLat.intValue(), geoLng.intValue());
        Log.d("Sample", "gp = "+gp);

        //Get the GTalk icon for online
        Drawable icon = getResources().getDrawable( android.R.drawable.presence_online );
        //Add a specific overlay printig presence information
        MyItemizedOverlay<OverlayItem> overlay2 = new MyItemizedOverlay<OverlayItem>(icon);
        overlay2.addItem(new OverlayItem(gp, uri, note));
        map.getOverlays().add(overlay2); //add it to the map
}
Here the extra intents are collected. If map button was clicked on they are absent but if the map was triggered by a presence notification they will be set as we saw above. We add some nice zooming capabilities to the map. Center the map to the current location. If there is location information then it is formated to the Gmaps format and handed to a customized Overlay class for printing it on top of the map.
Note that the icon we are using for plotting the user is reused from the OS presence online icon. These kind of things can simplify for a user so that it is easier to recognize yourself.
Most of the drawing action happens in the MyItemizedOverly and it is implemented in a generic way. Since we only provide for one buddy in the list then there will be at most one item in the overlay.
@Override
public  void draw(Canvas canvas, MapView mapView, boolean shadow) {
        super.draw(canvas, mapView, shadow); //Draw the generic things
        Projection projection = mapView.getProjection();
        Paint paint = new Paint();
        paint.setTextSize(14);
        Log.d("Size", "=" + paint.getTextSize());
        for (OverlayItem oi : items) {
                //Get the Point x,y of the green dot
                Point p = projection.toPixels(oi.getPoint(), null);
                //Calculate offset not to print on marker
                int xoff = marker.getIntrinsicWidth() / 2;
                //Draw text user part of SIP URI and mood.
                canvas.drawText(getUriUser(oi.getTitle()),p.x+xoff,p.y-7, paint);
                canvas.drawText(oi.getSnippet(), p.x + xoff, p.y + 7, paint);
                }
        }
The marker is going to be painted by the Super class OverlayItem. But to know who published what also the user part of the SIP URI is going to be painted together with the mood note to the right of the green dot.

Last but not least lets look at the manifest XML file how all this is configured.
<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
        package="com.ericsson.labs.sample.androidpresence" android:versionCode="1"
        android:versionName="1.0.0">
        <application android:icon="@drawable/icon" android:label="@string/app_name">
                <uses-library android:name="com.google.android.maps"/>
                <activity android:name=".ImsPresence" android:label="@string/app_name">
                        <intent-filter>
                                <action android:name="android.intent.action.MAIN"/>
                                <category android:name="android.intent.category.LAUNCHER"/>
                        </intent-filter>
                </activity>
                <activity android:name=".PresenceMapActivity">
                </activity>
        </application>
        <uses-permission android:name="android.permission.INTERNET"/>
        <uses-permission android:name="android.permission.ACCESS_WIFI_STATE"/>
        <uses-permission android:name="android.permission.WAKE_LOCK"/>
        <uses-permission android:name="android.permission.ACCESS_FINE_LOCATION"/>
        <uses-sdk android:minSdkVersion="3"></uses-sdk>
</manifest>
Now the difference with the first sample is pretty much the last two lines. Extra permissions are needed to access GPS information and a statement that this application needs the SDK version 3 that includes the Google Maps libraries. Except for that also a uses-library is set on the application level. The rest is identical to the first sample with the exception of now having to declare the second activity that is our maps display. All Activities that are displayed needs to be declared in the manifest.
So this concludes the second sample and the complete code with a binary build can be downloaded to experiment further with the framework.
When executing this sample on the Android emulator
one fact becomes quickly obvious. The emulator does not have any built in GPS so where do we get the location from?
Luckily there is a way to provide this to the emulator.
The default emulator normally has a listener socket enabled where the developer can connect with a telnet application. By default the port is 5554 and on a second emulator 5556 but it can also be set with flags when starting the emulator.
telnet localhost 5554
Entering :
geo fix -122.08 37.41
Now pushing the map button in the application there should be a blue dot showing the current position. This coordinate is not so nice since it takes you on Hwy 101 and it is 10 lanes of congestion, but feel free to find you own favorite coordinates to test with.
The real G1 or G2 phones will actually give you a fair position so this is only for emulator testing.
Here is the source, eclipse project and latest jar files.
There is also a PDF version of the blog post available.
Develop an IMS SIP client on Android. Part 3
This is a series of 3 post samples how to develop for Android using ImsInnovation.com as the IMS SIP provider. This sample demonstrates the file transfer part of the API. As the backend in the imsinnovation.com domain an Ericsson of the shelf product IMS core with a Session Boarder Gateway going to facilitate resources.
Preparation
Once again a lot has happened since posting the part 2!
Google has released the Android 2.0 (Eclair) with a lot of nice new features. The Android framework is diverging. Even if the vendors claims it is purely cosmetic UI improvements it is not entirely true. In this case we are going to transfer pictures and video since they are the most interesting file content of an Android phone. This sample has been tested on the HTC developer phones but also on the HTC Hero. The HTC Sense framework is not behaving the same way as the stock ADP2 HTC developer phone.
This sample was written for Android 1.5 but also verified on 1.6 and a 2.1 emulator.
You should have the SDK and the Eclipse ADT plugin is recommended.
Please look at Sample 1 and Sample 2 for more details on setup.
This means that you have to download a new SDK and a new version of the Eclipse plugin.
Eclipse should detect and download the latest version of the ADT plugin if you followed the instructions in Sample 1. Make sure to change the location of the ADT to point to the Android 1.5 SDK library.
There are also a couple of changes in our Android API.
The ImsInnovation class has some new methods for debug. Previous versions where doing a lot of output but writing a more product quality application requires to be able of disabling logging.
So now if logging should be enabled there are a couple of steps that needs to be accomplished.
First of all we need to set in our code:
ImsActivity.setDebug(true);
Then it is a good practice to log the majority of things on Log.d debug level.
The imsInnovation.jar framework has a couple of own logging tags defined that it is using tag labels like : ImsInnovation and presence.
Android logging by default is logging at INFO level.
Here is an example how to enable DEBUG level for all the ImsInnovation logging.
adb shell
setprop log.tag.ImsInnovation DEBUG
Note that this sample uses the tag “Sample”!
The strange thing with the Android logging framework is that if something is logged with Log.d it will turn up in the log even when the default level is INFO. This is not consistent with how the normal logging frameworks like java.util.logging and log4j works.
But if Log.isLoggable(“Sample”,DEBUG) is called it would then return false.
So in this case we will need to set the property for that tag too.
setprop log.tag.Sample DEBUG
Since this sample is all about transferring files we will need two phones or two emulator instances.
Another important thing is that we need to have a SD card in the phone or a SD card image in the emulator for this sample. The reason is that Android stores pictures and video on the SD card.
Here is how the configuration of a new Android Virtual Device looks like in Eclipse.

Illustration 1: Eclipse ADT, Android AVD manager
Note that I have used 8MB size for the virtual SD card image. Even if Android claims that smaller images can be used at least some versions of the emulator on OSX are unable to mount a smaller SD card image. It might not be true for other OS configurations but at least 8MB has been a working size for me.
Sample3
Now that the emulator is setup lets look at how the File transfer application is implemented.
The entire code for this sample can be found here.
First we are going to look at the UI in order to get a better understanding of what the application actually does and also a helper UI for displaying an image or a video that we will receive. Then we will look at the main activity ImsFileTransfer and see how the UI connects with the ImsInnovation framework. One of the key success factor of an Android application is smooth OS integration and for this purpose we will look how our application can integrate with the Android Content Share framework. Finally we will conclude with implementing the FileTransferEventListener how to update progress bars and how to handle a received file in a ways so the it is imported into the Android native media library.
UI
Lets start with the main GUI page to get a better understanding of the use case.

Illustration 2: Sample 3 main screen, successful connection to framework
The Send To field specifies a SIP URI of a person to send a file to. The Title field will contain a custom text label. Then there are two progress bars depending on if the client is sending or receiving a file.
Here is how the XML for this GUI looks like:
<?xml version="1.0" encoding="utf-8"?>

<RelativeLayout android:layout_width="fill_parent"
        android:layout_height="fill_parent"
        xmlns:android="http://schemas.android.com/apk/res/android"
        android:id="@+id/MainLayout">

        <TextView android:text="@string/send_to"
                android:layout_width="wrap_content"
                android:id="@+id/TextSendTo"
                android:gravity="center_vertical"
                android:layout_height="wrap_content"
                android:layout_alignBaseline="@+id/EditTextToURI"/>
        <EditText android:layout_height="wrap_content"
                android:id="@+id/EditTextToURI"
                android:layout_width="fill_parent"
                android:layout_toRightOf="@+id/TextSendTo"/>

        <TextView android:layout_height="wrap_content"
                android:layout_width="wrap_content"
                android:id="@+id/TextTitle"
                android:text="@string/message"
                android:layout_below="@+id/EditTextToURI"
                android:layout_alignBaseline="@+id/EditMessage"/>
        <EditText android:layout_height="wrap_content"
                android:layout_width="fill_parent"
                android:id="@+id/EditMessage"
                android:layout_toRightOf="@+id/TextTitle"
                android:layout_below="@+id/EditTextToURI"/>

        <Button android:text="@string/send"
                android:layout_height="wrap_content"
                android:layout_width="wrap_content"
                android:id="@+id/SendButton"
                android:layout_below="@+id/EditMessage"
                android:layout_alignParentRight="true"/>

        <TextView android:layout_height="wrap_content"
                android:layout_width="wrap_content"
                android:id="@+id/TextSendbar"
                android:layout_below="@+id/SendButton"
                android:text="@string/sent"/>
        <;ProgressBar android:layout_height="wrap_content"
                style="?android:attr/progressBarStyleHorizontal"
                android:layout_width="fill_parent"
                android:layout_below="@+id/TextSendbar"
                android:id="@+id/ProgressBarSent"/>

        <TextView android:layout_height="wrap_content"
                android:layout_width="wrap_content"
                android:id="@+id/TextReceivebar"
                android:layout_below="@+id/ProgressBarSent"
                android:text="@string/received"/>
        <;ProgressBar android:layout_height="wrap_content"
                style="?android:attr/progressBarStyleHorizontal"
                android:layout_width="fill_parent"
                android:layout_below="@+id/TextReceivebar"
                android:id="@+id/ProgressBarReceived"/>
</RelativeLayout>
Note that this time we choose to use the RelativeLayout instead of the previous LinearLayout.
The reason is to demonstrate the preferred way of designing UI. As long as you do not need to nest the LinearLayout it is ok but the more complex GUI should use more complex layouts in order not to waste memory on the device. The biggest difference is that is that each object needs to specify how it should be drawn in relation to another object. The layout starts by default in the top left corner of the window unless other properties are specified. But it is entirely possible to start at the bottom or center and then draw the layout from there. This layout can be found in the resources layout directory and is called main.xml
There is also another layout that we have used for displaying the picture or video that we will receive.
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout
  xmlns:android="http://schemas.android.com/apk/res/android"
  android:layout_width="fill_parent"
  android:layout_height="fill_parent">

        <VideoView android:id="@+id/VideoViewShared"
                android:layout_width="fill_parent"
                android:layout_height="fill_parent"
                android:visibility="gone"/>
        <ImageView android:id="@+id/ImageViewShared"
                android:layout_height="fill_parent"
                android:layout_width="fill_parent"
                android:visibility="gone"/>
</LinearLayout>
This layout is built using the LinearLayout since it is very simple. It has a Image and a Video view.
Both are by default listed as “gone” meaning that they are not drawn or measured by the layout by default. Also both can not be displayed at the same time since they want to fill the parent. But we will only enable one at a time depending on if we received a picture or a video.
Now that the XML layout is covered lets look at how that hooks into the Java code.
Main Activity – Integrating to ImsInnovation framework
public class ImsFiletransfer extends ImsActivity {
        // Message service for implementing a file transfer service
        // using SIP INVITE message
        MessageService messageService;
        // Button to send a file
        Button sendButton;
        // Text widget to set the text label of the file
        EditText textTitle;
        // SIP URI to send the file to
        EditText textToURI;

        ProgressBar transferSent;
        ProgressBar transferReceived;

        byte[] bytes;

        @Override
        public void onCreate(Bundle savedInstanceState) {
                // Call super to initialize the network and IMS connection
                super.onCreate(savedInstanceState);
                setContentView(R.layout.main);

                // Hook up the send button generated by main.xml
                sendButton = (Button) findViewById(R.id.SendButton);
                sendButton.setOnClickListener(sendListener);

                // Hook up the two EditText fields generated by main.xml
                textTitle = (EditText) findViewById(R.id.EditMessage);
                textToURI = (EditText) findViewById(R.id.EditTextToURI);
                textToURI.setText("dream@imsinnovation.com");

                transferSent = (ProgressBar)
                        findViewById(R.id.ProgressBarSent);
                transferReceived = (ProgressBar)
                        findViewById(R.id.ProgressBarReceived);

                setDebug(true);
        }
The code looks very similar to the previous samples. The main difference is that we now are using the ProgressBar components in this sample. The other notable difference is that we are setting the debug flag to true
The onLogin() method looks identical in sample 1and 2 and sets the credentials.
Most of the initialization of the file transfer framework happens in onConnected() method. Also, it is here the ServiceListener is implemented and set.
// Implement the ServiceListener
messageService = new MessageService(innovation);
messageService.setMessageServiceListener(new MessageServiceListener() {

//Receives a FT request for sharing content
public void interactionRequest(PeerConnection peer,
                String initialContentType) {
        peer.setEventListener(new FiletransferEventListener(
                        initialContentType));
        String title = peer.getSession().getSessionDescriptor().
                        getSessionInfo();
        printWithToast(ImsFiletransfer.this, "Receiving "+
                        initialContentType.substring(0,
                                        initialContentType.indexOf('/'))+
                        " : "+title);
        peer.acceptContent(initialContentType, true);
}
On the incoming SIP Invite message will contain a Session Description. How this works is that SIP establishes a session between the two endpoints. The session that we are trying to established is described using Session Description Protocol (SDP). It uses another Internet standard protocol called Media Session Relay Protocol (MSRP). Both SIP and MSRP are very similar to the HTTP protocol. The difference between SIP and HTTP is that SIP is asynchronous and HTTP is synchronous. Another difference is that every SIP client also needs to provide a SIP server.
Here we are looking at the receiving end. We are acting server and are receiving an Invite to communicate. The SDP SIP message body describes on what IP and port we should establish connection. By calling peer.acceptContent the framework will send a SIP 200 OK message just like if it was HTTP. The only difference is that it too will contain a SDP completing the negotiation.
This will establish a MSRP connection but understanding the details is not really necessary.
Just for reference MSRP looks quite similar to SIP and HTTP. It consists of a request line and a set of headers and a body part transferring the actual byte content of the file. The reason for using a specific protocol is for performance. MSRP headers are identical in syntax but have the difference of having to be specified in a particular order. This makes the implementation more efficient and robust. It also contains the possibility to pause and resume a file transfer and also detect any missing content that might have resulted from a software crash.
The code can also choose to reject content depending on the originator, size or content type or any other reason that your service logic might want to enforce.
The other special line of code is the transfer of a custom label. For this reason the SDP request body is used to transport the custom string. In this sample the session info parameter is used to transport over the custom string.
peer.getSession().getSessionDescriptor().getSessionInfo();
The last piece of the code that is important to cover is the FiletransferEventListener.
It will be called when the content starts arriving. Also the same listener is going to be used on the transmitting side and we will cover it in detail.
Android Content Sharing framework integration
Let's look at the transmitting side. It start with selecting a file for transfer. For this purpose we will use the native android Dalvik OS support for Content sharing. The magic happens in the AndroidManifest.xml
<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
        package="com.ericsson.labs.sample.androidfiletranfer" android:versionCode="1"
        android:versionName="1.0.0">
        <application android:icon="@drawable/icon" android:label="@string/app_name">
                <activity android:name=".ImsFiletransfer" android:label="@string/app_name"
                        android:screenOrientation="portrait">
                        <intent-filter>
                                <action android:name="android.intent.action.SEND" />
                                <category android:name="android.intent.category.DEFAULT" />
                                <data android:mimeType="image/*" />
                        </intent-filter>
                        <intent-filter>
                                <action android:name="android.intent.action.SEND" />
                                <category android:name="android.intent.category.DEFAULT" />
                                <data android:mimeType="video/*" />
                        </intent-filter>
                        <intent-filter>
                                <action android:name="android.intent.action.MAIN" />
                                <category android:name="android.intent.category.LAUNCHER" />
                        </intent-filter>
                </activity>
                <activity android:name=".ViewActivity"/>
        </application>
        <uses-permission android:name="android.permission.INTERNET"/>
        <uses-permission android:name="android.permission.ACCESS_WIFI_STATE"/>
        <uses-permission android:name="android.permission.WAKE_LOCK"/>
        <uses-sdk android:minSdkVersion="3"/>
</manifest>
Most of the file looks straight forward. But the magic happens in the Activity declaration for the ImsFiletransfer definition. There are two additional intent filters. Both have the action of “android.intent.action.SEND” and the DEFAULT category. The magic code is to specify the mime type. The first filter specifies the content type of image/* while the second targets video/*.
So whenever an application wishes to share content with another application the android OS then resolves what other applications are capable of handling such content. We have just told the Android OS that we can handle video and images of any kind.
One good way of generating an action that results in an invocation of our sample is to use the Android Camera, Camcorder or Gallery applications. Here is how it looks.

Illustration 3: Android Camera app on 1.6
This is how the Camera application looks like on an Android 1.6 device. By selecting the share option the Android Package Manager examines all the installed applications for a suitable match.
The matches are presented in a List to the end user.

Illustration 4: Share options menu rendered by the Android OS
In this case we do want to select the IMS FileTransfer application to trigger our sample code.
What happens next is that our sample is started just if someone clicked on its icon but with one major difference. It will contain some extra information on the actual content that we do want to share. The Android OS is saving pictures and videos on the SD card and it uses a Content Provider URI to determine the location of the content that the end user wants to share. With this in mind the sample can have been started in two ways. With the icon (receiving side) and with the Share Intent (sender side).
When the main GUI is displayed the sender has an option of entering the destination address and the title we want the receiver to get when determining if it is willing to receive the content.
Next the sender will hit the Send button to send the initial Invite SIP message starting the transfer.
private OnClickListener sendListener = new OnClickListener() {
        public void onClick(View v) {
        .
        .
        .
                        try {
                                String ct = getIntent().getType();

                                PeerConnection peer = messageService
                                        .getPeerConnection("sip:"+textToURI.getText()
                                        .toString());
                                peer.setEventListener(new FiletransferEventListener(ct));

                                //Fetches the URI selected by another Android app.
                                //Best example is the Android camera or camcorder.
                                Uri uri = getIntent().getParcelableExtra(
                                        "android.intent.extra.STREAM");


                                InputStream is = getContentResolver().
                                        openInputStream(uri);
                                ByteArrayOutputStream baos = new ByteArrayOutputStream(
                                        is.available());
                                byte[] b = new byte[4096];
                                while (is.available() > 0) {
                                        int len = is.read(b);
                                        baos.write(b,0,len);
                                }
                                is.close();
                                bytes = baos.toByteArray();
                                baos.close();
                                transferSent.setMax(bytes.length);
                                transferSent.setProgress(0);
                                transferReceived.setProgress(0); //reset        

                                peer.getSession().getSessionDescriptor().
                                        setSessionInfo(textTitle.getText().toString());
                                peer.startSharingContent(ct);
                        } catch (IOException e) {
                                Log.e("Sample", "", e);
                        }
                }
        }
};
First of all we look at the Intent to got us started, specifically at the type. Since we specified image or video it should be one of the two. The next line we try to initialize the message service to address the person we want to send the file to. For this purpose the message service is initiated with a new peer connection containing the SIP identity of the receiver from the EditTextToURI widget.
Filetransfer Event Listener
Next we create a FiletransferEventListener to call us back when the receiver has accepted and is ready to start receiving content. The next line fetches the content URI from the Intent that triggered the sender application.
After this there are a couple of lines getting the actual content and converting the received content to a Java byte array. For this purpose we use the content resolver core Android class to get hold of a standard Java stream.
We initialize the progress bars to be set to zero progress. We store the title string from the other EditView widget so that it is transmitted in the body of the Invite (SDP)
The last line sends the actual SIP invite to the receiving side advertising the content type.
peer.startSharingContent(ct);
Now that we have seen the sending and receiving side initialization it is time to look at the FiletransferEventListener implementation.
class FiletransferEventListener implements EventListener {
                int total = 0;
                String contentType = "image/jpeg";

                public FiletransferEventListener( String contentType ) {
                        this.contentType = contentType;
                }

                public void contentAccepted(PeerConnection connection,
                                String contentType) {
                        try {
                                if (bytes != null) { //On the sender side
                                        Log.d("Sample", "Sending bytes = "+bytes.length);
                                        connection.sendContent(contentType, bytes);
                                } else { //receiver side.
                                        transferReceived.setProgress(0); //reset
                                }
                        } catch (Exception e) {
                                Log.e("Sample", "", e);
                        }
                }

                public void contentRejected(PeerConnection connection,
                                String contentType) {
                        Log.d("Sample", "Content Rejected!");
                }

                public void contentSharingStopped(PeerConnection connection,
                                String contentType) {
                        Log.d("Sample", "Content Sharing Stopped!");
                }

                public void newContentType(PeerConnection connection,
                                String contentType) {
                        connection.acceptContent(contentType, true);
                }

                public void receiveComplete(PeerConnection connection, String id,
                                byte[] data) {
                        Log.d("Sample", "Receive Complete!");
                        Log.d("Sample", "Got data len = " + data.length+
                                " type = "+contentType);
                        connection.stopSharingContent(contentType);
                        printWithToast(ImsFiletransfer.this,"Receive Complete!");

                        String fileType = ".tmp";
                        if("image/jpeg".equals(contentType ) ) fileType = ".jpg";
                        else if("video/3gpp".equals(contentType ) ) fileType = ".3gp";
                        else if("video/mp4".equals(contentType ) ) fileType = ".mp4";
                        else Log.d("Sample", "Add a mapping for content type = "
                                +contentType);

                        File dir = new File("/sdcard/DCIM/Camera");
                        if( !dir.exists() ) dir.mkdirs();
                        final File media = new File(dir,System.currentTimeMillis()+
                                fileType);
                        try {
                                FileOutputStream fos = new FileOutputStream(media);
                                fos.write(data);
                                fos.flush();
                                fos.close();

                                MediaScannerConnectionClient mscc = new
                                        MediaScannerConnectionClient() {

                                        public void onScanCompleted(String path, Uri uri)
                                        {
                                                if( uri != null ) {
                                                        Intent i = new Intent(
                                                                ImsFiletransfer.this,
                                                                ViewActivity.class);
                                                        i.setDataAndType(uri, contentType);
                                                        startActivity(i);
                                                }
                                        }

                                        public void onMediaScannerConnected() {
                                        }
                                };

                                final MediaScannerConnection scanner = new
                                        MediaScannerConnection(
                                        ImsFiletransfer.this,mscc) {

                                        @Override
                                        public void onServiceConnected(
                                                        ComponentName className,
                                                        IBinder service) {
                                                super.onServiceConnected(className,
                                                        service);
                                                try {
                                                        scanFile(media.getCanonicalPath(),
                                                                contentType);
                                                        disconnect();
                                                } catch (IOException e) {
                                                        Log.d("Sample",
                                                                "Failed in file scanning");
                                                }
                                        }
                                };
                                scanner.connect();

                        } catch (IOException e) {
                                Log.e("Sample", "Can not store Media!");
                        }
                }

                public boolean receiveProgress(PeerConnection connection, String id,
                                String contentType, int receivedBytes, int totalBytes) {
                        if( total == 0 ) {
                                total = totalBytes;
                                transferReceived.setMax(totalBytes);
                        }
                        if( receivedBytes > transferReceived.getProgress() )
                                transferReceived.setProgress(receivedBytes); //received
                        return true;
                }

                public void sendComplete(PeerConnection connection, String id) {
                        connection.stopSharingContent(contentType);
                        Log.d("Sample", "Send Complete!");
                        printWithToast(ImsFiletransfer.this,"Sent completed");
                }

                public boolean sendProgress(PeerConnection connection, String id,
                                int sentBytes, int totalBytes) {
                        if( total == 0 ) {
                                total = totalBytes;
                                transferSent.setMax(totalBytes);
                        }
                        if( sentBytes > transferSent.getProgress() )
                                transferSent.setProgress(sentBytes); //sent
                        return true;
                }

                public void transferCanceled(PeerConnection connection, String id) {
                        Log.d("Sample", "Transfer Canceled!");
                }
        }
First of all this code might look a little bit intimidating since it contains both the sending and receiving side. But with some guidance hopefully it will be crystal clear.
The constructor collects the actual content type we are transferring. It will be needed on the receiving end for us to know how to render the content so it is saved away in a local class variable.
The following function callback content accepted is received on both the sender and receiver side. On the sender side it is the cue to start sending the content since the receiver was willing to accept.
connection.sendContent(contentType, bytes);
The content type we just got from the method call while the actual bytes comes from our conversion when the send button was pressed. On the receiving side we just simply reset the progress bar to zero since we might have already performed a file transfer previously.
The following functions are not doing any significant part in this sample but can be very useful in a production grade application. The content rejected callback tells the application that the receiver does not wish to accept this file. The content sharing stopped indicates that the receiver for some reason wishes to pause the transfer while the new content type callback is performed in order for the application to indicate if it supports the proposed content type. I our case we are happy to accept any content type since we know that the Android OS will only invoke the application for images and video.
connection.acceptContent(contentType, true);
Lets leave the receiveComplete() function for later examination since it is the most complex one and look at the rest for now.
The send and receive progress functions are quite simple and are in charge of updating the sender and receiver progress bar. Since as we described earlier the MSRP is chunking the large content we will have a nice steady progress whenever a chunk is sent. One thing worth noting is that watching a live file transfer the receiver progress bar will be slightly ahead the sender progress bar. At first this might feel a little bit strange since intuitively the sender should be ahead. But what our framework does is to call this function not when a chunk was sent but when the receiver has acknowledged the reception of a sent chunk. That is the main reason for the strange phenomena making the sender bar very accurate.
The send completed function just calls to stop the content sharing.
connection.stopSharingContent(contentType);

What happens under the hood is that a SIP BYE message is send terminating our Invite session that we started earlier. Now we could have also continued to send more content and taken the opportunity that we do have an open TCP MSRP session. But in our sample we only transfer one file at a time.
So lets go back and look what happens on the receiver side when the entire content is received.
Also on the receiver side we call the stop content sharing function. It does the same thing here and this is a SIP specific measure for robustness. Any side might actually “hang-up”.
The a couple of line getting the byte content and storing it as a normal Linux file on the SD card follows. The mapping between content type and file extension is not a mandatory part. It simply makes it nicer and more correct look if someone wants to read the raw file of the filesystem. The Android framework is pretty good in guessing the content anyway so we could have left it out.
Android media scanner integration
The next step that does all the magic from the Android framework point of view is a bit tricky at first. This is because it is done as private class and is not in logical chronological order.
MediaScannerConnectionClient mscc = new MediaScannerConnectionClient() {

        public void onScanCompleted(String path, Uri uri) {
                if( uri != null ) {
                        Intent i = new Intent(ImsFiletransfer.this,
                                ViewActivity.class);
                        i.setDataAndType(uri, contentType);
                        startActivity(i);
                }
        }
};
The Media Scanner Connection Client is a helper class to the Media Scanner that will be called back when a successful scan is completed. At that time we do have a URI from the Content Resolver service provider that we want to pass to our ViewActivity for displaying. One important note is that you need to pass both the Data and Type in the Intent at the same time. If this is done in two function calls the first one is ignored. A bit strange Android behavior but some things has to be learned the hard way.
Next is the media scanner code.
final MediaScannerConnection scanner = new MediaScannerConnection(
                ImsFiletransfer.this,mscc) {

        @Override
        public void onServiceConnected(ComponentName className,
                                        IBinder service) {
                super.onServiceConnected(className, service);
                try {
                        scanFile(media.getCanonicalPath(), contentType);
                        disconnect();
                } catch (IOException e) {
                        Log.d("Sample", "Failed in file scanning");
                }
        }                        
};
scanner.connect();
Here we have to pass a reference to our client that will display the file when all is done with a reference to the executing activity that will be used as a Context for the media resolving. Since the service is a real Android service we need to wait for the callback that we are connected and the RPC is completed. We then scan the file and disconnect. At this point our callback should be invoked in the scanner client and we will invoke our other view activity.
Finally here is a code snipplet from the onCreate function of the view activity.
Intent intent = getIntent();

ImageView image = (ImageView) findViewById(R.id.ImageViewShared);

VideoView video = (VideoView) findViewById(R.id.VideoViewShared);

Log.d("Sample","View activity - got type = "+intent.getType());

if( intent.getType().startsWith("image/")) {
        image.setVisibility(View.VISIBLE);
        image.setImageURI(getIntent().getData() );
} else if( intent.getType().startsWith("video/")) {
        video.setVisibility(View.VISIBLE);
        video.setVideoURI(getIntent().getData());
        video.start();
Now the rest is quite simple. Remember that we declared both the image view and the video view as Gone? Now depending on if we just got a image or video we will make the rendering surface Visible! Then we need to set the URI of the content and we are pretty much done. Now the received content should be displayed on the screen.
The call to start the video might be redundant on some phones but we found that at least on the HTC Hero we need to invoke it. Also a note here on the choice of implementing this is in order. It is all but simple since there are multiple ways but not work on different Android phones.
Another approach we might have taken is to make an intent of rendering this by the Android framework and not our own Activity. This works fine too. Only drawback might be that you can not customize the UI experience, you do not have the same control and what should be returned when the display activity is over and in cases when multiple apps are running you app might very well have been killed to free up some memory.
There are also a couple of ways to include the content in an Intent to the Android framework and not having to manually save the file on the SD card. This is a tricky field since it did not work on the HTC Sense framework. Another way is to call static function android.provider.MediaStore.
This was also a mine field since under the hood it tries to convert pictures to Bitmaps hitting all kind of limitations and throwing exception due to the large content sizes. It is a bit funny since this happens for a picture taken with the built in 3MP camera and you find yourself helpless and on your own but luckily the solution above worked out well even after a quick look at the provided API's it was not our first or even second choice initially.
Now this pretty much concludes this sample but we also want to post a couple of screenshots for making it easier to get the entire flow of the application.
Also if anyone want to play around a bit more there are some interesting things to try out. For instance there is a nice app at Android market called PicSay that can be used in the chain before transferring the file. How it works is that instead of choosing the FT from the Camera share list you can choose the PicSay app, then modify and enhance the picture with a cool effect. Then the PicSay app also hooks in to this framework and share this picture on and then choosing our FT app. In the same way our sample could be enhanced to bring up a sharing option on the receiving side so that the user can decide what to do with a received content. This is a very powerful concept making each application focus on what it is best designed to do.

Illustration 5: Sender side, Title : "also nice"         
Illustration 6: Receiver side, Title receiving : "nice desk"

Illustration 7: Sender progress bar         
Illustration 8: Receiver progress bar
________________________________________
Summary
So this concludes the sample. One thing worth thinking about is that this actually solves a real problem to reach the mobile devices that often are behind Network Address Translators or firewalls. In this way there is a constant channel and you can push and receive content at any time. Programmable to either get the user to opt in for sending or receiving or as a infrastructure background task for data exchange.
Running the sample
Make sure to change the following lines in onLogin()
config.put("ImsInnovation-PubUI", "sip:changeme@imsinnovation.com");
config.put("ImsInnovation-PriUI", "changeme");
config.put("ImsInnovation-Password", "secret");
Two user accounts are needed and in order to keep the sample clean from a lot of formatting login code the values are hardcoded. Unfortunately this leads to having to modify and compile the sample twice. First add user A, recompile and execute. Then change the code for user B, recompile execute.
If user A shall send a file to user B then start the B side by clicking on the ImsFileTransfer icon. On the A side go to the camera application, take a picture and hit the sharing option. The A application will be started from the camera application. Make sure that just after the start a Toast message should be displayed IMS Connected if not then there are some network problems or other problems related with the user credentials used for login.

  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 2
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 2
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值