Managing Your App's Memory

 http://developer.android.com/training/articles/memory.html


Managing Your App'sMemory


In this document

1.      How Android Manages Memory

1.      Sharing Memory

2.      Allocating and Reclaiming App Memory

3.      Restricting App Memory

4.      Switching Apps

2.      How Your App Should Manage Memory

1.      Use services sparingly

2.      Release memory when your user interface becomes hidden

3.      Release memory as memory becomes tight

4.      Check how much memory you should use

5.      Avoid wasting memory with bitmaps

6.      Use optimized data containers

7.      Be aware of memory overhead

8.      Be careful with code abstractions

9.      Use nano protobufs for serialized data

10.  Avoid dependency injection frameworks

11.  Be careful about using external libraries

12.  Optimize overall performance

13.  Use ProGuard to strip out any unneeded code

14.  Use zipalign on your final APK

15.  Analyze your RAM usage

16.  Use multiple processes

See Also

·        Investigating Your RAM Usage

Random-access memory (RAM) is a valuableresource in any software development environment, but it's even more valuableon a mobile operating system where physical memory is often constrained.Although Android's Dalvik virtual machine performs routine garbage collection,this doesn't allow you to ignore when and where your app allocates and releasesmemory.

In order for the garbage collector toreclaim memory from your app, you need to avoid introducing memory leaks(usually caused by holding onto object references in global members) andrelease any Reference objects at theappropriate time (as defined by lifecycle callbacks discussed further below).For most apps, the Dalvik garbage collector takes care of the rest: the systemreclaims your memory allocations when the corresponding objects leave the scopeof your app's active threads.

This document explains how Android managesapp processes and memory allocation, and how you can proactively reduce memoryusage while developing for Android. For more information about general practicesto clean up your resources when programming in Java, refer to other books oronline documentation about managing resource references. If you’re looking forinformation about how to analyze your app’s memory once you’ve already builtit, read Investigating Your RAM Usage.

How Android ManagesMemory


Android does not offer swap space formemory, but it does use paging and memory-mapping(mmapping) to manage memory. This means that any memory you modify—whether byallocating new objects or touching mmapped pages—remains resident in RAM andcannot be paged out. So the only way to completely release memory from your appis to release object references you may be holding, making the memory availableto the garbage collector. That is with one exception: any files mmapped inwithout modification, such as code, can be paged out of RAM if the system wantsto use that memory elsewhere.

Sharing Memory

In order to fit everything it needs in RAM,Android tries to share RAM pages across processes. It can do so in thefollowing ways:

·        Each app process is forked from an existingprocess called Zygote. The Zygote process starts when the system boots andloads common framework code and resources (such as activity themes). To start anew app process, the system forks the Zygote process then loads and runs theapp's code in the new process. This allows most of the RAM pages allocated forframework code and resources to be shared across all app processes.

·        Most static data is mmapped into a process.This not only allows that same data to be shared between processes but alsoallows it to be paged out when needed. Example static data include: Dalvik code(by placing it in a pre-linked .odex file for directmmapping), app resources (by designing the resource table to be a structurethat can be mmapped and by aligning the zip entries of the APK), andtraditional project elements like native code in .so files.

·        In many places, Android shares the samedynamic RAM across processes using explicitly allocated shared memory regions(either with ashmem or gralloc). For example, window surfaces use shared memorybetween the app and screen compositor, and cursor buffers use shared memorybetween the content provider and client.

Due to the extensive use of shared memory,determining how much memory your app is using requires care. Techniques toproperly determine your app's memory use are discussed in Investigating Your RAM Usage.

Allocating and Reclaiming App Memory

Here are some facts about how Androidallocates then reclaims memory from your app:

·        The Dalvik heap for each process isconstrained to a single virtual memory range. This defines the logical heapsize, which can grow as it needs to (but only up to a limit that the systemdefines for each app).

·        The logical size of the heap is not the sameas the amount of physical memory used by the heap. When inspecting your app's heap,Android computes a value called the Proportional Set Size (PSS), which accountsfor both dirty and clean pages that are shared with other processes—but only inan amount that's proportional to how many apps share that RAM. This (PSS) totalis what the system considers to be your physical memory footprint. For moreinformation about PSS, see the Investigating Your RAM Usage guide.

·        The Dalvik heap does not compact the logicalsize of the heap, meaning that Android does not defragment the heap to close upspace. Android can only shrink the logical heap size when there is unused spaceat the end of the heap. But this doesn't mean the physical memory used by theheap can't shrink. After garbage collection, Dalvik walks the heap and findsunused pages, then returns those pages to the kernel using madvise. So, pairedallocations and deallocations of large chunks should result in reclaiming all(or nearly all) the physical memory used. However, reclaiming memory from smallallocations can be much less efficient because the page used for a smallallocation may still be shared with something else that has not yet been freed.

Restricting App Memory

To maintain a functional multi-taskingenvironment, Android sets a hard limit on the heap size for each app. The exactheap size limit varies between devices based on how much RAM the device hasavailable overall. If your app has reached the heap capacity and tries toallocate more memory, it will receive an OutOfMemoryError.

In some cases, you might want to query thesystem to determine exactly how much heap space you have available on thecurrent device—for example, to determine how much data is safe to keep in acache. You can query the system for this figure by calling getMemoryClass(). This returns aninteger indicating the number of megabytes available for your app's heap. Thisis discussed further below, under Check how much memory you should use.

Switching Apps

Instead of using swap space when the userswitches between apps, Android keeps processes that are not hosting aforeground ("user visible") app component in a least-recently used(LRU) cache. For example, when the user first launches an app, a process iscreated for it, but when the user leaves the app, that process does not quit. The system keeps the process cached, so if the user later returnsto the app, the process is reused for faster app switching.

If your app has a cached process and itretains memory that it currently does not need, then your app—even while theuser is not using it—is constraining the system's overall performance. So, asthe system runs low on memory, it may kill processes in the LRU cache beginningwith the process least recently used, but also giving some consideration towardwhich processes are most memory intensive. To keep your process cached as longas possible, follow the advice in the following sections about when to releaseyour references.

More information about how processes arecached while not running in the foreground and how Android decides which onescan be killed is available in the Processes and Threads guide.

How Your App ShouldManage Memory


You should consider RAM constraintsthroughout all phases of development, including during app design (before youbegin development). There are many ways you can design and write code that leadto more efficient results, through aggregation of the same techniques appliedover and over.

You should apply the following techniqueswhile designing and implementing your app to make it more memory efficient.

Use services sparingly

If your app needs a service to perform work in the background, do not keep itrunning unless it's actively performing a job. Also be careful to never leak yourservice by failing to stop it when its work is done.

When you start a service, the system prefersto always keep the process for that service running. This makes the processvery expensive because the RAM used by the service can’t be used by anything elseor paged out. This reduces the number of cached processes that the system cankeep in the LRU cache, making app switching less efficient. It can even lead tothrashing in the system when memory is tight and the system can’t maintainenough processes to host all the services currently running.

The best way to limit the lifespan of yourservice is to use an IntentService, which finishesitself as soon as it's done handling the intent that started it. For moreinformation, read Running in a Background Service .

Leaving a service running when it’s notneeded is one of the worst memory-management mistakes an Android appcan make. So don’t be greedy by keeping a service for your app running. Notonly will it increase the risk of your app performing poorly due to RAMconstraints, but users will discover such misbehaving apps and uninstall them.

Release memory when your user interfacebecomes hidden

When the user navigates to a different appand your UI is no longer visible, you should release any resources that areused by only your UI. Releasing UI resources at this time can significantlyincrease the system's capacity for cached processes, which has a direct impacton the quality of the user experience.

To be notified when the user exits your UI,implement the onTrimMemory() callback in yourActivity classes. Youshould use this method to listen for the TRIM_MEMORY_UI_HIDDEN level, whichindicates your UI is now hidden from view and you should free resources thatonly your UI uses.

Notice that your app receives the onTrimMemory() callback with TRIM_MEMORY_UI_HIDDEN only when all the UI components of your app process become hidden from theuser. This is distinct from the onStop() callback, whichis called when an Activity instance becomeshidden, which occurs even when the user moves to another activity in your app.So although you should implement onStop() to releaseactivity resources such as a network connection or to unregister broadcastreceivers, you usually should not release your UI resources until you receive onTrimMemory(TRIM_MEMORY_UI_HIDDEN). This ensuresthat if the user navigates back from anotheractivity in your app, your UI resources are still available to resume theactivity quickly.

Release memory as memory becomes tight

During any stage of your app's lifecycle,the onTrimMemory() callback alsotells you when the overall device memory is getting low. You should respond byfurther releasing resources based on the following memory levels delivered by onTrimMemory():

·        TRIM_MEMORY_RUNNING_MODERATE

Your app isrunning and not considered killable, but the device is running low on memoryand the system is actively killing processes in the LRU cache.

·        TRIM_MEMORY_RUNNING_LOW

Your app isrunning and not considered killable, but the device is running much lower onmemory so you should release unused resources to improve system performance(which directly impacts your app's performance).

·        TRIM_MEMORY_RUNNING_CRITICAL

Your app is stillrunning, but the system has already killed most of the processes in the LRUcache, so you should release all non-critical resources now. If the systemcannot reclaim sufficient amounts of RAM, it will clear all of the LRU cacheand begin killing processes that the system prefers to keep alive, such asthose hosting a running service.

Also, when your app process is currentlycached, you may receive one of the following levels from onTrimMemory():

·        TRIM_MEMORY_BACKGROUND

The system isrunning low on memory and your process is near the beginning of the LRU list.Although your app process is not at a high risk of being killed, the system mayalready be killing processes in the LRU cache. You should release resourcesthat are easy to recover so your process will remain in the list and resumequickly when the user returns to your app.

·        TRIM_MEMORY_MODERATE

The system isrunning low on memory and your process is near the middle of the LRU list. Ifthe system becomes further constrained for memory, there's a chance yourprocess will be killed.

·        TRIM_MEMORY_COMPLETE

The system isrunning low on memory and your process is one of the first to be killed if thesystem does not recover memory now. You should release everything that's notcritical to resuming your app state.

Because the onTrimMemory() callback wasadded in API level 14, you can use the onLowMemory() callback as afallback for older versions, which is roughly equivalent to the TRIM_MEMORY_COMPLETE event.

Note: When the systembegins killing processes in the LRU cache, although it primarily worksbottom-up, it does give some consideration to which processes are consumingmore memory and will thus provide the system more memory gain if killed. So theless memory you consume while in the LRU list overall, the better your chancesare to remain in the list and be able to quickly resume.

Check how much memory you should use

As mentioned earlier, each Android-powereddevice has a different amount of RAM available to the system and thus providesa different heap limit for each app. You can call getMemoryClass() to get anestimate of your app's available heap in megabytes. If your app tries toallocate more memory than is available here, it will receive an OutOfMemoryError.

In very special situations, you can requesta larger heap size by setting the largeHeapattribute to "true" in the manifest <application>tag. If you do so, you can call getLargeMemoryClass() to get anestimate of the large heap size.

However, the ability to request a large heapis intended only for a small set of apps that can justify the need to consumemore RAM (such as a large photo editing app). Never request a large heap simply because you've run out of memory and you need aquick fix—you should use it only when you know exactly where all your memory isbeing allocated and why it must be retained. Yet, even when you're confidentyour app can justify the large heap, you should avoid requesting it to whateverextent possible. Using the extra memory will increasingly be to the detrimentof the overall user experience because garbage collection will take longer andsystem performance may be slower when task switching or performing other commonoperations.

Additionally, the large heap size is not thesame on all devices and, when running on devices that have limited RAM, thelarge heap size may be exactly the same as the regular heap size. So even ifyou do request the large heap size, you should call getMemoryClass() to check theregular heap size and strive to always stay below that limit.

Avoid wasting memory with bitmaps

When you load a bitmap, keep it in RAM onlyat the resolution you need for the current device's screen, scaling it down ifthe original bitmap is a higher resolution. Keep in mind that an increase inbitmap resolution results in a corresponding (increase2) in memoryneeded, because both the X and Y dimensions increase.

Note: On Android 2.3.x(API level 10) and below, bitmap objects always appear as the same size in yourapp heap regardless of the image resolution (the actual pixel data is storedseparately in native memory). This makes it more difficult to debug the bitmapmemory allocation because most heap analysis tools do not see the nativeallocation. However, beginning in Android 3.0 (API level 11), the bitmap pixeldata is allocated in your app's Dalvik heap, improving garbage collection anddebuggability. So if your app uses bitmaps and you're having troublediscovering why your app is using some memory on an older device, switch to adevice running Android 3.0 or higher to debug it.

For more tips about working with bitmaps,read Managing Bitmap Memory.

Use optimized data containers

Take advantage of optimized containers inthe Android framework, such as SparseArray, SparseBooleanArray, and LongSparseArray. The generic HashMap implementationcan be quite memory inefficient because it needs a separate entry object forevery mapping. Additionally, the SparseArray classes are moreefficient because they avoid the system's need to autobox the key and sometimes value (which creates yet another object or two perentry). And don't be afraid of dropping down to raw arrays when that makessense.

Be aware of memory overhead

Be knowledgeable about the cost and overheadof the language and libraries you are using, and keep this information in mindwhen you design your app, from start to finish. Often, things on the surfacethat look innocuous may in fact have a large amount of overhead. Examplesinclude:

·        Enums often require more than twice as muchmemory as static constants. You should strictly avoid using enums on Android.

·        Every class in Java (including anonymousinner classes) uses about 500 bytes of code.

·        Every class instance has 12-16 bytes of RAMoverhead.

·        Putting a single entry into a HashMap requires theallocation of an additional entry object that takes 32 bytes (see the previoussection about optimized data containers).

A few bytes here and there quickly addup—app designs that are class- or object-heavy will suffer from this overhead.That can leave you in the difficult position of looking at a heap analysis andrealizing your problem is a lot of small objects using up your RAM.

Be careful with code abstractions

Often, developers use abstractions simply asa "good programming practice," because abstractions can improve codeflexibility and maintenance. However, abstractions come at a significant cost:generally they require a fair amount more code that needs to be executed,requiring more time and more RAM for that code to be mapped into memory. So if yourabstractions aren't supplying a significant benefit, you should avoid them.

Use nano protobufs for serialized data

Protocol buffers are a language-neutral, platform-neutral, extensiblemechanism designed by Google for serializing structured data—think XML, butsmaller, faster, and simpler. If you decide to use protobufs for your data, youshould always use nano protobufs in your client-side code. Regular protobufsgenerate extremely verbose code, which will cause many kinds of problems inyour app: increased RAM use, significant APK size increase, slower execution,and quickly hitting the DEX symbol limit.

For more information, see the "Nanoversion" section in the protobuf readme.

Avoid dependency injection frameworks

Using a dependency injection framework suchas Guiceor RoboGuicemay be attractive because they can simplify the code you write and provide anadaptive environment that's useful for testing and other configuration changes.However, these frameworks tend to perform a lot of process initialization byscanning your code for annotations, which can require significant amounts ofyour code to be mapped into RAM even though you don't need it. These mappedpages are allocated into clean memory so Android can drop them, but that won'thappen until the pages have been left in memory for a long period of time.

Be careful about using external libraries

External library code is often not writtenfor mobile environments and can be inefficient when used for work on a mobileclient. At the very least, when you decide to use an external library, youshould assume you are taking on a significant porting and maintenance burden tooptimize the library for mobile. Plan for that work up-front and analyze thelibrary in terms of code size and RAM footprint before deciding to use it atall.

Even libraries supposedly designed for useon Android are potentially dangerous because each library may do thingsdifferently. For example, one library may use nano protobufs while another usesmicro protobufs. Now you have two different protobuf implementations in yourapp. This can and will also happen with different implementations of logging,analytics, image loading frameworks, caching, and all kinds of other things youdon't expect. ProGuard won't save you here because these will all belower-level dependencies that are required by the features for which you wantthe library. This becomes especially problematic when you use an Activity subclass from alibrary (which will tend to have wide swaths of dependencies), when librariesuse reflection (which is common and means you need to spend a lot of timemanually tweaking ProGuard to get it to work), and so on.

Also be careful not to fall into the trap ofusing a shared library for one or two features out of dozens of other things itdoes; you don't want to pull in a large amount of code and overhead that youdon't even use. At the end of the day, if there isn't an existingimplementation that is a strong match for what you need to do, it may be bestif you create your own implementation.

Optimize overall performance

A variety of information about optimizingyour app's overall performance is available in other documents listed in Best Practices for Performance. Many of these documentsinclude optimizations tips for CPU performance, but many of these tips alsohelp optimize your app's memory use, such as by reducing the number of layoutobjects required by your UI.

You should also read about optimizing your UI with the layout debugging tools and takeadvantage of the optimization suggestions provided by the lint tool.

Use ProGuard to strip out any unneeded code

The ProGuard tool shrinks, optimizes, and obfuscates your code byremoving unused code and renaming classes, fields, and methods withsemantically obscure names. Using ProGuard can make your code more compact,requiring fewer RAM pages to be mapped.

Use zipalign on your final APK

If you do any post-processing of an APKgenerated by a build system (including signing it with your final productioncertificate), then you must run zipalign on it to have it re-aligned. Failing to do so cancause your app to require significantly more RAM, because things like resourcescan no longer be mmapped from the APK.

Note: Google PlayStore does not accept APK files that are not zipaligned.

Analyze your RAM usage

Once you achieve a relatively stable build,begin analyzing how much RAM your app is using throughout all stages of itslifecycle. For information about how to analyze your app, read Investigating Your RAM Usage.

Use multiple processes

If it's appropriate for your app, anadvanced technique that may help you manage your app's memory is dividingcomponents of your app into multiple processes. This technique must always beused carefully and most apps should notrun multiple processes, as it can easily increase—rather thandecrease—your RAM footprint if done incorrectly. It is primarily useful to appsthat may run significant work in the background as well as the foreground andcan manage those operations separately.

An example of when multiple processes may beappropriate is when building a music player that plays music from a service forlong period of time. If the entire app runs in one process, then many of theallocations performed for its activity UI must be kept around as long as it isplaying music, even if the user is currently in another app and the service iscontrolling the playback. An app like this may be split into two process: onefor its UI, and the other for the work that continues running in the backgroundservice.

You can specify a separate process for eachapp component by declaring the android:processattribute for each component in the manifest file. For example, you can specifythat your service should run in a process separate from your app's main processby declaring a new process named "background" (but you can name theprocess anything you like):

<service android:name=".PlaybackService"
         android:process=":background" />

Your process name should begin with a colon(':') to ensure that the process remains private to your app.

Before you decide to create a new process,you need to understand the memory implications. To illustrate the consequencesof each process, consider that an empty process doing basically nothing has anextra memory footprint of about 1.4MB, as shown by the memory information dumpbelow.

adb shell dumpsys meminfocom.example.android.apis:empty

 

** MEMINFO in pid 10172[com.example.android.apis:empty] **

                Pss     Pss Shared Private  Shared Private    Heap   Heap    Heap

             Total   Clean   Dirty  Dirty   Clean   Clean   Size   Alloc    Free

            ------  ------  ------ ------  ------  ------ ------  ------  ------

 Native Heap     0       0      0       0       0      0    1864    1800     63

 Dalvik Heap   764       0   5228     316       0      0    5584    5499     85

 Dalvik Other  619       0    3784    448       0       0

       Stack    28       0      8      28       0      0

   Other dev     4       0     12       0      0       4

    .so mmap   287       0   2840     212     972      0

   .apk mmap    54       0      0       0     136      0

   .dex mmap   250     148      0       0    3704    148

  Other mmap     8       0      8       8      20      0

     Unknown   403       0    600     380       0      0

       TOTAL  2417     148  12480    1392    4832    152    7448    7299    148

Note: More informationabout how to read this output is provided in Investigating Your RAM Usage. The key data here is the Private Dirty and Private Clean memory, which shows that this process isusing almost 1.4MB of non-pageable RAM (distributed across the Dalvik heap,native allocations, book-keeping, and library-loading), and another 150K of RAMfor code that has been mapped in to execute.

This memory footprint for an empty processis fairly significant and it can quickly grow as you start doing work in thatprocess. For example, here is the memory use of a process that is created onlyto show an activity with some text in it:

** MEMINFO in pid 10226 [com.example.android.helloactivity]**

                Pss     Pss Shared Private  SharedPrivate    Heap    Heap   Heap

             Total   Clean   Dirty  Dirty   Clean   Clean   Size   Alloc    Free

            ------  ------  ------ ------  ------  ------ ------  ------  ------

 Native Heap     0       0      0       0       0      0    3000    2951     48

 Dalvik Heap  1074       0   4928     776       0      0    5744    5658     86

 Dalvik Other  802       0    3612    664       0       0

       Stack    28       0      8      28       0      0

      Ashmem     6       0     16       0       0      0

   Other dev   108       0     24     104       0      4

    .so mmap  2166       0   2824    1828    3756      0

   .apk mmap    48       0      0       0     632      0

   .ttf mmap     3       0      0       0      24      0

   .dex mmap   292       4      0       0    5672      4

  Other mmap    10       0      8       8      68      0

     Unknown   632       0    412     624      0       0

       TOTAL  5169       4  11832    4032   10152      8    8744    8609    134

The process has now almost tripled in size,to 4MB, simply by showing some text in the UI. This leads to an importantconclusion: If you are going to split your app into multiple processes, onlyone process should be responsible for UI. Other processes should avoid any UI,as this will quickly increase the RAM required by the process (especially onceyou start loading bitmap assets and other resources). It may then be hard orimpossible to reduce the memory usage once the UI is drawn.

Additionally, when running more than oneprocess, it's more important than ever that you keep your code as lean aspossible, because any unnecessary RAM overhead for common implementations arenow replicated in each process. For example, if you are using enums (though you should not use enums), all of the RAM needed to create andinitialize those constants is duplicated in each process, and any abstractionsyou have with adapters and temporaries or other overhead will likewise bereplicated.

Another concern with multiple processes isthe dependencies that exist between them. For example, if your app has acontent provider that you have running in the default process which also hostsyour UI, then code in a background process that uses that content provider willalso require that your UI process remain in RAM. If your goal is to have abackground process that can run independently of a heavy-weight UI process, itcan't have dependencies on content providers or services that execute in the UIprocess.

 

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值