I'm getting a serious memory leak when assigning textures at runtime.
When useNewArray is true, Task manager shows that Unity's memory usage grows by ~20 MBs / second.
The full (183k zipped) project can be downloaded here:
https://goo.gl/axFJDs
Here's my code:
using UnityEngine;
using System.Collections;
using System.Collections.Generic;
public class testMemory : MonoBehaviour {
public bool useNewList = false;
Texture2D newTexture;
public byte color;
List newColors;
List staticPixelColorsList;
Color tempColor = Color.white;
// Use this for initialization
void Start () {
newTexture = new Texture2D(0,0);
newColors = new List();
staticPixelColorsList = new List(120000);
GetComponent().material.mainTexture = newTexture;
}
// Update is called once per frame
void Update () {
// Make sure we're using different colors for every frame
color = (byte)((Time.time*255) % 255);
if(useNewList)
{
// This option causes a memory leak, but it's the one I need
// because I'm getting the texture data from an external source
int newWidth = Random.Range(200,400);
int newHeight = Random.Range(200,400);
// Create a new list each time
List newPixels = new List(newWidth * newHeight);
for(int i = 0; i < newWidth * newHeight; i++)
{
newPixels.Add ((byte)(color * i / 120000f));
}
setTexture(newPixels, newWidth, newHeight);
// Clear the list (yeah, right)
newPixels.Clear ();
}
else
{
// Use the same list, but assign new colors
for(int i = 0; i < 120000; i++)
{
staticPixelColorsList.Add ((byte)(color * i / 120000f));
}
setTexture(staticPixelColorsList, 300, 400);
staticPixelColorsList.Clear ();
}
}
void setTexture(List inputPixels, int inputWidth, int inputHeight)
{
newTexture.Resize(inputWidth, inputHeight);
float colorValue;
// Convert input value to Unity's "Color"
for(int n = 0; n < newTexture.width * newTexture.height ; n++)
{
colorValue = inputPixels[n] / 255.0f;
tempColor.r = tempColor.g = tempColor.b = colorValue;
newColors.Add (tempColor);
}
// Actually set the texture pixels
newTexture.SetPixels(newColors.ToArray());
newTexture.Apply();
newColors.Clear();
}
}
Thanx in advance.
Talk1:
"but it's the one I need because I'm getting the texture data from an external source" So your real code does or doesn't use a list of byte?
Talk2:
Golly, why are you updating a texture CPU-side every frame? That'll be quite slow
Talk3:
This is the code used to reproduce the problem.. MickyDuncan, The texture comes from a camera and I do need to update it every frame. T.Kiley, when 'useNewList = false' I'm using a pre-allocated list with other values. The problem may be that runtime-allocated lists are not GC'ed.
Solutions1
According to this MSDN Article, the List.Clear method resets the Count property to 0 and releases all references to elements of the collection.
However, the capacity remains unchanged. Therefore, the same amount of memory stays allocated. To release this memory, the TrimExcess method should be called, or the Capacity property manually set.
Maybe instead of making a new newPixels list each time the Update method is called you could create this list outside of the function (perhaps in the class declaration) and refill it each time the Update method is called? Clearing would be optional, since every call, the contents would be overwritten. This could also improve performance, since your code won't have to release every member of the collection at the end of every update.
Talk1:
To check this is correct, the OP could try calling a System.GC.Collect() to check the memory is then free'd (i.e. not a memory leak, simply an inefficient use of memory.
Talk2:
I've added ' newPixels.TrimExcess(); System.GC.Collect();' after 'newPixels.Clear ();' It didn't make any difference. Nor did using one list and not a new one each Update().
Talk3:
.Kiley Calling GC.Collect() after a List.Clear() when the list is a list of value types will have no effect
Talk4:
No - I was proposing doing it at the start of the next frame where the last frames newPixels will have gone out of scope and therefore be available for GC
Talk5:
.Kiley I understand, that's fine
Solutions2
Apparently this is a bug with Texture2D.Apply(). It's so deep, Unity's profiler did not show the leak, but external tools did. Unity have acknowledged the problem and are seeking out a solution. No date was given. Thanks for the help :-)