My question is related to a Kaggle data science competition. I'm trying to read an image from a one-dimensional array containing 1-bit grayscale pixel information (0 to 255) for an 28x28 image. So the array is from 0 to 783 where each pixel is encoded as x = i * 28 + j.
Converted into a two-dimensional 28x28 matrix this:
000 001 002 003 ... 026 027
028 029 030 031 ... 054 055
056 057 058 059 ... 082 083
| | | | ... | |
728 729 730 731 ... 754 755
756 757 758 759 ... 782 783
For reasons of image manipulation (resizing, skewing) I would like to read that array into an in-memory PIL image. I did some research on the Matplotlib image function, which I think is most promising. Another idea is the Numpy image functions.
What I'm looking for, is a code example that shows me how to load that 1-dimensional array via Numpy or Matplotlib or anything else. Or how to convert that array into a 2-dimensional image using for instance Numpy.vstack and then read it as an image.
解决方案
You can convert a NumPy array to PIL image using Image.fromarray:
import numpy as np
from PIL import Image
arr = np.random.randint(255, size=(28*28))
img = Image.fromarray(arr.reshape(28,28), 'L')
L mode indicates the array values represent luminance. The result will be a gray-scale image.