iOS自带的提供了一个API如下
- NSData *UIImageJPEGRepresentation(UIImage *image, CGFloat compressionQuality);
在Iphone上有两种读取图片数据的简单方法: UIImageJPEGRepresentation和UIImagePNGRepresentation. UIImageJPEGRepresentation函数需要两个参数:图片的引用和压缩系数.而UIImagePNGRepresentation只需要图片引用作为参数.通过在实际使用过程中,比较发现: UIImagePNGRepresentation(UIImage* image) 要比UIImageJPEGRepresentation(UIImage* image, 1.0) 返回的图片数据量大很多.譬如,同样是读取摄像头拍摄的同样景色的照片, UIImagePNGRepresentation()返回的数据量大小为199K ,而 UIImageJPEGRepresentation(UIImage* image, 1.0)返回的数据量大小只为140KB,比前者少了50多KB.如果对图片的清晰度要求不高,还可以通过设置 UIImageJPEGRepresentation函数的第二个参数,大幅度降低图片数据量.譬如,刚才拍摄的图片, 通过调用UIImageJPEGRepresentation(UIImage* image, 1.0)读取数据时,返回的数据大小为140KB,但更改压缩系数后,通过调用UIImageJPEGRepresentation(UIImage* image, 0.5)读取数据时,返回的数据大小只有11KB多,大大压缩了图片的数据量 ,而且从视角角度看,图片的质量并没有明显的降低.因此,在读取图片数据内容时,建议优先使用UIImageJPEGRepresentation,并可根据自己的实际使用场景,设置压缩系数,进一步降低图片数据量大小。
- UIImage *imageNew = [info objectForKey:@"UIImagePickerControllerOriginalImage"];
- imageNew = [self imageWithImage:imageNew scaledToSize:CGSizeMake(100, 100)];
- NSData *imageData = UIImageJPEGRepresentation(imageNew, 0.0001);
- m_selectImage = [UIImage imageWithData:imageData];
.h具体code
- #import <Foundation/Foundation.h>
- @interface UIImage (UIImageExt)
- - (UIImage *)scaleToSize:(UIImage *)img size:(CGSize)size;
- - (UIImage *)imageByScalingAndCroppingForSize:(CGSize)targetSize;
- @end
- #import "UIImageExt.h"
- @implementation UIImage (UIImageExt)
- - (UIImage *)scaleToSize:(UIImage *)img size:(CGSize)size{
- // 创建一个bitmap的context
- // 并把它设置成为当前正在使用的context
- UIGraphicsBeginImageContext(size);
- // 绘制改变大小的图片
- [img drawInRect:CGRectMake(0, 0, size.width, size.height)];
- // 从当前context中创建一个改变大小后的图片
- UIImage* scaledImage = UIGraphicsGetImageFromCurrentImageContext();
- // 使当前的context出堆栈
- UIGraphicsEndImageContext();
- // 返回新的改变大小后的图片
- return scaledImage;
- }
- - (UIImage*)imageByScalingAndCroppingForSize:(CGSize)targetSize
- {
- UIImage *sourceImage = self;
- UIImage *newImage = nil;
- CGSize imageSize = sourceImage.size;
- CGFloat width = imageSize.width;
- CGFloat height = imageSize.height;
- CGFloat targetWidth = targetSize.width;
- CGFloat targetHeight = targetSize.height;
- CGFloat scaleFactor = 0.0;
- CGFloat scaledWidth = targetWidth;
- CGFloat scaledHeight = targetHeight;
- CGPoint thumbnailPoint = CGPointMake(0.0,0.0);
- if (CGSizeEqualToSize(imageSize, targetSize) == NO)
- {
- CGFloat widthFactor = targetWidth / width;
- CGFloat heightFactor = targetHeight / height;
- if (widthFactor > heightFactor)
- scaleFactor = widthFactor; // scale to fit height
- else
- scaleFactor = heightFactor; // scale to fit width
- scaledWidth = width * scaleFactor;
- scaledHeight = height * scaleFactor;
- // center the image
- if (widthFactor > heightFactor)
- {
- thumbnailPoint.y = (targetHeight - scaledHeight) * 0.5;
- }
- else
- if (widthFactor < heightFactor)
- {
- thumbnailPoint.x = (targetWidth - scaledWidth) * 0.5;
- }
- }
- UIGraphicsBeginImageContext(targetSize); // this will crop
- CGRect thumbnailRect = CGRectZero;
- thumbnailRect.origin = thumbnailPoint;
- thumbnailRect.size.width = scaledWidth;
- thumbnailRect.size.height = scaledHeight;
- [sourceImage drawInRect:thumbnailRect];
- newImage = UIGraphicsGetImageFromCurrentImageContext();
- if(newImage == nil)
- NSLog(@"could not scale image");
- //pop the context to get back to the default
- UIGraphicsEndImageContext();
- return newImage;
- }
- @end