Re: Scaling a UIImage

2020-11-04 Thread Alex Zavatone via Cocoa-dev
I’ve got more if you need to desaturate an image, remove all color, tint a 
bitmap, as opposed to using tint, this actually makes a new tinted bitmap, find 
imageInbundle, create enabled and disabled images from source images and so on.

Sometimes we want to use the actual tint, and other times, if we desaturate an 
image and then apply a tint when creating a new image, then we have a new image 
that is always that way.  

This is iOS specific, and you can save them as you want, but I’ve also got a 
save in documents folder as a data, not as image and a load image from docs.

Let me know which ones you might want and I can send them to you offlist.

Cheers,
Alex Zavatone



> On Nov 4, 2020, at 5:10 PM, Carl Hoefs  wrote:
> 
> Thanks for the UIImage category resizing methods! They are quite useful.
> 
> -Carl
> 
>> On Nov 4, 2020, at 2:17 PM, Alex Zavatone > > wrote:
>> 
>> Sorry for the delay.  I hope these do what you need.
>> Of course you’ll need to add checks to make sure that you’re not dividing by 
>> zero or nil.
>> 
>> 
>> 
>> // Alex Zavatone 4/2/16.
>> + (UIImage *)imageWithImage:(UIImage *)image 
>> scaledToHeight:(CGFloat)newHeight 
>> {
>> CGFloat ratio = newHeight / image.size.height;
>> CGFloat newWidth = image.size.width * ratio;
>> 
>> CGSize newSize = CGSizeMake(newWidth, newHeight);
>> //UIGraphicsBeginImageContext(newSize);
>> // In next line, pass 0.0 to use the current device's pixel scaling 
>> factor (and thus account for Retina resolution).
>> // Pass 1.0 to force exact pixel size.
>> UIGraphicsBeginImageContextWithOptions(newSize, NO, 0.0);
>> [image drawInRect:CGRectMake(0, 0, newWidth, newHeight)];
>> UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
>> UIGraphicsEndImageContext();
>> return newImage;
>> }
>> 
>> 
>> + (UIImage *)imageWithImage:(UIImage *)image scaledToSize:(CGSize)newSize 
>> {
>> //UIGraphicsBeginImageContext(newSize);
>> // In next line, pass 0.0 to use the current device's pixel scaling 
>> factor (and thus account for Retina resolution).
>> // Pass 1.0 to force exact pixel size.
>> UIGraphicsBeginImageContextWithOptions(newSize, NO, 0.0);
>> [image drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
>> UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
>> UIGraphicsEndImageContext();
>> return newImage;
>> }
>> 
>> + (UIImage *)imageWithImage:(UIImage *)image 
>> scaledToPercentage:(CGFloat)newScale 
>> {
>> CGSize newSize = CGSizeMake(image.size.width * newScale, 
>> image.size.width * newScale);
>> UIImage *newImage = [self imageWithImage:image scaledToSize:newSize];
>> 
>> return newImage;
>> }
>> 
>> 
>> Cheers,
>> Alex Zavatone
>> 
>> 
>>> On Nov 3, 2020, at 10:34 AM, James Crate via Cocoa-dev 
>>> mailto:cocoa-dev@lists.apple.com>> wrote:
>>> 
>>> On Nov 2, 2020, at 5:59 PM, Carl Hoefs via Cocoa-dev 
>>> mailto:cocoa-dev@lists.apple.com>> wrote:
>>> 
 I have an iOS app that interacts with a macOS server process. The iOS app 
 takes a 3264x2448 camera image, scales it to 640x480 pixels, and makes a 
 JPEG representation of it to send to the server:
>>> 
>>> I have code that does pretty much the same thing, in Swift though so you’ll 
>>> need to convert the API calls to ObjC. Since you’re taking a picture, you 
>>> could use the AVCapturePhoto directly. 
>>> 
>>> 
>>>let capture : AVCapturePhoto
>>>private lazy var context = CIContext()
>>> 
>>>lazy var remotePreviewImage: Data? = {
>>>guard let cgImage = 
>>> self.capture.cgImageRepresentation()?.takeRetainedValue() else { return nil 
>>> }
>>> 
>>>var baseImg = CIImage(cgImage: cgImage)
>>> 
>>>if let orientation = self.capture.metadata[ 
>>> String(kCGImagePropertyOrientation) ] as? Int32 {
>>>baseImg = baseImg.oriented(forExifOrientation: orientation)
>>>}
>>> 
>>>let scalePct = [800.0 / baseImg.extent.size.width, 800.0 / 
>>> baseImg.extent.size.height].max() ?? 0.3
>>>let transformedImg = baseImg.transformed(by: 
>>> CGAffineTransform(scaleX: scalePct, y: scalePct))
>>>print("generated remote preview image \(transformedImg.extent.size)")
>>> 
>>>let colorspace : CGColorSpace = baseImg.colorSpace ?? 
>>> CGColorSpace(name: CGColorSpace.sRGB)!
>>>let compressionKey = CIImageRepresentationOption(rawValue: 
>>> kCGImageDestinationLossyCompressionQuality as String)
>>>let data = self.context.jpegRepresentation(of: transformedImg, 
>>> colorSpace: colorspace,
>>>   options: [compressionKey 
>>> : 0.6])
>>>print("photo generated preview \(data?.count ?? 0) bytes")
>>>return data
>>>}()
>>> 
>>> 
>>> I had a previous version that used ImageIO. I don’t remember why I switched 
>>> but I still had the commented code hanging around.  
>>> 
>>> //lazy var 

Re: Scaling a UIImage

2020-11-04 Thread Carl Hoefs via Cocoa-dev
Thanks for the UIImage category resizing methods! They are quite useful.

-Carl

> On Nov 4, 2020, at 2:17 PM, Alex Zavatone  wrote:
> 
> Sorry for the delay.  I hope these do what you need.
> Of course you’ll need to add checks to make sure that you’re not dividing by 
> zero or nil.
> 
> 
> 
> // Alex Zavatone 4/2/16.
> + (UIImage *)imageWithImage:(UIImage *)image 
> scaledToHeight:(CGFloat)newHeight 
> {
> CGFloat ratio = newHeight / image.size.height;
> CGFloat newWidth = image.size.width * ratio;
> 
> CGSize newSize = CGSizeMake(newWidth, newHeight);
> //UIGraphicsBeginImageContext(newSize);
> // In next line, pass 0.0 to use the current device's pixel scaling 
> factor (and thus account for Retina resolution).
> // Pass 1.0 to force exact pixel size.
> UIGraphicsBeginImageContextWithOptions(newSize, NO, 0.0);
> [image drawInRect:CGRectMake(0, 0, newWidth, newHeight)];
> UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
> UIGraphicsEndImageContext();
> return newImage;
> }
> 
> 
> + (UIImage *)imageWithImage:(UIImage *)image scaledToSize:(CGSize)newSize 
> {
> //UIGraphicsBeginImageContext(newSize);
> // In next line, pass 0.0 to use the current device's pixel scaling 
> factor (and thus account for Retina resolution).
> // Pass 1.0 to force exact pixel size.
> UIGraphicsBeginImageContextWithOptions(newSize, NO, 0.0);
> [image drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
> UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
> UIGraphicsEndImageContext();
> return newImage;
> }
> 
> + (UIImage *)imageWithImage:(UIImage *)image 
> scaledToPercentage:(CGFloat)newScale 
> {
> CGSize newSize = CGSizeMake(image.size.width * newScale, image.size.width 
> * newScale);
> UIImage *newImage = [self imageWithImage:image scaledToSize:newSize];
> 
> return newImage;
> }
> 
> 
> Cheers,
> Alex Zavatone
> 
> 
>> On Nov 3, 2020, at 10:34 AM, James Crate via Cocoa-dev 
>> mailto:cocoa-dev@lists.apple.com>> wrote:
>> 
>> On Nov 2, 2020, at 5:59 PM, Carl Hoefs via Cocoa-dev 
>> mailto:cocoa-dev@lists.apple.com>> wrote:
>> 
>>> I have an iOS app that interacts with a macOS server process. The iOS app 
>>> takes a 3264x2448 camera image, scales it to 640x480 pixels, and makes a 
>>> JPEG representation of it to send to the server:
>> 
>> I have code that does pretty much the same thing, in Swift though so you’ll 
>> need to convert the API calls to ObjC. Since you’re taking a picture, you 
>> could use the AVCapturePhoto directly. 
>> 
>> 
>>let capture : AVCapturePhoto
>>private lazy var context = CIContext()
>> 
>>lazy var remotePreviewImage: Data? = {
>>guard let cgImage = 
>> self.capture.cgImageRepresentation()?.takeRetainedValue() else { return nil }
>> 
>>var baseImg = CIImage(cgImage: cgImage)
>> 
>>if let orientation = self.capture.metadata[ 
>> String(kCGImagePropertyOrientation) ] as? Int32 {
>>baseImg = baseImg.oriented(forExifOrientation: orientation)
>>}
>> 
>>let scalePct = [800.0 / baseImg.extent.size.width, 800.0 / 
>> baseImg.extent.size.height].max() ?? 0.3
>>let transformedImg = baseImg.transformed(by: 
>> CGAffineTransform(scaleX: scalePct, y: scalePct))
>>print("generated remote preview image \(transformedImg.extent.size)")
>> 
>>let colorspace : CGColorSpace = baseImg.colorSpace ?? 
>> CGColorSpace(name: CGColorSpace.sRGB)!
>>let compressionKey = CIImageRepresentationOption(rawValue: 
>> kCGImageDestinationLossyCompressionQuality as String)
>>let data = self.context.jpegRepresentation(of: transformedImg, 
>> colorSpace: colorspace,
>>   options: [compressionKey : 
>> 0.6])
>>print("photo generated preview \(data?.count ?? 0) bytes")
>>return data
>>}()
>> 
>> 
>> I had a previous version that used ImageIO. I don’t remember why I switched 
>> but I still had the commented code hanging around.  
>> 
>> //lazy var remotePreviewImage: Data? = {
>> //guard let data = self.capture.fileDataRepresentation() else { 
>> return nil }
>> //guard let src = CGImageSourceCreateWithData(data as NSData, nil) 
>> else { return nil }
>> //let thumbOpts = [
>> //kCGImageSourceCreateThumbnailFromImageAlways: true,
>> //kCGImageSourceCreateThumbnailWithTransform: true,
>> //kCGImageSourceThumbnailMaxPixelSize: 800,
>> //] as [CFString : Any]
>> //
>> //if let cgImage = CGImageSourceCreateThumbnailAtIndex(src, 0, 
>> thumbOpts as CFDictionary) {
>> //// create jpg data
>> //let data = NSMutableData()
>> //
>> //if let dest = CGImageDestinationCreateWithData(data, 
>> kUTTypeJPEG, 1, nil) {
>> //CGImageDestinationAddImage(dest, cgImage, 
>> 

Re: Scaling a UIImage

2020-11-04 Thread Alex Zavatone via Cocoa-dev
Sorry for the delay.  I hope these do what you need.
Of course you’ll need to add checks to make sure that you’re not dividing by 
zero or nil.



// Alex Zavatone 4/2/16.
+ (UIImage *)imageWithImage:(UIImage *)image scaledToHeight:(CGFloat)newHeight 
{
CGFloat ratio = newHeight / image.size.height;
CGFloat newWidth = image.size.width * ratio;

CGSize newSize = CGSizeMake(newWidth, newHeight);
//UIGraphicsBeginImageContext(newSize);
// In next line, pass 0.0 to use the current device's pixel scaling factor 
(and thus account for Retina resolution).
// Pass 1.0 to force exact pixel size.
UIGraphicsBeginImageContextWithOptions(newSize, NO, 0.0);
[image drawInRect:CGRectMake(0, 0, newWidth, newHeight)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}


+ (UIImage *)imageWithImage:(UIImage *)image scaledToSize:(CGSize)newSize 
{
//UIGraphicsBeginImageContext(newSize);
// In next line, pass 0.0 to use the current device's pixel scaling factor 
(and thus account for Retina resolution).
// Pass 1.0 to force exact pixel size.
UIGraphicsBeginImageContextWithOptions(newSize, NO, 0.0);
[image drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}

+ (UIImage *)imageWithImage:(UIImage *)image 
scaledToPercentage:(CGFloat)newScale 
{
CGSize newSize = CGSizeMake(image.size.width * newScale, image.size.width * 
newScale);
UIImage *newImage = [self imageWithImage:image scaledToSize:newSize];

return newImage;
}


Cheers,
Alex Zavatone


> On Nov 3, 2020, at 10:34 AM, James Crate via Cocoa-dev 
>  wrote:
> 
> On Nov 2, 2020, at 5:59 PM, Carl Hoefs via Cocoa-dev 
>  wrote:
> 
>> I have an iOS app that interacts with a macOS server process. The iOS app 
>> takes a 3264x2448 camera image, scales it to 640x480 pixels, and makes a 
>> JPEG representation of it to send to the server:
> 
> I have code that does pretty much the same thing, in Swift though so you’ll 
> need to convert the API calls to ObjC. Since you’re taking a picture, you 
> could use the AVCapturePhoto directly. 
> 
> 
>let capture : AVCapturePhoto
>private lazy var context = CIContext()
> 
>lazy var remotePreviewImage: Data? = {
>guard let cgImage = 
> self.capture.cgImageRepresentation()?.takeRetainedValue() else { return nil }
> 
>var baseImg = CIImage(cgImage: cgImage)
> 
>if let orientation = self.capture.metadata[ 
> String(kCGImagePropertyOrientation) ] as? Int32 {
>baseImg = baseImg.oriented(forExifOrientation: orientation)
>}
> 
>let scalePct = [800.0 / baseImg.extent.size.width, 800.0 / 
> baseImg.extent.size.height].max() ?? 0.3
>let transformedImg = baseImg.transformed(by: CGAffineTransform(scaleX: 
> scalePct, y: scalePct))
>print("generated remote preview image \(transformedImg.extent.size)")
> 
>let colorspace : CGColorSpace = baseImg.colorSpace ?? 
> CGColorSpace(name: CGColorSpace.sRGB)!
>let compressionKey = CIImageRepresentationOption(rawValue: 
> kCGImageDestinationLossyCompressionQuality as String)
>let data = self.context.jpegRepresentation(of: transformedImg, 
> colorSpace: colorspace,
>   options: [compressionKey : 
> 0.6])
>print("photo generated preview \(data?.count ?? 0) bytes")
>return data
>}()
> 
> 
> I had a previous version that used ImageIO. I don’t remember why I switched 
> but I still had the commented code hanging around.  
> 
> //lazy var remotePreviewImage: Data? = {
> //guard let data = self.capture.fileDataRepresentation() else { 
> return nil }
> //guard let src = CGImageSourceCreateWithData(data as NSData, nil) 
> else { return nil }
> //let thumbOpts = [
> //kCGImageSourceCreateThumbnailFromImageAlways: true,
> //kCGImageSourceCreateThumbnailWithTransform: true,
> //kCGImageSourceThumbnailMaxPixelSize: 800,
> //] as [CFString : Any]
> //
> //if let cgImage = CGImageSourceCreateThumbnailAtIndex(src, 0, 
> thumbOpts as CFDictionary) {
> //// create jpg data
> //let data = NSMutableData()
> //
> //if let dest = CGImageDestinationCreateWithData(data, 
> kUTTypeJPEG, 1, nil) {
> //CGImageDestinationAddImage(dest, cgImage, 
> [kCGImageDestinationLossyCompressionQuality: 0.6] as CFDictionary)
> //CGImageDestinationFinalize(dest)
> //}
> //print("getPhoto generated preview \(data.count) bytes for 
> RemoteCapture")
> //return data as Data
> //}
> //return nil
> //}()
> 
> 
> Jim Crate
> 
> ___
> 
> Cocoa-dev mailing list 

Re: Scaling a UIImage

2020-11-03 Thread James Crate via Cocoa-dev
On Nov 2, 2020, at 5:59 PM, Carl Hoefs via Cocoa-dev 
 wrote:

> I have an iOS app that interacts with a macOS server process. The iOS app 
> takes a 3264x2448 camera image, scales it to 640x480 pixels, and makes a JPEG 
> representation of it to send to the server:

I have code that does pretty much the same thing, in Swift though so you’ll 
need to convert the API calls to ObjC. Since you’re taking a picture, you could 
use the AVCapturePhoto directly. 


let capture : AVCapturePhoto
private lazy var context = CIContext()

lazy var remotePreviewImage: Data? = {
guard let cgImage = 
self.capture.cgImageRepresentation()?.takeRetainedValue() else { return nil }

var baseImg = CIImage(cgImage: cgImage)

if let orientation = self.capture.metadata[ 
String(kCGImagePropertyOrientation) ] as? Int32 {
baseImg = baseImg.oriented(forExifOrientation: orientation)
}

let scalePct = [800.0 / baseImg.extent.size.width, 800.0 / 
baseImg.extent.size.height].max() ?? 0.3
let transformedImg = baseImg.transformed(by: CGAffineTransform(scaleX: 
scalePct, y: scalePct))
print("generated remote preview image \(transformedImg.extent.size)")

let colorspace : CGColorSpace = baseImg.colorSpace ?? 
CGColorSpace(name: CGColorSpace.sRGB)!
let compressionKey = CIImageRepresentationOption(rawValue: 
kCGImageDestinationLossyCompressionQuality as String)
let data = self.context.jpegRepresentation(of: transformedImg, 
colorSpace: colorspace,
   options: [compressionKey : 
0.6])
print("photo generated preview \(data?.count ?? 0) bytes")
return data
}()


I had a previous version that used ImageIO. I don’t remember why I switched but 
I still had the commented code hanging around.  

//lazy var remotePreviewImage: Data? = {
//guard let data = self.capture.fileDataRepresentation() else { return 
nil }
//guard let src = CGImageSourceCreateWithData(data as NSData, nil) else 
{ return nil }
//let thumbOpts = [
//kCGImageSourceCreateThumbnailFromImageAlways: true,
//kCGImageSourceCreateThumbnailWithTransform: true,
//kCGImageSourceThumbnailMaxPixelSize: 800,
//] as [CFString : Any]
//
//if let cgImage = CGImageSourceCreateThumbnailAtIndex(src, 0, 
thumbOpts as CFDictionary) {
//// create jpg data
//let data = NSMutableData()
//
//if let dest = CGImageDestinationCreateWithData(data, kUTTypeJPEG, 
1, nil) {
//CGImageDestinationAddImage(dest, cgImage, 
[kCGImageDestinationLossyCompressionQuality: 0.6] as CFDictionary)
//CGImageDestinationFinalize(dest)
//}
//print("getPhoto generated preview \(data.count) bytes for 
RemoteCapture")
//return data as Data
//}
//return nil
//}()


Jim Crate

___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com


Re: Scaling a UIImage

2020-11-02 Thread Alex Zavatone via Cocoa-dev
I’lll dig up a utility class that I made for UIImage that has this as well as 
others.

Cheers.

> On Nov 2, 2020, at 4:59 PM, Carl Hoefs via Cocoa-dev 
>  wrote:
> 
> How can I correctly scale a UIImage from 3264x2448 down to 640x480 pixels?
> 
> 
> I have an iOS app that interacts with a macOS server process. The iOS app 
> takes a 3264x2448 camera image, scales it to 640x480 pixels, and makes a JPEG 
> representation of it to send to the server:
> 
>  NSData *dataObj = UIImageJPEGRepresentation(origImage,0.5);
> 
> The server reads the bytes, and creates an NSImage:
> 
>  NSImage *theImage = [[NSImage alloc] initWithData:imageData];
>  NSBitmapImageRep *imageRep = [[NSBitmapImageRep alloc] 
> initWithData:[theImage TIFFRepresentation]];
> 
> But at this point, imageRep.pixelsWide=1280 and imageRep.pixelsHigh=960!
> 
> If I write theImage to disk and look at it with Preview, it displays onscreen 
> as 640x480 but Preview's Inspector Tool shows it to be 1280x960.
> 
> On the iOS app side, here's the UIImage category method I'm using to scale 
> the image:
> 
> + (UIImage *)imageWithImage:(UIImage *)image 
>   scaledToSize:(CGSize)newSize 
> {
>UIGraphicsBeginImageContextWithOptions(newSize, NO, 0.0);
>[image drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
>UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
>UIGraphicsEndImageContext();
>return newImage;
> }
> 
> Any suggestions?
> -Carl
> 
> ___
> 
> Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)
> 
> Please do not post admin requests or moderator comments to the list.
> Contact the moderators at cocoa-dev-admins(at)lists.apple.com
> 
> Help/Unsubscribe/Update your Subscription:
> https://lists.apple.com/mailman/options/cocoa-dev/zav%40mac.com
> 
> This email sent to z...@mac.com

___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com


Re: Scaling a UIImage

2020-11-02 Thread David Duncan via Cocoa-dev


> On Nov 2, 2020, at 3:20 PM, Carl Hoefs  wrote:
> 
> Okay. It was my understanding that -TIFFRepresentation was the only way to 
> get serializable image data bytes... What is a more efficient way to do this?

If you want serializable data bytes, sure, but thats not what you appear to be 
doing in the code you sent – you appear to want a bitmap representation. 
Presumably you go on to do something other than serialize it (as I imagine you 
want to draw it in some way).

In general if you want something you can draw, you can get a 
-CGImageForProposedRect:context:hints, and if you need an NSBitmapRep you can 
use -bestRepresentationForRect:context:hints:. But assuming your trying to get 
that NSImage somewhere other than into (say) an NSImageView, there may be 
better APIs or pathways to that solution.

> -Carl
> 
> 
>> On Nov 2, 2020, at 3:09 PM, David Duncan > > wrote:
>> 
>> Also any code using -TIFFRepresentation for any reason other than to get 
>> actual TIFF data is likely suspect. There are absolutely more efficient ways 
>> to do this.
> 

___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com


Re: Scaling a UIImage

2020-11-02 Thread Carl Hoefs via Cocoa-dev
Okay. It was my understanding that -TIFFRepresentation was the only way to get 
serializable image data bytes... What is a more efficient way to do this?
-Carl


> On Nov 2, 2020, at 3:09 PM, David Duncan  wrote:
> 
> Also any code using -TIFFRepresentation for any reason other than to get 
> actual TIFF data is likely suspect. There are absolutely more efficient ways 
> to do this.

___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com


Re: Scaling a UIImage

2020-11-02 Thread Carl Hoefs via Cocoa-dev
Yes! That's what I overlooked. "native" isn't what I intended.

Thanks!
-Carl


> On Nov 2, 2020, at 3:09 PM, David Duncan  wrote:
> 
>>   UIGraphicsBeginImageContextWithOptions(newSize, NO, 0.0);
> 
> Explicitly pass 1 here.
> 
> 

___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com


Re: Scaling a UIImage

2020-11-02 Thread David Duncan via Cocoa-dev


> On Nov 2, 2020, at 2:59 PM, Carl Hoefs via Cocoa-dev 
>  wrote:
> 
> How can I correctly scale a UIImage from 3264x2448 down to 640x480 pixels?
> 
> 
> I have an iOS app that interacts with a macOS server process. The iOS app 
> takes a 3264x2448 camera image, scales it to 640x480 pixels, and makes a JPEG 
> representation of it to send to the server:
> 
>  NSData *dataObj = UIImageJPEGRepresentation(origImage,0.5);
> 
> The server reads the bytes, and creates an NSImage:
> 
>  NSImage *theImage = [[NSImage alloc] initWithData:imageData];
>  NSBitmapImageRep *imageRep = [[NSBitmapImageRep alloc] 
> initWithData:[theImage TIFFRepresentation]];

Do you specifically need an NSBitmapImageRep? What is your next step here.

Also any code using -TIFFRepresentation for any reason other than to get actual 
TIFF data is likely suspect. There are absolutely more efficient ways to do 
this.

But which way you should do depends on your next step...

> 
> But at this point, imageRep.pixelsWide=1280 and imageRep.pixelsHigh=960!

You got 1280x960 because your on a 2x device and your resizing code didn’t 
specify a scale for the image (you passed 0.0, which means to use “native” 
scale).

> 
> If I write theImage to disk and look at it with Preview, it displays onscreen 
> as 640x480 but Preview's Inspector Tool shows it to be 1280x960.
> 
> On the iOS app side, here's the UIImage category method I'm using to scale 
> the image:
> 
> + (UIImage *)imageWithImage:(UIImage *)image 
>   scaledToSize:(CGSize)newSize 
> {
>UIGraphicsBeginImageContextWithOptions(newSize, NO, 0.0);

Explicitly pass 1 here.

>[image drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
>UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
>UIGraphicsEndImageContext();
>return newImage;
> }
> 
> Any suggestions?
> -Carl
> 
> ___
> 
> Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)
> 
> Please do not post admin requests or moderator comments to the list.
> Contact the moderators at cocoa-dev-admins(at)lists.apple.com
> 
> Help/Unsubscribe/Update your Subscription:
> https://lists.apple.com/mailman/options/cocoa-dev/david.duncan%40apple.com
> 
> This email sent to david.dun...@apple.com

___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com


Scaling a UIImage

2020-11-02 Thread Carl Hoefs via Cocoa-dev
How can I correctly scale a UIImage from 3264x2448 down to 640x480 pixels?


I have an iOS app that interacts with a macOS server process. The iOS app takes 
a 3264x2448 camera image, scales it to 640x480 pixels, and makes a JPEG 
representation of it to send to the server:

  NSData *dataObj = UIImageJPEGRepresentation(origImage,0.5);

The server reads the bytes, and creates an NSImage:

  NSImage *theImage = [[NSImage alloc] initWithData:imageData];
  NSBitmapImageRep *imageRep = [[NSBitmapImageRep alloc] initWithData:[theImage 
TIFFRepresentation]];

But at this point, imageRep.pixelsWide=1280 and imageRep.pixelsHigh=960!

If I write theImage to disk and look at it with Preview, it displays onscreen 
as 640x480 but Preview's Inspector Tool shows it to be 1280x960.

On the iOS app side, here's the UIImage category method I'm using to scale the 
image:

+ (UIImage *)imageWithImage:(UIImage *)image 
   scaledToSize:(CGSize)newSize 
{
UIGraphicsBeginImageContextWithOptions(newSize, NO, 0.0);
[image drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}

Any suggestions?
-Carl

___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com