ios - Efficient downsampling with core image -


per getting best performance page,

use core graphics or image i/o functions crop or downsample, such functions cgimagecreatewithimageinrect or cgimagesourcecreatethumbnailatindex.

however, i'm wondering how true if you're working solely in core image image processing. if have image needs downsampled , filtered, along other things, wouldn't less efficient convert cgimage, downsample, convert ciimage other uses?

i'm wondering if better work in core image framework if downsampling apart of image processing algorithm you're performing. if above faster i'd give try, i'm not sure there's other way downsample fast possible. no, unfortunately cilanczosscaletransform horribly slow, wish core image had faster way in built scale images besides this.

i'm using code below, found here:

http://flexmonkey.blogspot.com/2014/12/scaling-resizing-and-orienting-images.html

extension uiimage {     public func resizetoboundingsquare(_ boundingsquaresidelength : cgfloat) -> uiimage {         let imgscale = self.size.width > self.size.height ? boundingsquaresidelength / self.size.width : boundingsquaresidelength / self.size.height         let newwidth = self.size.width * imgscale         let newheight = self.size.height * imgscale         let newsize = cgsize(width: newwidth, height: newheight)         uigraphicsbeginimagecontext(newsize)         self.draw(in: cgrect(x: 0, y: 0, width: newwidth, height: newheight))         let resizedimage = uigraphicsgetimagefromcurrentimagecontext()         uigraphicsendimagecontext();         return resizedimage!     } } 

after downsizing things and/or making various pixel sizes consistent, use ci filters, both custom , chaining. i'm not seeing performance or memory issues.


Comments