Some Important Aspects of UIKit Performance

Even though a lot of articles and WWDC sessions are devoted to UIKit performance, this topic still remains obscure for a lot of iOS developers. For this reason we decided to put together the most crucial questions and issues, that fast and cool UI primarily depends on.

The first issue we are going to pay your attention to is Color Blending and what it actually is.

Color blending

Blending is the stage of frame rendering, which computes the final pixel color. Each UIView (to be honest, CALayer) affects the color of the final pixels if combine a set of properties like alpha, backgroundColor, opaque and all the overlying views.

Let’s start from the most used UIView properties, such as UIView.alpha, UIView.opaque and UIView.backgroundColor.

Opaque vs Transparent

UIView.opaque is a hint for the renderer, that allows the view to be treated as a fully opaque tile, thus improving the drawing performance. Opaque means: “Don't draw anything underneath”. It allows the renderer to skip drawing of the underlying views as well as color blending to produce the final color. It simply uses the topmost color of the view.


If the value of alpha property is less than 1, the opaque flag (even if its value is YES) will be ignored.

- (void)loadView {
 [super loadView];

 UIView *view = [[UIView alloc] initWithFrame:CGRectMake(35.f, 35.f, 200.f, 200.f)];
 view.backgroundColor = [UIColor purpleColor];
 view.alpha = 0.5f;

 [self.view addSubview:view];


Notwithstanding that the default value of opaque is YES, we have color blending as a result, because we made our view transparent by setting the value of alpha property to less than 1.

How to test?

NOTE: If you want to get precise information about real performance, you should test your app on a real iOS device, not a simulator. iOS CPU is slower than your Mac CPU and GPU in your iOS device is very different from the one in Mac.

In iOS Simulator Debug menu you can find an item called ‘Color Blended Layers’. It allows the debugger to show blended view layers, those where multiple semitransparent layers overlap one another. Multiple view layers that are drawn on top of each other with blending enabled are highlighted in red, while multiple view layers that are drawn without blending are highlighted in green.

To use Core Animation tool from Instruments we need to have a real device connected.

Core Animation interface

You can find here ‘Color Blended Layers’ item as well.


Alpha channel in images

The same issue arises when we try to figure how changing alpha channel can affect UIImageView transparency (and alpha property of UIImageView as well). Let’s use a category for UIImage to receive another image with a custom alpha channel:

#import "UIImage+ESImageByAlphaComponent.h"

@implementation UIImage (ESImageByAlphaComponent)

- (UIImage *)imageByApplyingAlpha:(CGFloat) alpha {
 UIGraphicsBeginImageContextWithOptions(self.size, NO, 0.0f);

 CGContextRef ctx = UIGraphicsGetCurrentContext(); 
 CGRect area = CGRectMake(0, 0, self.size.width, self.size.height);

 CGContextScaleCTM(ctx, 1, -1);
 CGContextTranslateCTM(ctx, 0, -area.size.height);
 CGContextSetBlendMode(ctx, kCGBlendModeMultiply);
 CGContextSetAlpha(ctx, alpha);

 CGContextDrawImage(ctx, area, self.CGImage);
 UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();


 return newImage;


We are going to consider 4 cases:

  1. Instance of UIImageView has a default value of alpha property (1.0) and the image doesn’t have an alpha channel.
  2. Instance of UIImage View has a default value of alpha property (1.0) and the image has an alpha channel modified to 0.5.
  3. Instance of UIImage View has a modified value of alpha property and the image doesn’t have an alpha channel.
  4. Instance of UIImage View has modified value of alpha property and the image has an alpha channel modified to 0.5.
- (void)viewDidLoad {
 [super viewDidLoad];

  UIImage *image = [UIImage imageNamed:@"flower.jpg"];
  UIImage *imageWithAlpha = [image imageByApplyingAlpha:0.5f];

  //1st case
  [self.imageViewWithImageHasDefaulAlphaChannel setImage:image];

  //The 2nd case
  [self.imageViewWihImageHaveCustomAlphaChannel setImage:imageWithAlpha];

  //The 3d case
  self.imageViewHasChangedAlphaWithImageHasDefaultAlpha.alpha = 0.5f;
  [self.imageViewHasChangedAlphaWithImageHasDefaultAlpha setImage:image];

  //The 4th case
 self.imageViewHasChangedAlphaWithImageHasCustomAlpha.alpha = 0.5f;
 [self.imageViewHasChangedAlphaWithImageHasCustomAlpha setImage:imageWithAlpha];

12123123 (1)

Read alsoLightweight iOS View Controllers through separate datasources

Blended view layers are shown by iOS simulator. So even if our instance of UIImageView has an alpha property with a default value 1.0, but the image has a modified alpha channel, we will receive a blended layer.

Official Apple’s documentation encourages us to pay attention to blended view layers:

“Reduce the amount of red in your app when this option is selected to dramatically improve your app’s performance. Blended view layers are often the cause of slow table scrolling.”

To render the transparent layers, you need to perform additional computational operations. The system has to blend the layer with the layer below to compute its color and draw.

Offscreen rendering

Offscreen rendering is drawing, that cannot be done using hardware acceleration (GPU) and should be performed on the CPU instead.

At the low level this looks like that: during rendering of the layer that requires the offscreen rendering, the GPU stops the rendering pipeline and passes control to the CPU. In its turn, the CPU performs all the necessary operations (e.g. your fancy stuff in drawRect:) and returns control to the GPU with the already rendered layer. GPU renders it and the rendering pipeline keeps going.

In addition, the offscreen rendering requires allocation of the additional memory for a so-called backing store. Meanwhile, this is not needed for the hardware-accelerated layers.

Onscreen rendering

Onscreen rendering

 Offscreen rendering

Offscreen rendering

So what effects / settings lead to the offscreen rendering evil? Let's go through them:

  • custom drawRect: (any, even if you simply fill the background with color)
  • CALayer corner radius
  • CALayer shadow
  • CALayer mask
  • any custom drawing using CGContext
  • CALayer corner radius
  • CALayer shadow
  • CALayer mask 
  • any custom drawing using CGContext

We can easily detect offscreen rendering using Core Animation tool in Instruments if the option Color Offscreen-Rendered Yellow is on. Each place where offscreen rendering occurs will be covered with a yellow overlay.


Read also: From JSON to CoreData fast and effectively

Now we are going to consider a couple of cases and test the performance. We will try to find a proper solution that could improve the performance and at the same time implement your vision on the good design.

Test environment:

  • Device: iPhone 4 with iOS 7.1.1 (11D201) on board
  • Xcode: 5.1.1 (5B1008)
  • MacBook Pro 15 Retina (ME294) with OS X 10.9.3 (13D65) on board

Rounded corners

Let’s create a simple tableView with our custom cell and put UIImageView and UILabel to our cell prototype. Remember the good old times when the buttons were round? To achieve this fancy effect in tableView we need to set CALayer.cornerRadius and CALayer.maskToBounds to YES.

- (UITableViewCell *)tableView:(UITableView *)tableView cellForRowAtIndexPath:(NSIndexPath *)indexPath {
  NSString *identifier = NSStringFromClass(CRTTableViewCell.class);

  CRTTableViewCell *cell = [self.tableView dequeueReusableCellWithIdentifier:identifier];

  cell.imageView.layer.cornerRadius = 30;
 cell.imageView.layer.masksToBounds = YES;
 cell.imageView.image = [UIImage imageNamed:@"flower.jpg"];
 cell.textLabel.text = [NSString stringWithFormat:@"Cell:%ld",(long)indexPath.row];

  return cell;


Despite the fact, that we have achieved the desired effect even without Instruments, it is obvious that performance is very far from the recommended 60 FPS. But we are not going to look into a crystal ball to find the untested numeric answers. Instead, we’ll just test the performance with the Instruments.

First of all, let’s enable the option Color Offscreen-Rendered Yellow. All the instances in each cell of the UIImageView are covered with a yellow overlay.


Also we should check the performance with Core Animation and OpenGL ES Driver tool in the Instruments.

Speaking of the OpenGL ES Driver tool, what does it give us? To understand how it works, let's look at GPU internals. GPU has two components - Renderer and Tiler. Renderer’s responsibility is to draw the vertex data but the order and composition is defined by the tiler. So Tiler’s  job is to subdivide frame into tiles and identify their visibility. Then only the visible tiles are passed to the Renderer (i.e. rendering is deferred).

If the value of Renderer Utilization is higher that ~50%, it suggests that animation may be fill-rate limited. Speaking of the Tiler Utilization, if it’s higher than ~50%, it suggests that animation may be geometry limited, meaning that there may be too many layers on the screen.

CorenerRadiusFPSAfterOPTCropped (1)randerUtilizationBeforeOptimizationCropped

So now it is clear that we should look for another approach to achieve the desired effect and at the same time increase the performance. Let’s use a category for UIImage to make the corners rounded, instead of using cornerRadius property.

@implementation UIImage (YALExtension)

- (UIImage *)yal_imageWithRoundedCornersAndSize:(CGSize)sizeToFit  {
 CGRect rect = (CGRect){0.f, 0.f, sizeToFit}; 

 UIGraphicsBeginImageContextWithOptions(sizeToFit, NO, UIScreen.mainScreen.scale);    
 [UIBezierPath bezierPathWithRoundedRect:rect cornerRadius:sizeToFit.width].CGPath);

 [self drawInRect:rect];
 UIImage *output = UIGraphicsGetImageFromCurrentImageContext();


  return output;


And now we are going to modify implementation of dataSource method cellForRowAtIndexPath.

- (UITableViewCell *)tableView:(UITableView *)tableView cellForRowAtIndexPath:(NSIndexPath *)indexPath {
  NSString *identifier = NSStringFromClass(CRTTableViewCell.class);
  CRTTableViewCell *cell = [self.tableView dequeueReusableCellWithIdentifier:identifier];

 cell.myTextLabel.text = [NSString stringWithFormat:@"Cell:%ld",(long)indexPath.row];
 UIImage *image = [UIImage imageNamed:@"flower.jpg"];
  cell.imageViewForPhoto.image = [image yal_imageWithRoundedCornersAndSize:cell.imageViewForPhoto.bounds.size];

  return cell;

Drawing code is called only once, when the object is first placed on the screen. It is cached in the view’s CALayer and can be animated without additional drawing. Regardless being slower, than Core Animation methods, this approach allows us to convert per-frame cost into a one-time cost.

Before we get back to performance measuring, let’s check the offscreen rendering one more time.




57 – 60 FPS! We managed to increase our performance twice and reduce Tiler Utilization and Renderer Utilization properly.

Avoid abusing overriding - drawRect

Keep in mind, that -drawRect also leads to the offscreen rendering, even in a simple case when you need to fill the background with color.

Especially if you want to make your own implementation of - drawRect only for such simple operations as, for example, setting a background color, instead of taking advantage of the proper UIView backgroundColor property, which is available out of the box.

This approach is unreasonable because of the two facts.

Firstly, system views may implement private drawing methods to render their content and it is obvious that Apple puts efforts into drawing optimization. Also we should remember about the backing store - a new backing image for the view, with pixel dimensions equal to the view size multiplied by the contentsScale, which will be cached until the view needs to update it.

Secondly, if we avoid overriding - drawRect, we don’t need to allocate additional memory for the backing store and zero it out each time we perform a new drawing cycle.

scheme_1 (1)

  scheme_2-2 (1)

Read also: iOS Collection View customization


Another way to speed up the performance for offscreen rendering is to use CALayer.shouldRasterize property. It hints the drawing system to render the layer once and cache its content until the layer needs to be redrawn.

However, despite potential performance enhancement, if layer needs to be redrawn too often, then extra cost of caching makes the cache useless, because the system will rasterize the layer after each draw.

In the end the usage of CALayer.shouldRasterize depends on a certain use-case and Instruments.

Shadows & shadowPath

The possibilities of shadows can make the UI more beautiful. On iOS it is worth nothing to add a shadow effect:

cell.imageView.layer.shadowRadius = 30;
cell.imageView.layer.shadowOpacity = 0.5f;


With turned on “Offscreen Rendering” we can see, that shadow adds offscreen rendering and causes CoreAnimation to compute shadow path in real time by default, which lowers FPS.

What does Apple say?

“Letting Core Animation determine the shape of a shadow can be expensive and impact your app’s performance. Rather than letting Core Animation determine the shape of the shadow, specify the shadow shape explicitly using the shadowPath property of CALayer. When you specify a path object for this property, Core Animation uses that shape to draw and cache the shadow effect. For layers, whose shape never changes or rarely changes, this greatly improves the performance by reducing the amount of rendering done by Core Animation.”

So basically we need to provide cached shadow path (CGPath) to the CoreAnimation, which is quite easy to do:

UIBezierPath *shadowPath = [UIBezierPath bezierPathWithRect:cell.imageView.bounds];
[cell.imageView.layer setShadowPath:shadowPath.CGPath];

​Just with one line of code we removed Offscreen rendering and enhanced the performance by miles.

So as you see a lot of UI-related performance issues can be solved quite easily. One small remark: don’t forget to measure both before and after the optimization:)

Useful links and resources

4.6/ 5.0
Article rating
Remember those Facebook reactions? Well, we aren't Facebook but we love reactions too. They can give us valuable insights on how to improve what we're doing. Would you tell us how you feel about this article?

We use cookies to personalize our service and to improve your experience on the website and its subdomains. We also use this information for analytics.