Swift - UIView draw 1 pixel-width line

Two issues here:

  1. CoreGraphics drawing functions work in terms of points (the unit of screen layout which is constant in approximate physical size across all devices), not in pixels. The number of pixels per point is different on devices with different screen scales: iPad 2 and iPad mini are the only 1x devices supported in iOS 7 and later, iPhone 4, iPad 3, iPad mini 2 and later are 2x, except for iPhone 6/6s/7 Plus which is 3x. So if you want a one-device-pixel hairline, you need a 0.5-point line width on most current devices (and 0.33-point width on iPhone 6 Plus).

  2. The width of a line is centered in the square area of one point. So if you have a line from (10.0, 10.0) to (10.0, 20.0), it'll actually lie in between the pixels with x-coordinate 10.0 and 9.0 — when rendered, antialiasing will shade both the 10.0 and 9.0 columns of pixels at 50% of the line color, instead of shading one column at 100%. To fix this, you need to position your line so it's entirely within a pixel. (A device pixel, not a layout point.)

So, to get a one-pixel hairline, you need to both reduce your line width and offset the points you're drawing by an amount that varies based on the scale factor of your screen. Here's an extension of your test case that does that:

// pass in the scale of your UIScreen
func drawHairline(in context: CGContext, scale: CGFloat, color: CGColor) {

    // pick which row/column of pixels to treat as the "center" of a point
    // through which to draw lines -- favor true center for odd scales, or
    // offset to the side for even scales so we fall on pixel boundaries
    let center: CGFloat
    if Int(scale) % 2 == 0 {
        center = 1 / (scale * 2)
    } else {
        center = 0
    }

    let offset = 0.5 - center // use the "center" choice to create an offset
    let p1 = CGPoint(x: 50 + offset, y: 50 + offset)
    let p2 = CGPoint(x: 50 + offset, y: 75 + offset)

    // draw line of minimal stroke width
    let width = 1 / scale
    context.setLineWidth(width)
    context.setStrokeColor(color)
    context.beginPath()
    context.move(to: p1)
    context.addLine(to: p2)
    context.strokePath()
}

The centerChoice calculation generalizes the issue of having to choose which sub-point pixel to draw your line on. You have to draw through the center of the point (offset 0.5) to shade a whole pixel on a 1x display or shade only the middle pixel of the point on a 3x display, but on a 2x display offset 0.5 is in between the two pixels that make up one point. So for 2x, you have to choose offset 0.25 or offset 0.75 — the center line does that for even or odd scale factors in general.

Note #1: I changed your test case to draw a vertical line, because it's easier to see the effect of antialiasing that way. A diagonal line will get some antialiasing no matter what, but a vertical or horizontal line will get no antialiasing if it's of the right width and in the right place.

Note #2: iPhone 6/6s/7 Plus has a logical scale of 3.0 and a physical display scale of about 2.61 — you might want to play around with screen.scale versus screen.nativeScale to see which gets you better looking results.


I'm use this code for view with 1px bottom line. Other examples work bad on x3 displays such as 6 Plus

import UIKit

class ViewWithBottomLine: UIView {
    @IBInspectable var separatorColor: UIColor = Your default color

    override func draw(_ rect: CGRect) {
        super.draw(rect)

        guard let context = UIGraphicsGetCurrentContext() else {
            return
        }

        let scale = UIScreen.main.scale

        let width = 1 / scale
        let offset = width / 2

        context.setLineWidth(width)
        context.setStrokeColor(separatorColor.cgColor)
        context.beginPath()
        context.move(to: CGPoint(x: 0, y: rect.maxY - offset))
        context.addLine(to: CGPoint(x: rect.maxX, y: rect.maxY - offset))
        context.strokePath()
    }
}