What exactly is UIFont's point size?
I am struggling to understand exactly what the point size in UIFont
means. It's not pixels and it doesn't appear to be the standard definition of point which is that they relate to 1/72th inch.
I worked out the pixel size using -[NSString sizeWithFont:]
of fonts at various sizes and got the following:
| Point Size | Pixel Size |
| ---------- | ---------- |
| 10.0 | 13.0 |
| 20.0 | 24.0 |
| 30.0 | 36.0 |
| 40.0 | 47.0 |
| 50.0 | 59.0 |
| 72.0 | 84.0 |
| 99.0 | 115.0 |
| 100.0 | 116.0 |
(I did [@"A" sizeWithFont:[UIFont systemFontOfSize:theSize]]
)
And looking at the 72.0
point size, that is not 1-inch since this is on a device with a DPI of 163, so 1-inch would be 163.0 pixels, right?
Can anyone explain what a "point" in UIFont
terms is then? i.e. is my method above wrong and really if I used something else I'd see something about the font is 163 pixels at 72 point? Or is it purely that a point is defined from something else?
A font has an internal coordinate system, think of it as a unit square, within which a glyph's vector coordinates are specified at whatever arbitrary size accommodates all the glyphs in the font +- any amount of margin the font designer chooses.
At 72.0 points the font's unit square is one inch. Glyph x of font y has an arbitrary size in relation to this inch square. Thus a font designer can make a font that appears large or small in relation to other fonts. This is part of the font's 'character'.
So, drawing an 'A' at 72 points tells you that it will be twice as high as an 'A' drawn at 36 points in the same font - and absolutely nothing else about what the actual bitmap size will be.
ie For a given font the only way to determine the relationship between point size and pixels is to measure it.
I am not sure how -[NSString sizeWithFont:]
measures the height. Does it use line height or the difference between the peaks of the beziers? What text did you use?
I believe -[UIFont lineHeight]
would be better to measure the height.
Edit:
Also, note that none of the measurement methods returns the size in pixels. It returns the size in points
. You have to multiply the result by [UIScreen mainScreen].scale
.
Note the difference between typographic points
used when constructing the font and points
from iOS default logical coordinate space
. Unfortunately, the difference is not explained very clearly in the documentation.
I agree this is very confusing. I'm trying to give you some basic explanation here to make the things clearer.
First, the DPI (dot-per-inch) thing comes from printing, on physical papers. So does font. The unit point was invented to discribe physical printing size of text, just because inch is too large for usual text sizes. Then people invented point, that is the length of 1/72 inch (actually evolved in the history), to describe text size easily. So yes, if you are writing a document in Word or other word processing software for printing, you will get absolutely one-inch-height text if you use 72pt font.
Second, the theoretical text height is usually different from the rendered strokes you can actually see by your eyes. The original text height idea came from the actual glyphs used for printing. All letters are engraved on glyph blocks, which share the same height – which matches the font point height. However, depending on different letters and different font design, the actual visible part of the text may a little bit shorter than the theoretical height. Helvetica Neue is actually very standard. If you measure the top of a letter "k" to the bottom of a letter "p", it will match the font height.
Third, computer display screwed up DPI, as well as the definition of point at the same time. The resolution of computer displays are described by their native pixels, such as 1024 x 768 or 1920 x 1080. Software actually doesn't care the physical size of your monitors, because everything would be very fuzzy if they scale screen content like printing on paper — just the physical resolution is not high enough to make everything smooth and legit. Software uses a very simple and dead way: Fixed DPI for whatever monitor you use. For Windows, it's 96DPI; for Mac, it's 72DPI. That's said, no matter how many pixels make an inch on your monitor, software just ignores it. When the operating system renders text in 72pt, it would be always 96px high on Windows and 72px high on Mac. (That's why Microsoft Word documents always look smaller on Mac and you usually need zoom to 125%.)
Finally on iOS, it's very similar, no matter it's iPhone, iPod touch, iPad or Apple Watch, iOS uses the fixed 72DPI for non-retina screen, 144DPI for @2x retina display, and 216DPI for @3x retina display used on iPhone 6 Plus.
Forget about the real inch. It only exists on actual printing, not for displaying. For software displaying text on your screen, it's just an artificial ratio to physical pixels.
I first wondered if this had something to do with the way [CSS pixels are defined at 96 per "inch"][1] while UI layout points are defined at 72 per "inch". (Where, of course, an "inch" has nothing to do with a physical inch.) Why would web standards factor into UIKit business? Well, you may note when examining stack traces in the debugger or crash reports that there's some WebKit code underlying a lot of UIKit, even when you're not using UIWebView
. Actually, though, it's simpler than that.
First, the font size is measured from the lowest descender to the highest ascender in regular Latin text -- e.g. from the bottom of the "j" to the top of the "k", or for convenient measure in a single character, the height of "ƒ". (That's U+0192 "LATIN SMALL LETTER F WITH HOOK", easily typed with option-F on a US Mac keyboard. People used it to abbreviate "folder" way back when.) You'll notice that when measured with that scheme, the height in pixels (on a 1x display) matches the specified font size -- e.g. with [UIFont systemFontOfSize:14]
, "ƒ" will be 14 pixels tall. (Measuring the capital "A" only accounts for an arbitrary portion of the space measured in the font size. This portion may change at smaller font sizes; when rendering font vectors to pixels, "hinting" modifies the results to produce more legible onscreen text.)
However, fonts contain all sorts of glyphs that don't fit into the space defined by that metric. There are letters with diacritics above an ascender in eastern European languages, and all kinds of punctuation marks and special characters that fit in a "layout box" much larger. (See the Math Symbols section in Mac OS X's Special Characters window for plenty of examples.)
In the CGSize
returned by -[NSString sizeWithFont:]
, the width accounts for the specific characters in the string, but the height only reflects the number of lines. Line height is a metric specified by the font, and related to the "layout box" encompassing the font's largest characters.